##// END OF EJS Templates
release: merge back stable branch into default
marcink -
r4528:5055a30b merge default
parent child Browse files
Show More

The requested changes are too big and content was truncated. Show full diff

@@ -0,0 +1,54 b''
1 |RCE| 4.21.0 |RNS|
2 ------------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2020-09-28
8
9
10 New Features
11 ^^^^^^^^^^^^
12
13 - Pull requests: overhaul of the UX/UI by adding new sidebar
14 - Pull requests: new live reviewer present indicator (requires channelstream enabled)
15 - Pull requests: new live new comments indicator (requires channelstream enabled)
16 - Pull requests: new sidebar with comments/todos/referenced tickets navigation
17 - Commits page: Introduced sidebar for single commits pages
18
19
20 General
21 ^^^^^^^
22
23 - API: allow repo admins to get/set settings.
24 Previously it was only super-admins that could do that.
25 - Sessions: patch baker to take expire time for redis for auto session cleanup feature.
26 - Git: bumped git version to 2.27.0
27 - Packages: bumped to channelstream==0.6.14
28
29
30 Security
31 ^^^^^^^^
32
33 - Issue trackers: fix XSS with description field.
34
35
36 Performance
37 ^^^^^^^^^^^
38
39 - Artifacts: speed-up of artifacts download request processing.
40
41
42 Fixes
43 ^^^^^
44
45 - Pull requests: properly save merge failure metadata.
46 In rare cases merge check reported conflicts which there were none.
47 - Sessions: fixed cleanup with corrupted session data issue.
48
49
50 Upgrade notes
51 ^^^^^^^^^^^^^
52
53 - Scheduled feature release.
54 - Git version was bumped to 2.27.0
@@ -0,0 +1,55 b''
1 |RCE| 4.22.0 |RNS|
2 ------------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2020-10-12
8
9
10 New Features
11 ^^^^^^^^^^^^
12
13 - Reviewers: added observers as another role for reviewers.
14 Observers is a role that doesn't require voting, but still gets notified about
15 PR and should participate in review process.
16 - Issue trackers: implemented more sophisticated ticket data extraction based on
17 advanced regex module. This allows using ticket references without false positives
18 like catching ticket data in an URL.
19 - Channelstream: Notification about updates and comments now works via API, and both
20 Pull-requests and individual commits.
21
22
23 General
24 ^^^^^^^
25
26 - Data tables: unified tables look for main pages of rhodecode repo pages.
27 - Users: autocomplete now sorts by matched username to show best matches first.
28 - Pull requests: only allow actual reviewers to leave status/votes in order to not
29 confuse others users about voting from people who aren't actual reviewers.
30
31 Security
32 ^^^^^^^^
33
34
35
36 Performance
37 ^^^^^^^^^^^
38
39 - Default reviewers: optimize diff data, and creation of PR with advanced default reviewers
40 - default-reviewers: diff data should load more things lazy for better performance.
41 - Pull requests: limit the amount of data saved in default reviewers data for better memory usage
42 - DB: don't use lazy loaders on PR related objects, to optimize memory usage on large
43 Pull requests with lots of comments, and commits.
44
45 Fixes
46 ^^^^^
47
48 - Quick search bar: fixes #5634, crash when search on non-ascii characters.
49 - Sidebar: few fixes for panel rendering of reviewers/observers for both commits and PRS.
50
51 Upgrade notes
52 ^^^^^^^^^^^^^
53
54 - Scheduled feature release.
55
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
@@ -1,70 +1,72 b''
1 1 1bd3e92b7e2e2d2024152b34bb88dff1db544a71 v4.0.0
2 2 170c5398320ea6cddd50955e88d408794c21d43a v4.0.1
3 3 c3fe200198f5aa34cf2e4066df2881a9cefe3704 v4.1.0
4 4 7fd5c850745e2ea821fb4406af5f4bff9b0a7526 v4.1.1
5 5 41c87da28a179953df86061d817bc35533c66dd2 v4.1.2
6 6 baaf9f5bcea3bae0ef12ae20c8b270482e62abb6 v4.2.0
7 7 32a70c7e56844a825f61df496ee5eaf8c3c4e189 v4.2.1
8 8 fa695cdb411d294679ac081d595ac654e5613b03 v4.3.0
9 9 0e4dc11b58cad833c513fe17bac39e6850edf959 v4.3.1
10 10 8a876f48f5cb1d018b837db28ff928500cb32cfb v4.4.0
11 11 8dd86b410b1aac086ffdfc524ef300f896af5047 v4.4.1
12 12 d2514226abc8d3b4f6fb57765f47d1b6fb360a05 v4.4.2
13 13 27d783325930af6dad2741476c0d0b1b7c8415c2 v4.5.0
14 14 7f2016f352abcbdba4a19d4039c386e9629449da v4.5.1
15 15 416fec799314c70a5c780fb28b3357b08869333a v4.5.2
16 16 27c3b85fafc83143e6678fbc3da69e1615bcac55 v4.6.0
17 17 5ad13deb9118c2a5243d4032d4d9cc174e5872db v4.6.1
18 18 2be921e01fa24bb102696ada596f87464c3666f6 v4.7.0
19 19 7198bdec29c2872c974431d55200d0398354cdb1 v4.7.1
20 20 bd1c8d230fe741c2dfd7100a0ef39fd0774fd581 v4.7.2
21 21 9731914f89765d9628dc4dddc84bc9402aa124c8 v4.8.0
22 22 c5a2b7d0e4bbdebc4a62d7b624befe375207b659 v4.9.0
23 23 d9aa3b27ac9f7e78359775c75fedf7bfece232f1 v4.9.1
24 24 4ba4d74981cec5d6b28b158f875a2540952c2f74 v4.10.0
25 25 0a6821cbd6b0b3c21503002f88800679fa35ab63 v4.10.1
26 26 434ad90ec8d621f4416074b84f6e9ce03964defb v4.10.2
27 27 68baee10e698da2724c6e0f698c03a6abb993bf2 v4.10.3
28 28 00821d3afd1dce3f4767cc353f84a17f7d5218a1 v4.10.4
29 29 22f6744ad8cc274311825f63f953e4dee2ea5cb9 v4.10.5
30 30 96eb24bea2f5f9258775245e3f09f6fa0a4dda01 v4.10.6
31 31 3121217a812c956d7dd5a5875821bd73e8002a32 v4.11.0
32 32 fa98b454715ac5b912f39e84af54345909a2a805 v4.11.1
33 33 3982abcfdcc229a723cebe52d3a9bcff10bba08e v4.11.2
34 34 33195f145db9172f0a8f1487e09207178a6ab065 v4.11.3
35 35 194c74f33e32bbae6fc4d71ec5a999cff3c13605 v4.11.4
36 36 8fbd8b0c3ddc2fa4ac9e4ca16942a03eb593df2d v4.11.5
37 37 f0609aa5d5d05a1ca2f97c3995542236131c9d8a v4.11.6
38 38 b5b30547d90d2e088472a70c84878f429ffbf40d v4.12.0
39 39 9072253aa8894d20c00b4a43dc61c2168c1eff94 v4.12.1
40 40 6a517543ea9ef9987d74371bd2a315eb0b232dc9 v4.12.2
41 41 7fc0731b024c3114be87865eda7ab621cc957e32 v4.12.3
42 42 6d531c0b068c6eda62dddceedc9f845ecb6feb6f v4.12.4
43 43 3d6bf2d81b1564830eb5e83396110d2a9a93eb1e v4.13.0
44 44 5468fc89e708bd90e413cd0d54350017abbdbc0e v4.13.1
45 45 610d621550521c314ee97b3d43473ac0bcf06fb8 v4.13.2
46 46 7dc62c090881fb5d03268141e71e0940d7c3295d v4.13.3
47 47 9151328c1c46b72ba6f00d7640d9141e75aa1ca2 v4.14.0
48 48 a47eeac5dfa41fa6779d90452affba4091c3ade8 v4.14.1
49 49 4b34ce0d2c3c10510626b3b65044939bb7a2cddf v4.15.0
50 50 14502561d22e6b70613674cd675ae9a604b7989f v4.15.1
51 51 4aaa40b605b01af78a9f6882eca561c54b525ef0 v4.15.2
52 52 797744642eca86640ed20bef2cd77445780abaec v4.16.0
53 53 6c3452c7c25ed35ff269690929e11960ed6ad7d3 v4.16.1
54 54 5d8057df561c4b6b81b6401aed7d2f911e6e77f7 v4.16.2
55 55 13acfc008896ef4c62546bab5074e8f6f89b4fa7 v4.17.0
56 56 45b9b610976f483877142fe75321808ce9ebac59 v4.17.1
57 57 ad5bd0c4bd322fdbd04bb825a3d027e08f7a3901 v4.17.2
58 58 037f5794b55a6236d68f6485a485372dde6566e0 v4.17.3
59 59 83bc3100cfd6094c1d04f475ddb299b7dc3d0b33 v4.17.4
60 60 e3de8c95baf8cc9109ca56aee8193a2cb6a54c8a v4.17.4
61 61 f37a3126570477543507f0bc9d245ce75546181a v4.18.0
62 62 71d8791463e87b64c1a18475de330ee600d37561 v4.18.1
63 63 4bd6b75dac1d25c64885d4d49385e5533f21c525 v4.18.2
64 64 12ed92fe57f2e9fc7b71dc0b65e26c2da5c7085f v4.18.3
65 65 ddef396a6567117de531d67d44c739cbbfc3eebb v4.19.0
66 66 c0c65acd73914bf4368222d510afe1161ab8c07c v4.19.1
67 67 7ac623a4a2405917e2af660d645ded662011e40d v4.19.2
68 68 ef7ffda65eeb90c3ba88590a6cb816ef9b0bc232 v4.19.3
69 69 3e635489bb7961df93b01e42454ad1a8730ae968 v4.20.0
70 70 7e2eb896a02ca7cd2cd9f0f853ef3dac3f0039e3 v4.20.1
71 8bb5fece08ab65986225b184e46f53d2a71729cb v4.21.0
72 90734aac31ee4563bbe665a43ff73190cc762275 v4.22.0
@@ -1,448 +1,465 b''
1 1 .. _pull-request-methods-ref:
2 2
3 3 pull_request methods
4 4 ====================
5 5
6 6 close_pull_request
7 7 ------------------
8 8
9 9 .. py:function:: close_pull_request(apiuser, pullrequestid, repoid=<Optional:None>, userid=<Optional:<OptionalAttr:apiuser>>, message=<Optional:''>)
10 10
11 11 Close the pull request specified by `pullrequestid`.
12 12
13 13 :param apiuser: This is filled automatically from the |authtoken|.
14 14 :type apiuser: AuthUser
15 15 :param repoid: Repository name or repository ID to which the pull
16 16 request belongs.
17 17 :type repoid: str or int
18 18 :param pullrequestid: ID of the pull request to be closed.
19 19 :type pullrequestid: int
20 20 :param userid: Close the pull request as this user.
21 21 :type userid: Optional(str or int)
22 22 :param message: Optional message to close the Pull Request with. If not
23 23 specified it will be generated automatically.
24 24 :type message: Optional(str)
25 25
26 26 Example output:
27 27
28 28 .. code-block:: bash
29 29
30 30 "id": <id_given_in_input>,
31 31 "result": {
32 32 "pull_request_id": "<int>",
33 33 "close_status": "<str:status_lbl>,
34 34 "closed": "<bool>"
35 35 },
36 36 "error": null
37 37
38 38
39 39 comment_pull_request
40 40 --------------------
41 41
42 42 .. py:function:: comment_pull_request(apiuser, pullrequestid, repoid=<Optional:None>, message=<Optional:None>, commit_id=<Optional:None>, status=<Optional:None>, comment_type=<Optional:u'note'>, resolves_comment_id=<Optional:None>, extra_recipients=<Optional:[]>, userid=<Optional:<OptionalAttr:apiuser>>, send_email=<Optional:True>)
43 43
44 44 Comment on the pull request specified with the `pullrequestid`,
45 45 in the |repo| specified by the `repoid`, and optionally change the
46 46 review status.
47 47
48 48 :param apiuser: This is filled automatically from the |authtoken|.
49 49 :type apiuser: AuthUser
50 50 :param repoid: Optional repository name or repository ID.
51 51 :type repoid: str or int
52 52 :param pullrequestid: The pull request ID.
53 53 :type pullrequestid: int
54 54 :param commit_id: Specify the commit_id for which to set a comment. If
55 55 given commit_id is different than latest in the PR status
56 56 change won't be performed.
57 57 :type commit_id: str
58 58 :param message: The text content of the comment.
59 59 :type message: str
60 60 :param status: (**Optional**) Set the approval status of the pull
61 61 request. One of: 'not_reviewed', 'approved', 'rejected',
62 62 'under_review'
63 63 :type status: str
64 64 :param comment_type: Comment type, one of: 'note', 'todo'
65 65 :type comment_type: Optional(str), default: 'note'
66 66 :param resolves_comment_id: id of comment which this one will resolve
67 67 :type resolves_comment_id: Optional(int)
68 68 :param extra_recipients: list of user ids or usernames to add
69 69 notifications for this comment. Acts like a CC for notification
70 70 :type extra_recipients: Optional(list)
71 71 :param userid: Comment on the pull request as this user
72 72 :type userid: Optional(str or int)
73 73 :param send_email: Define if this comment should also send email notification
74 74 :type send_email: Optional(bool)
75 75
76 76 Example output:
77 77
78 78 .. code-block:: bash
79 79
80 80 id : <id_given_in_input>
81 81 result : {
82 82 "pull_request_id": "<Integer>",
83 83 "comment_id": "<Integer>",
84 84 "status": {"given": <given_status>,
85 85 "was_changed": <bool status_was_actually_changed> },
86 86 },
87 87 error : null
88 88
89 89
90 90 create_pull_request
91 91 -------------------
92 92
93 .. py:function:: create_pull_request(apiuser, source_repo, target_repo, source_ref, target_ref, owner=<Optional:<OptionalAttr:apiuser>>, title=<Optional:''>, description=<Optional:''>, description_renderer=<Optional:''>, reviewers=<Optional:None>)
93 .. py:function:: create_pull_request(apiuser, source_repo, target_repo, source_ref, target_ref, owner=<Optional:<OptionalAttr:apiuser>>, title=<Optional:''>, description=<Optional:''>, description_renderer=<Optional:''>, reviewers=<Optional:None>, observers=<Optional:None>)
94 94
95 95 Creates a new pull request.
96 96
97 97 Accepts refs in the following formats:
98 98
99 99 * branch:<branch_name>:<sha>
100 100 * branch:<branch_name>
101 101 * bookmark:<bookmark_name>:<sha> (Mercurial only)
102 102 * bookmark:<bookmark_name> (Mercurial only)
103 103
104 104 :param apiuser: This is filled automatically from the |authtoken|.
105 105 :type apiuser: AuthUser
106 106 :param source_repo: Set the source repository name.
107 107 :type source_repo: str
108 108 :param target_repo: Set the target repository name.
109 109 :type target_repo: str
110 110 :param source_ref: Set the source ref name.
111 111 :type source_ref: str
112 112 :param target_ref: Set the target ref name.
113 113 :type target_ref: str
114 114 :param owner: user_id or username
115 115 :type owner: Optional(str)
116 116 :param title: Optionally Set the pull request title, it's generated otherwise
117 117 :type title: str
118 118 :param description: Set the pull request description.
119 119 :type description: Optional(str)
120 120 :type description_renderer: Optional(str)
121 121 :param description_renderer: Set pull request renderer for the description.
122 122 It should be 'rst', 'markdown' or 'plain'. If not give default
123 123 system renderer will be used
124 124 :param reviewers: Set the new pull request reviewers list.
125 125 Reviewer defined by review rules will be added automatically to the
126 126 defined list.
127 127 :type reviewers: Optional(list)
128 128 Accepts username strings or objects of the format:
129 129
130 130 [{'username': 'nick', 'reasons': ['original author'], 'mandatory': <bool>}]
131 :param observers: Set the new pull request observers list.
132 Reviewer defined by review rules will be added automatically to the
133 defined list. This feature is only available in RhodeCode EE
134 :type observers: Optional(list)
135 Accepts username strings or objects of the format:
136
137 [{'username': 'nick', 'reasons': ['original author']}]
131 138
132 139
133 140 get_pull_request
134 141 ----------------
135 142
136 143 .. py:function:: get_pull_request(apiuser, pullrequestid, repoid=<Optional:None>, merge_state=<Optional:False>)
137 144
138 145 Get a pull request based on the given ID.
139 146
140 147 :param apiuser: This is filled automatically from the |authtoken|.
141 148 :type apiuser: AuthUser
142 149 :param repoid: Optional, repository name or repository ID from where
143 150 the pull request was opened.
144 151 :type repoid: str or int
145 152 :param pullrequestid: ID of the requested pull request.
146 153 :type pullrequestid: int
147 154 :param merge_state: Optional calculate merge state for each repository.
148 155 This could result in longer time to fetch the data
149 156 :type merge_state: bool
150 157
151 158 Example output:
152 159
153 160 .. code-block:: bash
154 161
155 162 "id": <id_given_in_input>,
156 163 "result":
157 164 {
158 165 "pull_request_id": "<pull_request_id>",
159 166 "url": "<url>",
160 167 "title": "<title>",
161 168 "description": "<description>",
162 169 "status" : "<status>",
163 170 "created_on": "<date_time_created>",
164 171 "updated_on": "<date_time_updated>",
165 172 "versions": "<number_or_versions_of_pr>",
166 173 "commit_ids": [
167 174 ...
168 175 "<commit_id>",
169 176 "<commit_id>",
170 177 ...
171 178 ],
172 179 "review_status": "<review_status>",
173 180 "mergeable": {
174 181 "status": "<bool>",
175 182 "message": "<message>",
176 183 },
177 184 "source": {
178 185 "clone_url": "<clone_url>",
179 186 "repository": "<repository_name>",
180 187 "reference":
181 188 {
182 189 "name": "<name>",
183 190 "type": "<type>",
184 191 "commit_id": "<commit_id>",
185 192 }
186 193 },
187 194 "target": {
188 195 "clone_url": "<clone_url>",
189 196 "repository": "<repository_name>",
190 197 "reference":
191 198 {
192 199 "name": "<name>",
193 200 "type": "<type>",
194 201 "commit_id": "<commit_id>",
195 202 }
196 203 },
197 204 "merge": {
198 205 "clone_url": "<clone_url>",
199 206 "reference":
200 207 {
201 208 "name": "<name>",
202 209 "type": "<type>",
203 210 "commit_id": "<commit_id>",
204 211 }
205 212 },
206 213 "author": <user_obj>,
207 214 "reviewers": [
208 215 ...
209 216 {
210 217 "user": "<user_obj>",
211 218 "review_status": "<review_status>",
212 219 }
213 220 ...
214 221 ]
215 222 },
216 223 "error": null
217 224
218 225
219 226 get_pull_request_comments
220 227 -------------------------
221 228
222 229 .. py:function:: get_pull_request_comments(apiuser, pullrequestid, repoid=<Optional:None>)
223 230
224 231 Get all comments of pull request specified with the `pullrequestid`
225 232
226 233 :param apiuser: This is filled automatically from the |authtoken|.
227 234 :type apiuser: AuthUser
228 235 :param repoid: Optional repository name or repository ID.
229 236 :type repoid: str or int
230 237 :param pullrequestid: The pull request ID.
231 238 :type pullrequestid: int
232 239
233 240 Example output:
234 241
235 242 .. code-block:: bash
236 243
237 244 id : <id_given_in_input>
238 245 result : [
239 246 {
240 247 "comment_author": {
241 248 "active": true,
242 249 "full_name_or_username": "Tom Gore",
243 250 "username": "admin"
244 251 },
245 252 "comment_created_on": "2017-01-02T18:43:45.533",
246 253 "comment_f_path": null,
247 254 "comment_id": 25,
248 255 "comment_lineno": null,
249 256 "comment_status": {
250 257 "status": "under_review",
251 258 "status_lbl": "Under Review"
252 259 },
253 260 "comment_text": "Example text",
254 261 "comment_type": null,
255 262 "comment_last_version: 0,
256 263 "pull_request_version": null,
257 264 "comment_commit_id": None,
258 265 "comment_pull_request_id": <pull_request_id>
259 266 }
260 267 ],
261 268 error : null
262 269
263 270
264 271 get_pull_requests
265 272 -----------------
266 273
267 274 .. py:function:: get_pull_requests(apiuser, repoid, status=<Optional:'new'>, merge_state=<Optional:False>)
268 275
269 276 Get all pull requests from the repository specified in `repoid`.
270 277
271 278 :param apiuser: This is filled automatically from the |authtoken|.
272 279 :type apiuser: AuthUser
273 280 :param repoid: Optional repository name or repository ID.
274 281 :type repoid: str or int
275 282 :param status: Only return pull requests with the specified status.
276 283 Valid options are.
277 284 * ``new`` (default)
278 285 * ``open``
279 286 * ``closed``
280 287 :type status: str
281 288 :param merge_state: Optional calculate merge state for each repository.
282 289 This could result in longer time to fetch the data
283 290 :type merge_state: bool
284 291
285 292 Example output:
286 293
287 294 .. code-block:: bash
288 295
289 296 "id": <id_given_in_input>,
290 297 "result":
291 298 [
292 299 ...
293 300 {
294 301 "pull_request_id": "<pull_request_id>",
295 302 "url": "<url>",
296 303 "title" : "<title>",
297 304 "description": "<description>",
298 305 "status": "<status>",
299 306 "created_on": "<date_time_created>",
300 307 "updated_on": "<date_time_updated>",
301 308 "commit_ids": [
302 309 ...
303 310 "<commit_id>",
304 311 "<commit_id>",
305 312 ...
306 313 ],
307 314 "review_status": "<review_status>",
308 315 "mergeable": {
309 316 "status": "<bool>",
310 317 "message: "<message>",
311 318 },
312 319 "source": {
313 320 "clone_url": "<clone_url>",
314 321 "reference":
315 322 {
316 323 "name": "<name>",
317 324 "type": "<type>",
318 325 "commit_id": "<commit_id>",
319 326 }
320 327 },
321 328 "target": {
322 329 "clone_url": "<clone_url>",
323 330 "reference":
324 331 {
325 332 "name": "<name>",
326 333 "type": "<type>",
327 334 "commit_id": "<commit_id>",
328 335 }
329 336 },
330 337 "merge": {
331 338 "clone_url": "<clone_url>",
332 339 "reference":
333 340 {
334 341 "name": "<name>",
335 342 "type": "<type>",
336 343 "commit_id": "<commit_id>",
337 344 }
338 345 },
339 346 "author": <user_obj>,
340 347 "reviewers": [
341 348 ...
342 349 {
343 350 "user": "<user_obj>",
344 351 "review_status": "<review_status>",
345 352 }
346 353 ...
347 354 ]
348 355 }
349 356 ...
350 357 ],
351 358 "error": null
352 359
353 360
354 361 merge_pull_request
355 362 ------------------
356 363
357 364 .. py:function:: merge_pull_request(apiuser, pullrequestid, repoid=<Optional:None>, userid=<Optional:<OptionalAttr:apiuser>>)
358 365
359 366 Merge the pull request specified by `pullrequestid` into its target
360 367 repository.
361 368
362 369 :param apiuser: This is filled automatically from the |authtoken|.
363 370 :type apiuser: AuthUser
364 371 :param repoid: Optional, repository name or repository ID of the
365 372 target repository to which the |pr| is to be merged.
366 373 :type repoid: str or int
367 374 :param pullrequestid: ID of the pull request which shall be merged.
368 375 :type pullrequestid: int
369 376 :param userid: Merge the pull request as this user.
370 377 :type userid: Optional(str or int)
371 378
372 379 Example output:
373 380
374 381 .. code-block:: bash
375 382
376 383 "id": <id_given_in_input>,
377 384 "result": {
378 385 "executed": "<bool>",
379 386 "failure_reason": "<int>",
380 387 "merge_status_message": "<str>",
381 388 "merge_commit_id": "<merge_commit_id>",
382 389 "possible": "<bool>",
383 390 "merge_ref": {
384 391 "commit_id": "<commit_id>",
385 392 "type": "<type>",
386 393 "name": "<name>"
387 394 }
388 395 },
389 396 "error": null
390 397
391 398
392 399 update_pull_request
393 400 -------------------
394 401
395 .. py:function:: update_pull_request(apiuser, pullrequestid, repoid=<Optional:None>, title=<Optional:''>, description=<Optional:''>, description_renderer=<Optional:''>, reviewers=<Optional:None>, update_commits=<Optional:None>)
402 .. py:function:: update_pull_request(apiuser, pullrequestid, repoid=<Optional:None>, title=<Optional:''>, description=<Optional:''>, description_renderer=<Optional:''>, reviewers=<Optional:None>, observers=<Optional:None>, update_commits=<Optional:None>)
396 403
397 404 Updates a pull request.
398 405
399 406 :param apiuser: This is filled automatically from the |authtoken|.
400 407 :type apiuser: AuthUser
401 408 :param repoid: Optional repository name or repository ID.
402 409 :type repoid: str or int
403 410 :param pullrequestid: The pull request ID.
404 411 :type pullrequestid: int
405 412 :param title: Set the pull request title.
406 413 :type title: str
407 414 :param description: Update pull request description.
408 415 :type description: Optional(str)
409 416 :type description_renderer: Optional(str)
410 417 :param description_renderer: Update pull request renderer for the description.
411 418 It should be 'rst', 'markdown' or 'plain'
412 419 :param reviewers: Update pull request reviewers list with new value.
413 420 :type reviewers: Optional(list)
414 421 Accepts username strings or objects of the format:
415 422
416 423 [{'username': 'nick', 'reasons': ['original author'], 'mandatory': <bool>}]
424 :param observers: Update pull request observers list with new value.
425 :type observers: Optional(list)
426 Accepts username strings or objects of the format:
417 427
428 [{'username': 'nick', 'reasons': ['should be aware about this PR']}]
418 429 :param update_commits: Trigger update of commits for this pull request
419 430 :type: update_commits: Optional(bool)
420 431
421 432 Example output:
422 433
423 434 .. code-block:: bash
424 435
425 436 id : <id_given_in_input>
426 437 result : {
427 438 "msg": "Updated pull request `63`",
428 439 "pull_request": <pull_request_object>,
429 440 "updated_reviewers": {
430 441 "added": [
431 442 "username"
432 443 ],
433 444 "removed": []
434 445 },
446 "updated_observers": {
447 "added": [
448 "username"
449 ],
450 "removed": []
451 },
435 452 "updated_commits": {
436 453 "added": [
437 454 "<sha1_hash>"
438 455 ],
439 456 "common": [
440 457 "<sha1_hash>",
441 458 "<sha1_hash>",
442 459 ],
443 460 "removed": []
444 461 }
445 462 }
446 463 error : null
447 464
448 465
@@ -1,146 +1,148 b''
1 1 .. _rhodecode-release-notes-ref:
2 2
3 3 Release Notes
4 4 =============
5 5
6 6 |RCE| 4.x Versions
7 7 ------------------
8 8
9 9 .. toctree::
10 10 :maxdepth: 1
11 11
12 release-notes-4.22.0.rst
13 release-notes-4.21.0.rst
12 14 release-notes-4.20.1.rst
13 15 release-notes-4.20.0.rst
14 16 release-notes-4.19.3.rst
15 17 release-notes-4.19.2.rst
16 18 release-notes-4.19.1.rst
17 19 release-notes-4.19.0.rst
18 20 release-notes-4.18.3.rst
19 21 release-notes-4.18.2.rst
20 22 release-notes-4.18.1.rst
21 23 release-notes-4.18.0.rst
22 24 release-notes-4.17.4.rst
23 25 release-notes-4.17.3.rst
24 26 release-notes-4.17.2.rst
25 27 release-notes-4.17.1.rst
26 28 release-notes-4.17.0.rst
27 29 release-notes-4.16.2.rst
28 30 release-notes-4.16.1.rst
29 31 release-notes-4.16.0.rst
30 32 release-notes-4.15.2.rst
31 33 release-notes-4.15.1.rst
32 34 release-notes-4.15.0.rst
33 35 release-notes-4.14.1.rst
34 36 release-notes-4.14.0.rst
35 37 release-notes-4.13.3.rst
36 38 release-notes-4.13.2.rst
37 39 release-notes-4.13.1.rst
38 40 release-notes-4.13.0.rst
39 41 release-notes-4.12.4.rst
40 42 release-notes-4.12.3.rst
41 43 release-notes-4.12.2.rst
42 44 release-notes-4.12.1.rst
43 45 release-notes-4.12.0.rst
44 46 release-notes-4.11.6.rst
45 47 release-notes-4.11.5.rst
46 48 release-notes-4.11.4.rst
47 49 release-notes-4.11.3.rst
48 50 release-notes-4.11.2.rst
49 51 release-notes-4.11.1.rst
50 52 release-notes-4.11.0.rst
51 53 release-notes-4.10.6.rst
52 54 release-notes-4.10.5.rst
53 55 release-notes-4.10.4.rst
54 56 release-notes-4.10.3.rst
55 57 release-notes-4.10.2.rst
56 58 release-notes-4.10.1.rst
57 59 release-notes-4.10.0.rst
58 60 release-notes-4.9.1.rst
59 61 release-notes-4.9.0.rst
60 62 release-notes-4.8.0.rst
61 63 release-notes-4.7.2.rst
62 64 release-notes-4.7.1.rst
63 65 release-notes-4.7.0.rst
64 66 release-notes-4.6.1.rst
65 67 release-notes-4.6.0.rst
66 68 release-notes-4.5.2.rst
67 69 release-notes-4.5.1.rst
68 70 release-notes-4.5.0.rst
69 71 release-notes-4.4.2.rst
70 72 release-notes-4.4.1.rst
71 73 release-notes-4.4.0.rst
72 74 release-notes-4.3.1.rst
73 75 release-notes-4.3.0.rst
74 76 release-notes-4.2.1.rst
75 77 release-notes-4.2.0.rst
76 78 release-notes-4.1.2.rst
77 79 release-notes-4.1.1.rst
78 80 release-notes-4.1.0.rst
79 81 release-notes-4.0.1.rst
80 82 release-notes-4.0.0.rst
81 83
82 84 |RCE| 3.x Versions
83 85 ------------------
84 86
85 87 .. toctree::
86 88 :maxdepth: 1
87 89
88 90 release-notes-3.8.4.rst
89 91 release-notes-3.8.3.rst
90 92 release-notes-3.8.2.rst
91 93 release-notes-3.8.1.rst
92 94 release-notes-3.8.0.rst
93 95 release-notes-3.7.1.rst
94 96 release-notes-3.7.0.rst
95 97 release-notes-3.6.1.rst
96 98 release-notes-3.6.0.rst
97 99 release-notes-3.5.2.rst
98 100 release-notes-3.5.1.rst
99 101 release-notes-3.5.0.rst
100 102 release-notes-3.4.1.rst
101 103 release-notes-3.4.0.rst
102 104 release-notes-3.3.4.rst
103 105 release-notes-3.3.3.rst
104 106 release-notes-3.3.2.rst
105 107 release-notes-3.3.1.rst
106 108 release-notes-3.3.0.rst
107 109 release-notes-3.2.3.rst
108 110 release-notes-3.2.2.rst
109 111 release-notes-3.2.1.rst
110 112 release-notes-3.2.0.rst
111 113 release-notes-3.1.1.rst
112 114 release-notes-3.1.0.rst
113 115 release-notes-3.0.2.rst
114 116 release-notes-3.0.1.rst
115 117 release-notes-3.0.0.rst
116 118
117 119 |RCE| 2.x Versions
118 120 ------------------
119 121
120 122 .. toctree::
121 123 :maxdepth: 1
122 124
123 125 release-notes-2.2.8.rst
124 126 release-notes-2.2.7.rst
125 127 release-notes-2.2.6.rst
126 128 release-notes-2.2.5.rst
127 129 release-notes-2.2.4.rst
128 130 release-notes-2.2.3.rst
129 131 release-notes-2.2.2.rst
130 132 release-notes-2.2.1.rst
131 133 release-notes-2.2.0.rst
132 134 release-notes-2.1.0.rst
133 135 release-notes-2.0.2.rst
134 136 release-notes-2.0.1.rst
135 137 release-notes-2.0.0.rst
136 138
137 139 |RCE| 1.x Versions
138 140 ------------------
139 141
140 142 .. toctree::
141 143 :maxdepth: 1
142 144
143 145 release-notes-1.7.2.rst
144 146 release-notes-1.7.1.rst
145 147 release-notes-1.7.0.rst
146 148 release-notes-1.6.0.rst
@@ -1,2497 +1,2509 b''
1 1 # Generated by pip2nix 0.8.0.dev1
2 2 # See https://github.com/johbo/pip2nix
3 3
4 4 { pkgs, fetchurl, fetchgit, fetchhg }:
5 5
6 6 self: super: {
7 7 "alembic" = super.buildPythonPackage {
8 8 name = "alembic-1.4.2";
9 9 doCheck = false;
10 10 propagatedBuildInputs = [
11 11 self."sqlalchemy"
12 12 self."mako"
13 13 self."python-editor"
14 14 self."python-dateutil"
15 15 ];
16 16 src = fetchurl {
17 17 url = "https://files.pythonhosted.org/packages/60/1e/cabc75a189de0fbb2841d0975243e59bde8b7822bacbb95008ac6fe9ad47/alembic-1.4.2.tar.gz";
18 18 sha256 = "1gsdrzx9h7wfva200qvvsc9sn4w79mk2vs0bbnzjhxi1jw2b0nh3";
19 19 };
20 20 meta = {
21 21 license = [ pkgs.lib.licenses.mit ];
22 22 };
23 23 };
24 24 "amqp" = super.buildPythonPackage {
25 25 name = "amqp-2.5.2";
26 26 doCheck = false;
27 27 propagatedBuildInputs = [
28 28 self."vine"
29 29 ];
30 30 src = fetchurl {
31 31 url = "https://files.pythonhosted.org/packages/92/1d/433541994a5a69f4ad2fff39746ddbb0bdedb0ea0d85673eb0db68a7edd9/amqp-2.5.2.tar.gz";
32 32 sha256 = "13dhhfxjrqcjybnq4zahg92mydhpg2l76nxcmq7d560687wsxwbp";
33 33 };
34 34 meta = {
35 35 license = [ pkgs.lib.licenses.bsdOriginal ];
36 36 };
37 37 };
38 38 "apispec" = super.buildPythonPackage {
39 39 name = "apispec-1.0.0";
40 40 doCheck = false;
41 41 propagatedBuildInputs = [
42 42 self."PyYAML"
43 43 ];
44 44 src = fetchurl {
45 45 url = "https://files.pythonhosted.org/packages/67/15/346c04988dd67d36007e28145504c520491930c878b1f484a97b27a8f497/apispec-1.0.0.tar.gz";
46 46 sha256 = "1712w1anvqrvadjjpvai84vbaygaxabd3zz5lxihdzwzs4gvi9sp";
47 47 };
48 48 meta = {
49 49 license = [ pkgs.lib.licenses.mit ];
50 50 };
51 51 };
52 52 "appenlight-client" = super.buildPythonPackage {
53 53 name = "appenlight-client-0.6.26";
54 54 doCheck = false;
55 55 propagatedBuildInputs = [
56 56 self."webob"
57 57 self."requests"
58 58 self."six"
59 59 ];
60 60 src = fetchurl {
61 61 url = "https://files.pythonhosted.org/packages/2e/56/418fc10379b96e795ee39a15e69a730c222818af04c3821fa354eaa859ec/appenlight_client-0.6.26.tar.gz";
62 62 sha256 = "0s9xw3sb8s3pk73k78nnq4jil3q4mk6bczfa1fmgfx61kdxl2712";
63 63 };
64 64 meta = {
65 65 license = [ pkgs.lib.licenses.bsdOriginal ];
66 66 };
67 67 };
68 68 "asn1crypto" = super.buildPythonPackage {
69 69 name = "asn1crypto-0.24.0";
70 70 doCheck = false;
71 71 src = fetchurl {
72 72 url = "https://files.pythonhosted.org/packages/fc/f1/8db7daa71f414ddabfa056c4ef792e1461ff655c2ae2928a2b675bfed6b4/asn1crypto-0.24.0.tar.gz";
73 73 sha256 = "0jaf8rf9dx1lf23xfv2cdd5h52f1qr3w8k63985bc35g3d220p4x";
74 74 };
75 75 meta = {
76 76 license = [ pkgs.lib.licenses.mit ];
77 77 };
78 78 };
79 79 "atomicwrites" = super.buildPythonPackage {
80 80 name = "atomicwrites-1.3.0";
81 81 doCheck = false;
82 82 src = fetchurl {
83 83 url = "https://files.pythonhosted.org/packages/ec/0f/cd484ac8820fed363b374af30049adc8fd13065720fd4f4c6be8a2309da7/atomicwrites-1.3.0.tar.gz";
84 84 sha256 = "19ngcscdf3jsqmpcxn6zl5b6anmsajb6izp1smcd1n02midl9abm";
85 85 };
86 86 meta = {
87 87 license = [ pkgs.lib.licenses.mit ];
88 88 };
89 89 };
90 90 "attrs" = super.buildPythonPackage {
91 91 name = "attrs-19.3.0";
92 92 doCheck = false;
93 93 src = fetchurl {
94 94 url = "https://files.pythonhosted.org/packages/98/c3/2c227e66b5e896e15ccdae2e00bbc69aa46e9a8ce8869cc5fa96310bf612/attrs-19.3.0.tar.gz";
95 95 sha256 = "0wky4h28n7xnr6xv69p9z6kv8bzn50d10c3drmd9ds8gawbcxdzp";
96 96 };
97 97 meta = {
98 98 license = [ pkgs.lib.licenses.mit ];
99 99 };
100 100 };
101 101 "babel" = super.buildPythonPackage {
102 102 name = "babel-1.3";
103 103 doCheck = false;
104 104 propagatedBuildInputs = [
105 105 self."pytz"
106 106 ];
107 107 src = fetchurl {
108 108 url = "https://files.pythonhosted.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz";
109 109 sha256 = "0bnin777lc53nxd1hp3apq410jj5wx92n08h7h4izpl4f4sx00lz";
110 110 };
111 111 meta = {
112 112 license = [ pkgs.lib.licenses.bsdOriginal ];
113 113 };
114 114 };
115 115 "backports.shutil-get-terminal-size" = super.buildPythonPackage {
116 116 name = "backports.shutil-get-terminal-size-1.0.0";
117 117 doCheck = false;
118 118 src = fetchurl {
119 119 url = "https://files.pythonhosted.org/packages/ec/9c/368086faa9c016efce5da3e0e13ba392c9db79e3ab740b763fe28620b18b/backports.shutil_get_terminal_size-1.0.0.tar.gz";
120 120 sha256 = "107cmn7g3jnbkp826zlj8rrj19fam301qvaqf0f3905f5217lgki";
121 121 };
122 122 meta = {
123 123 license = [ pkgs.lib.licenses.mit ];
124 124 };
125 125 };
126 126 "beaker" = super.buildPythonPackage {
127 127 name = "beaker-1.9.1";
128 128 doCheck = false;
129 129 propagatedBuildInputs = [
130 130 self."funcsigs"
131 131 ];
132 132 src = fetchurl {
133 133 url = "https://files.pythonhosted.org/packages/ca/14/a626188d0d0c7b55dd7cf1902046c2743bd392a7078bb53073e13280eb1e/Beaker-1.9.1.tar.gz";
134 134 sha256 = "08arsn61r255lhz6hcpn2lsiqpg30clla805ysx06wmbhvb6w9rj";
135 135 };
136 136 meta = {
137 137 license = [ pkgs.lib.licenses.bsdOriginal ];
138 138 };
139 139 };
140 140 "beautifulsoup4" = super.buildPythonPackage {
141 141 name = "beautifulsoup4-4.6.3";
142 142 doCheck = false;
143 143 src = fetchurl {
144 144 url = "https://files.pythonhosted.org/packages/88/df/86bffad6309f74f3ff85ea69344a078fc30003270c8df6894fca7a3c72ff/beautifulsoup4-4.6.3.tar.gz";
145 145 sha256 = "041dhalzjciw6qyzzq7a2k4h1yvyk76xigp35hv5ibnn448ydy4h";
146 146 };
147 147 meta = {
148 148 license = [ pkgs.lib.licenses.mit ];
149 149 };
150 150 };
151 151 "billiard" = super.buildPythonPackage {
152 152 name = "billiard-3.6.1.0";
153 153 doCheck = false;
154 154 src = fetchurl {
155 155 url = "https://files.pythonhosted.org/packages/68/1d/2aea8fbb0b1e1260a8a2e77352de2983d36d7ac01207cf14c2b9c6cc860e/billiard-3.6.1.0.tar.gz";
156 156 sha256 = "09hzy3aqi7visy4vmf4xiish61n0rq5nd3iwjydydps8yrs9r05q";
157 157 };
158 158 meta = {
159 159 license = [ pkgs.lib.licenses.bsdOriginal ];
160 160 };
161 161 };
162 162 "bleach" = super.buildPythonPackage {
163 163 name = "bleach-3.1.3";
164 164 doCheck = false;
165 165 propagatedBuildInputs = [
166 166 self."six"
167 167 self."webencodings"
168 168 ];
169 169 src = fetchurl {
170 170 url = "https://files.pythonhosted.org/packages/de/09/5267f8577a92487ed43bc694476c4629c6eca2e3c93fcf690a26bfe39e1d/bleach-3.1.3.tar.gz";
171 171 sha256 = "0al437aw4p2xp83az5hhlrp913nsf0cg6kg4qj3fjhv4wakxipzq";
172 172 };
173 173 meta = {
174 174 license = [ pkgs.lib.licenses.asl20 ];
175 175 };
176 176 };
177 177 "bumpversion" = super.buildPythonPackage {
178 178 name = "bumpversion-0.5.3";
179 179 doCheck = false;
180 180 src = fetchurl {
181 181 url = "https://files.pythonhosted.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz";
182 182 sha256 = "0zn7694yfipxg35ikkfh7kvgl2fissha3dnqad2c5bvsvmrwhi37";
183 183 };
184 184 meta = {
185 185 license = [ pkgs.lib.licenses.mit ];
186 186 };
187 187 };
188 188 "cachetools" = super.buildPythonPackage {
189 189 name = "cachetools-3.1.1";
190 190 doCheck = false;
191 191 src = fetchurl {
192 192 url = "https://files.pythonhosted.org/packages/ae/37/7fd45996b19200e0cb2027a0b6bef4636951c4ea111bfad36c71287247f6/cachetools-3.1.1.tar.gz";
193 193 sha256 = "16m69l6n6y1r1y7cklm92rr7v69ldig2n3lbl3j323w5jz7d78lf";
194 194 };
195 195 meta = {
196 196 license = [ pkgs.lib.licenses.mit ];
197 197 };
198 198 };
199 199 "celery" = super.buildPythonPackage {
200 200 name = "celery-4.3.0";
201 201 doCheck = false;
202 202 propagatedBuildInputs = [
203 203 self."pytz"
204 204 self."billiard"
205 205 self."kombu"
206 206 self."vine"
207 207 ];
208 208 src = fetchurl {
209 209 url = "https://files.pythonhosted.org/packages/a2/4b/d020836f751617e907e84753a41c92231cd4b673ff991b8ee9da52361323/celery-4.3.0.tar.gz";
210 210 sha256 = "1y8y0gbgkwimpxqnxq2rm5qz2vy01fvjiybnpm00y5rzd2m34iac";
211 211 };
212 212 meta = {
213 213 license = [ pkgs.lib.licenses.bsdOriginal ];
214 214 };
215 215 };
216 216 "certifi" = super.buildPythonPackage {
217 217 name = "certifi-2020.4.5.1";
218 218 doCheck = false;
219 219 src = fetchurl {
220 220 url = "https://files.pythonhosted.org/packages/b8/e2/a3a86a67c3fc8249ed305fc7b7d290ebe5e4d46ad45573884761ef4dea7b/certifi-2020.4.5.1.tar.gz";
221 221 sha256 = "06b5gfs7wmmipln8f3z928d2mmx2j4b3x7pnqmj6cvmyfh8v7z2i";
222 222 };
223 223 meta = {
224 224 license = [ pkgs.lib.licenses.mpl20 { fullName = "Mozilla Public License 2.0 (MPL 2.0)"; } ];
225 225 };
226 226 };
227 227 "cffi" = super.buildPythonPackage {
228 228 name = "cffi-1.12.3";
229 229 doCheck = false;
230 230 propagatedBuildInputs = [
231 231 self."pycparser"
232 232 ];
233 233 src = fetchurl {
234 234 url = "https://files.pythonhosted.org/packages/93/1a/ab8c62b5838722f29f3daffcc8d4bd61844aa9b5f437341cc890ceee483b/cffi-1.12.3.tar.gz";
235 235 sha256 = "0x075521fxwv0mfp4cqzk7lvmw4n94bjw601qkcv314z5s182704";
236 236 };
237 237 meta = {
238 238 license = [ pkgs.lib.licenses.mit ];
239 239 };
240 240 };
241 241 "chameleon" = super.buildPythonPackage {
242 242 name = "chameleon-2.24";
243 243 doCheck = false;
244 244 src = fetchurl {
245 245 url = "https://files.pythonhosted.org/packages/5a/9e/637379ffa13c5172b5c0e704833ffea6bf51cec7567f93fd6e903d53ed74/Chameleon-2.24.tar.gz";
246 246 sha256 = "0ykqr7syxfa6h9adjfnsv1gdsca2xzm22vmic8859n0f0j09abj5";
247 247 };
248 248 meta = {
249 249 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
250 250 };
251 251 };
252 252 "channelstream" = super.buildPythonPackage {
253 253 name = "channelstream-0.6.14";
254 254 doCheck = false;
255 255 propagatedBuildInputs = [
256 256 self."gevent"
257 257 self."ws4py"
258 258 self."marshmallow"
259 259 self."python-dateutil"
260 260 self."pyramid"
261 261 self."pyramid-jinja2"
262 262 self."pyramid-apispec"
263 263 self."itsdangerous"
264 264 self."requests"
265 265 self."six"
266 266 ];
267 267 src = fetchurl {
268 268 url = "https://files.pythonhosted.org/packages/d4/2d/86d6757ccd06ce673ee224123471da3d45251d061da7c580bfc259bad853/channelstream-0.6.14.tar.gz";
269 269 sha256 = "0qgy5j3rj6c8cslzidh32glhkrhbbdxjc008y69v8a0y3zyaz2d3";
270 270 };
271 271 meta = {
272 272 license = [ pkgs.lib.licenses.bsdOriginal ];
273 273 };
274 274 };
275 275 "chardet" = super.buildPythonPackage {
276 276 name = "chardet-3.0.4";
277 277 doCheck = false;
278 278 src = fetchurl {
279 279 url = "https://files.pythonhosted.org/packages/fc/bb/a5768c230f9ddb03acc9ef3f0d4a3cf93462473795d18e9535498c8f929d/chardet-3.0.4.tar.gz";
280 280 sha256 = "1bpalpia6r5x1kknbk11p1fzph56fmmnp405ds8icksd3knr5aw4";
281 281 };
282 282 meta = {
283 283 license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
284 284 };
285 285 };
286 286 "click" = super.buildPythonPackage {
287 287 name = "click-7.0";
288 288 doCheck = false;
289 289 src = fetchurl {
290 290 url = "https://files.pythonhosted.org/packages/f8/5c/f60e9d8a1e77005f664b76ff8aeaee5bc05d0a91798afd7f53fc998dbc47/Click-7.0.tar.gz";
291 291 sha256 = "1mzjixd4vjbjvzb6vylki9w1556a9qmdh35kzmq6cign46av952v";
292 292 };
293 293 meta = {
294 294 license = [ pkgs.lib.licenses.bsdOriginal ];
295 295 };
296 296 };
297 297 "colander" = super.buildPythonPackage {
298 298 name = "colander-1.7.0";
299 299 doCheck = false;
300 300 propagatedBuildInputs = [
301 301 self."translationstring"
302 302 self."iso8601"
303 303 self."enum34"
304 304 ];
305 305 src = fetchurl {
306 306 url = "https://files.pythonhosted.org/packages/db/e4/74ab06f54211917b41865cafc987ce511e35503de48da9bfe9358a1bdc3e/colander-1.7.0.tar.gz";
307 307 sha256 = "1wl1bqab307lbbcjx81i28s3yl6dlm4rf15fxawkjb6j48x1cn6p";
308 308 };
309 309 meta = {
310 310 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
311 311 };
312 312 };
313 313 "configobj" = super.buildPythonPackage {
314 314 name = "configobj-5.0.6";
315 315 doCheck = false;
316 316 propagatedBuildInputs = [
317 317 self."six"
318 318 ];
319 319 src = fetchurl {
320 320 url = "https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626";
321 321 sha256 = "0kqfrdfr14mw8yd8qwq14dv2xghpkjmd3yjsy8dfcbvpcc17xnxp";
322 322 };
323 323 meta = {
324 324 license = [ pkgs.lib.licenses.bsdOriginal ];
325 325 };
326 326 };
327 327 "configparser" = super.buildPythonPackage {
328 328 name = "configparser-4.0.2";
329 329 doCheck = false;
330 330 src = fetchurl {
331 331 url = "https://files.pythonhosted.org/packages/16/4f/48975536bd488d3a272549eb795ac4a13a5f7fcdc8995def77fbef3532ee/configparser-4.0.2.tar.gz";
332 332 sha256 = "1priacxym85yjcf68hh38w55nqswaxp71ryjyfdk222kg9l85ln7";
333 333 };
334 334 meta = {
335 335 license = [ pkgs.lib.licenses.mit ];
336 336 };
337 337 };
338 338 "contextlib2" = super.buildPythonPackage {
339 339 name = "contextlib2-0.6.0.post1";
340 340 doCheck = false;
341 341 src = fetchurl {
342 342 url = "https://files.pythonhosted.org/packages/02/54/669207eb72e3d8ae8b38aa1f0703ee87a0e9f88f30d3c0a47bebdb6de242/contextlib2-0.6.0.post1.tar.gz";
343 343 sha256 = "0bhnr2ac7wy5l85ji909gyljyk85n92w8pdvslmrvc8qih4r1x01";
344 344 };
345 345 meta = {
346 346 license = [ pkgs.lib.licenses.psfl ];
347 347 };
348 348 };
349 349 "cov-core" = super.buildPythonPackage {
350 350 name = "cov-core-1.15.0";
351 351 doCheck = false;
352 352 propagatedBuildInputs = [
353 353 self."coverage"
354 354 ];
355 355 src = fetchurl {
356 356 url = "https://files.pythonhosted.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz";
357 357 sha256 = "0k3np9ymh06yv1ib96sb6wfsxjkqhmik8qfsn119vnhga9ywc52a";
358 358 };
359 359 meta = {
360 360 license = [ pkgs.lib.licenses.mit ];
361 361 };
362 362 };
363 363 "coverage" = super.buildPythonPackage {
364 364 name = "coverage-4.5.4";
365 365 doCheck = false;
366 366 src = fetchurl {
367 367 url = "https://files.pythonhosted.org/packages/85/d5/818d0e603685c4a613d56f065a721013e942088047ff1027a632948bdae6/coverage-4.5.4.tar.gz";
368 368 sha256 = "0p0j4di6h8k6ica7jwwj09azdcg4ycxq60i9qsskmsg94cd9yzg0";
369 369 };
370 370 meta = {
371 371 license = [ pkgs.lib.licenses.asl20 ];
372 372 };
373 373 };
374 374 "cryptography" = super.buildPythonPackage {
375 375 name = "cryptography-2.6.1";
376 376 doCheck = false;
377 377 propagatedBuildInputs = [
378 378 self."asn1crypto"
379 379 self."six"
380 380 self."cffi"
381 381 self."enum34"
382 382 self."ipaddress"
383 383 ];
384 384 src = fetchurl {
385 385 url = "https://files.pythonhosted.org/packages/07/ca/bc827c5e55918ad223d59d299fff92f3563476c3b00d0a9157d9c0217449/cryptography-2.6.1.tar.gz";
386 386 sha256 = "19iwz5avym5zl6jrrrkym1rdaa9h61j20ph4cswsqgv8xg5j3j16";
387 387 };
388 388 meta = {
389 389 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
390 390 };
391 391 };
392 392 "cssselect" = super.buildPythonPackage {
393 393 name = "cssselect-1.0.3";
394 394 doCheck = false;
395 395 src = fetchurl {
396 396 url = "https://files.pythonhosted.org/packages/52/ea/f31e1d2e9eb130fda2a631e22eac369dc644e8807345fbed5113f2d6f92b/cssselect-1.0.3.tar.gz";
397 397 sha256 = "011jqa2jhmydhi0iz4v1w3cr540z5zas8g2bw8brdw4s4b2qnv86";
398 398 };
399 399 meta = {
400 400 license = [ pkgs.lib.licenses.bsdOriginal ];
401 401 };
402 402 };
403 403 "cssutils" = super.buildPythonPackage {
404 404 name = "cssutils-1.0.2";
405 405 doCheck = false;
406 406 src = fetchurl {
407 407 url = "https://files.pythonhosted.org/packages/5c/0b/c5f29d29c037e97043770b5e7c740b6252993e4b57f029b3cd03c78ddfec/cssutils-1.0.2.tar.gz";
408 408 sha256 = "1bxchrbqzapwijap0yhlxdil1w9bmwvgx77aizlkhc2mcxjg1z52";
409 409 };
410 410 meta = {
411 411 license = [ { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL 2.1 or later, see also http://cthedot.de/cssutils/"; } ];
412 412 };
413 413 };
414 414 "decorator" = super.buildPythonPackage {
415 415 name = "decorator-4.1.2";
416 416 doCheck = false;
417 417 src = fetchurl {
418 418 url = "https://files.pythonhosted.org/packages/bb/e0/f6e41e9091e130bf16d4437dabbac3993908e4d6485ecbc985ef1352db94/decorator-4.1.2.tar.gz";
419 419 sha256 = "1d8npb11kxyi36mrvjdpcjij76l5zfyrz2f820brf0l0rcw4vdkw";
420 420 };
421 421 meta = {
422 422 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "new BSD License"; } ];
423 423 };
424 424 };
425 425 "deform" = super.buildPythonPackage {
426 426 name = "deform-2.0.8";
427 427 doCheck = false;
428 428 propagatedBuildInputs = [
429 429 self."chameleon"
430 430 self."colander"
431 431 self."iso8601"
432 432 self."peppercorn"
433 433 self."translationstring"
434 434 self."zope.deprecation"
435 435 ];
436 436 src = fetchurl {
437 437 url = "https://files.pythonhosted.org/packages/21/d0/45fdf891a82722c02fc2da319cf2d1ae6b5abf9e470ad3762135a895a868/deform-2.0.8.tar.gz";
438 438 sha256 = "0wbjv98sib96649aqaygzxnrkclyy50qij2rha6fn1i4c86bfdl9";
439 439 };
440 440 meta = {
441 441 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
442 442 };
443 443 };
444 444 "defusedxml" = super.buildPythonPackage {
445 445 name = "defusedxml-0.6.0";
446 446 doCheck = false;
447 447 src = fetchurl {
448 448 url = "https://files.pythonhosted.org/packages/a4/5f/f8aa58ca0cf01cbcee728abc9d88bfeb74e95e6cb4334cfd5bed5673ea77/defusedxml-0.6.0.tar.gz";
449 449 sha256 = "1xbp8fivl3wlbyg2jrvs4lalaqv1xp9a9f29p75wdx2s2d6h717n";
450 450 };
451 451 meta = {
452 452 license = [ pkgs.lib.licenses.psfl ];
453 453 };
454 454 };
455 455 "dm.xmlsec.binding" = super.buildPythonPackage {
456 456 name = "dm.xmlsec.binding-1.3.7";
457 457 doCheck = false;
458 458 propagatedBuildInputs = [
459 459 self."setuptools"
460 460 self."lxml"
461 461 ];
462 462 src = fetchurl {
463 463 url = "https://files.pythonhosted.org/packages/2c/9e/7651982d50252692991acdae614af821fd6c79bc8dcd598ad71d55be8fc7/dm.xmlsec.binding-1.3.7.tar.gz";
464 464 sha256 = "03jjjscx1pz2nc0dwiw9nia02qbz1c6f0f9zkyr8fmvys2n5jkb3";
465 465 };
466 466 meta = {
467 467 license = [ pkgs.lib.licenses.bsdOriginal ];
468 468 };
469 469 };
470 470 "docutils" = super.buildPythonPackage {
471 471 name = "docutils-0.16";
472 472 doCheck = false;
473 473 src = fetchurl {
474 474 url = "https://files.pythonhosted.org/packages/2f/e0/3d435b34abd2d62e8206171892f174b180cd37b09d57b924ca5c2ef2219d/docutils-0.16.tar.gz";
475 475 sha256 = "1z3qliszqca9m719q3qhdkh0ghh90g500avzdgi7pl77x5h3mpn2";
476 476 };
477 477 meta = {
478 478 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.publicDomain pkgs.lib.licenses.gpl1 { fullName = "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"; } pkgs.lib.licenses.psfl ];
479 479 };
480 480 };
481 481 "dogpile.cache" = super.buildPythonPackage {
482 482 name = "dogpile.cache-0.9.0";
483 483 doCheck = false;
484 484 propagatedBuildInputs = [
485 485 self."decorator"
486 486 ];
487 487 src = fetchurl {
488 488 url = "https://files.pythonhosted.org/packages/ac/6a/9ac405686a94b7f009a20a50070a5786b0e1aedc707b88d40d0c4b51a82e/dogpile.cache-0.9.0.tar.gz";
489 489 sha256 = "0sr1fn6b4k5bh0cscd9yi8csqxvj4ngzildav58x5p694mc86j5k";
490 490 };
491 491 meta = {
492 492 license = [ pkgs.lib.licenses.bsdOriginal ];
493 493 };
494 494 };
495 495 "dogpile.core" = super.buildPythonPackage {
496 496 name = "dogpile.core-0.4.1";
497 497 doCheck = false;
498 498 src = fetchurl {
499 499 url = "https://files.pythonhosted.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz";
500 500 sha256 = "0xpdvg4kr1isfkrh1rfsh7za4q5a5s6l2kf9wpvndbwf3aqjyrdy";
501 501 };
502 502 meta = {
503 503 license = [ pkgs.lib.licenses.bsdOriginal ];
504 504 };
505 505 };
506 506 "ecdsa" = super.buildPythonPackage {
507 507 name = "ecdsa-0.13.2";
508 508 doCheck = false;
509 509 src = fetchurl {
510 510 url = "https://files.pythonhosted.org/packages/51/76/139bf6e9b7b6684d5891212cdbd9e0739f2bfc03f380a1a6ffa700f392ac/ecdsa-0.13.2.tar.gz";
511 511 sha256 = "116qaq7bh4lcynzi613960jhsnn19v0kmsqwahiwjfj14gx4y0sw";
512 512 };
513 513 meta = {
514 514 license = [ pkgs.lib.licenses.mit ];
515 515 };
516 516 };
517 517 "elasticsearch" = super.buildPythonPackage {
518 518 name = "elasticsearch-6.3.1";
519 519 doCheck = false;
520 520 propagatedBuildInputs = [
521 521 self."urllib3"
522 522 ];
523 523 src = fetchurl {
524 524 url = "https://files.pythonhosted.org/packages/9d/ce/c4664e8380e379a9402ecfbaf158e56396da90d520daba21cfa840e0eb71/elasticsearch-6.3.1.tar.gz";
525 525 sha256 = "12y93v0yn7a4xmf969239g8gb3l4cdkclfpbk1qc8hx5qkymrnma";
526 526 };
527 527 meta = {
528 528 license = [ pkgs.lib.licenses.asl20 ];
529 529 };
530 530 };
531 531 "elasticsearch-dsl" = super.buildPythonPackage {
532 532 name = "elasticsearch-dsl-6.3.1";
533 533 doCheck = false;
534 534 propagatedBuildInputs = [
535 535 self."six"
536 536 self."python-dateutil"
537 537 self."elasticsearch"
538 538 self."ipaddress"
539 539 ];
540 540 src = fetchurl {
541 541 url = "https://files.pythonhosted.org/packages/4c/0d/1549f50c591db6bb4e66cbcc8d34a6e537c3d89aa426b167c244fd46420a/elasticsearch-dsl-6.3.1.tar.gz";
542 542 sha256 = "1gh8a0shqi105k325hgwb9avrpdjh0mc6mxwfg9ba7g6lssb702z";
543 543 };
544 544 meta = {
545 545 license = [ pkgs.lib.licenses.asl20 ];
546 546 };
547 547 };
548 548 "elasticsearch1" = super.buildPythonPackage {
549 549 name = "elasticsearch1-1.10.0";
550 550 doCheck = false;
551 551 propagatedBuildInputs = [
552 552 self."urllib3"
553 553 ];
554 554 src = fetchurl {
555 555 url = "https://files.pythonhosted.org/packages/a6/eb/73e75f9681fa71e3157b8ee878534235d57f24ee64f0e77f8d995fb57076/elasticsearch1-1.10.0.tar.gz";
556 556 sha256 = "0g89444kd5zwql4vbvyrmi2m6l6dcj6ga98j4hqxyyyz6z20aki2";
557 557 };
558 558 meta = {
559 559 license = [ pkgs.lib.licenses.asl20 ];
560 560 };
561 561 };
562 562 "elasticsearch1-dsl" = super.buildPythonPackage {
563 563 name = "elasticsearch1-dsl-0.0.12";
564 564 doCheck = false;
565 565 propagatedBuildInputs = [
566 566 self."six"
567 567 self."python-dateutil"
568 568 self."elasticsearch1"
569 569 ];
570 570 src = fetchurl {
571 571 url = "https://files.pythonhosted.org/packages/eb/9d/785342775cb10eddc9b8d7457d618a423b4f0b89d8b2b2d1bc27190d71db/elasticsearch1-dsl-0.0.12.tar.gz";
572 572 sha256 = "0ig1ly39v93hba0z975wnhbmzwj28w6w1sqlr2g7cn5spp732bhk";
573 573 };
574 574 meta = {
575 575 license = [ pkgs.lib.licenses.asl20 ];
576 576 };
577 577 };
578 578 "elasticsearch2" = super.buildPythonPackage {
579 579 name = "elasticsearch2-2.5.1";
580 580 doCheck = false;
581 581 propagatedBuildInputs = [
582 582 self."urllib3"
583 583 ];
584 584 src = fetchurl {
585 585 url = "https://files.pythonhosted.org/packages/f6/09/f9b24aa6b1120bea371cd57ef6f57c7694cf16660469456a8be6c2bdbe22/elasticsearch2-2.5.1.tar.gz";
586 586 sha256 = "19k2znpjfyp0hrq73cz7pjyj289040xpsxsm0xhh4jfh6y551g7k";
587 587 };
588 588 meta = {
589 589 license = [ pkgs.lib.licenses.asl20 ];
590 590 };
591 591 };
592 592 "entrypoints" = super.buildPythonPackage {
593 593 name = "entrypoints-0.2.2";
594 594 doCheck = false;
595 595 propagatedBuildInputs = [
596 596 self."configparser"
597 597 ];
598 598 src = fetchurl {
599 599 url = "https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d";
600 600 sha256 = "0qih72n2myclanplqipqxpgpj9d2yhff1pz5d02zq1cfqyd173w5";
601 601 };
602 602 meta = {
603 603 license = [ pkgs.lib.licenses.mit ];
604 604 };
605 605 };
606 606 "enum34" = super.buildPythonPackage {
607 607 name = "enum34-1.1.10";
608 608 doCheck = false;
609 609 src = fetchurl {
610 610 url = "https://files.pythonhosted.org/packages/11/c4/2da1f4952ba476677a42f25cd32ab8aaf0e1c0d0e00b89822b835c7e654c/enum34-1.1.10.tar.gz";
611 611 sha256 = "0j7ji699fwswm4vg6w1v07fkbf8dkzdm6gfh88jvs5nqgr3sgrnc";
612 612 };
613 613 meta = {
614 614 license = [ pkgs.lib.licenses.bsdOriginal ];
615 615 };
616 616 };
617 617 "formencode" = super.buildPythonPackage {
618 618 name = "formencode-1.2.4";
619 619 doCheck = false;
620 620 src = fetchurl {
621 621 url = "https://files.pythonhosted.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz";
622 622 sha256 = "1fgy04sdy4yry5xcjls3x3xy30dqwj58ycnkndim819jx0788w42";
623 623 };
624 624 meta = {
625 625 license = [ pkgs.lib.licenses.psfl ];
626 626 };
627 627 };
628 628 "funcsigs" = super.buildPythonPackage {
629 629 name = "funcsigs-1.0.2";
630 630 doCheck = false;
631 631 src = fetchurl {
632 632 url = "https://files.pythonhosted.org/packages/94/4a/db842e7a0545de1cdb0439bb80e6e42dfe82aaeaadd4072f2263a4fbed23/funcsigs-1.0.2.tar.gz";
633 633 sha256 = "0l4g5818ffyfmfs1a924811azhjj8ax9xd1cffr1mzd3ycn0zfx7";
634 634 };
635 635 meta = {
636 636 license = [ { fullName = "ASL"; } pkgs.lib.licenses.asl20 ];
637 637 };
638 638 };
639 639 "functools32" = super.buildPythonPackage {
640 640 name = "functools32-3.2.3.post2";
641 641 doCheck = false;
642 642 src = fetchurl {
643 643 url = "https://files.pythonhosted.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz";
644 644 sha256 = "0v8ya0b58x47wp216n1zamimv4iw57cxz3xxhzix52jkw3xks9gn";
645 645 };
646 646 meta = {
647 647 license = [ pkgs.lib.licenses.psfl ];
648 648 };
649 649 };
650 650 "future" = super.buildPythonPackage {
651 651 name = "future-0.14.3";
652 652 doCheck = false;
653 653 src = fetchurl {
654 654 url = "https://files.pythonhosted.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz";
655 655 sha256 = "1savk7jx7hal032f522c5ajhh8fra6gmnadrj9adv5qxi18pv1b2";
656 656 };
657 657 meta = {
658 658 license = [ { fullName = "OSI Approved"; } pkgs.lib.licenses.mit ];
659 659 };
660 660 };
661 661 "futures" = super.buildPythonPackage {
662 662 name = "futures-3.0.2";
663 663 doCheck = false;
664 664 src = fetchurl {
665 665 url = "https://files.pythonhosted.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz";
666 666 sha256 = "0mz2pbgxbc2nbib1szifi07whjbfs4r02pv2z390z7p410awjgyw";
667 667 };
668 668 meta = {
669 669 license = [ pkgs.lib.licenses.bsdOriginal ];
670 670 };
671 671 };
672 672 "gevent" = super.buildPythonPackage {
673 673 name = "gevent-1.5.0";
674 674 doCheck = false;
675 675 propagatedBuildInputs = [
676 676 self."greenlet"
677 677 ];
678 678 src = fetchurl {
679 679 url = "https://files.pythonhosted.org/packages/5a/79/2c63d385d017b5dd7d70983a463dfd25befae70c824fedb857df6e72eff2/gevent-1.5.0.tar.gz";
680 680 sha256 = "0aac3d4vhv5n4rsb6cqzq0d1xx9immqz4fmpddw35yxkwdc450dj";
681 681 };
682 682 meta = {
683 683 license = [ pkgs.lib.licenses.mit ];
684 684 };
685 685 };
686 686 "gnureadline" = super.buildPythonPackage {
687 687 name = "gnureadline-6.3.8";
688 688 doCheck = false;
689 689 src = fetchurl {
690 690 url = "https://files.pythonhosted.org/packages/50/64/86085c823cd78f9df9d8e33dce0baa71618016f8860460b82cf6610e1eb3/gnureadline-6.3.8.tar.gz";
691 691 sha256 = "0ddhj98x2nv45iz4aadk4b9m0b1kpsn1xhcbypn5cd556knhiqjq";
692 692 };
693 693 meta = {
694 694 license = [ { fullName = "GNU General Public License v3 (GPLv3)"; } pkgs.lib.licenses.gpl1 ];
695 695 };
696 696 };
697 697 "gprof2dot" = super.buildPythonPackage {
698 698 name = "gprof2dot-2017.9.19";
699 699 doCheck = false;
700 700 src = fetchurl {
701 701 url = "https://files.pythonhosted.org/packages/9d/36/f977122502979f3dfb50704979c9ed70e6b620787942b089bf1af15f5aba/gprof2dot-2017.9.19.tar.gz";
702 702 sha256 = "17ih23ld2nzgc3xwgbay911l6lh96jp1zshmskm17n1gg2i7mg6f";
703 703 };
704 704 meta = {
705 705 license = [ { fullName = "GNU Lesser General Public License v3 or later (LGPLv3+)"; } { fullName = "LGPL"; } ];
706 706 };
707 707 };
708 708 "greenlet" = super.buildPythonPackage {
709 709 name = "greenlet-0.4.15";
710 710 doCheck = false;
711 711 src = fetchurl {
712 712 url = "https://files.pythonhosted.org/packages/f8/e8/b30ae23b45f69aa3f024b46064c0ac8e5fcb4f22ace0dca8d6f9c8bbe5e7/greenlet-0.4.15.tar.gz";
713 713 sha256 = "1g4g1wwc472ds89zmqlpyan3fbnzpa8qm48z3z1y6mlk44z485ll";
714 714 };
715 715 meta = {
716 716 license = [ pkgs.lib.licenses.mit ];
717 717 };
718 718 };
719 719 "gunicorn" = super.buildPythonPackage {
720 720 name = "gunicorn-19.9.0";
721 721 doCheck = false;
722 722 src = fetchurl {
723 723 url = "https://files.pythonhosted.org/packages/47/52/68ba8e5e8ba251e54006a49441f7ccabca83b6bef5aedacb4890596c7911/gunicorn-19.9.0.tar.gz";
724 724 sha256 = "1wzlf4xmn6qjirh5w81l6i6kqjnab1n1qqkh7zsj1yb6gh4n49ps";
725 725 };
726 726 meta = {
727 727 license = [ pkgs.lib.licenses.mit ];
728 728 };
729 729 };
730 730 "hupper" = super.buildPythonPackage {
731 731 name = "hupper-1.10.2";
732 732 doCheck = false;
733 733 src = fetchurl {
734 734 url = "https://files.pythonhosted.org/packages/41/24/ea90fef04706e54bd1635c05c50dc9cf87cda543c59303a03e7aa7dda0ce/hupper-1.10.2.tar.gz";
735 735 sha256 = "0am0p6g5cz6xmcaf04xq8q6dzdd9qz0phj6gcmpsckf2mcyza61q";
736 736 };
737 737 meta = {
738 738 license = [ pkgs.lib.licenses.mit ];
739 739 };
740 740 };
741 741 "idna" = super.buildPythonPackage {
742 742 name = "idna-2.8";
743 743 doCheck = false;
744 744 src = fetchurl {
745 745 url = "https://files.pythonhosted.org/packages/ad/13/eb56951b6f7950cadb579ca166e448ba77f9d24efc03edd7e55fa57d04b7/idna-2.8.tar.gz";
746 746 sha256 = "01rlkigdxg17sf9yar1jl8n18ls59367wqh59hnawlyg53vb6my3";
747 747 };
748 748 meta = {
749 749 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD-like"; } ];
750 750 };
751 751 };
752 752 "importlib-metadata" = super.buildPythonPackage {
753 753 name = "importlib-metadata-1.6.0";
754 754 doCheck = false;
755 755 propagatedBuildInputs = [
756 756 self."zipp"
757 757 self."pathlib2"
758 758 self."contextlib2"
759 759 self."configparser"
760 760 ];
761 761 src = fetchurl {
762 762 url = "https://files.pythonhosted.org/packages/b4/1b/baab42e3cd64c9d5caac25a9d6c054f8324cdc38975a44d600569f1f7158/importlib_metadata-1.6.0.tar.gz";
763 763 sha256 = "07icyggasn38yv2swdrd8z6i0plazmc9adavsdkbqqj91j53ll9l";
764 764 };
765 765 meta = {
766 766 license = [ pkgs.lib.licenses.asl20 ];
767 767 };
768 768 };
769 769 "infrae.cache" = super.buildPythonPackage {
770 770 name = "infrae.cache-1.0.1";
771 771 doCheck = false;
772 772 propagatedBuildInputs = [
773 773 self."beaker"
774 774 self."repoze.lru"
775 775 ];
776 776 src = fetchurl {
777 777 url = "https://files.pythonhosted.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz";
778 778 sha256 = "1dvqsjn8vw253wz9d1pz17j79mf4bs53dvp2qxck2qdp1am1njw4";
779 779 };
780 780 meta = {
781 781 license = [ pkgs.lib.licenses.zpl21 ];
782 782 };
783 783 };
784 784 "invoke" = super.buildPythonPackage {
785 785 name = "invoke-0.13.0";
786 786 doCheck = false;
787 787 src = fetchurl {
788 788 url = "https://files.pythonhosted.org/packages/47/bf/d07ef52fa1ac645468858bbac7cb95b246a972a045e821493d17d89c81be/invoke-0.13.0.tar.gz";
789 789 sha256 = "0794vhgxfmkh0vzkkg5cfv1w82g3jc3xr18wim29far9qpx9468s";
790 790 };
791 791 meta = {
792 792 license = [ pkgs.lib.licenses.bsdOriginal ];
793 793 };
794 794 };
795 795 "ipaddress" = super.buildPythonPackage {
796 796 name = "ipaddress-1.0.23";
797 797 doCheck = false;
798 798 src = fetchurl {
799 799 url = "https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz";
800 800 sha256 = "1qp743h30s04m3cg3yk3fycad930jv17q7dsslj4mfw0jlvf1y5p";
801 801 };
802 802 meta = {
803 803 license = [ pkgs.lib.licenses.psfl ];
804 804 };
805 805 };
806 806 "ipdb" = super.buildPythonPackage {
807 807 name = "ipdb-0.13.2";
808 808 doCheck = false;
809 809 propagatedBuildInputs = [
810 810 self."setuptools"
811 811 self."ipython"
812 812 ];
813 813 src = fetchurl {
814 814 url = "https://files.pythonhosted.org/packages/2c/bb/a3e1a441719ebd75c6dac8170d3ddba884b7ee8a5c0f9aefa7297386627a/ipdb-0.13.2.tar.gz";
815 815 sha256 = "0jcd849rx30y3wcgzsqbn06v0yjlzvb9x3076q0yxpycdwm1ryvp";
816 816 };
817 817 meta = {
818 818 license = [ pkgs.lib.licenses.bsdOriginal ];
819 819 };
820 820 };
821 821 "ipython" = super.buildPythonPackage {
822 822 name = "ipython-5.1.0";
823 823 doCheck = false;
824 824 propagatedBuildInputs = [
825 825 self."setuptools"
826 826 self."decorator"
827 827 self."pickleshare"
828 828 self."simplegeneric"
829 829 self."traitlets"
830 830 self."prompt-toolkit"
831 831 self."pygments"
832 832 self."pexpect"
833 833 self."backports.shutil-get-terminal-size"
834 834 self."pathlib2"
835 835 self."pexpect"
836 836 ];
837 837 src = fetchurl {
838 838 url = "https://files.pythonhosted.org/packages/89/63/a9292f7cd9d0090a0f995e1167f3f17d5889dcbc9a175261719c513b9848/ipython-5.1.0.tar.gz";
839 839 sha256 = "0qdrf6aj9kvjczd5chj1my8y2iq09am9l8bb2a1334a52d76kx3y";
840 840 };
841 841 meta = {
842 842 license = [ pkgs.lib.licenses.bsdOriginal ];
843 843 };
844 844 };
845 845 "ipython-genutils" = super.buildPythonPackage {
846 846 name = "ipython-genutils-0.2.0";
847 847 doCheck = false;
848 848 src = fetchurl {
849 849 url = "https://files.pythonhosted.org/packages/e8/69/fbeffffc05236398ebfcfb512b6d2511c622871dca1746361006da310399/ipython_genutils-0.2.0.tar.gz";
850 850 sha256 = "1a4bc9y8hnvq6cp08qs4mckgm6i6ajpndp4g496rvvzcfmp12bpb";
851 851 };
852 852 meta = {
853 853 license = [ pkgs.lib.licenses.bsdOriginal ];
854 854 };
855 855 };
856 856 "iso8601" = super.buildPythonPackage {
857 857 name = "iso8601-0.1.12";
858 858 doCheck = false;
859 859 src = fetchurl {
860 860 url = "https://files.pythonhosted.org/packages/45/13/3db24895497345fb44c4248c08b16da34a9eb02643cea2754b21b5ed08b0/iso8601-0.1.12.tar.gz";
861 861 sha256 = "10nyvvnrhw2w3p09v1ica4lgj6f4g9j3kkfx17qmraiq3w7b5i29";
862 862 };
863 863 meta = {
864 864 license = [ pkgs.lib.licenses.mit ];
865 865 };
866 866 };
867 867 "isodate" = super.buildPythonPackage {
868 868 name = "isodate-0.6.0";
869 869 doCheck = false;
870 870 propagatedBuildInputs = [
871 871 self."six"
872 872 ];
873 873 src = fetchurl {
874 874 url = "https://files.pythonhosted.org/packages/b1/80/fb8c13a4cd38eb5021dc3741a9e588e4d1de88d895c1910c6fc8a08b7a70/isodate-0.6.0.tar.gz";
875 875 sha256 = "1n7jkz68kk5pwni540pr5zdh99bf6ywydk1p5pdrqisrawylldif";
876 876 };
877 877 meta = {
878 878 license = [ pkgs.lib.licenses.bsdOriginal ];
879 879 };
880 880 };
881 881 "itsdangerous" = super.buildPythonPackage {
882 882 name = "itsdangerous-1.1.0";
883 883 doCheck = false;
884 884 src = fetchurl {
885 885 url = "https://files.pythonhosted.org/packages/68/1a/f27de07a8a304ad5fa817bbe383d1238ac4396da447fa11ed937039fa04b/itsdangerous-1.1.0.tar.gz";
886 886 sha256 = "068zpbksq5q2z4dckh2k1zbcq43ay74ylqn77rni797j0wyh66rj";
887 887 };
888 888 meta = {
889 889 license = [ pkgs.lib.licenses.bsdOriginal ];
890 890 };
891 891 };
892 892 "jinja2" = super.buildPythonPackage {
893 893 name = "jinja2-2.9.6";
894 894 doCheck = false;
895 895 propagatedBuildInputs = [
896 896 self."markupsafe"
897 897 ];
898 898 src = fetchurl {
899 899 url = "https://files.pythonhosted.org/packages/90/61/f820ff0076a2599dd39406dcb858ecb239438c02ce706c8e91131ab9c7f1/Jinja2-2.9.6.tar.gz";
900 900 sha256 = "1zzrkywhziqffrzks14kzixz7nd4yh2vc0fb04a68vfd2ai03anx";
901 901 };
902 902 meta = {
903 903 license = [ pkgs.lib.licenses.bsdOriginal ];
904 904 };
905 905 };
906 906 "jsonschema" = super.buildPythonPackage {
907 907 name = "jsonschema-2.6.0";
908 908 doCheck = false;
909 909 propagatedBuildInputs = [
910 910 self."functools32"
911 911 ];
912 912 src = fetchurl {
913 913 url = "https://files.pythonhosted.org/packages/58/b9/171dbb07e18c6346090a37f03c7e74410a1a56123f847efed59af260a298/jsonschema-2.6.0.tar.gz";
914 914 sha256 = "00kf3zmpp9ya4sydffpifn0j0mzm342a2vzh82p6r0vh10cg7xbg";
915 915 };
916 916 meta = {
917 917 license = [ pkgs.lib.licenses.mit ];
918 918 };
919 919 };
920 920 "jupyter-client" = super.buildPythonPackage {
921 921 name = "jupyter-client-5.0.0";
922 922 doCheck = false;
923 923 propagatedBuildInputs = [
924 924 self."traitlets"
925 925 self."jupyter-core"
926 926 self."pyzmq"
927 927 self."python-dateutil"
928 928 ];
929 929 src = fetchurl {
930 930 url = "https://files.pythonhosted.org/packages/e5/6f/65412ed462202b90134b7e761b0b7e7f949e07a549c1755475333727b3d0/jupyter_client-5.0.0.tar.gz";
931 931 sha256 = "0nxw4rqk4wsjhc87gjqd7pv89cb9dnimcfnmcmp85bmrvv1gjri7";
932 932 };
933 933 meta = {
934 934 license = [ pkgs.lib.licenses.bsdOriginal ];
935 935 };
936 936 };
937 937 "jupyter-core" = super.buildPythonPackage {
938 938 name = "jupyter-core-4.5.0";
939 939 doCheck = false;
940 940 propagatedBuildInputs = [
941 941 self."traitlets"
942 942 ];
943 943 src = fetchurl {
944 944 url = "https://files.pythonhosted.org/packages/4a/de/ff4ca734656d17ebe0450807b59d728f45277e2e7f4b82bc9aae6cb82961/jupyter_core-4.5.0.tar.gz";
945 945 sha256 = "1xr4pbghwk5hayn5wwnhb7z95380r45p79gf5if5pi1akwg7qvic";
946 946 };
947 947 meta = {
948 948 license = [ pkgs.lib.licenses.bsdOriginal ];
949 949 };
950 950 };
951 951 "kombu" = super.buildPythonPackage {
952 952 name = "kombu-4.6.6";
953 953 doCheck = false;
954 954 propagatedBuildInputs = [
955 955 self."amqp"
956 956 self."importlib-metadata"
957 957 ];
958 958 src = fetchurl {
959 959 url = "https://files.pythonhosted.org/packages/20/e6/bc2d9affba6138a1dc143f77fef253e9e08e238fa7c0688d917c09005e96/kombu-4.6.6.tar.gz";
960 960 sha256 = "11mxpcy8mg1l35bgbhba70v29bydr2hrhdbdlb4lg98m3m5vaq0p";
961 961 };
962 962 meta = {
963 963 license = [ pkgs.lib.licenses.bsdOriginal ];
964 964 };
965 965 };
966 966 "lxml" = super.buildPythonPackage {
967 967 name = "lxml-4.2.5";
968 968 doCheck = false;
969 969 src = fetchurl {
970 970 url = "https://files.pythonhosted.org/packages/4b/20/ddf5eb3bd5c57582d2b4652b4bbcf8da301bdfe5d805cb94e805f4d7464d/lxml-4.2.5.tar.gz";
971 971 sha256 = "0zw0y9hs0nflxhl9cs6ipwwh53szi3w2x06wl0k9cylyqac0cwin";
972 972 };
973 973 meta = {
974 974 license = [ pkgs.lib.licenses.bsdOriginal ];
975 975 };
976 976 };
977 977 "mako" = super.buildPythonPackage {
978 978 name = "mako-1.1.0";
979 979 doCheck = false;
980 980 propagatedBuildInputs = [
981 981 self."markupsafe"
982 982 ];
983 983 src = fetchurl {
984 984 url = "https://files.pythonhosted.org/packages/b0/3c/8dcd6883d009f7cae0f3157fb53e9afb05a0d3d33b3db1268ec2e6f4a56b/Mako-1.1.0.tar.gz";
985 985 sha256 = "0jqa3qfpykyn4fmkn0kh6043sfls7br8i2bsdbccazcvk9cijsd3";
986 986 };
987 987 meta = {
988 988 license = [ pkgs.lib.licenses.mit ];
989 989 };
990 990 };
991 991 "markdown" = super.buildPythonPackage {
992 992 name = "markdown-2.6.11";
993 993 doCheck = false;
994 994 src = fetchurl {
995 995 url = "https://files.pythonhosted.org/packages/b3/73/fc5c850f44af5889192dff783b7b0d8f3fe8d30b65c8e3f78f8f0265fecf/Markdown-2.6.11.tar.gz";
996 996 sha256 = "108g80ryzykh8bj0i7jfp71510wrcixdi771lf2asyghgyf8cmm8";
997 997 };
998 998 meta = {
999 999 license = [ pkgs.lib.licenses.bsdOriginal ];
1000 1000 };
1001 1001 };
1002 1002 "markupsafe" = super.buildPythonPackage {
1003 1003 name = "markupsafe-1.1.1";
1004 1004 doCheck = false;
1005 1005 src = fetchurl {
1006 1006 url = "https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz";
1007 1007 sha256 = "0sqipg4fk7xbixqd8kq6rlkxj664d157bdwbh93farcphf92x1r9";
1008 1008 };
1009 1009 meta = {
1010 1010 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd3 ];
1011 1011 };
1012 1012 };
1013 1013 "marshmallow" = super.buildPythonPackage {
1014 1014 name = "marshmallow-2.18.0";
1015 1015 doCheck = false;
1016 1016 src = fetchurl {
1017 1017 url = "https://files.pythonhosted.org/packages/ad/0b/5799965d1c6d5f608d684e2c0dce8a828e0309a3bfe8327d9418a89f591c/marshmallow-2.18.0.tar.gz";
1018 1018 sha256 = "1g0aafpjn7yaxq06yndy8c7rs9n42adxkqq1ayhlr869pr06d3lm";
1019 1019 };
1020 1020 meta = {
1021 1021 license = [ pkgs.lib.licenses.mit ];
1022 1022 };
1023 1023 };
1024 1024 "mistune" = super.buildPythonPackage {
1025 1025 name = "mistune-0.8.4";
1026 1026 doCheck = false;
1027 1027 src = fetchurl {
1028 1028 url = "https://files.pythonhosted.org/packages/2d/a4/509f6e7783ddd35482feda27bc7f72e65b5e7dc910eca4ab2164daf9c577/mistune-0.8.4.tar.gz";
1029 1029 sha256 = "0vkmsh0x480rni51lhyvigfdf06b9247z868pk3bal1wnnfl58sr";
1030 1030 };
1031 1031 meta = {
1032 1032 license = [ pkgs.lib.licenses.bsdOriginal ];
1033 1033 };
1034 1034 };
1035 1035 "mock" = super.buildPythonPackage {
1036 1036 name = "mock-3.0.5";
1037 1037 doCheck = false;
1038 1038 propagatedBuildInputs = [
1039 1039 self."six"
1040 1040 self."funcsigs"
1041 1041 ];
1042 1042 src = fetchurl {
1043 1043 url = "https://files.pythonhosted.org/packages/2e/ab/4fe657d78b270aa6a32f027849513b829b41b0f28d9d8d7f8c3d29ea559a/mock-3.0.5.tar.gz";
1044 1044 sha256 = "1hrp6j0yrx2xzylfv02qa8kph661m6yq4p0mc8fnimch9j4psrc3";
1045 1045 };
1046 1046 meta = {
1047 1047 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "OSI Approved :: BSD License"; } ];
1048 1048 };
1049 1049 };
1050 1050 "more-itertools" = super.buildPythonPackage {
1051 1051 name = "more-itertools-5.0.0";
1052 1052 doCheck = false;
1053 1053 propagatedBuildInputs = [
1054 1054 self."six"
1055 1055 ];
1056 1056 src = fetchurl {
1057 1057 url = "https://files.pythonhosted.org/packages/dd/26/30fc0d541d9fdf55faf5ba4b0fd68f81d5bd2447579224820ad525934178/more-itertools-5.0.0.tar.gz";
1058 1058 sha256 = "1r12cm6mcdwdzz7d47a6g4l437xsvapdlgyhqay3i2nrlv03da9q";
1059 1059 };
1060 1060 meta = {
1061 1061 license = [ pkgs.lib.licenses.mit ];
1062 1062 };
1063 1063 };
1064 1064 "msgpack-python" = super.buildPythonPackage {
1065 1065 name = "msgpack-python-0.5.6";
1066 1066 doCheck = false;
1067 1067 src = fetchurl {
1068 1068 url = "https://files.pythonhosted.org/packages/8a/20/6eca772d1a5830336f84aca1d8198e5a3f4715cd1c7fc36d3cc7f7185091/msgpack-python-0.5.6.tar.gz";
1069 1069 sha256 = "16wh8qgybmfh4pjp8vfv78mdlkxfmcasg78lzlnm6nslsfkci31p";
1070 1070 };
1071 1071 meta = {
1072 1072 license = [ pkgs.lib.licenses.asl20 ];
1073 1073 };
1074 1074 };
1075 1075 "mysql-python" = super.buildPythonPackage {
1076 1076 name = "mysql-python-1.2.5";
1077 1077 doCheck = false;
1078 1078 src = fetchurl {
1079 1079 url = "https://files.pythonhosted.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip";
1080 1080 sha256 = "0x0c2jg0bb3pp84njaqiic050qkyd7ymwhfvhipnimg58yv40441";
1081 1081 };
1082 1082 meta = {
1083 1083 license = [ pkgs.lib.licenses.gpl1 ];
1084 1084 };
1085 1085 };
1086 1086 "nbconvert" = super.buildPythonPackage {
1087 1087 name = "nbconvert-5.3.1";
1088 1088 doCheck = false;
1089 1089 propagatedBuildInputs = [
1090 1090 self."mistune"
1091 1091 self."jinja2"
1092 1092 self."pygments"
1093 1093 self."traitlets"
1094 1094 self."jupyter-core"
1095 1095 self."nbformat"
1096 1096 self."entrypoints"
1097 1097 self."bleach"
1098 1098 self."pandocfilters"
1099 1099 self."testpath"
1100 1100 ];
1101 1101 src = fetchurl {
1102 1102 url = "https://files.pythonhosted.org/packages/b9/a4/d0a0938ad6f5eeb4dea4e73d255c617ef94b0b2849d51194c9bbdb838412/nbconvert-5.3.1.tar.gz";
1103 1103 sha256 = "1f9dkvpx186xjm4xab0qbph588mncp4vqk3fmxrsnqs43mks9c8j";
1104 1104 };
1105 1105 meta = {
1106 1106 license = [ pkgs.lib.licenses.bsdOriginal ];
1107 1107 };
1108 1108 };
1109 1109 "nbformat" = super.buildPythonPackage {
1110 1110 name = "nbformat-4.4.0";
1111 1111 doCheck = false;
1112 1112 propagatedBuildInputs = [
1113 1113 self."ipython-genutils"
1114 1114 self."traitlets"
1115 1115 self."jsonschema"
1116 1116 self."jupyter-core"
1117 1117 ];
1118 1118 src = fetchurl {
1119 1119 url = "https://files.pythonhosted.org/packages/6e/0e/160754f7ae3e984863f585a3743b0ed1702043a81245907c8fae2d537155/nbformat-4.4.0.tar.gz";
1120 1120 sha256 = "00nlf08h8yc4q73nphfvfhxrcnilaqanb8z0mdy6nxk0vzq4wjgp";
1121 1121 };
1122 1122 meta = {
1123 1123 license = [ pkgs.lib.licenses.bsdOriginal ];
1124 1124 };
1125 1125 };
1126 1126 "packaging" = super.buildPythonPackage {
1127 1127 name = "packaging-20.3";
1128 1128 doCheck = false;
1129 1129 propagatedBuildInputs = [
1130 1130 self."pyparsing"
1131 1131 self."six"
1132 1132 ];
1133 1133 src = fetchurl {
1134 1134 url = "https://files.pythonhosted.org/packages/65/37/83e3f492eb52d771e2820e88105f605335553fe10422cba9d256faeb1702/packaging-20.3.tar.gz";
1135 1135 sha256 = "18xpablq278janh03bai9xd4kz9b0yfp6vflazn725ns9x3jna9w";
1136 1136 };
1137 1137 meta = {
1138 1138 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
1139 1139 };
1140 1140 };
1141 1141 "pandocfilters" = super.buildPythonPackage {
1142 1142 name = "pandocfilters-1.4.2";
1143 1143 doCheck = false;
1144 1144 src = fetchurl {
1145 1145 url = "https://files.pythonhosted.org/packages/4c/ea/236e2584af67bb6df960832731a6e5325fd4441de001767da328c33368ce/pandocfilters-1.4.2.tar.gz";
1146 1146 sha256 = "1a8d9b7s48gmq9zj0pmbyv2sivn5i7m6mybgpkk4jm5vd7hp1pdk";
1147 1147 };
1148 1148 meta = {
1149 1149 license = [ pkgs.lib.licenses.bsdOriginal ];
1150 1150 };
1151 1151 };
1152 1152 "paste" = super.buildPythonPackage {
1153 1153 name = "paste-3.4.0";
1154 1154 doCheck = false;
1155 1155 propagatedBuildInputs = [
1156 1156 self."six"
1157 1157 ];
1158 1158 src = fetchurl {
1159 1159 url = "https://files.pythonhosted.org/packages/79/4a/45821b71dd40000507549afd1491546afad8279c0a87527c88776a794158/Paste-3.4.0.tar.gz";
1160 1160 sha256 = "16sichvhyci1gaarkjs35mai8vphh7b244qm14hj1isw38nx4c03";
1161 1161 };
1162 1162 meta = {
1163 1163 license = [ pkgs.lib.licenses.mit ];
1164 1164 };
1165 1165 };
1166 1166 "pastedeploy" = super.buildPythonPackage {
1167 1167 name = "pastedeploy-2.1.0";
1168 1168 doCheck = false;
1169 1169 src = fetchurl {
1170 1170 url = "https://files.pythonhosted.org/packages/c4/e9/972a1c20318b3ae9edcab11a6cef64308fbae5d0d45ab52c6f8b2b8f35b8/PasteDeploy-2.1.0.tar.gz";
1171 1171 sha256 = "16qsq5y6mryslmbp5pn35x4z8z3ndp5rpgl42h226879nrw9hmg7";
1172 1172 };
1173 1173 meta = {
1174 1174 license = [ pkgs.lib.licenses.mit ];
1175 1175 };
1176 1176 };
1177 1177 "pastescript" = super.buildPythonPackage {
1178 1178 name = "pastescript-3.2.0";
1179 1179 doCheck = false;
1180 1180 propagatedBuildInputs = [
1181 1181 self."paste"
1182 1182 self."pastedeploy"
1183 1183 self."six"
1184 1184 ];
1185 1185 src = fetchurl {
1186 1186 url = "https://files.pythonhosted.org/packages/ff/47/45c6f5a3cb8f5abf786fea98dbb8d02400a55768a9b623afb7df12346c61/PasteScript-3.2.0.tar.gz";
1187 1187 sha256 = "1b3jq7xh383nvrrlblk05m37345bv97xrhx77wshllba3h7mq3wv";
1188 1188 };
1189 1189 meta = {
1190 1190 license = [ pkgs.lib.licenses.mit ];
1191 1191 };
1192 1192 };
1193 1193 "pathlib2" = super.buildPythonPackage {
1194 1194 name = "pathlib2-2.3.5";
1195 1195 doCheck = false;
1196 1196 propagatedBuildInputs = [
1197 1197 self."six"
1198 1198 self."scandir"
1199 1199 ];
1200 1200 src = fetchurl {
1201 1201 url = "https://files.pythonhosted.org/packages/94/d8/65c86584e7e97ef824a1845c72bbe95d79f5b306364fa778a3c3e401b309/pathlib2-2.3.5.tar.gz";
1202 1202 sha256 = "0s4qa8c082fdkb17izh4mfgwrjd1n5pya18wvrbwqdvvb5xs9nbc";
1203 1203 };
1204 1204 meta = {
1205 1205 license = [ pkgs.lib.licenses.mit ];
1206 1206 };
1207 1207 };
1208 1208 "peppercorn" = super.buildPythonPackage {
1209 1209 name = "peppercorn-0.6";
1210 1210 doCheck = false;
1211 1211 src = fetchurl {
1212 1212 url = "https://files.pythonhosted.org/packages/e4/77/93085de7108cdf1a0b092ff443872a8f9442c736d7ddebdf2f27627935f4/peppercorn-0.6.tar.gz";
1213 1213 sha256 = "1ip4bfwcpwkq9hz2dai14k2cyabvwrnvcvrcmzxmqm04g8fnimwn";
1214 1214 };
1215 1215 meta = {
1216 1216 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1217 1217 };
1218 1218 };
1219 1219 "pexpect" = super.buildPythonPackage {
1220 1220 name = "pexpect-4.8.0";
1221 1221 doCheck = false;
1222 1222 propagatedBuildInputs = [
1223 1223 self."ptyprocess"
1224 1224 ];
1225 1225 src = fetchurl {
1226 1226 url = "https://files.pythonhosted.org/packages/e5/9b/ff402e0e930e70467a7178abb7c128709a30dfb22d8777c043e501bc1b10/pexpect-4.8.0.tar.gz";
1227 1227 sha256 = "032cg337h8awydgypz6f4wx848lw8dyrj4zy988x0lyib4ws8rgw";
1228 1228 };
1229 1229 meta = {
1230 1230 license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ];
1231 1231 };
1232 1232 };
1233 1233 "pickleshare" = super.buildPythonPackage {
1234 1234 name = "pickleshare-0.7.5";
1235 1235 doCheck = false;
1236 1236 propagatedBuildInputs = [
1237 1237 self."pathlib2"
1238 1238 ];
1239 1239 src = fetchurl {
1240 1240 url = "https://files.pythonhosted.org/packages/d8/b6/df3c1c9b616e9c0edbc4fbab6ddd09df9535849c64ba51fcb6531c32d4d8/pickleshare-0.7.5.tar.gz";
1241 1241 sha256 = "1jmghg3c53yp1i8cm6pcrm280ayi8621rwyav9fac7awjr3kss47";
1242 1242 };
1243 1243 meta = {
1244 1244 license = [ pkgs.lib.licenses.mit ];
1245 1245 };
1246 1246 };
1247 1247 "plaster" = super.buildPythonPackage {
1248 1248 name = "plaster-1.0";
1249 1249 doCheck = false;
1250 1250 propagatedBuildInputs = [
1251 1251 self."setuptools"
1252 1252 ];
1253 1253 src = fetchurl {
1254 1254 url = "https://files.pythonhosted.org/packages/37/e1/56d04382d718d32751017d32f351214384e529b794084eee20bb52405563/plaster-1.0.tar.gz";
1255 1255 sha256 = "1hy8k0nv2mxq94y5aysk6hjk9ryb4bsd13g83m60hcyzxz3wflc3";
1256 1256 };
1257 1257 meta = {
1258 1258 license = [ pkgs.lib.licenses.mit ];
1259 1259 };
1260 1260 };
1261 1261 "plaster-pastedeploy" = super.buildPythonPackage {
1262 1262 name = "plaster-pastedeploy-0.7";
1263 1263 doCheck = false;
1264 1264 propagatedBuildInputs = [
1265 1265 self."pastedeploy"
1266 1266 self."plaster"
1267 1267 ];
1268 1268 src = fetchurl {
1269 1269 url = "https://files.pythonhosted.org/packages/99/69/2d3bc33091249266a1bd3cf24499e40ab31d54dffb4a7d76fe647950b98c/plaster_pastedeploy-0.7.tar.gz";
1270 1270 sha256 = "1zg7gcsvc1kzay1ry5p699rg2qavfsxqwl17mqxzr0gzw6j9679r";
1271 1271 };
1272 1272 meta = {
1273 1273 license = [ pkgs.lib.licenses.mit ];
1274 1274 };
1275 1275 };
1276 1276 "pluggy" = super.buildPythonPackage {
1277 1277 name = "pluggy-0.13.1";
1278 1278 doCheck = false;
1279 1279 propagatedBuildInputs = [
1280 1280 self."importlib-metadata"
1281 1281 ];
1282 1282 src = fetchurl {
1283 1283 url = "https://files.pythonhosted.org/packages/f8/04/7a8542bed4b16a65c2714bf76cf5a0b026157da7f75e87cc88774aa10b14/pluggy-0.13.1.tar.gz";
1284 1284 sha256 = "1c35qyhvy27q9ih9n899f3h4sdnpgq027dbiilly2qb5cvgarchm";
1285 1285 };
1286 1286 meta = {
1287 1287 license = [ pkgs.lib.licenses.mit ];
1288 1288 };
1289 1289 };
1290 1290 "premailer" = super.buildPythonPackage {
1291 1291 name = "premailer-3.6.1";
1292 1292 doCheck = false;
1293 1293 propagatedBuildInputs = [
1294 1294 self."lxml"
1295 1295 self."cssselect"
1296 1296 self."cssutils"
1297 1297 self."requests"
1298 1298 self."cachetools"
1299 1299 ];
1300 1300 src = fetchurl {
1301 1301 url = "https://files.pythonhosted.org/packages/62/da/2f43cdf9d3d79c80c4856a12389a1f257d65fe9ccc44bc6b4383c8a18e33/premailer-3.6.1.tar.gz";
1302 1302 sha256 = "08pshx7a110k4ll20x0xhpvyn3kkipkrbgxjjn7ncdxs54ihdhgw";
1303 1303 };
1304 1304 meta = {
1305 1305 license = [ pkgs.lib.licenses.psfl { fullName = "Python"; } ];
1306 1306 };
1307 1307 };
1308 1308 "prompt-toolkit" = super.buildPythonPackage {
1309 1309 name = "prompt-toolkit-1.0.18";
1310 1310 doCheck = false;
1311 1311 propagatedBuildInputs = [
1312 1312 self."six"
1313 1313 self."wcwidth"
1314 1314 ];
1315 1315 src = fetchurl {
1316 1316 url = "https://files.pythonhosted.org/packages/c5/64/c170e5b1913b540bf0c8ab7676b21fdd1d25b65ddeb10025c6ca43cccd4c/prompt_toolkit-1.0.18.tar.gz";
1317 1317 sha256 = "09h1153wgr5x2ny7ds0w2m81n3bb9j8hjb8sjfnrg506r01clkyx";
1318 1318 };
1319 1319 meta = {
1320 1320 license = [ pkgs.lib.licenses.bsdOriginal ];
1321 1321 };
1322 1322 };
1323 1323 "psutil" = super.buildPythonPackage {
1324 1324 name = "psutil-5.7.0";
1325 1325 doCheck = false;
1326 1326 src = fetchurl {
1327 1327 url = "https://files.pythonhosted.org/packages/c4/b8/3512f0e93e0db23a71d82485ba256071ebef99b227351f0f5540f744af41/psutil-5.7.0.tar.gz";
1328 1328 sha256 = "03jykdi3dgf1cdal9bv4fq9zjvzj9l9bs99gi5ar81sdl5nc2pk8";
1329 1329 };
1330 1330 meta = {
1331 1331 license = [ pkgs.lib.licenses.bsdOriginal ];
1332 1332 };
1333 1333 };
1334 1334 "psycopg2" = super.buildPythonPackage {
1335 1335 name = "psycopg2-2.8.4";
1336 1336 doCheck = false;
1337 1337 src = fetchurl {
1338 1338 url = "https://files.pythonhosted.org/packages/84/d7/6a93c99b5ba4d4d22daa3928b983cec66df4536ca50b22ce5dcac65e4e71/psycopg2-2.8.4.tar.gz";
1339 1339 sha256 = "1djvh98pi4hjd8rxbq8qzc63bg8v78k33yg6pl99wak61b6fb67q";
1340 1340 };
1341 1341 meta = {
1342 1342 license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ];
1343 1343 };
1344 1344 };
1345 1345 "ptyprocess" = super.buildPythonPackage {
1346 1346 name = "ptyprocess-0.6.0";
1347 1347 doCheck = false;
1348 1348 src = fetchurl {
1349 1349 url = "https://files.pythonhosted.org/packages/7d/2d/e4b8733cf79b7309d84c9081a4ab558c89d8c89da5961bf4ddb050ca1ce0/ptyprocess-0.6.0.tar.gz";
1350 1350 sha256 = "1h4lcd3w5nrxnsk436ar7fwkiy5rfn5wj2xwy9l0r4mdqnf2jgwj";
1351 1351 };
1352 1352 meta = {
1353 1353 license = [ ];
1354 1354 };
1355 1355 };
1356 1356 "py" = super.buildPythonPackage {
1357 1357 name = "py-1.8.0";
1358 1358 doCheck = false;
1359 1359 src = fetchurl {
1360 1360 url = "https://files.pythonhosted.org/packages/f1/5a/87ca5909f400a2de1561f1648883af74345fe96349f34f737cdfc94eba8c/py-1.8.0.tar.gz";
1361 1361 sha256 = "0lsy1gajva083pzc7csj1cvbmminb7b4l6a0prdzyb3fd829nqyw";
1362 1362 };
1363 1363 meta = {
1364 1364 license = [ pkgs.lib.licenses.mit ];
1365 1365 };
1366 1366 };
1367 1367 "py-bcrypt" = super.buildPythonPackage {
1368 1368 name = "py-bcrypt-0.4";
1369 1369 doCheck = false;
1370 1370 src = fetchurl {
1371 1371 url = "https://files.pythonhosted.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz";
1372 1372 sha256 = "0y6smdggwi5s72v6p1nn53dg6w05hna3d264cq6kas0lap73p8az";
1373 1373 };
1374 1374 meta = {
1375 1375 license = [ pkgs.lib.licenses.bsdOriginal ];
1376 1376 };
1377 1377 };
1378 1378 "py-gfm" = super.buildPythonPackage {
1379 1379 name = "py-gfm-0.1.4";
1380 1380 doCheck = false;
1381 1381 propagatedBuildInputs = [
1382 1382 self."setuptools"
1383 1383 self."markdown"
1384 1384 ];
1385 1385 src = fetchurl {
1386 1386 url = "https://files.pythonhosted.org/packages/06/ee/004a03a1d92bb386dae44f6dd087db541bc5093374f1637d4d4ae5596cc2/py-gfm-0.1.4.tar.gz";
1387 1387 sha256 = "0zip06g2isivx8fzgqd4n9qzsa22c25jas1rsb7m2rnjg72m0rzg";
1388 1388 };
1389 1389 meta = {
1390 1390 license = [ pkgs.lib.licenses.bsdOriginal ];
1391 1391 };
1392 1392 };
1393 1393 "pyasn1" = super.buildPythonPackage {
1394 1394 name = "pyasn1-0.4.8";
1395 1395 doCheck = false;
1396 1396 src = fetchurl {
1397 1397 url = "https://files.pythonhosted.org/packages/a4/db/fffec68299e6d7bad3d504147f9094830b704527a7fc098b721d38cc7fa7/pyasn1-0.4.8.tar.gz";
1398 1398 sha256 = "1fnhbi3rmk47l9851gbik0flfr64vs5j0hbqx24cafjap6gprxxf";
1399 1399 };
1400 1400 meta = {
1401 1401 license = [ pkgs.lib.licenses.bsdOriginal ];
1402 1402 };
1403 1403 };
1404 1404 "pyasn1-modules" = super.buildPythonPackage {
1405 1405 name = "pyasn1-modules-0.2.6";
1406 1406 doCheck = false;
1407 1407 propagatedBuildInputs = [
1408 1408 self."pyasn1"
1409 1409 ];
1410 1410 src = fetchurl {
1411 1411 url = "https://files.pythonhosted.org/packages/f1/a9/a1ef72a0e43feff643cf0130a08123dea76205e7a0dda37e3efb5f054a31/pyasn1-modules-0.2.6.tar.gz";
1412 1412 sha256 = "08hph9j1r018drnrny29l7dl2q0cin78csswrhwrh8jmq61pmha3";
1413 1413 };
1414 1414 meta = {
1415 1415 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
1416 1416 };
1417 1417 };
1418 1418 "pycparser" = super.buildPythonPackage {
1419 1419 name = "pycparser-2.20";
1420 1420 doCheck = false;
1421 1421 src = fetchurl {
1422 1422 url = "https://files.pythonhosted.org/packages/0f/86/e19659527668d70be91d0369aeaa055b4eb396b0f387a4f92293a20035bd/pycparser-2.20.tar.gz";
1423 1423 sha256 = "1w0m3xvlrzq4lkbvd1ngfm8mdw64r1yxy6n7djlw6qj5d0km6ird";
1424 1424 };
1425 1425 meta = {
1426 1426 license = [ pkgs.lib.licenses.bsdOriginal ];
1427 1427 };
1428 1428 };
1429 1429 "pycrypto" = super.buildPythonPackage {
1430 1430 name = "pycrypto-2.6.1";
1431 1431 doCheck = false;
1432 1432 src = fetchurl {
1433 1433 url = "https://files.pythonhosted.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz";
1434 1434 sha256 = "0g0ayql5b9mkjam8hym6zyg6bv77lbh66rv1fyvgqb17kfc1xkpj";
1435 1435 };
1436 1436 meta = {
1437 1437 license = [ pkgs.lib.licenses.publicDomain ];
1438 1438 };
1439 1439 };
1440 1440 "pycurl" = super.buildPythonPackage {
1441 1441 name = "pycurl-7.43.0.3";
1442 1442 doCheck = false;
1443 1443 src = fetchurl {
1444 1444 url = "https://files.pythonhosted.org/packages/ac/b3/0f3979633b7890bab6098d84c84467030b807a1e2b31f5d30103af5a71ca/pycurl-7.43.0.3.tar.gz";
1445 1445 sha256 = "13nsvqhvnmnvfk75s8iynqsgszyv06cjp4drd3psi7zpbh63623g";
1446 1446 };
1447 1447 meta = {
1448 1448 license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1449 1449 };
1450 1450 };
1451 1451 "pygments" = super.buildPythonPackage {
1452 1452 name = "pygments-2.4.2";
1453 1453 doCheck = false;
1454 1454 src = fetchurl {
1455 1455 url = "https://files.pythonhosted.org/packages/7e/ae/26808275fc76bf2832deb10d3a3ed3107bc4de01b85dcccbe525f2cd6d1e/Pygments-2.4.2.tar.gz";
1456 1456 sha256 = "15v2sqm5g12bqa0c7wikfh9ck2nl97ayizy1hpqhmws5gqalq748";
1457 1457 };
1458 1458 meta = {
1459 1459 license = [ pkgs.lib.licenses.bsdOriginal ];
1460 1460 };
1461 1461 };
1462 1462 "pymysql" = super.buildPythonPackage {
1463 1463 name = "pymysql-0.8.1";
1464 1464 doCheck = false;
1465 1465 src = fetchurl {
1466 1466 url = "https://files.pythonhosted.org/packages/44/39/6bcb83cae0095a31b6be4511707fdf2009d3e29903a55a0494d3a9a2fac0/PyMySQL-0.8.1.tar.gz";
1467 1467 sha256 = "0a96crz55bw4h6myh833skrli7b0ck89m3x673y2z2ryy7zrpq9l";
1468 1468 };
1469 1469 meta = {
1470 1470 license = [ pkgs.lib.licenses.mit ];
1471 1471 };
1472 1472 };
1473 1473 "pyotp" = super.buildPythonPackage {
1474 1474 name = "pyotp-2.3.0";
1475 1475 doCheck = false;
1476 1476 src = fetchurl {
1477 1477 url = "https://files.pythonhosted.org/packages/f7/15/395c4945ea6bc37e8811280bb675615cb4c2b2c1cd70bdc43329da91a386/pyotp-2.3.0.tar.gz";
1478 1478 sha256 = "18d13ikra1iq0xyfqfm72zhgwxi2qi9ps6z1a6zmqp4qrn57wlzw";
1479 1479 };
1480 1480 meta = {
1481 1481 license = [ pkgs.lib.licenses.mit ];
1482 1482 };
1483 1483 };
1484 1484 "pyparsing" = super.buildPythonPackage {
1485 1485 name = "pyparsing-2.4.7";
1486 1486 doCheck = false;
1487 1487 src = fetchurl {
1488 1488 url = "https://files.pythonhosted.org/packages/c1/47/dfc9c342c9842bbe0036c7f763d2d6686bcf5eb1808ba3e170afdb282210/pyparsing-2.4.7.tar.gz";
1489 1489 sha256 = "1hgc8qrbq1ymxbwfbjghv01fm3fbpjwpjwi0bcailxxzhf3yq0y2";
1490 1490 };
1491 1491 meta = {
1492 1492 license = [ pkgs.lib.licenses.mit ];
1493 1493 };
1494 1494 };
1495 1495 "pyramid" = super.buildPythonPackage {
1496 1496 name = "pyramid-1.10.4";
1497 1497 doCheck = false;
1498 1498 propagatedBuildInputs = [
1499 1499 self."hupper"
1500 1500 self."plaster"
1501 1501 self."plaster-pastedeploy"
1502 1502 self."setuptools"
1503 1503 self."translationstring"
1504 1504 self."venusian"
1505 1505 self."webob"
1506 1506 self."zope.deprecation"
1507 1507 self."zope.interface"
1508 1508 self."repoze.lru"
1509 1509 ];
1510 1510 src = fetchurl {
1511 1511 url = "https://files.pythonhosted.org/packages/c2/43/1ae701c9c6bb3a434358e678a5e72c96e8aa55cf4cb1d2fa2041b5dd38b7/pyramid-1.10.4.tar.gz";
1512 1512 sha256 = "0rkxs1ajycg2zh1c94xlmls56mx5m161sn8112skj0amza6cn36q";
1513 1513 };
1514 1514 meta = {
1515 1515 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1516 1516 };
1517 1517 };
1518 1518 "pyramid-debugtoolbar" = super.buildPythonPackage {
1519 1519 name = "pyramid-debugtoolbar-4.6.1";
1520 1520 doCheck = false;
1521 1521 propagatedBuildInputs = [
1522 1522 self."pyramid"
1523 1523 self."pyramid-mako"
1524 1524 self."repoze.lru"
1525 1525 self."pygments"
1526 1526 self."ipaddress"
1527 1527 ];
1528 1528 src = fetchurl {
1529 1529 url = "https://files.pythonhosted.org/packages/99/f6/b8603f82c18275be293921bc3a2184205056ca505747bf64ab8a0c08e124/pyramid_debugtoolbar-4.6.1.tar.gz";
1530 1530 sha256 = "185z7q8n959ga5331iczwra2iljwkidfx4qn6bbd7vm3rm4w6llv";
1531 1531 };
1532 1532 meta = {
1533 1533 license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ];
1534 1534 };
1535 1535 };
1536 1536 "pyramid-jinja2" = super.buildPythonPackage {
1537 1537 name = "pyramid-jinja2-2.7";
1538 1538 doCheck = false;
1539 1539 propagatedBuildInputs = [
1540 1540 self."pyramid"
1541 1541 self."zope.deprecation"
1542 1542 self."jinja2"
1543 1543 self."markupsafe"
1544 1544 ];
1545 1545 src = fetchurl {
1546 1546 url = "https://files.pythonhosted.org/packages/d8/80/d60a7233823de22ce77bd864a8a83736a1fe8b49884b08303a2e68b2c853/pyramid_jinja2-2.7.tar.gz";
1547 1547 sha256 = "1sz5s0pp5jqhf4w22w9527yz8hgdi4mhr6apd6vw1gm5clghh8aw";
1548 1548 };
1549 1549 meta = {
1550 1550 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1551 1551 };
1552 1552 };
1553 1553 "pyramid-apispec" = super.buildPythonPackage {
1554 1554 name = "pyramid-apispec-0.3.2";
1555 1555 doCheck = false;
1556 1556 propagatedBuildInputs = [
1557 1557 self."apispec"
1558 1558 ];
1559 1559 src = fetchurl {
1560 1560 url = "https://files.pythonhosted.org/packages/2a/30/1dea5d81ea635449572ba60ec3148310d75ae4530c3c695f54b0991bb8c7/pyramid_apispec-0.3.2.tar.gz";
1561 1561 sha256 = "0ffrcqp9dkykivhfcq0v9lgy6w0qhwl6x78925vfjmayly9r8da0";
1562 1562 };
1563 1563 meta = {
1564 1564 license = [ pkgs.lib.licenses.bsdOriginal ];
1565 1565 };
1566 1566 };
1567 1567 "pyramid-mailer" = super.buildPythonPackage {
1568 1568 name = "pyramid-mailer-0.15.1";
1569 1569 doCheck = false;
1570 1570 propagatedBuildInputs = [
1571 1571 self."pyramid"
1572 1572 self."repoze.sendmail"
1573 1573 self."transaction"
1574 1574 ];
1575 1575 src = fetchurl {
1576 1576 url = "https://files.pythonhosted.org/packages/a0/f2/6febf5459dff4d7e653314d575469ad2e11b9d2af2c3606360e1c67202f2/pyramid_mailer-0.15.1.tar.gz";
1577 1577 sha256 = "16vg8jb203jgb7b0hd6wllfqvp542qh2ry1gjai2m6qpv5agy2pc";
1578 1578 };
1579 1579 meta = {
1580 1580 license = [ pkgs.lib.licenses.bsdOriginal ];
1581 1581 };
1582 1582 };
1583 1583 "pyramid-mako" = super.buildPythonPackage {
1584 1584 name = "pyramid-mako-1.1.0";
1585 1585 doCheck = false;
1586 1586 propagatedBuildInputs = [
1587 1587 self."pyramid"
1588 1588 self."mako"
1589 1589 ];
1590 1590 src = fetchurl {
1591 1591 url = "https://files.pythonhosted.org/packages/63/7b/5e2af68f675071a6bad148c1c393928f0ef5fcd94e95cbf53b89d6471a83/pyramid_mako-1.1.0.tar.gz";
1592 1592 sha256 = "1qj0m091mnii86j2q1d82yir22nha361rvhclvg3s70z8iiwhrh0";
1593 1593 };
1594 1594 meta = {
1595 1595 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1596 1596 };
1597 1597 };
1598 1598 "pysqlite" = super.buildPythonPackage {
1599 1599 name = "pysqlite-2.8.3";
1600 1600 doCheck = false;
1601 1601 src = fetchurl {
1602 1602 url = "https://files.pythonhosted.org/packages/42/02/981b6703e3c83c5b25a829c6e77aad059f9481b0bbacb47e6e8ca12bd731/pysqlite-2.8.3.tar.gz";
1603 1603 sha256 = "1424gwq9sil2ffmnizk60q36vydkv8rxs6m7xs987kz8cdc37lqp";
1604 1604 };
1605 1605 meta = {
1606 1606 license = [ { fullName = "zlib/libpng License"; } { fullName = "zlib/libpng license"; } ];
1607 1607 };
1608 1608 };
1609 1609 "pytest" = super.buildPythonPackage {
1610 1610 name = "pytest-4.6.5";
1611 1611 doCheck = false;
1612 1612 propagatedBuildInputs = [
1613 1613 self."py"
1614 1614 self."six"
1615 1615 self."packaging"
1616 1616 self."attrs"
1617 1617 self."atomicwrites"
1618 1618 self."pluggy"
1619 1619 self."importlib-metadata"
1620 1620 self."wcwidth"
1621 1621 self."funcsigs"
1622 1622 self."pathlib2"
1623 1623 self."more-itertools"
1624 1624 ];
1625 1625 src = fetchurl {
1626 1626 url = "https://files.pythonhosted.org/packages/2a/c6/1d1f32f6a5009900521b12e6560fb6b7245b0d4bc3fb771acd63d10e30e1/pytest-4.6.5.tar.gz";
1627 1627 sha256 = "0iykwwfp4h181nd7rsihh2120b0rkawlw7rvbl19sgfspncr3hwg";
1628 1628 };
1629 1629 meta = {
1630 1630 license = [ pkgs.lib.licenses.mit ];
1631 1631 };
1632 1632 };
1633 1633 "pytest-cov" = super.buildPythonPackage {
1634 1634 name = "pytest-cov-2.7.1";
1635 1635 doCheck = false;
1636 1636 propagatedBuildInputs = [
1637 1637 self."pytest"
1638 1638 self."coverage"
1639 1639 ];
1640 1640 src = fetchurl {
1641 1641 url = "https://files.pythonhosted.org/packages/bb/0f/3db7ff86801883b21d5353b258c994b1b8e2abbc804e2273b8d0fd19004b/pytest-cov-2.7.1.tar.gz";
1642 1642 sha256 = "0filvmmyqm715azsl09ql8hy2x7h286n6d8z5x42a1wpvvys83p0";
1643 1643 };
1644 1644 meta = {
1645 1645 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.mit ];
1646 1646 };
1647 1647 };
1648 1648 "pytest-profiling" = super.buildPythonPackage {
1649 1649 name = "pytest-profiling-1.7.0";
1650 1650 doCheck = false;
1651 1651 propagatedBuildInputs = [
1652 1652 self."six"
1653 1653 self."pytest"
1654 1654 self."gprof2dot"
1655 1655 ];
1656 1656 src = fetchurl {
1657 1657 url = "https://files.pythonhosted.org/packages/39/70/22a4b33739f07f1732a63e33bbfbf68e0fa58cfba9d200e76d01921eddbf/pytest-profiling-1.7.0.tar.gz";
1658 1658 sha256 = "0abz9gi26jpcfdzgsvwad91555lpgdc8kbymicmms8k2fqa8z4wk";
1659 1659 };
1660 1660 meta = {
1661 1661 license = [ pkgs.lib.licenses.mit ];
1662 1662 };
1663 1663 };
1664 1664 "pytest-runner" = super.buildPythonPackage {
1665 1665 name = "pytest-runner-5.1";
1666 1666 doCheck = false;
1667 1667 src = fetchurl {
1668 1668 url = "https://files.pythonhosted.org/packages/d9/6d/4b41a74b31720e25abd4799be72d54811da4b4d0233e38b75864dcc1f7ad/pytest-runner-5.1.tar.gz";
1669 1669 sha256 = "0ykfcnpp8c22winj63qzc07l5axwlc9ikl8vn05sc32gv3417815";
1670 1670 };
1671 1671 meta = {
1672 1672 license = [ pkgs.lib.licenses.mit ];
1673 1673 };
1674 1674 };
1675 1675 "pytest-sugar" = super.buildPythonPackage {
1676 1676 name = "pytest-sugar-0.9.2";
1677 1677 doCheck = false;
1678 1678 propagatedBuildInputs = [
1679 1679 self."pytest"
1680 1680 self."termcolor"
1681 1681 self."packaging"
1682 1682 ];
1683 1683 src = fetchurl {
1684 1684 url = "https://files.pythonhosted.org/packages/55/59/f02f78d1c80f7e03e23177f60624c8106d4f23d124c921df103f65692464/pytest-sugar-0.9.2.tar.gz";
1685 1685 sha256 = "1asq7yc4g8bx2sn7yy974mhc9ywvaihasjab4inkirdwn9s7mn7w";
1686 1686 };
1687 1687 meta = {
1688 1688 license = [ pkgs.lib.licenses.bsdOriginal ];
1689 1689 };
1690 1690 };
1691 1691 "pytest-timeout" = super.buildPythonPackage {
1692 1692 name = "pytest-timeout-1.3.3";
1693 1693 doCheck = false;
1694 1694 propagatedBuildInputs = [
1695 1695 self."pytest"
1696 1696 ];
1697 1697 src = fetchurl {
1698 1698 url = "https://files.pythonhosted.org/packages/13/48/7a166eaa29c1dca6cc253e3ba5773ff2e4aa4f567c1ea3905808e95ac5c1/pytest-timeout-1.3.3.tar.gz";
1699 1699 sha256 = "1cczcjhw4xx5sjkhxlhc5c1bkr7x6fcyx12wrnvwfckshdvblc2a";
1700 1700 };
1701 1701 meta = {
1702 1702 license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ];
1703 1703 };
1704 1704 };
1705 1705 "python-dateutil" = super.buildPythonPackage {
1706 1706 name = "python-dateutil-2.8.1";
1707 1707 doCheck = false;
1708 1708 propagatedBuildInputs = [
1709 1709 self."six"
1710 1710 ];
1711 1711 src = fetchurl {
1712 1712 url = "https://files.pythonhosted.org/packages/be/ed/5bbc91f03fa4c839c4c7360375da77f9659af5f7086b7a7bdda65771c8e0/python-dateutil-2.8.1.tar.gz";
1713 1713 sha256 = "0g42w7k5007iv9dam6gnja2ry8ydwirh99mgdll35s12pyfzxsvk";
1714 1714 };
1715 1715 meta = {
1716 1716 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ];
1717 1717 };
1718 1718 };
1719 1719 "python-editor" = super.buildPythonPackage {
1720 1720 name = "python-editor-1.0.4";
1721 1721 doCheck = false;
1722 1722 src = fetchurl {
1723 1723 url = "https://files.pythonhosted.org/packages/0a/85/78f4a216d28343a67b7397c99825cff336330893f00601443f7c7b2f2234/python-editor-1.0.4.tar.gz";
1724 1724 sha256 = "0yrjh8w72ivqxi4i7xsg5b1vz15x8fg51xra7c3bgfyxqnyadzai";
1725 1725 };
1726 1726 meta = {
1727 1727 license = [ pkgs.lib.licenses.asl20 { fullName = "Apache"; } ];
1728 1728 };
1729 1729 };
1730 1730 "python-ldap" = super.buildPythonPackage {
1731 1731 name = "python-ldap-3.2.0";
1732 1732 doCheck = false;
1733 1733 propagatedBuildInputs = [
1734 1734 self."pyasn1"
1735 1735 self."pyasn1-modules"
1736 1736 ];
1737 1737 src = fetchurl {
1738 1738 url = "https://files.pythonhosted.org/packages/ea/93/596f875e003c770447f4b99267820a0c769dd2dc3ae3ed19afe460fcbad0/python-ldap-3.2.0.tar.gz";
1739 1739 sha256 = "13nvrhp85yr0jyxixcjj012iw8l9wynxxlykm9j3alss6waln73x";
1740 1740 };
1741 1741 meta = {
1742 1742 license = [ pkgs.lib.licenses.psfl ];
1743 1743 };
1744 1744 };
1745 1745 "python-memcached" = super.buildPythonPackage {
1746 1746 name = "python-memcached-1.59";
1747 1747 doCheck = false;
1748 1748 propagatedBuildInputs = [
1749 1749 self."six"
1750 1750 ];
1751 1751 src = fetchurl {
1752 1752 url = "https://files.pythonhosted.org/packages/90/59/5faf6e3cd8a568dd4f737ddae4f2e54204fd8c51f90bf8df99aca6c22318/python-memcached-1.59.tar.gz";
1753 1753 sha256 = "0kvyapavbirk2x3n1jx4yb9nyigrj1s3x15nm3qhpvhkpqvqdqm2";
1754 1754 };
1755 1755 meta = {
1756 1756 license = [ pkgs.lib.licenses.psfl ];
1757 1757 };
1758 1758 };
1759 1759 "python-pam" = super.buildPythonPackage {
1760 1760 name = "python-pam-1.8.4";
1761 1761 doCheck = false;
1762 1762 src = fetchurl {
1763 1763 url = "https://files.pythonhosted.org/packages/01/16/544d01cae9f28e0292dbd092b6b8b0bf222b528f362ee768a5bed2140111/python-pam-1.8.4.tar.gz";
1764 1764 sha256 = "16whhc0vr7gxsbzvsnq65nq8fs3wwmx755cavm8kkczdkz4djmn8";
1765 1765 };
1766 1766 meta = {
1767 1767 license = [ { fullName = "License :: OSI Approved :: MIT License"; } pkgs.lib.licenses.mit ];
1768 1768 };
1769 1769 };
1770 1770 "python-saml" = super.buildPythonPackage {
1771 1771 name = "python-saml-2.4.2";
1772 1772 doCheck = false;
1773 1773 propagatedBuildInputs = [
1774 1774 self."dm.xmlsec.binding"
1775 1775 self."isodate"
1776 1776 self."defusedxml"
1777 1777 ];
1778 1778 src = fetchurl {
1779 1779 url = "https://files.pythonhosted.org/packages/79/a8/a6611017e0883102fd5e2b73c9d90691b8134e38247c04ee1531d3dc647c/python-saml-2.4.2.tar.gz";
1780 1780 sha256 = "0dls4hwvf13yg7x5yfjrghbywg8g38vn5vr0rsf70hli3ydbfm43";
1781 1781 };
1782 1782 meta = {
1783 1783 license = [ pkgs.lib.licenses.mit ];
1784 1784 };
1785 1785 };
1786 1786 "pytz" = super.buildPythonPackage {
1787 1787 name = "pytz-2019.3";
1788 1788 doCheck = false;
1789 1789 src = fetchurl {
1790 1790 url = "https://files.pythonhosted.org/packages/82/c3/534ddba230bd4fbbd3b7a3d35f3341d014cca213f369a9940925e7e5f691/pytz-2019.3.tar.gz";
1791 1791 sha256 = "1ghrk1wg45d3nymj7bf4zj03n3bh64xmczhk4pfi577hdkdhcb5h";
1792 1792 };
1793 1793 meta = {
1794 1794 license = [ pkgs.lib.licenses.mit ];
1795 1795 };
1796 1796 };
1797 1797 "pyzmq" = super.buildPythonPackage {
1798 1798 name = "pyzmq-14.6.0";
1799 1799 doCheck = false;
1800 1800 src = fetchurl {
1801 1801 url = "https://files.pythonhosted.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz";
1802 1802 sha256 = "1frmbjykvhmdg64g7sn20c9fpamrsfxwci1nhhg8q7jgz5pq0ikp";
1803 1803 };
1804 1804 meta = {
1805 1805 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "LGPL+BSD"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1806 1806 };
1807 1807 };
1808 1808 "PyYAML" = super.buildPythonPackage {
1809 1809 name = "PyYAML-5.3.1";
1810 1810 doCheck = false;
1811 1811 src = fetchurl {
1812 1812 url = "https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz";
1813 1813 sha256 = "0pb4zvkfxfijkpgd1b86xjsqql97ssf1knbd1v53wkg1qm9cgsmq";
1814 1814 };
1815 1815 meta = {
1816 1816 license = [ pkgs.lib.licenses.mit ];
1817 1817 };
1818 1818 };
1819 "regex" = super.buildPythonPackage {
1820 name = "regex-2020.9.27";
1821 doCheck = false;
1822 src = fetchurl {
1823 url = "https://files.pythonhosted.org/packages/93/8c/17f45cdfb39b13d4b5f909e4b4c2917abcbdef9c0036919a0399769148cf/regex-2020.9.27.tar.gz";
1824 sha256 = "179ngfzwbsjvn5vhyzdahvmg0f7acahkwwy9bpjy1pv08bm2mwx6";
1825 };
1826 meta = {
1827 license = [ pkgs.lib.licenses.psfl ];
1828 };
1829 };
1819 1830 "redis" = super.buildPythonPackage {
1820 1831 name = "redis-3.4.1";
1821 1832 doCheck = false;
1822 1833 src = fetchurl {
1823 1834 url = "https://files.pythonhosted.org/packages/ef/2e/2c0f59891db7db087a7eeaa79bc7c7f2c039e71a2b5b0a41391e9d462926/redis-3.4.1.tar.gz";
1824 1835 sha256 = "07yaj0j9fs7xdkg5bg926fa990khyigjbp31si8ai20vj8sv7kqd";
1825 1836 };
1826 1837 meta = {
1827 1838 license = [ pkgs.lib.licenses.mit ];
1828 1839 };
1829 1840 };
1830 1841 "repoze.lru" = super.buildPythonPackage {
1831 1842 name = "repoze.lru-0.7";
1832 1843 doCheck = false;
1833 1844 src = fetchurl {
1834 1845 url = "https://files.pythonhosted.org/packages/12/bc/595a77c4b5e204847fdf19268314ef59c85193a9dc9f83630fc459c0fee5/repoze.lru-0.7.tar.gz";
1835 1846 sha256 = "0xzz1aw2smy8hdszrq8yhnklx6w1r1mf55061kalw3iq35gafa84";
1836 1847 };
1837 1848 meta = {
1838 1849 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1839 1850 };
1840 1851 };
1841 1852 "repoze.sendmail" = super.buildPythonPackage {
1842 1853 name = "repoze.sendmail-4.4.1";
1843 1854 doCheck = false;
1844 1855 propagatedBuildInputs = [
1845 1856 self."setuptools"
1846 1857 self."zope.interface"
1847 1858 self."transaction"
1848 1859 ];
1849 1860 src = fetchurl {
1850 1861 url = "https://files.pythonhosted.org/packages/12/4e/8ef1fd5c42765d712427b9c391419a77bd48877886d2cbc5e9f23c8cad9b/repoze.sendmail-4.4.1.tar.gz";
1851 1862 sha256 = "096ln02jr2afk7ab9j2czxqv2ryqq7m86ah572nqplx52iws73ks";
1852 1863 };
1853 1864 meta = {
1854 1865 license = [ pkgs.lib.licenses.zpl21 ];
1855 1866 };
1856 1867 };
1857 1868 "requests" = super.buildPythonPackage {
1858 1869 name = "requests-2.22.0";
1859 1870 doCheck = false;
1860 1871 propagatedBuildInputs = [
1861 1872 self."chardet"
1862 1873 self."idna"
1863 1874 self."urllib3"
1864 1875 self."certifi"
1865 1876 ];
1866 1877 src = fetchurl {
1867 1878 url = "https://files.pythonhosted.org/packages/01/62/ddcf76d1d19885e8579acb1b1df26a852b03472c0e46d2b959a714c90608/requests-2.22.0.tar.gz";
1868 1879 sha256 = "1d5ybh11jr5sm7xp6mz8fyc7vrp4syifds91m7sj60xalal0gq0i";
1869 1880 };
1870 1881 meta = {
1871 1882 license = [ pkgs.lib.licenses.asl20 ];
1872 1883 };
1873 1884 };
1874 1885 "rhodecode-enterprise-ce" = super.buildPythonPackage {
1875 name = "rhodecode-enterprise-ce-4.20.0";
1886 name = "rhodecode-enterprise-ce-4.22.0";
1876 1887 buildInputs = [
1877 1888 self."pytest"
1878 1889 self."py"
1879 1890 self."pytest-cov"
1880 1891 self."pytest-sugar"
1881 1892 self."pytest-runner"
1882 1893 self."pytest-profiling"
1883 1894 self."pytest-timeout"
1884 1895 self."gprof2dot"
1885 1896 self."mock"
1886 1897 self."cov-core"
1887 1898 self."coverage"
1888 1899 self."webtest"
1889 1900 self."beautifulsoup4"
1890 1901 self."configobj"
1891 1902 ];
1892 1903 doCheck = true;
1893 1904 propagatedBuildInputs = [
1894 1905 self."amqp"
1895 1906 self."babel"
1896 1907 self."beaker"
1897 1908 self."bleach"
1898 1909 self."celery"
1899 1910 self."channelstream"
1900 1911 self."click"
1901 1912 self."colander"
1902 1913 self."configobj"
1903 1914 self."cssselect"
1904 1915 self."cryptography"
1905 1916 self."decorator"
1906 1917 self."deform"
1907 1918 self."docutils"
1908 1919 self."dogpile.cache"
1909 1920 self."dogpile.core"
1910 1921 self."formencode"
1911 1922 self."future"
1912 1923 self."futures"
1913 1924 self."infrae.cache"
1914 1925 self."iso8601"
1915 1926 self."itsdangerous"
1916 1927 self."kombu"
1917 1928 self."lxml"
1918 1929 self."mako"
1919 1930 self."markdown"
1920 1931 self."markupsafe"
1921 1932 self."msgpack-python"
1922 1933 self."pyotp"
1923 1934 self."packaging"
1924 1935 self."pathlib2"
1925 1936 self."paste"
1926 1937 self."pastedeploy"
1927 1938 self."pastescript"
1928 1939 self."peppercorn"
1929 1940 self."premailer"
1930 1941 self."psutil"
1931 1942 self."py-bcrypt"
1932 1943 self."pycurl"
1933 1944 self."pycrypto"
1934 1945 self."pygments"
1935 1946 self."pyparsing"
1936 1947 self."pyramid-debugtoolbar"
1937 1948 self."pyramid-mako"
1938 1949 self."pyramid"
1939 1950 self."pyramid-mailer"
1940 1951 self."python-dateutil"
1941 1952 self."python-ldap"
1942 1953 self."python-memcached"
1943 1954 self."python-pam"
1944 1955 self."python-saml"
1945 1956 self."pytz"
1946 1957 self."tzlocal"
1947 1958 self."pyzmq"
1948 1959 self."py-gfm"
1960 self."regex"
1949 1961 self."redis"
1950 1962 self."repoze.lru"
1951 1963 self."requests"
1952 1964 self."routes"
1953 1965 self."simplejson"
1954 1966 self."six"
1955 1967 self."sqlalchemy"
1956 1968 self."sshpubkeys"
1957 1969 self."subprocess32"
1958 1970 self."supervisor"
1959 1971 self."translationstring"
1960 1972 self."urllib3"
1961 1973 self."urlobject"
1962 1974 self."venusian"
1963 1975 self."weberror"
1964 1976 self."webhelpers2"
1965 1977 self."webob"
1966 1978 self."whoosh"
1967 1979 self."wsgiref"
1968 1980 self."zope.cachedescriptors"
1969 1981 self."zope.deprecation"
1970 1982 self."zope.event"
1971 1983 self."zope.interface"
1972 1984 self."mysql-python"
1973 1985 self."pymysql"
1974 1986 self."pysqlite"
1975 1987 self."psycopg2"
1976 1988 self."nbconvert"
1977 1989 self."nbformat"
1978 1990 self."jupyter-client"
1979 1991 self."jupyter-core"
1980 1992 self."alembic"
1981 1993 self."invoke"
1982 1994 self."bumpversion"
1983 1995 self."gevent"
1984 1996 self."greenlet"
1985 1997 self."gunicorn"
1986 1998 self."waitress"
1987 1999 self."ipdb"
1988 2000 self."ipython"
1989 2001 self."rhodecode-tools"
1990 2002 self."appenlight-client"
1991 2003 self."pytest"
1992 2004 self."py"
1993 2005 self."pytest-cov"
1994 2006 self."pytest-sugar"
1995 2007 self."pytest-runner"
1996 2008 self."pytest-profiling"
1997 2009 self."pytest-timeout"
1998 2010 self."gprof2dot"
1999 2011 self."mock"
2000 2012 self."cov-core"
2001 2013 self."coverage"
2002 2014 self."webtest"
2003 2015 self."beautifulsoup4"
2004 2016 ];
2005 2017 src = ./.;
2006 2018 meta = {
2007 2019 license = [ { fullName = "Affero GNU General Public License v3 or later (AGPLv3+)"; } { fullName = "AGPLv3, and Commercial License"; } ];
2008 2020 };
2009 2021 };
2010 2022 "rhodecode-tools" = super.buildPythonPackage {
2011 2023 name = "rhodecode-tools-1.4.0";
2012 2024 doCheck = false;
2013 2025 propagatedBuildInputs = [
2014 2026 self."click"
2015 2027 self."future"
2016 2028 self."six"
2017 2029 self."mako"
2018 2030 self."markupsafe"
2019 2031 self."requests"
2020 2032 self."urllib3"
2021 2033 self."whoosh"
2022 2034 self."elasticsearch"
2023 2035 self."elasticsearch-dsl"
2024 2036 self."elasticsearch2"
2025 2037 self."elasticsearch1-dsl"
2026 2038 ];
2027 2039 src = fetchurl {
2028 2040 url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a";
2029 2041 sha256 = "0fjszppj3zhh47g1i6b9xqps28gzfxdkzwb47pdmzrd1sfx29w3n";
2030 2042 };
2031 2043 meta = {
2032 2044 license = [ { fullName = "Apache 2.0 and Proprietary"; } ];
2033 2045 };
2034 2046 };
2035 2047 "routes" = super.buildPythonPackage {
2036 2048 name = "routes-2.4.1";
2037 2049 doCheck = false;
2038 2050 propagatedBuildInputs = [
2039 2051 self."six"
2040 2052 self."repoze.lru"
2041 2053 ];
2042 2054 src = fetchurl {
2043 2055 url = "https://files.pythonhosted.org/packages/33/38/ea827837e68d9c7dde4cff7ec122a93c319f0effc08ce92a17095576603f/Routes-2.4.1.tar.gz";
2044 2056 sha256 = "1zamff3m0kc4vyfniyhxpkkcqv1rrgnmh37ykxv34nna1ws47vi6";
2045 2057 };
2046 2058 meta = {
2047 2059 license = [ pkgs.lib.licenses.mit ];
2048 2060 };
2049 2061 };
2050 2062 "scandir" = super.buildPythonPackage {
2051 2063 name = "scandir-1.10.0";
2052 2064 doCheck = false;
2053 2065 src = fetchurl {
2054 2066 url = "https://files.pythonhosted.org/packages/df/f5/9c052db7bd54d0cbf1bc0bb6554362bba1012d03e5888950a4f5c5dadc4e/scandir-1.10.0.tar.gz";
2055 2067 sha256 = "1bkqwmf056pkchf05ywbnf659wqlp6lljcdb0y88wr9f0vv32ijd";
2056 2068 };
2057 2069 meta = {
2058 2070 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ];
2059 2071 };
2060 2072 };
2061 2073 "setproctitle" = super.buildPythonPackage {
2062 2074 name = "setproctitle-1.1.10";
2063 2075 doCheck = false;
2064 2076 src = fetchurl {
2065 2077 url = "https://files.pythonhosted.org/packages/5a/0d/dc0d2234aacba6cf1a729964383e3452c52096dc695581248b548786f2b3/setproctitle-1.1.10.tar.gz";
2066 2078 sha256 = "163kplw9dcrw0lffq1bvli5yws3rngpnvrxrzdw89pbphjjvg0v2";
2067 2079 };
2068 2080 meta = {
2069 2081 license = [ pkgs.lib.licenses.bsdOriginal ];
2070 2082 };
2071 2083 };
2072 2084 "setuptools" = super.buildPythonPackage {
2073 2085 name = "setuptools-44.1.0";
2074 2086 doCheck = false;
2075 2087 src = fetchurl {
2076 2088 url = "https://files.pythonhosted.org/packages/ed/7b/bbf89ca71e722b7f9464ebffe4b5ee20a9e5c9a555a56e2d3914bb9119a6/setuptools-44.1.0.zip";
2077 2089 sha256 = "1jja896zvd1ppccnjbhkgagxbwchgq6vfamp6qn1hvywq6q9cjkr";
2078 2090 };
2079 2091 meta = {
2080 2092 license = [ pkgs.lib.licenses.mit ];
2081 2093 };
2082 2094 };
2083 2095 "simplegeneric" = super.buildPythonPackage {
2084 2096 name = "simplegeneric-0.8.1";
2085 2097 doCheck = false;
2086 2098 src = fetchurl {
2087 2099 url = "https://files.pythonhosted.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip";
2088 2100 sha256 = "0wwi1c6md4vkbcsfsf8dklf3vr4mcdj4mpxkanwgb6jb1432x5yw";
2089 2101 };
2090 2102 meta = {
2091 2103 license = [ pkgs.lib.licenses.zpl21 ];
2092 2104 };
2093 2105 };
2094 2106 "simplejson" = super.buildPythonPackage {
2095 2107 name = "simplejson-3.16.0";
2096 2108 doCheck = false;
2097 2109 src = fetchurl {
2098 2110 url = "https://files.pythonhosted.org/packages/e3/24/c35fb1c1c315fc0fffe61ea00d3f88e85469004713dab488dee4f35b0aff/simplejson-3.16.0.tar.gz";
2099 2111 sha256 = "19cws1syk8jzq2pw43878dv6fjkb0ifvjpx0i9aajix6kc9jkwxi";
2100 2112 };
2101 2113 meta = {
2102 2114 license = [ { fullName = "Academic Free License (AFL)"; } pkgs.lib.licenses.mit ];
2103 2115 };
2104 2116 };
2105 2117 "six" = super.buildPythonPackage {
2106 2118 name = "six-1.11.0";
2107 2119 doCheck = false;
2108 2120 src = fetchurl {
2109 2121 url = "https://files.pythonhosted.org/packages/16/d8/bc6316cf98419719bd59c91742194c111b6f2e85abac88e496adefaf7afe/six-1.11.0.tar.gz";
2110 2122 sha256 = "1scqzwc51c875z23phj48gircqjgnn3af8zy2izjwmnlxrxsgs3h";
2111 2123 };
2112 2124 meta = {
2113 2125 license = [ pkgs.lib.licenses.mit ];
2114 2126 };
2115 2127 };
2116 2128 "sqlalchemy" = super.buildPythonPackage {
2117 2129 name = "sqlalchemy-1.3.15";
2118 2130 doCheck = false;
2119 2131 src = fetchurl {
2120 2132 url = "https://files.pythonhosted.org/packages/8c/30/4134e726dd5ed13728ff814fa91fc01c447ad8700504653fe99d91fdd34b/SQLAlchemy-1.3.15.tar.gz";
2121 2133 sha256 = "0iglkvymfp35zm5pxy5kzqvcv96kkas0chqdx7xpla86sspa9k64";
2122 2134 };
2123 2135 meta = {
2124 2136 license = [ pkgs.lib.licenses.mit ];
2125 2137 };
2126 2138 };
2127 2139 "sshpubkeys" = super.buildPythonPackage {
2128 2140 name = "sshpubkeys-3.1.0";
2129 2141 doCheck = false;
2130 2142 propagatedBuildInputs = [
2131 2143 self."cryptography"
2132 2144 self."ecdsa"
2133 2145 ];
2134 2146 src = fetchurl {
2135 2147 url = "https://files.pythonhosted.org/packages/00/23/f7508a12007c96861c3da811992f14283d79c819d71a217b3e12d5196649/sshpubkeys-3.1.0.tar.gz";
2136 2148 sha256 = "105g2li04nm1hb15a2y6hm9m9k7fbrkd5l3gy12w3kgcmsf3k25k";
2137 2149 };
2138 2150 meta = {
2139 2151 license = [ pkgs.lib.licenses.bsdOriginal ];
2140 2152 };
2141 2153 };
2142 2154 "subprocess32" = super.buildPythonPackage {
2143 2155 name = "subprocess32-3.5.4";
2144 2156 doCheck = false;
2145 2157 src = fetchurl {
2146 2158 url = "https://files.pythonhosted.org/packages/32/c8/564be4d12629b912ea431f1a50eb8b3b9d00f1a0b1ceff17f266be190007/subprocess32-3.5.4.tar.gz";
2147 2159 sha256 = "17f7mvwx2271s1wrl0qac3wjqqnrqag866zs3qc8v5wp0k43fagb";
2148 2160 };
2149 2161 meta = {
2150 2162 license = [ pkgs.lib.licenses.psfl ];
2151 2163 };
2152 2164 };
2153 2165 "supervisor" = super.buildPythonPackage {
2154 2166 name = "supervisor-4.1.0";
2155 2167 doCheck = false;
2156 2168 src = fetchurl {
2157 2169 url = "https://files.pythonhosted.org/packages/de/87/ee1ad8fa533a4b5f2c7623f4a2b585d3c1947af7bed8e65bc7772274320e/supervisor-4.1.0.tar.gz";
2158 2170 sha256 = "10q36sa1jqljyyyl7cif52akpygl5kmlqq9x91hmx53f8zh6zj1d";
2159 2171 };
2160 2172 meta = {
2161 2173 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2162 2174 };
2163 2175 };
2164 2176 "tempita" = super.buildPythonPackage {
2165 2177 name = "tempita-0.5.2";
2166 2178 doCheck = false;
2167 2179 src = fetchurl {
2168 2180 url = "https://files.pythonhosted.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz";
2169 2181 sha256 = "177wwq45slfyajd8csy477bmdmzipyw0dm7i85k3akb7m85wzkna";
2170 2182 };
2171 2183 meta = {
2172 2184 license = [ pkgs.lib.licenses.mit ];
2173 2185 };
2174 2186 };
2175 2187 "termcolor" = super.buildPythonPackage {
2176 2188 name = "termcolor-1.1.0";
2177 2189 doCheck = false;
2178 2190 src = fetchurl {
2179 2191 url = "https://files.pythonhosted.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz";
2180 2192 sha256 = "0fv1vq14rpqwgazxg4981904lfyp84mnammw7y046491cv76jv8x";
2181 2193 };
2182 2194 meta = {
2183 2195 license = [ pkgs.lib.licenses.mit ];
2184 2196 };
2185 2197 };
2186 2198 "testpath" = super.buildPythonPackage {
2187 2199 name = "testpath-0.4.4";
2188 2200 doCheck = false;
2189 2201 src = fetchurl {
2190 2202 url = "https://files.pythonhosted.org/packages/2c/b3/5d57205e896d8998d77ad12aa42ebce75cd97d8b9a97d00ba078c4c9ffeb/testpath-0.4.4.tar.gz";
2191 2203 sha256 = "0zpcmq22dz79ipvvsfnw1ykpjcaj6xyzy7ws77s5b5ql3hka7q30";
2192 2204 };
2193 2205 meta = {
2194 2206 license = [ ];
2195 2207 };
2196 2208 };
2197 2209 "traitlets" = super.buildPythonPackage {
2198 2210 name = "traitlets-4.3.3";
2199 2211 doCheck = false;
2200 2212 propagatedBuildInputs = [
2201 2213 self."ipython-genutils"
2202 2214 self."six"
2203 2215 self."decorator"
2204 2216 self."enum34"
2205 2217 ];
2206 2218 src = fetchurl {
2207 2219 url = "https://files.pythonhosted.org/packages/75/b0/43deb021bc943f18f07cbe3dac1d681626a48997b7ffa1e7fb14ef922b21/traitlets-4.3.3.tar.gz";
2208 2220 sha256 = "1xsrwgivpkxlbr4dfndfsi098s29yqgswgjc1qqn69yxklvfw8yh";
2209 2221 };
2210 2222 meta = {
2211 2223 license = [ pkgs.lib.licenses.bsdOriginal ];
2212 2224 };
2213 2225 };
2214 2226 "transaction" = super.buildPythonPackage {
2215 2227 name = "transaction-2.4.0";
2216 2228 doCheck = false;
2217 2229 propagatedBuildInputs = [
2218 2230 self."zope.interface"
2219 2231 ];
2220 2232 src = fetchurl {
2221 2233 url = "https://files.pythonhosted.org/packages/9d/7d/0e8af0d059e052b9dcf2bb5a08aad20ae3e238746bdd3f8701a60969b363/transaction-2.4.0.tar.gz";
2222 2234 sha256 = "17wz1y524ca07vr03yddy8dv0gbscs06dbdywmllxv5rc725jq3j";
2223 2235 };
2224 2236 meta = {
2225 2237 license = [ pkgs.lib.licenses.zpl21 ];
2226 2238 };
2227 2239 };
2228 2240 "translationstring" = super.buildPythonPackage {
2229 2241 name = "translationstring-1.3";
2230 2242 doCheck = false;
2231 2243 src = fetchurl {
2232 2244 url = "https://files.pythonhosted.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz";
2233 2245 sha256 = "0bdpcnd9pv0131dl08h4zbcwmgc45lyvq3pa224xwan5b3x4rr2f";
2234 2246 };
2235 2247 meta = {
2236 2248 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
2237 2249 };
2238 2250 };
2239 2251 "tzlocal" = super.buildPythonPackage {
2240 2252 name = "tzlocal-1.5.1";
2241 2253 doCheck = false;
2242 2254 propagatedBuildInputs = [
2243 2255 self."pytz"
2244 2256 ];
2245 2257 src = fetchurl {
2246 2258 url = "https://files.pythonhosted.org/packages/cb/89/e3687d3ed99bc882793f82634e9824e62499fdfdc4b1ae39e211c5b05017/tzlocal-1.5.1.tar.gz";
2247 2259 sha256 = "0kiciwiqx0bv0fbc913idxibc4ygg4cb7f8rcpd9ij2shi4bigjf";
2248 2260 };
2249 2261 meta = {
2250 2262 license = [ pkgs.lib.licenses.mit ];
2251 2263 };
2252 2264 };
2253 2265 "urllib3" = super.buildPythonPackage {
2254 2266 name = "urllib3-1.25.2";
2255 2267 doCheck = false;
2256 2268 src = fetchurl {
2257 2269 url = "https://files.pythonhosted.org/packages/9a/8b/ea6d2beb2da6e331e9857d0a60b79ed4f72dcbc4e2c7f2d2521b0480fda2/urllib3-1.25.2.tar.gz";
2258 2270 sha256 = "1nq2k4pss1ihsjh02r41sqpjpm5rfqkjfysyq7g7n2i1p7c66c55";
2259 2271 };
2260 2272 meta = {
2261 2273 license = [ pkgs.lib.licenses.mit ];
2262 2274 };
2263 2275 };
2264 2276 "urlobject" = super.buildPythonPackage {
2265 2277 name = "urlobject-2.4.3";
2266 2278 doCheck = false;
2267 2279 src = fetchurl {
2268 2280 url = "https://files.pythonhosted.org/packages/e2/b8/1d0a916f4b34c4618846e6da0e4eeaa8fcb4a2f39e006434fe38acb74b34/URLObject-2.4.3.tar.gz";
2269 2281 sha256 = "1ahc8ficzfvr2avln71immfh4ls0zyv6cdaa5xmkdj5rd87f5cj7";
2270 2282 };
2271 2283 meta = {
2272 2284 license = [ pkgs.lib.licenses.publicDomain ];
2273 2285 };
2274 2286 };
2275 2287 "venusian" = super.buildPythonPackage {
2276 2288 name = "venusian-1.2.0";
2277 2289 doCheck = false;
2278 2290 src = fetchurl {
2279 2291 url = "https://files.pythonhosted.org/packages/7e/6f/40a9d43ac77cb51cb62be5b5662d170f43f8037bdc4eab56336c4ca92bb7/venusian-1.2.0.tar.gz";
2280 2292 sha256 = "0ghyx66g8ikx9nx1mnwqvdcqm11i1vlq0hnvwl50s48bp22q5v34";
2281 2293 };
2282 2294 meta = {
2283 2295 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2284 2296 };
2285 2297 };
2286 2298 "vine" = super.buildPythonPackage {
2287 2299 name = "vine-1.3.0";
2288 2300 doCheck = false;
2289 2301 src = fetchurl {
2290 2302 url = "https://files.pythonhosted.org/packages/1c/e1/79fb8046e607dd6c2ad05c9b8ebac9d0bd31d086a08f02699e96fc5b3046/vine-1.3.0.tar.gz";
2291 2303 sha256 = "11ydsbhl1vabndc2r979dv61s6j2b0giq6dgvryifvq1m7bycghk";
2292 2304 };
2293 2305 meta = {
2294 2306 license = [ pkgs.lib.licenses.bsdOriginal ];
2295 2307 };
2296 2308 };
2297 2309 "waitress" = super.buildPythonPackage {
2298 2310 name = "waitress-1.3.1";
2299 2311 doCheck = false;
2300 2312 src = fetchurl {
2301 2313 url = "https://files.pythonhosted.org/packages/a6/e6/708da7bba65898e5d759ade8391b1077e49d07be0b0223c39f5be04def56/waitress-1.3.1.tar.gz";
2302 2314 sha256 = "1iysl8ka3l4cdrr0r19fh1cv28q41mwpvgsb81ji7k4shkb0k3i7";
2303 2315 };
2304 2316 meta = {
2305 2317 license = [ pkgs.lib.licenses.zpl21 ];
2306 2318 };
2307 2319 };
2308 2320 "wcwidth" = super.buildPythonPackage {
2309 2321 name = "wcwidth-0.1.9";
2310 2322 doCheck = false;
2311 2323 src = fetchurl {
2312 2324 url = "https://files.pythonhosted.org/packages/25/9d/0acbed6e4a4be4fc99148f275488580968f44ddb5e69b8ceb53fc9df55a0/wcwidth-0.1.9.tar.gz";
2313 2325 sha256 = "1wf5ycjx8s066rdvr0fgz4xds9a8zhs91c4jzxvvymm1c8l8cwzf";
2314 2326 };
2315 2327 meta = {
2316 2328 license = [ pkgs.lib.licenses.mit ];
2317 2329 };
2318 2330 };
2319 2331 "webencodings" = super.buildPythonPackage {
2320 2332 name = "webencodings-0.5.1";
2321 2333 doCheck = false;
2322 2334 src = fetchurl {
2323 2335 url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz";
2324 2336 sha256 = "08qrgrc4hrximb2gqnl69g01s93rhf2842jfxdjljc1dbwj1qsmk";
2325 2337 };
2326 2338 meta = {
2327 2339 license = [ pkgs.lib.licenses.bsdOriginal ];
2328 2340 };
2329 2341 };
2330 2342 "weberror" = super.buildPythonPackage {
2331 2343 name = "weberror-0.13.1";
2332 2344 doCheck = false;
2333 2345 propagatedBuildInputs = [
2334 2346 self."webob"
2335 2347 self."tempita"
2336 2348 self."pygments"
2337 2349 self."paste"
2338 2350 ];
2339 2351 src = fetchurl {
2340 2352 url = "https://files.pythonhosted.org/packages/07/0a/09ca5eb0fab5c0d17b380026babe81c96ecebb13f2b06c3203432dd7be72/WebError-0.13.1.tar.gz";
2341 2353 sha256 = "0r4qvnf2r92gfnpa1kwygh4j2x6j3axg2i4an6hyxwg2gpaqp7y1";
2342 2354 };
2343 2355 meta = {
2344 2356 license = [ pkgs.lib.licenses.mit ];
2345 2357 };
2346 2358 };
2347 2359 "webhelpers2" = super.buildPythonPackage {
2348 2360 name = "webhelpers2-2.0";
2349 2361 doCheck = false;
2350 2362 propagatedBuildInputs = [
2351 2363 self."markupsafe"
2352 2364 self."six"
2353 2365 ];
2354 2366 src = fetchurl {
2355 2367 url = "https://files.pythonhosted.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz";
2356 2368 sha256 = "0aphva1qmxh83n01p53f5fd43m4srzbnfbz5ajvbx9aj2aipwmcs";
2357 2369 };
2358 2370 meta = {
2359 2371 license = [ pkgs.lib.licenses.mit ];
2360 2372 };
2361 2373 };
2362 2374 "webob" = super.buildPythonPackage {
2363 2375 name = "webob-1.8.5";
2364 2376 doCheck = false;
2365 2377 src = fetchurl {
2366 2378 url = "https://files.pythonhosted.org/packages/9d/1a/0c89c070ee2829c934cb6c7082287c822e28236a4fcf90063e6be7c35532/WebOb-1.8.5.tar.gz";
2367 2379 sha256 = "11khpzaxc88q31v25ic330gsf56fwmbdc9b30br8mvp0fmwspah5";
2368 2380 };
2369 2381 meta = {
2370 2382 license = [ pkgs.lib.licenses.mit ];
2371 2383 };
2372 2384 };
2373 2385 "webtest" = super.buildPythonPackage {
2374 2386 name = "webtest-2.0.34";
2375 2387 doCheck = false;
2376 2388 propagatedBuildInputs = [
2377 2389 self."six"
2378 2390 self."webob"
2379 2391 self."waitress"
2380 2392 self."beautifulsoup4"
2381 2393 ];
2382 2394 src = fetchurl {
2383 2395 url = "https://files.pythonhosted.org/packages/2c/74/a0e63feee438735d628631e2b70d82280276a930637ac535479e5fad9427/WebTest-2.0.34.tar.gz";
2384 2396 sha256 = "0x1y2c8z4fmpsny4hbp6ka37si2g10r5r2jwxhvv5mx7g3blq4bi";
2385 2397 };
2386 2398 meta = {
2387 2399 license = [ pkgs.lib.licenses.mit ];
2388 2400 };
2389 2401 };
2390 2402 "whoosh" = super.buildPythonPackage {
2391 2403 name = "whoosh-2.7.4";
2392 2404 doCheck = false;
2393 2405 src = fetchurl {
2394 2406 url = "https://files.pythonhosted.org/packages/25/2b/6beed2107b148edc1321da0d489afc4617b9ed317ef7b72d4993cad9b684/Whoosh-2.7.4.tar.gz";
2395 2407 sha256 = "10qsqdjpbc85fykc1vgcs8xwbgn4l2l52c8d83xf1q59pwyn79bw";
2396 2408 };
2397 2409 meta = {
2398 2410 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
2399 2411 };
2400 2412 };
2401 2413 "ws4py" = super.buildPythonPackage {
2402 2414 name = "ws4py-0.5.1";
2403 2415 doCheck = false;
2404 2416 src = fetchurl {
2405 2417 url = "https://files.pythonhosted.org/packages/53/20/4019a739b2eefe9282d3822ef6a225250af964b117356971bd55e274193c/ws4py-0.5.1.tar.gz";
2406 2418 sha256 = "10slbbf2jm4hpr92jx7kh7mhf48sjl01v2w4d8z3f1p0ybbp7l19";
2407 2419 };
2408 2420 meta = {
2409 2421 license = [ pkgs.lib.licenses.bsdOriginal ];
2410 2422 };
2411 2423 };
2412 2424 "wsgiref" = super.buildPythonPackage {
2413 2425 name = "wsgiref-0.1.2";
2414 2426 doCheck = false;
2415 2427 src = fetchurl {
2416 2428 url = "https://files.pythonhosted.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip";
2417 2429 sha256 = "0y8fyjmpq7vwwm4x732w97qbkw78rjwal5409k04cw4m03411rn7";
2418 2430 };
2419 2431 meta = {
2420 2432 license = [ { fullName = "PSF or ZPL"; } ];
2421 2433 };
2422 2434 };
2423 2435 "zipp" = super.buildPythonPackage {
2424 2436 name = "zipp-1.2.0";
2425 2437 doCheck = false;
2426 2438 propagatedBuildInputs = [
2427 2439 self."contextlib2"
2428 2440 ];
2429 2441 src = fetchurl {
2430 2442 url = "https://files.pythonhosted.org/packages/78/08/d52f0ea643bc1068d6dc98b412f4966a9b63255d20911a23ac3220c033c4/zipp-1.2.0.tar.gz";
2431 2443 sha256 = "1c91lnv1bxjimh8as27hz7bghsjkkbxn1d37xq7in9c82iai0167";
2432 2444 };
2433 2445 meta = {
2434 2446 license = [ pkgs.lib.licenses.mit ];
2435 2447 };
2436 2448 };
2437 2449 "zope.cachedescriptors" = super.buildPythonPackage {
2438 2450 name = "zope.cachedescriptors-4.3.1";
2439 2451 doCheck = false;
2440 2452 propagatedBuildInputs = [
2441 2453 self."setuptools"
2442 2454 ];
2443 2455 src = fetchurl {
2444 2456 url = "https://files.pythonhosted.org/packages/2f/89/ebe1890cc6d3291ebc935558fa764d5fffe571018dbbee200e9db78762cb/zope.cachedescriptors-4.3.1.tar.gz";
2445 2457 sha256 = "0jhr3m5p74c6r7k8iv0005b8bfsialih9d7zl5vx38rf5xq1lk8z";
2446 2458 };
2447 2459 meta = {
2448 2460 license = [ pkgs.lib.licenses.zpl21 ];
2449 2461 };
2450 2462 };
2451 2463 "zope.deprecation" = super.buildPythonPackage {
2452 2464 name = "zope.deprecation-4.4.0";
2453 2465 doCheck = false;
2454 2466 propagatedBuildInputs = [
2455 2467 self."setuptools"
2456 2468 ];
2457 2469 src = fetchurl {
2458 2470 url = "https://files.pythonhosted.org/packages/34/da/46e92d32d545dd067b9436279d84c339e8b16de2ca393d7b892bc1e1e9fd/zope.deprecation-4.4.0.tar.gz";
2459 2471 sha256 = "1pz2cv7gv9y1r3m0bdv7ks1alagmrn5msm5spwdzkb2by0w36i8d";
2460 2472 };
2461 2473 meta = {
2462 2474 license = [ pkgs.lib.licenses.zpl21 ];
2463 2475 };
2464 2476 };
2465 2477 "zope.event" = super.buildPythonPackage {
2466 2478 name = "zope.event-4.4";
2467 2479 doCheck = false;
2468 2480 propagatedBuildInputs = [
2469 2481 self."setuptools"
2470 2482 ];
2471 2483 src = fetchurl {
2472 2484 url = "https://files.pythonhosted.org/packages/4c/b2/51c0369adcf5be2334280eed230192ab3b03f81f8efda9ddea6f65cc7b32/zope.event-4.4.tar.gz";
2473 2485 sha256 = "1ksbc726av9xacml6jhcfyn828hlhb9xlddpx6fcvnlvmpmpvhk9";
2474 2486 };
2475 2487 meta = {
2476 2488 license = [ pkgs.lib.licenses.zpl21 ];
2477 2489 };
2478 2490 };
2479 2491 "zope.interface" = super.buildPythonPackage {
2480 2492 name = "zope.interface-4.6.0";
2481 2493 doCheck = false;
2482 2494 propagatedBuildInputs = [
2483 2495 self."setuptools"
2484 2496 ];
2485 2497 src = fetchurl {
2486 2498 url = "https://files.pythonhosted.org/packages/4e/d0/c9d16bd5b38de44a20c6dc5d5ed80a49626fafcb3db9f9efdc2a19026db6/zope.interface-4.6.0.tar.gz";
2487 2499 sha256 = "1rgh2x3rcl9r0v0499kf78xy86rnmanajf4ywmqb943wpk50sg8v";
2488 2500 };
2489 2501 meta = {
2490 2502 license = [ pkgs.lib.licenses.zpl21 ];
2491 2503 };
2492 2504 };
2493 2505
2494 2506 ### Test requirements
2495 2507
2496 2508
2497 2509 }
@@ -1,123 +1,124 b''
1 1 ## dependencies
2 2
3 3 amqp==2.5.2
4 4 babel==1.3
5 5 beaker==1.9.1
6 6 bleach==3.1.3
7 7 celery==4.3.0
8 8 channelstream==0.6.14
9 9 click==7.0
10 10 colander==1.7.0
11 11 # our custom configobj
12 12 https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626#egg=configobj==5.0.6
13 13 cssselect==1.0.3
14 14 cryptography==2.6.1
15 15 decorator==4.1.2
16 16 deform==2.0.8
17 17 docutils==0.16.0
18 18 dogpile.cache==0.9.0
19 19 dogpile.core==0.4.1
20 20 formencode==1.2.4
21 21 future==0.14.3
22 22 futures==3.0.2
23 23 infrae.cache==1.0.1
24 24 iso8601==0.1.12
25 25 itsdangerous==1.1.0
26 26 kombu==4.6.6
27 27 lxml==4.2.5
28 28 mako==1.1.0
29 29 markdown==2.6.11
30 30 markupsafe==1.1.1
31 31 msgpack-python==0.5.6
32 32 pyotp==2.3.0
33 33 packaging==20.3
34 34 pathlib2==2.3.5
35 35 paste==3.4.0
36 36 pastedeploy==2.1.0
37 37 pastescript==3.2.0
38 38 peppercorn==0.6
39 39 premailer==3.6.1
40 40 psutil==5.7.0
41 41 py-bcrypt==0.4
42 42 pycurl==7.43.0.3
43 43 pycrypto==2.6.1
44 44 pygments==2.4.2
45 45 pyparsing==2.4.7
46 46 pyramid-debugtoolbar==4.6.1
47 47 pyramid-mako==1.1.0
48 48 pyramid==1.10.4
49 49 pyramid_mailer==0.15.1
50 50 python-dateutil==2.8.1
51 51 python-ldap==3.2.0
52 52 python-memcached==1.59
53 53 python-pam==1.8.4
54 54 python-saml==2.4.2
55 55 pytz==2019.3
56 56 tzlocal==1.5.1
57 57 pyzmq==14.6.0
58 58 py-gfm==0.1.4
59 regex==2020.9.27
59 60 redis==3.4.1
60 61 repoze.lru==0.7
61 62 requests==2.22.0
62 63 routes==2.4.1
63 64 simplejson==3.16.0
64 65 six==1.11.0
65 66 sqlalchemy==1.3.15
66 67 sshpubkeys==3.1.0
67 68 subprocess32==3.5.4
68 69 supervisor==4.1.0
69 70 translationstring==1.3
70 71 urllib3==1.25.2
71 72 urlobject==2.4.3
72 73 venusian==1.2.0
73 74 weberror==0.13.1
74 75 webhelpers2==2.0
75 76 webob==1.8.5
76 77 whoosh==2.7.4
77 78 wsgiref==0.1.2
78 79 zope.cachedescriptors==4.3.1
79 80 zope.deprecation==4.4.0
80 81 zope.event==4.4.0
81 82 zope.interface==4.6.0
82 83
83 84 # DB drivers
84 85 mysql-python==1.2.5
85 86 pymysql==0.8.1
86 87 pysqlite==2.8.3
87 88 psycopg2==2.8.4
88 89
89 90 # IPYTHON RENDERING
90 91 # entrypoints backport, pypi version doesn't support egg installs
91 92 https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d#egg=entrypoints==0.2.2.rhodecode-upstream1
92 93 nbconvert==5.3.1
93 94 nbformat==4.4.0
94 95 jupyter-client==5.0.0
95 96 jupyter-core==4.5.0
96 97
97 98 ## cli tools
98 99 alembic==1.4.2
99 100 invoke==0.13.0
100 101 bumpversion==0.5.3
101 102
102 103 ## http servers
103 104 gevent==1.5.0
104 105 greenlet==0.4.15
105 106 gunicorn==19.9.0
106 107 waitress==1.3.1
107 108
108 109 ## debug
109 110 ipdb==0.13.2
110 111 ipython==5.1.0
111 112
112 113 ## rhodecode-tools, special case, use file://PATH.tar.gz#egg=rhodecode-tools==X.Y.Z, to test local version
113 114 https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a#egg=rhodecode-tools==1.4.0
114 115
115 116
116 117 ## appenlight
117 118 appenlight-client==0.6.26
118 119
119 120 ## test related requirements
120 121 -r requirements_test.txt
121 122
122 123 ## uncomment to add the debug libraries
123 124 #-r requirements_debug.txt
@@ -1,60 +1,60 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import os
22 22 from collections import OrderedDict
23 23
24 24 import sys
25 25 import platform
26 26
27 27 VERSION = tuple(open(os.path.join(
28 28 os.path.dirname(__file__), 'VERSION')).read().split('.'))
29 29
30 30 BACKENDS = OrderedDict()
31 31
32 32 BACKENDS['hg'] = 'Mercurial repository'
33 33 BACKENDS['git'] = 'Git repository'
34 34 BACKENDS['svn'] = 'Subversion repository'
35 35
36 36
37 37 CELERY_ENABLED = False
38 38 CELERY_EAGER = False
39 39
40 40 # link to config for pyramid
41 41 CONFIG = {}
42 42
43 43 # Populated with the settings dictionary from application init in
44 44 # rhodecode.conf.environment.load_pyramid_environment
45 45 PYRAMID_SETTINGS = {}
46 46
47 47 # Linked module for extensions
48 48 EXTENSIONS = {}
49 49
50 50 __version__ = ('.'.join((str(each) for each in VERSION[:3])))
51 __dbversion__ = 109 # defines current db version for migrations
51 __dbversion__ = 110 # defines current db version for migrations
52 52 __platform__ = platform.system()
53 53 __license__ = 'AGPLv3, and Commercial License'
54 54 __author__ = 'RhodeCode GmbH'
55 55 __url__ = 'https://code.rhodecode.com'
56 56
57 57 is_windows = __platform__ in ['Windows']
58 58 is_unix = not is_windows
59 59 is_test = False
60 60 disable_error_handler = False
@@ -1,368 +1,368 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import pytest
22 22
23 23 from rhodecode.model.db import User
24 24 from rhodecode.model.pull_request import PullRequestModel
25 25 from rhodecode.model.repo import RepoModel
26 26 from rhodecode.model.user import UserModel
27 27 from rhodecode.tests import TEST_USER_ADMIN_LOGIN, TEST_USER_REGULAR_LOGIN
28 28 from rhodecode.api.tests.utils import build_data, api_call, assert_error
29 29
30 30
31 31 @pytest.mark.usefixtures("testuser_api", "app")
32 32 class TestCreatePullRequestApi(object):
33 33 finalizers = []
34 34
35 35 def teardown_method(self, method):
36 36 if self.finalizers:
37 37 for finalizer in self.finalizers:
38 38 finalizer()
39 39 self.finalizers = []
40 40
41 41 def test_create_with_wrong_data(self):
42 42 required_data = {
43 43 'source_repo': 'tests/source_repo',
44 44 'target_repo': 'tests/target_repo',
45 45 'source_ref': 'branch:default:initial',
46 46 'target_ref': 'branch:default:new-feature',
47 47 }
48 48 for key in required_data:
49 49 data = required_data.copy()
50 50 data.pop(key)
51 51 id_, params = build_data(
52 52 self.apikey, 'create_pull_request', **data)
53 53 response = api_call(self.app, params)
54 54
55 55 expected = 'Missing non optional `{}` arg in JSON DATA'.format(key)
56 56 assert_error(id_, expected, given=response.body)
57 57
58 58 @pytest.mark.backends("git", "hg")
59 59 @pytest.mark.parametrize('source_ref', [
60 60 'bookmarg:default:initial'
61 61 ])
62 62 def test_create_with_wrong_refs_data(self, backend, source_ref):
63 63
64 64 data = self._prepare_data(backend)
65 65 data['source_ref'] = source_ref
66 66
67 67 id_, params = build_data(
68 68 self.apikey_regular, 'create_pull_request', **data)
69 69
70 70 response = api_call(self.app, params)
71 71
72 72 expected = "Ref `{}` type is not allowed. " \
73 73 "Only:['bookmark', 'book', 'tag', 'branch'] " \
74 74 "are possible.".format(source_ref)
75 75 assert_error(id_, expected, given=response.body)
76 76
77 77 @pytest.mark.backends("git", "hg")
78 78 def test_create_with_correct_data(self, backend):
79 79 data = self._prepare_data(backend)
80 80 RepoModel().revoke_user_permission(
81 81 self.source.repo_name, User.DEFAULT_USER)
82 82 id_, params = build_data(
83 83 self.apikey_regular, 'create_pull_request', **data)
84 84 response = api_call(self.app, params)
85 85 expected_message = "Created new pull request `{title}`".format(
86 86 title=data['title'])
87 87 result = response.json
88 88 assert result['error'] is None
89 89 assert result['result']['msg'] == expected_message
90 90 pull_request_id = result['result']['pull_request_id']
91 91 pull_request = PullRequestModel().get(pull_request_id)
92 92 assert pull_request.title == data['title']
93 93 assert pull_request.description == data['description']
94 94 assert pull_request.source_ref == data['source_ref']
95 95 assert pull_request.target_ref == data['target_ref']
96 96 assert pull_request.source_repo.repo_name == data['source_repo']
97 97 assert pull_request.target_repo.repo_name == data['target_repo']
98 98 assert pull_request.revisions == [self.commit_ids['change']]
99 99 assert len(pull_request.reviewers) == 1
100 100
101 101 @pytest.mark.backends("git", "hg")
102 102 def test_create_with_empty_description(self, backend):
103 103 data = self._prepare_data(backend)
104 104 data.pop('description')
105 105 id_, params = build_data(
106 106 self.apikey_regular, 'create_pull_request', **data)
107 107 response = api_call(self.app, params)
108 108 expected_message = "Created new pull request `{title}`".format(
109 109 title=data['title'])
110 110 result = response.json
111 111 assert result['error'] is None
112 112 assert result['result']['msg'] == expected_message
113 113 pull_request_id = result['result']['pull_request_id']
114 114 pull_request = PullRequestModel().get(pull_request_id)
115 115 assert pull_request.description == ''
116 116
117 117 @pytest.mark.backends("git", "hg")
118 118 def test_create_with_empty_title(self, backend):
119 119 data = self._prepare_data(backend)
120 120 data.pop('title')
121 121 id_, params = build_data(
122 122 self.apikey_regular, 'create_pull_request', **data)
123 123 response = api_call(self.app, params)
124 124 result = response.json
125 125 pull_request_id = result['result']['pull_request_id']
126 126 pull_request = PullRequestModel().get(pull_request_id)
127 127 data['ref'] = backend.default_branch_name
128 128 title = '{source_repo}#{ref} to {target_repo}'.format(**data)
129 129 assert pull_request.title == title
130 130
131 131 @pytest.mark.backends("git", "hg")
132 132 def test_create_with_reviewers_specified_by_names(
133 133 self, backend, no_notifications):
134 134 data = self._prepare_data(backend)
135 135 reviewers = [
136 136 {'username': TEST_USER_REGULAR_LOGIN,
137 137 'reasons': ['{} added manually'.format(TEST_USER_REGULAR_LOGIN)]},
138 138 {'username': TEST_USER_ADMIN_LOGIN,
139 139 'reasons': ['{} added manually'.format(TEST_USER_ADMIN_LOGIN)],
140 140 'mandatory': True},
141 141 ]
142 142 data['reviewers'] = reviewers
143 143
144 144 id_, params = build_data(
145 145 self.apikey_regular, 'create_pull_request', **data)
146 146 response = api_call(self.app, params)
147 147
148 148 expected_message = "Created new pull request `{title}`".format(
149 149 title=data['title'])
150 150 result = response.json
151 151 assert result['error'] is None
152 152 assert result['result']['msg'] == expected_message
153 153 pull_request_id = result['result']['pull_request_id']
154 154 pull_request = PullRequestModel().get(pull_request_id)
155 155
156 156 actual_reviewers = []
157 157 for rev in pull_request.reviewers:
158 158 entry = {
159 159 'username': rev.user.username,
160 160 'reasons': rev.reasons,
161 161 }
162 162 if rev.mandatory:
163 163 entry['mandatory'] = rev.mandatory
164 164 actual_reviewers.append(entry)
165 165
166 166 owner_username = pull_request.target_repo.user.username
167 167 for spec_reviewer in reviewers[::]:
168 168 # default reviewer will be added who is an owner of the repo
169 169 # this get's overridden by a add owner to reviewers rule
170 170 if spec_reviewer['username'] == owner_username:
171 171 spec_reviewer['reasons'] = [u'Default reviewer', u'Repository owner']
172 172 # since owner is more important, we don't inherit mandatory flag
173 173 del spec_reviewer['mandatory']
174 174
175 175 assert sorted(actual_reviewers, key=lambda e: e['username']) \
176 176 == sorted(reviewers, key=lambda e: e['username'])
177 177
178 178 @pytest.mark.backends("git", "hg")
179 179 def test_create_with_reviewers_specified_by_ids(
180 180 self, backend, no_notifications):
181 181 data = self._prepare_data(backend)
182 182 reviewers = [
183 183 {'username': UserModel().get_by_username(
184 184 TEST_USER_REGULAR_LOGIN).user_id,
185 185 'reasons': ['added manually']},
186 186 {'username': UserModel().get_by_username(
187 187 TEST_USER_ADMIN_LOGIN).user_id,
188 188 'reasons': ['added manually']},
189 189 ]
190 190
191 191 data['reviewers'] = reviewers
192 192 id_, params = build_data(
193 193 self.apikey_regular, 'create_pull_request', **data)
194 194 response = api_call(self.app, params)
195 195
196 196 expected_message = "Created new pull request `{title}`".format(
197 197 title=data['title'])
198 198 result = response.json
199 199 assert result['error'] is None
200 200 assert result['result']['msg'] == expected_message
201 201 pull_request_id = result['result']['pull_request_id']
202 202 pull_request = PullRequestModel().get(pull_request_id)
203 203
204 204 actual_reviewers = []
205 205 for rev in pull_request.reviewers:
206 206 entry = {
207 207 'username': rev.user.user_id,
208 208 'reasons': rev.reasons,
209 209 }
210 210 if rev.mandatory:
211 211 entry['mandatory'] = rev.mandatory
212 212 actual_reviewers.append(entry)
213 213
214 214 owner_user_id = pull_request.target_repo.user.user_id
215 215 for spec_reviewer in reviewers[::]:
216 216 # default reviewer will be added who is an owner of the repo
217 217 # this get's overridden by a add owner to reviewers rule
218 218 if spec_reviewer['username'] == owner_user_id:
219 219 spec_reviewer['reasons'] = [u'Default reviewer', u'Repository owner']
220 220
221 221 assert sorted(actual_reviewers, key=lambda e: e['username']) \
222 222 == sorted(reviewers, key=lambda e: e['username'])
223 223
224 224 @pytest.mark.backends("git", "hg")
225 225 def test_create_fails_when_the_reviewer_is_not_found(self, backend):
226 226 data = self._prepare_data(backend)
227 227 data['reviewers'] = [{'username': 'somebody'}]
228 228 id_, params = build_data(
229 229 self.apikey_regular, 'create_pull_request', **data)
230 230 response = api_call(self.app, params)
231 231 expected_message = 'user `somebody` does not exist'
232 232 assert_error(id_, expected_message, given=response.body)
233 233
234 234 @pytest.mark.backends("git", "hg")
235 235 def test_cannot_create_with_reviewers_in_wrong_format(self, backend):
236 236 data = self._prepare_data(backend)
237 237 reviewers = ','.join([TEST_USER_REGULAR_LOGIN, TEST_USER_ADMIN_LOGIN])
238 238 data['reviewers'] = reviewers
239 239 id_, params = build_data(
240 240 self.apikey_regular, 'create_pull_request', **data)
241 241 response = api_call(self.app, params)
242 242 expected_message = {u'': '"test_regular,test_admin" is not iterable'}
243 243 assert_error(id_, expected_message, given=response.body)
244 244
245 245 @pytest.mark.backends("git", "hg")
246 246 def test_create_with_no_commit_hashes(self, backend):
247 247 data = self._prepare_data(backend)
248 248 expected_source_ref = data['source_ref']
249 249 expected_target_ref = data['target_ref']
250 250 data['source_ref'] = 'branch:{}'.format(backend.default_branch_name)
251 251 data['target_ref'] = 'branch:{}'.format(backend.default_branch_name)
252 252 id_, params = build_data(
253 253 self.apikey_regular, 'create_pull_request', **data)
254 254 response = api_call(self.app, params)
255 255 expected_message = "Created new pull request `{title}`".format(
256 256 title=data['title'])
257 257 result = response.json
258 258 assert result['result']['msg'] == expected_message
259 259 pull_request_id = result['result']['pull_request_id']
260 260 pull_request = PullRequestModel().get(pull_request_id)
261 261 assert pull_request.source_ref == expected_source_ref
262 262 assert pull_request.target_ref == expected_target_ref
263 263
264 264 @pytest.mark.backends("git", "hg")
265 265 @pytest.mark.parametrize("data_key", ["source_repo", "target_repo"])
266 266 def test_create_fails_with_wrong_repo(self, backend, data_key):
267 267 repo_name = 'fake-repo'
268 268 data = self._prepare_data(backend)
269 269 data[data_key] = repo_name
270 270 id_, params = build_data(
271 271 self.apikey_regular, 'create_pull_request', **data)
272 272 response = api_call(self.app, params)
273 273 expected_message = 'repository `{}` does not exist'.format(repo_name)
274 274 assert_error(id_, expected_message, given=response.body)
275 275
276 276 @pytest.mark.backends("git", "hg")
277 277 @pytest.mark.parametrize("data_key", ["source_ref", "target_ref"])
278 278 def test_create_fails_with_non_existing_branch(self, backend, data_key):
279 279 branch_name = 'test-branch'
280 280 data = self._prepare_data(backend)
281 281 data[data_key] = "branch:{}".format(branch_name)
282 282 id_, params = build_data(
283 283 self.apikey_regular, 'create_pull_request', **data)
284 284 response = api_call(self.app, params)
285 285 expected_message = 'The specified value:{type}:`{name}` ' \
286 286 'does not exist, or is not allowed.'.format(type='branch',
287 287 name=branch_name)
288 288 assert_error(id_, expected_message, given=response.body)
289 289
290 290 @pytest.mark.backends("git", "hg")
291 291 @pytest.mark.parametrize("data_key", ["source_ref", "target_ref"])
292 292 def test_create_fails_with_ref_in_a_wrong_format(self, backend, data_key):
293 293 data = self._prepare_data(backend)
294 294 ref = 'stange-ref'
295 295 data[data_key] = ref
296 296 id_, params = build_data(
297 297 self.apikey_regular, 'create_pull_request', **data)
298 298 response = api_call(self.app, params)
299 299 expected_message = (
300 300 'Ref `{ref}` given in a wrong format. Please check the API'
301 301 ' documentation for more details'.format(ref=ref))
302 302 assert_error(id_, expected_message, given=response.body)
303 303
304 304 @pytest.mark.backends("git", "hg")
305 305 @pytest.mark.parametrize("data_key", ["source_ref", "target_ref"])
306 306 def test_create_fails_with_non_existing_ref(self, backend, data_key):
307 307 commit_id = 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa10'
308 308 ref = self._get_full_ref(backend, commit_id)
309 309 data = self._prepare_data(backend)
310 310 data[data_key] = ref
311 311 id_, params = build_data(
312 312 self.apikey_regular, 'create_pull_request', **data)
313 313 response = api_call(self.app, params)
314 314 expected_message = 'Ref `{}` does not exist'.format(ref)
315 315 assert_error(id_, expected_message, given=response.body)
316 316
317 317 @pytest.mark.backends("git", "hg")
318 318 def test_create_fails_when_no_revisions(self, backend):
319 319 data = self._prepare_data(backend, source_head='initial')
320 320 id_, params = build_data(
321 321 self.apikey_regular, 'create_pull_request', **data)
322 322 response = api_call(self.app, params)
323 expected_message = 'no commits found'
323 expected_message = 'no commits found for merge between specified references'
324 324 assert_error(id_, expected_message, given=response.body)
325 325
326 326 @pytest.mark.backends("git", "hg")
327 327 def test_create_fails_when_no_permissions(self, backend):
328 328 data = self._prepare_data(backend)
329 329 RepoModel().revoke_user_permission(
330 330 self.source.repo_name, self.test_user)
331 331 RepoModel().revoke_user_permission(
332 332 self.source.repo_name, User.DEFAULT_USER)
333 333
334 334 id_, params = build_data(
335 335 self.apikey_regular, 'create_pull_request', **data)
336 336 response = api_call(self.app, params)
337 337 expected_message = 'repository `{}` does not exist'.format(
338 338 self.source.repo_name)
339 339 assert_error(id_, expected_message, given=response.body)
340 340
341 341 def _prepare_data(
342 342 self, backend, source_head='change', target_head='initial'):
343 343 commits = [
344 344 {'message': 'initial'},
345 345 {'message': 'change'},
346 346 {'message': 'new-feature', 'parents': ['initial']},
347 347 ]
348 348 self.commit_ids = backend.create_master_repo(commits)
349 349 self.source = backend.create_repo(heads=[source_head])
350 350 self.target = backend.create_repo(heads=[target_head])
351 351
352 352 data = {
353 353 'source_repo': self.source.repo_name,
354 354 'target_repo': self.target.repo_name,
355 355 'source_ref': self._get_full_ref(
356 356 backend, self.commit_ids[source_head]),
357 357 'target_ref': self._get_full_ref(
358 358 backend, self.commit_ids[target_head]),
359 359 'title': 'Test PR 1',
360 360 'description': 'Test'
361 361 }
362 362 RepoModel().grant_user_permission(
363 363 self.source.repo_name, self.TEST_USER_LOGIN, 'repository.read')
364 364 return data
365 365
366 366 def _get_full_ref(self, backend, commit_id):
367 367 return 'branch:{branch}:{commit_id}'.format(
368 368 branch=backend.default_branch_name, commit_id=commit_id)
@@ -1,80 +1,82 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21
22 22 import pytest
23 23
24 24 from rhodecode.model.meta import Session
25 25 from rhodecode.model.pull_request import PullRequestModel
26 26 from rhodecode.api.tests.utils import (
27 27 build_data, api_call, assert_error)
28 28
29 29
30 30 @pytest.mark.usefixtures("testuser_api", "app")
31 31 class TestGetPullRequest(object):
32
32 33 @pytest.mark.backends("git", "hg")
33 34 def test_api_get_pull_requests(self, pr_util):
34 35 pull_request = pr_util.create_pull_request()
35 36 pull_request_2 = PullRequestModel().create(
36 37 created_by=pull_request.author,
37 38 source_repo=pull_request.source_repo,
38 39 source_ref=pull_request.source_ref,
39 40 target_repo=pull_request.target_repo,
40 41 target_ref=pull_request.target_ref,
41 42 revisions=pull_request.revisions,
42 43 reviewers=(),
44 observers=(),
43 45 title=pull_request.title,
44 46 description=pull_request.description,
45 47 )
46 48 Session().commit()
47 49 id_, params = build_data(
48 50 self.apikey, 'get_pull_requests',
49 51 repoid=pull_request.target_repo.repo_name)
50 52 response = api_call(self.app, params)
51 53 assert response.status == '200 OK'
52 54 assert len(response.json['result']) == 2
53 55
54 56 PullRequestModel().close_pull_request(
55 57 pull_request_2, pull_request_2.author)
56 58 Session().commit()
57 59
58 60 id_, params = build_data(
59 61 self.apikey, 'get_pull_requests',
60 62 repoid=pull_request.target_repo.repo_name,
61 63 status='new')
62 64 response = api_call(self.app, params)
63 65 assert response.status == '200 OK'
64 66 assert len(response.json['result']) == 1
65 67
66 68 id_, params = build_data(
67 69 self.apikey, 'get_pull_requests',
68 70 repoid=pull_request.target_repo.repo_name,
69 71 status='closed')
70 72 response = api_call(self.app, params)
71 73 assert response.status == '200 OK'
72 74 assert len(response.json['result']) == 1
73 75
74 76 @pytest.mark.backends("git", "hg")
75 77 def test_api_get_pull_requests_repo_error(self):
76 78 id_, params = build_data(self.apikey, 'get_pull_requests', repoid=666)
77 79 response = api_call(self.app, params)
78 80
79 81 expected = 'repository `666` does not exist'
80 82 assert_error(id_, expected, given=response.body)
@@ -1,212 +1,215 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import pytest
22 22
23 23 from rhodecode.lib.vcs.nodes import FileNode
24 24 from rhodecode.model.db import User
25 25 from rhodecode.model.pull_request import PullRequestModel
26 26 from rhodecode.tests import TEST_USER_ADMIN_LOGIN
27 27 from rhodecode.api.tests.utils import (
28 28 build_data, api_call, assert_ok, assert_error)
29 29
30 30
31 31 @pytest.mark.usefixtures("testuser_api", "app")
32 32 class TestUpdatePullRequest(object):
33 33
34 34 @pytest.mark.backends("git", "hg")
35 35 def test_api_update_pull_request_title_or_description(
36 36 self, pr_util, no_notifications):
37 37 pull_request = pr_util.create_pull_request()
38 38
39 39 id_, params = build_data(
40 40 self.apikey, 'update_pull_request',
41 41 repoid=pull_request.target_repo.repo_name,
42 42 pullrequestid=pull_request.pull_request_id,
43 43 title='New TITLE OF A PR',
44 44 description='New DESC OF A PR',
45 45 )
46 46 response = api_call(self.app, params)
47 47
48 48 expected = {
49 49 "msg": "Updated pull request `{}`".format(
50 50 pull_request.pull_request_id),
51 51 "pull_request": response.json['result']['pull_request'],
52 52 "updated_commits": {"added": [], "common": [], "removed": []},
53 53 "updated_reviewers": {"added": [], "removed": []},
54 "updated_observers": {"added": [], "removed": []},
54 55 }
55 56
56 57 response_json = response.json['result']
57 58 assert response_json == expected
58 59 pr = response_json['pull_request']
59 60 assert pr['title'] == 'New TITLE OF A PR'
60 61 assert pr['description'] == 'New DESC OF A PR'
61 62
62 63 @pytest.mark.backends("git", "hg")
63 64 def test_api_try_update_closed_pull_request(
64 65 self, pr_util, no_notifications):
65 66 pull_request = pr_util.create_pull_request()
66 67 PullRequestModel().close_pull_request(
67 68 pull_request, TEST_USER_ADMIN_LOGIN)
68 69
69 70 id_, params = build_data(
70 71 self.apikey, 'update_pull_request',
71 72 repoid=pull_request.target_repo.repo_name,
72 73 pullrequestid=pull_request.pull_request_id)
73 74 response = api_call(self.app, params)
74 75
75 76 expected = 'pull request `{}` update failed, pull request ' \
76 77 'is closed'.format(pull_request.pull_request_id)
77 78
78 79 assert_error(id_, expected, response.body)
79 80
80 81 @pytest.mark.backends("git", "hg")
81 82 def test_api_update_update_commits(self, pr_util, no_notifications):
82 83 commits = [
83 84 {'message': 'a'},
84 85 {'message': 'b', 'added': [FileNode('file_b', 'test_content\n')]},
85 86 {'message': 'c', 'added': [FileNode('file_c', 'test_content\n')]},
86 87 ]
87 88 pull_request = pr_util.create_pull_request(
88 89 commits=commits, target_head='a', source_head='b', revisions=['b'])
89 90 pr_util.update_source_repository(head='c')
90 91 repo = pull_request.source_repo.scm_instance()
91 92 commits = [x for x in repo.get_commits()]
92 93
93 94 added_commit_id = commits[-1].raw_id # c commit
94 95 common_commit_id = commits[1].raw_id # b commit is common ancestor
95 96 total_commits = [added_commit_id, common_commit_id]
96 97
97 98 id_, params = build_data(
98 99 self.apikey, 'update_pull_request',
99 100 repoid=pull_request.target_repo.repo_name,
100 101 pullrequestid=pull_request.pull_request_id,
101 102 update_commits=True
102 103 )
103 104 response = api_call(self.app, params)
104 105
105 106 expected = {
106 107 "msg": "Updated pull request `{}`".format(
107 108 pull_request.pull_request_id),
108 109 "pull_request": response.json['result']['pull_request'],
109 110 "updated_commits": {"added": [added_commit_id],
110 111 "common": [common_commit_id],
111 112 "total": total_commits,
112 113 "removed": []},
113 114 "updated_reviewers": {"added": [], "removed": []},
115 "updated_observers": {"added": [], "removed": []},
114 116 }
115 117
116 118 assert_ok(id_, expected, response.body)
117 119
118 120 @pytest.mark.backends("git", "hg")
119 121 def test_api_update_change_reviewers(
120 122 self, user_util, pr_util, no_notifications):
121 123 a = user_util.create_user()
122 124 b = user_util.create_user()
123 125 c = user_util.create_user()
124 126 new_reviewers = [
125 127 {'username': b.username,'reasons': ['updated via API'],
126 128 'mandatory':False},
127 129 {'username': c.username, 'reasons': ['updated via API'],
128 130 'mandatory':False},
129 131 ]
130 132
131 133 added = [b.username, c.username]
132 134 removed = [a.username]
133 135
134 136 pull_request = pr_util.create_pull_request(
135 reviewers=[(a.username, ['added via API'], False, [])])
137 reviewers=[(a.username, ['added via API'], False, 'reviewer', [])])
136 138
137 139 id_, params = build_data(
138 140 self.apikey, 'update_pull_request',
139 141 repoid=pull_request.target_repo.repo_name,
140 142 pullrequestid=pull_request.pull_request_id,
141 143 reviewers=new_reviewers)
142 144 response = api_call(self.app, params)
143 145 expected = {
144 146 "msg": "Updated pull request `{}`".format(
145 147 pull_request.pull_request_id),
146 148 "pull_request": response.json['result']['pull_request'],
147 149 "updated_commits": {"added": [], "common": [], "removed": []},
148 150 "updated_reviewers": {"added": added, "removed": removed},
151 "updated_observers": {"added": [], "removed": []},
149 152 }
150 153
151 154 assert_ok(id_, expected, response.body)
152 155
153 156 @pytest.mark.backends("git", "hg")
154 157 def test_api_update_bad_user_in_reviewers(self, pr_util):
155 158 pull_request = pr_util.create_pull_request()
156 159
157 160 id_, params = build_data(
158 161 self.apikey, 'update_pull_request',
159 162 repoid=pull_request.target_repo.repo_name,
160 163 pullrequestid=pull_request.pull_request_id,
161 164 reviewers=[{'username': 'bad_name'}])
162 165 response = api_call(self.app, params)
163 166
164 167 expected = 'user `bad_name` does not exist'
165 168
166 169 assert_error(id_, expected, response.body)
167 170
168 171 @pytest.mark.backends("git", "hg")
169 172 def test_api_update_repo_error(self, pr_util):
170 173 pull_request = pr_util.create_pull_request()
171 174 id_, params = build_data(
172 175 self.apikey, 'update_pull_request',
173 176 repoid='fake',
174 177 pullrequestid=pull_request.pull_request_id,
175 178 reviewers=[{'username': 'bad_name'}])
176 179 response = api_call(self.app, params)
177 180
178 181 expected = 'repository `fake` does not exist'
179 182
180 183 response_json = response.json['error']
181 184 assert response_json == expected
182 185
183 186 @pytest.mark.backends("git", "hg")
184 187 def test_api_update_pull_request_error(self, pr_util):
185 188 pull_request = pr_util.create_pull_request()
186 189
187 190 id_, params = build_data(
188 191 self.apikey, 'update_pull_request',
189 192 repoid=pull_request.target_repo.repo_name,
190 193 pullrequestid=999999,
191 194 reviewers=[{'username': 'bad_name'}])
192 195 response = api_call(self.app, params)
193 196
194 197 expected = 'pull request `999999` does not exist'
195 198 assert_error(id_, expected, response.body)
196 199
197 200 @pytest.mark.backends("git", "hg")
198 201 def test_api_update_pull_request_no_perms_to_update(
199 202 self, user_util, pr_util):
200 203 user = user_util.create_user()
201 204 pull_request = pr_util.create_pull_request()
202 205
203 206 id_, params = build_data(
204 207 user.api_key, 'update_pull_request',
205 208 repoid=pull_request.target_repo.repo_name,
206 209 pullrequestid=pull_request.pull_request_id,)
207 210 response = api_call(self.app, params)
208 211
209 212 expected = ('pull request `%s` update failed, '
210 213 'no permission to update.') % pull_request.pull_request_id
211 214
212 215 assert_error(id_, expected, response.body)
@@ -1,1018 +1,1118 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21
22 22 import logging
23 23
24 24 from rhodecode.api import jsonrpc_method, JSONRPCError, JSONRPCValidationError
25 25 from rhodecode.api.utils import (
26 26 has_superadmin_permission, Optional, OAttr, get_repo_or_error,
27 27 get_pull_request_or_error, get_commit_or_error, get_user_or_error,
28 28 validate_repo_permissions, resolve_ref_or_error, validate_set_owner_permissions)
29 from rhodecode.lib import channelstream
29 30 from rhodecode.lib.auth import (HasRepoPermissionAnyApi)
30 31 from rhodecode.lib.base import vcs_operation_context
31 32 from rhodecode.lib.utils2 import str2bool
33 from rhodecode.lib.vcs.backends.base import unicode_to_reference
32 34 from rhodecode.model.changeset_status import ChangesetStatusModel
33 35 from rhodecode.model.comment import CommentsModel
34 from rhodecode.model.db import Session, ChangesetStatus, ChangesetComment, PullRequest
36 from rhodecode.model.db import (
37 Session, ChangesetStatus, ChangesetComment, PullRequest, PullRequestReviewers)
35 38 from rhodecode.model.pull_request import PullRequestModel, MergeCheck
36 39 from rhodecode.model.settings import SettingsModel
37 40 from rhodecode.model.validation_schema import Invalid
38 41 from rhodecode.model.validation_schema.schemas.reviewer_schema import ReviewerListSchema
39 42
40 43 log = logging.getLogger(__name__)
41 44
42 45
43 46 @jsonrpc_method()
44 47 def get_pull_request(request, apiuser, pullrequestid, repoid=Optional(None),
45 48 merge_state=Optional(False)):
46 49 """
47 50 Get a pull request based on the given ID.
48 51
49 52 :param apiuser: This is filled automatically from the |authtoken|.
50 53 :type apiuser: AuthUser
51 54 :param repoid: Optional, repository name or repository ID from where
52 55 the pull request was opened.
53 56 :type repoid: str or int
54 57 :param pullrequestid: ID of the requested pull request.
55 58 :type pullrequestid: int
56 59 :param merge_state: Optional calculate merge state for each repository.
57 60 This could result in longer time to fetch the data
58 61 :type merge_state: bool
59 62
60 63 Example output:
61 64
62 65 .. code-block:: bash
63 66
64 67 "id": <id_given_in_input>,
65 68 "result":
66 69 {
67 70 "pull_request_id": "<pull_request_id>",
68 71 "url": "<url>",
69 72 "title": "<title>",
70 73 "description": "<description>",
71 74 "status" : "<status>",
72 75 "created_on": "<date_time_created>",
73 76 "updated_on": "<date_time_updated>",
74 77 "versions": "<number_or_versions_of_pr>",
75 78 "commit_ids": [
76 79 ...
77 80 "<commit_id>",
78 81 "<commit_id>",
79 82 ...
80 83 ],
81 84 "review_status": "<review_status>",
82 85 "mergeable": {
83 86 "status": "<bool>",
84 87 "message": "<message>",
85 88 },
86 89 "source": {
87 90 "clone_url": "<clone_url>",
88 91 "repository": "<repository_name>",
89 92 "reference":
90 93 {
91 94 "name": "<name>",
92 95 "type": "<type>",
93 96 "commit_id": "<commit_id>",
94 97 }
95 98 },
96 99 "target": {
97 100 "clone_url": "<clone_url>",
98 101 "repository": "<repository_name>",
99 102 "reference":
100 103 {
101 104 "name": "<name>",
102 105 "type": "<type>",
103 106 "commit_id": "<commit_id>",
104 107 }
105 108 },
106 109 "merge": {
107 110 "clone_url": "<clone_url>",
108 111 "reference":
109 112 {
110 113 "name": "<name>",
111 114 "type": "<type>",
112 115 "commit_id": "<commit_id>",
113 116 }
114 117 },
115 118 "author": <user_obj>,
116 119 "reviewers": [
117 120 ...
118 121 {
119 122 "user": "<user_obj>",
120 123 "review_status": "<review_status>",
121 124 }
122 125 ...
123 126 ]
124 127 },
125 128 "error": null
126 129 """
127 130
128 131 pull_request = get_pull_request_or_error(pullrequestid)
129 132 if Optional.extract(repoid):
130 133 repo = get_repo_or_error(repoid)
131 134 else:
132 135 repo = pull_request.target_repo
133 136
134 137 if not PullRequestModel().check_user_read(pull_request, apiuser, api=True):
135 138 raise JSONRPCError('repository `%s` or pull request `%s` '
136 139 'does not exist' % (repoid, pullrequestid))
137 140
138 141 # NOTE(marcink): only calculate and return merge state if the pr state is 'created'
139 142 # otherwise we can lock the repo on calculation of merge state while update/merge
140 143 # is happening.
141 144 pr_created = pull_request.pull_request_state == pull_request.STATE_CREATED
142 145 merge_state = Optional.extract(merge_state, binary=True) and pr_created
143 146 data = pull_request.get_api_data(with_merge_state=merge_state)
144 147 return data
145 148
146 149
147 150 @jsonrpc_method()
148 151 def get_pull_requests(request, apiuser, repoid, status=Optional('new'),
149 152 merge_state=Optional(False)):
150 153 """
151 154 Get all pull requests from the repository specified in `repoid`.
152 155
153 156 :param apiuser: This is filled automatically from the |authtoken|.
154 157 :type apiuser: AuthUser
155 158 :param repoid: Optional repository name or repository ID.
156 159 :type repoid: str or int
157 160 :param status: Only return pull requests with the specified status.
158 161 Valid options are.
159 162 * ``new`` (default)
160 163 * ``open``
161 164 * ``closed``
162 165 :type status: str
163 166 :param merge_state: Optional calculate merge state for each repository.
164 167 This could result in longer time to fetch the data
165 168 :type merge_state: bool
166 169
167 170 Example output:
168 171
169 172 .. code-block:: bash
170 173
171 174 "id": <id_given_in_input>,
172 175 "result":
173 176 [
174 177 ...
175 178 {
176 179 "pull_request_id": "<pull_request_id>",
177 180 "url": "<url>",
178 181 "title" : "<title>",
179 182 "description": "<description>",
180 183 "status": "<status>",
181 184 "created_on": "<date_time_created>",
182 185 "updated_on": "<date_time_updated>",
183 186 "commit_ids": [
184 187 ...
185 188 "<commit_id>",
186 189 "<commit_id>",
187 190 ...
188 191 ],
189 192 "review_status": "<review_status>",
190 193 "mergeable": {
191 194 "status": "<bool>",
192 195 "message: "<message>",
193 196 },
194 197 "source": {
195 198 "clone_url": "<clone_url>",
196 199 "reference":
197 200 {
198 201 "name": "<name>",
199 202 "type": "<type>",
200 203 "commit_id": "<commit_id>",
201 204 }
202 205 },
203 206 "target": {
204 207 "clone_url": "<clone_url>",
205 208 "reference":
206 209 {
207 210 "name": "<name>",
208 211 "type": "<type>",
209 212 "commit_id": "<commit_id>",
210 213 }
211 214 },
212 215 "merge": {
213 216 "clone_url": "<clone_url>",
214 217 "reference":
215 218 {
216 219 "name": "<name>",
217 220 "type": "<type>",
218 221 "commit_id": "<commit_id>",
219 222 }
220 223 },
221 224 "author": <user_obj>,
222 225 "reviewers": [
223 226 ...
224 227 {
225 228 "user": "<user_obj>",
226 229 "review_status": "<review_status>",
227 230 }
228 231 ...
229 232 ]
230 233 }
231 234 ...
232 235 ],
233 236 "error": null
234 237
235 238 """
236 239 repo = get_repo_or_error(repoid)
237 240 if not has_superadmin_permission(apiuser):
238 241 _perms = (
239 242 'repository.admin', 'repository.write', 'repository.read',)
240 243 validate_repo_permissions(apiuser, repoid, repo, _perms)
241 244
242 245 status = Optional.extract(status)
243 246 merge_state = Optional.extract(merge_state, binary=True)
244 247 pull_requests = PullRequestModel().get_all(repo, statuses=[status],
245 248 order_by='id', order_dir='desc')
246 249 data = [pr.get_api_data(with_merge_state=merge_state) for pr in pull_requests]
247 250 return data
248 251
249 252
250 253 @jsonrpc_method()
251 254 def merge_pull_request(
252 255 request, apiuser, pullrequestid, repoid=Optional(None),
253 256 userid=Optional(OAttr('apiuser'))):
254 257 """
255 258 Merge the pull request specified by `pullrequestid` into its target
256 259 repository.
257 260
258 261 :param apiuser: This is filled automatically from the |authtoken|.
259 262 :type apiuser: AuthUser
260 263 :param repoid: Optional, repository name or repository ID of the
261 264 target repository to which the |pr| is to be merged.
262 265 :type repoid: str or int
263 266 :param pullrequestid: ID of the pull request which shall be merged.
264 267 :type pullrequestid: int
265 268 :param userid: Merge the pull request as this user.
266 269 :type userid: Optional(str or int)
267 270
268 271 Example output:
269 272
270 273 .. code-block:: bash
271 274
272 275 "id": <id_given_in_input>,
273 276 "result": {
274 277 "executed": "<bool>",
275 278 "failure_reason": "<int>",
276 279 "merge_status_message": "<str>",
277 280 "merge_commit_id": "<merge_commit_id>",
278 281 "possible": "<bool>",
279 282 "merge_ref": {
280 283 "commit_id": "<commit_id>",
281 284 "type": "<type>",
282 285 "name": "<name>"
283 286 }
284 287 },
285 288 "error": null
286 289 """
287 290 pull_request = get_pull_request_or_error(pullrequestid)
288 291 if Optional.extract(repoid):
289 292 repo = get_repo_or_error(repoid)
290 293 else:
291 294 repo = pull_request.target_repo
292 295 auth_user = apiuser
293 296
294 297 if not isinstance(userid, Optional):
295 298 is_repo_admin = HasRepoPermissionAnyApi('repository.admin')(
296 299 user=apiuser, repo_name=repo.repo_name)
297 300 if has_superadmin_permission(apiuser) or is_repo_admin:
298 301 apiuser = get_user_or_error(userid)
299 302 auth_user = apiuser.AuthUser()
300 303 else:
301 304 raise JSONRPCError('userid is not the same as your user')
302 305
303 306 if pull_request.pull_request_state != PullRequest.STATE_CREATED:
304 307 raise JSONRPCError(
305 308 'Operation forbidden because pull request is in state {}, '
306 309 'only state {} is allowed.'.format(
307 310 pull_request.pull_request_state, PullRequest.STATE_CREATED))
308 311
309 312 with pull_request.set_state(PullRequest.STATE_UPDATING):
310 313 check = MergeCheck.validate(pull_request, auth_user=auth_user,
311 314 translator=request.translate)
312 315 merge_possible = not check.failed
313 316
314 317 if not merge_possible:
315 318 error_messages = []
316 319 for err_type, error_msg in check.errors:
317 320 error_msg = request.translate(error_msg)
318 321 error_messages.append(error_msg)
319 322
320 323 reasons = ','.join(error_messages)
321 324 raise JSONRPCError(
322 325 'merge not possible for following reasons: {}'.format(reasons))
323 326
324 327 target_repo = pull_request.target_repo
325 328 extras = vcs_operation_context(
326 329 request.environ, repo_name=target_repo.repo_name,
327 330 username=auth_user.username, action='push',
328 331 scm=target_repo.repo_type)
329 332 with pull_request.set_state(PullRequest.STATE_UPDATING):
330 333 merge_response = PullRequestModel().merge_repo(
331 334 pull_request, apiuser, extras=extras)
332 335 if merge_response.executed:
333 336 PullRequestModel().close_pull_request(pull_request.pull_request_id, auth_user)
334 337
335 338 Session().commit()
336 339
337 340 # In previous versions the merge response directly contained the merge
338 341 # commit id. It is now contained in the merge reference object. To be
339 342 # backwards compatible we have to extract it again.
340 343 merge_response = merge_response.asdict()
341 344 merge_response['merge_commit_id'] = merge_response['merge_ref'].commit_id
342 345
343 346 return merge_response
344 347
345 348
346 349 @jsonrpc_method()
347 350 def get_pull_request_comments(
348 351 request, apiuser, pullrequestid, repoid=Optional(None)):
349 352 """
350 353 Get all comments of pull request specified with the `pullrequestid`
351 354
352 355 :param apiuser: This is filled automatically from the |authtoken|.
353 356 :type apiuser: AuthUser
354 357 :param repoid: Optional repository name or repository ID.
355 358 :type repoid: str or int
356 359 :param pullrequestid: The pull request ID.
357 360 :type pullrequestid: int
358 361
359 362 Example output:
360 363
361 364 .. code-block:: bash
362 365
363 366 id : <id_given_in_input>
364 367 result : [
365 368 {
366 369 "comment_author": {
367 370 "active": true,
368 371 "full_name_or_username": "Tom Gore",
369 372 "username": "admin"
370 373 },
371 374 "comment_created_on": "2017-01-02T18:43:45.533",
372 375 "comment_f_path": null,
373 376 "comment_id": 25,
374 377 "comment_lineno": null,
375 378 "comment_status": {
376 379 "status": "under_review",
377 380 "status_lbl": "Under Review"
378 381 },
379 382 "comment_text": "Example text",
380 383 "comment_type": null,
381 384 "comment_last_version: 0,
382 385 "pull_request_version": null,
383 386 "comment_commit_id": None,
384 387 "comment_pull_request_id": <pull_request_id>
385 388 }
386 389 ],
387 390 error : null
388 391 """
389 392
390 393 pull_request = get_pull_request_or_error(pullrequestid)
391 394 if Optional.extract(repoid):
392 395 repo = get_repo_or_error(repoid)
393 396 else:
394 397 repo = pull_request.target_repo
395 398
396 399 if not PullRequestModel().check_user_read(
397 400 pull_request, apiuser, api=True):
398 401 raise JSONRPCError('repository `%s` or pull request `%s` '
399 402 'does not exist' % (repoid, pullrequestid))
400 403
401 404 (pull_request_latest,
402 405 pull_request_at_ver,
403 406 pull_request_display_obj,
404 407 at_version) = PullRequestModel().get_pr_version(
405 408 pull_request.pull_request_id, version=None)
406 409
407 410 versions = pull_request_display_obj.versions()
408 411 ver_map = {
409 412 ver.pull_request_version_id: cnt
410 413 for cnt, ver in enumerate(versions, 1)
411 414 }
412 415
413 416 # GENERAL COMMENTS with versions #
414 417 q = CommentsModel()._all_general_comments_of_pull_request(pull_request)
415 418 q = q.order_by(ChangesetComment.comment_id.asc())
416 419 general_comments = q.all()
417 420
418 421 # INLINE COMMENTS with versions #
419 422 q = CommentsModel()._all_inline_comments_of_pull_request(pull_request)
420 423 q = q.order_by(ChangesetComment.comment_id.asc())
421 424 inline_comments = q.all()
422 425
423 426 data = []
424 427 for comment in inline_comments + general_comments:
425 428 full_data = comment.get_api_data()
426 429 pr_version_id = None
427 430 if comment.pull_request_version_id:
428 431 pr_version_id = 'v{}'.format(
429 432 ver_map[comment.pull_request_version_id])
430 433
431 434 # sanitize some entries
432 435
433 436 full_data['pull_request_version'] = pr_version_id
434 437 full_data['comment_author'] = {
435 438 'username': full_data['comment_author'].username,
436 439 'full_name_or_username': full_data['comment_author'].full_name_or_username,
437 440 'active': full_data['comment_author'].active,
438 441 }
439 442
440 443 if full_data['comment_status']:
441 444 full_data['comment_status'] = {
442 445 'status': full_data['comment_status'][0].status,
443 446 'status_lbl': full_data['comment_status'][0].status_lbl,
444 447 }
445 448 else:
446 449 full_data['comment_status'] = {}
447 450
448 451 data.append(full_data)
449 452 return data
450 453
451 454
452 455 @jsonrpc_method()
453 456 def comment_pull_request(
454 457 request, apiuser, pullrequestid, repoid=Optional(None),
455 458 message=Optional(None), commit_id=Optional(None), status=Optional(None),
456 459 comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE),
457 460 resolves_comment_id=Optional(None), extra_recipients=Optional([]),
458 461 userid=Optional(OAttr('apiuser')), send_email=Optional(True)):
459 462 """
460 463 Comment on the pull request specified with the `pullrequestid`,
461 464 in the |repo| specified by the `repoid`, and optionally change the
462 465 review status.
463 466
464 467 :param apiuser: This is filled automatically from the |authtoken|.
465 468 :type apiuser: AuthUser
466 469 :param repoid: Optional repository name or repository ID.
467 470 :type repoid: str or int
468 471 :param pullrequestid: The pull request ID.
469 472 :type pullrequestid: int
470 473 :param commit_id: Specify the commit_id for which to set a comment. If
471 474 given commit_id is different than latest in the PR status
472 475 change won't be performed.
473 476 :type commit_id: str
474 477 :param message: The text content of the comment.
475 478 :type message: str
476 479 :param status: (**Optional**) Set the approval status of the pull
477 480 request. One of: 'not_reviewed', 'approved', 'rejected',
478 481 'under_review'
479 482 :type status: str
480 483 :param comment_type: Comment type, one of: 'note', 'todo'
481 484 :type comment_type: Optional(str), default: 'note'
482 485 :param resolves_comment_id: id of comment which this one will resolve
483 486 :type resolves_comment_id: Optional(int)
484 487 :param extra_recipients: list of user ids or usernames to add
485 488 notifications for this comment. Acts like a CC for notification
486 489 :type extra_recipients: Optional(list)
487 490 :param userid: Comment on the pull request as this user
488 491 :type userid: Optional(str or int)
489 492 :param send_email: Define if this comment should also send email notification
490 493 :type send_email: Optional(bool)
491 494
492 495 Example output:
493 496
494 497 .. code-block:: bash
495 498
496 499 id : <id_given_in_input>
497 500 result : {
498 501 "pull_request_id": "<Integer>",
499 502 "comment_id": "<Integer>",
500 503 "status": {"given": <given_status>,
501 504 "was_changed": <bool status_was_actually_changed> },
502 505 },
503 506 error : null
504 507 """
508 _ = request.translate
509
505 510 pull_request = get_pull_request_or_error(pullrequestid)
506 511 if Optional.extract(repoid):
507 512 repo = get_repo_or_error(repoid)
508 513 else:
509 514 repo = pull_request.target_repo
510 515
516 db_repo_name = repo.repo_name
511 517 auth_user = apiuser
512 518 if not isinstance(userid, Optional):
513 519 is_repo_admin = HasRepoPermissionAnyApi('repository.admin')(
514 user=apiuser, repo_name=repo.repo_name)
520 user=apiuser, repo_name=db_repo_name)
515 521 if has_superadmin_permission(apiuser) or is_repo_admin:
516 522 apiuser = get_user_or_error(userid)
517 523 auth_user = apiuser.AuthUser()
518 524 else:
519 525 raise JSONRPCError('userid is not the same as your user')
520 526
521 527 if pull_request.is_closed():
522 528 raise JSONRPCError(
523 529 'pull request `%s` comment failed, pull request is closed' % (
524 530 pullrequestid,))
525 531
526 532 if not PullRequestModel().check_user_read(
527 533 pull_request, apiuser, api=True):
528 534 raise JSONRPCError('repository `%s` does not exist' % (repoid,))
529 535 message = Optional.extract(message)
530 536 status = Optional.extract(status)
531 537 commit_id = Optional.extract(commit_id)
532 538 comment_type = Optional.extract(comment_type)
533 539 resolves_comment_id = Optional.extract(resolves_comment_id)
534 540 extra_recipients = Optional.extract(extra_recipients)
535 541 send_email = Optional.extract(send_email, binary=True)
536 542
537 543 if not message and not status:
538 544 raise JSONRPCError(
539 545 'Both message and status parameters are missing. '
540 546 'At least one is required.')
541 547
542 548 if (status not in (st[0] for st in ChangesetStatus.STATUSES) and
543 549 status is not None):
544 550 raise JSONRPCError('Unknown comment status: `%s`' % status)
545 551
546 552 if commit_id and commit_id not in pull_request.revisions:
547 553 raise JSONRPCError(
548 554 'Invalid commit_id `%s` for this pull request.' % commit_id)
549 555
550 556 allowed_to_change_status = PullRequestModel().check_user_change_status(
551 557 pull_request, apiuser)
552 558
553 559 # if commit_id is passed re-validated if user is allowed to change status
554 560 # based on latest commit_id from the PR
555 561 if commit_id:
556 562 commit_idx = pull_request.revisions.index(commit_id)
557 563 if commit_idx != 0:
558 564 allowed_to_change_status = False
559 565
560 566 if resolves_comment_id:
561 567 comment = ChangesetComment.get(resolves_comment_id)
562 568 if not comment:
563 569 raise JSONRPCError(
564 570 'Invalid resolves_comment_id `%s` for this pull request.'
565 571 % resolves_comment_id)
566 572 if comment.comment_type != ChangesetComment.COMMENT_TYPE_TODO:
567 573 raise JSONRPCError(
568 574 'Comment `%s` is wrong type for setting status to resolved.'
569 575 % resolves_comment_id)
570 576
571 577 text = message
572 578 status_label = ChangesetStatus.get_status_lbl(status)
573 579 if status and allowed_to_change_status:
574 580 st_message = ('Status change %(transition_icon)s %(status)s'
575 581 % {'transition_icon': '>', 'status': status_label})
576 582 text = message or st_message
577 583
578 584 rc_config = SettingsModel().get_all_settings()
579 585 renderer = rc_config.get('rhodecode_markup_renderer', 'rst')
580 586
581 587 status_change = status and allowed_to_change_status
582 588 comment = CommentsModel().create(
583 589 text=text,
584 590 repo=pull_request.target_repo.repo_id,
585 591 user=apiuser.user_id,
586 592 pull_request=pull_request.pull_request_id,
587 593 f_path=None,
588 594 line_no=None,
589 595 status_change=(status_label if status_change else None),
590 596 status_change_type=(status if status_change else None),
591 597 closing_pr=False,
592 598 renderer=renderer,
593 599 comment_type=comment_type,
594 600 resolves_comment_id=resolves_comment_id,
595 601 auth_user=auth_user,
596 602 extra_recipients=extra_recipients,
597 603 send_email=send_email
598 604 )
605 is_inline = comment.is_inline
599 606
600 607 if allowed_to_change_status and status:
601 608 old_calculated_status = pull_request.calculated_review_status()
602 609 ChangesetStatusModel().set_status(
603 610 pull_request.target_repo.repo_id,
604 611 status,
605 612 apiuser.user_id,
606 613 comment,
607 614 pull_request=pull_request.pull_request_id
608 615 )
609 616 Session().flush()
610 617
611 618 Session().commit()
612 619
613 620 PullRequestModel().trigger_pull_request_hook(
614 621 pull_request, apiuser, 'comment',
615 622 data={'comment': comment})
616 623
617 624 if allowed_to_change_status and status:
618 625 # we now calculate the status of pull request, and based on that
619 626 # calculation we set the commits status
620 627 calculated_status = pull_request.calculated_review_status()
621 628 if old_calculated_status != calculated_status:
622 629 PullRequestModel().trigger_pull_request_hook(
623 630 pull_request, apiuser, 'review_status_change',
624 631 data={'status': calculated_status})
625 632
626 633 data = {
627 634 'pull_request_id': pull_request.pull_request_id,
628 635 'comment_id': comment.comment_id if comment else None,
629 636 'status': {'given': status, 'was_changed': status_change},
630 637 }
638
639 comment_broadcast_channel = channelstream.comment_channel(
640 db_repo_name, pull_request_obj=pull_request)
641
642 comment_data = data
643 comment_type = 'inline' if is_inline else 'general'
644 channelstream.comment_channelstream_push(
645 request, comment_broadcast_channel, apiuser,
646 _('posted a new {} comment').format(comment_type),
647 comment_data=comment_data)
648
631 649 return data
632 650
651 def _reviewers_validation(obj_list):
652 schema = ReviewerListSchema()
653 try:
654 reviewer_objects = schema.deserialize(obj_list)
655 except Invalid as err:
656 raise JSONRPCValidationError(colander_exc=err)
657
658 # validate users
659 for reviewer_object in reviewer_objects:
660 user = get_user_or_error(reviewer_object['username'])
661 reviewer_object['user_id'] = user.user_id
662 return reviewer_objects
663
633 664
634 665 @jsonrpc_method()
635 666 def create_pull_request(
636 667 request, apiuser, source_repo, target_repo, source_ref, target_ref,
637 668 owner=Optional(OAttr('apiuser')), title=Optional(''), description=Optional(''),
638 description_renderer=Optional(''), reviewers=Optional(None)):
669 description_renderer=Optional(''),
670 reviewers=Optional(None), observers=Optional(None)):
639 671 """
640 672 Creates a new pull request.
641 673
642 674 Accepts refs in the following formats:
643 675
644 676 * branch:<branch_name>:<sha>
645 677 * branch:<branch_name>
646 678 * bookmark:<bookmark_name>:<sha> (Mercurial only)
647 679 * bookmark:<bookmark_name> (Mercurial only)
648 680
649 681 :param apiuser: This is filled automatically from the |authtoken|.
650 682 :type apiuser: AuthUser
651 683 :param source_repo: Set the source repository name.
652 684 :type source_repo: str
653 685 :param target_repo: Set the target repository name.
654 686 :type target_repo: str
655 687 :param source_ref: Set the source ref name.
656 688 :type source_ref: str
657 689 :param target_ref: Set the target ref name.
658 690 :type target_ref: str
659 691 :param owner: user_id or username
660 692 :type owner: Optional(str)
661 693 :param title: Optionally Set the pull request title, it's generated otherwise
662 694 :type title: str
663 695 :param description: Set the pull request description.
664 696 :type description: Optional(str)
665 697 :type description_renderer: Optional(str)
666 698 :param description_renderer: Set pull request renderer for the description.
667 699 It should be 'rst', 'markdown' or 'plain'. If not give default
668 700 system renderer will be used
669 701 :param reviewers: Set the new pull request reviewers list.
670 702 Reviewer defined by review rules will be added automatically to the
671 703 defined list.
672 704 :type reviewers: Optional(list)
673 705 Accepts username strings or objects of the format:
674 706
675 707 [{'username': 'nick', 'reasons': ['original author'], 'mandatory': <bool>}]
708 :param observers: Set the new pull request observers list.
709 Reviewer defined by review rules will be added automatically to the
710 defined list. This feature is only available in RhodeCode EE
711 :type observers: Optional(list)
712 Accepts username strings or objects of the format:
713
714 [{'username': 'nick', 'reasons': ['original author']}]
676 715 """
677 716
678 717 source_db_repo = get_repo_or_error(source_repo)
679 718 target_db_repo = get_repo_or_error(target_repo)
680 719 if not has_superadmin_permission(apiuser):
681 720 _perms = ('repository.admin', 'repository.write', 'repository.read',)
682 721 validate_repo_permissions(apiuser, source_repo, source_db_repo, _perms)
683 722
684 723 owner = validate_set_owner_permissions(apiuser, owner)
685 724
686 725 full_source_ref = resolve_ref_or_error(source_ref, source_db_repo)
687 726 full_target_ref = resolve_ref_or_error(target_ref, target_db_repo)
688 727
689 source_commit = get_commit_or_error(full_source_ref, source_db_repo)
690 target_commit = get_commit_or_error(full_target_ref, target_db_repo)
728 get_commit_or_error(full_source_ref, source_db_repo)
729 get_commit_or_error(full_target_ref, target_db_repo)
691 730
692 731 reviewer_objects = Optional.extract(reviewers) or []
732 observer_objects = Optional.extract(observers) or []
693 733
694 734 # serialize and validate passed in given reviewers
695 735 if reviewer_objects:
696 schema = ReviewerListSchema()
697 try:
698 reviewer_objects = schema.deserialize(reviewer_objects)
699 except Invalid as err:
700 raise JSONRPCValidationError(colander_exc=err)
736 reviewer_objects = _reviewers_validation(reviewer_objects)
737
738 if observer_objects:
739 observer_objects = _reviewers_validation(reviewer_objects)
701 740
702 # validate users
703 for reviewer_object in reviewer_objects:
704 user = get_user_or_error(reviewer_object['username'])
705 reviewer_object['user_id'] = user.user_id
741 get_default_reviewers_data, validate_default_reviewers, validate_observers = \
742 PullRequestModel().get_reviewer_functions()
706 743
707 get_default_reviewers_data, validate_default_reviewers = \
708 PullRequestModel().get_reviewer_functions()
744 source_ref_obj = unicode_to_reference(full_source_ref)
745 target_ref_obj = unicode_to_reference(full_target_ref)
709 746
710 747 # recalculate reviewers logic, to make sure we can validate this
711 748 default_reviewers_data = get_default_reviewers_data(
712 owner, source_db_repo,
713 source_commit, target_db_repo, target_commit)
749 owner,
750 source_db_repo,
751 source_ref_obj,
752 target_db_repo,
753 target_ref_obj,
754 )
714 755
715 # now MERGE our given with the calculated
716 reviewer_objects = default_reviewers_data['reviewers'] + reviewer_objects
756 # now MERGE our given with the calculated from the default rules
757 just_reviewers = [
758 x for x in default_reviewers_data['reviewers']
759 if x['role'] == PullRequestReviewers.ROLE_REVIEWER]
760 reviewer_objects = just_reviewers + reviewer_objects
717 761
718 762 try:
719 763 reviewers = validate_default_reviewers(
720 764 reviewer_objects, default_reviewers_data)
721 765 except ValueError as e:
722 766 raise JSONRPCError('Reviewers Validation: {}'.format(e))
723 767
768 # now MERGE our given with the calculated from the default rules
769 just_observers = [
770 x for x in default_reviewers_data['reviewers']
771 if x['role'] == PullRequestReviewers.ROLE_OBSERVER]
772 observer_objects = just_observers + observer_objects
773
774 try:
775 observers = validate_observers(
776 observer_objects, default_reviewers_data)
777 except ValueError as e:
778 raise JSONRPCError('Observer Validation: {}'.format(e))
779
724 780 title = Optional.extract(title)
725 781 if not title:
726 title_source_ref = source_ref.split(':', 2)[1]
782 title_source_ref = source_ref_obj.name
727 783 title = PullRequestModel().generate_pullrequest_title(
728 784 source=source_repo,
729 785 source_ref=title_source_ref,
730 786 target=target_repo
731 787 )
732 788
733 789 diff_info = default_reviewers_data['diff_info']
734 790 common_ancestor_id = diff_info['ancestor']
735 commits = diff_info['commits']
791 # NOTE(marcink): reversed is consistent with how we open it in the WEB interface
792 commits = [commit['commit_id'] for commit in reversed(diff_info['commits'])]
736 793
737 794 if not common_ancestor_id:
738 raise JSONRPCError('no common ancestor found')
795 raise JSONRPCError('no common ancestor found between specified references')
739 796
740 797 if not commits:
741 raise JSONRPCError('no commits found')
742
743 # NOTE(marcink): reversed is consistent with how we open it in the WEB interface
744 revisions = [commit.raw_id for commit in reversed(commits)]
798 raise JSONRPCError('no commits found for merge between specified references')
745 799
746 800 # recalculate target ref based on ancestor
747 target_ref_type, target_ref_name, __ = full_target_ref.split(':')
748 full_target_ref = ':'.join((target_ref_type, target_ref_name, common_ancestor_id))
801 full_target_ref = ':'.join((target_ref_obj.type, target_ref_obj.name, common_ancestor_id))
749 802
750 803 # fetch renderer, if set fallback to plain in case of PR
751 804 rc_config = SettingsModel().get_all_settings()
752 805 default_system_renderer = rc_config.get('rhodecode_markup_renderer', 'plain')
753 806 description = Optional.extract(description)
754 807 description_renderer = Optional.extract(description_renderer) or default_system_renderer
755 808
756 809 pull_request = PullRequestModel().create(
757 810 created_by=owner.user_id,
758 811 source_repo=source_repo,
759 812 source_ref=full_source_ref,
760 813 target_repo=target_repo,
761 814 target_ref=full_target_ref,
762 815 common_ancestor_id=common_ancestor_id,
763 revisions=revisions,
816 revisions=commits,
764 817 reviewers=reviewers,
818 observers=observers,
765 819 title=title,
766 820 description=description,
767 821 description_renderer=description_renderer,
768 822 reviewer_data=default_reviewers_data,
769 823 auth_user=apiuser
770 824 )
771 825
772 826 Session().commit()
773 827 data = {
774 828 'msg': 'Created new pull request `{}`'.format(title),
775 829 'pull_request_id': pull_request.pull_request_id,
776 830 }
777 831 return data
778 832
779 833
780 834 @jsonrpc_method()
781 835 def update_pull_request(
782 836 request, apiuser, pullrequestid, repoid=Optional(None),
783 837 title=Optional(''), description=Optional(''), description_renderer=Optional(''),
784 reviewers=Optional(None), update_commits=Optional(None)):
838 reviewers=Optional(None), observers=Optional(None), update_commits=Optional(None)):
785 839 """
786 840 Updates a pull request.
787 841
788 842 :param apiuser: This is filled automatically from the |authtoken|.
789 843 :type apiuser: AuthUser
790 844 :param repoid: Optional repository name or repository ID.
791 845 :type repoid: str or int
792 846 :param pullrequestid: The pull request ID.
793 847 :type pullrequestid: int
794 848 :param title: Set the pull request title.
795 849 :type title: str
796 850 :param description: Update pull request description.
797 851 :type description: Optional(str)
798 852 :type description_renderer: Optional(str)
799 853 :param description_renderer: Update pull request renderer for the description.
800 854 It should be 'rst', 'markdown' or 'plain'
801 855 :param reviewers: Update pull request reviewers list with new value.
802 856 :type reviewers: Optional(list)
803 857 Accepts username strings or objects of the format:
804 858
805 859 [{'username': 'nick', 'reasons': ['original author'], 'mandatory': <bool>}]
860 :param observers: Update pull request observers list with new value.
861 :type observers: Optional(list)
862 Accepts username strings or objects of the format:
806 863
864 [{'username': 'nick', 'reasons': ['should be aware about this PR']}]
807 865 :param update_commits: Trigger update of commits for this pull request
808 866 :type: update_commits: Optional(bool)
809 867
810 868 Example output:
811 869
812 870 .. code-block:: bash
813 871
814 872 id : <id_given_in_input>
815 873 result : {
816 874 "msg": "Updated pull request `63`",
817 875 "pull_request": <pull_request_object>,
818 876 "updated_reviewers": {
819 877 "added": [
820 878 "username"
821 879 ],
822 880 "removed": []
823 881 },
882 "updated_observers": {
883 "added": [
884 "username"
885 ],
886 "removed": []
887 },
824 888 "updated_commits": {
825 889 "added": [
826 890 "<sha1_hash>"
827 891 ],
828 892 "common": [
829 893 "<sha1_hash>",
830 894 "<sha1_hash>",
831 895 ],
832 896 "removed": []
833 897 }
834 898 }
835 899 error : null
836 900 """
837 901
838 902 pull_request = get_pull_request_or_error(pullrequestid)
839 903 if Optional.extract(repoid):
840 904 repo = get_repo_or_error(repoid)
841 905 else:
842 906 repo = pull_request.target_repo
843 907
844 908 if not PullRequestModel().check_user_update(
845 909 pull_request, apiuser, api=True):
846 910 raise JSONRPCError(
847 911 'pull request `%s` update failed, no permission to update.' % (
848 912 pullrequestid,))
849 913 if pull_request.is_closed():
850 914 raise JSONRPCError(
851 915 'pull request `%s` update failed, pull request is closed' % (
852 916 pullrequestid,))
853 917
854 918 reviewer_objects = Optional.extract(reviewers) or []
855
856 if reviewer_objects:
857 schema = ReviewerListSchema()
858 try:
859 reviewer_objects = schema.deserialize(reviewer_objects)
860 except Invalid as err:
861 raise JSONRPCValidationError(colander_exc=err)
862
863 # validate users
864 for reviewer_object in reviewer_objects:
865 user = get_user_or_error(reviewer_object['username'])
866 reviewer_object['user_id'] = user.user_id
867
868 get_default_reviewers_data, get_validated_reviewers = \
869 PullRequestModel().get_reviewer_functions()
870
871 # re-use stored rules
872 reviewer_rules = pull_request.reviewer_data
873 try:
874 reviewers = get_validated_reviewers(
875 reviewer_objects, reviewer_rules)
876 except ValueError as e:
877 raise JSONRPCError('Reviewers Validation: {}'.format(e))
878 else:
879 reviewers = []
919 observer_objects = Optional.extract(observers) or []
880 920
881 921 title = Optional.extract(title)
882 922 description = Optional.extract(description)
883 923 description_renderer = Optional.extract(description_renderer)
884 924
925 # Update title/description
926 title_changed = False
885 927 if title or description:
886 928 PullRequestModel().edit(
887 929 pull_request,
888 930 title or pull_request.title,
889 931 description or pull_request.description,
890 932 description_renderer or pull_request.description_renderer,
891 933 apiuser)
892 934 Session().commit()
935 title_changed = True
893 936
894 937 commit_changes = {"added": [], "common": [], "removed": []}
938
939 # Update commits
940 commits_changed = False
895 941 if str2bool(Optional.extract(update_commits)):
896 942
897 943 if pull_request.pull_request_state != PullRequest.STATE_CREATED:
898 944 raise JSONRPCError(
899 945 'Operation forbidden because pull request is in state {}, '
900 946 'only state {} is allowed.'.format(
901 947 pull_request.pull_request_state, PullRequest.STATE_CREATED))
902 948
903 949 with pull_request.set_state(PullRequest.STATE_UPDATING):
904 950 if PullRequestModel().has_valid_update_type(pull_request):
905 951 db_user = apiuser.get_instance()
906 952 update_response = PullRequestModel().update_commits(
907 953 pull_request, db_user)
908 954 commit_changes = update_response.changes or commit_changes
909 955 Session().commit()
956 commits_changed = True
910 957
958 # Update reviewers
959 # serialize and validate passed in given reviewers
960 if reviewer_objects:
961 reviewer_objects = _reviewers_validation(reviewer_objects)
962
963 if observer_objects:
964 observer_objects = _reviewers_validation(reviewer_objects)
965
966 # re-use stored rules
967 default_reviewers_data = pull_request.reviewer_data
968
969 __, validate_default_reviewers, validate_observers = \
970 PullRequestModel().get_reviewer_functions()
971
972 if reviewer_objects:
973 try:
974 reviewers = validate_default_reviewers(reviewer_objects, default_reviewers_data)
975 except ValueError as e:
976 raise JSONRPCError('Reviewers Validation: {}'.format(e))
977 else:
978 reviewers = []
979
980 if observer_objects:
981 try:
982 observers = validate_default_reviewers(reviewer_objects, default_reviewers_data)
983 except ValueError as e:
984 raise JSONRPCError('Observer Validation: {}'.format(e))
985 else:
986 observers = []
987
988 reviewers_changed = False
911 989 reviewers_changes = {"added": [], "removed": []}
912 990 if reviewers:
913 991 old_calculated_status = pull_request.calculated_review_status()
914 992 added_reviewers, removed_reviewers = \
915 PullRequestModel().update_reviewers(pull_request, reviewers, apiuser)
993 PullRequestModel().update_reviewers(pull_request, reviewers, apiuser.get_instance())
916 994
917 995 reviewers_changes['added'] = sorted(
918 996 [get_user_or_error(n).username for n in added_reviewers])
919 997 reviewers_changes['removed'] = sorted(
920 998 [get_user_or_error(n).username for n in removed_reviewers])
921 999 Session().commit()
922 1000
923 1001 # trigger status changed if change in reviewers changes the status
924 1002 calculated_status = pull_request.calculated_review_status()
925 1003 if old_calculated_status != calculated_status:
926 1004 PullRequestModel().trigger_pull_request_hook(
927 1005 pull_request, apiuser, 'review_status_change',
928 1006 data={'status': calculated_status})
1007 reviewers_changed = True
1008
1009 observers_changed = False
1010 observers_changes = {"added": [], "removed": []}
1011 if observers:
1012 added_observers, removed_observers = \
1013 PullRequestModel().update_observers(pull_request, observers, apiuser.get_instance())
1014
1015 observers_changes['added'] = sorted(
1016 [get_user_or_error(n).username for n in added_observers])
1017 observers_changes['removed'] = sorted(
1018 [get_user_or_error(n).username for n in removed_observers])
1019 Session().commit()
1020
1021 reviewers_changed = True
1022
1023 # push changed to channelstream
1024 if commits_changed or reviewers_changed or observers_changed:
1025 pr_broadcast_channel = channelstream.pr_channel(pull_request)
1026 msg = 'Pull request was updated.'
1027 channelstream.pr_update_channelstream_push(
1028 request, pr_broadcast_channel, apiuser, msg)
929 1029
930 1030 data = {
931 'msg': 'Updated pull request `{}`'.format(
932 pull_request.pull_request_id),
1031 'msg': 'Updated pull request `{}`'.format(pull_request.pull_request_id),
933 1032 'pull_request': pull_request.get_api_data(),
934 1033 'updated_commits': commit_changes,
935 'updated_reviewers': reviewers_changes
1034 'updated_reviewers': reviewers_changes,
1035 'updated_observers': observers_changes,
936 1036 }
937 1037
938 1038 return data
939 1039
940 1040
941 1041 @jsonrpc_method()
942 1042 def close_pull_request(
943 1043 request, apiuser, pullrequestid, repoid=Optional(None),
944 1044 userid=Optional(OAttr('apiuser')), message=Optional('')):
945 1045 """
946 1046 Close the pull request specified by `pullrequestid`.
947 1047
948 1048 :param apiuser: This is filled automatically from the |authtoken|.
949 1049 :type apiuser: AuthUser
950 1050 :param repoid: Repository name or repository ID to which the pull
951 1051 request belongs.
952 1052 :type repoid: str or int
953 1053 :param pullrequestid: ID of the pull request to be closed.
954 1054 :type pullrequestid: int
955 1055 :param userid: Close the pull request as this user.
956 1056 :type userid: Optional(str or int)
957 1057 :param message: Optional message to close the Pull Request with. If not
958 1058 specified it will be generated automatically.
959 1059 :type message: Optional(str)
960 1060
961 1061 Example output:
962 1062
963 1063 .. code-block:: bash
964 1064
965 1065 "id": <id_given_in_input>,
966 1066 "result": {
967 1067 "pull_request_id": "<int>",
968 1068 "close_status": "<str:status_lbl>,
969 1069 "closed": "<bool>"
970 1070 },
971 1071 "error": null
972 1072
973 1073 """
974 1074 _ = request.translate
975 1075
976 1076 pull_request = get_pull_request_or_error(pullrequestid)
977 1077 if Optional.extract(repoid):
978 1078 repo = get_repo_or_error(repoid)
979 1079 else:
980 1080 repo = pull_request.target_repo
981 1081
982 1082 is_repo_admin = HasRepoPermissionAnyApi('repository.admin')(
983 1083 user=apiuser, repo_name=repo.repo_name)
984 1084 if not isinstance(userid, Optional):
985 1085 if has_superadmin_permission(apiuser) or is_repo_admin:
986 1086 apiuser = get_user_or_error(userid)
987 1087 else:
988 1088 raise JSONRPCError('userid is not the same as your user')
989 1089
990 1090 if pull_request.is_closed():
991 1091 raise JSONRPCError(
992 1092 'pull request `%s` is already closed' % (pullrequestid,))
993 1093
994 1094 # only owner or admin or person with write permissions
995 1095 allowed_to_close = PullRequestModel().check_user_update(
996 1096 pull_request, apiuser, api=True)
997 1097
998 1098 if not allowed_to_close:
999 1099 raise JSONRPCError(
1000 1100 'pull request `%s` close failed, no permission to close.' % (
1001 1101 pullrequestid,))
1002 1102
1003 1103 # message we're using to close the PR, else it's automatically generated
1004 1104 message = Optional.extract(message)
1005 1105
1006 1106 # finally close the PR, with proper message comment
1007 1107 comment, status = PullRequestModel().close_pull_request_with_comment(
1008 1108 pull_request, apiuser, repo, message=message, auth_user=apiuser)
1009 1109 status_lbl = ChangesetStatus.get_status_lbl(status)
1010 1110
1011 1111 Session().commit()
1012 1112
1013 1113 data = {
1014 1114 'pull_request_id': pull_request.pull_request_id,
1015 1115 'close_status': status_lbl,
1016 1116 'closed': True,
1017 1117 }
1018 1118 return data
@@ -1,2507 +1,2523 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import logging
22 22 import time
23 23
24 24 import rhodecode
25 25 from rhodecode.api import (
26 26 jsonrpc_method, JSONRPCError, JSONRPCForbidden, JSONRPCValidationError)
27 27 from rhodecode.api.utils import (
28 28 has_superadmin_permission, Optional, OAttr, get_repo_or_error,
29 29 get_user_group_or_error, get_user_or_error, validate_repo_permissions,
30 30 get_perm_or_error, parse_args, get_origin, build_commit_data,
31 31 validate_set_owner_permissions)
32 from rhodecode.lib import audit_logger, rc_cache
32 from rhodecode.lib import audit_logger, rc_cache, channelstream
33 33 from rhodecode.lib import repo_maintenance
34 34 from rhodecode.lib.auth import (
35 35 HasPermissionAnyApi, HasUserGroupPermissionAnyApi,
36 36 HasRepoPermissionAnyApi)
37 37 from rhodecode.lib.celerylib.utils import get_task_id
38 38 from rhodecode.lib.utils2 import (
39 39 str2bool, time_to_datetime, safe_str, safe_int, safe_unicode)
40 40 from rhodecode.lib.ext_json import json
41 41 from rhodecode.lib.exceptions import (
42 42 StatusChangeOnClosedPullRequestError, CommentVersionMismatch)
43 43 from rhodecode.lib.vcs import RepositoryError
44 44 from rhodecode.lib.vcs.exceptions import NodeDoesNotExistError
45 45 from rhodecode.model.changeset_status import ChangesetStatusModel
46 46 from rhodecode.model.comment import CommentsModel
47 47 from rhodecode.model.db import (
48 48 Session, ChangesetStatus, RepositoryField, Repository, RepoGroup,
49 49 ChangesetComment)
50 50 from rhodecode.model.permission import PermissionModel
51 51 from rhodecode.model.pull_request import PullRequestModel
52 52 from rhodecode.model.repo import RepoModel
53 53 from rhodecode.model.scm import ScmModel, RepoList
54 54 from rhodecode.model.settings import SettingsModel, VcsSettingsModel
55 55 from rhodecode.model import validation_schema
56 56 from rhodecode.model.validation_schema.schemas import repo_schema
57 57
58 58 log = logging.getLogger(__name__)
59 59
60 60
61 61 @jsonrpc_method()
62 62 def get_repo(request, apiuser, repoid, cache=Optional(True)):
63 63 """
64 64 Gets an existing repository by its name or repository_id.
65 65
66 66 The members section so the output returns users groups or users
67 67 associated with that repository.
68 68
69 69 This command can only be run using an |authtoken| with admin rights,
70 70 or users with at least read rights to the |repo|.
71 71
72 72 :param apiuser: This is filled automatically from the |authtoken|.
73 73 :type apiuser: AuthUser
74 74 :param repoid: The repository name or repository id.
75 75 :type repoid: str or int
76 76 :param cache: use the cached value for last changeset
77 77 :type: cache: Optional(bool)
78 78
79 79 Example output:
80 80
81 81 .. code-block:: bash
82 82
83 83 {
84 84 "error": null,
85 85 "id": <repo_id>,
86 86 "result": {
87 87 "clone_uri": null,
88 88 "created_on": "timestamp",
89 89 "description": "repo description",
90 90 "enable_downloads": false,
91 91 "enable_locking": false,
92 92 "enable_statistics": false,
93 93 "followers": [
94 94 {
95 95 "active": true,
96 96 "admin": false,
97 97 "api_key": "****************************************",
98 98 "api_keys": [
99 99 "****************************************"
100 100 ],
101 101 "email": "user@example.com",
102 102 "emails": [
103 103 "user@example.com"
104 104 ],
105 105 "extern_name": "rhodecode",
106 106 "extern_type": "rhodecode",
107 107 "firstname": "username",
108 108 "ip_addresses": [],
109 109 "language": null,
110 110 "last_login": "2015-09-16T17:16:35.854",
111 111 "lastname": "surname",
112 112 "user_id": <user_id>,
113 113 "username": "name"
114 114 }
115 115 ],
116 116 "fork_of": "parent-repo",
117 117 "landing_rev": [
118 118 "rev",
119 119 "tip"
120 120 ],
121 121 "last_changeset": {
122 122 "author": "User <user@example.com>",
123 123 "branch": "default",
124 124 "date": "timestamp",
125 125 "message": "last commit message",
126 126 "parents": [
127 127 {
128 128 "raw_id": "commit-id"
129 129 }
130 130 ],
131 131 "raw_id": "commit-id",
132 132 "revision": <revision number>,
133 133 "short_id": "short id"
134 134 },
135 135 "lock_reason": null,
136 136 "locked_by": null,
137 137 "locked_date": null,
138 138 "owner": "owner-name",
139 139 "permissions": [
140 140 {
141 141 "name": "super-admin-name",
142 142 "origin": "super-admin",
143 143 "permission": "repository.admin",
144 144 "type": "user"
145 145 },
146 146 {
147 147 "name": "owner-name",
148 148 "origin": "owner",
149 149 "permission": "repository.admin",
150 150 "type": "user"
151 151 },
152 152 {
153 153 "name": "user-group-name",
154 154 "origin": "permission",
155 155 "permission": "repository.write",
156 156 "type": "user_group"
157 157 }
158 158 ],
159 159 "private": true,
160 160 "repo_id": 676,
161 161 "repo_name": "user-group/repo-name",
162 162 "repo_type": "hg"
163 163 }
164 164 }
165 165 """
166 166
167 167 repo = get_repo_or_error(repoid)
168 168 cache = Optional.extract(cache)
169 169
170 170 include_secrets = False
171 171 if has_superadmin_permission(apiuser):
172 172 include_secrets = True
173 173 else:
174 174 # check if we have at least read permission for this repo !
175 175 _perms = (
176 176 'repository.admin', 'repository.write', 'repository.read',)
177 177 validate_repo_permissions(apiuser, repoid, repo, _perms)
178 178
179 179 permissions = []
180 180 for _user in repo.permissions():
181 181 user_data = {
182 182 'name': _user.username,
183 183 'permission': _user.permission,
184 184 'origin': get_origin(_user),
185 185 'type': "user",
186 186 }
187 187 permissions.append(user_data)
188 188
189 189 for _user_group in repo.permission_user_groups():
190 190 user_group_data = {
191 191 'name': _user_group.users_group_name,
192 192 'permission': _user_group.permission,
193 193 'origin': get_origin(_user_group),
194 194 'type': "user_group",
195 195 }
196 196 permissions.append(user_group_data)
197 197
198 198 following_users = [
199 199 user.user.get_api_data(include_secrets=include_secrets)
200 200 for user in repo.followers]
201 201
202 202 if not cache:
203 203 repo.update_commit_cache()
204 204 data = repo.get_api_data(include_secrets=include_secrets)
205 205 data['permissions'] = permissions
206 206 data['followers'] = following_users
207 207 return data
208 208
209 209
210 210 @jsonrpc_method()
211 211 def get_repos(request, apiuser, root=Optional(None), traverse=Optional(True)):
212 212 """
213 213 Lists all existing repositories.
214 214
215 215 This command can only be run using an |authtoken| with admin rights,
216 216 or users with at least read rights to |repos|.
217 217
218 218 :param apiuser: This is filled automatically from the |authtoken|.
219 219 :type apiuser: AuthUser
220 220 :param root: specify root repository group to fetch repositories.
221 221 filters the returned repositories to be members of given root group.
222 222 :type root: Optional(None)
223 223 :param traverse: traverse given root into subrepositories. With this flag
224 224 set to False, it will only return top-level repositories from `root`.
225 225 if root is empty it will return just top-level repositories.
226 226 :type traverse: Optional(True)
227 227
228 228
229 229 Example output:
230 230
231 231 .. code-block:: bash
232 232
233 233 id : <id_given_in_input>
234 234 result: [
235 235 {
236 236 "repo_id" : "<repo_id>",
237 237 "repo_name" : "<reponame>"
238 238 "repo_type" : "<repo_type>",
239 239 "clone_uri" : "<clone_uri>",
240 240 "private": : "<bool>",
241 241 "created_on" : "<datetimecreated>",
242 242 "description" : "<description>",
243 243 "landing_rev": "<landing_rev>",
244 244 "owner": "<repo_owner>",
245 245 "fork_of": "<name_of_fork_parent>",
246 246 "enable_downloads": "<bool>",
247 247 "enable_locking": "<bool>",
248 248 "enable_statistics": "<bool>",
249 249 },
250 250 ...
251 251 ]
252 252 error: null
253 253 """
254 254
255 255 include_secrets = has_superadmin_permission(apiuser)
256 256 _perms = ('repository.read', 'repository.write', 'repository.admin',)
257 257 extras = {'user': apiuser}
258 258
259 259 root = Optional.extract(root)
260 260 traverse = Optional.extract(traverse, binary=True)
261 261
262 262 if root:
263 263 # verify parent existance, if it's empty return an error
264 264 parent = RepoGroup.get_by_group_name(root)
265 265 if not parent:
266 266 raise JSONRPCError(
267 267 'Root repository group `{}` does not exist'.format(root))
268 268
269 269 if traverse:
270 270 repos = RepoModel().get_repos_for_root(root=root, traverse=traverse)
271 271 else:
272 272 repos = RepoModel().get_repos_for_root(root=parent)
273 273 else:
274 274 if traverse:
275 275 repos = RepoModel().get_all()
276 276 else:
277 277 # return just top-level
278 278 repos = RepoModel().get_repos_for_root(root=None)
279 279
280 280 repo_list = RepoList(repos, perm_set=_perms, extra_kwargs=extras)
281 281 return [repo.get_api_data(include_secrets=include_secrets)
282 282 for repo in repo_list]
283 283
284 284
285 285 @jsonrpc_method()
286 286 def get_repo_changeset(request, apiuser, repoid, revision,
287 287 details=Optional('basic')):
288 288 """
289 289 Returns information about a changeset.
290 290
291 291 Additionally parameters define the amount of details returned by
292 292 this function.
293 293
294 294 This command can only be run using an |authtoken| with admin rights,
295 295 or users with at least read rights to the |repo|.
296 296
297 297 :param apiuser: This is filled automatically from the |authtoken|.
298 298 :type apiuser: AuthUser
299 299 :param repoid: The repository name or repository id
300 300 :type repoid: str or int
301 301 :param revision: revision for which listing should be done
302 302 :type revision: str
303 303 :param details: details can be 'basic|extended|full' full gives diff
304 304 info details like the diff itself, and number of changed files etc.
305 305 :type details: Optional(str)
306 306
307 307 """
308 308 repo = get_repo_or_error(repoid)
309 309 if not has_superadmin_permission(apiuser):
310 310 _perms = ('repository.admin', 'repository.write', 'repository.read',)
311 311 validate_repo_permissions(apiuser, repoid, repo, _perms)
312 312
313 313 changes_details = Optional.extract(details)
314 314 _changes_details_types = ['basic', 'extended', 'full']
315 315 if changes_details not in _changes_details_types:
316 316 raise JSONRPCError(
317 317 'ret_type must be one of %s' % (
318 318 ','.join(_changes_details_types)))
319 319
320 320 pre_load = ['author', 'branch', 'date', 'message', 'parents',
321 321 'status', '_commit', '_file_paths']
322 322
323 323 try:
324 324 cs = repo.get_commit(commit_id=revision, pre_load=pre_load)
325 325 except TypeError as e:
326 326 raise JSONRPCError(safe_str(e))
327 327 _cs_json = cs.__json__()
328 328 _cs_json['diff'] = build_commit_data(cs, changes_details)
329 329 if changes_details == 'full':
330 330 _cs_json['refs'] = cs._get_refs()
331 331 return _cs_json
332 332
333 333
334 334 @jsonrpc_method()
335 335 def get_repo_changesets(request, apiuser, repoid, start_rev, limit,
336 336 details=Optional('basic')):
337 337 """
338 338 Returns a set of commits limited by the number starting
339 339 from the `start_rev` option.
340 340
341 341 Additional parameters define the amount of details returned by this
342 342 function.
343 343
344 344 This command can only be run using an |authtoken| with admin rights,
345 345 or users with at least read rights to |repos|.
346 346
347 347 :param apiuser: This is filled automatically from the |authtoken|.
348 348 :type apiuser: AuthUser
349 349 :param repoid: The repository name or repository ID.
350 350 :type repoid: str or int
351 351 :param start_rev: The starting revision from where to get changesets.
352 352 :type start_rev: str
353 353 :param limit: Limit the number of commits to this amount
354 354 :type limit: str or int
355 355 :param details: Set the level of detail returned. Valid option are:
356 356 ``basic``, ``extended`` and ``full``.
357 357 :type details: Optional(str)
358 358
359 359 .. note::
360 360
361 361 Setting the parameter `details` to the value ``full`` is extensive
362 362 and returns details like the diff itself, and the number
363 363 of changed files.
364 364
365 365 """
366 366 repo = get_repo_or_error(repoid)
367 367 if not has_superadmin_permission(apiuser):
368 368 _perms = ('repository.admin', 'repository.write', 'repository.read',)
369 369 validate_repo_permissions(apiuser, repoid, repo, _perms)
370 370
371 371 changes_details = Optional.extract(details)
372 372 _changes_details_types = ['basic', 'extended', 'full']
373 373 if changes_details not in _changes_details_types:
374 374 raise JSONRPCError(
375 375 'ret_type must be one of %s' % (
376 376 ','.join(_changes_details_types)))
377 377
378 378 limit = int(limit)
379 379 pre_load = ['author', 'branch', 'date', 'message', 'parents',
380 380 'status', '_commit', '_file_paths']
381 381
382 382 vcs_repo = repo.scm_instance()
383 383 # SVN needs a special case to distinguish its index and commit id
384 384 if vcs_repo and vcs_repo.alias == 'svn' and (start_rev == '0'):
385 385 start_rev = vcs_repo.commit_ids[0]
386 386
387 387 try:
388 388 commits = vcs_repo.get_commits(
389 389 start_id=start_rev, pre_load=pre_load, translate_tags=False)
390 390 except TypeError as e:
391 391 raise JSONRPCError(safe_str(e))
392 392 except Exception:
393 393 log.exception('Fetching of commits failed')
394 394 raise JSONRPCError('Error occurred during commit fetching')
395 395
396 396 ret = []
397 397 for cnt, commit in enumerate(commits):
398 398 if cnt >= limit != -1:
399 399 break
400 400 _cs_json = commit.__json__()
401 401 _cs_json['diff'] = build_commit_data(commit, changes_details)
402 402 if changes_details == 'full':
403 403 _cs_json['refs'] = {
404 404 'branches': [commit.branch],
405 405 'bookmarks': getattr(commit, 'bookmarks', []),
406 406 'tags': commit.tags
407 407 }
408 408 ret.append(_cs_json)
409 409 return ret
410 410
411 411
412 412 @jsonrpc_method()
413 413 def get_repo_nodes(request, apiuser, repoid, revision, root_path,
414 414 ret_type=Optional('all'), details=Optional('basic'),
415 415 max_file_bytes=Optional(None)):
416 416 """
417 417 Returns a list of nodes and children in a flat list for a given
418 418 path at given revision.
419 419
420 420 It's possible to specify ret_type to show only `files` or `dirs`.
421 421
422 422 This command can only be run using an |authtoken| with admin rights,
423 423 or users with at least read rights to |repos|.
424 424
425 425 :param apiuser: This is filled automatically from the |authtoken|.
426 426 :type apiuser: AuthUser
427 427 :param repoid: The repository name or repository ID.
428 428 :type repoid: str or int
429 429 :param revision: The revision for which listing should be done.
430 430 :type revision: str
431 431 :param root_path: The path from which to start displaying.
432 432 :type root_path: str
433 433 :param ret_type: Set the return type. Valid options are
434 434 ``all`` (default), ``files`` and ``dirs``.
435 435 :type ret_type: Optional(str)
436 436 :param details: Returns extended information about nodes, such as
437 437 md5, binary, and or content.
438 438 The valid options are ``basic`` and ``full``.
439 439 :type details: Optional(str)
440 440 :param max_file_bytes: Only return file content under this file size bytes
441 441 :type details: Optional(int)
442 442
443 443 Example output:
444 444
445 445 .. code-block:: bash
446 446
447 447 id : <id_given_in_input>
448 448 result: [
449 449 {
450 450 "binary": false,
451 451 "content": "File line",
452 452 "extension": "md",
453 453 "lines": 2,
454 454 "md5": "059fa5d29b19c0657e384749480f6422",
455 455 "mimetype": "text/x-minidsrc",
456 456 "name": "file.md",
457 457 "size": 580,
458 458 "type": "file"
459 459 },
460 460 ...
461 461 ]
462 462 error: null
463 463 """
464 464
465 465 repo = get_repo_or_error(repoid)
466 466 if not has_superadmin_permission(apiuser):
467 467 _perms = ('repository.admin', 'repository.write', 'repository.read',)
468 468 validate_repo_permissions(apiuser, repoid, repo, _perms)
469 469
470 470 ret_type = Optional.extract(ret_type)
471 471 details = Optional.extract(details)
472 472 _extended_types = ['basic', 'full']
473 473 if details not in _extended_types:
474 474 raise JSONRPCError('ret_type must be one of %s' % (','.join(_extended_types)))
475 475 extended_info = False
476 476 content = False
477 477 if details == 'basic':
478 478 extended_info = True
479 479
480 480 if details == 'full':
481 481 extended_info = content = True
482 482
483 483 _map = {}
484 484 try:
485 485 # check if repo is not empty by any chance, skip quicker if it is.
486 486 _scm = repo.scm_instance()
487 487 if _scm.is_empty():
488 488 return []
489 489
490 490 _d, _f = ScmModel().get_nodes(
491 491 repo, revision, root_path, flat=False,
492 492 extended_info=extended_info, content=content,
493 493 max_file_bytes=max_file_bytes)
494 494 _map = {
495 495 'all': _d + _f,
496 496 'files': _f,
497 497 'dirs': _d,
498 498 }
499 499 return _map[ret_type]
500 500 except KeyError:
501 501 raise JSONRPCError(
502 502 'ret_type must be one of %s' % (','.join(sorted(_map.keys()))))
503 503 except Exception:
504 504 log.exception("Exception occurred while trying to get repo nodes")
505 505 raise JSONRPCError(
506 506 'failed to get repo: `%s` nodes' % repo.repo_name
507 507 )
508 508
509 509
510 510 @jsonrpc_method()
511 511 def get_repo_file(request, apiuser, repoid, commit_id, file_path,
512 512 max_file_bytes=Optional(None), details=Optional('basic'),
513 513 cache=Optional(True)):
514 514 """
515 515 Returns a single file from repository at given revision.
516 516
517 517 This command can only be run using an |authtoken| with admin rights,
518 518 or users with at least read rights to |repos|.
519 519
520 520 :param apiuser: This is filled automatically from the |authtoken|.
521 521 :type apiuser: AuthUser
522 522 :param repoid: The repository name or repository ID.
523 523 :type repoid: str or int
524 524 :param commit_id: The revision for which listing should be done.
525 525 :type commit_id: str
526 526 :param file_path: The path from which to start displaying.
527 527 :type file_path: str
528 528 :param details: Returns different set of information about nodes.
529 529 The valid options are ``minimal`` ``basic`` and ``full``.
530 530 :type details: Optional(str)
531 531 :param max_file_bytes: Only return file content under this file size bytes
532 532 :type max_file_bytes: Optional(int)
533 533 :param cache: Use internal caches for fetching files. If disabled fetching
534 534 files is slower but more memory efficient
535 535 :type cache: Optional(bool)
536 536
537 537 Example output:
538 538
539 539 .. code-block:: bash
540 540
541 541 id : <id_given_in_input>
542 542 result: {
543 543 "binary": false,
544 544 "extension": "py",
545 545 "lines": 35,
546 546 "content": "....",
547 547 "md5": "76318336366b0f17ee249e11b0c99c41",
548 548 "mimetype": "text/x-python",
549 549 "name": "python.py",
550 550 "size": 817,
551 551 "type": "file",
552 552 }
553 553 error: null
554 554 """
555 555
556 556 repo = get_repo_or_error(repoid)
557 557 if not has_superadmin_permission(apiuser):
558 558 _perms = ('repository.admin', 'repository.write', 'repository.read',)
559 559 validate_repo_permissions(apiuser, repoid, repo, _perms)
560 560
561 561 cache = Optional.extract(cache, binary=True)
562 562 details = Optional.extract(details)
563 563 _extended_types = ['minimal', 'minimal+search', 'basic', 'full']
564 564 if details not in _extended_types:
565 565 raise JSONRPCError(
566 566 'ret_type must be one of %s, got %s' % (','.join(_extended_types)), details)
567 567 extended_info = False
568 568 content = False
569 569
570 570 if details == 'minimal':
571 571 extended_info = False
572 572
573 573 elif details == 'basic':
574 574 extended_info = True
575 575
576 576 elif details == 'full':
577 577 extended_info = content = True
578 578
579 579 file_path = safe_unicode(file_path)
580 580 try:
581 581 # check if repo is not empty by any chance, skip quicker if it is.
582 582 _scm = repo.scm_instance()
583 583 if _scm.is_empty():
584 584 return None
585 585
586 586 node = ScmModel().get_node(
587 587 repo, commit_id, file_path, extended_info=extended_info,
588 588 content=content, max_file_bytes=max_file_bytes, cache=cache)
589 589 except NodeDoesNotExistError:
590 590 raise JSONRPCError(u'There is no file in repo: `{}` at path `{}` for commit: `{}`'.format(
591 591 repo.repo_name, file_path, commit_id))
592 592 except Exception:
593 593 log.exception(u"Exception occurred while trying to get repo %s file",
594 594 repo.repo_name)
595 595 raise JSONRPCError(u'failed to get repo: `{}` file at path {}'.format(
596 596 repo.repo_name, file_path))
597 597
598 598 return node
599 599
600 600
601 601 @jsonrpc_method()
602 602 def get_repo_fts_tree(request, apiuser, repoid, commit_id, root_path):
603 603 """
604 604 Returns a list of tree nodes for path at given revision. This api is built
605 605 strictly for usage in full text search building, and shouldn't be consumed
606 606
607 607 This command can only be run using an |authtoken| with admin rights,
608 608 or users with at least read rights to |repos|.
609 609
610 610 """
611 611
612 612 repo = get_repo_or_error(repoid)
613 613 if not has_superadmin_permission(apiuser):
614 614 _perms = ('repository.admin', 'repository.write', 'repository.read',)
615 615 validate_repo_permissions(apiuser, repoid, repo, _perms)
616 616
617 617 repo_id = repo.repo_id
618 618 cache_seconds = safe_int(rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
619 619 cache_on = cache_seconds > 0
620 620
621 621 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
622 622 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
623 623
624 624 def compute_fts_tree(cache_ver, repo_id, commit_id, root_path):
625 625 return ScmModel().get_fts_data(repo_id, commit_id, root_path)
626 626
627 627 try:
628 628 # check if repo is not empty by any chance, skip quicker if it is.
629 629 _scm = repo.scm_instance()
630 630 if _scm.is_empty():
631 631 return []
632 632 except RepositoryError:
633 633 log.exception("Exception occurred while trying to get repo nodes")
634 634 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
635 635
636 636 try:
637 637 # we need to resolve commit_id to a FULL sha for cache to work correctly.
638 638 # sending 'master' is a pointer that needs to be translated to current commit.
639 639 commit_id = _scm.get_commit(commit_id=commit_id).raw_id
640 640 log.debug(
641 641 'Computing FTS REPO TREE for repo_id %s commit_id `%s` '
642 642 'with caching: %s[TTL: %ss]' % (
643 643 repo_id, commit_id, cache_on, cache_seconds or 0))
644 644
645 645 tree_files = compute_fts_tree(rc_cache.FILE_TREE_CACHE_VER, repo_id, commit_id, root_path)
646 646 return tree_files
647 647
648 648 except Exception:
649 649 log.exception("Exception occurred while trying to get repo nodes")
650 650 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
651 651
652 652
653 653 @jsonrpc_method()
654 654 def get_repo_refs(request, apiuser, repoid):
655 655 """
656 656 Returns a dictionary of current references. It returns
657 657 bookmarks, branches, closed_branches, and tags for given repository
658 658
659 659 It's possible to specify ret_type to show only `files` or `dirs`.
660 660
661 661 This command can only be run using an |authtoken| with admin rights,
662 662 or users with at least read rights to |repos|.
663 663
664 664 :param apiuser: This is filled automatically from the |authtoken|.
665 665 :type apiuser: AuthUser
666 666 :param repoid: The repository name or repository ID.
667 667 :type repoid: str or int
668 668
669 669 Example output:
670 670
671 671 .. code-block:: bash
672 672
673 673 id : <id_given_in_input>
674 674 "result": {
675 675 "bookmarks": {
676 676 "dev": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
677 677 "master": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
678 678 },
679 679 "branches": {
680 680 "default": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
681 681 "stable": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
682 682 },
683 683 "branches_closed": {},
684 684 "tags": {
685 685 "tip": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
686 686 "v4.4.0": "1232313f9e6adac5ce5399c2a891dc1e72b79022",
687 687 "v4.4.1": "cbb9f1d329ae5768379cdec55a62ebdd546c4e27",
688 688 "v4.4.2": "24ffe44a27fcd1c5b6936144e176b9f6dd2f3a17",
689 689 }
690 690 }
691 691 error: null
692 692 """
693 693
694 694 repo = get_repo_or_error(repoid)
695 695 if not has_superadmin_permission(apiuser):
696 696 _perms = ('repository.admin', 'repository.write', 'repository.read',)
697 697 validate_repo_permissions(apiuser, repoid, repo, _perms)
698 698
699 699 try:
700 700 # check if repo is not empty by any chance, skip quicker if it is.
701 701 vcs_instance = repo.scm_instance()
702 702 refs = vcs_instance.refs()
703 703 return refs
704 704 except Exception:
705 705 log.exception("Exception occurred while trying to get repo refs")
706 706 raise JSONRPCError(
707 707 'failed to get repo: `%s` references' % repo.repo_name
708 708 )
709 709
710 710
711 711 @jsonrpc_method()
712 712 def create_repo(
713 713 request, apiuser, repo_name, repo_type,
714 714 owner=Optional(OAttr('apiuser')),
715 715 description=Optional(''),
716 716 private=Optional(False),
717 717 clone_uri=Optional(None),
718 718 push_uri=Optional(None),
719 719 landing_rev=Optional(None),
720 720 enable_statistics=Optional(False),
721 721 enable_locking=Optional(False),
722 722 enable_downloads=Optional(False),
723 723 copy_permissions=Optional(False)):
724 724 """
725 725 Creates a repository.
726 726
727 727 * If the repository name contains "/", repository will be created inside
728 728 a repository group or nested repository groups
729 729
730 730 For example "foo/bar/repo1" will create |repo| called "repo1" inside
731 731 group "foo/bar". You have to have permissions to access and write to
732 732 the last repository group ("bar" in this example)
733 733
734 734 This command can only be run using an |authtoken| with at least
735 735 permissions to create repositories, or write permissions to
736 736 parent repository groups.
737 737
738 738 :param apiuser: This is filled automatically from the |authtoken|.
739 739 :type apiuser: AuthUser
740 740 :param repo_name: Set the repository name.
741 741 :type repo_name: str
742 742 :param repo_type: Set the repository type; 'hg','git', or 'svn'.
743 743 :type repo_type: str
744 744 :param owner: user_id or username
745 745 :type owner: Optional(str)
746 746 :param description: Set the repository description.
747 747 :type description: Optional(str)
748 748 :param private: set repository as private
749 749 :type private: bool
750 750 :param clone_uri: set clone_uri
751 751 :type clone_uri: str
752 752 :param push_uri: set push_uri
753 753 :type push_uri: str
754 754 :param landing_rev: <rev_type>:<rev>, e.g branch:default, book:dev, rev:abcd
755 755 :type landing_rev: str
756 756 :param enable_locking:
757 757 :type enable_locking: bool
758 758 :param enable_downloads:
759 759 :type enable_downloads: bool
760 760 :param enable_statistics:
761 761 :type enable_statistics: bool
762 762 :param copy_permissions: Copy permission from group in which the
763 763 repository is being created.
764 764 :type copy_permissions: bool
765 765
766 766
767 767 Example output:
768 768
769 769 .. code-block:: bash
770 770
771 771 id : <id_given_in_input>
772 772 result: {
773 773 "msg": "Created new repository `<reponame>`",
774 774 "success": true,
775 775 "task": "<celery task id or None if done sync>"
776 776 }
777 777 error: null
778 778
779 779
780 780 Example error output:
781 781
782 782 .. code-block:: bash
783 783
784 784 id : <id_given_in_input>
785 785 result : null
786 786 error : {
787 787 'failed to create repository `<repo_name>`'
788 788 }
789 789
790 790 """
791 791
792 792 owner = validate_set_owner_permissions(apiuser, owner)
793 793
794 794 description = Optional.extract(description)
795 795 copy_permissions = Optional.extract(copy_permissions)
796 796 clone_uri = Optional.extract(clone_uri)
797 797 push_uri = Optional.extract(push_uri)
798 798
799 799 defs = SettingsModel().get_default_repo_settings(strip_prefix=True)
800 800 if isinstance(private, Optional):
801 801 private = defs.get('repo_private') or Optional.extract(private)
802 802 if isinstance(repo_type, Optional):
803 803 repo_type = defs.get('repo_type')
804 804 if isinstance(enable_statistics, Optional):
805 805 enable_statistics = defs.get('repo_enable_statistics')
806 806 if isinstance(enable_locking, Optional):
807 807 enable_locking = defs.get('repo_enable_locking')
808 808 if isinstance(enable_downloads, Optional):
809 809 enable_downloads = defs.get('repo_enable_downloads')
810 810
811 811 landing_ref, _label = ScmModel.backend_landing_ref(repo_type)
812 812 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
813 813 ref_choices = list(set(ref_choices + [landing_ref]))
814 814
815 815 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
816 816
817 817 schema = repo_schema.RepoSchema().bind(
818 818 repo_type_options=rhodecode.BACKENDS.keys(),
819 819 repo_ref_options=ref_choices,
820 820 repo_type=repo_type,
821 821 # user caller
822 822 user=apiuser)
823 823
824 824 try:
825 825 schema_data = schema.deserialize(dict(
826 826 repo_name=repo_name,
827 827 repo_type=repo_type,
828 828 repo_owner=owner.username,
829 829 repo_description=description,
830 830 repo_landing_commit_ref=landing_commit_ref,
831 831 repo_clone_uri=clone_uri,
832 832 repo_push_uri=push_uri,
833 833 repo_private=private,
834 834 repo_copy_permissions=copy_permissions,
835 835 repo_enable_statistics=enable_statistics,
836 836 repo_enable_downloads=enable_downloads,
837 837 repo_enable_locking=enable_locking))
838 838 except validation_schema.Invalid as err:
839 839 raise JSONRPCValidationError(colander_exc=err)
840 840
841 841 try:
842 842 data = {
843 843 'owner': owner,
844 844 'repo_name': schema_data['repo_group']['repo_name_without_group'],
845 845 'repo_name_full': schema_data['repo_name'],
846 846 'repo_group': schema_data['repo_group']['repo_group_id'],
847 847 'repo_type': schema_data['repo_type'],
848 848 'repo_description': schema_data['repo_description'],
849 849 'repo_private': schema_data['repo_private'],
850 850 'clone_uri': schema_data['repo_clone_uri'],
851 851 'push_uri': schema_data['repo_push_uri'],
852 852 'repo_landing_rev': schema_data['repo_landing_commit_ref'],
853 853 'enable_statistics': schema_data['repo_enable_statistics'],
854 854 'enable_locking': schema_data['repo_enable_locking'],
855 855 'enable_downloads': schema_data['repo_enable_downloads'],
856 856 'repo_copy_permissions': schema_data['repo_copy_permissions'],
857 857 }
858 858
859 859 task = RepoModel().create(form_data=data, cur_user=owner.user_id)
860 860 task_id = get_task_id(task)
861 861 # no commit, it's done in RepoModel, or async via celery
862 862 return {
863 863 'msg': "Created new repository `%s`" % (schema_data['repo_name'],),
864 864 'success': True, # cannot return the repo data here since fork
865 865 # can be done async
866 866 'task': task_id
867 867 }
868 868 except Exception:
869 869 log.exception(
870 870 u"Exception while trying to create the repository %s",
871 871 schema_data['repo_name'])
872 872 raise JSONRPCError(
873 873 'failed to create repository `%s`' % (schema_data['repo_name'],))
874 874
875 875
876 876 @jsonrpc_method()
877 877 def add_field_to_repo(request, apiuser, repoid, key, label=Optional(''),
878 878 description=Optional('')):
879 879 """
880 880 Adds an extra field to a repository.
881 881
882 882 This command can only be run using an |authtoken| with at least
883 883 write permissions to the |repo|.
884 884
885 885 :param apiuser: This is filled automatically from the |authtoken|.
886 886 :type apiuser: AuthUser
887 887 :param repoid: Set the repository name or repository id.
888 888 :type repoid: str or int
889 889 :param key: Create a unique field key for this repository.
890 890 :type key: str
891 891 :param label:
892 892 :type label: Optional(str)
893 893 :param description:
894 894 :type description: Optional(str)
895 895 """
896 896 repo = get_repo_or_error(repoid)
897 897 if not has_superadmin_permission(apiuser):
898 898 _perms = ('repository.admin',)
899 899 validate_repo_permissions(apiuser, repoid, repo, _perms)
900 900
901 901 label = Optional.extract(label) or key
902 902 description = Optional.extract(description)
903 903
904 904 field = RepositoryField.get_by_key_name(key, repo)
905 905 if field:
906 906 raise JSONRPCError('Field with key '
907 907 '`%s` exists for repo `%s`' % (key, repoid))
908 908
909 909 try:
910 910 RepoModel().add_repo_field(repo, key, field_label=label,
911 911 field_desc=description)
912 912 Session().commit()
913 913 return {
914 914 'msg': "Added new repository field `%s`" % (key,),
915 915 'success': True,
916 916 }
917 917 except Exception:
918 918 log.exception("Exception occurred while trying to add field to repo")
919 919 raise JSONRPCError(
920 920 'failed to create new field for repository `%s`' % (repoid,))
921 921
922 922
923 923 @jsonrpc_method()
924 924 def remove_field_from_repo(request, apiuser, repoid, key):
925 925 """
926 926 Removes an extra field from a repository.
927 927
928 928 This command can only be run using an |authtoken| with at least
929 929 write permissions to the |repo|.
930 930
931 931 :param apiuser: This is filled automatically from the |authtoken|.
932 932 :type apiuser: AuthUser
933 933 :param repoid: Set the repository name or repository ID.
934 934 :type repoid: str or int
935 935 :param key: Set the unique field key for this repository.
936 936 :type key: str
937 937 """
938 938
939 939 repo = get_repo_or_error(repoid)
940 940 if not has_superadmin_permission(apiuser):
941 941 _perms = ('repository.admin',)
942 942 validate_repo_permissions(apiuser, repoid, repo, _perms)
943 943
944 944 field = RepositoryField.get_by_key_name(key, repo)
945 945 if not field:
946 946 raise JSONRPCError('Field with key `%s` does not '
947 947 'exists for repo `%s`' % (key, repoid))
948 948
949 949 try:
950 950 RepoModel().delete_repo_field(repo, field_key=key)
951 951 Session().commit()
952 952 return {
953 953 'msg': "Deleted repository field `%s`" % (key,),
954 954 'success': True,
955 955 }
956 956 except Exception:
957 957 log.exception(
958 958 "Exception occurred while trying to delete field from repo")
959 959 raise JSONRPCError(
960 960 'failed to delete field for repository `%s`' % (repoid,))
961 961
962 962
963 963 @jsonrpc_method()
964 964 def update_repo(
965 965 request, apiuser, repoid, repo_name=Optional(None),
966 966 owner=Optional(OAttr('apiuser')), description=Optional(''),
967 967 private=Optional(False),
968 968 clone_uri=Optional(None), push_uri=Optional(None),
969 969 landing_rev=Optional(None), fork_of=Optional(None),
970 970 enable_statistics=Optional(False),
971 971 enable_locking=Optional(False),
972 972 enable_downloads=Optional(False), fields=Optional('')):
973 973 """
974 974 Updates a repository with the given information.
975 975
976 976 This command can only be run using an |authtoken| with at least
977 977 admin permissions to the |repo|.
978 978
979 979 * If the repository name contains "/", repository will be updated
980 980 accordingly with a repository group or nested repository groups
981 981
982 982 For example repoid=repo-test name="foo/bar/repo-test" will update |repo|
983 983 called "repo-test" and place it inside group "foo/bar".
984 984 You have to have permissions to access and write to the last repository
985 985 group ("bar" in this example)
986 986
987 987 :param apiuser: This is filled automatically from the |authtoken|.
988 988 :type apiuser: AuthUser
989 989 :param repoid: repository name or repository ID.
990 990 :type repoid: str or int
991 991 :param repo_name: Update the |repo| name, including the
992 992 repository group it's in.
993 993 :type repo_name: str
994 994 :param owner: Set the |repo| owner.
995 995 :type owner: str
996 996 :param fork_of: Set the |repo| as fork of another |repo|.
997 997 :type fork_of: str
998 998 :param description: Update the |repo| description.
999 999 :type description: str
1000 1000 :param private: Set the |repo| as private. (True | False)
1001 1001 :type private: bool
1002 1002 :param clone_uri: Update the |repo| clone URI.
1003 1003 :type clone_uri: str
1004 1004 :param landing_rev: Set the |repo| landing revision. e.g branch:default, book:dev, rev:abcd
1005 1005 :type landing_rev: str
1006 1006 :param enable_statistics: Enable statistics on the |repo|, (True | False).
1007 1007 :type enable_statistics: bool
1008 1008 :param enable_locking: Enable |repo| locking.
1009 1009 :type enable_locking: bool
1010 1010 :param enable_downloads: Enable downloads from the |repo|, (True | False).
1011 1011 :type enable_downloads: bool
1012 1012 :param fields: Add extra fields to the |repo|. Use the following
1013 1013 example format: ``field_key=field_val,field_key2=fieldval2``.
1014 1014 Escape ', ' with \,
1015 1015 :type fields: str
1016 1016 """
1017 1017
1018 1018 repo = get_repo_or_error(repoid)
1019 1019
1020 1020 include_secrets = False
1021 1021 if not has_superadmin_permission(apiuser):
1022 1022 _perms = ('repository.admin',)
1023 1023 validate_repo_permissions(apiuser, repoid, repo, _perms)
1024 1024 else:
1025 1025 include_secrets = True
1026 1026
1027 1027 updates = dict(
1028 1028 repo_name=repo_name
1029 1029 if not isinstance(repo_name, Optional) else repo.repo_name,
1030 1030
1031 1031 fork_id=fork_of
1032 1032 if not isinstance(fork_of, Optional) else repo.fork.repo_name if repo.fork else None,
1033 1033
1034 1034 user=owner
1035 1035 if not isinstance(owner, Optional) else repo.user.username,
1036 1036
1037 1037 repo_description=description
1038 1038 if not isinstance(description, Optional) else repo.description,
1039 1039
1040 1040 repo_private=private
1041 1041 if not isinstance(private, Optional) else repo.private,
1042 1042
1043 1043 clone_uri=clone_uri
1044 1044 if not isinstance(clone_uri, Optional) else repo.clone_uri,
1045 1045
1046 1046 push_uri=push_uri
1047 1047 if not isinstance(push_uri, Optional) else repo.push_uri,
1048 1048
1049 1049 repo_landing_rev=landing_rev
1050 1050 if not isinstance(landing_rev, Optional) else repo._landing_revision,
1051 1051
1052 1052 repo_enable_statistics=enable_statistics
1053 1053 if not isinstance(enable_statistics, Optional) else repo.enable_statistics,
1054 1054
1055 1055 repo_enable_locking=enable_locking
1056 1056 if not isinstance(enable_locking, Optional) else repo.enable_locking,
1057 1057
1058 1058 repo_enable_downloads=enable_downloads
1059 1059 if not isinstance(enable_downloads, Optional) else repo.enable_downloads)
1060 1060
1061 1061 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1062 1062 ref_choices, _labels = ScmModel().get_repo_landing_revs(
1063 1063 request.translate, repo=repo)
1064 1064 ref_choices = list(set(ref_choices + [landing_ref]))
1065 1065
1066 1066 old_values = repo.get_api_data()
1067 1067 repo_type = repo.repo_type
1068 1068 schema = repo_schema.RepoSchema().bind(
1069 1069 repo_type_options=rhodecode.BACKENDS.keys(),
1070 1070 repo_ref_options=ref_choices,
1071 1071 repo_type=repo_type,
1072 1072 # user caller
1073 1073 user=apiuser,
1074 1074 old_values=old_values)
1075 1075 try:
1076 1076 schema_data = schema.deserialize(dict(
1077 1077 # we save old value, users cannot change type
1078 1078 repo_type=repo_type,
1079 1079
1080 1080 repo_name=updates['repo_name'],
1081 1081 repo_owner=updates['user'],
1082 1082 repo_description=updates['repo_description'],
1083 1083 repo_clone_uri=updates['clone_uri'],
1084 1084 repo_push_uri=updates['push_uri'],
1085 1085 repo_fork_of=updates['fork_id'],
1086 1086 repo_private=updates['repo_private'],
1087 1087 repo_landing_commit_ref=updates['repo_landing_rev'],
1088 1088 repo_enable_statistics=updates['repo_enable_statistics'],
1089 1089 repo_enable_downloads=updates['repo_enable_downloads'],
1090 1090 repo_enable_locking=updates['repo_enable_locking']))
1091 1091 except validation_schema.Invalid as err:
1092 1092 raise JSONRPCValidationError(colander_exc=err)
1093 1093
1094 1094 # save validated data back into the updates dict
1095 1095 validated_updates = dict(
1096 1096 repo_name=schema_data['repo_group']['repo_name_without_group'],
1097 1097 repo_group=schema_data['repo_group']['repo_group_id'],
1098 1098
1099 1099 user=schema_data['repo_owner'],
1100 1100 repo_description=schema_data['repo_description'],
1101 1101 repo_private=schema_data['repo_private'],
1102 1102 clone_uri=schema_data['repo_clone_uri'],
1103 1103 push_uri=schema_data['repo_push_uri'],
1104 1104 repo_landing_rev=schema_data['repo_landing_commit_ref'],
1105 1105 repo_enable_statistics=schema_data['repo_enable_statistics'],
1106 1106 repo_enable_locking=schema_data['repo_enable_locking'],
1107 1107 repo_enable_downloads=schema_data['repo_enable_downloads'],
1108 1108 )
1109 1109
1110 1110 if schema_data['repo_fork_of']:
1111 1111 fork_repo = get_repo_or_error(schema_data['repo_fork_of'])
1112 1112 validated_updates['fork_id'] = fork_repo.repo_id
1113 1113
1114 1114 # extra fields
1115 1115 fields = parse_args(Optional.extract(fields), key_prefix='ex_')
1116 1116 if fields:
1117 1117 validated_updates.update(fields)
1118 1118
1119 1119 try:
1120 1120 RepoModel().update(repo, **validated_updates)
1121 1121 audit_logger.store_api(
1122 1122 'repo.edit', action_data={'old_data': old_values},
1123 1123 user=apiuser, repo=repo)
1124 1124 Session().commit()
1125 1125 return {
1126 1126 'msg': 'updated repo ID:%s %s' % (repo.repo_id, repo.repo_name),
1127 1127 'repository': repo.get_api_data(include_secrets=include_secrets)
1128 1128 }
1129 1129 except Exception:
1130 1130 log.exception(
1131 1131 u"Exception while trying to update the repository %s",
1132 1132 repoid)
1133 1133 raise JSONRPCError('failed to update repo `%s`' % repoid)
1134 1134
1135 1135
1136 1136 @jsonrpc_method()
1137 1137 def fork_repo(request, apiuser, repoid, fork_name,
1138 1138 owner=Optional(OAttr('apiuser')),
1139 1139 description=Optional(''),
1140 1140 private=Optional(False),
1141 1141 clone_uri=Optional(None),
1142 1142 landing_rev=Optional(None),
1143 1143 copy_permissions=Optional(False)):
1144 1144 """
1145 1145 Creates a fork of the specified |repo|.
1146 1146
1147 1147 * If the fork_name contains "/", fork will be created inside
1148 1148 a repository group or nested repository groups
1149 1149
1150 1150 For example "foo/bar/fork-repo" will create fork called "fork-repo"
1151 1151 inside group "foo/bar". You have to have permissions to access and
1152 1152 write to the last repository group ("bar" in this example)
1153 1153
1154 1154 This command can only be run using an |authtoken| with minimum
1155 1155 read permissions of the forked repo, create fork permissions for an user.
1156 1156
1157 1157 :param apiuser: This is filled automatically from the |authtoken|.
1158 1158 :type apiuser: AuthUser
1159 1159 :param repoid: Set repository name or repository ID.
1160 1160 :type repoid: str or int
1161 1161 :param fork_name: Set the fork name, including it's repository group membership.
1162 1162 :type fork_name: str
1163 1163 :param owner: Set the fork owner.
1164 1164 :type owner: str
1165 1165 :param description: Set the fork description.
1166 1166 :type description: str
1167 1167 :param copy_permissions: Copy permissions from parent |repo|. The
1168 1168 default is False.
1169 1169 :type copy_permissions: bool
1170 1170 :param private: Make the fork private. The default is False.
1171 1171 :type private: bool
1172 1172 :param landing_rev: Set the landing revision. E.g branch:default, book:dev, rev:abcd
1173 1173
1174 1174 Example output:
1175 1175
1176 1176 .. code-block:: bash
1177 1177
1178 1178 id : <id_for_response>
1179 1179 api_key : "<api_key>"
1180 1180 args: {
1181 1181 "repoid" : "<reponame or repo_id>",
1182 1182 "fork_name": "<forkname>",
1183 1183 "owner": "<username or user_id = Optional(=apiuser)>",
1184 1184 "description": "<description>",
1185 1185 "copy_permissions": "<bool>",
1186 1186 "private": "<bool>",
1187 1187 "landing_rev": "<landing_rev>"
1188 1188 }
1189 1189
1190 1190 Example error output:
1191 1191
1192 1192 .. code-block:: bash
1193 1193
1194 1194 id : <id_given_in_input>
1195 1195 result: {
1196 1196 "msg": "Created fork of `<reponame>` as `<forkname>`",
1197 1197 "success": true,
1198 1198 "task": "<celery task id or None if done sync>"
1199 1199 }
1200 1200 error: null
1201 1201
1202 1202 """
1203 1203
1204 1204 repo = get_repo_or_error(repoid)
1205 1205 repo_name = repo.repo_name
1206 1206
1207 1207 if not has_superadmin_permission(apiuser):
1208 1208 # check if we have at least read permission for
1209 1209 # this repo that we fork !
1210 1210 _perms = ('repository.admin', 'repository.write', 'repository.read')
1211 1211 validate_repo_permissions(apiuser, repoid, repo, _perms)
1212 1212
1213 1213 # check if the regular user has at least fork permissions as well
1214 1214 if not HasPermissionAnyApi('hg.fork.repository')(user=apiuser):
1215 1215 raise JSONRPCForbidden()
1216 1216
1217 1217 # check if user can set owner parameter
1218 1218 owner = validate_set_owner_permissions(apiuser, owner)
1219 1219
1220 1220 description = Optional.extract(description)
1221 1221 copy_permissions = Optional.extract(copy_permissions)
1222 1222 clone_uri = Optional.extract(clone_uri)
1223 1223
1224 1224 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1225 1225 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
1226 1226 ref_choices = list(set(ref_choices + [landing_ref]))
1227 1227 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
1228 1228
1229 1229 private = Optional.extract(private)
1230 1230
1231 1231 schema = repo_schema.RepoSchema().bind(
1232 1232 repo_type_options=rhodecode.BACKENDS.keys(),
1233 1233 repo_ref_options=ref_choices,
1234 1234 repo_type=repo.repo_type,
1235 1235 # user caller
1236 1236 user=apiuser)
1237 1237
1238 1238 try:
1239 1239 schema_data = schema.deserialize(dict(
1240 1240 repo_name=fork_name,
1241 1241 repo_type=repo.repo_type,
1242 1242 repo_owner=owner.username,
1243 1243 repo_description=description,
1244 1244 repo_landing_commit_ref=landing_commit_ref,
1245 1245 repo_clone_uri=clone_uri,
1246 1246 repo_private=private,
1247 1247 repo_copy_permissions=copy_permissions))
1248 1248 except validation_schema.Invalid as err:
1249 1249 raise JSONRPCValidationError(colander_exc=err)
1250 1250
1251 1251 try:
1252 1252 data = {
1253 1253 'fork_parent_id': repo.repo_id,
1254 1254
1255 1255 'repo_name': schema_data['repo_group']['repo_name_without_group'],
1256 1256 'repo_name_full': schema_data['repo_name'],
1257 1257 'repo_group': schema_data['repo_group']['repo_group_id'],
1258 1258 'repo_type': schema_data['repo_type'],
1259 1259 'description': schema_data['repo_description'],
1260 1260 'private': schema_data['repo_private'],
1261 1261 'copy_permissions': schema_data['repo_copy_permissions'],
1262 1262 'landing_rev': schema_data['repo_landing_commit_ref'],
1263 1263 }
1264 1264
1265 1265 task = RepoModel().create_fork(data, cur_user=owner.user_id)
1266 1266 # no commit, it's done in RepoModel, or async via celery
1267 1267 task_id = get_task_id(task)
1268 1268
1269 1269 return {
1270 1270 'msg': 'Created fork of `%s` as `%s`' % (
1271 1271 repo.repo_name, schema_data['repo_name']),
1272 1272 'success': True, # cannot return the repo data here since fork
1273 1273 # can be done async
1274 1274 'task': task_id
1275 1275 }
1276 1276 except Exception:
1277 1277 log.exception(
1278 1278 u"Exception while trying to create fork %s",
1279 1279 schema_data['repo_name'])
1280 1280 raise JSONRPCError(
1281 1281 'failed to fork repository `%s` as `%s`' % (
1282 1282 repo_name, schema_data['repo_name']))
1283 1283
1284 1284
1285 1285 @jsonrpc_method()
1286 1286 def delete_repo(request, apiuser, repoid, forks=Optional('')):
1287 1287 """
1288 1288 Deletes a repository.
1289 1289
1290 1290 * When the `forks` parameter is set it's possible to detach or delete
1291 1291 forks of deleted repository.
1292 1292
1293 1293 This command can only be run using an |authtoken| with admin
1294 1294 permissions on the |repo|.
1295 1295
1296 1296 :param apiuser: This is filled automatically from the |authtoken|.
1297 1297 :type apiuser: AuthUser
1298 1298 :param repoid: Set the repository name or repository ID.
1299 1299 :type repoid: str or int
1300 1300 :param forks: Set to `detach` or `delete` forks from the |repo|.
1301 1301 :type forks: Optional(str)
1302 1302
1303 1303 Example error output:
1304 1304
1305 1305 .. code-block:: bash
1306 1306
1307 1307 id : <id_given_in_input>
1308 1308 result: {
1309 1309 "msg": "Deleted repository `<reponame>`",
1310 1310 "success": true
1311 1311 }
1312 1312 error: null
1313 1313 """
1314 1314
1315 1315 repo = get_repo_or_error(repoid)
1316 1316 repo_name = repo.repo_name
1317 1317 if not has_superadmin_permission(apiuser):
1318 1318 _perms = ('repository.admin',)
1319 1319 validate_repo_permissions(apiuser, repoid, repo, _perms)
1320 1320
1321 1321 try:
1322 1322 handle_forks = Optional.extract(forks)
1323 1323 _forks_msg = ''
1324 1324 _forks = [f for f in repo.forks]
1325 1325 if handle_forks == 'detach':
1326 1326 _forks_msg = ' ' + 'Detached %s forks' % len(_forks)
1327 1327 elif handle_forks == 'delete':
1328 1328 _forks_msg = ' ' + 'Deleted %s forks' % len(_forks)
1329 1329 elif _forks:
1330 1330 raise JSONRPCError(
1331 1331 'Cannot delete `%s` it still contains attached forks' %
1332 1332 (repo.repo_name,)
1333 1333 )
1334 1334 old_data = repo.get_api_data()
1335 1335 RepoModel().delete(repo, forks=forks)
1336 1336
1337 1337 repo = audit_logger.RepoWrap(repo_id=None,
1338 1338 repo_name=repo.repo_name)
1339 1339
1340 1340 audit_logger.store_api(
1341 1341 'repo.delete', action_data={'old_data': old_data},
1342 1342 user=apiuser, repo=repo)
1343 1343
1344 1344 ScmModel().mark_for_invalidation(repo_name, delete=True)
1345 1345 Session().commit()
1346 1346 return {
1347 1347 'msg': 'Deleted repository `%s`%s' % (repo_name, _forks_msg),
1348 1348 'success': True
1349 1349 }
1350 1350 except Exception:
1351 1351 log.exception("Exception occurred while trying to delete repo")
1352 1352 raise JSONRPCError(
1353 1353 'failed to delete repository `%s`' % (repo_name,)
1354 1354 )
1355 1355
1356 1356
1357 1357 #TODO: marcink, change name ?
1358 1358 @jsonrpc_method()
1359 1359 def invalidate_cache(request, apiuser, repoid, delete_keys=Optional(False)):
1360 1360 """
1361 1361 Invalidates the cache for the specified repository.
1362 1362
1363 1363 This command can only be run using an |authtoken| with admin rights to
1364 1364 the specified repository.
1365 1365
1366 1366 This command takes the following options:
1367 1367
1368 1368 :param apiuser: This is filled automatically from |authtoken|.
1369 1369 :type apiuser: AuthUser
1370 1370 :param repoid: Sets the repository name or repository ID.
1371 1371 :type repoid: str or int
1372 1372 :param delete_keys: This deletes the invalidated keys instead of
1373 1373 just flagging them.
1374 1374 :type delete_keys: Optional(``True`` | ``False``)
1375 1375
1376 1376 Example output:
1377 1377
1378 1378 .. code-block:: bash
1379 1379
1380 1380 id : <id_given_in_input>
1381 1381 result : {
1382 1382 'msg': Cache for repository `<repository name>` was invalidated,
1383 1383 'repository': <repository name>
1384 1384 }
1385 1385 error : null
1386 1386
1387 1387 Example error output:
1388 1388
1389 1389 .. code-block:: bash
1390 1390
1391 1391 id : <id_given_in_input>
1392 1392 result : null
1393 1393 error : {
1394 1394 'Error occurred during cache invalidation action'
1395 1395 }
1396 1396
1397 1397 """
1398 1398
1399 1399 repo = get_repo_or_error(repoid)
1400 1400 if not has_superadmin_permission(apiuser):
1401 1401 _perms = ('repository.admin', 'repository.write',)
1402 1402 validate_repo_permissions(apiuser, repoid, repo, _perms)
1403 1403
1404 1404 delete = Optional.extract(delete_keys)
1405 1405 try:
1406 1406 ScmModel().mark_for_invalidation(repo.repo_name, delete=delete)
1407 1407 return {
1408 1408 'msg': 'Cache for repository `%s` was invalidated' % (repoid,),
1409 1409 'repository': repo.repo_name
1410 1410 }
1411 1411 except Exception:
1412 1412 log.exception(
1413 1413 "Exception occurred while trying to invalidate repo cache")
1414 1414 raise JSONRPCError(
1415 1415 'Error occurred during cache invalidation action'
1416 1416 )
1417 1417
1418 1418
1419 1419 #TODO: marcink, change name ?
1420 1420 @jsonrpc_method()
1421 1421 def lock(request, apiuser, repoid, locked=Optional(None),
1422 1422 userid=Optional(OAttr('apiuser'))):
1423 1423 """
1424 1424 Sets the lock state of the specified |repo| by the given user.
1425 1425 From more information, see :ref:`repo-locking`.
1426 1426
1427 1427 * If the ``userid`` option is not set, the repository is locked to the
1428 1428 user who called the method.
1429 1429 * If the ``locked`` parameter is not set, the current lock state of the
1430 1430 repository is displayed.
1431 1431
1432 1432 This command can only be run using an |authtoken| with admin rights to
1433 1433 the specified repository.
1434 1434
1435 1435 This command takes the following options:
1436 1436
1437 1437 :param apiuser: This is filled automatically from the |authtoken|.
1438 1438 :type apiuser: AuthUser
1439 1439 :param repoid: Sets the repository name or repository ID.
1440 1440 :type repoid: str or int
1441 1441 :param locked: Sets the lock state.
1442 1442 :type locked: Optional(``True`` | ``False``)
1443 1443 :param userid: Set the repository lock to this user.
1444 1444 :type userid: Optional(str or int)
1445 1445
1446 1446 Example error output:
1447 1447
1448 1448 .. code-block:: bash
1449 1449
1450 1450 id : <id_given_in_input>
1451 1451 result : {
1452 1452 'repo': '<reponame>',
1453 1453 'locked': <bool: lock state>,
1454 1454 'locked_since': <int: lock timestamp>,
1455 1455 'locked_by': <username of person who made the lock>,
1456 1456 'lock_reason': <str: reason for locking>,
1457 1457 'lock_state_changed': <bool: True if lock state has been changed in this request>,
1458 1458 'msg': 'Repo `<reponame>` locked by `<username>` on <timestamp>.'
1459 1459 or
1460 1460 'msg': 'Repo `<repository name>` not locked.'
1461 1461 or
1462 1462 'msg': 'User `<user name>` set lock state for repo `<repository name>` to `<new lock state>`'
1463 1463 }
1464 1464 error : null
1465 1465
1466 1466 Example error output:
1467 1467
1468 1468 .. code-block:: bash
1469 1469
1470 1470 id : <id_given_in_input>
1471 1471 result : null
1472 1472 error : {
1473 1473 'Error occurred locking repository `<reponame>`'
1474 1474 }
1475 1475 """
1476 1476
1477 1477 repo = get_repo_or_error(repoid)
1478 1478 if not has_superadmin_permission(apiuser):
1479 1479 # check if we have at least write permission for this repo !
1480 1480 _perms = ('repository.admin', 'repository.write',)
1481 1481 validate_repo_permissions(apiuser, repoid, repo, _perms)
1482 1482
1483 1483 # make sure normal user does not pass someone else userid,
1484 1484 # he is not allowed to do that
1485 1485 if not isinstance(userid, Optional) and userid != apiuser.user_id:
1486 1486 raise JSONRPCError('userid is not the same as your user')
1487 1487
1488 1488 if isinstance(userid, Optional):
1489 1489 userid = apiuser.user_id
1490 1490
1491 1491 user = get_user_or_error(userid)
1492 1492
1493 1493 if isinstance(locked, Optional):
1494 1494 lockobj = repo.locked
1495 1495
1496 1496 if lockobj[0] is None:
1497 1497 _d = {
1498 1498 'repo': repo.repo_name,
1499 1499 'locked': False,
1500 1500 'locked_since': None,
1501 1501 'locked_by': None,
1502 1502 'lock_reason': None,
1503 1503 'lock_state_changed': False,
1504 1504 'msg': 'Repo `%s` not locked.' % repo.repo_name
1505 1505 }
1506 1506 return _d
1507 1507 else:
1508 1508 _user_id, _time, _reason = lockobj
1509 1509 lock_user = get_user_or_error(userid)
1510 1510 _d = {
1511 1511 'repo': repo.repo_name,
1512 1512 'locked': True,
1513 1513 'locked_since': _time,
1514 1514 'locked_by': lock_user.username,
1515 1515 'lock_reason': _reason,
1516 1516 'lock_state_changed': False,
1517 1517 'msg': ('Repo `%s` locked by `%s` on `%s`.'
1518 1518 % (repo.repo_name, lock_user.username,
1519 1519 json.dumps(time_to_datetime(_time))))
1520 1520 }
1521 1521 return _d
1522 1522
1523 1523 # force locked state through a flag
1524 1524 else:
1525 1525 locked = str2bool(locked)
1526 1526 lock_reason = Repository.LOCK_API
1527 1527 try:
1528 1528 if locked:
1529 1529 lock_time = time.time()
1530 1530 Repository.lock(repo, user.user_id, lock_time, lock_reason)
1531 1531 else:
1532 1532 lock_time = None
1533 1533 Repository.unlock(repo)
1534 1534 _d = {
1535 1535 'repo': repo.repo_name,
1536 1536 'locked': locked,
1537 1537 'locked_since': lock_time,
1538 1538 'locked_by': user.username,
1539 1539 'lock_reason': lock_reason,
1540 1540 'lock_state_changed': True,
1541 1541 'msg': ('User `%s` set lock state for repo `%s` to `%s`'
1542 1542 % (user.username, repo.repo_name, locked))
1543 1543 }
1544 1544 return _d
1545 1545 except Exception:
1546 1546 log.exception(
1547 1547 "Exception occurred while trying to lock repository")
1548 1548 raise JSONRPCError(
1549 1549 'Error occurred locking repository `%s`' % repo.repo_name
1550 1550 )
1551 1551
1552 1552
1553 1553 @jsonrpc_method()
1554 1554 def comment_commit(
1555 1555 request, apiuser, repoid, commit_id, message, status=Optional(None),
1556 1556 comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE),
1557 1557 resolves_comment_id=Optional(None), extra_recipients=Optional([]),
1558 1558 userid=Optional(OAttr('apiuser')), send_email=Optional(True)):
1559 1559 """
1560 1560 Set a commit comment, and optionally change the status of the commit.
1561 1561
1562 1562 :param apiuser: This is filled automatically from the |authtoken|.
1563 1563 :type apiuser: AuthUser
1564 1564 :param repoid: Set the repository name or repository ID.
1565 1565 :type repoid: str or int
1566 1566 :param commit_id: Specify the commit_id for which to set a comment.
1567 1567 :type commit_id: str
1568 1568 :param message: The comment text.
1569 1569 :type message: str
1570 1570 :param status: (**Optional**) status of commit, one of: 'not_reviewed',
1571 1571 'approved', 'rejected', 'under_review'
1572 1572 :type status: str
1573 1573 :param comment_type: Comment type, one of: 'note', 'todo'
1574 1574 :type comment_type: Optional(str), default: 'note'
1575 1575 :param resolves_comment_id: id of comment which this one will resolve
1576 1576 :type resolves_comment_id: Optional(int)
1577 1577 :param extra_recipients: list of user ids or usernames to add
1578 1578 notifications for this comment. Acts like a CC for notification
1579 1579 :type extra_recipients: Optional(list)
1580 1580 :param userid: Set the user name of the comment creator.
1581 1581 :type userid: Optional(str or int)
1582 1582 :param send_email: Define if this comment should also send email notification
1583 1583 :type send_email: Optional(bool)
1584 1584
1585 1585 Example error output:
1586 1586
1587 1587 .. code-block:: bash
1588 1588
1589 1589 {
1590 1590 "id" : <id_given_in_input>,
1591 1591 "result" : {
1592 1592 "msg": "Commented on commit `<commit_id>` for repository `<repoid>`",
1593 1593 "status_change": null or <status>,
1594 1594 "success": true
1595 1595 },
1596 1596 "error" : null
1597 1597 }
1598 1598
1599 1599 """
1600 _ = request.translate
1601
1600 1602 repo = get_repo_or_error(repoid)
1601 1603 if not has_superadmin_permission(apiuser):
1602 1604 _perms = ('repository.read', 'repository.write', 'repository.admin')
1603 1605 validate_repo_permissions(apiuser, repoid, repo, _perms)
1606 db_repo_name = repo.repo_name
1604 1607
1605 1608 try:
1606 1609 commit = repo.scm_instance().get_commit(commit_id=commit_id)
1607 1610 commit_id = commit.raw_id
1608 1611 except Exception as e:
1609 1612 log.exception('Failed to fetch commit')
1610 1613 raise JSONRPCError(safe_str(e))
1611 1614
1612 1615 if isinstance(userid, Optional):
1613 1616 userid = apiuser.user_id
1614 1617
1615 1618 user = get_user_or_error(userid)
1616 1619 status = Optional.extract(status)
1617 1620 comment_type = Optional.extract(comment_type)
1618 1621 resolves_comment_id = Optional.extract(resolves_comment_id)
1619 1622 extra_recipients = Optional.extract(extra_recipients)
1620 1623 send_email = Optional.extract(send_email, binary=True)
1621 1624
1622 1625 allowed_statuses = [x[0] for x in ChangesetStatus.STATUSES]
1623 1626 if status and status not in allowed_statuses:
1624 1627 raise JSONRPCError('Bad status, must be on '
1625 1628 'of %s got %s' % (allowed_statuses, status,))
1626 1629
1627 1630 if resolves_comment_id:
1628 1631 comment = ChangesetComment.get(resolves_comment_id)
1629 1632 if not comment:
1630 1633 raise JSONRPCError(
1631 1634 'Invalid resolves_comment_id `%s` for this commit.'
1632 1635 % resolves_comment_id)
1633 1636 if comment.comment_type != ChangesetComment.COMMENT_TYPE_TODO:
1634 1637 raise JSONRPCError(
1635 1638 'Comment `%s` is wrong type for setting status to resolved.'
1636 1639 % resolves_comment_id)
1637 1640
1638 1641 try:
1639 1642 rc_config = SettingsModel().get_all_settings()
1640 1643 renderer = rc_config.get('rhodecode_markup_renderer', 'rst')
1641 1644 status_change_label = ChangesetStatus.get_status_lbl(status)
1642 1645 comment = CommentsModel().create(
1643 1646 message, repo, user, commit_id=commit_id,
1644 1647 status_change=status_change_label,
1645 1648 status_change_type=status,
1646 1649 renderer=renderer,
1647 1650 comment_type=comment_type,
1648 1651 resolves_comment_id=resolves_comment_id,
1649 1652 auth_user=apiuser,
1650 1653 extra_recipients=extra_recipients,
1651 1654 send_email=send_email
1652 1655 )
1656 is_inline = comment.is_inline
1657
1653 1658 if status:
1654 1659 # also do a status change
1655 1660 try:
1656 1661 ChangesetStatusModel().set_status(
1657 1662 repo, status, user, comment, revision=commit_id,
1658 1663 dont_allow_on_closed_pull_request=True
1659 1664 )
1660 1665 except StatusChangeOnClosedPullRequestError:
1661 1666 log.exception(
1662 1667 "Exception occurred while trying to change repo commit status")
1663 1668 msg = ('Changing status on a commit associated with '
1664 1669 'a closed pull request is not allowed')
1665 1670 raise JSONRPCError(msg)
1666 1671
1667 1672 CommentsModel().trigger_commit_comment_hook(
1668 1673 repo, apiuser, 'create',
1669 1674 data={'comment': comment, 'commit': commit})
1670 1675
1671 1676 Session().commit()
1677
1678 comment_broadcast_channel = channelstream.comment_channel(
1679 db_repo_name, commit_obj=commit)
1680
1681 comment_data = {'comment': comment, 'comment_id': comment.comment_id}
1682 comment_type = 'inline' if is_inline else 'general'
1683 channelstream.comment_channelstream_push(
1684 request, comment_broadcast_channel, apiuser,
1685 _('posted a new {} comment').format(comment_type),
1686 comment_data=comment_data)
1687
1672 1688 return {
1673 1689 'msg': (
1674 1690 'Commented on commit `%s` for repository `%s`' % (
1675 1691 comment.revision, repo.repo_name)),
1676 1692 'status_change': status,
1677 1693 'success': True,
1678 1694 }
1679 1695 except JSONRPCError:
1680 1696 # catch any inside errors, and re-raise them to prevent from
1681 1697 # below global catch to silence them
1682 1698 raise
1683 1699 except Exception:
1684 1700 log.exception("Exception occurred while trying to comment on commit")
1685 1701 raise JSONRPCError(
1686 1702 'failed to set comment on repository `%s`' % (repo.repo_name,)
1687 1703 )
1688 1704
1689 1705
1690 1706 @jsonrpc_method()
1691 1707 def get_repo_comments(request, apiuser, repoid,
1692 1708 commit_id=Optional(None), comment_type=Optional(None),
1693 1709 userid=Optional(None)):
1694 1710 """
1695 1711 Get all comments for a repository
1696 1712
1697 1713 :param apiuser: This is filled automatically from the |authtoken|.
1698 1714 :type apiuser: AuthUser
1699 1715 :param repoid: Set the repository name or repository ID.
1700 1716 :type repoid: str or int
1701 1717 :param commit_id: Optionally filter the comments by the commit_id
1702 1718 :type commit_id: Optional(str), default: None
1703 1719 :param comment_type: Optionally filter the comments by the comment_type
1704 1720 one of: 'note', 'todo'
1705 1721 :type comment_type: Optional(str), default: None
1706 1722 :param userid: Optionally filter the comments by the author of comment
1707 1723 :type userid: Optional(str or int), Default: None
1708 1724
1709 1725 Example error output:
1710 1726
1711 1727 .. code-block:: bash
1712 1728
1713 1729 {
1714 1730 "id" : <id_given_in_input>,
1715 1731 "result" : [
1716 1732 {
1717 1733 "comment_author": <USER_DETAILS>,
1718 1734 "comment_created_on": "2017-02-01T14:38:16.309",
1719 1735 "comment_f_path": "file.txt",
1720 1736 "comment_id": 282,
1721 1737 "comment_lineno": "n1",
1722 1738 "comment_resolved_by": null,
1723 1739 "comment_status": [],
1724 1740 "comment_text": "This file needs a header",
1725 1741 "comment_type": "todo",
1726 1742 "comment_last_version: 0
1727 1743 }
1728 1744 ],
1729 1745 "error" : null
1730 1746 }
1731 1747
1732 1748 """
1733 1749 repo = get_repo_or_error(repoid)
1734 1750 if not has_superadmin_permission(apiuser):
1735 1751 _perms = ('repository.read', 'repository.write', 'repository.admin')
1736 1752 validate_repo_permissions(apiuser, repoid, repo, _perms)
1737 1753
1738 1754 commit_id = Optional.extract(commit_id)
1739 1755
1740 1756 userid = Optional.extract(userid)
1741 1757 if userid:
1742 1758 user = get_user_or_error(userid)
1743 1759 else:
1744 1760 user = None
1745 1761
1746 1762 comment_type = Optional.extract(comment_type)
1747 1763 if comment_type and comment_type not in ChangesetComment.COMMENT_TYPES:
1748 1764 raise JSONRPCError(
1749 1765 'comment_type must be one of `{}` got {}'.format(
1750 1766 ChangesetComment.COMMENT_TYPES, comment_type)
1751 1767 )
1752 1768
1753 1769 comments = CommentsModel().get_repository_comments(
1754 1770 repo=repo, comment_type=comment_type, user=user, commit_id=commit_id)
1755 1771 return comments
1756 1772
1757 1773
1758 1774 @jsonrpc_method()
1759 1775 def get_comment(request, apiuser, comment_id):
1760 1776 """
1761 1777 Get single comment from repository or pull_request
1762 1778
1763 1779 :param apiuser: This is filled automatically from the |authtoken|.
1764 1780 :type apiuser: AuthUser
1765 1781 :param comment_id: comment id found in the URL of comment
1766 1782 :type comment_id: str or int
1767 1783
1768 1784 Example error output:
1769 1785
1770 1786 .. code-block:: bash
1771 1787
1772 1788 {
1773 1789 "id" : <id_given_in_input>,
1774 1790 "result" : {
1775 1791 "comment_author": <USER_DETAILS>,
1776 1792 "comment_created_on": "2017-02-01T14:38:16.309",
1777 1793 "comment_f_path": "file.txt",
1778 1794 "comment_id": 282,
1779 1795 "comment_lineno": "n1",
1780 1796 "comment_resolved_by": null,
1781 1797 "comment_status": [],
1782 1798 "comment_text": "This file needs a header",
1783 1799 "comment_type": "todo",
1784 1800 "comment_last_version: 0
1785 1801 },
1786 1802 "error" : null
1787 1803 }
1788 1804
1789 1805 """
1790 1806
1791 1807 comment = ChangesetComment.get(comment_id)
1792 1808 if not comment:
1793 1809 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1794 1810
1795 1811 perms = ('repository.read', 'repository.write', 'repository.admin')
1796 1812 has_comment_perm = HasRepoPermissionAnyApi(*perms)\
1797 1813 (user=apiuser, repo_name=comment.repo.repo_name)
1798 1814
1799 1815 if not has_comment_perm:
1800 1816 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1801 1817
1802 1818 return comment
1803 1819
1804 1820
1805 1821 @jsonrpc_method()
1806 1822 def edit_comment(request, apiuser, message, comment_id, version,
1807 1823 userid=Optional(OAttr('apiuser'))):
1808 1824 """
1809 1825 Edit comment on the pull request or commit,
1810 1826 specified by the `comment_id` and version. Initially version should be 0
1811 1827
1812 1828 :param apiuser: This is filled automatically from the |authtoken|.
1813 1829 :type apiuser: AuthUser
1814 1830 :param comment_id: Specify the comment_id for editing
1815 1831 :type comment_id: int
1816 1832 :param version: version of the comment that will be created, starts from 0
1817 1833 :type version: int
1818 1834 :param message: The text content of the comment.
1819 1835 :type message: str
1820 1836 :param userid: Comment on the pull request as this user
1821 1837 :type userid: Optional(str or int)
1822 1838
1823 1839 Example output:
1824 1840
1825 1841 .. code-block:: bash
1826 1842
1827 1843 id : <id_given_in_input>
1828 1844 result : {
1829 1845 "comment": "<comment data>",
1830 1846 "version": "<Integer>",
1831 1847 },
1832 1848 error : null
1833 1849 """
1834 1850
1835 1851 auth_user = apiuser
1836 1852 comment = ChangesetComment.get(comment_id)
1837 1853 if not comment:
1838 1854 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1839 1855
1840 1856 is_super_admin = has_superadmin_permission(apiuser)
1841 1857 is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1842 1858 (user=apiuser, repo_name=comment.repo.repo_name)
1843 1859
1844 1860 if not isinstance(userid, Optional):
1845 1861 if is_super_admin or is_repo_admin:
1846 1862 apiuser = get_user_or_error(userid)
1847 1863 auth_user = apiuser.AuthUser()
1848 1864 else:
1849 1865 raise JSONRPCError('userid is not the same as your user')
1850 1866
1851 1867 comment_author = comment.author.user_id == auth_user.user_id
1852 1868 if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1853 1869 raise JSONRPCError("you don't have access to edit this comment")
1854 1870
1855 1871 try:
1856 1872 comment_history = CommentsModel().edit(
1857 1873 comment_id=comment_id,
1858 1874 text=message,
1859 1875 auth_user=auth_user,
1860 1876 version=version,
1861 1877 )
1862 1878 Session().commit()
1863 1879 except CommentVersionMismatch:
1864 1880 raise JSONRPCError(
1865 1881 'comment ({}) version ({}) mismatch'.format(comment_id, version)
1866 1882 )
1867 1883 if not comment_history and not message:
1868 1884 raise JSONRPCError(
1869 1885 "comment ({}) can't be changed with empty string".format(comment_id)
1870 1886 )
1871 1887
1872 1888 if comment.pull_request:
1873 1889 pull_request = comment.pull_request
1874 1890 PullRequestModel().trigger_pull_request_hook(
1875 1891 pull_request, apiuser, 'comment_edit',
1876 1892 data={'comment': comment})
1877 1893 else:
1878 1894 db_repo = comment.repo
1879 1895 commit_id = comment.revision
1880 1896 commit = db_repo.get_commit(commit_id)
1881 1897 CommentsModel().trigger_commit_comment_hook(
1882 1898 db_repo, apiuser, 'edit',
1883 1899 data={'comment': comment, 'commit': commit})
1884 1900
1885 1901 data = {
1886 1902 'comment': comment,
1887 1903 'version': comment_history.version if comment_history else None,
1888 1904 }
1889 1905 return data
1890 1906
1891 1907
1892 1908 # TODO(marcink): write this with all required logic for deleting a comments in PR or commits
1893 1909 # @jsonrpc_method()
1894 1910 # def delete_comment(request, apiuser, comment_id):
1895 1911 # auth_user = apiuser
1896 1912 #
1897 1913 # comment = ChangesetComment.get(comment_id)
1898 1914 # if not comment:
1899 1915 # raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1900 1916 #
1901 1917 # is_super_admin = has_superadmin_permission(apiuser)
1902 1918 # is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1903 1919 # (user=apiuser, repo_name=comment.repo.repo_name)
1904 1920 #
1905 1921 # comment_author = comment.author.user_id == auth_user.user_id
1906 1922 # if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1907 1923 # raise JSONRPCError("you don't have access to edit this comment")
1908 1924
1909 1925 @jsonrpc_method()
1910 1926 def grant_user_permission(request, apiuser, repoid, userid, perm):
1911 1927 """
1912 1928 Grant permissions for the specified user on the given repository,
1913 1929 or update existing permissions if found.
1914 1930
1915 1931 This command can only be run using an |authtoken| with admin
1916 1932 permissions on the |repo|.
1917 1933
1918 1934 :param apiuser: This is filled automatically from the |authtoken|.
1919 1935 :type apiuser: AuthUser
1920 1936 :param repoid: Set the repository name or repository ID.
1921 1937 :type repoid: str or int
1922 1938 :param userid: Set the user name.
1923 1939 :type userid: str
1924 1940 :param perm: Set the user permissions, using the following format
1925 1941 ``(repository.(none|read|write|admin))``
1926 1942 :type perm: str
1927 1943
1928 1944 Example output:
1929 1945
1930 1946 .. code-block:: bash
1931 1947
1932 1948 id : <id_given_in_input>
1933 1949 result: {
1934 1950 "msg" : "Granted perm: `<perm>` for user: `<username>` in repo: `<reponame>`",
1935 1951 "success": true
1936 1952 }
1937 1953 error: null
1938 1954 """
1939 1955
1940 1956 repo = get_repo_or_error(repoid)
1941 1957 user = get_user_or_error(userid)
1942 1958 perm = get_perm_or_error(perm)
1943 1959 if not has_superadmin_permission(apiuser):
1944 1960 _perms = ('repository.admin',)
1945 1961 validate_repo_permissions(apiuser, repoid, repo, _perms)
1946 1962
1947 1963 perm_additions = [[user.user_id, perm.permission_name, "user"]]
1948 1964 try:
1949 1965 changes = RepoModel().update_permissions(
1950 1966 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
1951 1967
1952 1968 action_data = {
1953 1969 'added': changes['added'],
1954 1970 'updated': changes['updated'],
1955 1971 'deleted': changes['deleted'],
1956 1972 }
1957 1973 audit_logger.store_api(
1958 1974 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
1959 1975 Session().commit()
1960 1976 PermissionModel().flush_user_permission_caches(changes)
1961 1977
1962 1978 return {
1963 1979 'msg': 'Granted perm: `%s` for user: `%s` in repo: `%s`' % (
1964 1980 perm.permission_name, user.username, repo.repo_name
1965 1981 ),
1966 1982 'success': True
1967 1983 }
1968 1984 except Exception:
1969 1985 log.exception("Exception occurred while trying edit permissions for repo")
1970 1986 raise JSONRPCError(
1971 1987 'failed to edit permission for user: `%s` in repo: `%s`' % (
1972 1988 userid, repoid
1973 1989 )
1974 1990 )
1975 1991
1976 1992
1977 1993 @jsonrpc_method()
1978 1994 def revoke_user_permission(request, apiuser, repoid, userid):
1979 1995 """
1980 1996 Revoke permission for a user on the specified repository.
1981 1997
1982 1998 This command can only be run using an |authtoken| with admin
1983 1999 permissions on the |repo|.
1984 2000
1985 2001 :param apiuser: This is filled automatically from the |authtoken|.
1986 2002 :type apiuser: AuthUser
1987 2003 :param repoid: Set the repository name or repository ID.
1988 2004 :type repoid: str or int
1989 2005 :param userid: Set the user name of revoked user.
1990 2006 :type userid: str or int
1991 2007
1992 2008 Example error output:
1993 2009
1994 2010 .. code-block:: bash
1995 2011
1996 2012 id : <id_given_in_input>
1997 2013 result: {
1998 2014 "msg" : "Revoked perm for user: `<username>` in repo: `<reponame>`",
1999 2015 "success": true
2000 2016 }
2001 2017 error: null
2002 2018 """
2003 2019
2004 2020 repo = get_repo_or_error(repoid)
2005 2021 user = get_user_or_error(userid)
2006 2022 if not has_superadmin_permission(apiuser):
2007 2023 _perms = ('repository.admin',)
2008 2024 validate_repo_permissions(apiuser, repoid, repo, _perms)
2009 2025
2010 2026 perm_deletions = [[user.user_id, None, "user"]]
2011 2027 try:
2012 2028 changes = RepoModel().update_permissions(
2013 2029 repo=repo, perm_deletions=perm_deletions, cur_user=user)
2014 2030
2015 2031 action_data = {
2016 2032 'added': changes['added'],
2017 2033 'updated': changes['updated'],
2018 2034 'deleted': changes['deleted'],
2019 2035 }
2020 2036 audit_logger.store_api(
2021 2037 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2022 2038 Session().commit()
2023 2039 PermissionModel().flush_user_permission_caches(changes)
2024 2040
2025 2041 return {
2026 2042 'msg': 'Revoked perm for user: `%s` in repo: `%s`' % (
2027 2043 user.username, repo.repo_name
2028 2044 ),
2029 2045 'success': True
2030 2046 }
2031 2047 except Exception:
2032 2048 log.exception("Exception occurred while trying revoke permissions to repo")
2033 2049 raise JSONRPCError(
2034 2050 'failed to edit permission for user: `%s` in repo: `%s`' % (
2035 2051 userid, repoid
2036 2052 )
2037 2053 )
2038 2054
2039 2055
2040 2056 @jsonrpc_method()
2041 2057 def grant_user_group_permission(request, apiuser, repoid, usergroupid, perm):
2042 2058 """
2043 2059 Grant permission for a user group on the specified repository,
2044 2060 or update existing permissions.
2045 2061
2046 2062 This command can only be run using an |authtoken| with admin
2047 2063 permissions on the |repo|.
2048 2064
2049 2065 :param apiuser: This is filled automatically from the |authtoken|.
2050 2066 :type apiuser: AuthUser
2051 2067 :param repoid: Set the repository name or repository ID.
2052 2068 :type repoid: str or int
2053 2069 :param usergroupid: Specify the ID of the user group.
2054 2070 :type usergroupid: str or int
2055 2071 :param perm: Set the user group permissions using the following
2056 2072 format: (repository.(none|read|write|admin))
2057 2073 :type perm: str
2058 2074
2059 2075 Example output:
2060 2076
2061 2077 .. code-block:: bash
2062 2078
2063 2079 id : <id_given_in_input>
2064 2080 result : {
2065 2081 "msg" : "Granted perm: `<perm>` for group: `<usersgroupname>` in repo: `<reponame>`",
2066 2082 "success": true
2067 2083
2068 2084 }
2069 2085 error : null
2070 2086
2071 2087 Example error output:
2072 2088
2073 2089 .. code-block:: bash
2074 2090
2075 2091 id : <id_given_in_input>
2076 2092 result : null
2077 2093 error : {
2078 2094 "failed to edit permission for user group: `<usergroup>` in repo `<repo>`'
2079 2095 }
2080 2096
2081 2097 """
2082 2098
2083 2099 repo = get_repo_or_error(repoid)
2084 2100 perm = get_perm_or_error(perm)
2085 2101 if not has_superadmin_permission(apiuser):
2086 2102 _perms = ('repository.admin',)
2087 2103 validate_repo_permissions(apiuser, repoid, repo, _perms)
2088 2104
2089 2105 user_group = get_user_group_or_error(usergroupid)
2090 2106 if not has_superadmin_permission(apiuser):
2091 2107 # check if we have at least read permission for this user group !
2092 2108 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2093 2109 if not HasUserGroupPermissionAnyApi(*_perms)(
2094 2110 user=apiuser, user_group_name=user_group.users_group_name):
2095 2111 raise JSONRPCError(
2096 2112 'user group `%s` does not exist' % (usergroupid,))
2097 2113
2098 2114 perm_additions = [[user_group.users_group_id, perm.permission_name, "user_group"]]
2099 2115 try:
2100 2116 changes = RepoModel().update_permissions(
2101 2117 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
2102 2118 action_data = {
2103 2119 'added': changes['added'],
2104 2120 'updated': changes['updated'],
2105 2121 'deleted': changes['deleted'],
2106 2122 }
2107 2123 audit_logger.store_api(
2108 2124 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2109 2125 Session().commit()
2110 2126 PermissionModel().flush_user_permission_caches(changes)
2111 2127
2112 2128 return {
2113 2129 'msg': 'Granted perm: `%s` for user group: `%s` in '
2114 2130 'repo: `%s`' % (
2115 2131 perm.permission_name, user_group.users_group_name,
2116 2132 repo.repo_name
2117 2133 ),
2118 2134 'success': True
2119 2135 }
2120 2136 except Exception:
2121 2137 log.exception(
2122 2138 "Exception occurred while trying change permission on repo")
2123 2139 raise JSONRPCError(
2124 2140 'failed to edit permission for user group: `%s` in '
2125 2141 'repo: `%s`' % (
2126 2142 usergroupid, repo.repo_name
2127 2143 )
2128 2144 )
2129 2145
2130 2146
2131 2147 @jsonrpc_method()
2132 2148 def revoke_user_group_permission(request, apiuser, repoid, usergroupid):
2133 2149 """
2134 2150 Revoke the permissions of a user group on a given repository.
2135 2151
2136 2152 This command can only be run using an |authtoken| with admin
2137 2153 permissions on the |repo|.
2138 2154
2139 2155 :param apiuser: This is filled automatically from the |authtoken|.
2140 2156 :type apiuser: AuthUser
2141 2157 :param repoid: Set the repository name or repository ID.
2142 2158 :type repoid: str or int
2143 2159 :param usergroupid: Specify the user group ID.
2144 2160 :type usergroupid: str or int
2145 2161
2146 2162 Example output:
2147 2163
2148 2164 .. code-block:: bash
2149 2165
2150 2166 id : <id_given_in_input>
2151 2167 result: {
2152 2168 "msg" : "Revoked perm for group: `<usersgroupname>` in repo: `<reponame>`",
2153 2169 "success": true
2154 2170 }
2155 2171 error: null
2156 2172 """
2157 2173
2158 2174 repo = get_repo_or_error(repoid)
2159 2175 if not has_superadmin_permission(apiuser):
2160 2176 _perms = ('repository.admin',)
2161 2177 validate_repo_permissions(apiuser, repoid, repo, _perms)
2162 2178
2163 2179 user_group = get_user_group_or_error(usergroupid)
2164 2180 if not has_superadmin_permission(apiuser):
2165 2181 # check if we have at least read permission for this user group !
2166 2182 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2167 2183 if not HasUserGroupPermissionAnyApi(*_perms)(
2168 2184 user=apiuser, user_group_name=user_group.users_group_name):
2169 2185 raise JSONRPCError(
2170 2186 'user group `%s` does not exist' % (usergroupid,))
2171 2187
2172 2188 perm_deletions = [[user_group.users_group_id, None, "user_group"]]
2173 2189 try:
2174 2190 changes = RepoModel().update_permissions(
2175 2191 repo=repo, perm_deletions=perm_deletions, cur_user=apiuser)
2176 2192 action_data = {
2177 2193 'added': changes['added'],
2178 2194 'updated': changes['updated'],
2179 2195 'deleted': changes['deleted'],
2180 2196 }
2181 2197 audit_logger.store_api(
2182 2198 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2183 2199 Session().commit()
2184 2200 PermissionModel().flush_user_permission_caches(changes)
2185 2201
2186 2202 return {
2187 2203 'msg': 'Revoked perm for user group: `%s` in repo: `%s`' % (
2188 2204 user_group.users_group_name, repo.repo_name
2189 2205 ),
2190 2206 'success': True
2191 2207 }
2192 2208 except Exception:
2193 2209 log.exception("Exception occurred while trying revoke "
2194 2210 "user group permission on repo")
2195 2211 raise JSONRPCError(
2196 2212 'failed to edit permission for user group: `%s` in '
2197 2213 'repo: `%s`' % (
2198 2214 user_group.users_group_name, repo.repo_name
2199 2215 )
2200 2216 )
2201 2217
2202 2218
2203 2219 @jsonrpc_method()
2204 2220 def pull(request, apiuser, repoid, remote_uri=Optional(None)):
2205 2221 """
2206 2222 Triggers a pull on the given repository from a remote location. You
2207 2223 can use this to keep remote repositories up-to-date.
2208 2224
2209 2225 This command can only be run using an |authtoken| with admin
2210 2226 rights to the specified repository. For more information,
2211 2227 see :ref:`config-token-ref`.
2212 2228
2213 2229 This command takes the following options:
2214 2230
2215 2231 :param apiuser: This is filled automatically from the |authtoken|.
2216 2232 :type apiuser: AuthUser
2217 2233 :param repoid: The repository name or repository ID.
2218 2234 :type repoid: str or int
2219 2235 :param remote_uri: Optional remote URI to pass in for pull
2220 2236 :type remote_uri: str
2221 2237
2222 2238 Example output:
2223 2239
2224 2240 .. code-block:: bash
2225 2241
2226 2242 id : <id_given_in_input>
2227 2243 result : {
2228 2244 "msg": "Pulled from url `<remote_url>` on repo `<repository name>`"
2229 2245 "repository": "<repository name>"
2230 2246 }
2231 2247 error : null
2232 2248
2233 2249 Example error output:
2234 2250
2235 2251 .. code-block:: bash
2236 2252
2237 2253 id : <id_given_in_input>
2238 2254 result : null
2239 2255 error : {
2240 2256 "Unable to push changes from `<remote_url>`"
2241 2257 }
2242 2258
2243 2259 """
2244 2260
2245 2261 repo = get_repo_or_error(repoid)
2246 2262 remote_uri = Optional.extract(remote_uri)
2247 2263 remote_uri_display = remote_uri or repo.clone_uri_hidden
2248 2264 if not has_superadmin_permission(apiuser):
2249 2265 _perms = ('repository.admin',)
2250 2266 validate_repo_permissions(apiuser, repoid, repo, _perms)
2251 2267
2252 2268 try:
2253 2269 ScmModel().pull_changes(
2254 2270 repo.repo_name, apiuser.username, remote_uri=remote_uri)
2255 2271 return {
2256 2272 'msg': 'Pulled from url `%s` on repo `%s`' % (
2257 2273 remote_uri_display, repo.repo_name),
2258 2274 'repository': repo.repo_name
2259 2275 }
2260 2276 except Exception:
2261 2277 log.exception("Exception occurred while trying to "
2262 2278 "pull changes from remote location")
2263 2279 raise JSONRPCError(
2264 2280 'Unable to pull changes from `%s`' % remote_uri_display
2265 2281 )
2266 2282
2267 2283
2268 2284 @jsonrpc_method()
2269 2285 def strip(request, apiuser, repoid, revision, branch):
2270 2286 """
2271 2287 Strips the given revision from the specified repository.
2272 2288
2273 2289 * This will remove the revision and all of its decendants.
2274 2290
2275 2291 This command can only be run using an |authtoken| with admin rights to
2276 2292 the specified repository.
2277 2293
2278 2294 This command takes the following options:
2279 2295
2280 2296 :param apiuser: This is filled automatically from the |authtoken|.
2281 2297 :type apiuser: AuthUser
2282 2298 :param repoid: The repository name or repository ID.
2283 2299 :type repoid: str or int
2284 2300 :param revision: The revision you wish to strip.
2285 2301 :type revision: str
2286 2302 :param branch: The branch from which to strip the revision.
2287 2303 :type branch: str
2288 2304
2289 2305 Example output:
2290 2306
2291 2307 .. code-block:: bash
2292 2308
2293 2309 id : <id_given_in_input>
2294 2310 result : {
2295 2311 "msg": "'Stripped commit <commit_hash> from repo `<repository name>`'"
2296 2312 "repository": "<repository name>"
2297 2313 }
2298 2314 error : null
2299 2315
2300 2316 Example error output:
2301 2317
2302 2318 .. code-block:: bash
2303 2319
2304 2320 id : <id_given_in_input>
2305 2321 result : null
2306 2322 error : {
2307 2323 "Unable to strip commit <commit_hash> from repo `<repository name>`"
2308 2324 }
2309 2325
2310 2326 """
2311 2327
2312 2328 repo = get_repo_or_error(repoid)
2313 2329 if not has_superadmin_permission(apiuser):
2314 2330 _perms = ('repository.admin',)
2315 2331 validate_repo_permissions(apiuser, repoid, repo, _perms)
2316 2332
2317 2333 try:
2318 2334 ScmModel().strip(repo, revision, branch)
2319 2335 audit_logger.store_api(
2320 2336 'repo.commit.strip', action_data={'commit_id': revision},
2321 2337 repo=repo,
2322 2338 user=apiuser, commit=True)
2323 2339
2324 2340 return {
2325 2341 'msg': 'Stripped commit %s from repo `%s`' % (
2326 2342 revision, repo.repo_name),
2327 2343 'repository': repo.repo_name
2328 2344 }
2329 2345 except Exception:
2330 2346 log.exception("Exception while trying to strip")
2331 2347 raise JSONRPCError(
2332 2348 'Unable to strip commit %s from repo `%s`' % (
2333 2349 revision, repo.repo_name)
2334 2350 )
2335 2351
2336 2352
2337 2353 @jsonrpc_method()
2338 2354 def get_repo_settings(request, apiuser, repoid, key=Optional(None)):
2339 2355 """
2340 2356 Returns all settings for a repository. If key is given it only returns the
2341 2357 setting identified by the key or null.
2342 2358
2343 2359 :param apiuser: This is filled automatically from the |authtoken|.
2344 2360 :type apiuser: AuthUser
2345 2361 :param repoid: The repository name or repository id.
2346 2362 :type repoid: str or int
2347 2363 :param key: Key of the setting to return.
2348 2364 :type: key: Optional(str)
2349 2365
2350 2366 Example output:
2351 2367
2352 2368 .. code-block:: bash
2353 2369
2354 2370 {
2355 2371 "error": null,
2356 2372 "id": 237,
2357 2373 "result": {
2358 2374 "extensions_largefiles": true,
2359 2375 "extensions_evolve": true,
2360 2376 "hooks_changegroup_push_logger": true,
2361 2377 "hooks_changegroup_repo_size": false,
2362 2378 "hooks_outgoing_pull_logger": true,
2363 2379 "phases_publish": "True",
2364 2380 "rhodecode_hg_use_rebase_for_merging": true,
2365 2381 "rhodecode_pr_merge_enabled": true,
2366 2382 "rhodecode_use_outdated_comments": true
2367 2383 }
2368 2384 }
2369 2385 """
2370 2386
2371 2387 # Restrict access to this api method to super-admins, and repo admins only.
2372 2388 repo = get_repo_or_error(repoid)
2373 2389 if not has_superadmin_permission(apiuser):
2374 2390 _perms = ('repository.admin',)
2375 2391 validate_repo_permissions(apiuser, repoid, repo, _perms)
2376 2392
2377 2393 try:
2378 2394 settings_model = VcsSettingsModel(repo=repo)
2379 2395 settings = settings_model.get_global_settings()
2380 2396 settings.update(settings_model.get_repo_settings())
2381 2397
2382 2398 # If only a single setting is requested fetch it from all settings.
2383 2399 key = Optional.extract(key)
2384 2400 if key is not None:
2385 2401 settings = settings.get(key, None)
2386 2402 except Exception:
2387 2403 msg = 'Failed to fetch settings for repository `{}`'.format(repoid)
2388 2404 log.exception(msg)
2389 2405 raise JSONRPCError(msg)
2390 2406
2391 2407 return settings
2392 2408
2393 2409
2394 2410 @jsonrpc_method()
2395 2411 def set_repo_settings(request, apiuser, repoid, settings):
2396 2412 """
2397 2413 Update repository settings. Returns true on success.
2398 2414
2399 2415 :param apiuser: This is filled automatically from the |authtoken|.
2400 2416 :type apiuser: AuthUser
2401 2417 :param repoid: The repository name or repository id.
2402 2418 :type repoid: str or int
2403 2419 :param settings: The new settings for the repository.
2404 2420 :type: settings: dict
2405 2421
2406 2422 Example output:
2407 2423
2408 2424 .. code-block:: bash
2409 2425
2410 2426 {
2411 2427 "error": null,
2412 2428 "id": 237,
2413 2429 "result": true
2414 2430 }
2415 2431 """
2416 2432 # Restrict access to this api method to super-admins, and repo admins only.
2417 2433 repo = get_repo_or_error(repoid)
2418 2434 if not has_superadmin_permission(apiuser):
2419 2435 _perms = ('repository.admin',)
2420 2436 validate_repo_permissions(apiuser, repoid, repo, _perms)
2421 2437
2422 2438 if type(settings) is not dict:
2423 2439 raise JSONRPCError('Settings have to be a JSON Object.')
2424 2440
2425 2441 try:
2426 2442 settings_model = VcsSettingsModel(repo=repoid)
2427 2443
2428 2444 # Merge global, repo and incoming settings.
2429 2445 new_settings = settings_model.get_global_settings()
2430 2446 new_settings.update(settings_model.get_repo_settings())
2431 2447 new_settings.update(settings)
2432 2448
2433 2449 # Update the settings.
2434 2450 inherit_global_settings = new_settings.get(
2435 2451 'inherit_global_settings', False)
2436 2452 settings_model.create_or_update_repo_settings(
2437 2453 new_settings, inherit_global_settings=inherit_global_settings)
2438 2454 Session().commit()
2439 2455 except Exception:
2440 2456 msg = 'Failed to update settings for repository `{}`'.format(repoid)
2441 2457 log.exception(msg)
2442 2458 raise JSONRPCError(msg)
2443 2459
2444 2460 # Indicate success.
2445 2461 return True
2446 2462
2447 2463
2448 2464 @jsonrpc_method()
2449 2465 def maintenance(request, apiuser, repoid):
2450 2466 """
2451 2467 Triggers a maintenance on the given repository.
2452 2468
2453 2469 This command can only be run using an |authtoken| with admin
2454 2470 rights to the specified repository. For more information,
2455 2471 see :ref:`config-token-ref`.
2456 2472
2457 2473 This command takes the following options:
2458 2474
2459 2475 :param apiuser: This is filled automatically from the |authtoken|.
2460 2476 :type apiuser: AuthUser
2461 2477 :param repoid: The repository name or repository ID.
2462 2478 :type repoid: str or int
2463 2479
2464 2480 Example output:
2465 2481
2466 2482 .. code-block:: bash
2467 2483
2468 2484 id : <id_given_in_input>
2469 2485 result : {
2470 2486 "msg": "executed maintenance command",
2471 2487 "executed_actions": [
2472 2488 <action_message>, <action_message2>...
2473 2489 ],
2474 2490 "repository": "<repository name>"
2475 2491 }
2476 2492 error : null
2477 2493
2478 2494 Example error output:
2479 2495
2480 2496 .. code-block:: bash
2481 2497
2482 2498 id : <id_given_in_input>
2483 2499 result : null
2484 2500 error : {
2485 2501 "Unable to execute maintenance on `<reponame>`"
2486 2502 }
2487 2503
2488 2504 """
2489 2505
2490 2506 repo = get_repo_or_error(repoid)
2491 2507 if not has_superadmin_permission(apiuser):
2492 2508 _perms = ('repository.admin',)
2493 2509 validate_repo_permissions(apiuser, repoid, repo, _perms)
2494 2510
2495 2511 try:
2496 2512 maintenance = repo_maintenance.RepoMaintenance()
2497 2513 executed_actions = maintenance.execute(repo)
2498 2514
2499 2515 return {
2500 2516 'msg': 'executed maintenance command',
2501 2517 'executed_actions': executed_actions,
2502 2518 'repository': repo.repo_name
2503 2519 }
2504 2520 except Exception:
2505 2521 log.exception("Exception occurred while trying to run maintenance")
2506 2522 raise JSONRPCError(
2507 2523 'Unable to execute maintenance on `%s`' % repo.repo_name)
@@ -1,783 +1,792 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21
22 22 import logging
23 23 import collections
24 24
25 25 import datetime
26 26 import formencode
27 27 import formencode.htmlfill
28 28
29 29 import rhodecode
30 30 from pyramid.view import view_config
31 31 from pyramid.httpexceptions import HTTPFound, HTTPNotFound
32 32 from pyramid.renderers import render
33 33 from pyramid.response import Response
34 34
35 35 from rhodecode.apps._base import BaseAppView
36 36 from rhodecode.apps._base.navigation import navigation_list
37 37 from rhodecode.apps.svn_support.config_keys import generate_config
38 38 from rhodecode.lib import helpers as h
39 39 from rhodecode.lib.auth import (
40 40 LoginRequired, HasPermissionAllDecorator, CSRFRequired)
41 41 from rhodecode.lib.celerylib import tasks, run_task
42 42 from rhodecode.lib.utils import repo2db_mapper
43 43 from rhodecode.lib.utils2 import str2bool, safe_unicode, AttributeDict
44 44 from rhodecode.lib.index import searcher_from_config
45 45
46 46 from rhodecode.model.db import RhodeCodeUi, Repository
47 47 from rhodecode.model.forms import (ApplicationSettingsForm,
48 48 ApplicationUiSettingsForm, ApplicationVisualisationForm,
49 49 LabsSettingsForm, IssueTrackerPatternsForm)
50 50 from rhodecode.model.permission import PermissionModel
51 51 from rhodecode.model.repo_group import RepoGroupModel
52 52
53 53 from rhodecode.model.scm import ScmModel
54 54 from rhodecode.model.notification import EmailNotificationModel
55 55 from rhodecode.model.meta import Session
56 56 from rhodecode.model.settings import (
57 57 IssueTrackerSettingsModel, VcsSettingsModel, SettingNotFound,
58 58 SettingsModel)
59 59
60 60
61 61 log = logging.getLogger(__name__)
62 62
63 63
64 64 class AdminSettingsView(BaseAppView):
65 65
66 66 def load_default_context(self):
67 67 c = self._get_local_tmpl_context()
68 68 c.labs_active = str2bool(
69 69 rhodecode.CONFIG.get('labs_settings_active', 'true'))
70 70 c.navlist = navigation_list(self.request)
71 71
72 72 return c
73 73
74 74 @classmethod
75 75 def _get_ui_settings(cls):
76 76 ret = RhodeCodeUi.query().all()
77 77
78 78 if not ret:
79 79 raise Exception('Could not get application ui settings !')
80 80 settings = {}
81 81 for each in ret:
82 82 k = each.ui_key
83 83 v = each.ui_value
84 84 if k == '/':
85 85 k = 'root_path'
86 86
87 87 if k in ['push_ssl', 'publish', 'enabled']:
88 88 v = str2bool(v)
89 89
90 90 if k.find('.') != -1:
91 91 k = k.replace('.', '_')
92 92
93 93 if each.ui_section in ['hooks', 'extensions']:
94 94 v = each.ui_active
95 95
96 96 settings[each.ui_section + '_' + k] = v
97 97 return settings
98 98
99 99 @classmethod
100 100 def _form_defaults(cls):
101 101 defaults = SettingsModel().get_all_settings()
102 102 defaults.update(cls._get_ui_settings())
103 103
104 104 defaults.update({
105 105 'new_svn_branch': '',
106 106 'new_svn_tag': '',
107 107 })
108 108 return defaults
109 109
110 110 @LoginRequired()
111 111 @HasPermissionAllDecorator('hg.admin')
112 112 @view_config(
113 113 route_name='admin_settings_vcs', request_method='GET',
114 114 renderer='rhodecode:templates/admin/settings/settings.mako')
115 115 def settings_vcs(self):
116 116 c = self.load_default_context()
117 117 c.active = 'vcs'
118 118 model = VcsSettingsModel()
119 119 c.svn_branch_patterns = model.get_global_svn_branch_patterns()
120 120 c.svn_tag_patterns = model.get_global_svn_tag_patterns()
121 121
122 122 settings = self.request.registry.settings
123 123 c.svn_proxy_generate_config = settings[generate_config]
124 124
125 125 defaults = self._form_defaults()
126 126
127 127 model.create_largeobjects_dirs_if_needed(defaults['paths_root_path'])
128 128
129 129 data = render('rhodecode:templates/admin/settings/settings.mako',
130 130 self._get_template_context(c), self.request)
131 131 html = formencode.htmlfill.render(
132 132 data,
133 133 defaults=defaults,
134 134 encoding="UTF-8",
135 135 force_defaults=False
136 136 )
137 137 return Response(html)
138 138
139 139 @LoginRequired()
140 140 @HasPermissionAllDecorator('hg.admin')
141 141 @CSRFRequired()
142 142 @view_config(
143 143 route_name='admin_settings_vcs_update', request_method='POST',
144 144 renderer='rhodecode:templates/admin/settings/settings.mako')
145 145 def settings_vcs_update(self):
146 146 _ = self.request.translate
147 147 c = self.load_default_context()
148 148 c.active = 'vcs'
149 149
150 150 model = VcsSettingsModel()
151 151 c.svn_branch_patterns = model.get_global_svn_branch_patterns()
152 152 c.svn_tag_patterns = model.get_global_svn_tag_patterns()
153 153
154 154 settings = self.request.registry.settings
155 155 c.svn_proxy_generate_config = settings[generate_config]
156 156
157 157 application_form = ApplicationUiSettingsForm(self.request.translate)()
158 158
159 159 try:
160 160 form_result = application_form.to_python(dict(self.request.POST))
161 161 except formencode.Invalid as errors:
162 162 h.flash(
163 163 _("Some form inputs contain invalid data."),
164 164 category='error')
165 165 data = render('rhodecode:templates/admin/settings/settings.mako',
166 166 self._get_template_context(c), self.request)
167 167 html = formencode.htmlfill.render(
168 168 data,
169 169 defaults=errors.value,
170 170 errors=errors.error_dict or {},
171 171 prefix_error=False,
172 172 encoding="UTF-8",
173 173 force_defaults=False
174 174 )
175 175 return Response(html)
176 176
177 177 try:
178 178 if c.visual.allow_repo_location_change:
179 179 model.update_global_path_setting(form_result['paths_root_path'])
180 180
181 181 model.update_global_ssl_setting(form_result['web_push_ssl'])
182 182 model.update_global_hook_settings(form_result)
183 183
184 184 model.create_or_update_global_svn_settings(form_result)
185 185 model.create_or_update_global_hg_settings(form_result)
186 186 model.create_or_update_global_git_settings(form_result)
187 187 model.create_or_update_global_pr_settings(form_result)
188 188 except Exception:
189 189 log.exception("Exception while updating settings")
190 190 h.flash(_('Error occurred during updating '
191 191 'application settings'), category='error')
192 192 else:
193 193 Session().commit()
194 194 h.flash(_('Updated VCS settings'), category='success')
195 195 raise HTTPFound(h.route_path('admin_settings_vcs'))
196 196
197 197 data = render('rhodecode:templates/admin/settings/settings.mako',
198 198 self._get_template_context(c), self.request)
199 199 html = formencode.htmlfill.render(
200 200 data,
201 201 defaults=self._form_defaults(),
202 202 encoding="UTF-8",
203 203 force_defaults=False
204 204 )
205 205 return Response(html)
206 206
207 207 @LoginRequired()
208 208 @HasPermissionAllDecorator('hg.admin')
209 209 @CSRFRequired()
210 210 @view_config(
211 211 route_name='admin_settings_vcs_svn_pattern_delete', request_method='POST',
212 212 renderer='json_ext', xhr=True)
213 213 def settings_vcs_delete_svn_pattern(self):
214 214 delete_pattern_id = self.request.POST.get('delete_svn_pattern')
215 215 model = VcsSettingsModel()
216 216 try:
217 217 model.delete_global_svn_pattern(delete_pattern_id)
218 218 except SettingNotFound:
219 219 log.exception(
220 220 'Failed to delete svn_pattern with id %s', delete_pattern_id)
221 221 raise HTTPNotFound()
222 222
223 223 Session().commit()
224 224 return True
225 225
226 226 @LoginRequired()
227 227 @HasPermissionAllDecorator('hg.admin')
228 228 @view_config(
229 229 route_name='admin_settings_mapping', request_method='GET',
230 230 renderer='rhodecode:templates/admin/settings/settings.mako')
231 231 def settings_mapping(self):
232 232 c = self.load_default_context()
233 233 c.active = 'mapping'
234 234
235 235 data = render('rhodecode:templates/admin/settings/settings.mako',
236 236 self._get_template_context(c), self.request)
237 237 html = formencode.htmlfill.render(
238 238 data,
239 239 defaults=self._form_defaults(),
240 240 encoding="UTF-8",
241 241 force_defaults=False
242 242 )
243 243 return Response(html)
244 244
245 245 @LoginRequired()
246 246 @HasPermissionAllDecorator('hg.admin')
247 247 @CSRFRequired()
248 248 @view_config(
249 249 route_name='admin_settings_mapping_update', request_method='POST',
250 250 renderer='rhodecode:templates/admin/settings/settings.mako')
251 251 def settings_mapping_update(self):
252 252 _ = self.request.translate
253 253 c = self.load_default_context()
254 254 c.active = 'mapping'
255 255 rm_obsolete = self.request.POST.get('destroy', False)
256 256 invalidate_cache = self.request.POST.get('invalidate', False)
257 257 log.debug('rescanning repo location with destroy obsolete=%s', rm_obsolete)
258 258
259 259 if invalidate_cache:
260 260 log.debug('invalidating all repositories cache')
261 261 for repo in Repository.get_all():
262 262 ScmModel().mark_for_invalidation(repo.repo_name, delete=True)
263 263
264 264 filesystem_repos = ScmModel().repo_scan()
265 265 added, removed = repo2db_mapper(filesystem_repos, rm_obsolete)
266 266 PermissionModel().trigger_permission_flush()
267 267
268 268 _repr = lambda l: ', '.join(map(safe_unicode, l)) or '-'
269 269 h.flash(_('Repositories successfully '
270 270 'rescanned added: %s ; removed: %s') %
271 271 (_repr(added), _repr(removed)),
272 272 category='success')
273 273 raise HTTPFound(h.route_path('admin_settings_mapping'))
274 274
275 275 @LoginRequired()
276 276 @HasPermissionAllDecorator('hg.admin')
277 277 @view_config(
278 278 route_name='admin_settings', request_method='GET',
279 279 renderer='rhodecode:templates/admin/settings/settings.mako')
280 280 @view_config(
281 281 route_name='admin_settings_global', request_method='GET',
282 282 renderer='rhodecode:templates/admin/settings/settings.mako')
283 283 def settings_global(self):
284 284 c = self.load_default_context()
285 285 c.active = 'global'
286 286 c.personal_repo_group_default_pattern = RepoGroupModel()\
287 287 .get_personal_group_name_pattern()
288 288
289 289 data = render('rhodecode:templates/admin/settings/settings.mako',
290 290 self._get_template_context(c), self.request)
291 291 html = formencode.htmlfill.render(
292 292 data,
293 293 defaults=self._form_defaults(),
294 294 encoding="UTF-8",
295 295 force_defaults=False
296 296 )
297 297 return Response(html)
298 298
299 299 @LoginRequired()
300 300 @HasPermissionAllDecorator('hg.admin')
301 301 @CSRFRequired()
302 302 @view_config(
303 303 route_name='admin_settings_update', request_method='POST',
304 304 renderer='rhodecode:templates/admin/settings/settings.mako')
305 305 @view_config(
306 306 route_name='admin_settings_global_update', request_method='POST',
307 307 renderer='rhodecode:templates/admin/settings/settings.mako')
308 308 def settings_global_update(self):
309 309 _ = self.request.translate
310 310 c = self.load_default_context()
311 311 c.active = 'global'
312 312 c.personal_repo_group_default_pattern = RepoGroupModel()\
313 313 .get_personal_group_name_pattern()
314 314 application_form = ApplicationSettingsForm(self.request.translate)()
315 315 try:
316 316 form_result = application_form.to_python(dict(self.request.POST))
317 317 except formencode.Invalid as errors:
318 318 h.flash(
319 319 _("Some form inputs contain invalid data."),
320 320 category='error')
321 321 data = render('rhodecode:templates/admin/settings/settings.mako',
322 322 self._get_template_context(c), self.request)
323 323 html = formencode.htmlfill.render(
324 324 data,
325 325 defaults=errors.value,
326 326 errors=errors.error_dict or {},
327 327 prefix_error=False,
328 328 encoding="UTF-8",
329 329 force_defaults=False
330 330 )
331 331 return Response(html)
332 332
333 333 settings = [
334 334 ('title', 'rhodecode_title', 'unicode'),
335 335 ('realm', 'rhodecode_realm', 'unicode'),
336 336 ('pre_code', 'rhodecode_pre_code', 'unicode'),
337 337 ('post_code', 'rhodecode_post_code', 'unicode'),
338 338 ('captcha_public_key', 'rhodecode_captcha_public_key', 'unicode'),
339 339 ('captcha_private_key', 'rhodecode_captcha_private_key', 'unicode'),
340 340 ('create_personal_repo_group', 'rhodecode_create_personal_repo_group', 'bool'),
341 341 ('personal_repo_group_pattern', 'rhodecode_personal_repo_group_pattern', 'unicode'),
342 342 ]
343 343 try:
344 344 for setting, form_key, type_ in settings:
345 345 sett = SettingsModel().create_or_update_setting(
346 346 setting, form_result[form_key], type_)
347 347 Session().add(sett)
348 348
349 349 Session().commit()
350 350 SettingsModel().invalidate_settings_cache()
351 351 h.flash(_('Updated application settings'), category='success')
352 352 except Exception:
353 353 log.exception("Exception while updating application settings")
354 354 h.flash(
355 355 _('Error occurred during updating application settings'),
356 356 category='error')
357 357
358 358 raise HTTPFound(h.route_path('admin_settings_global'))
359 359
360 360 @LoginRequired()
361 361 @HasPermissionAllDecorator('hg.admin')
362 362 @view_config(
363 363 route_name='admin_settings_visual', request_method='GET',
364 364 renderer='rhodecode:templates/admin/settings/settings.mako')
365 365 def settings_visual(self):
366 366 c = self.load_default_context()
367 367 c.active = 'visual'
368 368
369 369 data = render('rhodecode:templates/admin/settings/settings.mako',
370 370 self._get_template_context(c), self.request)
371 371 html = formencode.htmlfill.render(
372 372 data,
373 373 defaults=self._form_defaults(),
374 374 encoding="UTF-8",
375 375 force_defaults=False
376 376 )
377 377 return Response(html)
378 378
379 379 @LoginRequired()
380 380 @HasPermissionAllDecorator('hg.admin')
381 381 @CSRFRequired()
382 382 @view_config(
383 383 route_name='admin_settings_visual_update', request_method='POST',
384 384 renderer='rhodecode:templates/admin/settings/settings.mako')
385 385 def settings_visual_update(self):
386 386 _ = self.request.translate
387 387 c = self.load_default_context()
388 388 c.active = 'visual'
389 389 application_form = ApplicationVisualisationForm(self.request.translate)()
390 390 try:
391 391 form_result = application_form.to_python(dict(self.request.POST))
392 392 except formencode.Invalid as errors:
393 393 h.flash(
394 394 _("Some form inputs contain invalid data."),
395 395 category='error')
396 396 data = render('rhodecode:templates/admin/settings/settings.mako',
397 397 self._get_template_context(c), self.request)
398 398 html = formencode.htmlfill.render(
399 399 data,
400 400 defaults=errors.value,
401 401 errors=errors.error_dict or {},
402 402 prefix_error=False,
403 403 encoding="UTF-8",
404 404 force_defaults=False
405 405 )
406 406 return Response(html)
407 407
408 408 try:
409 409 settings = [
410 410 ('show_public_icon', 'rhodecode_show_public_icon', 'bool'),
411 411 ('show_private_icon', 'rhodecode_show_private_icon', 'bool'),
412 412 ('stylify_metatags', 'rhodecode_stylify_metatags', 'bool'),
413 413 ('repository_fields', 'rhodecode_repository_fields', 'bool'),
414 414 ('dashboard_items', 'rhodecode_dashboard_items', 'int'),
415 415 ('admin_grid_items', 'rhodecode_admin_grid_items', 'int'),
416 416 ('show_version', 'rhodecode_show_version', 'bool'),
417 417 ('use_gravatar', 'rhodecode_use_gravatar', 'bool'),
418 418 ('markup_renderer', 'rhodecode_markup_renderer', 'unicode'),
419 419 ('gravatar_url', 'rhodecode_gravatar_url', 'unicode'),
420 420 ('clone_uri_tmpl', 'rhodecode_clone_uri_tmpl', 'unicode'),
421 421 ('clone_uri_ssh_tmpl', 'rhodecode_clone_uri_ssh_tmpl', 'unicode'),
422 422 ('support_url', 'rhodecode_support_url', 'unicode'),
423 423 ('show_revision_number', 'rhodecode_show_revision_number', 'bool'),
424 424 ('show_sha_length', 'rhodecode_show_sha_length', 'int'),
425 425 ]
426 426 for setting, form_key, type_ in settings:
427 427 sett = SettingsModel().create_or_update_setting(
428 428 setting, form_result[form_key], type_)
429 429 Session().add(sett)
430 430
431 431 Session().commit()
432 432 SettingsModel().invalidate_settings_cache()
433 433 h.flash(_('Updated visualisation settings'), category='success')
434 434 except Exception:
435 435 log.exception("Exception updating visualization settings")
436 436 h.flash(_('Error occurred during updating '
437 437 'visualisation settings'),
438 438 category='error')
439 439
440 440 raise HTTPFound(h.route_path('admin_settings_visual'))
441 441
442 442 @LoginRequired()
443 443 @HasPermissionAllDecorator('hg.admin')
444 444 @view_config(
445 445 route_name='admin_settings_issuetracker', request_method='GET',
446 446 renderer='rhodecode:templates/admin/settings/settings.mako')
447 447 def settings_issuetracker(self):
448 448 c = self.load_default_context()
449 449 c.active = 'issuetracker'
450 450 defaults = c.rc_config
451 451
452 452 entry_key = 'rhodecode_issuetracker_pat_'
453 453
454 454 c.issuetracker_entries = {}
455 455 for k, v in defaults.items():
456 456 if k.startswith(entry_key):
457 457 uid = k[len(entry_key):]
458 458 c.issuetracker_entries[uid] = None
459 459
460 460 for uid in c.issuetracker_entries:
461 461 c.issuetracker_entries[uid] = AttributeDict({
462 462 'pat': defaults.get('rhodecode_issuetracker_pat_' + uid),
463 463 'url': defaults.get('rhodecode_issuetracker_url_' + uid),
464 464 'pref': defaults.get('rhodecode_issuetracker_pref_' + uid),
465 465 'desc': defaults.get('rhodecode_issuetracker_desc_' + uid),
466 466 })
467 467
468 468 return self._get_template_context(c)
469 469
470 470 @LoginRequired()
471 471 @HasPermissionAllDecorator('hg.admin')
472 472 @CSRFRequired()
473 473 @view_config(
474 474 route_name='admin_settings_issuetracker_test', request_method='POST',
475 475 renderer='string', xhr=True)
476 476 def settings_issuetracker_test(self):
477 return h.urlify_commit_message(
477 error_container = []
478
479 urlified_commit = h.urlify_commit_message(
478 480 self.request.POST.get('test_text', ''),
479 'repo_group/test_repo1')
481 'repo_group/test_repo1', error_container=error_container)
482 if error_container:
483 def converter(inp):
484 return h.html_escape(unicode(inp))
485
486 return 'ERRORS: ' + '\n'.join(map(converter, error_container))
487
488 return urlified_commit
480 489
481 490 @LoginRequired()
482 491 @HasPermissionAllDecorator('hg.admin')
483 492 @CSRFRequired()
484 493 @view_config(
485 494 route_name='admin_settings_issuetracker_update', request_method='POST',
486 495 renderer='rhodecode:templates/admin/settings/settings.mako')
487 496 def settings_issuetracker_update(self):
488 497 _ = self.request.translate
489 498 self.load_default_context()
490 499 settings_model = IssueTrackerSettingsModel()
491 500
492 501 try:
493 502 form = IssueTrackerPatternsForm(self.request.translate)()
494 503 data = form.to_python(self.request.POST)
495 504 except formencode.Invalid as errors:
496 505 log.exception('Failed to add new pattern')
497 506 error = errors
498 507 h.flash(_('Invalid issue tracker pattern: {}'.format(error)),
499 508 category='error')
500 509 raise HTTPFound(h.route_path('admin_settings_issuetracker'))
501 510
502 511 if data:
503 512 for uid in data.get('delete_patterns', []):
504 513 settings_model.delete_entries(uid)
505 514
506 515 for pattern in data.get('patterns', []):
507 516 for setting, value, type_ in pattern:
508 517 sett = settings_model.create_or_update_setting(
509 518 setting, value, type_)
510 519 Session().add(sett)
511 520
512 521 Session().commit()
513 522
514 523 SettingsModel().invalidate_settings_cache()
515 524 h.flash(_('Updated issue tracker entries'), category='success')
516 525 raise HTTPFound(h.route_path('admin_settings_issuetracker'))
517 526
518 527 @LoginRequired()
519 528 @HasPermissionAllDecorator('hg.admin')
520 529 @CSRFRequired()
521 530 @view_config(
522 531 route_name='admin_settings_issuetracker_delete', request_method='POST',
523 532 renderer='json_ext', xhr=True)
524 533 def settings_issuetracker_delete(self):
525 534 _ = self.request.translate
526 535 self.load_default_context()
527 536 uid = self.request.POST.get('uid')
528 537 try:
529 538 IssueTrackerSettingsModel().delete_entries(uid)
530 539 except Exception:
531 540 log.exception('Failed to delete issue tracker setting %s', uid)
532 541 raise HTTPNotFound()
533 542
534 543 SettingsModel().invalidate_settings_cache()
535 544 h.flash(_('Removed issue tracker entry.'), category='success')
536 545
537 546 return {'deleted': uid}
538 547
539 548 @LoginRequired()
540 549 @HasPermissionAllDecorator('hg.admin')
541 550 @view_config(
542 551 route_name='admin_settings_email', request_method='GET',
543 552 renderer='rhodecode:templates/admin/settings/settings.mako')
544 553 def settings_email(self):
545 554 c = self.load_default_context()
546 555 c.active = 'email'
547 556 c.rhodecode_ini = rhodecode.CONFIG
548 557
549 558 data = render('rhodecode:templates/admin/settings/settings.mako',
550 559 self._get_template_context(c), self.request)
551 560 html = formencode.htmlfill.render(
552 561 data,
553 562 defaults=self._form_defaults(),
554 563 encoding="UTF-8",
555 564 force_defaults=False
556 565 )
557 566 return Response(html)
558 567
559 568 @LoginRequired()
560 569 @HasPermissionAllDecorator('hg.admin')
561 570 @CSRFRequired()
562 571 @view_config(
563 572 route_name='admin_settings_email_update', request_method='POST',
564 573 renderer='rhodecode:templates/admin/settings/settings.mako')
565 574 def settings_email_update(self):
566 575 _ = self.request.translate
567 576 c = self.load_default_context()
568 577 c.active = 'email'
569 578
570 579 test_email = self.request.POST.get('test_email')
571 580
572 581 if not test_email:
573 582 h.flash(_('Please enter email address'), category='error')
574 583 raise HTTPFound(h.route_path('admin_settings_email'))
575 584
576 585 email_kwargs = {
577 586 'date': datetime.datetime.now(),
578 587 'user': self._rhodecode_db_user
579 588 }
580 589
581 590 (subject, email_body, email_body_plaintext) = EmailNotificationModel().render_email(
582 591 EmailNotificationModel.TYPE_EMAIL_TEST, **email_kwargs)
583 592
584 593 recipients = [test_email] if test_email else None
585 594
586 595 run_task(tasks.send_email, recipients, subject,
587 596 email_body_plaintext, email_body)
588 597
589 598 h.flash(_('Send email task created'), category='success')
590 599 raise HTTPFound(h.route_path('admin_settings_email'))
591 600
592 601 @LoginRequired()
593 602 @HasPermissionAllDecorator('hg.admin')
594 603 @view_config(
595 604 route_name='admin_settings_hooks', request_method='GET',
596 605 renderer='rhodecode:templates/admin/settings/settings.mako')
597 606 def settings_hooks(self):
598 607 c = self.load_default_context()
599 608 c.active = 'hooks'
600 609
601 610 model = SettingsModel()
602 611 c.hooks = model.get_builtin_hooks()
603 612 c.custom_hooks = model.get_custom_hooks()
604 613
605 614 data = render('rhodecode:templates/admin/settings/settings.mako',
606 615 self._get_template_context(c), self.request)
607 616 html = formencode.htmlfill.render(
608 617 data,
609 618 defaults=self._form_defaults(),
610 619 encoding="UTF-8",
611 620 force_defaults=False
612 621 )
613 622 return Response(html)
614 623
615 624 @LoginRequired()
616 625 @HasPermissionAllDecorator('hg.admin')
617 626 @CSRFRequired()
618 627 @view_config(
619 628 route_name='admin_settings_hooks_update', request_method='POST',
620 629 renderer='rhodecode:templates/admin/settings/settings.mako')
621 630 @view_config(
622 631 route_name='admin_settings_hooks_delete', request_method='POST',
623 632 renderer='rhodecode:templates/admin/settings/settings.mako')
624 633 def settings_hooks_update(self):
625 634 _ = self.request.translate
626 635 c = self.load_default_context()
627 636 c.active = 'hooks'
628 637 if c.visual.allow_custom_hooks_settings:
629 638 ui_key = self.request.POST.get('new_hook_ui_key')
630 639 ui_value = self.request.POST.get('new_hook_ui_value')
631 640
632 641 hook_id = self.request.POST.get('hook_id')
633 642 new_hook = False
634 643
635 644 model = SettingsModel()
636 645 try:
637 646 if ui_value and ui_key:
638 647 model.create_or_update_hook(ui_key, ui_value)
639 648 h.flash(_('Added new hook'), category='success')
640 649 new_hook = True
641 650 elif hook_id:
642 651 RhodeCodeUi.delete(hook_id)
643 652 Session().commit()
644 653
645 654 # check for edits
646 655 update = False
647 656 _d = self.request.POST.dict_of_lists()
648 657 for k, v in zip(_d.get('hook_ui_key', []),
649 658 _d.get('hook_ui_value_new', [])):
650 659 model.create_or_update_hook(k, v)
651 660 update = True
652 661
653 662 if update and not new_hook:
654 663 h.flash(_('Updated hooks'), category='success')
655 664 Session().commit()
656 665 except Exception:
657 666 log.exception("Exception during hook creation")
658 667 h.flash(_('Error occurred during hook creation'),
659 668 category='error')
660 669
661 670 raise HTTPFound(h.route_path('admin_settings_hooks'))
662 671
663 672 @LoginRequired()
664 673 @HasPermissionAllDecorator('hg.admin')
665 674 @view_config(
666 675 route_name='admin_settings_search', request_method='GET',
667 676 renderer='rhodecode:templates/admin/settings/settings.mako')
668 677 def settings_search(self):
669 678 c = self.load_default_context()
670 679 c.active = 'search'
671 680
672 681 c.searcher = searcher_from_config(self.request.registry.settings)
673 682 c.statistics = c.searcher.statistics(self.request.translate)
674 683
675 684 return self._get_template_context(c)
676 685
677 686 @LoginRequired()
678 687 @HasPermissionAllDecorator('hg.admin')
679 688 @view_config(
680 689 route_name='admin_settings_automation', request_method='GET',
681 690 renderer='rhodecode:templates/admin/settings/settings.mako')
682 691 def settings_automation(self):
683 692 c = self.load_default_context()
684 693 c.active = 'automation'
685 694
686 695 return self._get_template_context(c)
687 696
688 697 @LoginRequired()
689 698 @HasPermissionAllDecorator('hg.admin')
690 699 @view_config(
691 700 route_name='admin_settings_labs', request_method='GET',
692 701 renderer='rhodecode:templates/admin/settings/settings.mako')
693 702 def settings_labs(self):
694 703 c = self.load_default_context()
695 704 if not c.labs_active:
696 705 raise HTTPFound(h.route_path('admin_settings'))
697 706
698 707 c.active = 'labs'
699 708 c.lab_settings = _LAB_SETTINGS
700 709
701 710 data = render('rhodecode:templates/admin/settings/settings.mako',
702 711 self._get_template_context(c), self.request)
703 712 html = formencode.htmlfill.render(
704 713 data,
705 714 defaults=self._form_defaults(),
706 715 encoding="UTF-8",
707 716 force_defaults=False
708 717 )
709 718 return Response(html)
710 719
711 720 @LoginRequired()
712 721 @HasPermissionAllDecorator('hg.admin')
713 722 @CSRFRequired()
714 723 @view_config(
715 724 route_name='admin_settings_labs_update', request_method='POST',
716 725 renderer='rhodecode:templates/admin/settings/settings.mako')
717 726 def settings_labs_update(self):
718 727 _ = self.request.translate
719 728 c = self.load_default_context()
720 729 c.active = 'labs'
721 730
722 731 application_form = LabsSettingsForm(self.request.translate)()
723 732 try:
724 733 form_result = application_form.to_python(dict(self.request.POST))
725 734 except formencode.Invalid as errors:
726 735 h.flash(
727 736 _("Some form inputs contain invalid data."),
728 737 category='error')
729 738 data = render('rhodecode:templates/admin/settings/settings.mako',
730 739 self._get_template_context(c), self.request)
731 740 html = formencode.htmlfill.render(
732 741 data,
733 742 defaults=errors.value,
734 743 errors=errors.error_dict or {},
735 744 prefix_error=False,
736 745 encoding="UTF-8",
737 746 force_defaults=False
738 747 )
739 748 return Response(html)
740 749
741 750 try:
742 751 session = Session()
743 752 for setting in _LAB_SETTINGS:
744 753 setting_name = setting.key[len('rhodecode_'):]
745 754 sett = SettingsModel().create_or_update_setting(
746 755 setting_name, form_result[setting.key], setting.type)
747 756 session.add(sett)
748 757
749 758 except Exception:
750 759 log.exception('Exception while updating lab settings')
751 760 h.flash(_('Error occurred during updating labs settings'),
752 761 category='error')
753 762 else:
754 763 Session().commit()
755 764 SettingsModel().invalidate_settings_cache()
756 765 h.flash(_('Updated Labs settings'), category='success')
757 766 raise HTTPFound(h.route_path('admin_settings_labs'))
758 767
759 768 data = render('rhodecode:templates/admin/settings/settings.mako',
760 769 self._get_template_context(c), self.request)
761 770 html = formencode.htmlfill.render(
762 771 data,
763 772 defaults=self._form_defaults(),
764 773 encoding="UTF-8",
765 774 force_defaults=False
766 775 )
767 776 return Response(html)
768 777
769 778
770 779 # :param key: name of the setting including the 'rhodecode_' prefix
771 780 # :param type: the RhodeCodeSetting type to use.
772 781 # :param group: the i18ned group in which we should dispaly this setting
773 782 # :param label: the i18ned label we should display for this setting
774 783 # :param help: the i18ned help we should dispaly for this setting
775 784 LabSetting = collections.namedtuple(
776 785 'LabSetting', ('key', 'type', 'group', 'label', 'help'))
777 786
778 787
779 788 # This list has to be kept in sync with the form
780 789 # rhodecode.model.forms.LabsSettingsForm.
781 790 _LAB_SETTINGS = [
782 791
783 792 ]
@@ -1,418 +1,486 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import os
22 22 import logging
23 23 import datetime
24 24
25 25 from pyramid.view import view_config
26 26 from pyramid.renderers import render_to_response
27 27 from rhodecode.apps._base import BaseAppView
28 28 from rhodecode.lib.celerylib import run_task, tasks
29 29 from rhodecode.lib.utils2 import AttributeDict
30 30 from rhodecode.model.db import User
31 31 from rhodecode.model.notification import EmailNotificationModel
32 32
33 33 log = logging.getLogger(__name__)
34 34
35 35
36 36 class DebugStyleView(BaseAppView):
37
37 38 def load_default_context(self):
38 39 c = self._get_local_tmpl_context()
39 40
40 41 return c
41 42
42 43 @view_config(
43 44 route_name='debug_style_home', request_method='GET',
44 45 renderer=None)
45 46 def index(self):
46 47 c = self.load_default_context()
47 48 c.active = 'index'
48 49
49 50 return render_to_response(
50 51 'debug_style/index.html', self._get_template_context(c),
51 52 request=self.request)
52 53
53 54 @view_config(
54 55 route_name='debug_style_email', request_method='GET',
55 56 renderer=None)
56 57 @view_config(
57 58 route_name='debug_style_email_plain_rendered', request_method='GET',
58 59 renderer=None)
59 60 def render_email(self):
60 61 c = self.load_default_context()
61 62 email_id = self.request.matchdict['email_id']
62 63 c.active = 'emails'
63 64
64 65 pr = AttributeDict(
65 66 pull_request_id=123,
66 67 title='digital_ocean: fix redis, elastic search start on boot, '
67 68 'fix fd limits on supervisor, set postgres 11 version',
68 69 description='''
69 70 Check if we should use full-topic or mini-topic.
70 71
71 72 - full topic produces some problems with merge states etc
72 73 - server-mini-topic needs probably tweeks.
73 74 ''',
74 75 repo_name='foobar',
75 76 source_ref_parts=AttributeDict(type='branch', name='fix-ticket-2000'),
76 77 target_ref_parts=AttributeDict(type='branch', name='master'),
77 78 )
79
78 80 target_repo = AttributeDict(repo_name='repo_group/target_repo')
79 81 source_repo = AttributeDict(repo_name='repo_group/source_repo')
80 82 user = User.get_by_username(self.request.GET.get('user')) or self._rhodecode_db_user
81 83 # file/commit changes for PR update
82 84 commit_changes = AttributeDict({
83 85 'added': ['aaaaaaabbbbb', 'cccccccddddddd'],
84 86 'removed': ['eeeeeeeeeee'],
85 87 })
88
86 89 file_changes = AttributeDict({
87 90 'added': ['a/file1.md', 'file2.py'],
88 91 'modified': ['b/modified_file.rst'],
89 92 'removed': ['.idea'],
90 93 })
91 94
92 95 exc_traceback = {
93 96 'exc_utc_date': '2020-03-26T12:54:50.683281',
94 97 'exc_id': 139638856342656,
95 98 'exc_timestamp': '1585227290.683288',
96 99 'version': 'v1',
97 100 'exc_message': 'Traceback (most recent call last):\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/tweens.py", line 41, in excview_tween\n response = handler(request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/router.py", line 148, in handle_request\n registry, request, context, context_iface, view_name\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/view.py", line 667, in _call_view\n response = view_callable(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/config/views.py", line 188, in attr_view\n return view(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/config/views.py", line 214, in predicate_wrapper\n return view(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/viewderivers.py", line 401, in viewresult_to_response\n result = view(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/viewderivers.py", line 132, in _class_view\n response = getattr(inst, attr)()\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/apps/debug_style/views.py", line 355, in render_email\n template_type, **email_kwargs.get(email_id, {}))\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/model/notification.py", line 402, in render_email\n body = email_template.render(None, **_kwargs)\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/lib/partial_renderer.py", line 95, in render\n return self._render_with_exc(tmpl, args, kwargs)\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/lib/partial_renderer.py", line 79, in _render_with_exc\n return render_func.render(*args, **kwargs)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/template.py", line 476, in render\n return runtime._render(self, self.callable_, args, data)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/runtime.py", line 883, in _render\n **_kwargs_for_callable(callable_, data)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/runtime.py", line 920, in _render_context\n _exec_template(inherit, lclcontext, args=args, kwargs=kwargs)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/runtime.py", line 947, in _exec_template\n callable_(context, *args, **kwargs)\n File "rhodecode_templates_email_templates_base_mako", line 63, in render_body\n File "rhodecode_templates_email_templates_exception_tracker_mako", line 43, in render_body\nAttributeError: \'str\' object has no attribute \'get\'\n',
98 101 'exc_type': 'AttributeError'
99 102 }
103
100 104 email_kwargs = {
101 105 'test': {},
106
102 107 'message': {
103 108 'body': 'message body !'
104 109 },
110
105 111 'email_test': {
106 112 'user': user,
107 113 'date': datetime.datetime.now(),
108 114 },
115
109 116 'exception': {
110 117 'email_prefix': '[RHODECODE ERROR]',
111 118 'exc_id': exc_traceback['exc_id'],
112 119 'exc_url': 'http://server-url/{}'.format(exc_traceback['exc_id']),
113 120 'exc_type_name': 'NameError',
114 121 'exc_traceback': exc_traceback,
115 122 },
123
116 124 'password_reset': {
117 125 'password_reset_url': 'http://example.com/reset-rhodecode-password/token',
118 126
119 127 'user': user,
120 128 'date': datetime.datetime.now(),
121 129 'email': 'test@rhodecode.com',
122 130 'first_admin_email': User.get_first_super_admin().email
123 131 },
132
124 133 'password_reset_confirmation': {
125 134 'new_password': 'new-password-example',
126 135 'user': user,
127 136 'date': datetime.datetime.now(),
128 137 'email': 'test@rhodecode.com',
129 138 'first_admin_email': User.get_first_super_admin().email
130 139 },
140
131 141 'registration': {
132 142 'user': user,
133 143 'date': datetime.datetime.now(),
134 144 },
135 145
136 146 'pull_request_comment': {
137 147 'user': user,
138 148
139 149 'status_change': None,
140 150 'status_change_type': None,
141 151
142 152 'pull_request': pr,
143 153 'pull_request_commits': [],
144 154
145 155 'pull_request_target_repo': target_repo,
146 156 'pull_request_target_repo_url': 'http://target-repo/url',
147 157
148 158 'pull_request_source_repo': source_repo,
149 159 'pull_request_source_repo_url': 'http://source-repo/url',
150 160
151 161 'pull_request_url': 'http://localhost/pr1',
152 162 'pr_comment_url': 'http://comment-url',
153 163 'pr_comment_reply_url': 'http://comment-url#reply',
154 164
155 165 'comment_file': None,
156 166 'comment_line': None,
157 167 'comment_type': 'note',
158 168 'comment_body': 'This is my comment body. *I like !*',
159 169 'comment_id': 2048,
160 170 'renderer_type': 'markdown',
161 171 'mention': True,
162 172
163 173 },
174
164 175 'pull_request_comment+status': {
165 176 'user': user,
166 177
167 178 'status_change': 'approved',
168 179 'status_change_type': 'approved',
169 180
170 181 'pull_request': pr,
171 182 'pull_request_commits': [],
172 183
173 184 'pull_request_target_repo': target_repo,
174 185 'pull_request_target_repo_url': 'http://target-repo/url',
175 186
176 187 'pull_request_source_repo': source_repo,
177 188 'pull_request_source_repo_url': 'http://source-repo/url',
178 189
179 190 'pull_request_url': 'http://localhost/pr1',
180 191 'pr_comment_url': 'http://comment-url',
181 192 'pr_comment_reply_url': 'http://comment-url#reply',
182 193
183 194 'comment_type': 'todo',
184 195 'comment_file': None,
185 196 'comment_line': None,
186 197 'comment_body': '''
187 198 I think something like this would be better
188 199
189 200 ```py
190 201 // markdown renderer
191 202
192 203 def db():
193 204 global connection
194 205 return connection
195 206
196 207 ```
197 208
198 209 ''',
199 210 'comment_id': 2048,
200 211 'renderer_type': 'markdown',
201 212 'mention': True,
202 213
203 214 },
215
204 216 'pull_request_comment+file': {
205 217 'user': user,
206 218
207 219 'status_change': None,
208 220 'status_change_type': None,
209 221
210 222 'pull_request': pr,
211 223 'pull_request_commits': [],
212 224
213 225 'pull_request_target_repo': target_repo,
214 226 'pull_request_target_repo_url': 'http://target-repo/url',
215 227
216 228 'pull_request_source_repo': source_repo,
217 229 'pull_request_source_repo_url': 'http://source-repo/url',
218 230
219 231 'pull_request_url': 'http://localhost/pr1',
220 232
221 233 'pr_comment_url': 'http://comment-url',
222 234 'pr_comment_reply_url': 'http://comment-url#reply',
223 235
224 236 'comment_file': 'rhodecode/model/get_flow_commits',
225 237 'comment_line': 'o1210',
226 238 'comment_type': 'todo',
227 239 'comment_body': '''
228 240 I like this !
229 241
230 242 But please check this code
231 243
232 244 .. code-block:: javascript
233 245
234 246 // THIS IS RST CODE
235 247
236 248 this.createResolutionComment = function(commentId) {
237 249 // hide the trigger text
238 250 $('#resolve-comment-{0}'.format(commentId)).hide();
239 251
240 252 var comment = $('#comment-'+commentId);
241 253 var commentData = comment.data();
242 254 if (commentData.commentInline) {
243 255 this.createComment(comment, commentId)
244 256 } else {
245 257 Rhodecode.comments.createGeneralComment('general', "$placeholder", commentId)
246 258 }
247 259
248 260 return false;
249 261 };
250 262
251 263 This should work better !
252 264 ''',
253 265 'comment_id': 2048,
254 266 'renderer_type': 'rst',
255 267 'mention': True,
256 268
257 269 },
258 270
259 271 'pull_request_update': {
260 272 'updating_user': user,
261 273
262 274 'status_change': None,
263 275 'status_change_type': None,
264 276
265 277 'pull_request': pr,
266 278 'pull_request_commits': [],
267 279
268 280 'pull_request_target_repo': target_repo,
269 281 'pull_request_target_repo_url': 'http://target-repo/url',
270 282
271 283 'pull_request_source_repo': source_repo,
272 284 'pull_request_source_repo_url': 'http://source-repo/url',
273 285
274 286 'pull_request_url': 'http://localhost/pr1',
275 287
276 288 # update comment links
277 289 'pr_comment_url': 'http://comment-url',
278 290 'pr_comment_reply_url': 'http://comment-url#reply',
279 291 'ancestor_commit_id': 'f39bd443',
280 292 'added_commits': commit_changes.added,
281 293 'removed_commits': commit_changes.removed,
282 294 'changed_files': (file_changes.added + file_changes.modified + file_changes.removed),
283 295 'added_files': file_changes.added,
284 296 'modified_files': file_changes.modified,
285 297 'removed_files': file_changes.removed,
286 298 },
287 299
288 300 'cs_comment': {
289 301 'user': user,
290 302 'commit': AttributeDict(idx=123, raw_id='a'*40, message='Commit message'),
291 303 'status_change': None,
292 304 'status_change_type': None,
293 305
294 306 'commit_target_repo_url': 'http://foo.example.com/#comment1',
295 307 'repo_name': 'test-repo',
296 308 'comment_type': 'note',
297 309 'comment_file': None,
298 310 'comment_line': None,
299 311 'commit_comment_url': 'http://comment-url',
300 312 'commit_comment_reply_url': 'http://comment-url#reply',
301 313 'comment_body': 'This is my comment body. *I like !*',
302 314 'comment_id': 2048,
303 315 'renderer_type': 'markdown',
304 316 'mention': True,
305 317 },
318
306 319 'cs_comment+status': {
307 320 'user': user,
308 321 'commit': AttributeDict(idx=123, raw_id='a' * 40, message='Commit message'),
309 322 'status_change': 'approved',
310 323 'status_change_type': 'approved',
311 324
312 325 'commit_target_repo_url': 'http://foo.example.com/#comment1',
313 326 'repo_name': 'test-repo',
314 327 'comment_type': 'note',
315 328 'comment_file': None,
316 329 'comment_line': None,
317 330 'commit_comment_url': 'http://comment-url',
318 331 'commit_comment_reply_url': 'http://comment-url#reply',
319 332 'comment_body': '''
320 333 Hello **world**
321 334
322 335 This is a multiline comment :)
323 336
324 337 - list
325 338 - list2
326 339 ''',
327 340 'comment_id': 2048,
328 341 'renderer_type': 'markdown',
329 342 'mention': True,
330 343 },
344
331 345 'cs_comment+file': {
332 346 'user': user,
333 347 'commit': AttributeDict(idx=123, raw_id='a' * 40, message='Commit message'),
334 348 'status_change': None,
335 349 'status_change_type': None,
336 350
337 351 'commit_target_repo_url': 'http://foo.example.com/#comment1',
338 352 'repo_name': 'test-repo',
339 353
340 354 'comment_type': 'note',
341 355 'comment_file': 'test-file.py',
342 356 'comment_line': 'n100',
343 357
344 358 'commit_comment_url': 'http://comment-url',
345 359 'commit_comment_reply_url': 'http://comment-url#reply',
346 360 'comment_body': 'This is my comment body. *I like !*',
347 361 'comment_id': 2048,
348 362 'renderer_type': 'markdown',
349 363 'mention': True,
350 364 },
351 365
352 366 'pull_request': {
353 367 'user': user,
354 368 'pull_request': pr,
355 369 'pull_request_commits': [
356 370 ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\
357 371 my-account: moved email closer to profile as it's similar data just moved outside.
358 372 '''),
359 373 ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\
360 374 users: description edit fixes
361 375
362 376 - tests
363 377 - added metatags info
364 378 '''),
365 379 ],
366 380
367 381 'pull_request_target_repo': target_repo,
368 382 'pull_request_target_repo_url': 'http://target-repo/url',
369 383
370 384 'pull_request_source_repo': source_repo,
371 385 'pull_request_source_repo_url': 'http://source-repo/url',
372 386
373 387 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123',
388 'user_role': 'reviewer',
389 },
390
391 'pull_request+reviewer_role': {
392 'user': user,
393 'pull_request': pr,
394 'pull_request_commits': [
395 ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\
396 my-account: moved email closer to profile as it's similar data just moved outside.
397 '''),
398 ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\
399 users: description edit fixes
400
401 - tests
402 - added metatags info
403 '''),
404 ],
405
406 'pull_request_target_repo': target_repo,
407 'pull_request_target_repo_url': 'http://target-repo/url',
408
409 'pull_request_source_repo': source_repo,
410 'pull_request_source_repo_url': 'http://source-repo/url',
411
412 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123',
413 'user_role': 'reviewer',
414 },
415
416 'pull_request+observer_role': {
417 'user': user,
418 'pull_request': pr,
419 'pull_request_commits': [
420 ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\
421 my-account: moved email closer to profile as it's similar data just moved outside.
422 '''),
423 ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\
424 users: description edit fixes
425
426 - tests
427 - added metatags info
428 '''),
429 ],
430
431 'pull_request_target_repo': target_repo,
432 'pull_request_target_repo_url': 'http://target-repo/url',
433
434 'pull_request_source_repo': source_repo,
435 'pull_request_source_repo_url': 'http://source-repo/url',
436
437 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123',
438 'user_role': 'observer'
374 439 }
375
376 440 }
377 441
378 442 template_type = email_id.split('+')[0]
379 443 (c.subject, c.email_body, c.email_body_plaintext) = EmailNotificationModel().render_email(
380 444 template_type, **email_kwargs.get(email_id, {}))
381 445
382 446 test_email = self.request.GET.get('email')
383 447 if test_email:
384 448 recipients = [test_email]
385 449 run_task(tasks.send_email, recipients, c.subject,
386 450 c.email_body_plaintext, c.email_body)
387 451
388 452 if self.request.matched_route.name == 'debug_style_email_plain_rendered':
389 453 template = 'debug_style/email_plain_rendered.mako'
390 454 else:
391 455 template = 'debug_style/email.mako'
392 456 return render_to_response(
393 457 template, self._get_template_context(c),
394 458 request=self.request)
395 459
396 460 @view_config(
397 461 route_name='debug_style_template', request_method='GET',
398 462 renderer=None)
399 463 def template(self):
400 464 t_path = self.request.matchdict['t_path']
401 465 c = self.load_default_context()
402 466 c.active = os.path.splitext(t_path)[0]
403 467 c.came_from = ''
468 # NOTE(marcink): extend the email types with variations based on data sets
404 469 c.email_types = {
405 470 'cs_comment+file': {},
406 471 'cs_comment+status': {},
407 472
408 473 'pull_request_comment+file': {},
409 474 'pull_request_comment+status': {},
410 475
411 476 'pull_request_update': {},
477
478 'pull_request+reviewer_role': {},
479 'pull_request+observer_role': {},
412 480 }
413 481 c.email_types.update(EmailNotificationModel.email_types)
414 482
415 483 return render_to_response(
416 484 'debug_style/' + t_path, self._get_template_context(c),
417 485 request=self.request)
418 486
@@ -1,896 +1,897 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import re
22 22 import logging
23 23 import collections
24 24
25 25 from pyramid.httpexceptions import HTTPNotFound
26 26 from pyramid.view import view_config
27 27
28 28 from rhodecode.apps._base import BaseAppView, DataGridAppView
29 29 from rhodecode.lib import helpers as h
30 30 from rhodecode.lib.auth import (
31 31 LoginRequired, NotAnonymous, HasRepoGroupPermissionAnyDecorator, CSRFRequired,
32 32 HasRepoGroupPermissionAny, AuthUser)
33 33 from rhodecode.lib.codeblocks import filenode_as_lines_tokens
34 34 from rhodecode.lib.index import searcher_from_config
35 from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int
35 from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int, safe_str
36 36 from rhodecode.lib.vcs.nodes import FileNode
37 37 from rhodecode.model.db import (
38 38 func, true, or_, case, cast, in_filter_generator, String, Session,
39 39 Repository, RepoGroup, User, UserGroup, PullRequest)
40 40 from rhodecode.model.repo import RepoModel
41 41 from rhodecode.model.repo_group import RepoGroupModel
42 42 from rhodecode.model.user import UserModel
43 43 from rhodecode.model.user_group import UserGroupModel
44 44
45 45 log = logging.getLogger(__name__)
46 46
47 47
48 48 class HomeView(BaseAppView, DataGridAppView):
49 49
50 50 def load_default_context(self):
51 51 c = self._get_local_tmpl_context()
52 52 c.user = c.auth_user.get_instance()
53 53
54 54 return c
55 55
56 56 @LoginRequired()
57 57 @view_config(
58 58 route_name='user_autocomplete_data', request_method='GET',
59 59 renderer='json_ext', xhr=True)
60 60 def user_autocomplete_data(self):
61 61 self.load_default_context()
62 62 query = self.request.GET.get('query')
63 63 active = str2bool(self.request.GET.get('active') or True)
64 64 include_groups = str2bool(self.request.GET.get('user_groups'))
65 65 expand_groups = str2bool(self.request.GET.get('user_groups_expand'))
66 66 skip_default_user = str2bool(self.request.GET.get('skip_default_user'))
67 67
68 68 log.debug('generating user list, query:%s, active:%s, with_groups:%s',
69 69 query, active, include_groups)
70 70
71 71 _users = UserModel().get_users(
72 72 name_contains=query, only_active=active)
73 73
74 74 def maybe_skip_default_user(usr):
75 75 if skip_default_user and usr['username'] == UserModel.cls.DEFAULT_USER:
76 76 return False
77 77 return True
78 78 _users = filter(maybe_skip_default_user, _users)
79 79
80 80 if include_groups:
81 81 # extend with user groups
82 82 _user_groups = UserGroupModel().get_user_groups(
83 83 name_contains=query, only_active=active,
84 84 expand_groups=expand_groups)
85 85 _users = _users + _user_groups
86 86
87 87 return {'suggestions': _users}
88 88
89 89 @LoginRequired()
90 90 @NotAnonymous()
91 91 @view_config(
92 92 route_name='user_group_autocomplete_data', request_method='GET',
93 93 renderer='json_ext', xhr=True)
94 94 def user_group_autocomplete_data(self):
95 95 self.load_default_context()
96 96 query = self.request.GET.get('query')
97 97 active = str2bool(self.request.GET.get('active') or True)
98 98 expand_groups = str2bool(self.request.GET.get('user_groups_expand'))
99 99
100 100 log.debug('generating user group list, query:%s, active:%s',
101 101 query, active)
102 102
103 103 _user_groups = UserGroupModel().get_user_groups(
104 104 name_contains=query, only_active=active,
105 105 expand_groups=expand_groups)
106 106 _user_groups = _user_groups
107 107
108 108 return {'suggestions': _user_groups}
109 109
110 110 def _get_repo_list(self, name_contains=None, repo_type=None, repo_group_name='', limit=20):
111 111 org_query = name_contains
112 112 allowed_ids = self._rhodecode_user.repo_acl_ids(
113 113 ['repository.read', 'repository.write', 'repository.admin'],
114 114 cache=True, name_filter=name_contains) or [-1]
115 115
116 116 query = Session().query(
117 117 Repository.repo_name,
118 118 Repository.repo_id,
119 119 Repository.repo_type,
120 120 Repository.private,
121 121 )\
122 122 .filter(Repository.archived.isnot(true()))\
123 123 .filter(or_(
124 124 # generate multiple IN to fix limitation problems
125 125 *in_filter_generator(Repository.repo_id, allowed_ids)
126 126 ))
127 127
128 128 query = query.order_by(case(
129 129 [
130 130 (Repository.repo_name.startswith(repo_group_name), repo_group_name+'/'),
131 131 ],
132 132 ))
133 133 query = query.order_by(func.length(Repository.repo_name))
134 134 query = query.order_by(Repository.repo_name)
135 135
136 136 if repo_type:
137 137 query = query.filter(Repository.repo_type == repo_type)
138 138
139 139 if name_contains:
140 140 ilike_expression = u'%{}%'.format(safe_unicode(name_contains))
141 141 query = query.filter(
142 142 Repository.repo_name.ilike(ilike_expression))
143 143 query = query.limit(limit)
144 144
145 145 acl_iter = query
146 146
147 147 return [
148 148 {
149 149 'id': obj.repo_name,
150 150 'value': org_query,
151 151 'value_display': obj.repo_name,
152 152 'text': obj.repo_name,
153 153 'type': 'repo',
154 154 'repo_id': obj.repo_id,
155 155 'repo_type': obj.repo_type,
156 156 'private': obj.private,
157 157 'url': h.route_path('repo_summary', repo_name=obj.repo_name)
158 158 }
159 159 for obj in acl_iter]
160 160
161 161 def _get_repo_group_list(self, name_contains=None, repo_group_name='', limit=20):
162 162 org_query = name_contains
163 163 allowed_ids = self._rhodecode_user.repo_group_acl_ids(
164 164 ['group.read', 'group.write', 'group.admin'],
165 165 cache=True, name_filter=name_contains) or [-1]
166 166
167 167 query = Session().query(
168 168 RepoGroup.group_id,
169 169 RepoGroup.group_name,
170 170 )\
171 171 .filter(or_(
172 172 # generate multiple IN to fix limitation problems
173 173 *in_filter_generator(RepoGroup.group_id, allowed_ids)
174 174 ))
175 175
176 176 query = query.order_by(case(
177 177 [
178 178 (RepoGroup.group_name.startswith(repo_group_name), repo_group_name+'/'),
179 179 ],
180 180 ))
181 181 query = query.order_by(func.length(RepoGroup.group_name))
182 182 query = query.order_by(RepoGroup.group_name)
183 183
184 184 if name_contains:
185 185 ilike_expression = u'%{}%'.format(safe_unicode(name_contains))
186 186 query = query.filter(
187 187 RepoGroup.group_name.ilike(ilike_expression))
188 188 query = query.limit(limit)
189 189
190 190 acl_iter = query
191 191
192 192 return [
193 193 {
194 194 'id': obj.group_name,
195 195 'value': org_query,
196 196 'value_display': obj.group_name,
197 197 'text': obj.group_name,
198 198 'type': 'repo_group',
199 199 'repo_group_id': obj.group_id,
200 200 'url': h.route_path(
201 201 'repo_group_home', repo_group_name=obj.group_name)
202 202 }
203 203 for obj in acl_iter]
204 204
205 205 def _get_user_list(self, name_contains=None, limit=20):
206 206 org_query = name_contains
207 207 if not name_contains:
208 208 return [], False
209 209
210 210 # TODO(marcink): should all logged in users be allowed to search others?
211 211 allowed_user_search = self._rhodecode_user.username != User.DEFAULT_USER
212 212 if not allowed_user_search:
213 213 return [], False
214 214
215 215 name_contains = re.compile('(?:user:[ ]?)(.+)').findall(name_contains)
216 216 if len(name_contains) != 1:
217 217 return [], False
218 218
219 219 name_contains = name_contains[0]
220 220
221 221 query = User.query()\
222 222 .order_by(func.length(User.username))\
223 223 .order_by(User.username) \
224 224 .filter(User.username != User.DEFAULT_USER)
225 225
226 226 if name_contains:
227 227 ilike_expression = u'%{}%'.format(safe_unicode(name_contains))
228 228 query = query.filter(
229 229 User.username.ilike(ilike_expression))
230 230 query = query.limit(limit)
231 231
232 232 acl_iter = query
233 233
234 234 return [
235 235 {
236 236 'id': obj.user_id,
237 237 'value': org_query,
238 238 'value_display': 'user: `{}`'.format(obj.username),
239 239 'type': 'user',
240 240 'icon_link': h.gravatar_url(obj.email, 30),
241 241 'url': h.route_path(
242 242 'user_profile', username=obj.username)
243 243 }
244 244 for obj in acl_iter], True
245 245
246 246 def _get_user_groups_list(self, name_contains=None, limit=20):
247 247 org_query = name_contains
248 248 if not name_contains:
249 249 return [], False
250 250
251 251 # TODO(marcink): should all logged in users be allowed to search others?
252 252 allowed_user_search = self._rhodecode_user.username != User.DEFAULT_USER
253 253 if not allowed_user_search:
254 254 return [], False
255 255
256 256 name_contains = re.compile('(?:user_group:[ ]?)(.+)').findall(name_contains)
257 257 if len(name_contains) != 1:
258 258 return [], False
259 259
260 260 name_contains = name_contains[0]
261 261
262 262 query = UserGroup.query()\
263 263 .order_by(func.length(UserGroup.users_group_name))\
264 264 .order_by(UserGroup.users_group_name)
265 265
266 266 if name_contains:
267 267 ilike_expression = u'%{}%'.format(safe_unicode(name_contains))
268 268 query = query.filter(
269 269 UserGroup.users_group_name.ilike(ilike_expression))
270 270 query = query.limit(limit)
271 271
272 272 acl_iter = query
273 273
274 274 return [
275 275 {
276 276 'id': obj.users_group_id,
277 277 'value': org_query,
278 278 'value_display': 'user_group: `{}`'.format(obj.users_group_name),
279 279 'type': 'user_group',
280 280 'url': h.route_path(
281 281 'user_group_profile', user_group_name=obj.users_group_name)
282 282 }
283 283 for obj in acl_iter], True
284 284
285 285 def _get_pull_request_list(self, name_contains=None, limit=20):
286 286 org_query = name_contains
287 287 if not name_contains:
288 288 return [], False
289 289
290 290 # TODO(marcink): should all logged in users be allowed to search others?
291 291 allowed_user_search = self._rhodecode_user.username != User.DEFAULT_USER
292 292 if not allowed_user_search:
293 293 return [], False
294 294
295 295 name_contains = re.compile('(?:pr:[ ]?)(.+)').findall(name_contains)
296 296 if len(name_contains) != 1:
297 297 return [], False
298 298
299 299 name_contains = name_contains[0]
300 300
301 301 allowed_ids = self._rhodecode_user.repo_acl_ids(
302 302 ['repository.read', 'repository.write', 'repository.admin'],
303 303 cache=True) or [-1]
304 304
305 305 query = Session().query(
306 306 PullRequest.pull_request_id,
307 307 PullRequest.title,
308 308 )
309 309 query = query.join(Repository, Repository.repo_id == PullRequest.target_repo_id)
310 310
311 311 query = query.filter(or_(
312 312 # generate multiple IN to fix limitation problems
313 313 *in_filter_generator(Repository.repo_id, allowed_ids)
314 314 ))
315 315
316 316 query = query.order_by(PullRequest.pull_request_id)
317 317
318 318 if name_contains:
319 319 ilike_expression = u'%{}%'.format(safe_unicode(name_contains))
320 320 query = query.filter(or_(
321 321 cast(PullRequest.pull_request_id, String).ilike(ilike_expression),
322 322 PullRequest.title.ilike(ilike_expression),
323 323 PullRequest.description.ilike(ilike_expression),
324 324 ))
325 325
326 326 query = query.limit(limit)
327 327
328 328 acl_iter = query
329 329
330 330 return [
331 331 {
332 332 'id': obj.pull_request_id,
333 333 'value': org_query,
334 'value_display': 'pull request: `!{} - {}`'.format(obj.pull_request_id, obj.title[:50]),
334 'value_display': 'pull request: `!{} - {}`'.format(
335 obj.pull_request_id, safe_str(obj.title[:50])),
335 336 'type': 'pull_request',
336 337 'url': h.route_path('pull_requests_global', pull_request_id=obj.pull_request_id)
337 338 }
338 339 for obj in acl_iter], True
339 340
340 341 def _get_hash_commit_list(self, auth_user, searcher, query, repo=None, repo_group=None):
341 342 repo_name = repo_group_name = None
342 343 if repo:
343 344 repo_name = repo.repo_name
344 345 if repo_group:
345 346 repo_group_name = repo_group.group_name
346 347
347 348 org_query = query
348 349 if not query or len(query) < 3 or not searcher:
349 350 return [], False
350 351
351 352 commit_hashes = re.compile('(?:commit:[ ]?)([0-9a-f]{2,40})').findall(query)
352 353
353 354 if len(commit_hashes) != 1:
354 355 return [], False
355 356
356 357 commit_hash = commit_hashes[0]
357 358
358 359 result = searcher.search(
359 360 'commit_id:{}*'.format(commit_hash), 'commit', auth_user,
360 361 repo_name, repo_group_name, raise_on_exc=False)
361 362
362 363 commits = []
363 364 for entry in result['results']:
364 365 repo_data = {
365 366 'repository_id': entry.get('repository_id'),
366 367 'repository_type': entry.get('repo_type'),
367 368 'repository_name': entry.get('repository'),
368 369 }
369 370
370 371 commit_entry = {
371 372 'id': entry['commit_id'],
372 373 'value': org_query,
373 374 'value_display': '`{}` commit: {}'.format(
374 375 entry['repository'], entry['commit_id']),
375 376 'type': 'commit',
376 377 'repo': entry['repository'],
377 378 'repo_data': repo_data,
378 379
379 380 'url': h.route_path(
380 381 'repo_commit',
381 382 repo_name=entry['repository'], commit_id=entry['commit_id'])
382 383 }
383 384
384 385 commits.append(commit_entry)
385 386 return commits, True
386 387
387 388 def _get_path_list(self, auth_user, searcher, query, repo=None, repo_group=None):
388 389 repo_name = repo_group_name = None
389 390 if repo:
390 391 repo_name = repo.repo_name
391 392 if repo_group:
392 393 repo_group_name = repo_group.group_name
393 394
394 395 org_query = query
395 396 if not query or len(query) < 3 or not searcher:
396 397 return [], False
397 398
398 399 paths_re = re.compile('(?:file:[ ]?)(.+)').findall(query)
399 400 if len(paths_re) != 1:
400 401 return [], False
401 402
402 403 file_path = paths_re[0]
403 404
404 405 search_path = searcher.escape_specials(file_path)
405 406 result = searcher.search(
406 407 'file.raw:*{}*'.format(search_path), 'path', auth_user,
407 408 repo_name, repo_group_name, raise_on_exc=False)
408 409
409 410 files = []
410 411 for entry in result['results']:
411 412 repo_data = {
412 413 'repository_id': entry.get('repository_id'),
413 414 'repository_type': entry.get('repo_type'),
414 415 'repository_name': entry.get('repository'),
415 416 }
416 417
417 418 file_entry = {
418 419 'id': entry['commit_id'],
419 420 'value': org_query,
420 421 'value_display': '`{}` file: {}'.format(
421 422 entry['repository'], entry['file']),
422 423 'type': 'file',
423 424 'repo': entry['repository'],
424 425 'repo_data': repo_data,
425 426
426 427 'url': h.route_path(
427 428 'repo_files',
428 429 repo_name=entry['repository'], commit_id=entry['commit_id'],
429 430 f_path=entry['file'])
430 431 }
431 432
432 433 files.append(file_entry)
433 434 return files, True
434 435
435 436 @LoginRequired()
436 437 @view_config(
437 438 route_name='repo_list_data', request_method='GET',
438 439 renderer='json_ext', xhr=True)
439 440 def repo_list_data(self):
440 441 _ = self.request.translate
441 442 self.load_default_context()
442 443
443 444 query = self.request.GET.get('query')
444 445 repo_type = self.request.GET.get('repo_type')
445 446 log.debug('generating repo list, query:%s, repo_type:%s',
446 447 query, repo_type)
447 448
448 449 res = []
449 450 repos = self._get_repo_list(query, repo_type=repo_type)
450 451 if repos:
451 452 res.append({
452 453 'text': _('Repositories'),
453 454 'children': repos
454 455 })
455 456
456 457 data = {
457 458 'more': False,
458 459 'results': res
459 460 }
460 461 return data
461 462
462 463 @LoginRequired()
463 464 @view_config(
464 465 route_name='repo_group_list_data', request_method='GET',
465 466 renderer='json_ext', xhr=True)
466 467 def repo_group_list_data(self):
467 468 _ = self.request.translate
468 469 self.load_default_context()
469 470
470 471 query = self.request.GET.get('query')
471 472
472 473 log.debug('generating repo group list, query:%s',
473 474 query)
474 475
475 476 res = []
476 477 repo_groups = self._get_repo_group_list(query)
477 478 if repo_groups:
478 479 res.append({
479 480 'text': _('Repository Groups'),
480 481 'children': repo_groups
481 482 })
482 483
483 484 data = {
484 485 'more': False,
485 486 'results': res
486 487 }
487 488 return data
488 489
489 490 def _get_default_search_queries(self, search_context, searcher, query):
490 491 if not searcher:
491 492 return []
492 493
493 494 is_es_6 = searcher.is_es_6
494 495
495 496 queries = []
496 497 repo_group_name, repo_name, repo_context = None, None, None
497 498
498 499 # repo group context
499 500 if search_context.get('search_context[repo_group_name]'):
500 501 repo_group_name = search_context.get('search_context[repo_group_name]')
501 502 if search_context.get('search_context[repo_name]'):
502 503 repo_name = search_context.get('search_context[repo_name]')
503 504 repo_context = search_context.get('search_context[repo_view_type]')
504 505
505 506 if is_es_6 and repo_name:
506 507 # files
507 508 def query_modifier():
508 509 qry = query
509 510 return {'q': qry, 'type': 'content'}
510 511
511 512 label = u'File content search for `{}`'.format(h.escape(query))
512 513 file_qry = {
513 514 'id': -10,
514 515 'value': query,
515 516 'value_display': label,
516 517 'value_icon': '<i class="icon-code"></i>',
517 518 'type': 'search',
518 519 'subtype': 'repo',
519 520 'url': h.route_path('search_repo',
520 521 repo_name=repo_name,
521 522 _query=query_modifier())
522 523 }
523 524
524 525 # commits
525 526 def query_modifier():
526 527 qry = query
527 528 return {'q': qry, 'type': 'commit'}
528 529
529 530 label = u'Commit search for `{}`'.format(h.escape(query))
530 531 commit_qry = {
531 532 'id': -20,
532 533 'value': query,
533 534 'value_display': label,
534 535 'value_icon': '<i class="icon-history"></i>',
535 536 'type': 'search',
536 537 'subtype': 'repo',
537 538 'url': h.route_path('search_repo',
538 539 repo_name=repo_name,
539 540 _query=query_modifier())
540 541 }
541 542
542 543 if repo_context in ['commit', 'commits']:
543 544 queries.extend([commit_qry, file_qry])
544 545 elif repo_context in ['files', 'summary']:
545 546 queries.extend([file_qry, commit_qry])
546 547 else:
547 548 queries.extend([commit_qry, file_qry])
548 549
549 550 elif is_es_6 and repo_group_name:
550 551 # files
551 552 def query_modifier():
552 553 qry = query
553 554 return {'q': qry, 'type': 'content'}
554 555
555 556 label = u'File content search for `{}`'.format(query)
556 557 file_qry = {
557 558 'id': -30,
558 559 'value': query,
559 560 'value_display': label,
560 561 'value_icon': '<i class="icon-code"></i>',
561 562 'type': 'search',
562 563 'subtype': 'repo_group',
563 564 'url': h.route_path('search_repo_group',
564 565 repo_group_name=repo_group_name,
565 566 _query=query_modifier())
566 567 }
567 568
568 569 # commits
569 570 def query_modifier():
570 571 qry = query
571 572 return {'q': qry, 'type': 'commit'}
572 573
573 574 label = u'Commit search for `{}`'.format(query)
574 575 commit_qry = {
575 576 'id': -40,
576 577 'value': query,
577 578 'value_display': label,
578 579 'value_icon': '<i class="icon-history"></i>',
579 580 'type': 'search',
580 581 'subtype': 'repo_group',
581 582 'url': h.route_path('search_repo_group',
582 583 repo_group_name=repo_group_name,
583 584 _query=query_modifier())
584 585 }
585 586
586 587 if repo_context in ['commit', 'commits']:
587 588 queries.extend([commit_qry, file_qry])
588 589 elif repo_context in ['files', 'summary']:
589 590 queries.extend([file_qry, commit_qry])
590 591 else:
591 592 queries.extend([commit_qry, file_qry])
592 593
593 594 # Global, not scoped
594 595 if not queries:
595 596 queries.append(
596 597 {
597 598 'id': -1,
598 599 'value': query,
599 600 'value_display': u'File content search for: `{}`'.format(query),
600 601 'value_icon': '<i class="icon-code"></i>',
601 602 'type': 'search',
602 603 'subtype': 'global',
603 604 'url': h.route_path('search',
604 605 _query={'q': query, 'type': 'content'})
605 606 })
606 607 queries.append(
607 608 {
608 609 'id': -2,
609 610 'value': query,
610 611 'value_display': u'Commit search for: `{}`'.format(query),
611 612 'value_icon': '<i class="icon-history"></i>',
612 613 'type': 'search',
613 614 'subtype': 'global',
614 615 'url': h.route_path('search',
615 616 _query={'q': query, 'type': 'commit'})
616 617 })
617 618
618 619 return queries
619 620
620 621 @LoginRequired()
621 622 @view_config(
622 623 route_name='goto_switcher_data', request_method='GET',
623 624 renderer='json_ext', xhr=True)
624 625 def goto_switcher_data(self):
625 626 c = self.load_default_context()
626 627
627 628 _ = self.request.translate
628 629
629 630 query = self.request.GET.get('query')
630 631 log.debug('generating main filter data, query %s', query)
631 632
632 633 res = []
633 634 if not query:
634 635 return {'suggestions': res}
635 636
636 637 def no_match(name):
637 638 return {
638 639 'id': -1,
639 640 'value': "",
640 641 'value_display': name,
641 642 'type': 'text',
642 643 'url': ""
643 644 }
644 645 searcher = searcher_from_config(self.request.registry.settings)
645 646 has_specialized_search = False
646 647
647 648 # set repo context
648 649 repo = None
649 650 repo_id = safe_int(self.request.GET.get('search_context[repo_id]'))
650 651 if repo_id:
651 652 repo = Repository.get(repo_id)
652 653
653 654 # set group context
654 655 repo_group = None
655 656 repo_group_id = safe_int(self.request.GET.get('search_context[repo_group_id]'))
656 657 if repo_group_id:
657 658 repo_group = RepoGroup.get(repo_group_id)
658 659 prefix_match = False
659 660
660 661 # user: type search
661 662 if not prefix_match:
662 663 users, prefix_match = self._get_user_list(query)
663 664 if users:
664 665 has_specialized_search = True
665 666 for serialized_user in users:
666 667 res.append(serialized_user)
667 668 elif prefix_match:
668 669 has_specialized_search = True
669 670 res.append(no_match('No matching users found'))
670 671
671 672 # user_group: type search
672 673 if not prefix_match:
673 674 user_groups, prefix_match = self._get_user_groups_list(query)
674 675 if user_groups:
675 676 has_specialized_search = True
676 677 for serialized_user_group in user_groups:
677 678 res.append(serialized_user_group)
678 679 elif prefix_match:
679 680 has_specialized_search = True
680 681 res.append(no_match('No matching user groups found'))
681 682
682 683 # pr: type search
683 684 if not prefix_match:
684 685 pull_requests, prefix_match = self._get_pull_request_list(query)
685 686 if pull_requests:
686 687 has_specialized_search = True
687 688 for serialized_pull_request in pull_requests:
688 689 res.append(serialized_pull_request)
689 690 elif prefix_match:
690 691 has_specialized_search = True
691 692 res.append(no_match('No matching pull requests found'))
692 693
693 694 # FTS commit: type search
694 695 if not prefix_match:
695 696 commits, prefix_match = self._get_hash_commit_list(
696 697 c.auth_user, searcher, query, repo, repo_group)
697 698 if commits:
698 699 has_specialized_search = True
699 700 unique_repos = collections.OrderedDict()
700 701 for commit in commits:
701 702 repo_name = commit['repo']
702 703 unique_repos.setdefault(repo_name, []).append(commit)
703 704
704 705 for _repo, commits in unique_repos.items():
705 706 for commit in commits:
706 707 res.append(commit)
707 708 elif prefix_match:
708 709 has_specialized_search = True
709 710 res.append(no_match('No matching commits found'))
710 711
711 712 # FTS file: type search
712 713 if not prefix_match:
713 714 paths, prefix_match = self._get_path_list(
714 715 c.auth_user, searcher, query, repo, repo_group)
715 716 if paths:
716 717 has_specialized_search = True
717 718 unique_repos = collections.OrderedDict()
718 719 for path in paths:
719 720 repo_name = path['repo']
720 721 unique_repos.setdefault(repo_name, []).append(path)
721 722
722 723 for repo, paths in unique_repos.items():
723 724 for path in paths:
724 725 res.append(path)
725 726 elif prefix_match:
726 727 has_specialized_search = True
727 728 res.append(no_match('No matching files found'))
728 729
729 730 # main suggestions
730 731 if not has_specialized_search:
731 732 repo_group_name = ''
732 733 if repo_group:
733 734 repo_group_name = repo_group.group_name
734 735
735 736 for _q in self._get_default_search_queries(self.request.GET, searcher, query):
736 737 res.append(_q)
737 738
738 739 repo_groups = self._get_repo_group_list(query, repo_group_name=repo_group_name)
739 740 for serialized_repo_group in repo_groups:
740 741 res.append(serialized_repo_group)
741 742
742 743 repos = self._get_repo_list(query, repo_group_name=repo_group_name)
743 744 for serialized_repo in repos:
744 745 res.append(serialized_repo)
745 746
746 747 if not repos and not repo_groups:
747 748 res.append(no_match('No matches found'))
748 749
749 750 return {'suggestions': res}
750 751
751 752 @LoginRequired()
752 753 @view_config(
753 754 route_name='home', request_method='GET',
754 755 renderer='rhodecode:templates/index.mako')
755 756 def main_page(self):
756 757 c = self.load_default_context()
757 758 c.repo_group = None
758 759 return self._get_template_context(c)
759 760
760 761 def _main_page_repo_groups_data(self, repo_group_id):
761 762 column_map = {
762 763 'name': 'group_name_hash',
763 764 'desc': 'group_description',
764 765 'last_change': 'updated_on',
765 766 'owner': 'user_username',
766 767 }
767 768 draw, start, limit = self._extract_chunk(self.request)
768 769 search_q, order_by, order_dir = self._extract_ordering(
769 770 self.request, column_map=column_map)
770 771 return RepoGroupModel().get_repo_groups_data_table(
771 772 draw, start, limit,
772 773 search_q, order_by, order_dir,
773 774 self._rhodecode_user, repo_group_id)
774 775
775 776 def _main_page_repos_data(self, repo_group_id):
776 777 column_map = {
777 778 'name': 'repo_name',
778 779 'desc': 'description',
779 780 'last_change': 'updated_on',
780 781 'owner': 'user_username',
781 782 }
782 783 draw, start, limit = self._extract_chunk(self.request)
783 784 search_q, order_by, order_dir = self._extract_ordering(
784 785 self.request, column_map=column_map)
785 786 return RepoModel().get_repos_data_table(
786 787 draw, start, limit,
787 788 search_q, order_by, order_dir,
788 789 self._rhodecode_user, repo_group_id)
789 790
790 791 @LoginRequired()
791 792 @view_config(
792 793 route_name='main_page_repo_groups_data',
793 794 request_method='GET', renderer='json_ext', xhr=True)
794 795 def main_page_repo_groups_data(self):
795 796 self.load_default_context()
796 797 repo_group_id = safe_int(self.request.GET.get('repo_group_id'))
797 798
798 799 if repo_group_id:
799 800 group = RepoGroup.get_or_404(repo_group_id)
800 801 _perms = AuthUser.repo_group_read_perms
801 802 if not HasRepoGroupPermissionAny(*_perms)(
802 803 group.group_name, 'user is allowed to list repo group children'):
803 804 raise HTTPNotFound()
804 805
805 806 return self._main_page_repo_groups_data(repo_group_id)
806 807
807 808 @LoginRequired()
808 809 @view_config(
809 810 route_name='main_page_repos_data',
810 811 request_method='GET', renderer='json_ext', xhr=True)
811 812 def main_page_repos_data(self):
812 813 self.load_default_context()
813 814 repo_group_id = safe_int(self.request.GET.get('repo_group_id'))
814 815
815 816 if repo_group_id:
816 817 group = RepoGroup.get_or_404(repo_group_id)
817 818 _perms = AuthUser.repo_group_read_perms
818 819 if not HasRepoGroupPermissionAny(*_perms)(
819 820 group.group_name, 'user is allowed to list repo group children'):
820 821 raise HTTPNotFound()
821 822
822 823 return self._main_page_repos_data(repo_group_id)
823 824
824 825 @LoginRequired()
825 826 @HasRepoGroupPermissionAnyDecorator(*AuthUser.repo_group_read_perms)
826 827 @view_config(
827 828 route_name='repo_group_home', request_method='GET',
828 829 renderer='rhodecode:templates/index_repo_group.mako')
829 830 @view_config(
830 831 route_name='repo_group_home_slash', request_method='GET',
831 832 renderer='rhodecode:templates/index_repo_group.mako')
832 833 def repo_group_main_page(self):
833 834 c = self.load_default_context()
834 835 c.repo_group = self.request.db_repo_group
835 836 return self._get_template_context(c)
836 837
837 838 @LoginRequired()
838 839 @CSRFRequired()
839 840 @view_config(
840 841 route_name='markup_preview', request_method='POST',
841 842 renderer='string', xhr=True)
842 843 def markup_preview(self):
843 844 # Technically a CSRF token is not needed as no state changes with this
844 845 # call. However, as this is a POST is better to have it, so automated
845 846 # tools don't flag it as potential CSRF.
846 847 # Post is required because the payload could be bigger than the maximum
847 848 # allowed by GET.
848 849
849 850 text = self.request.POST.get('text')
850 851 renderer = self.request.POST.get('renderer') or 'rst'
851 852 if text:
852 853 return h.render(text, renderer=renderer, mentions=True)
853 854 return ''
854 855
855 856 @LoginRequired()
856 857 @CSRFRequired()
857 858 @view_config(
858 859 route_name='file_preview', request_method='POST',
859 860 renderer='string', xhr=True)
860 861 def file_preview(self):
861 862 # Technically a CSRF token is not needed as no state changes with this
862 863 # call. However, as this is a POST is better to have it, so automated
863 864 # tools don't flag it as potential CSRF.
864 865 # Post is required because the payload could be bigger than the maximum
865 866 # allowed by GET.
866 867
867 868 text = self.request.POST.get('text')
868 869 file_path = self.request.POST.get('file_path')
869 870
870 871 renderer = h.renderer_from_filename(file_path)
871 872
872 873 if renderer:
873 874 return h.render(text, renderer=renderer, mentions=True)
874 875 else:
875 876 self.load_default_context()
876 877 _render = self.request.get_partial_renderer(
877 878 'rhodecode:templates/files/file_content.mako')
878 879
879 880 lines = filenode_as_lines_tokens(FileNode(file_path, text))
880 881
881 882 return _render('render_lines', lines)
882 883
883 884 @LoginRequired()
884 885 @CSRFRequired()
885 886 @view_config(
886 887 route_name='store_user_session_value', request_method='POST',
887 888 renderer='string', xhr=True)
888 889 def store_user_session_attr(self):
889 890 key = self.request.POST.get('key')
890 891 val = self.request.POST.get('val')
891 892
892 893 existing_value = self.request.session.get(key)
893 894 if existing_value != val:
894 895 self.request.session[key] = val
895 896
896 897 return 'stored:{}:{}'.format(key, val)
@@ -1,822 +1,822 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import logging
22 22 import datetime
23 23 import string
24 24
25 25 import formencode
26 26 import formencode.htmlfill
27 27 import peppercorn
28 28 from pyramid.httpexceptions import HTTPFound, HTTPNotFound
29 29 from pyramid.view import view_config
30 30
31 31 from rhodecode.apps._base import BaseAppView, DataGridAppView
32 32 from rhodecode import forms
33 33 from rhodecode.lib import helpers as h
34 34 from rhodecode.lib import audit_logger
35 35 from rhodecode.lib.ext_json import json
36 36 from rhodecode.lib.auth import (
37 37 LoginRequired, NotAnonymous, CSRFRequired,
38 38 HasRepoPermissionAny, HasRepoGroupPermissionAny, AuthUser)
39 39 from rhodecode.lib.channelstream import (
40 40 channelstream_request, ChannelstreamException)
41 41 from rhodecode.lib.utils2 import safe_int, md5, str2bool
42 42 from rhodecode.model.auth_token import AuthTokenModel
43 43 from rhodecode.model.comment import CommentsModel
44 44 from rhodecode.model.db import (
45 45 IntegrityError, or_, in_filter_generator,
46 46 Repository, UserEmailMap, UserApiKeys, UserFollowing,
47 47 PullRequest, UserBookmark, RepoGroup)
48 48 from rhodecode.model.meta import Session
49 49 from rhodecode.model.pull_request import PullRequestModel
50 50 from rhodecode.model.user import UserModel
51 51 from rhodecode.model.user_group import UserGroupModel
52 52 from rhodecode.model.validation_schema.schemas import user_schema
53 53
54 54 log = logging.getLogger(__name__)
55 55
56 56
57 57 class MyAccountView(BaseAppView, DataGridAppView):
58 58 ALLOW_SCOPED_TOKENS = False
59 59 """
60 60 This view has alternative version inside EE, if modified please take a look
61 61 in there as well.
62 62 """
63 63
64 64 def load_default_context(self):
65 65 c = self._get_local_tmpl_context()
66 66 c.user = c.auth_user.get_instance()
67 67 c.allow_scoped_tokens = self.ALLOW_SCOPED_TOKENS
68 68
69 69 return c
70 70
71 71 @LoginRequired()
72 72 @NotAnonymous()
73 73 @view_config(
74 74 route_name='my_account_profile', request_method='GET',
75 75 renderer='rhodecode:templates/admin/my_account/my_account.mako')
76 76 def my_account_profile(self):
77 77 c = self.load_default_context()
78 78 c.active = 'profile'
79 79 c.extern_type = c.user.extern_type
80 80 return self._get_template_context(c)
81 81
82 82 @LoginRequired()
83 83 @NotAnonymous()
84 84 @view_config(
85 85 route_name='my_account_password', request_method='GET',
86 86 renderer='rhodecode:templates/admin/my_account/my_account.mako')
87 87 def my_account_password(self):
88 88 c = self.load_default_context()
89 89 c.active = 'password'
90 90 c.extern_type = c.user.extern_type
91 91
92 92 schema = user_schema.ChangePasswordSchema().bind(
93 93 username=c.user.username)
94 94
95 95 form = forms.Form(
96 96 schema,
97 97 action=h.route_path('my_account_password_update'),
98 98 buttons=(forms.buttons.save, forms.buttons.reset))
99 99
100 100 c.form = form
101 101 return self._get_template_context(c)
102 102
103 103 @LoginRequired()
104 104 @NotAnonymous()
105 105 @CSRFRequired()
106 106 @view_config(
107 107 route_name='my_account_password_update', request_method='POST',
108 108 renderer='rhodecode:templates/admin/my_account/my_account.mako')
109 109 def my_account_password_update(self):
110 110 _ = self.request.translate
111 111 c = self.load_default_context()
112 112 c.active = 'password'
113 113 c.extern_type = c.user.extern_type
114 114
115 115 schema = user_schema.ChangePasswordSchema().bind(
116 116 username=c.user.username)
117 117
118 118 form = forms.Form(
119 119 schema, buttons=(forms.buttons.save, forms.buttons.reset))
120 120
121 121 if c.extern_type != 'rhodecode':
122 122 raise HTTPFound(self.request.route_path('my_account_password'))
123 123
124 124 controls = self.request.POST.items()
125 125 try:
126 126 valid_data = form.validate(controls)
127 127 UserModel().update_user(c.user.user_id, **valid_data)
128 128 c.user.update_userdata(force_password_change=False)
129 129 Session().commit()
130 130 except forms.ValidationFailure as e:
131 131 c.form = e
132 132 return self._get_template_context(c)
133 133
134 134 except Exception:
135 135 log.exception("Exception updating password")
136 136 h.flash(_('Error occurred during update of user password'),
137 137 category='error')
138 138 else:
139 139 instance = c.auth_user.get_instance()
140 140 self.session.setdefault('rhodecode_user', {}).update(
141 141 {'password': md5(instance.password)})
142 142 self.session.save()
143 143 h.flash(_("Successfully updated password"), category='success')
144 144
145 145 raise HTTPFound(self.request.route_path('my_account_password'))
146 146
147 147 @LoginRequired()
148 148 @NotAnonymous()
149 149 @view_config(
150 150 route_name='my_account_auth_tokens', request_method='GET',
151 151 renderer='rhodecode:templates/admin/my_account/my_account.mako')
152 152 def my_account_auth_tokens(self):
153 153 _ = self.request.translate
154 154
155 155 c = self.load_default_context()
156 156 c.active = 'auth_tokens'
157 157 c.lifetime_values = AuthTokenModel.get_lifetime_values(translator=_)
158 158 c.role_values = [
159 159 (x, AuthTokenModel.cls._get_role_name(x))
160 160 for x in AuthTokenModel.cls.ROLES]
161 161 c.role_options = [(c.role_values, _("Role"))]
162 162 c.user_auth_tokens = AuthTokenModel().get_auth_tokens(
163 163 c.user.user_id, show_expired=True)
164 164 c.role_vcs = AuthTokenModel.cls.ROLE_VCS
165 165 return self._get_template_context(c)
166 166
167 167 @LoginRequired()
168 168 @NotAnonymous()
169 169 @CSRFRequired()
170 170 @view_config(
171 171 route_name='my_account_auth_tokens_view', request_method='POST', xhr=True,
172 172 renderer='json_ext')
173 173 def my_account_auth_tokens_view(self):
174 174 _ = self.request.translate
175 175 c = self.load_default_context()
176 176
177 177 auth_token_id = self.request.POST.get('auth_token_id')
178 178
179 179 if auth_token_id:
180 180 token = UserApiKeys.get_or_404(auth_token_id)
181 181 if token.user.user_id != c.user.user_id:
182 182 raise HTTPNotFound()
183 183
184 184 return {
185 185 'auth_token': token.api_key
186 186 }
187 187
188 188 def maybe_attach_token_scope(self, token):
189 189 # implemented in EE edition
190 190 pass
191 191
192 192 @LoginRequired()
193 193 @NotAnonymous()
194 194 @CSRFRequired()
195 195 @view_config(
196 196 route_name='my_account_auth_tokens_add', request_method='POST',)
197 197 def my_account_auth_tokens_add(self):
198 198 _ = self.request.translate
199 199 c = self.load_default_context()
200 200
201 201 lifetime = safe_int(self.request.POST.get('lifetime'), -1)
202 202 description = self.request.POST.get('description')
203 203 role = self.request.POST.get('role')
204 204
205 205 token = UserModel().add_auth_token(
206 206 user=c.user.user_id,
207 207 lifetime_minutes=lifetime, role=role, description=description,
208 208 scope_callback=self.maybe_attach_token_scope)
209 209 token_data = token.get_api_data()
210 210
211 211 audit_logger.store_web(
212 212 'user.edit.token.add', action_data={
213 213 'data': {'token': token_data, 'user': 'self'}},
214 214 user=self._rhodecode_user, )
215 215 Session().commit()
216 216
217 217 h.flash(_("Auth token successfully created"), category='success')
218 218 return HTTPFound(h.route_path('my_account_auth_tokens'))
219 219
220 220 @LoginRequired()
221 221 @NotAnonymous()
222 222 @CSRFRequired()
223 223 @view_config(
224 224 route_name='my_account_auth_tokens_delete', request_method='POST')
225 225 def my_account_auth_tokens_delete(self):
226 226 _ = self.request.translate
227 227 c = self.load_default_context()
228 228
229 229 del_auth_token = self.request.POST.get('del_auth_token')
230 230
231 231 if del_auth_token:
232 232 token = UserApiKeys.get_or_404(del_auth_token)
233 233 token_data = token.get_api_data()
234 234
235 235 AuthTokenModel().delete(del_auth_token, c.user.user_id)
236 236 audit_logger.store_web(
237 237 'user.edit.token.delete', action_data={
238 238 'data': {'token': token_data, 'user': 'self'}},
239 239 user=self._rhodecode_user,)
240 240 Session().commit()
241 241 h.flash(_("Auth token successfully deleted"), category='success')
242 242
243 243 return HTTPFound(h.route_path('my_account_auth_tokens'))
244 244
245 245 @LoginRequired()
246 246 @NotAnonymous()
247 247 @view_config(
248 248 route_name='my_account_emails', request_method='GET',
249 249 renderer='rhodecode:templates/admin/my_account/my_account.mako')
250 250 def my_account_emails(self):
251 251 _ = self.request.translate
252 252
253 253 c = self.load_default_context()
254 254 c.active = 'emails'
255 255
256 256 c.user_email_map = UserEmailMap.query()\
257 257 .filter(UserEmailMap.user == c.user).all()
258 258
259 259 schema = user_schema.AddEmailSchema().bind(
260 260 username=c.user.username, user_emails=c.user.emails)
261 261
262 262 form = forms.RcForm(schema,
263 263 action=h.route_path('my_account_emails_add'),
264 264 buttons=(forms.buttons.save, forms.buttons.reset))
265 265
266 266 c.form = form
267 267 return self._get_template_context(c)
268 268
269 269 @LoginRequired()
270 270 @NotAnonymous()
271 271 @CSRFRequired()
272 272 @view_config(
273 273 route_name='my_account_emails_add', request_method='POST',
274 274 renderer='rhodecode:templates/admin/my_account/my_account.mako')
275 275 def my_account_emails_add(self):
276 276 _ = self.request.translate
277 277 c = self.load_default_context()
278 278 c.active = 'emails'
279 279
280 280 schema = user_schema.AddEmailSchema().bind(
281 281 username=c.user.username, user_emails=c.user.emails)
282 282
283 283 form = forms.RcForm(
284 284 schema, action=h.route_path('my_account_emails_add'),
285 285 buttons=(forms.buttons.save, forms.buttons.reset))
286 286
287 287 controls = self.request.POST.items()
288 288 try:
289 289 valid_data = form.validate(controls)
290 290 UserModel().add_extra_email(c.user.user_id, valid_data['email'])
291 291 audit_logger.store_web(
292 292 'user.edit.email.add', action_data={
293 293 'data': {'email': valid_data['email'], 'user': 'self'}},
294 294 user=self._rhodecode_user,)
295 295 Session().commit()
296 296 except formencode.Invalid as error:
297 297 h.flash(h.escape(error.error_dict['email']), category='error')
298 298 except forms.ValidationFailure as e:
299 299 c.user_email_map = UserEmailMap.query() \
300 300 .filter(UserEmailMap.user == c.user).all()
301 301 c.form = e
302 302 return self._get_template_context(c)
303 303 except Exception:
304 304 log.exception("Exception adding email")
305 305 h.flash(_('Error occurred during adding email'),
306 306 category='error')
307 307 else:
308 308 h.flash(_("Successfully added email"), category='success')
309 309
310 310 raise HTTPFound(self.request.route_path('my_account_emails'))
311 311
312 312 @LoginRequired()
313 313 @NotAnonymous()
314 314 @CSRFRequired()
315 315 @view_config(
316 316 route_name='my_account_emails_delete', request_method='POST')
317 317 def my_account_emails_delete(self):
318 318 _ = self.request.translate
319 319 c = self.load_default_context()
320 320
321 321 del_email_id = self.request.POST.get('del_email_id')
322 322 if del_email_id:
323 323 email = UserEmailMap.get_or_404(del_email_id).email
324 324 UserModel().delete_extra_email(c.user.user_id, del_email_id)
325 325 audit_logger.store_web(
326 326 'user.edit.email.delete', action_data={
327 327 'data': {'email': email, 'user': 'self'}},
328 328 user=self._rhodecode_user,)
329 329 Session().commit()
330 330 h.flash(_("Email successfully deleted"),
331 331 category='success')
332 332 return HTTPFound(h.route_path('my_account_emails'))
333 333
334 334 @LoginRequired()
335 335 @NotAnonymous()
336 336 @CSRFRequired()
337 337 @view_config(
338 338 route_name='my_account_notifications_test_channelstream',
339 339 request_method='POST', renderer='json_ext')
340 340 def my_account_notifications_test_channelstream(self):
341 341 message = 'Test message sent via Channelstream by user: {}, on {}'.format(
342 342 self._rhodecode_user.username, datetime.datetime.now())
343 343 payload = {
344 344 # 'channel': 'broadcast',
345 345 'type': 'message',
346 346 'timestamp': datetime.datetime.utcnow(),
347 347 'user': 'system',
348 348 'pm_users': [self._rhodecode_user.username],
349 349 'message': {
350 350 'message': message,
351 351 'level': 'info',
352 352 'topic': '/notifications'
353 353 }
354 354 }
355 355
356 356 registry = self.request.registry
357 357 rhodecode_plugins = getattr(registry, 'rhodecode_plugins', {})
358 358 channelstream_config = rhodecode_plugins.get('channelstream', {})
359 359
360 360 try:
361 361 channelstream_request(channelstream_config, [payload], '/message')
362 362 except ChannelstreamException as e:
363 363 log.exception('Failed to send channelstream data')
364 364 return {"response": 'ERROR: {}'.format(e.__class__.__name__)}
365 365 return {"response": 'Channelstream data sent. '
366 366 'You should see a new live message now.'}
367 367
368 368 def _load_my_repos_data(self, watched=False):
369 369
370 370 allowed_ids = [-1] + self._rhodecode_user.repo_acl_ids_from_stack(AuthUser.repo_read_perms)
371 371
372 372 if watched:
373 373 # repos user watch
374 374 repo_list = Session().query(
375 375 Repository
376 376 ) \
377 377 .join(
378 378 (UserFollowing, UserFollowing.follows_repo_id == Repository.repo_id)
379 379 ) \
380 380 .filter(
381 381 UserFollowing.user_id == self._rhodecode_user.user_id
382 382 ) \
383 383 .filter(or_(
384 384 # generate multiple IN to fix limitation problems
385 385 *in_filter_generator(Repository.repo_id, allowed_ids))
386 386 ) \
387 387 .order_by(Repository.repo_name) \
388 388 .all()
389 389
390 390 else:
391 391 # repos user is owner of
392 392 repo_list = Session().query(
393 393 Repository
394 394 ) \
395 395 .filter(
396 396 Repository.user_id == self._rhodecode_user.user_id
397 397 ) \
398 398 .filter(or_(
399 399 # generate multiple IN to fix limitation problems
400 400 *in_filter_generator(Repository.repo_id, allowed_ids))
401 401 ) \
402 402 .order_by(Repository.repo_name) \
403 403 .all()
404 404
405 405 _render = self.request.get_partial_renderer(
406 406 'rhodecode:templates/data_table/_dt_elements.mako')
407 407
408 408 def repo_lnk(name, rtype, rstate, private, archived, fork_of):
409 409 return _render('repo_name', name, rtype, rstate, private, archived, fork_of,
410 410 short_name=False, admin=False)
411 411
412 412 repos_data = []
413 413 for repo in repo_list:
414 414 row = {
415 415 "name": repo_lnk(repo.repo_name, repo.repo_type, repo.repo_state,
416 416 repo.private, repo.archived, repo.fork),
417 417 "name_raw": repo.repo_name.lower(),
418 418 }
419 419
420 420 repos_data.append(row)
421 421
422 422 # json used to render the grid
423 423 return json.dumps(repos_data)
424 424
425 425 @LoginRequired()
426 426 @NotAnonymous()
427 427 @view_config(
428 428 route_name='my_account_repos', request_method='GET',
429 429 renderer='rhodecode:templates/admin/my_account/my_account.mako')
430 430 def my_account_repos(self):
431 431 c = self.load_default_context()
432 432 c.active = 'repos'
433 433
434 434 # json used to render the grid
435 435 c.data = self._load_my_repos_data()
436 436 return self._get_template_context(c)
437 437
438 438 @LoginRequired()
439 439 @NotAnonymous()
440 440 @view_config(
441 441 route_name='my_account_watched', request_method='GET',
442 442 renderer='rhodecode:templates/admin/my_account/my_account.mako')
443 443 def my_account_watched(self):
444 444 c = self.load_default_context()
445 445 c.active = 'watched'
446 446
447 447 # json used to render the grid
448 448 c.data = self._load_my_repos_data(watched=True)
449 449 return self._get_template_context(c)
450 450
451 451 @LoginRequired()
452 452 @NotAnonymous()
453 453 @view_config(
454 454 route_name='my_account_bookmarks', request_method='GET',
455 455 renderer='rhodecode:templates/admin/my_account/my_account.mako')
456 456 def my_account_bookmarks(self):
457 457 c = self.load_default_context()
458 458 c.active = 'bookmarks'
459 459 c.bookmark_items = UserBookmark.get_bookmarks_for_user(
460 460 self._rhodecode_db_user.user_id, cache=False)
461 461 return self._get_template_context(c)
462 462
463 463 def _process_bookmark_entry(self, entry, user_id):
464 464 position = safe_int(entry.get('position'))
465 465 cur_position = safe_int(entry.get('cur_position'))
466 466 if position is None:
467 467 return
468 468
469 469 # check if this is an existing entry
470 470 is_new = False
471 471 db_entry = UserBookmark().get_by_position_for_user(cur_position, user_id)
472 472
473 473 if db_entry and str2bool(entry.get('remove')):
474 474 log.debug('Marked bookmark %s for deletion', db_entry)
475 475 Session().delete(db_entry)
476 476 return
477 477
478 478 if not db_entry:
479 479 # new
480 480 db_entry = UserBookmark()
481 481 is_new = True
482 482
483 483 should_save = False
484 484 default_redirect_url = ''
485 485
486 486 # save repo
487 487 if entry.get('bookmark_repo') and safe_int(entry.get('bookmark_repo')):
488 488 repo = Repository.get(entry['bookmark_repo'])
489 489 perm_check = HasRepoPermissionAny(
490 490 'repository.read', 'repository.write', 'repository.admin')
491 491 if repo and perm_check(repo_name=repo.repo_name):
492 492 db_entry.repository = repo
493 493 should_save = True
494 494 default_redirect_url = '${repo_url}'
495 495 # save repo group
496 496 elif entry.get('bookmark_repo_group') and safe_int(entry.get('bookmark_repo_group')):
497 497 repo_group = RepoGroup.get(entry['bookmark_repo_group'])
498 498 perm_check = HasRepoGroupPermissionAny(
499 499 'group.read', 'group.write', 'group.admin')
500 500
501 501 if repo_group and perm_check(group_name=repo_group.group_name):
502 502 db_entry.repository_group = repo_group
503 503 should_save = True
504 504 default_redirect_url = '${repo_group_url}'
505 505 # save generic info
506 506 elif entry.get('title') and entry.get('redirect_url'):
507 507 should_save = True
508 508
509 509 if should_save:
510 510 # mark user and position
511 511 db_entry.user_id = user_id
512 512 db_entry.position = position
513 513 db_entry.title = entry.get('title')
514 514 db_entry.redirect_url = entry.get('redirect_url') or default_redirect_url
515 515 log.debug('Saving bookmark %s, new:%s', db_entry, is_new)
516 516
517 517 Session().add(db_entry)
518 518
519 519 @LoginRequired()
520 520 @NotAnonymous()
521 521 @CSRFRequired()
522 522 @view_config(
523 523 route_name='my_account_bookmarks_update', request_method='POST')
524 524 def my_account_bookmarks_update(self):
525 525 _ = self.request.translate
526 526 c = self.load_default_context()
527 527 c.active = 'bookmarks'
528 528
529 529 controls = peppercorn.parse(self.request.POST.items())
530 530 user_id = c.user.user_id
531 531
532 532 # validate positions
533 533 positions = {}
534 534 for entry in controls.get('bookmarks', []):
535 535 position = safe_int(entry['position'])
536 536 if position is None:
537 537 continue
538 538
539 539 if position in positions:
540 540 h.flash(_("Position {} is defined twice. "
541 541 "Please correct this error.").format(position), category='error')
542 542 return HTTPFound(h.route_path('my_account_bookmarks'))
543 543
544 544 entry['position'] = position
545 545 entry['cur_position'] = safe_int(entry.get('cur_position'))
546 546 positions[position] = entry
547 547
548 548 try:
549 549 for entry in positions.values():
550 550 self._process_bookmark_entry(entry, user_id)
551 551
552 552 Session().commit()
553 553 h.flash(_("Update Bookmarks"), category='success')
554 554 except IntegrityError:
555 555 h.flash(_("Failed to update bookmarks. "
556 556 "Make sure an unique position is used."), category='error')
557 557
558 558 return HTTPFound(h.route_path('my_account_bookmarks'))
559 559
560 560 @LoginRequired()
561 561 @NotAnonymous()
562 562 @view_config(
563 563 route_name='my_account_goto_bookmark', request_method='GET',
564 564 renderer='rhodecode:templates/admin/my_account/my_account.mako')
565 565 def my_account_goto_bookmark(self):
566 566
567 567 bookmark_id = self.request.matchdict['bookmark_id']
568 568 user_bookmark = UserBookmark().query()\
569 569 .filter(UserBookmark.user_id == self.request.user.user_id) \
570 570 .filter(UserBookmark.position == bookmark_id).scalar()
571 571
572 572 redirect_url = h.route_path('my_account_bookmarks')
573 573 if not user_bookmark:
574 574 raise HTTPFound(redirect_url)
575 575
576 576 # repository set
577 577 if user_bookmark.repository:
578 578 repo_name = user_bookmark.repository.repo_name
579 579 base_redirect_url = h.route_path(
580 580 'repo_summary', repo_name=repo_name)
581 581 if user_bookmark.redirect_url and \
582 582 '${repo_url}' in user_bookmark.redirect_url:
583 583 redirect_url = string.Template(user_bookmark.redirect_url)\
584 584 .safe_substitute({'repo_url': base_redirect_url})
585 585 else:
586 586 redirect_url = base_redirect_url
587 587 # repository group set
588 588 elif user_bookmark.repository_group:
589 589 repo_group_name = user_bookmark.repository_group.group_name
590 590 base_redirect_url = h.route_path(
591 591 'repo_group_home', repo_group_name=repo_group_name)
592 592 if user_bookmark.redirect_url and \
593 593 '${repo_group_url}' in user_bookmark.redirect_url:
594 594 redirect_url = string.Template(user_bookmark.redirect_url)\
595 595 .safe_substitute({'repo_group_url': base_redirect_url})
596 596 else:
597 597 redirect_url = base_redirect_url
598 598 # custom URL set
599 599 elif user_bookmark.redirect_url:
600 600 server_url = h.route_url('home').rstrip('/')
601 601 redirect_url = string.Template(user_bookmark.redirect_url) \
602 602 .safe_substitute({'server_url': server_url})
603 603
604 604 log.debug('Redirecting bookmark %s to %s', user_bookmark, redirect_url)
605 605 raise HTTPFound(redirect_url)
606 606
607 607 @LoginRequired()
608 608 @NotAnonymous()
609 609 @view_config(
610 610 route_name='my_account_perms', request_method='GET',
611 611 renderer='rhodecode:templates/admin/my_account/my_account.mako')
612 612 def my_account_perms(self):
613 613 c = self.load_default_context()
614 614 c.active = 'perms'
615 615
616 616 c.perm_user = c.auth_user
617 617 return self._get_template_context(c)
618 618
619 619 @LoginRequired()
620 620 @NotAnonymous()
621 621 @view_config(
622 622 route_name='my_account_notifications', request_method='GET',
623 623 renderer='rhodecode:templates/admin/my_account/my_account.mako')
624 624 def my_notifications(self):
625 625 c = self.load_default_context()
626 626 c.active = 'notifications'
627 627
628 628 return self._get_template_context(c)
629 629
630 630 @LoginRequired()
631 631 @NotAnonymous()
632 632 @CSRFRequired()
633 633 @view_config(
634 634 route_name='my_account_notifications_toggle_visibility',
635 635 request_method='POST', renderer='json_ext')
636 636 def my_notifications_toggle_visibility(self):
637 637 user = self._rhodecode_db_user
638 638 new_status = not user.user_data.get('notification_status', True)
639 639 user.update_userdata(notification_status=new_status)
640 640 Session().commit()
641 641 return user.user_data['notification_status']
642 642
643 643 @LoginRequired()
644 644 @NotAnonymous()
645 645 @view_config(
646 646 route_name='my_account_edit',
647 647 request_method='GET',
648 648 renderer='rhodecode:templates/admin/my_account/my_account.mako')
649 649 def my_account_edit(self):
650 650 c = self.load_default_context()
651 651 c.active = 'profile_edit'
652 652 c.extern_type = c.user.extern_type
653 653 c.extern_name = c.user.extern_name
654 654
655 655 schema = user_schema.UserProfileSchema().bind(
656 656 username=c.user.username, user_emails=c.user.emails)
657 657 appstruct = {
658 658 'username': c.user.username,
659 659 'email': c.user.email,
660 660 'firstname': c.user.firstname,
661 661 'lastname': c.user.lastname,
662 662 'description': c.user.description,
663 663 }
664 664 c.form = forms.RcForm(
665 665 schema, appstruct=appstruct,
666 666 action=h.route_path('my_account_update'),
667 667 buttons=(forms.buttons.save, forms.buttons.reset))
668 668
669 669 return self._get_template_context(c)
670 670
671 671 @LoginRequired()
672 672 @NotAnonymous()
673 673 @CSRFRequired()
674 674 @view_config(
675 675 route_name='my_account_update',
676 676 request_method='POST',
677 677 renderer='rhodecode:templates/admin/my_account/my_account.mako')
678 678 def my_account_update(self):
679 679 _ = self.request.translate
680 680 c = self.load_default_context()
681 681 c.active = 'profile_edit'
682 682 c.perm_user = c.auth_user
683 683 c.extern_type = c.user.extern_type
684 684 c.extern_name = c.user.extern_name
685 685
686 686 schema = user_schema.UserProfileSchema().bind(
687 687 username=c.user.username, user_emails=c.user.emails)
688 688 form = forms.RcForm(
689 689 schema, buttons=(forms.buttons.save, forms.buttons.reset))
690 690
691 691 controls = self.request.POST.items()
692 692 try:
693 693 valid_data = form.validate(controls)
694 694 skip_attrs = ['admin', 'active', 'extern_type', 'extern_name',
695 695 'new_password', 'password_confirmation']
696 696 if c.extern_type != "rhodecode":
697 697 # forbid updating username for external accounts
698 698 skip_attrs.append('username')
699 699 old_email = c.user.email
700 700 UserModel().update_user(
701 701 self._rhodecode_user.user_id, skip_attrs=skip_attrs,
702 702 **valid_data)
703 703 if old_email != valid_data['email']:
704 704 old = UserEmailMap.query() \
705 705 .filter(UserEmailMap.user == c.user).filter(UserEmailMap.email == valid_data['email']).first()
706 706 old.email = old_email
707 707 h.flash(_('Your account was updated successfully'), category='success')
708 708 Session().commit()
709 709 except forms.ValidationFailure as e:
710 710 c.form = e
711 711 return self._get_template_context(c)
712 712 except Exception:
713 713 log.exception("Exception updating user")
714 714 h.flash(_('Error occurred during update of user'),
715 715 category='error')
716 716 raise HTTPFound(h.route_path('my_account_profile'))
717 717
718 718 def _get_pull_requests_list(self, statuses):
719 719 draw, start, limit = self._extract_chunk(self.request)
720 720 search_q, order_by, order_dir = self._extract_ordering(self.request)
721 721 _render = self.request.get_partial_renderer(
722 722 'rhodecode:templates/data_table/_dt_elements.mako')
723 723
724 724 pull_requests = PullRequestModel().get_im_participating_in(
725 725 user_id=self._rhodecode_user.user_id,
726 726 statuses=statuses, query=search_q,
727 727 offset=start, length=limit, order_by=order_by,
728 728 order_dir=order_dir)
729 729
730 730 pull_requests_total_count = PullRequestModel().count_im_participating_in(
731 731 user_id=self._rhodecode_user.user_id, statuses=statuses, query=search_q)
732 732
733 733 data = []
734 734 comments_model = CommentsModel()
735 735 for pr in pull_requests:
736 736 repo_id = pr.target_repo_id
737 comments = comments_model.get_all_comments(
738 repo_id, pull_request=pr)
737 comments_count = comments_model.get_all_comments(
738 repo_id, pull_request=pr, count_only=True)
739 739 owned = pr.user_id == self._rhodecode_user.user_id
740 740
741 741 data.append({
742 742 'target_repo': _render('pullrequest_target_repo',
743 743 pr.target_repo.repo_name),
744 744 'name': _render('pullrequest_name',
745 745 pr.pull_request_id, pr.pull_request_state,
746 746 pr.work_in_progress, pr.target_repo.repo_name,
747 747 short=True),
748 748 'name_raw': pr.pull_request_id,
749 749 'status': _render('pullrequest_status',
750 750 pr.calculated_review_status()),
751 751 'title': _render('pullrequest_title', pr.title, pr.description),
752 752 'description': h.escape(pr.description),
753 753 'updated_on': _render('pullrequest_updated_on',
754 754 h.datetime_to_time(pr.updated_on)),
755 755 'updated_on_raw': h.datetime_to_time(pr.updated_on),
756 756 'created_on': _render('pullrequest_updated_on',
757 757 h.datetime_to_time(pr.created_on)),
758 758 'created_on_raw': h.datetime_to_time(pr.created_on),
759 759 'state': pr.pull_request_state,
760 760 'author': _render('pullrequest_author',
761 761 pr.author.full_contact, ),
762 762 'author_raw': pr.author.full_name,
763 'comments': _render('pullrequest_comments', len(comments)),
764 'comments_raw': len(comments),
763 'comments': _render('pullrequest_comments', comments_count),
764 'comments_raw': comments_count,
765 765 'closed': pr.is_closed(),
766 766 'owned': owned
767 767 })
768 768
769 769 # json used to render the grid
770 770 data = ({
771 771 'draw': draw,
772 772 'data': data,
773 773 'recordsTotal': pull_requests_total_count,
774 774 'recordsFiltered': pull_requests_total_count,
775 775 })
776 776 return data
777 777
778 778 @LoginRequired()
779 779 @NotAnonymous()
780 780 @view_config(
781 781 route_name='my_account_pullrequests',
782 782 request_method='GET',
783 783 renderer='rhodecode:templates/admin/my_account/my_account.mako')
784 784 def my_account_pullrequests(self):
785 785 c = self.load_default_context()
786 786 c.active = 'pullrequests'
787 787 req_get = self.request.GET
788 788
789 789 c.closed = str2bool(req_get.get('pr_show_closed'))
790 790
791 791 return self._get_template_context(c)
792 792
793 793 @LoginRequired()
794 794 @NotAnonymous()
795 795 @view_config(
796 796 route_name='my_account_pullrequests_data',
797 797 request_method='GET', renderer='json_ext')
798 798 def my_account_pullrequests_data(self):
799 799 self.load_default_context()
800 800 req_get = self.request.GET
801 801 closed = str2bool(req_get.get('closed'))
802 802
803 803 statuses = [PullRequest.STATUS_NEW, PullRequest.STATUS_OPEN]
804 804 if closed:
805 805 statuses += [PullRequest.STATUS_CLOSED]
806 806
807 807 data = self._get_pull_requests_list(statuses=statuses)
808 808 return data
809 809
810 810 @LoginRequired()
811 811 @NotAnonymous()
812 812 @view_config(
813 813 route_name='my_account_user_group_membership',
814 814 request_method='GET',
815 815 renderer='rhodecode:templates/admin/my_account/my_account.mako')
816 816 def my_account_user_group_membership(self):
817 817 c = self.load_default_context()
818 818 c.active = 'user_group_membership'
819 819 groups = [UserGroupModel.get_user_groups_as_dict(group.users_group)
820 820 for group in self._rhodecode_db_user.group_member]
821 821 c.user_groups = json.dumps(groups)
822 822 return self._get_template_context(c)
@@ -1,1658 +1,1661 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20 import mock
21 21 import pytest
22 22
23 23 import rhodecode
24 24 from rhodecode.lib.vcs.backends.base import MergeResponse, MergeFailureReason
25 25 from rhodecode.lib.vcs.nodes import FileNode
26 26 from rhodecode.lib import helpers as h
27 27 from rhodecode.model.changeset_status import ChangesetStatusModel
28 28 from rhodecode.model.db import (
29 29 PullRequest, ChangesetStatus, UserLog, Notification, ChangesetComment, Repository)
30 30 from rhodecode.model.meta import Session
31 31 from rhodecode.model.pull_request import PullRequestModel
32 32 from rhodecode.model.user import UserModel
33 33 from rhodecode.model.comment import CommentsModel
34 34 from rhodecode.tests import (
35 35 assert_session_flash, TEST_USER_ADMIN_LOGIN, TEST_USER_REGULAR_LOGIN)
36 36
37 37
38 38 def route_path(name, params=None, **kwargs):
39 39 import urllib
40 40
41 41 base_url = {
42 42 'repo_changelog': '/{repo_name}/changelog',
43 43 'repo_changelog_file': '/{repo_name}/changelog/{commit_id}/{f_path}',
44 44 'repo_commits': '/{repo_name}/commits',
45 45 'repo_commits_file': '/{repo_name}/commits/{commit_id}/{f_path}',
46 46 'pullrequest_show': '/{repo_name}/pull-request/{pull_request_id}',
47 47 'pullrequest_show_all': '/{repo_name}/pull-request',
48 48 'pullrequest_show_all_data': '/{repo_name}/pull-request-data',
49 49 'pullrequest_repo_refs': '/{repo_name}/pull-request/refs/{target_repo_name:.*?[^/]}',
50 50 'pullrequest_repo_targets': '/{repo_name}/pull-request/repo-destinations',
51 51 'pullrequest_new': '/{repo_name}/pull-request/new',
52 52 'pullrequest_create': '/{repo_name}/pull-request/create',
53 53 'pullrequest_update': '/{repo_name}/pull-request/{pull_request_id}/update',
54 54 'pullrequest_merge': '/{repo_name}/pull-request/{pull_request_id}/merge',
55 55 'pullrequest_delete': '/{repo_name}/pull-request/{pull_request_id}/delete',
56 56 'pullrequest_comment_create': '/{repo_name}/pull-request/{pull_request_id}/comment',
57 57 'pullrequest_comment_delete': '/{repo_name}/pull-request/{pull_request_id}/comment/{comment_id}/delete',
58 58 'pullrequest_comment_edit': '/{repo_name}/pull-request/{pull_request_id}/comment/{comment_id}/edit',
59 59 }[name].format(**kwargs)
60 60
61 61 if params:
62 62 base_url = '{}?{}'.format(base_url, urllib.urlencode(params))
63 63 return base_url
64 64
65 65
66 66 @pytest.mark.usefixtures('app', 'autologin_user')
67 67 @pytest.mark.backends("git", "hg")
68 68 class TestPullrequestsView(object):
69 69
70 70 def test_index(self, backend):
71 71 self.app.get(route_path(
72 72 'pullrequest_new',
73 73 repo_name=backend.repo_name))
74 74
75 75 def test_option_menu_create_pull_request_exists(self, backend):
76 76 repo_name = backend.repo_name
77 77 response = self.app.get(h.route_path('repo_summary', repo_name=repo_name))
78 78
79 79 create_pr_link = '<a href="%s">Create Pull Request</a>' % route_path(
80 80 'pullrequest_new', repo_name=repo_name)
81 81 response.mustcontain(create_pr_link)
82 82
83 83 def test_create_pr_form_with_raw_commit_id(self, backend):
84 84 repo = backend.repo
85 85
86 86 self.app.get(
87 87 route_path('pullrequest_new', repo_name=repo.repo_name,
88 88 commit=repo.get_commit().raw_id),
89 89 status=200)
90 90
91 91 @pytest.mark.parametrize('pr_merge_enabled', [True, False])
92 92 @pytest.mark.parametrize('range_diff', ["0", "1"])
93 93 def test_show(self, pr_util, pr_merge_enabled, range_diff):
94 94 pull_request = pr_util.create_pull_request(
95 95 mergeable=pr_merge_enabled, enable_notifications=False)
96 96
97 97 response = self.app.get(route_path(
98 98 'pullrequest_show',
99 99 repo_name=pull_request.target_repo.scm_instance().name,
100 100 pull_request_id=pull_request.pull_request_id,
101 101 params={'range-diff': range_diff}))
102 102
103 103 for commit_id in pull_request.revisions:
104 104 response.mustcontain(commit_id)
105 105
106 106 response.mustcontain(pull_request.target_ref_parts.type)
107 107 response.mustcontain(pull_request.target_ref_parts.name)
108 108
109 109 response.mustcontain('class="pull-request-merge"')
110 110
111 111 if pr_merge_enabled:
112 112 response.mustcontain('Pull request reviewer approval is pending')
113 113 else:
114 114 response.mustcontain('Server-side pull request merging is disabled.')
115 115
116 116 if range_diff == "1":
117 117 response.mustcontain('Turn off: Show the diff as commit range')
118 118
119 119 def test_show_versions_of_pr(self, backend, csrf_token):
120 120 commits = [
121 121 {'message': 'initial-commit',
122 122 'added': [FileNode('test-file.txt', 'LINE1\n')]},
123 123
124 124 {'message': 'commit-1',
125 125 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\n')]},
126 126 # Above is the initial version of PR that changes a single line
127 127
128 128 # from now on we'll add 3x commit adding a nother line on each step
129 129 {'message': 'commit-2',
130 130 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\nLINE3\n')]},
131 131
132 132 {'message': 'commit-3',
133 133 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\nLINE3\nLINE4\n')]},
134 134
135 135 {'message': 'commit-4',
136 136 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\nLINE3\nLINE4\nLINE5\n')]},
137 137 ]
138 138
139 139 commit_ids = backend.create_master_repo(commits)
140 140 target = backend.create_repo(heads=['initial-commit'])
141 141 source = backend.create_repo(heads=['commit-1'])
142 142 source_repo_name = source.repo_name
143 143 target_repo_name = target.repo_name
144 144
145 145 target_ref = 'branch:{branch}:{commit_id}'.format(
146 146 branch=backend.default_branch_name, commit_id=commit_ids['initial-commit'])
147 147 source_ref = 'branch:{branch}:{commit_id}'.format(
148 148 branch=backend.default_branch_name, commit_id=commit_ids['commit-1'])
149 149
150 150 response = self.app.post(
151 151 route_path('pullrequest_create', repo_name=source.repo_name),
152 152 [
153 153 ('source_repo', source_repo_name),
154 154 ('source_ref', source_ref),
155 155 ('target_repo', target_repo_name),
156 156 ('target_ref', target_ref),
157 157 ('common_ancestor', commit_ids['initial-commit']),
158 158 ('pullrequest_title', 'Title'),
159 159 ('pullrequest_desc', 'Description'),
160 160 ('description_renderer', 'markdown'),
161 161 ('__start__', 'review_members:sequence'),
162 162 ('__start__', 'reviewer:mapping'),
163 163 ('user_id', '1'),
164 164 ('__start__', 'reasons:sequence'),
165 165 ('reason', 'Some reason'),
166 166 ('__end__', 'reasons:sequence'),
167 167 ('__start__', 'rules:sequence'),
168 168 ('__end__', 'rules:sequence'),
169 169 ('mandatory', 'False'),
170 170 ('__end__', 'reviewer:mapping'),
171 171 ('__end__', 'review_members:sequence'),
172 172 ('__start__', 'revisions:sequence'),
173 173 ('revisions', commit_ids['commit-1']),
174 174 ('__end__', 'revisions:sequence'),
175 175 ('user', ''),
176 176 ('csrf_token', csrf_token),
177 177 ],
178 178 status=302)
179 179
180 180 location = response.headers['Location']
181 181
182 182 pull_request_id = location.rsplit('/', 1)[1]
183 183 assert pull_request_id != 'new'
184 184 pull_request = PullRequest.get(int(pull_request_id))
185 185
186 186 pull_request_id = pull_request.pull_request_id
187 187
188 188 # Show initial version of PR
189 189 response = self.app.get(
190 190 route_path('pullrequest_show',
191 191 repo_name=target_repo_name,
192 192 pull_request_id=pull_request_id))
193 193
194 194 response.mustcontain('commit-1')
195 195 response.mustcontain(no=['commit-2'])
196 196 response.mustcontain(no=['commit-3'])
197 197 response.mustcontain(no=['commit-4'])
198 198
199 199 response.mustcontain('cb-addition"></span><span>LINE2</span>')
200 200 response.mustcontain(no=['LINE3'])
201 201 response.mustcontain(no=['LINE4'])
202 202 response.mustcontain(no=['LINE5'])
203 203
204 204 # update PR #1
205 205 source_repo = Repository.get_by_repo_name(source_repo_name)
206 206 backend.pull_heads(source_repo, heads=['commit-2'])
207 207 response = self.app.post(
208 208 route_path('pullrequest_update',
209 209 repo_name=target_repo_name, pull_request_id=pull_request_id),
210 210 params={'update_commits': 'true', 'csrf_token': csrf_token})
211 211
212 212 # update PR #2
213 213 source_repo = Repository.get_by_repo_name(source_repo_name)
214 214 backend.pull_heads(source_repo, heads=['commit-3'])
215 215 response = self.app.post(
216 216 route_path('pullrequest_update',
217 217 repo_name=target_repo_name, pull_request_id=pull_request_id),
218 218 params={'update_commits': 'true', 'csrf_token': csrf_token})
219 219
220 220 # update PR #3
221 221 source_repo = Repository.get_by_repo_name(source_repo_name)
222 222 backend.pull_heads(source_repo, heads=['commit-4'])
223 223 response = self.app.post(
224 224 route_path('pullrequest_update',
225 225 repo_name=target_repo_name, pull_request_id=pull_request_id),
226 226 params={'update_commits': 'true', 'csrf_token': csrf_token})
227 227
228 228 # Show final version !
229 229 response = self.app.get(
230 230 route_path('pullrequest_show',
231 231 repo_name=target_repo_name,
232 232 pull_request_id=pull_request_id))
233 233
234 234 # 3 updates, and the latest == 4
235 235 response.mustcontain('4 versions available for this pull request')
236 236 response.mustcontain(no=['rhodecode diff rendering error'])
237 237
238 238 # initial show must have 3 commits, and 3 adds
239 239 response.mustcontain('commit-1')
240 240 response.mustcontain('commit-2')
241 241 response.mustcontain('commit-3')
242 242 response.mustcontain('commit-4')
243 243
244 244 response.mustcontain('cb-addition"></span><span>LINE2</span>')
245 245 response.mustcontain('cb-addition"></span><span>LINE3</span>')
246 246 response.mustcontain('cb-addition"></span><span>LINE4</span>')
247 247 response.mustcontain('cb-addition"></span><span>LINE5</span>')
248 248
249 249 # fetch versions
250 250 pr = PullRequest.get(pull_request_id)
251 251 versions = [x.pull_request_version_id for x in pr.versions.all()]
252 252 assert len(versions) == 3
253 253
254 254 # show v1,v2,v3,v4
255 255 def cb_line(text):
256 256 return 'cb-addition"></span><span>{}</span>'.format(text)
257 257
258 258 def cb_context(text):
259 259 return '<span class="cb-code"><span class="cb-action cb-context">' \
260 260 '</span><span>{}</span></span>'.format(text)
261 261
262 262 commit_tests = {
263 263 # in response, not in response
264 264 1: (['commit-1'], ['commit-2', 'commit-3', 'commit-4']),
265 265 2: (['commit-1', 'commit-2'], ['commit-3', 'commit-4']),
266 266 3: (['commit-1', 'commit-2', 'commit-3'], ['commit-4']),
267 267 4: (['commit-1', 'commit-2', 'commit-3', 'commit-4'], []),
268 268 }
269 269 diff_tests = {
270 270 1: (['LINE2'], ['LINE3', 'LINE4', 'LINE5']),
271 271 2: (['LINE2', 'LINE3'], ['LINE4', 'LINE5']),
272 272 3: (['LINE2', 'LINE3', 'LINE4'], ['LINE5']),
273 273 4: (['LINE2', 'LINE3', 'LINE4', 'LINE5'], []),
274 274 }
275 275 for idx, ver in enumerate(versions, 1):
276 276
277 277 response = self.app.get(
278 278 route_path('pullrequest_show',
279 279 repo_name=target_repo_name,
280 280 pull_request_id=pull_request_id,
281 281 params={'version': ver}))
282 282
283 283 response.mustcontain(no=['rhodecode diff rendering error'])
284 284 response.mustcontain('Showing changes at v{}'.format(idx))
285 285
286 286 yes, no = commit_tests[idx]
287 287 for y in yes:
288 288 response.mustcontain(y)
289 289 for n in no:
290 290 response.mustcontain(no=n)
291 291
292 292 yes, no = diff_tests[idx]
293 293 for y in yes:
294 294 response.mustcontain(cb_line(y))
295 295 for n in no:
296 296 response.mustcontain(no=n)
297 297
298 298 # show diff between versions
299 299 diff_compare_tests = {
300 300 1: (['LINE3'], ['LINE1', 'LINE2']),
301 301 2: (['LINE3', 'LINE4'], ['LINE1', 'LINE2']),
302 302 3: (['LINE3', 'LINE4', 'LINE5'], ['LINE1', 'LINE2']),
303 303 }
304 304 for idx, ver in enumerate(versions, 1):
305 305 adds, context = diff_compare_tests[idx]
306 306
307 307 to_ver = ver+1
308 308 if idx == 3:
309 309 to_ver = 'latest'
310 310
311 311 response = self.app.get(
312 312 route_path('pullrequest_show',
313 313 repo_name=target_repo_name,
314 314 pull_request_id=pull_request_id,
315 315 params={'from_version': versions[0], 'version': to_ver}))
316 316
317 317 response.mustcontain(no=['rhodecode diff rendering error'])
318 318
319 319 for a in adds:
320 320 response.mustcontain(cb_line(a))
321 321 for c in context:
322 322 response.mustcontain(cb_context(c))
323 323
324 324 # test version v2 -> v3
325 325 response = self.app.get(
326 326 route_path('pullrequest_show',
327 327 repo_name=target_repo_name,
328 328 pull_request_id=pull_request_id,
329 329 params={'from_version': versions[1], 'version': versions[2]}))
330 330
331 331 response.mustcontain(cb_context('LINE1'))
332 332 response.mustcontain(cb_context('LINE2'))
333 333 response.mustcontain(cb_context('LINE3'))
334 334 response.mustcontain(cb_line('LINE4'))
335 335
336 336 def test_close_status_visibility(self, pr_util, user_util, csrf_token):
337 337 # Logout
338 338 response = self.app.post(
339 339 h.route_path('logout'),
340 340 params={'csrf_token': csrf_token})
341 341 # Login as regular user
342 342 response = self.app.post(h.route_path('login'),
343 343 {'username': TEST_USER_REGULAR_LOGIN,
344 344 'password': 'test12'})
345 345
346 346 pull_request = pr_util.create_pull_request(
347 347 author=TEST_USER_REGULAR_LOGIN)
348 348
349 349 response = self.app.get(route_path(
350 350 'pullrequest_show',
351 351 repo_name=pull_request.target_repo.scm_instance().name,
352 352 pull_request_id=pull_request.pull_request_id))
353 353
354 354 response.mustcontain('Server-side pull request merging is disabled.')
355 355
356 356 assert_response = response.assert_response()
357 357 # for regular user without a merge permissions, we don't see it
358 358 assert_response.no_element_exists('#close-pull-request-action')
359 359
360 360 user_util.grant_user_permission_to_repo(
361 361 pull_request.target_repo,
362 362 UserModel().get_by_username(TEST_USER_REGULAR_LOGIN),
363 363 'repository.write')
364 364 response = self.app.get(route_path(
365 365 'pullrequest_show',
366 366 repo_name=pull_request.target_repo.scm_instance().name,
367 367 pull_request_id=pull_request.pull_request_id))
368 368
369 369 response.mustcontain('Server-side pull request merging is disabled.')
370 370
371 371 assert_response = response.assert_response()
372 372 # now regular user has a merge permissions, we have CLOSE button
373 373 assert_response.one_element_exists('#close-pull-request-action')
374 374
375 375 def test_show_invalid_commit_id(self, pr_util):
376 376 # Simulating invalid revisions which will cause a lookup error
377 377 pull_request = pr_util.create_pull_request()
378 378 pull_request.revisions = ['invalid']
379 379 Session().add(pull_request)
380 380 Session().commit()
381 381
382 382 response = self.app.get(route_path(
383 383 'pullrequest_show',
384 384 repo_name=pull_request.target_repo.scm_instance().name,
385 385 pull_request_id=pull_request.pull_request_id))
386 386
387 387 for commit_id in pull_request.revisions:
388 388 response.mustcontain(commit_id)
389 389
390 390 def test_show_invalid_source_reference(self, pr_util):
391 391 pull_request = pr_util.create_pull_request()
392 392 pull_request.source_ref = 'branch:b:invalid'
393 393 Session().add(pull_request)
394 394 Session().commit()
395 395
396 396 self.app.get(route_path(
397 397 'pullrequest_show',
398 398 repo_name=pull_request.target_repo.scm_instance().name,
399 399 pull_request_id=pull_request.pull_request_id))
400 400
401 401 def test_edit_title_description(self, pr_util, csrf_token):
402 402 pull_request = pr_util.create_pull_request()
403 403 pull_request_id = pull_request.pull_request_id
404 404
405 405 response = self.app.post(
406 406 route_path('pullrequest_update',
407 407 repo_name=pull_request.target_repo.repo_name,
408 408 pull_request_id=pull_request_id),
409 409 params={
410 410 'edit_pull_request': 'true',
411 411 'title': 'New title',
412 412 'description': 'New description',
413 413 'csrf_token': csrf_token})
414 414
415 415 assert_session_flash(
416 416 response, u'Pull request title & description updated.',
417 417 category='success')
418 418
419 419 pull_request = PullRequest.get(pull_request_id)
420 420 assert pull_request.title == 'New title'
421 421 assert pull_request.description == 'New description'
422 422
423 423 def test_edit_title_description_closed(self, pr_util, csrf_token):
424 424 pull_request = pr_util.create_pull_request()
425 425 pull_request_id = pull_request.pull_request_id
426 426 repo_name = pull_request.target_repo.repo_name
427 427 pr_util.close()
428 428
429 429 response = self.app.post(
430 430 route_path('pullrequest_update',
431 431 repo_name=repo_name, pull_request_id=pull_request_id),
432 432 params={
433 433 'edit_pull_request': 'true',
434 434 'title': 'New title',
435 435 'description': 'New description',
436 436 'csrf_token': csrf_token}, status=200)
437 437 assert_session_flash(
438 438 response, u'Cannot update closed pull requests.',
439 439 category='error')
440 440
441 441 def test_update_invalid_source_reference(self, pr_util, csrf_token):
442 442 from rhodecode.lib.vcs.backends.base import UpdateFailureReason
443 443
444 444 pull_request = pr_util.create_pull_request()
445 445 pull_request.source_ref = 'branch:invalid-branch:invalid-commit-id'
446 446 Session().add(pull_request)
447 447 Session().commit()
448 448
449 449 pull_request_id = pull_request.pull_request_id
450 450
451 451 response = self.app.post(
452 452 route_path('pullrequest_update',
453 453 repo_name=pull_request.target_repo.repo_name,
454 454 pull_request_id=pull_request_id),
455 455 params={'update_commits': 'true', 'csrf_token': csrf_token})
456 456
457 457 expected_msg = str(PullRequestModel.UPDATE_STATUS_MESSAGES[
458 458 UpdateFailureReason.MISSING_SOURCE_REF])
459 459 assert_session_flash(response, expected_msg, category='error')
460 460
461 461 def test_missing_target_reference(self, pr_util, csrf_token):
462 462 from rhodecode.lib.vcs.backends.base import MergeFailureReason
463 463 pull_request = pr_util.create_pull_request(
464 464 approved=True, mergeable=True)
465 465 unicode_reference = u'branch:invalid-branch:invalid-commit-id'
466 466 pull_request.target_ref = unicode_reference
467 467 Session().add(pull_request)
468 468 Session().commit()
469 469
470 470 pull_request_id = pull_request.pull_request_id
471 471 pull_request_url = route_path(
472 472 'pullrequest_show',
473 473 repo_name=pull_request.target_repo.repo_name,
474 474 pull_request_id=pull_request_id)
475 475
476 476 response = self.app.get(pull_request_url)
477 477 target_ref_id = 'invalid-branch'
478 478 merge_resp = MergeResponse(
479 479 True, True, '', MergeFailureReason.MISSING_TARGET_REF,
480 480 metadata={'target_ref': PullRequest.unicode_to_reference(unicode_reference)})
481 481 response.assert_response().element_contains(
482 482 'div[data-role="merge-message"]', merge_resp.merge_status_message)
483 483
484 484 def test_comment_and_close_pull_request_custom_message_approved(
485 485 self, pr_util, csrf_token, xhr_header):
486 486
487 487 pull_request = pr_util.create_pull_request(approved=True)
488 488 pull_request_id = pull_request.pull_request_id
489 489 author = pull_request.user_id
490 490 repo = pull_request.target_repo.repo_id
491 491
492 492 self.app.post(
493 493 route_path('pullrequest_comment_create',
494 494 repo_name=pull_request.target_repo.scm_instance().name,
495 495 pull_request_id=pull_request_id),
496 496 params={
497 497 'close_pull_request': '1',
498 498 'text': 'Closing a PR',
499 499 'csrf_token': csrf_token},
500 500 extra_environ=xhr_header,)
501 501
502 502 journal = UserLog.query()\
503 503 .filter(UserLog.user_id == author)\
504 504 .filter(UserLog.repository_id == repo) \
505 505 .order_by(UserLog.user_log_id.asc()) \
506 506 .all()
507 507 assert journal[-1].action == 'repo.pull_request.close'
508 508
509 509 pull_request = PullRequest.get(pull_request_id)
510 510 assert pull_request.is_closed()
511 511
512 512 status = ChangesetStatusModel().get_status(
513 513 pull_request.source_repo, pull_request=pull_request)
514 514 assert status == ChangesetStatus.STATUS_APPROVED
515 515 comments = ChangesetComment().query() \
516 516 .filter(ChangesetComment.pull_request == pull_request) \
517 517 .order_by(ChangesetComment.comment_id.asc())\
518 518 .all()
519 519 assert comments[-1].text == 'Closing a PR'
520 520
521 521 def test_comment_force_close_pull_request_rejected(
522 522 self, pr_util, csrf_token, xhr_header):
523 523 pull_request = pr_util.create_pull_request()
524 524 pull_request_id = pull_request.pull_request_id
525 525 PullRequestModel().update_reviewers(
526 pull_request_id, [(1, ['reason'], False, []), (2, ['reason2'], False, [])],
526 pull_request_id, [
527 (1, ['reason'], False, 'reviewer', []),
528 (2, ['reason2'], False, 'reviewer', [])],
527 529 pull_request.author)
528 530 author = pull_request.user_id
529 531 repo = pull_request.target_repo.repo_id
530 532
531 533 self.app.post(
532 534 route_path('pullrequest_comment_create',
533 535 repo_name=pull_request.target_repo.scm_instance().name,
534 536 pull_request_id=pull_request_id),
535 537 params={
536 538 'close_pull_request': '1',
537 539 'csrf_token': csrf_token},
538 540 extra_environ=xhr_header)
539 541
540 542 pull_request = PullRequest.get(pull_request_id)
541 543
542 544 journal = UserLog.query()\
543 545 .filter(UserLog.user_id == author, UserLog.repository_id == repo) \
544 546 .order_by(UserLog.user_log_id.asc()) \
545 547 .all()
546 548 assert journal[-1].action == 'repo.pull_request.close'
547 549
548 550 # check only the latest status, not the review status
549 551 status = ChangesetStatusModel().get_status(
550 552 pull_request.source_repo, pull_request=pull_request)
551 553 assert status == ChangesetStatus.STATUS_REJECTED
552 554
553 555 def test_comment_and_close_pull_request(
554 556 self, pr_util, csrf_token, xhr_header):
555 557 pull_request = pr_util.create_pull_request()
556 558 pull_request_id = pull_request.pull_request_id
557 559
558 560 response = self.app.post(
559 561 route_path('pullrequest_comment_create',
560 562 repo_name=pull_request.target_repo.scm_instance().name,
561 563 pull_request_id=pull_request.pull_request_id),
562 564 params={
563 565 'close_pull_request': 'true',
564 566 'csrf_token': csrf_token},
565 567 extra_environ=xhr_header)
566 568
567 569 assert response.json
568 570
569 571 pull_request = PullRequest.get(pull_request_id)
570 572 assert pull_request.is_closed()
571 573
572 574 # check only the latest status, not the review status
573 575 status = ChangesetStatusModel().get_status(
574 576 pull_request.source_repo, pull_request=pull_request)
575 577 assert status == ChangesetStatus.STATUS_REJECTED
576 578
577 579 def test_comment_and_close_pull_request_try_edit_comment(
578 580 self, pr_util, csrf_token, xhr_header
579 581 ):
580 582 pull_request = pr_util.create_pull_request()
581 583 pull_request_id = pull_request.pull_request_id
582 584 target_scm = pull_request.target_repo.scm_instance()
583 585 target_scm_name = target_scm.name
584 586
585 587 response = self.app.post(
586 588 route_path(
587 589 'pullrequest_comment_create',
588 590 repo_name=target_scm_name,
589 591 pull_request_id=pull_request_id,
590 592 ),
591 593 params={
592 594 'close_pull_request': 'true',
593 595 'csrf_token': csrf_token,
594 596 },
595 597 extra_environ=xhr_header)
596 598
597 599 assert response.json
598 600
599 601 pull_request = PullRequest.get(pull_request_id)
600 602 target_scm = pull_request.target_repo.scm_instance()
601 603 target_scm_name = target_scm.name
602 604 assert pull_request.is_closed()
603 605
604 606 # check only the latest status, not the review status
605 607 status = ChangesetStatusModel().get_status(
606 608 pull_request.source_repo, pull_request=pull_request)
607 609 assert status == ChangesetStatus.STATUS_REJECTED
608 610
609 611 comment_id = response.json.get('comment_id', None)
610 612 test_text = 'test'
611 613 response = self.app.post(
612 614 route_path(
613 615 'pullrequest_comment_edit',
614 616 repo_name=target_scm_name,
615 617 pull_request_id=pull_request_id,
616 618 comment_id=comment_id,
617 619 ),
618 620 extra_environ=xhr_header,
619 621 params={
620 622 'csrf_token': csrf_token,
621 623 'text': test_text,
622 624 },
623 625 status=403,
624 626 )
625 627 assert response.status_int == 403
626 628
627 629 def test_comment_and_comment_edit(self, pr_util, csrf_token, xhr_header):
628 630 pull_request = pr_util.create_pull_request()
629 631 target_scm = pull_request.target_repo.scm_instance()
630 632 target_scm_name = target_scm.name
631 633
632 634 response = self.app.post(
633 635 route_path(
634 636 'pullrequest_comment_create',
635 637 repo_name=target_scm_name,
636 638 pull_request_id=pull_request.pull_request_id),
637 639 params={
638 640 'csrf_token': csrf_token,
639 641 'text': 'init',
640 642 },
641 643 extra_environ=xhr_header,
642 644 )
643 645 assert response.json
644 646
645 647 comment_id = response.json.get('comment_id', None)
646 648 assert comment_id
647 649 test_text = 'test'
648 650 self.app.post(
649 651 route_path(
650 652 'pullrequest_comment_edit',
651 653 repo_name=target_scm_name,
652 654 pull_request_id=pull_request.pull_request_id,
653 655 comment_id=comment_id,
654 656 ),
655 657 extra_environ=xhr_header,
656 658 params={
657 659 'csrf_token': csrf_token,
658 660 'text': test_text,
659 661 'version': '0',
660 662 },
661 663
662 664 )
663 665 text_form_db = ChangesetComment.query().filter(
664 666 ChangesetComment.comment_id == comment_id).first().text
665 667 assert test_text == text_form_db
666 668
667 669 def test_comment_and_comment_edit(self, pr_util, csrf_token, xhr_header):
668 670 pull_request = pr_util.create_pull_request()
669 671 target_scm = pull_request.target_repo.scm_instance()
670 672 target_scm_name = target_scm.name
671 673
672 674 response = self.app.post(
673 675 route_path(
674 676 'pullrequest_comment_create',
675 677 repo_name=target_scm_name,
676 678 pull_request_id=pull_request.pull_request_id),
677 679 params={
678 680 'csrf_token': csrf_token,
679 681 'text': 'init',
680 682 },
681 683 extra_environ=xhr_header,
682 684 )
683 685 assert response.json
684 686
685 687 comment_id = response.json.get('comment_id', None)
686 688 assert comment_id
687 689 test_text = 'init'
688 690 response = self.app.post(
689 691 route_path(
690 692 'pullrequest_comment_edit',
691 693 repo_name=target_scm_name,
692 694 pull_request_id=pull_request.pull_request_id,
693 695 comment_id=comment_id,
694 696 ),
695 697 extra_environ=xhr_header,
696 698 params={
697 699 'csrf_token': csrf_token,
698 700 'text': test_text,
699 701 'version': '0',
700 702 },
701 703 status=404,
702 704
703 705 )
704 706 assert response.status_int == 404
705 707
706 708 def test_comment_and_try_edit_already_edited(self, pr_util, csrf_token, xhr_header):
707 709 pull_request = pr_util.create_pull_request()
708 710 target_scm = pull_request.target_repo.scm_instance()
709 711 target_scm_name = target_scm.name
710 712
711 713 response = self.app.post(
712 714 route_path(
713 715 'pullrequest_comment_create',
714 716 repo_name=target_scm_name,
715 717 pull_request_id=pull_request.pull_request_id),
716 718 params={
717 719 'csrf_token': csrf_token,
718 720 'text': 'init',
719 721 },
720 722 extra_environ=xhr_header,
721 723 )
722 724 assert response.json
723 725 comment_id = response.json.get('comment_id', None)
724 726 assert comment_id
725 727
726 728 test_text = 'test'
727 729 self.app.post(
728 730 route_path(
729 731 'pullrequest_comment_edit',
730 732 repo_name=target_scm_name,
731 733 pull_request_id=pull_request.pull_request_id,
732 734 comment_id=comment_id,
733 735 ),
734 736 extra_environ=xhr_header,
735 737 params={
736 738 'csrf_token': csrf_token,
737 739 'text': test_text,
738 740 'version': '0',
739 741 },
740 742
741 743 )
742 744 test_text_v2 = 'test_v2'
743 745 response = self.app.post(
744 746 route_path(
745 747 'pullrequest_comment_edit',
746 748 repo_name=target_scm_name,
747 749 pull_request_id=pull_request.pull_request_id,
748 750 comment_id=comment_id,
749 751 ),
750 752 extra_environ=xhr_header,
751 753 params={
752 754 'csrf_token': csrf_token,
753 755 'text': test_text_v2,
754 756 'version': '0',
755 757 },
756 758 status=409,
757 759 )
758 760 assert response.status_int == 409
759 761
760 762 text_form_db = ChangesetComment.query().filter(
761 763 ChangesetComment.comment_id == comment_id).first().text
762 764
763 765 assert test_text == text_form_db
764 766 assert test_text_v2 != text_form_db
765 767
766 768 def test_comment_and_comment_edit_permissions_forbidden(
767 769 self, autologin_regular_user, user_regular, user_admin, pr_util,
768 770 csrf_token, xhr_header):
769 771 pull_request = pr_util.create_pull_request(
770 772 author=user_admin.username, enable_notifications=False)
771 773 comment = CommentsModel().create(
772 774 text='test',
773 775 repo=pull_request.target_repo.scm_instance().name,
774 776 user=user_admin,
775 777 pull_request=pull_request,
776 778 )
777 779 response = self.app.post(
778 780 route_path(
779 781 'pullrequest_comment_edit',
780 782 repo_name=pull_request.target_repo.scm_instance().name,
781 783 pull_request_id=pull_request.pull_request_id,
782 784 comment_id=comment.comment_id,
783 785 ),
784 786 extra_environ=xhr_header,
785 787 params={
786 788 'csrf_token': csrf_token,
787 789 'text': 'test_text',
788 790 },
789 791 status=403,
790 792 )
791 793 assert response.status_int == 403
792 794
793 795 def test_create_pull_request(self, backend, csrf_token):
794 796 commits = [
795 797 {'message': 'ancestor'},
796 798 {'message': 'change'},
797 799 {'message': 'change2'},
798 800 ]
799 801 commit_ids = backend.create_master_repo(commits)
800 802 target = backend.create_repo(heads=['ancestor'])
801 803 source = backend.create_repo(heads=['change2'])
802 804
803 805 response = self.app.post(
804 806 route_path('pullrequest_create', repo_name=source.repo_name),
805 807 [
806 808 ('source_repo', source.repo_name),
807 809 ('source_ref', 'branch:default:' + commit_ids['change2']),
808 810 ('target_repo', target.repo_name),
809 811 ('target_ref', 'branch:default:' + commit_ids['ancestor']),
810 812 ('common_ancestor', commit_ids['ancestor']),
811 813 ('pullrequest_title', 'Title'),
812 814 ('pullrequest_desc', 'Description'),
813 815 ('description_renderer', 'markdown'),
814 816 ('__start__', 'review_members:sequence'),
815 817 ('__start__', 'reviewer:mapping'),
816 818 ('user_id', '1'),
817 819 ('__start__', 'reasons:sequence'),
818 820 ('reason', 'Some reason'),
819 821 ('__end__', 'reasons:sequence'),
820 822 ('__start__', 'rules:sequence'),
821 823 ('__end__', 'rules:sequence'),
822 824 ('mandatory', 'False'),
823 825 ('__end__', 'reviewer:mapping'),
824 826 ('__end__', 'review_members:sequence'),
825 827 ('__start__', 'revisions:sequence'),
826 828 ('revisions', commit_ids['change']),
827 829 ('revisions', commit_ids['change2']),
828 830 ('__end__', 'revisions:sequence'),
829 831 ('user', ''),
830 832 ('csrf_token', csrf_token),
831 833 ],
832 834 status=302)
833 835
834 836 location = response.headers['Location']
835 837 pull_request_id = location.rsplit('/', 1)[1]
836 838 assert pull_request_id != 'new'
837 839 pull_request = PullRequest.get(int(pull_request_id))
838 840
839 841 # check that we have now both revisions
840 842 assert pull_request.revisions == [commit_ids['change2'], commit_ids['change']]
841 843 assert pull_request.source_ref == 'branch:default:' + commit_ids['change2']
842 844 expected_target_ref = 'branch:default:' + commit_ids['ancestor']
843 845 assert pull_request.target_ref == expected_target_ref
844 846
845 847 def test_reviewer_notifications(self, backend, csrf_token):
846 848 # We have to use the app.post for this test so it will create the
847 849 # notifications properly with the new PR
848 850 commits = [
849 851 {'message': 'ancestor',
850 852 'added': [FileNode('file_A', content='content_of_ancestor')]},
851 853 {'message': 'change',
852 854 'added': [FileNode('file_a', content='content_of_change')]},
853 855 {'message': 'change-child'},
854 856 {'message': 'ancestor-child', 'parents': ['ancestor'],
855 857 'added': [
856 858 FileNode('file_B', content='content_of_ancestor_child')]},
857 859 {'message': 'ancestor-child-2'},
858 860 ]
859 861 commit_ids = backend.create_master_repo(commits)
860 862 target = backend.create_repo(heads=['ancestor-child'])
861 863 source = backend.create_repo(heads=['change'])
862 864
863 865 response = self.app.post(
864 866 route_path('pullrequest_create', repo_name=source.repo_name),
865 867 [
866 868 ('source_repo', source.repo_name),
867 869 ('source_ref', 'branch:default:' + commit_ids['change']),
868 870 ('target_repo', target.repo_name),
869 871 ('target_ref', 'branch:default:' + commit_ids['ancestor-child']),
870 872 ('common_ancestor', commit_ids['ancestor']),
871 873 ('pullrequest_title', 'Title'),
872 874 ('pullrequest_desc', 'Description'),
873 875 ('description_renderer', 'markdown'),
874 876 ('__start__', 'review_members:sequence'),
875 877 ('__start__', 'reviewer:mapping'),
876 878 ('user_id', '2'),
877 879 ('__start__', 'reasons:sequence'),
878 880 ('reason', 'Some reason'),
879 881 ('__end__', 'reasons:sequence'),
880 882 ('__start__', 'rules:sequence'),
881 883 ('__end__', 'rules:sequence'),
882 884 ('mandatory', 'False'),
883 885 ('__end__', 'reviewer:mapping'),
884 886 ('__end__', 'review_members:sequence'),
885 887 ('__start__', 'revisions:sequence'),
886 888 ('revisions', commit_ids['change']),
887 889 ('__end__', 'revisions:sequence'),
888 890 ('user', ''),
889 891 ('csrf_token', csrf_token),
890 892 ],
891 893 status=302)
892 894
893 895 location = response.headers['Location']
894 896
895 897 pull_request_id = location.rsplit('/', 1)[1]
896 898 assert pull_request_id != 'new'
897 899 pull_request = PullRequest.get(int(pull_request_id))
898 900
899 901 # Check that a notification was made
900 902 notifications = Notification.query()\
901 903 .filter(Notification.created_by == pull_request.author.user_id,
902 904 Notification.type_ == Notification.TYPE_PULL_REQUEST,
903 905 Notification.subject.contains(
904 906 "requested a pull request review. !%s" % pull_request_id))
905 907 assert len(notifications.all()) == 1
906 908
907 909 # Change reviewers and check that a notification was made
908 910 PullRequestModel().update_reviewers(
909 pull_request.pull_request_id, [(1, [], False, [])],
911 pull_request.pull_request_id, [
912 (1, [], False, 'reviewer', [])
913 ],
910 914 pull_request.author)
911 915 assert len(notifications.all()) == 2
912 916
913 def test_create_pull_request_stores_ancestor_commit_id(self, backend,
914 csrf_token):
917 def test_create_pull_request_stores_ancestor_commit_id(self, backend, csrf_token):
915 918 commits = [
916 919 {'message': 'ancestor',
917 920 'added': [FileNode('file_A', content='content_of_ancestor')]},
918 921 {'message': 'change',
919 922 'added': [FileNode('file_a', content='content_of_change')]},
920 923 {'message': 'change-child'},
921 924 {'message': 'ancestor-child', 'parents': ['ancestor'],
922 925 'added': [
923 926 FileNode('file_B', content='content_of_ancestor_child')]},
924 927 {'message': 'ancestor-child-2'},
925 928 ]
926 929 commit_ids = backend.create_master_repo(commits)
927 930 target = backend.create_repo(heads=['ancestor-child'])
928 931 source = backend.create_repo(heads=['change'])
929 932
930 933 response = self.app.post(
931 934 route_path('pullrequest_create', repo_name=source.repo_name),
932 935 [
933 936 ('source_repo', source.repo_name),
934 937 ('source_ref', 'branch:default:' + commit_ids['change']),
935 938 ('target_repo', target.repo_name),
936 939 ('target_ref', 'branch:default:' + commit_ids['ancestor-child']),
937 940 ('common_ancestor', commit_ids['ancestor']),
938 941 ('pullrequest_title', 'Title'),
939 942 ('pullrequest_desc', 'Description'),
940 943 ('description_renderer', 'markdown'),
941 944 ('__start__', 'review_members:sequence'),
942 945 ('__start__', 'reviewer:mapping'),
943 946 ('user_id', '1'),
944 947 ('__start__', 'reasons:sequence'),
945 948 ('reason', 'Some reason'),
946 949 ('__end__', 'reasons:sequence'),
947 950 ('__start__', 'rules:sequence'),
948 951 ('__end__', 'rules:sequence'),
949 952 ('mandatory', 'False'),
950 953 ('__end__', 'reviewer:mapping'),
951 954 ('__end__', 'review_members:sequence'),
952 955 ('__start__', 'revisions:sequence'),
953 956 ('revisions', commit_ids['change']),
954 957 ('__end__', 'revisions:sequence'),
955 958 ('user', ''),
956 959 ('csrf_token', csrf_token),
957 960 ],
958 961 status=302)
959 962
960 963 location = response.headers['Location']
961 964
962 965 pull_request_id = location.rsplit('/', 1)[1]
963 966 assert pull_request_id != 'new'
964 967 pull_request = PullRequest.get(int(pull_request_id))
965 968
966 969 # target_ref has to point to the ancestor's commit_id in order to
967 970 # show the correct diff
968 971 expected_target_ref = 'branch:default:' + commit_ids['ancestor']
969 972 assert pull_request.target_ref == expected_target_ref
970 973
971 974 # Check generated diff contents
972 975 response = response.follow()
973 976 response.mustcontain(no=['content_of_ancestor'])
974 977 response.mustcontain(no=['content_of_ancestor-child'])
975 978 response.mustcontain('content_of_change')
976 979
977 980 def test_merge_pull_request_enabled(self, pr_util, csrf_token):
978 981 # Clear any previous calls to rcextensions
979 982 rhodecode.EXTENSIONS.calls.clear()
980 983
981 984 pull_request = pr_util.create_pull_request(
982 985 approved=True, mergeable=True)
983 986 pull_request_id = pull_request.pull_request_id
984 987 repo_name = pull_request.target_repo.scm_instance().name,
985 988
986 989 url = route_path('pullrequest_merge',
987 990 repo_name=str(repo_name[0]),
988 991 pull_request_id=pull_request_id)
989 992 response = self.app.post(url, params={'csrf_token': csrf_token}).follow()
990 993
991 994 pull_request = PullRequest.get(pull_request_id)
992 995
993 996 assert response.status_int == 200
994 997 assert pull_request.is_closed()
995 998 assert_pull_request_status(
996 999 pull_request, ChangesetStatus.STATUS_APPROVED)
997 1000
998 1001 # Check the relevant log entries were added
999 1002 user_logs = UserLog.query().order_by(UserLog.user_log_id.desc()).limit(3)
1000 1003 actions = [log.action for log in user_logs]
1001 1004 pr_commit_ids = PullRequestModel()._get_commit_ids(pull_request)
1002 1005 expected_actions = [
1003 1006 u'repo.pull_request.close',
1004 1007 u'repo.pull_request.merge',
1005 1008 u'repo.pull_request.comment.create'
1006 1009 ]
1007 1010 assert actions == expected_actions
1008 1011
1009 1012 user_logs = UserLog.query().order_by(UserLog.user_log_id.desc()).limit(4)
1010 1013 actions = [log for log in user_logs]
1011 1014 assert actions[-1].action == 'user.push'
1012 1015 assert actions[-1].action_data['commit_ids'] == pr_commit_ids
1013 1016
1014 1017 # Check post_push rcextension was really executed
1015 1018 push_calls = rhodecode.EXTENSIONS.calls['_push_hook']
1016 1019 assert len(push_calls) == 1
1017 1020 unused_last_call_args, last_call_kwargs = push_calls[0]
1018 1021 assert last_call_kwargs['action'] == 'push'
1019 1022 assert last_call_kwargs['commit_ids'] == pr_commit_ids
1020 1023
1021 1024 def test_merge_pull_request_disabled(self, pr_util, csrf_token):
1022 1025 pull_request = pr_util.create_pull_request(mergeable=False)
1023 1026 pull_request_id = pull_request.pull_request_id
1024 1027 pull_request = PullRequest.get(pull_request_id)
1025 1028
1026 1029 response = self.app.post(
1027 1030 route_path('pullrequest_merge',
1028 1031 repo_name=pull_request.target_repo.scm_instance().name,
1029 1032 pull_request_id=pull_request.pull_request_id),
1030 1033 params={'csrf_token': csrf_token}).follow()
1031 1034
1032 1035 assert response.status_int == 200
1033 1036 response.mustcontain(
1034 1037 'Merge is not currently possible because of below failed checks.')
1035 1038 response.mustcontain('Server-side pull request merging is disabled.')
1036 1039
1037 1040 @pytest.mark.skip_backends('svn')
1038 1041 def test_merge_pull_request_not_approved(self, pr_util, csrf_token):
1039 1042 pull_request = pr_util.create_pull_request(mergeable=True)
1040 1043 pull_request_id = pull_request.pull_request_id
1041 1044 repo_name = pull_request.target_repo.scm_instance().name
1042 1045
1043 1046 response = self.app.post(
1044 1047 route_path('pullrequest_merge',
1045 1048 repo_name=repo_name, pull_request_id=pull_request_id),
1046 1049 params={'csrf_token': csrf_token}).follow()
1047 1050
1048 1051 assert response.status_int == 200
1049 1052
1050 1053 response.mustcontain(
1051 1054 'Merge is not currently possible because of below failed checks.')
1052 1055 response.mustcontain('Pull request reviewer approval is pending.')
1053 1056
1054 1057 def test_merge_pull_request_renders_failure_reason(
1055 1058 self, user_regular, csrf_token, pr_util):
1056 1059 pull_request = pr_util.create_pull_request(mergeable=True, approved=True)
1057 1060 pull_request_id = pull_request.pull_request_id
1058 1061 repo_name = pull_request.target_repo.scm_instance().name
1059 1062
1060 1063 merge_resp = MergeResponse(True, False, 'STUB_COMMIT_ID',
1061 1064 MergeFailureReason.PUSH_FAILED,
1062 1065 metadata={'target': 'shadow repo',
1063 1066 'merge_commit': 'xxx'})
1064 1067 model_patcher = mock.patch.multiple(
1065 1068 PullRequestModel,
1066 1069 merge_repo=mock.Mock(return_value=merge_resp),
1067 1070 merge_status=mock.Mock(return_value=(None, True, 'WRONG_MESSAGE')))
1068 1071
1069 1072 with model_patcher:
1070 1073 response = self.app.post(
1071 1074 route_path('pullrequest_merge',
1072 1075 repo_name=repo_name,
1073 1076 pull_request_id=pull_request_id),
1074 1077 params={'csrf_token': csrf_token}, status=302)
1075 1078
1076 1079 merge_resp = MergeResponse(True, True, '', MergeFailureReason.PUSH_FAILED,
1077 1080 metadata={'target': 'shadow repo',
1078 1081 'merge_commit': 'xxx'})
1079 1082 assert_session_flash(response, merge_resp.merge_status_message)
1080 1083
1081 1084 def test_update_source_revision(self, backend, csrf_token):
1082 1085 commits = [
1083 1086 {'message': 'ancestor'},
1084 1087 {'message': 'change'},
1085 1088 {'message': 'change-2'},
1086 1089 ]
1087 1090 commit_ids = backend.create_master_repo(commits)
1088 1091 target = backend.create_repo(heads=['ancestor'])
1089 1092 source = backend.create_repo(heads=['change'])
1090 1093
1091 1094 # create pr from a in source to A in target
1092 1095 pull_request = PullRequest()
1093 1096
1094 1097 pull_request.source_repo = source
1095 1098 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1096 1099 branch=backend.default_branch_name, commit_id=commit_ids['change'])
1097 1100
1098 1101 pull_request.target_repo = target
1099 1102 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1100 1103 branch=backend.default_branch_name, commit_id=commit_ids['ancestor'])
1101 1104
1102 1105 pull_request.revisions = [commit_ids['change']]
1103 1106 pull_request.title = u"Test"
1104 1107 pull_request.description = u"Description"
1105 1108 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1106 1109 pull_request.pull_request_state = PullRequest.STATE_CREATED
1107 1110 Session().add(pull_request)
1108 1111 Session().commit()
1109 1112 pull_request_id = pull_request.pull_request_id
1110 1113
1111 1114 # source has ancestor - change - change-2
1112 1115 backend.pull_heads(source, heads=['change-2'])
1113 1116 target_repo_name = target.repo_name
1114 1117
1115 1118 # update PR
1116 1119 self.app.post(
1117 1120 route_path('pullrequest_update',
1118 1121 repo_name=target_repo_name, pull_request_id=pull_request_id),
1119 1122 params={'update_commits': 'true', 'csrf_token': csrf_token})
1120 1123
1121 1124 response = self.app.get(
1122 1125 route_path('pullrequest_show',
1123 1126 repo_name=target_repo_name,
1124 1127 pull_request_id=pull_request.pull_request_id))
1125 1128
1126 1129 assert response.status_int == 200
1127 1130 response.mustcontain('Pull request updated to')
1128 1131 response.mustcontain('with 1 added, 0 removed commits.')
1129 1132
1130 1133 # check that we have now both revisions
1131 1134 pull_request = PullRequest.get(pull_request_id)
1132 1135 assert pull_request.revisions == [commit_ids['change-2'], commit_ids['change']]
1133 1136
1134 1137 def test_update_target_revision(self, backend, csrf_token):
1135 1138 commits = [
1136 1139 {'message': 'ancestor'},
1137 1140 {'message': 'change'},
1138 1141 {'message': 'ancestor-new', 'parents': ['ancestor']},
1139 1142 {'message': 'change-rebased'},
1140 1143 ]
1141 1144 commit_ids = backend.create_master_repo(commits)
1142 1145 target = backend.create_repo(heads=['ancestor'])
1143 1146 source = backend.create_repo(heads=['change'])
1144 1147
1145 1148 # create pr from a in source to A in target
1146 1149 pull_request = PullRequest()
1147 1150
1148 1151 pull_request.source_repo = source
1149 1152 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1150 1153 branch=backend.default_branch_name, commit_id=commit_ids['change'])
1151 1154
1152 1155 pull_request.target_repo = target
1153 1156 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1154 1157 branch=backend.default_branch_name, commit_id=commit_ids['ancestor'])
1155 1158
1156 1159 pull_request.revisions = [commit_ids['change']]
1157 1160 pull_request.title = u"Test"
1158 1161 pull_request.description = u"Description"
1159 1162 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1160 1163 pull_request.pull_request_state = PullRequest.STATE_CREATED
1161 1164
1162 1165 Session().add(pull_request)
1163 1166 Session().commit()
1164 1167 pull_request_id = pull_request.pull_request_id
1165 1168
1166 1169 # target has ancestor - ancestor-new
1167 1170 # source has ancestor - ancestor-new - change-rebased
1168 1171 backend.pull_heads(target, heads=['ancestor-new'])
1169 1172 backend.pull_heads(source, heads=['change-rebased'])
1170 1173 target_repo_name = target.repo_name
1171 1174
1172 1175 # update PR
1173 1176 url = route_path('pullrequest_update',
1174 1177 repo_name=target_repo_name,
1175 1178 pull_request_id=pull_request_id)
1176 1179 self.app.post(url,
1177 1180 params={'update_commits': 'true', 'csrf_token': csrf_token},
1178 1181 status=200)
1179 1182
1180 1183 # check that we have now both revisions
1181 1184 pull_request = PullRequest.get(pull_request_id)
1182 1185 assert pull_request.revisions == [commit_ids['change-rebased']]
1183 1186 assert pull_request.target_ref == 'branch:{branch}:{commit_id}'.format(
1184 1187 branch=backend.default_branch_name, commit_id=commit_ids['ancestor-new'])
1185 1188
1186 1189 response = self.app.get(
1187 1190 route_path('pullrequest_show',
1188 1191 repo_name=target_repo_name,
1189 1192 pull_request_id=pull_request.pull_request_id))
1190 1193 assert response.status_int == 200
1191 1194 response.mustcontain('Pull request updated to')
1192 1195 response.mustcontain('with 1 added, 1 removed commits.')
1193 1196
1194 1197 def test_update_target_revision_with_removal_of_1_commit_git(self, backend_git, csrf_token):
1195 1198 backend = backend_git
1196 1199 commits = [
1197 1200 {'message': 'master-commit-1'},
1198 1201 {'message': 'master-commit-2-change-1'},
1199 1202 {'message': 'master-commit-3-change-2'},
1200 1203
1201 1204 {'message': 'feat-commit-1', 'parents': ['master-commit-1']},
1202 1205 {'message': 'feat-commit-2'},
1203 1206 ]
1204 1207 commit_ids = backend.create_master_repo(commits)
1205 1208 target = backend.create_repo(heads=['master-commit-3-change-2'])
1206 1209 source = backend.create_repo(heads=['feat-commit-2'])
1207 1210
1208 1211 # create pr from a in source to A in target
1209 1212 pull_request = PullRequest()
1210 1213 pull_request.source_repo = source
1211 1214
1212 1215 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1213 1216 branch=backend.default_branch_name,
1214 1217 commit_id=commit_ids['master-commit-3-change-2'])
1215 1218
1216 1219 pull_request.target_repo = target
1217 1220 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1218 1221 branch=backend.default_branch_name, commit_id=commit_ids['feat-commit-2'])
1219 1222
1220 1223 pull_request.revisions = [
1221 1224 commit_ids['feat-commit-1'],
1222 1225 commit_ids['feat-commit-2']
1223 1226 ]
1224 1227 pull_request.title = u"Test"
1225 1228 pull_request.description = u"Description"
1226 1229 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1227 1230 pull_request.pull_request_state = PullRequest.STATE_CREATED
1228 1231 Session().add(pull_request)
1229 1232 Session().commit()
1230 1233 pull_request_id = pull_request.pull_request_id
1231 1234
1232 1235 # PR is created, now we simulate a force-push into target,
1233 1236 # that drops a 2 last commits
1234 1237 vcsrepo = target.scm_instance()
1235 1238 vcsrepo.config.clear_section('hooks')
1236 1239 vcsrepo.run_git_command(['reset', '--soft', 'HEAD~2'])
1237 1240 target_repo_name = target.repo_name
1238 1241
1239 1242 # update PR
1240 1243 url = route_path('pullrequest_update',
1241 1244 repo_name=target_repo_name,
1242 1245 pull_request_id=pull_request_id)
1243 1246 self.app.post(url,
1244 1247 params={'update_commits': 'true', 'csrf_token': csrf_token},
1245 1248 status=200)
1246 1249
1247 1250 response = self.app.get(route_path('pullrequest_new', repo_name=target_repo_name))
1248 1251 assert response.status_int == 200
1249 1252 response.mustcontain('Pull request updated to')
1250 1253 response.mustcontain('with 0 added, 0 removed commits.')
1251 1254
1252 1255 def test_update_of_ancestor_reference(self, backend, csrf_token):
1253 1256 commits = [
1254 1257 {'message': 'ancestor'},
1255 1258 {'message': 'change'},
1256 1259 {'message': 'change-2'},
1257 1260 {'message': 'ancestor-new', 'parents': ['ancestor']},
1258 1261 {'message': 'change-rebased'},
1259 1262 ]
1260 1263 commit_ids = backend.create_master_repo(commits)
1261 1264 target = backend.create_repo(heads=['ancestor'])
1262 1265 source = backend.create_repo(heads=['change'])
1263 1266
1264 1267 # create pr from a in source to A in target
1265 1268 pull_request = PullRequest()
1266 1269 pull_request.source_repo = source
1267 1270
1268 1271 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1269 1272 branch=backend.default_branch_name, commit_id=commit_ids['change'])
1270 1273 pull_request.target_repo = target
1271 1274 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1272 1275 branch=backend.default_branch_name, commit_id=commit_ids['ancestor'])
1273 1276 pull_request.revisions = [commit_ids['change']]
1274 1277 pull_request.title = u"Test"
1275 1278 pull_request.description = u"Description"
1276 1279 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1277 1280 pull_request.pull_request_state = PullRequest.STATE_CREATED
1278 1281 Session().add(pull_request)
1279 1282 Session().commit()
1280 1283 pull_request_id = pull_request.pull_request_id
1281 1284
1282 1285 # target has ancestor - ancestor-new
1283 1286 # source has ancestor - ancestor-new - change-rebased
1284 1287 backend.pull_heads(target, heads=['ancestor-new'])
1285 1288 backend.pull_heads(source, heads=['change-rebased'])
1286 1289 target_repo_name = target.repo_name
1287 1290
1288 1291 # update PR
1289 1292 self.app.post(
1290 1293 route_path('pullrequest_update',
1291 1294 repo_name=target_repo_name, pull_request_id=pull_request_id),
1292 1295 params={'update_commits': 'true', 'csrf_token': csrf_token},
1293 1296 status=200)
1294 1297
1295 1298 # Expect the target reference to be updated correctly
1296 1299 pull_request = PullRequest.get(pull_request_id)
1297 1300 assert pull_request.revisions == [commit_ids['change-rebased']]
1298 1301 expected_target_ref = 'branch:{branch}:{commit_id}'.format(
1299 1302 branch=backend.default_branch_name,
1300 1303 commit_id=commit_ids['ancestor-new'])
1301 1304 assert pull_request.target_ref == expected_target_ref
1302 1305
1303 1306 def test_remove_pull_request_branch(self, backend_git, csrf_token):
1304 1307 branch_name = 'development'
1305 1308 commits = [
1306 1309 {'message': 'initial-commit'},
1307 1310 {'message': 'old-feature'},
1308 1311 {'message': 'new-feature', 'branch': branch_name},
1309 1312 ]
1310 1313 repo = backend_git.create_repo(commits)
1311 1314 repo_name = repo.repo_name
1312 1315 commit_ids = backend_git.commit_ids
1313 1316
1314 1317 pull_request = PullRequest()
1315 1318 pull_request.source_repo = repo
1316 1319 pull_request.target_repo = repo
1317 1320 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1318 1321 branch=branch_name, commit_id=commit_ids['new-feature'])
1319 1322 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1320 1323 branch=backend_git.default_branch_name, commit_id=commit_ids['old-feature'])
1321 1324 pull_request.revisions = [commit_ids['new-feature']]
1322 1325 pull_request.title = u"Test"
1323 1326 pull_request.description = u"Description"
1324 1327 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1325 1328 pull_request.pull_request_state = PullRequest.STATE_CREATED
1326 1329 Session().add(pull_request)
1327 1330 Session().commit()
1328 1331
1329 1332 pull_request_id = pull_request.pull_request_id
1330 1333
1331 1334 vcs = repo.scm_instance()
1332 1335 vcs.remove_ref('refs/heads/{}'.format(branch_name))
1333 1336 # NOTE(marcink): run GC to ensure the commits are gone
1334 1337 vcs.run_gc()
1335 1338
1336 1339 response = self.app.get(route_path(
1337 1340 'pullrequest_show',
1338 1341 repo_name=repo_name,
1339 1342 pull_request_id=pull_request_id))
1340 1343
1341 1344 assert response.status_int == 200
1342 1345
1343 1346 response.assert_response().element_contains(
1344 1347 '#changeset_compare_view_content .alert strong',
1345 1348 'Missing commits')
1346 1349 response.assert_response().element_contains(
1347 1350 '#changeset_compare_view_content .alert',
1348 1351 'This pull request cannot be displayed, because one or more'
1349 1352 ' commits no longer exist in the source repository.')
1350 1353
1351 1354 def test_strip_commits_from_pull_request(
1352 1355 self, backend, pr_util, csrf_token):
1353 1356 commits = [
1354 1357 {'message': 'initial-commit'},
1355 1358 {'message': 'old-feature'},
1356 1359 {'message': 'new-feature', 'parents': ['initial-commit']},
1357 1360 ]
1358 1361 pull_request = pr_util.create_pull_request(
1359 1362 commits, target_head='initial-commit', source_head='new-feature',
1360 1363 revisions=['new-feature'])
1361 1364
1362 1365 vcs = pr_util.source_repository.scm_instance()
1363 1366 if backend.alias == 'git':
1364 1367 vcs.strip(pr_util.commit_ids['new-feature'], branch_name='master')
1365 1368 else:
1366 1369 vcs.strip(pr_util.commit_ids['new-feature'])
1367 1370
1368 1371 response = self.app.get(route_path(
1369 1372 'pullrequest_show',
1370 1373 repo_name=pr_util.target_repository.repo_name,
1371 1374 pull_request_id=pull_request.pull_request_id))
1372 1375
1373 1376 assert response.status_int == 200
1374 1377
1375 1378 response.assert_response().element_contains(
1376 1379 '#changeset_compare_view_content .alert strong',
1377 1380 'Missing commits')
1378 1381 response.assert_response().element_contains(
1379 1382 '#changeset_compare_view_content .alert',
1380 1383 'This pull request cannot be displayed, because one or more'
1381 1384 ' commits no longer exist in the source repository.')
1382 1385 response.assert_response().element_contains(
1383 1386 '#update_commits',
1384 1387 'Update commits')
1385 1388
1386 1389 def test_strip_commits_and_update(
1387 1390 self, backend, pr_util, csrf_token):
1388 1391 commits = [
1389 1392 {'message': 'initial-commit'},
1390 1393 {'message': 'old-feature'},
1391 1394 {'message': 'new-feature', 'parents': ['old-feature']},
1392 1395 ]
1393 1396 pull_request = pr_util.create_pull_request(
1394 1397 commits, target_head='old-feature', source_head='new-feature',
1395 1398 revisions=['new-feature'], mergeable=True)
1396 1399 pr_id = pull_request.pull_request_id
1397 1400 target_repo_name = pull_request.target_repo.repo_name
1398 1401
1399 1402 vcs = pr_util.source_repository.scm_instance()
1400 1403 if backend.alias == 'git':
1401 1404 vcs.strip(pr_util.commit_ids['new-feature'], branch_name='master')
1402 1405 else:
1403 1406 vcs.strip(pr_util.commit_ids['new-feature'])
1404 1407
1405 1408 url = route_path('pullrequest_update',
1406 1409 repo_name=target_repo_name,
1407 1410 pull_request_id=pr_id)
1408 1411 response = self.app.post(url,
1409 1412 params={'update_commits': 'true',
1410 1413 'csrf_token': csrf_token})
1411 1414
1412 1415 assert response.status_int == 200
1413 1416 assert response.body == '{"response": true, "redirect_url": null}'
1414 1417
1415 1418 # Make sure that after update, it won't raise 500 errors
1416 1419 response = self.app.get(route_path(
1417 1420 'pullrequest_show',
1418 1421 repo_name=target_repo_name,
1419 1422 pull_request_id=pr_id))
1420 1423
1421 1424 assert response.status_int == 200
1422 1425 response.assert_response().element_contains(
1423 1426 '#changeset_compare_view_content .alert strong',
1424 1427 'Missing commits')
1425 1428
1426 1429 def test_branch_is_a_link(self, pr_util):
1427 1430 pull_request = pr_util.create_pull_request()
1428 1431 pull_request.source_ref = 'branch:origin:1234567890abcdef'
1429 1432 pull_request.target_ref = 'branch:target:abcdef1234567890'
1430 1433 Session().add(pull_request)
1431 1434 Session().commit()
1432 1435
1433 1436 response = self.app.get(route_path(
1434 1437 'pullrequest_show',
1435 1438 repo_name=pull_request.target_repo.scm_instance().name,
1436 1439 pull_request_id=pull_request.pull_request_id))
1437 1440 assert response.status_int == 200
1438 1441
1439 1442 source = response.assert_response().get_element('.pr-source-info')
1440 1443 source_parent = source.getparent()
1441 1444 assert len(source_parent) == 1
1442 1445
1443 1446 target = response.assert_response().get_element('.pr-target-info')
1444 1447 target_parent = target.getparent()
1445 1448 assert len(target_parent) == 1
1446 1449
1447 1450 expected_origin_link = route_path(
1448 1451 'repo_commits',
1449 1452 repo_name=pull_request.source_repo.scm_instance().name,
1450 1453 params=dict(branch='origin'))
1451 1454 expected_target_link = route_path(
1452 1455 'repo_commits',
1453 1456 repo_name=pull_request.target_repo.scm_instance().name,
1454 1457 params=dict(branch='target'))
1455 1458 assert source_parent.attrib['href'] == expected_origin_link
1456 1459 assert target_parent.attrib['href'] == expected_target_link
1457 1460
1458 1461 def test_bookmark_is_not_a_link(self, pr_util):
1459 1462 pull_request = pr_util.create_pull_request()
1460 1463 pull_request.source_ref = 'bookmark:origin:1234567890abcdef'
1461 1464 pull_request.target_ref = 'bookmark:target:abcdef1234567890'
1462 1465 Session().add(pull_request)
1463 1466 Session().commit()
1464 1467
1465 1468 response = self.app.get(route_path(
1466 1469 'pullrequest_show',
1467 1470 repo_name=pull_request.target_repo.scm_instance().name,
1468 1471 pull_request_id=pull_request.pull_request_id))
1469 1472 assert response.status_int == 200
1470 1473
1471 1474 source = response.assert_response().get_element('.pr-source-info')
1472 1475 assert source.text.strip() == 'bookmark:origin'
1473 1476 assert source.getparent().attrib.get('href') is None
1474 1477
1475 1478 target = response.assert_response().get_element('.pr-target-info')
1476 1479 assert target.text.strip() == 'bookmark:target'
1477 1480 assert target.getparent().attrib.get('href') is None
1478 1481
1479 1482 def test_tag_is_not_a_link(self, pr_util):
1480 1483 pull_request = pr_util.create_pull_request()
1481 1484 pull_request.source_ref = 'tag:origin:1234567890abcdef'
1482 1485 pull_request.target_ref = 'tag:target:abcdef1234567890'
1483 1486 Session().add(pull_request)
1484 1487 Session().commit()
1485 1488
1486 1489 response = self.app.get(route_path(
1487 1490 'pullrequest_show',
1488 1491 repo_name=pull_request.target_repo.scm_instance().name,
1489 1492 pull_request_id=pull_request.pull_request_id))
1490 1493 assert response.status_int == 200
1491 1494
1492 1495 source = response.assert_response().get_element('.pr-source-info')
1493 1496 assert source.text.strip() == 'tag:origin'
1494 1497 assert source.getparent().attrib.get('href') is None
1495 1498
1496 1499 target = response.assert_response().get_element('.pr-target-info')
1497 1500 assert target.text.strip() == 'tag:target'
1498 1501 assert target.getparent().attrib.get('href') is None
1499 1502
1500 1503 @pytest.mark.parametrize('mergeable', [True, False])
1501 1504 def test_shadow_repository_link(
1502 1505 self, mergeable, pr_util, http_host_only_stub):
1503 1506 """
1504 1507 Check that the pull request summary page displays a link to the shadow
1505 1508 repository if the pull request is mergeable. If it is not mergeable
1506 1509 the link should not be displayed.
1507 1510 """
1508 1511 pull_request = pr_util.create_pull_request(
1509 1512 mergeable=mergeable, enable_notifications=False)
1510 1513 target_repo = pull_request.target_repo.scm_instance()
1511 1514 pr_id = pull_request.pull_request_id
1512 1515 shadow_url = '{host}/{repo}/pull-request/{pr_id}/repository'.format(
1513 1516 host=http_host_only_stub, repo=target_repo.name, pr_id=pr_id)
1514 1517
1515 1518 response = self.app.get(route_path(
1516 1519 'pullrequest_show',
1517 1520 repo_name=target_repo.name,
1518 1521 pull_request_id=pr_id))
1519 1522
1520 1523 if mergeable:
1521 1524 response.assert_response().element_value_contains(
1522 1525 'input.pr-mergeinfo', shadow_url)
1523 1526 response.assert_response().element_value_contains(
1524 1527 'input.pr-mergeinfo ', 'pr-merge')
1525 1528 else:
1526 1529 response.assert_response().no_element_exists('.pr-mergeinfo')
1527 1530
1528 1531
1529 1532 @pytest.mark.usefixtures('app')
1530 1533 @pytest.mark.backends("git", "hg")
1531 1534 class TestPullrequestsControllerDelete(object):
1532 1535 def test_pull_request_delete_button_permissions_admin(
1533 1536 self, autologin_user, user_admin, pr_util):
1534 1537 pull_request = pr_util.create_pull_request(
1535 1538 author=user_admin.username, enable_notifications=False)
1536 1539
1537 1540 response = self.app.get(route_path(
1538 1541 'pullrequest_show',
1539 1542 repo_name=pull_request.target_repo.scm_instance().name,
1540 1543 pull_request_id=pull_request.pull_request_id))
1541 1544
1542 1545 response.mustcontain('id="delete_pullrequest"')
1543 1546 response.mustcontain('Confirm to delete this pull request')
1544 1547
1545 1548 def test_pull_request_delete_button_permissions_owner(
1546 1549 self, autologin_regular_user, user_regular, pr_util):
1547 1550 pull_request = pr_util.create_pull_request(
1548 1551 author=user_regular.username, enable_notifications=False)
1549 1552
1550 1553 response = self.app.get(route_path(
1551 1554 'pullrequest_show',
1552 1555 repo_name=pull_request.target_repo.scm_instance().name,
1553 1556 pull_request_id=pull_request.pull_request_id))
1554 1557
1555 1558 response.mustcontain('id="delete_pullrequest"')
1556 1559 response.mustcontain('Confirm to delete this pull request')
1557 1560
1558 1561 def test_pull_request_delete_button_permissions_forbidden(
1559 1562 self, autologin_regular_user, user_regular, user_admin, pr_util):
1560 1563 pull_request = pr_util.create_pull_request(
1561 1564 author=user_admin.username, enable_notifications=False)
1562 1565
1563 1566 response = self.app.get(route_path(
1564 1567 'pullrequest_show',
1565 1568 repo_name=pull_request.target_repo.scm_instance().name,
1566 1569 pull_request_id=pull_request.pull_request_id))
1567 1570 response.mustcontain(no=['id="delete_pullrequest"'])
1568 1571 response.mustcontain(no=['Confirm to delete this pull request'])
1569 1572
1570 1573 def test_pull_request_delete_button_permissions_can_update_cannot_delete(
1571 1574 self, autologin_regular_user, user_regular, user_admin, pr_util,
1572 1575 user_util):
1573 1576
1574 1577 pull_request = pr_util.create_pull_request(
1575 1578 author=user_admin.username, enable_notifications=False)
1576 1579
1577 1580 user_util.grant_user_permission_to_repo(
1578 1581 pull_request.target_repo, user_regular,
1579 1582 'repository.write')
1580 1583
1581 1584 response = self.app.get(route_path(
1582 1585 'pullrequest_show',
1583 1586 repo_name=pull_request.target_repo.scm_instance().name,
1584 1587 pull_request_id=pull_request.pull_request_id))
1585 1588
1586 1589 response.mustcontain('id="open_edit_pullrequest"')
1587 1590 response.mustcontain('id="delete_pullrequest"')
1588 1591 response.mustcontain(no=['Confirm to delete this pull request'])
1589 1592
1590 1593 def test_delete_comment_returns_404_if_comment_does_not_exist(
1591 1594 self, autologin_user, pr_util, user_admin, csrf_token, xhr_header):
1592 1595
1593 1596 pull_request = pr_util.create_pull_request(
1594 1597 author=user_admin.username, enable_notifications=False)
1595 1598
1596 1599 self.app.post(
1597 1600 route_path(
1598 1601 'pullrequest_comment_delete',
1599 1602 repo_name=pull_request.target_repo.scm_instance().name,
1600 1603 pull_request_id=pull_request.pull_request_id,
1601 1604 comment_id=1024404),
1602 1605 extra_environ=xhr_header,
1603 1606 params={'csrf_token': csrf_token},
1604 1607 status=404
1605 1608 )
1606 1609
1607 1610 def test_delete_comment(
1608 1611 self, autologin_user, pr_util, user_admin, csrf_token, xhr_header):
1609 1612
1610 1613 pull_request = pr_util.create_pull_request(
1611 1614 author=user_admin.username, enable_notifications=False)
1612 1615 comment = pr_util.create_comment()
1613 1616 comment_id = comment.comment_id
1614 1617
1615 1618 response = self.app.post(
1616 1619 route_path(
1617 1620 'pullrequest_comment_delete',
1618 1621 repo_name=pull_request.target_repo.scm_instance().name,
1619 1622 pull_request_id=pull_request.pull_request_id,
1620 1623 comment_id=comment_id),
1621 1624 extra_environ=xhr_header,
1622 1625 params={'csrf_token': csrf_token},
1623 1626 status=200
1624 1627 )
1625 1628 assert response.body == 'true'
1626 1629
1627 1630 @pytest.mark.parametrize('url_type', [
1628 1631 'pullrequest_new',
1629 1632 'pullrequest_create',
1630 1633 'pullrequest_update',
1631 1634 'pullrequest_merge',
1632 1635 ])
1633 1636 def test_pull_request_is_forbidden_on_archived_repo(
1634 1637 self, autologin_user, backend, xhr_header, user_util, url_type):
1635 1638
1636 1639 # create a temporary repo
1637 1640 source = user_util.create_repo(repo_type=backend.alias)
1638 1641 repo_name = source.repo_name
1639 1642 repo = Repository.get_by_repo_name(repo_name)
1640 1643 repo.archived = True
1641 1644 Session().commit()
1642 1645
1643 1646 response = self.app.get(
1644 1647 route_path(url_type, repo_name=repo_name, pull_request_id=1), status=302)
1645 1648
1646 1649 msg = 'Action not supported for archived repository.'
1647 1650 assert_session_flash(response, msg)
1648 1651
1649 1652
1650 1653 def assert_pull_request_status(pull_request, expected_status):
1651 1654 status = ChangesetStatusModel().calculated_review_status(pull_request=pull_request)
1652 1655 assert status == expected_status
1653 1656
1654 1657
1655 1658 @pytest.mark.parametrize('route', ['pullrequest_new', 'pullrequest_create'])
1656 1659 @pytest.mark.usefixtures("autologin_user")
1657 1660 def test_forbidde_to_repo_summary_for_svn_repositories(backend_svn, app, route):
1658 1661 app.get(route_path(route, repo_name=backend_svn.repo_name), status=404)
@@ -1,87 +1,111 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 from rhodecode.lib import helpers as h
21 from rhodecode.lib import helpers as h, rc_cache
22 22 from rhodecode.lib.utils2 import safe_int
23 23 from rhodecode.model.pull_request import get_diff_info
24
25 REVIEWER_API_VERSION = 'V3'
24 from rhodecode.model.db import PullRequestReviewers
25 # V3 - Reviewers, with default rules data
26 # v4 - Added observers metadata
27 REVIEWER_API_VERSION = 'V4'
26 28
27 29
28 def reviewer_as_json(user, reasons=None, mandatory=False, rules=None, user_group=None):
30 def reviewer_as_json(user, reasons=None, role=None, mandatory=False, rules=None, user_group=None):
29 31 """
30 32 Returns json struct of a reviewer for frontend
31 33
32 34 :param user: the reviewer
33 35 :param reasons: list of strings of why they are reviewers
34 36 :param mandatory: bool, to set user as mandatory
35 37 """
38 role = role or PullRequestReviewers.ROLE_REVIEWER
39 if role not in PullRequestReviewers.ROLES:
40 raise ValueError('role is not one of %s', PullRequestReviewers.ROLES)
36 41
37 42 return {
38 43 'user_id': user.user_id,
39 44 'reasons': reasons or [],
40 45 'rules': rules or [],
46 'role': role,
41 47 'mandatory': mandatory,
42 48 'user_group': user_group,
43 49 'username': user.username,
44 50 'first_name': user.first_name,
45 51 'last_name': user.last_name,
46 52 'user_link': h.link_to_user(user),
47 53 'gravatar_link': h.gravatar_url(user.email, 14),
48 54 }
49 55
50 56
51 def get_default_reviewers_data(
52 current_user, source_repo, source_commit, target_repo, target_commit):
57 def to_reviewers(e):
58 if isinstance(e, (tuple, list)):
59 return map(reviewer_as_json, e)
60 else:
61 return reviewer_as_json(e)
62
63
64 def get_default_reviewers_data(current_user, source_repo, source_ref, target_repo, target_ref,
65 include_diff_info=True):
53 66 """
54 67 Return json for default reviewers of a repository
55 68 """
56 69
70 diff_info = {}
71 if include_diff_info:
57 72 diff_info = get_diff_info(
58 source_repo, source_commit.raw_id, target_repo, target_commit.raw_id)
73 source_repo, source_ref.commit_id, target_repo, target_ref.commit_id)
59 74
60 75 reasons = ['Default reviewer', 'Repository owner']
61 76 json_reviewers = [reviewer_as_json(
62 user=target_repo.user, reasons=reasons, mandatory=False, rules=None)]
77 user=target_repo.user, reasons=reasons, mandatory=False, rules=None, role=None)]
78
79 compute_key = rc_cache.utils.compute_key_from_params(
80 current_user.user_id, source_repo.repo_id, source_ref.type, source_ref.name,
81 source_ref.commit_id, target_repo.repo_id, target_ref.type, target_ref.name,
82 target_ref.commit_id)
63 83
64 84 return {
65 85 'api_ver': REVIEWER_API_VERSION, # define version for later possible schema upgrade
86 'compute_key': compute_key,
66 87 'diff_info': diff_info,
67 88 'reviewers': json_reviewers,
68 89 'rules': {},
69 90 'rules_data': {},
70 91 }
71 92
72 93
73 94 def validate_default_reviewers(review_members, reviewer_rules):
74 95 """
75 96 Function to validate submitted reviewers against the saved rules
76
77 97 """
78 98 reviewers = []
79 99 reviewer_by_id = {}
80 100 for r in review_members:
81 101 reviewer_user_id = safe_int(r['user_id'])
82 entry = (reviewer_user_id, r['reasons'], r['mandatory'], r['rules'])
102 entry = (reviewer_user_id, r['reasons'], r['mandatory'], r['role'], r['rules'])
83 103
84 104 reviewer_by_id[reviewer_user_id] = entry
85 105 reviewers.append(entry)
86 106
87 107 return reviewers
108
109
110 def validate_observers(observer_members, reviewer_rules):
111 return {}
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
General Comments 0
You need to be logged in to leave comments. Login now