##// END OF EJS Templates
mercurial: Add option to close a branch before merging
Mathieu Cantin -
r2055:5bbc6873 default
parent child Browse files
Show More
@@ -1,1044 +1,1045 b''
1 1 .. _repo-methods-ref:
2 2
3 3 repo methods
4 4 ============
5 5
6 6 add_field_to_repo
7 7 -----------------
8 8
9 9 .. py:function:: add_field_to_repo(apiuser, repoid, key, label=<Optional:''>, description=<Optional:''>)
10 10
11 11 Adds an extra field to a repository.
12 12
13 13 This command can only be run using an |authtoken| with at least
14 14 write permissions to the |repo|.
15 15
16 16 :param apiuser: This is filled automatically from the |authtoken|.
17 17 :type apiuser: AuthUser
18 18 :param repoid: Set the repository name or repository id.
19 19 :type repoid: str or int
20 20 :param key: Create a unique field key for this repository.
21 21 :type key: str
22 22 :param label:
23 23 :type label: Optional(str)
24 24 :param description:
25 25 :type description: Optional(str)
26 26
27 27
28 28 comment_commit
29 29 --------------
30 30
31 31 .. py:function:: comment_commit(apiuser, repoid, commit_id, message, status=<Optional:None>, comment_type=<Optional:u'note'>, resolves_comment_id=<Optional:None>, userid=<Optional:<OptionalAttr:apiuser>>)
32 32
33 33 Set a commit comment, and optionally change the status of the commit.
34 34
35 35 :param apiuser: This is filled automatically from the |authtoken|.
36 36 :type apiuser: AuthUser
37 37 :param repoid: Set the repository name or repository ID.
38 38 :type repoid: str or int
39 39 :param commit_id: Specify the commit_id for which to set a comment.
40 40 :type commit_id: str
41 41 :param message: The comment text.
42 42 :type message: str
43 43 :param status: (**Optional**) status of commit, one of: 'not_reviewed',
44 44 'approved', 'rejected', 'under_review'
45 45 :type status: str
46 46 :param comment_type: Comment type, one of: 'note', 'todo'
47 47 :type comment_type: Optional(str), default: 'note'
48 48 :param userid: Set the user name of the comment creator.
49 49 :type userid: Optional(str or int)
50 50
51 51 Example error output:
52 52
53 53 .. code-block:: bash
54 54
55 55 {
56 56 "id" : <id_given_in_input>,
57 57 "result" : {
58 58 "msg": "Commented on commit `<commit_id>` for repository `<repoid>`",
59 59 "status_change": null or <status>,
60 60 "success": true
61 61 },
62 62 "error" : null
63 63 }
64 64
65 65
66 66 create_repo
67 67 -----------
68 68
69 69 .. py:function:: create_repo(apiuser, repo_name, repo_type, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, copy_permissions=<Optional:False>)
70 70
71 71 Creates a repository.
72 72
73 73 * If the repository name contains "/", repository will be created inside
74 74 a repository group or nested repository groups
75 75
76 76 For example "foo/bar/repo1" will create |repo| called "repo1" inside
77 77 group "foo/bar". You have to have permissions to access and write to
78 78 the last repository group ("bar" in this example)
79 79
80 80 This command can only be run using an |authtoken| with at least
81 81 permissions to create repositories, or write permissions to
82 82 parent repository groups.
83 83
84 84 :param apiuser: This is filled automatically from the |authtoken|.
85 85 :type apiuser: AuthUser
86 86 :param repo_name: Set the repository name.
87 87 :type repo_name: str
88 88 :param repo_type: Set the repository type; 'hg','git', or 'svn'.
89 89 :type repo_type: str
90 90 :param owner: user_id or username
91 91 :type owner: Optional(str)
92 92 :param description: Set the repository description.
93 93 :type description: Optional(str)
94 94 :param private: set repository as private
95 95 :type private: bool
96 96 :param clone_uri: set clone_uri
97 97 :type clone_uri: str
98 98 :param landing_rev: <rev_type>:<rev>
99 99 :type landing_rev: str
100 100 :param enable_locking:
101 101 :type enable_locking: bool
102 102 :param enable_downloads:
103 103 :type enable_downloads: bool
104 104 :param enable_statistics:
105 105 :type enable_statistics: bool
106 106 :param copy_permissions: Copy permission from group in which the
107 107 repository is being created.
108 108 :type copy_permissions: bool
109 109
110 110
111 111 Example output:
112 112
113 113 .. code-block:: bash
114 114
115 115 id : <id_given_in_input>
116 116 result: {
117 117 "msg": "Created new repository `<reponame>`",
118 118 "success": true,
119 119 "task": "<celery task id or None if done sync>"
120 120 }
121 121 error: null
122 122
123 123
124 124 Example error output:
125 125
126 126 .. code-block:: bash
127 127
128 128 id : <id_given_in_input>
129 129 result : null
130 130 error : {
131 131 'failed to create repository `<repo_name>`'
132 132 }
133 133
134 134
135 135 delete_repo
136 136 -----------
137 137
138 138 .. py:function:: delete_repo(apiuser, repoid, forks=<Optional:''>)
139 139
140 140 Deletes a repository.
141 141
142 142 * When the `forks` parameter is set it's possible to detach or delete
143 143 forks of deleted repository.
144 144
145 145 This command can only be run using an |authtoken| with admin
146 146 permissions on the |repo|.
147 147
148 148 :param apiuser: This is filled automatically from the |authtoken|.
149 149 :type apiuser: AuthUser
150 150 :param repoid: Set the repository name or repository ID.
151 151 :type repoid: str or int
152 152 :param forks: Set to `detach` or `delete` forks from the |repo|.
153 153 :type forks: Optional(str)
154 154
155 155 Example error output:
156 156
157 157 .. code-block:: bash
158 158
159 159 id : <id_given_in_input>
160 160 result: {
161 161 "msg": "Deleted repository `<reponame>`",
162 162 "success": true
163 163 }
164 164 error: null
165 165
166 166
167 167 fork_repo
168 168 ---------
169 169
170 170 .. py:function:: fork_repo(apiuser, repoid, fork_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, copy_permissions=<Optional:False>)
171 171
172 172 Creates a fork of the specified |repo|.
173 173
174 174 * If the fork_name contains "/", fork will be created inside
175 175 a repository group or nested repository groups
176 176
177 177 For example "foo/bar/fork-repo" will create fork called "fork-repo"
178 178 inside group "foo/bar". You have to have permissions to access and
179 179 write to the last repository group ("bar" in this example)
180 180
181 181 This command can only be run using an |authtoken| with minimum
182 182 read permissions of the forked repo, create fork permissions for an user.
183 183
184 184 :param apiuser: This is filled automatically from the |authtoken|.
185 185 :type apiuser: AuthUser
186 186 :param repoid: Set repository name or repository ID.
187 187 :type repoid: str or int
188 188 :param fork_name: Set the fork name, including it's repository group membership.
189 189 :type fork_name: str
190 190 :param owner: Set the fork owner.
191 191 :type owner: str
192 192 :param description: Set the fork description.
193 193 :type description: str
194 194 :param copy_permissions: Copy permissions from parent |repo|. The
195 195 default is False.
196 196 :type copy_permissions: bool
197 197 :param private: Make the fork private. The default is False.
198 198 :type private: bool
199 199 :param landing_rev: Set the landing revision. The default is tip.
200 200
201 201 Example output:
202 202
203 203 .. code-block:: bash
204 204
205 205 id : <id_for_response>
206 206 api_key : "<api_key>"
207 207 args: {
208 208 "repoid" : "<reponame or repo_id>",
209 209 "fork_name": "<forkname>",
210 210 "owner": "<username or user_id = Optional(=apiuser)>",
211 211 "description": "<description>",
212 212 "copy_permissions": "<bool>",
213 213 "private": "<bool>",
214 214 "landing_rev": "<landing_rev>"
215 215 }
216 216
217 217 Example error output:
218 218
219 219 .. code-block:: bash
220 220
221 221 id : <id_given_in_input>
222 222 result: {
223 223 "msg": "Created fork of `<reponame>` as `<forkname>`",
224 224 "success": true,
225 225 "task": "<celery task id or None if done sync>"
226 226 }
227 227 error: null
228 228
229 229
230 230 get_repo
231 231 --------
232 232
233 233 .. py:function:: get_repo(apiuser, repoid, cache=<Optional:True>)
234 234
235 235 Gets an existing repository by its name or repository_id.
236 236
237 237 The members section so the output returns users groups or users
238 238 associated with that repository.
239 239
240 240 This command can only be run using an |authtoken| with admin rights,
241 241 or users with at least read rights to the |repo|.
242 242
243 243 :param apiuser: This is filled automatically from the |authtoken|.
244 244 :type apiuser: AuthUser
245 245 :param repoid: The repository name or repository id.
246 246 :type repoid: str or int
247 247 :param cache: use the cached value for last changeset
248 248 :type: cache: Optional(bool)
249 249
250 250 Example output:
251 251
252 252 .. code-block:: bash
253 253
254 254 {
255 255 "error": null,
256 256 "id": <repo_id>,
257 257 "result": {
258 258 "clone_uri": null,
259 259 "created_on": "timestamp",
260 260 "description": "repo description",
261 261 "enable_downloads": false,
262 262 "enable_locking": false,
263 263 "enable_statistics": false,
264 264 "followers": [
265 265 {
266 266 "active": true,
267 267 "admin": false,
268 268 "api_key": "****************************************",
269 269 "api_keys": [
270 270 "****************************************"
271 271 ],
272 272 "email": "user@example.com",
273 273 "emails": [
274 274 "user@example.com"
275 275 ],
276 276 "extern_name": "rhodecode",
277 277 "extern_type": "rhodecode",
278 278 "firstname": "username",
279 279 "ip_addresses": [],
280 280 "language": null,
281 281 "last_login": "2015-09-16T17:16:35.854",
282 282 "lastname": "surname",
283 283 "user_id": <user_id>,
284 284 "username": "name"
285 285 }
286 286 ],
287 287 "fork_of": "parent-repo",
288 288 "landing_rev": [
289 289 "rev",
290 290 "tip"
291 291 ],
292 292 "last_changeset": {
293 293 "author": "User <user@example.com>",
294 294 "branch": "default",
295 295 "date": "timestamp",
296 296 "message": "last commit message",
297 297 "parents": [
298 298 {
299 299 "raw_id": "commit-id"
300 300 }
301 301 ],
302 302 "raw_id": "commit-id",
303 303 "revision": <revision number>,
304 304 "short_id": "short id"
305 305 },
306 306 "lock_reason": null,
307 307 "locked_by": null,
308 308 "locked_date": null,
309 309 "members": [
310 310 {
311 311 "name": "super-admin-name",
312 312 "origin": "super-admin",
313 313 "permission": "repository.admin",
314 314 "type": "user"
315 315 },
316 316 {
317 317 "name": "owner-name",
318 318 "origin": "owner",
319 319 "permission": "repository.admin",
320 320 "type": "user"
321 321 },
322 322 {
323 323 "name": "user-group-name",
324 324 "origin": "permission",
325 325 "permission": "repository.write",
326 326 "type": "user_group"
327 327 }
328 328 ],
329 329 "owner": "owner-name",
330 330 "permissions": [
331 331 {
332 332 "name": "super-admin-name",
333 333 "origin": "super-admin",
334 334 "permission": "repository.admin",
335 335 "type": "user"
336 336 },
337 337 {
338 338 "name": "owner-name",
339 339 "origin": "owner",
340 340 "permission": "repository.admin",
341 341 "type": "user"
342 342 },
343 343 {
344 344 "name": "user-group-name",
345 345 "origin": "permission",
346 346 "permission": "repository.write",
347 347 "type": "user_group"
348 348 }
349 349 ],
350 350 "private": true,
351 351 "repo_id": 676,
352 352 "repo_name": "user-group/repo-name",
353 353 "repo_type": "hg"
354 354 }
355 355 }
356 356
357 357
358 358 get_repo_changeset
359 359 ------------------
360 360
361 361 .. py:function:: get_repo_changeset(apiuser, repoid, revision, details=<Optional:'basic'>)
362 362
363 363 Returns information about a changeset.
364 364
365 365 Additionally parameters define the amount of details returned by
366 366 this function.
367 367
368 368 This command can only be run using an |authtoken| with admin rights,
369 369 or users with at least read rights to the |repo|.
370 370
371 371 :param apiuser: This is filled automatically from the |authtoken|.
372 372 :type apiuser: AuthUser
373 373 :param repoid: The repository name or repository id
374 374 :type repoid: str or int
375 375 :param revision: revision for which listing should be done
376 376 :type revision: str
377 377 :param details: details can be 'basic|extended|full' full gives diff
378 378 info details like the diff itself, and number of changed files etc.
379 379 :type details: Optional(str)
380 380
381 381
382 382 get_repo_changesets
383 383 -------------------
384 384
385 385 .. py:function:: get_repo_changesets(apiuser, repoid, start_rev, limit, details=<Optional:'basic'>)
386 386
387 387 Returns a set of commits limited by the number starting
388 388 from the `start_rev` option.
389 389
390 390 Additional parameters define the amount of details returned by this
391 391 function.
392 392
393 393 This command can only be run using an |authtoken| with admin rights,
394 394 or users with at least read rights to |repos|.
395 395
396 396 :param apiuser: This is filled automatically from the |authtoken|.
397 397 :type apiuser: AuthUser
398 398 :param repoid: The repository name or repository ID.
399 399 :type repoid: str or int
400 400 :param start_rev: The starting revision from where to get changesets.
401 401 :type start_rev: str
402 402 :param limit: Limit the number of commits to this amount
403 403 :type limit: str or int
404 404 :param details: Set the level of detail returned. Valid option are:
405 405 ``basic``, ``extended`` and ``full``.
406 406 :type details: Optional(str)
407 407
408 408 .. note::
409 409
410 410 Setting the parameter `details` to the value ``full`` is extensive
411 411 and returns details like the diff itself, and the number
412 412 of changed files.
413 413
414 414
415 415 get_repo_nodes
416 416 --------------
417 417
418 418 .. py:function:: get_repo_nodes(apiuser, repoid, revision, root_path, ret_type=<Optional:'all'>, details=<Optional:'basic'>, max_file_bytes=<Optional:None>)
419 419
420 420 Returns a list of nodes and children in a flat list for a given
421 421 path at given revision.
422 422
423 423 It's possible to specify ret_type to show only `files` or `dirs`.
424 424
425 425 This command can only be run using an |authtoken| with admin rights,
426 426 or users with at least read rights to |repos|.
427 427
428 428 :param apiuser: This is filled automatically from the |authtoken|.
429 429 :type apiuser: AuthUser
430 430 :param repoid: The repository name or repository ID.
431 431 :type repoid: str or int
432 432 :param revision: The revision for which listing should be done.
433 433 :type revision: str
434 434 :param root_path: The path from which to start displaying.
435 435 :type root_path: str
436 436 :param ret_type: Set the return type. Valid options are
437 437 ``all`` (default), ``files`` and ``dirs``.
438 438 :type ret_type: Optional(str)
439 439 :param details: Returns extended information about nodes, such as
440 440 md5, binary, and or content. The valid options are ``basic`` and
441 441 ``full``.
442 442 :type details: Optional(str)
443 443 :param max_file_bytes: Only return file content under this file size bytes
444 444 :type details: Optional(int)
445 445
446 446 Example output:
447 447
448 448 .. code-block:: bash
449 449
450 450 id : <id_given_in_input>
451 451 result: [
452 452 {
453 453 "name" : "<name>"
454 454 "type" : "<type>",
455 455 "binary": "<true|false>" (only in extended mode)
456 456 "md5" : "<md5 of file content>" (only in extended mode)
457 457 },
458 458 ...
459 459 ]
460 460 error: null
461 461
462 462
463 463 get_repo_refs
464 464 -------------
465 465
466 466 .. py:function:: get_repo_refs(apiuser, repoid)
467 467
468 468 Returns a dictionary of current references. It returns
469 469 bookmarks, branches, closed_branches, and tags for given repository
470 470
471 471 It's possible to specify ret_type to show only `files` or `dirs`.
472 472
473 473 This command can only be run using an |authtoken| with admin rights,
474 474 or users with at least read rights to |repos|.
475 475
476 476 :param apiuser: This is filled automatically from the |authtoken|.
477 477 :type apiuser: AuthUser
478 478 :param repoid: The repository name or repository ID.
479 479 :type repoid: str or int
480 480
481 481 Example output:
482 482
483 483 .. code-block:: bash
484 484
485 485 id : <id_given_in_input>
486 486 "result": {
487 487 "bookmarks": {
488 488 "dev": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
489 489 "master": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
490 490 },
491 491 "branches": {
492 492 "default": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
493 493 "stable": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
494 494 },
495 495 "branches_closed": {},
496 496 "tags": {
497 497 "tip": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
498 498 "v4.4.0": "1232313f9e6adac5ce5399c2a891dc1e72b79022",
499 499 "v4.4.1": "cbb9f1d329ae5768379cdec55a62ebdd546c4e27",
500 500 "v4.4.2": "24ffe44a27fcd1c5b6936144e176b9f6dd2f3a17",
501 501 }
502 502 }
503 503 error: null
504 504
505 505
506 506 get_repo_settings
507 507 -----------------
508 508
509 509 .. py:function:: get_repo_settings(apiuser, repoid, key=<Optional:None>)
510 510
511 511 Returns all settings for a repository. If key is given it only returns the
512 512 setting identified by the key or null.
513 513
514 514 :param apiuser: This is filled automatically from the |authtoken|.
515 515 :type apiuser: AuthUser
516 516 :param repoid: The repository name or repository id.
517 517 :type repoid: str or int
518 518 :param key: Key of the setting to return.
519 519 :type: key: Optional(str)
520 520
521 521 Example output:
522 522
523 523 .. code-block:: bash
524 524
525 525 {
526 526 "error": null,
527 527 "id": 237,
528 528 "result": {
529 529 "extensions_largefiles": true,
530 530 "extensions_evolve": true,
531 531 "hooks_changegroup_push_logger": true,
532 532 "hooks_changegroup_repo_size": false,
533 533 "hooks_outgoing_pull_logger": true,
534 534 "phases_publish": "True",
535 535 "rhodecode_hg_use_rebase_for_merging": true,
536 "rhodecode_hg_close_branch_before_merging": false,
536 537 "rhodecode_pr_merge_enabled": true,
537 538 "rhodecode_use_outdated_comments": true
538 539 }
539 540 }
540 541
541 542
542 543 get_repos
543 544 ---------
544 545
545 546 .. py:function:: get_repos(apiuser, root=<Optional:None>, traverse=<Optional:True>)
546 547
547 548 Lists all existing repositories.
548 549
549 550 This command can only be run using an |authtoken| with admin rights,
550 551 or users with at least read rights to |repos|.
551 552
552 553 :param apiuser: This is filled automatically from the |authtoken|.
553 554 :type apiuser: AuthUser
554 555 :param root: specify root repository group to fetch repositories.
555 556 filters the returned repositories to be members of given root group.
556 557 :type root: Optional(None)
557 558 :param traverse: traverse given root into subrepositories. With this flag
558 559 set to False, it will only return top-level repositories from `root`.
559 560 if root is empty it will return just top-level repositories.
560 561 :type traverse: Optional(True)
561 562
562 563
563 564 Example output:
564 565
565 566 .. code-block:: bash
566 567
567 568 id : <id_given_in_input>
568 569 result: [
569 570 {
570 571 "repo_id" : "<repo_id>",
571 572 "repo_name" : "<reponame>"
572 573 "repo_type" : "<repo_type>",
573 574 "clone_uri" : "<clone_uri>",
574 575 "private": : "<bool>",
575 576 "created_on" : "<datetimecreated>",
576 577 "description" : "<description>",
577 578 "landing_rev": "<landing_rev>",
578 579 "owner": "<repo_owner>",
579 580 "fork_of": "<name_of_fork_parent>",
580 581 "enable_downloads": "<bool>",
581 582 "enable_locking": "<bool>",
582 583 "enable_statistics": "<bool>",
583 584 },
584 585 ...
585 586 ]
586 587 error: null
587 588
588 589
589 590 grant_user_group_permission
590 591 ---------------------------
591 592
592 593 .. py:function:: grant_user_group_permission(apiuser, repoid, usergroupid, perm)
593 594
594 595 Grant permission for a user group on the specified repository,
595 596 or update existing permissions.
596 597
597 598 This command can only be run using an |authtoken| with admin
598 599 permissions on the |repo|.
599 600
600 601 :param apiuser: This is filled automatically from the |authtoken|.
601 602 :type apiuser: AuthUser
602 603 :param repoid: Set the repository name or repository ID.
603 604 :type repoid: str or int
604 605 :param usergroupid: Specify the ID of the user group.
605 606 :type usergroupid: str or int
606 607 :param perm: Set the user group permissions using the following
607 608 format: (repository.(none|read|write|admin))
608 609 :type perm: str
609 610
610 611 Example output:
611 612
612 613 .. code-block:: bash
613 614
614 615 id : <id_given_in_input>
615 616 result : {
616 617 "msg" : "Granted perm: `<perm>` for group: `<usersgroupname>` in repo: `<reponame>`",
617 618 "success": true
618 619
619 620 }
620 621 error : null
621 622
622 623 Example error output:
623 624
624 625 .. code-block:: bash
625 626
626 627 id : <id_given_in_input>
627 628 result : null
628 629 error : {
629 630 "failed to edit permission for user group: `<usergroup>` in repo `<repo>`'
630 631 }
631 632
632 633
633 634 grant_user_permission
634 635 ---------------------
635 636
636 637 .. py:function:: grant_user_permission(apiuser, repoid, userid, perm)
637 638
638 639 Grant permissions for the specified user on the given repository,
639 640 or update existing permissions if found.
640 641
641 642 This command can only be run using an |authtoken| with admin
642 643 permissions on the |repo|.
643 644
644 645 :param apiuser: This is filled automatically from the |authtoken|.
645 646 :type apiuser: AuthUser
646 647 :param repoid: Set the repository name or repository ID.
647 648 :type repoid: str or int
648 649 :param userid: Set the user name.
649 650 :type userid: str
650 651 :param perm: Set the user permissions, using the following format
651 652 ``(repository.(none|read|write|admin))``
652 653 :type perm: str
653 654
654 655 Example output:
655 656
656 657 .. code-block:: bash
657 658
658 659 id : <id_given_in_input>
659 660 result: {
660 661 "msg" : "Granted perm: `<perm>` for user: `<username>` in repo: `<reponame>`",
661 662 "success": true
662 663 }
663 664 error: null
664 665
665 666
666 667 invalidate_cache
667 668 ----------------
668 669
669 670 .. py:function:: invalidate_cache(apiuser, repoid, delete_keys=<Optional:False>)
670 671
671 672 Invalidates the cache for the specified repository.
672 673
673 674 This command can only be run using an |authtoken| with admin rights to
674 675 the specified repository.
675 676
676 677 This command takes the following options:
677 678
678 679 :param apiuser: This is filled automatically from |authtoken|.
679 680 :type apiuser: AuthUser
680 681 :param repoid: Sets the repository name or repository ID.
681 682 :type repoid: str or int
682 683 :param delete_keys: This deletes the invalidated keys instead of
683 684 just flagging them.
684 685 :type delete_keys: Optional(``True`` | ``False``)
685 686
686 687 Example output:
687 688
688 689 .. code-block:: bash
689 690
690 691 id : <id_given_in_input>
691 692 result : {
692 693 'msg': Cache for repository `<repository name>` was invalidated,
693 694 'repository': <repository name>
694 695 }
695 696 error : null
696 697
697 698 Example error output:
698 699
699 700 .. code-block:: bash
700 701
701 702 id : <id_given_in_input>
702 703 result : null
703 704 error : {
704 705 'Error occurred during cache invalidation action'
705 706 }
706 707
707 708
708 709 lock
709 710 ----
710 711
711 712 .. py:function:: lock(apiuser, repoid, locked=<Optional:None>, userid=<Optional:<OptionalAttr:apiuser>>)
712 713
713 714 Sets the lock state of the specified |repo| by the given user.
714 715 From more information, see :ref:`repo-locking`.
715 716
716 717 * If the ``userid`` option is not set, the repository is locked to the
717 718 user who called the method.
718 719 * If the ``locked`` parameter is not set, the current lock state of the
719 720 repository is displayed.
720 721
721 722 This command can only be run using an |authtoken| with admin rights to
722 723 the specified repository.
723 724
724 725 This command takes the following options:
725 726
726 727 :param apiuser: This is filled automatically from the |authtoken|.
727 728 :type apiuser: AuthUser
728 729 :param repoid: Sets the repository name or repository ID.
729 730 :type repoid: str or int
730 731 :param locked: Sets the lock state.
731 732 :type locked: Optional(``True`` | ``False``)
732 733 :param userid: Set the repository lock to this user.
733 734 :type userid: Optional(str or int)
734 735
735 736 Example error output:
736 737
737 738 .. code-block:: bash
738 739
739 740 id : <id_given_in_input>
740 741 result : {
741 742 'repo': '<reponame>',
742 743 'locked': <bool: lock state>,
743 744 'locked_since': <int: lock timestamp>,
744 745 'locked_by': <username of person who made the lock>,
745 746 'lock_reason': <str: reason for locking>,
746 747 'lock_state_changed': <bool: True if lock state has been changed in this request>,
747 748 'msg': 'Repo `<reponame>` locked by `<username>` on <timestamp>.'
748 749 or
749 750 'msg': 'Repo `<repository name>` not locked.'
750 751 or
751 752 'msg': 'User `<user name>` set lock state for repo `<repository name>` to `<new lock state>`'
752 753 }
753 754 error : null
754 755
755 756 Example error output:
756 757
757 758 .. code-block:: bash
758 759
759 760 id : <id_given_in_input>
760 761 result : null
761 762 error : {
762 763 'Error occurred locking repository `<reponame>`'
763 764 }
764 765
765 766
766 767 maintenance
767 768 -----------
768 769
769 770 .. py:function:: maintenance(apiuser, repoid)
770 771
771 772 Triggers a maintenance on the given repository.
772 773
773 774 This command can only be run using an |authtoken| with admin
774 775 rights to the specified repository. For more information,
775 776 see :ref:`config-token-ref`.
776 777
777 778 This command takes the following options:
778 779
779 780 :param apiuser: This is filled automatically from the |authtoken|.
780 781 :type apiuser: AuthUser
781 782 :param repoid: The repository name or repository ID.
782 783 :type repoid: str or int
783 784
784 785 Example output:
785 786
786 787 .. code-block:: bash
787 788
788 789 id : <id_given_in_input>
789 790 result : {
790 791 "msg": "executed maintenance command",
791 792 "executed_actions": [
792 793 <action_message>, <action_message2>...
793 794 ],
794 795 "repository": "<repository name>"
795 796 }
796 797 error : null
797 798
798 799 Example error output:
799 800
800 801 .. code-block:: bash
801 802
802 803 id : <id_given_in_input>
803 804 result : null
804 805 error : {
805 806 "Unable to execute maintenance on `<reponame>`"
806 807 }
807 808
808 809
809 810 pull
810 811 ----
811 812
812 813 .. py:function:: pull(apiuser, repoid)
813 814
814 815 Triggers a pull on the given repository from a remote location. You
815 816 can use this to keep remote repositories up-to-date.
816 817
817 818 This command can only be run using an |authtoken| with admin
818 819 rights to the specified repository. For more information,
819 820 see :ref:`config-token-ref`.
820 821
821 822 This command takes the following options:
822 823
823 824 :param apiuser: This is filled automatically from the |authtoken|.
824 825 :type apiuser: AuthUser
825 826 :param repoid: The repository name or repository ID.
826 827 :type repoid: str or int
827 828
828 829 Example output:
829 830
830 831 .. code-block:: bash
831 832
832 833 id : <id_given_in_input>
833 834 result : {
834 835 "msg": "Pulled from `<repository name>`"
835 836 "repository": "<repository name>"
836 837 }
837 838 error : null
838 839
839 840 Example error output:
840 841
841 842 .. code-block:: bash
842 843
843 844 id : <id_given_in_input>
844 845 result : null
845 846 error : {
846 847 "Unable to pull changes from `<reponame>`"
847 848 }
848 849
849 850
850 851 remove_field_from_repo
851 852 ----------------------
852 853
853 854 .. py:function:: remove_field_from_repo(apiuser, repoid, key)
854 855
855 856 Removes an extra field from a repository.
856 857
857 858 This command can only be run using an |authtoken| with at least
858 859 write permissions to the |repo|.
859 860
860 861 :param apiuser: This is filled automatically from the |authtoken|.
861 862 :type apiuser: AuthUser
862 863 :param repoid: Set the repository name or repository ID.
863 864 :type repoid: str or int
864 865 :param key: Set the unique field key for this repository.
865 866 :type key: str
866 867
867 868
868 869 revoke_user_group_permission
869 870 ----------------------------
870 871
871 872 .. py:function:: revoke_user_group_permission(apiuser, repoid, usergroupid)
872 873
873 874 Revoke the permissions of a user group on a given repository.
874 875
875 876 This command can only be run using an |authtoken| with admin
876 877 permissions on the |repo|.
877 878
878 879 :param apiuser: This is filled automatically from the |authtoken|.
879 880 :type apiuser: AuthUser
880 881 :param repoid: Set the repository name or repository ID.
881 882 :type repoid: str or int
882 883 :param usergroupid: Specify the user group ID.
883 884 :type usergroupid: str or int
884 885
885 886 Example output:
886 887
887 888 .. code-block:: bash
888 889
889 890 id : <id_given_in_input>
890 891 result: {
891 892 "msg" : "Revoked perm for group: `<usersgroupname>` in repo: `<reponame>`",
892 893 "success": true
893 894 }
894 895 error: null
895 896
896 897
897 898 revoke_user_permission
898 899 ----------------------
899 900
900 901 .. py:function:: revoke_user_permission(apiuser, repoid, userid)
901 902
902 903 Revoke permission for a user on the specified repository.
903 904
904 905 This command can only be run using an |authtoken| with admin
905 906 permissions on the |repo|.
906 907
907 908 :param apiuser: This is filled automatically from the |authtoken|.
908 909 :type apiuser: AuthUser
909 910 :param repoid: Set the repository name or repository ID.
910 911 :type repoid: str or int
911 912 :param userid: Set the user name of revoked user.
912 913 :type userid: str or int
913 914
914 915 Example error output:
915 916
916 917 .. code-block:: bash
917 918
918 919 id : <id_given_in_input>
919 920 result: {
920 921 "msg" : "Revoked perm for user: `<username>` in repo: `<reponame>`",
921 922 "success": true
922 923 }
923 924 error: null
924 925
925 926
926 927 set_repo_settings
927 928 -----------------
928 929
929 930 .. py:function:: set_repo_settings(apiuser, repoid, settings)
930 931
931 932 Update repository settings. Returns true on success.
932 933
933 934 :param apiuser: This is filled automatically from the |authtoken|.
934 935 :type apiuser: AuthUser
935 936 :param repoid: The repository name or repository id.
936 937 :type repoid: str or int
937 938 :param settings: The new settings for the repository.
938 939 :type: settings: dict
939 940
940 941 Example output:
941 942
942 943 .. code-block:: bash
943 944
944 945 {
945 946 "error": null,
946 947 "id": 237,
947 948 "result": true
948 949 }
949 950
950 951
951 952 strip
952 953 -----
953 954
954 955 .. py:function:: strip(apiuser, repoid, revision, branch)
955 956
956 957 Strips the given revision from the specified repository.
957 958
958 959 * This will remove the revision and all of its decendants.
959 960
960 961 This command can only be run using an |authtoken| with admin rights to
961 962 the specified repository.
962 963
963 964 This command takes the following options:
964 965
965 966 :param apiuser: This is filled automatically from the |authtoken|.
966 967 :type apiuser: AuthUser
967 968 :param repoid: The repository name or repository ID.
968 969 :type repoid: str or int
969 970 :param revision: The revision you wish to strip.
970 971 :type revision: str
971 972 :param branch: The branch from which to strip the revision.
972 973 :type branch: str
973 974
974 975 Example output:
975 976
976 977 .. code-block:: bash
977 978
978 979 id : <id_given_in_input>
979 980 result : {
980 981 "msg": "'Stripped commit <commit_hash> from repo `<repository name>`'"
981 982 "repository": "<repository name>"
982 983 }
983 984 error : null
984 985
985 986 Example error output:
986 987
987 988 .. code-block:: bash
988 989
989 990 id : <id_given_in_input>
990 991 result : null
991 992 error : {
992 993 "Unable to strip commit <commit_hash> from repo `<repository name>`"
993 994 }
994 995
995 996
996 997 update_repo
997 998 -----------
998 999
999 1000 .. py:function:: update_repo(apiuser, repoid, repo_name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, fork_of=<Optional:None>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>)
1000 1001
1001 1002 Updates a repository with the given information.
1002 1003
1003 1004 This command can only be run using an |authtoken| with at least
1004 1005 admin permissions to the |repo|.
1005 1006
1006 1007 * If the repository name contains "/", repository will be updated
1007 1008 accordingly with a repository group or nested repository groups
1008 1009
1009 1010 For example repoid=repo-test name="foo/bar/repo-test" will update |repo|
1010 1011 called "repo-test" and place it inside group "foo/bar".
1011 1012 You have to have permissions to access and write to the last repository
1012 1013 group ("bar" in this example)
1013 1014
1014 1015 :param apiuser: This is filled automatically from the |authtoken|.
1015 1016 :type apiuser: AuthUser
1016 1017 :param repoid: repository name or repository ID.
1017 1018 :type repoid: str or int
1018 1019 :param repo_name: Update the |repo| name, including the
1019 1020 repository group it's in.
1020 1021 :type repo_name: str
1021 1022 :param owner: Set the |repo| owner.
1022 1023 :type owner: str
1023 1024 :param fork_of: Set the |repo| as fork of another |repo|.
1024 1025 :type fork_of: str
1025 1026 :param description: Update the |repo| description.
1026 1027 :type description: str
1027 1028 :param private: Set the |repo| as private. (True | False)
1028 1029 :type private: bool
1029 1030 :param clone_uri: Update the |repo| clone URI.
1030 1031 :type clone_uri: str
1031 1032 :param landing_rev: Set the |repo| landing revision. Default is ``rev:tip``.
1032 1033 :type landing_rev: str
1033 1034 :param enable_statistics: Enable statistics on the |repo|, (True | False).
1034 1035 :type enable_statistics: bool
1035 1036 :param enable_locking: Enable |repo| locking.
1036 1037 :type enable_locking: bool
1037 1038 :param enable_downloads: Enable downloads from the |repo|, (True | False).
1038 1039 :type enable_downloads: bool
1039 1040 :param fields: Add extra fields to the |repo|. Use the following
1040 1041 example format: ``field_key=field_val,field_key2=fieldval2``.
1041 1042 Escape ', ' with \,
1042 1043 :type fields: str
1043 1044
1044 1045
@@ -1,1588 +1,1590 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2014-2017 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 """
22 22 Base module for all VCS systems
23 23 """
24 24
25 25 import collections
26 26 import datetime
27 27 import itertools
28 28 import logging
29 29 import os
30 30 import time
31 31 import warnings
32 32
33 33 from zope.cachedescriptors.property import Lazy as LazyProperty
34 34
35 35 from rhodecode.lib.utils2 import safe_str, safe_unicode
36 36 from rhodecode.lib.vcs import connection
37 37 from rhodecode.lib.vcs.utils import author_name, author_email
38 38 from rhodecode.lib.vcs.conf import settings
39 39 from rhodecode.lib.vcs.exceptions import (
40 40 CommitError, EmptyRepositoryError, NodeAlreadyAddedError,
41 41 NodeAlreadyChangedError, NodeAlreadyExistsError, NodeAlreadyRemovedError,
42 42 NodeDoesNotExistError, NodeNotChangedError, VCSError,
43 43 ImproperArchiveTypeError, BranchDoesNotExistError, CommitDoesNotExistError,
44 44 RepositoryError)
45 45
46 46
47 47 log = logging.getLogger(__name__)
48 48
49 49
50 50 FILEMODE_DEFAULT = 0100644
51 51 FILEMODE_EXECUTABLE = 0100755
52 52
53 53 Reference = collections.namedtuple('Reference', ('type', 'name', 'commit_id'))
54 54 MergeResponse = collections.namedtuple(
55 55 'MergeResponse',
56 56 ('possible', 'executed', 'merge_ref', 'failure_reason'))
57 57
58 58
59 59 class MergeFailureReason(object):
60 60 """
61 61 Enumeration with all the reasons why the server side merge could fail.
62 62
63 63 DO NOT change the number of the reasons, as they may be stored in the
64 64 database.
65 65
66 66 Changing the name of a reason is acceptable and encouraged to deprecate old
67 67 reasons.
68 68 """
69 69
70 70 # Everything went well.
71 71 NONE = 0
72 72
73 73 # An unexpected exception was raised. Check the logs for more details.
74 74 UNKNOWN = 1
75 75
76 76 # The merge was not successful, there are conflicts.
77 77 MERGE_FAILED = 2
78 78
79 79 # The merge succeeded but we could not push it to the target repository.
80 80 PUSH_FAILED = 3
81 81
82 82 # The specified target is not a head in the target repository.
83 83 TARGET_IS_NOT_HEAD = 4
84 84
85 85 # The source repository contains more branches than the target. Pushing
86 86 # the merge will create additional branches in the target.
87 87 HG_SOURCE_HAS_MORE_BRANCHES = 5
88 88
89 89 # The target reference has multiple heads. That does not allow to correctly
90 90 # identify the target location. This could only happen for mercurial
91 91 # branches.
92 92 HG_TARGET_HAS_MULTIPLE_HEADS = 6
93 93
94 94 # The target repository is locked
95 95 TARGET_IS_LOCKED = 7
96 96
97 97 # Deprecated, use MISSING_TARGET_REF or MISSING_SOURCE_REF instead.
98 98 # A involved commit could not be found.
99 99 _DEPRECATED_MISSING_COMMIT = 8
100 100
101 101 # The target repo reference is missing.
102 102 MISSING_TARGET_REF = 9
103 103
104 104 # The source repo reference is missing.
105 105 MISSING_SOURCE_REF = 10
106 106
107 107 # The merge was not successful, there are conflicts related to sub
108 108 # repositories.
109 109 SUBREPO_MERGE_FAILED = 11
110 110
111 111
112 112 class UpdateFailureReason(object):
113 113 """
114 114 Enumeration with all the reasons why the pull request update could fail.
115 115
116 116 DO NOT change the number of the reasons, as they may be stored in the
117 117 database.
118 118
119 119 Changing the name of a reason is acceptable and encouraged to deprecate old
120 120 reasons.
121 121 """
122 122
123 123 # Everything went well.
124 124 NONE = 0
125 125
126 126 # An unexpected exception was raised. Check the logs for more details.
127 127 UNKNOWN = 1
128 128
129 129 # The pull request is up to date.
130 130 NO_CHANGE = 2
131 131
132 132 # The pull request has a reference type that is not supported for update.
133 133 WRONG_REF_TYPE = 3
134 134
135 135 # Update failed because the target reference is missing.
136 136 MISSING_TARGET_REF = 4
137 137
138 138 # Update failed because the source reference is missing.
139 139 MISSING_SOURCE_REF = 5
140 140
141 141
142 142 class BaseRepository(object):
143 143 """
144 144 Base Repository for final backends
145 145
146 146 .. attribute:: DEFAULT_BRANCH_NAME
147 147
148 148 name of default branch (i.e. "trunk" for svn, "master" for git etc.
149 149
150 150 .. attribute:: commit_ids
151 151
152 152 list of all available commit ids, in ascending order
153 153
154 154 .. attribute:: path
155 155
156 156 absolute path to the repository
157 157
158 158 .. attribute:: bookmarks
159 159
160 160 Mapping from name to :term:`Commit ID` of the bookmark. Empty in case
161 161 there are no bookmarks or the backend implementation does not support
162 162 bookmarks.
163 163
164 164 .. attribute:: tags
165 165
166 166 Mapping from name to :term:`Commit ID` of the tag.
167 167
168 168 """
169 169
170 170 DEFAULT_BRANCH_NAME = None
171 171 DEFAULT_CONTACT = u"Unknown"
172 172 DEFAULT_DESCRIPTION = u"unknown"
173 173 EMPTY_COMMIT_ID = '0' * 40
174 174
175 175 path = None
176 176
177 177 def __init__(self, repo_path, config=None, create=False, **kwargs):
178 178 """
179 179 Initializes repository. Raises RepositoryError if repository could
180 180 not be find at the given ``repo_path`` or directory at ``repo_path``
181 181 exists and ``create`` is set to True.
182 182
183 183 :param repo_path: local path of the repository
184 184 :param config: repository configuration
185 185 :param create=False: if set to True, would try to create repository.
186 186 :param src_url=None: if set, should be proper url from which repository
187 187 would be cloned; requires ``create`` parameter to be set to True -
188 188 raises RepositoryError if src_url is set and create evaluates to
189 189 False
190 190 """
191 191 raise NotImplementedError
192 192
193 193 def __repr__(self):
194 194 return '<%s at %s>' % (self.__class__.__name__, self.path)
195 195
196 196 def __len__(self):
197 197 return self.count()
198 198
199 199 def __eq__(self, other):
200 200 same_instance = isinstance(other, self.__class__)
201 201 return same_instance and other.path == self.path
202 202
203 203 def __ne__(self, other):
204 204 return not self.__eq__(other)
205 205
206 206 @LazyProperty
207 207 def EMPTY_COMMIT(self):
208 208 return EmptyCommit(self.EMPTY_COMMIT_ID)
209 209
210 210 @LazyProperty
211 211 def alias(self):
212 212 for k, v in settings.BACKENDS.items():
213 213 if v.split('.')[-1] == str(self.__class__.__name__):
214 214 return k
215 215
216 216 @LazyProperty
217 217 def name(self):
218 218 return safe_unicode(os.path.basename(self.path))
219 219
220 220 @LazyProperty
221 221 def description(self):
222 222 raise NotImplementedError
223 223
224 224 def refs(self):
225 225 """
226 226 returns a `dict` with branches, bookmarks, tags, and closed_branches
227 227 for this repository
228 228 """
229 229 return dict(
230 230 branches=self.branches,
231 231 branches_closed=self.branches_closed,
232 232 tags=self.tags,
233 233 bookmarks=self.bookmarks
234 234 )
235 235
236 236 @LazyProperty
237 237 def branches(self):
238 238 """
239 239 A `dict` which maps branch names to commit ids.
240 240 """
241 241 raise NotImplementedError
242 242
243 243 @LazyProperty
244 244 def tags(self):
245 245 """
246 246 A `dict` which maps tags names to commit ids.
247 247 """
248 248 raise NotImplementedError
249 249
250 250 @LazyProperty
251 251 def size(self):
252 252 """
253 253 Returns combined size in bytes for all repository files
254 254 """
255 255 tip = self.get_commit()
256 256 return tip.size
257 257
258 258 def size_at_commit(self, commit_id):
259 259 commit = self.get_commit(commit_id)
260 260 return commit.size
261 261
262 262 def is_empty(self):
263 263 return not bool(self.commit_ids)
264 264
265 265 @staticmethod
266 266 def check_url(url, config):
267 267 """
268 268 Function will check given url and try to verify if it's a valid
269 269 link.
270 270 """
271 271 raise NotImplementedError
272 272
273 273 @staticmethod
274 274 def is_valid_repository(path):
275 275 """
276 276 Check if given `path` contains a valid repository of this backend
277 277 """
278 278 raise NotImplementedError
279 279
280 280 # ==========================================================================
281 281 # COMMITS
282 282 # ==========================================================================
283 283
284 284 def get_commit(self, commit_id=None, commit_idx=None, pre_load=None):
285 285 """
286 286 Returns instance of `BaseCommit` class. If `commit_id` and `commit_idx`
287 287 are both None, most recent commit is returned.
288 288
289 289 :param pre_load: Optional. List of commit attributes to load.
290 290
291 291 :raises ``EmptyRepositoryError``: if there are no commits
292 292 """
293 293 raise NotImplementedError
294 294
295 295 def __iter__(self):
296 296 for commit_id in self.commit_ids:
297 297 yield self.get_commit(commit_id=commit_id)
298 298
299 299 def get_commits(
300 300 self, start_id=None, end_id=None, start_date=None, end_date=None,
301 301 branch_name=None, pre_load=None):
302 302 """
303 303 Returns iterator of `BaseCommit` objects from start to end
304 304 not inclusive. This should behave just like a list, ie. end is not
305 305 inclusive.
306 306
307 307 :param start_id: None or str, must be a valid commit id
308 308 :param end_id: None or str, must be a valid commit id
309 309 :param start_date:
310 310 :param end_date:
311 311 :param branch_name:
312 312 :param pre_load:
313 313 """
314 314 raise NotImplementedError
315 315
316 316 def __getitem__(self, key):
317 317 """
318 318 Allows index based access to the commit objects of this repository.
319 319 """
320 320 pre_load = ["author", "branch", "date", "message", "parents"]
321 321 if isinstance(key, slice):
322 322 return self._get_range(key, pre_load)
323 323 return self.get_commit(commit_idx=key, pre_load=pre_load)
324 324
325 325 def _get_range(self, slice_obj, pre_load):
326 326 for commit_id in self.commit_ids.__getitem__(slice_obj):
327 327 yield self.get_commit(commit_id=commit_id, pre_load=pre_load)
328 328
329 329 def count(self):
330 330 return len(self.commit_ids)
331 331
332 332 def tag(self, name, user, commit_id=None, message=None, date=None, **opts):
333 333 """
334 334 Creates and returns a tag for the given ``commit_id``.
335 335
336 336 :param name: name for new tag
337 337 :param user: full username, i.e.: "Joe Doe <joe.doe@example.com>"
338 338 :param commit_id: commit id for which new tag would be created
339 339 :param message: message of the tag's commit
340 340 :param date: date of tag's commit
341 341
342 342 :raises TagAlreadyExistError: if tag with same name already exists
343 343 """
344 344 raise NotImplementedError
345 345
346 346 def remove_tag(self, name, user, message=None, date=None):
347 347 """
348 348 Removes tag with the given ``name``.
349 349
350 350 :param name: name of the tag to be removed
351 351 :param user: full username, i.e.: "Joe Doe <joe.doe@example.com>"
352 352 :param message: message of the tag's removal commit
353 353 :param date: date of tag's removal commit
354 354
355 355 :raises TagDoesNotExistError: if tag with given name does not exists
356 356 """
357 357 raise NotImplementedError
358 358
359 359 def get_diff(
360 360 self, commit1, commit2, path=None, ignore_whitespace=False,
361 361 context=3, path1=None):
362 362 """
363 363 Returns (git like) *diff*, as plain text. Shows changes introduced by
364 364 `commit2` since `commit1`.
365 365
366 366 :param commit1: Entry point from which diff is shown. Can be
367 367 ``self.EMPTY_COMMIT`` - in this case, patch showing all
368 368 the changes since empty state of the repository until `commit2`
369 369 :param commit2: Until which commit changes should be shown.
370 370 :param path: Can be set to a path of a file to create a diff of that
371 371 file. If `path1` is also set, this value is only associated to
372 372 `commit2`.
373 373 :param ignore_whitespace: If set to ``True``, would not show whitespace
374 374 changes. Defaults to ``False``.
375 375 :param context: How many lines before/after changed lines should be
376 376 shown. Defaults to ``3``.
377 377 :param path1: Can be set to a path to associate with `commit1`. This
378 378 parameter works only for backends which support diff generation for
379 379 different paths. Other backends will raise a `ValueError` if `path1`
380 380 is set and has a different value than `path`.
381 381 :param file_path: filter this diff by given path pattern
382 382 """
383 383 raise NotImplementedError
384 384
385 385 def strip(self, commit_id, branch=None):
386 386 """
387 387 Strip given commit_id from the repository
388 388 """
389 389 raise NotImplementedError
390 390
391 391 def get_common_ancestor(self, commit_id1, commit_id2, repo2):
392 392 """
393 393 Return a latest common ancestor commit if one exists for this repo
394 394 `commit_id1` vs `commit_id2` from `repo2`.
395 395
396 396 :param commit_id1: Commit it from this repository to use as a
397 397 target for the comparison.
398 398 :param commit_id2: Source commit id to use for comparison.
399 399 :param repo2: Source repository to use for comparison.
400 400 """
401 401 raise NotImplementedError
402 402
403 403 def compare(self, commit_id1, commit_id2, repo2, merge, pre_load=None):
404 404 """
405 405 Compare this repository's revision `commit_id1` with `commit_id2`.
406 406
407 407 Returns a tuple(commits, ancestor) that would be merged from
408 408 `commit_id2`. Doing a normal compare (``merge=False``), ``None``
409 409 will be returned as ancestor.
410 410
411 411 :param commit_id1: Commit it from this repository to use as a
412 412 target for the comparison.
413 413 :param commit_id2: Source commit id to use for comparison.
414 414 :param repo2: Source repository to use for comparison.
415 415 :param merge: If set to ``True`` will do a merge compare which also
416 416 returns the common ancestor.
417 417 :param pre_load: Optional. List of commit attributes to load.
418 418 """
419 419 raise NotImplementedError
420 420
421 421 def merge(self, target_ref, source_repo, source_ref, workspace_id,
422 422 user_name='', user_email='', message='', dry_run=False,
423 use_rebase=False):
423 use_rebase=False, close_branch=False):
424 424 """
425 425 Merge the revisions specified in `source_ref` from `source_repo`
426 426 onto the `target_ref` of this repository.
427 427
428 428 `source_ref` and `target_ref` are named tupls with the following
429 429 fields `type`, `name` and `commit_id`.
430 430
431 431 Returns a MergeResponse named tuple with the following fields
432 432 'possible', 'executed', 'source_commit', 'target_commit',
433 433 'merge_commit'.
434 434
435 435 :param target_ref: `target_ref` points to the commit on top of which
436 436 the `source_ref` should be merged.
437 437 :param source_repo: The repository that contains the commits to be
438 438 merged.
439 439 :param source_ref: `source_ref` points to the topmost commit from
440 440 the `source_repo` which should be merged.
441 441 :param workspace_id: `workspace_id` unique identifier.
442 442 :param user_name: Merge commit `user_name`.
443 443 :param user_email: Merge commit `user_email`.
444 444 :param message: Merge commit `message`.
445 445 :param dry_run: If `True` the merge will not take place.
446 446 :param use_rebase: If `True` commits from the source will be rebased
447 447 on top of the target instead of being merged.
448 :param close_branch: If `True` branch will be close before merging it
448 449 """
449 450 if dry_run:
450 451 message = message or 'dry_run_merge_message'
451 452 user_email = user_email or 'dry-run-merge@rhodecode.com'
452 453 user_name = user_name or 'Dry-Run User'
453 454 else:
454 455 if not user_name:
455 456 raise ValueError('user_name cannot be empty')
456 457 if not user_email:
457 458 raise ValueError('user_email cannot be empty')
458 459 if not message:
459 460 raise ValueError('message cannot be empty')
460 461
461 462 shadow_repository_path = self._maybe_prepare_merge_workspace(
462 463 workspace_id, target_ref)
463 464
464 465 try:
465 466 return self._merge_repo(
466 467 shadow_repository_path, target_ref, source_repo,
467 468 source_ref, message, user_name, user_email, dry_run=dry_run,
468 use_rebase=use_rebase)
469 use_rebase=use_rebase, close_branch=close_branch)
469 470 except RepositoryError:
470 471 log.exception(
471 472 'Unexpected failure when running merge, dry-run=%s',
472 473 dry_run)
473 474 return MergeResponse(
474 475 False, False, None, MergeFailureReason.UNKNOWN)
475 476
476 477 def _merge_repo(self, shadow_repository_path, target_ref,
477 478 source_repo, source_ref, merge_message,
478 merger_name, merger_email, dry_run=False, use_rebase=False):
479 merger_name, merger_email, dry_run=False,
480 use_rebase=False, close_branch=False):
479 481 """Internal implementation of merge."""
480 482 raise NotImplementedError
481 483
482 484 def _maybe_prepare_merge_workspace(self, workspace_id, target_ref):
483 485 """
484 486 Create the merge workspace.
485 487
486 488 :param workspace_id: `workspace_id` unique identifier.
487 489 """
488 490 raise NotImplementedError
489 491
490 492 def cleanup_merge_workspace(self, workspace_id):
491 493 """
492 494 Remove merge workspace.
493 495
494 496 This function MUST not fail in case there is no workspace associated to
495 497 the given `workspace_id`.
496 498
497 499 :param workspace_id: `workspace_id` unique identifier.
498 500 """
499 501 raise NotImplementedError
500 502
501 503 # ========== #
502 504 # COMMIT API #
503 505 # ========== #
504 506
505 507 @LazyProperty
506 508 def in_memory_commit(self):
507 509 """
508 510 Returns :class:`InMemoryCommit` object for this repository.
509 511 """
510 512 raise NotImplementedError
511 513
512 514 # ======================== #
513 515 # UTILITIES FOR SUBCLASSES #
514 516 # ======================== #
515 517
516 518 def _validate_diff_commits(self, commit1, commit2):
517 519 """
518 520 Validates that the given commits are related to this repository.
519 521
520 522 Intended as a utility for sub classes to have a consistent validation
521 523 of input parameters in methods like :meth:`get_diff`.
522 524 """
523 525 self._validate_commit(commit1)
524 526 self._validate_commit(commit2)
525 527 if (isinstance(commit1, EmptyCommit) and
526 528 isinstance(commit2, EmptyCommit)):
527 529 raise ValueError("Cannot compare two empty commits")
528 530
529 531 def _validate_commit(self, commit):
530 532 if not isinstance(commit, BaseCommit):
531 533 raise TypeError(
532 534 "%s is not of type BaseCommit" % repr(commit))
533 535 if commit.repository != self and not isinstance(commit, EmptyCommit):
534 536 raise ValueError(
535 537 "Commit %s must be a valid commit from this repository %s, "
536 538 "related to this repository instead %s." %
537 539 (commit, self, commit.repository))
538 540
539 541 def _validate_commit_id(self, commit_id):
540 542 if not isinstance(commit_id, basestring):
541 543 raise TypeError("commit_id must be a string value")
542 544
543 545 def _validate_commit_idx(self, commit_idx):
544 546 if not isinstance(commit_idx, (int, long)):
545 547 raise TypeError("commit_idx must be a numeric value")
546 548
547 549 def _validate_branch_name(self, branch_name):
548 550 if branch_name and branch_name not in self.branches_all:
549 551 msg = ("Branch %s not found in %s" % (branch_name, self))
550 552 raise BranchDoesNotExistError(msg)
551 553
552 554 #
553 555 # Supporting deprecated API parts
554 556 # TODO: johbo: consider to move this into a mixin
555 557 #
556 558
557 559 @property
558 560 def EMPTY_CHANGESET(self):
559 561 warnings.warn(
560 562 "Use EMPTY_COMMIT or EMPTY_COMMIT_ID instead", DeprecationWarning)
561 563 return self.EMPTY_COMMIT_ID
562 564
563 565 @property
564 566 def revisions(self):
565 567 warnings.warn("Use commits attribute instead", DeprecationWarning)
566 568 return self.commit_ids
567 569
568 570 @revisions.setter
569 571 def revisions(self, value):
570 572 warnings.warn("Use commits attribute instead", DeprecationWarning)
571 573 self.commit_ids = value
572 574
573 575 def get_changeset(self, revision=None, pre_load=None):
574 576 warnings.warn("Use get_commit instead", DeprecationWarning)
575 577 commit_id = None
576 578 commit_idx = None
577 579 if isinstance(revision, basestring):
578 580 commit_id = revision
579 581 else:
580 582 commit_idx = revision
581 583 return self.get_commit(
582 584 commit_id=commit_id, commit_idx=commit_idx, pre_load=pre_load)
583 585
584 586 def get_changesets(
585 587 self, start=None, end=None, start_date=None, end_date=None,
586 588 branch_name=None, pre_load=None):
587 589 warnings.warn("Use get_commits instead", DeprecationWarning)
588 590 start_id = self._revision_to_commit(start)
589 591 end_id = self._revision_to_commit(end)
590 592 return self.get_commits(
591 593 start_id=start_id, end_id=end_id, start_date=start_date,
592 594 end_date=end_date, branch_name=branch_name, pre_load=pre_load)
593 595
594 596 def _revision_to_commit(self, revision):
595 597 """
596 598 Translates a revision to a commit_id
597 599
598 600 Helps to support the old changeset based API which allows to use
599 601 commit ids and commit indices interchangeable.
600 602 """
601 603 if revision is None:
602 604 return revision
603 605
604 606 if isinstance(revision, basestring):
605 607 commit_id = revision
606 608 else:
607 609 commit_id = self.commit_ids[revision]
608 610 return commit_id
609 611
610 612 @property
611 613 def in_memory_changeset(self):
612 614 warnings.warn("Use in_memory_commit instead", DeprecationWarning)
613 615 return self.in_memory_commit
614 616
615 617
616 618 class BaseCommit(object):
617 619 """
618 620 Each backend should implement it's commit representation.
619 621
620 622 **Attributes**
621 623
622 624 ``repository``
623 625 repository object within which commit exists
624 626
625 627 ``id``
626 628 The commit id, may be ``raw_id`` or i.e. for mercurial's tip
627 629 just ``tip``.
628 630
629 631 ``raw_id``
630 632 raw commit representation (i.e. full 40 length sha for git
631 633 backend)
632 634
633 635 ``short_id``
634 636 shortened (if apply) version of ``raw_id``; it would be simple
635 637 shortcut for ``raw_id[:12]`` for git/mercurial backends or same
636 638 as ``raw_id`` for subversion
637 639
638 640 ``idx``
639 641 commit index
640 642
641 643 ``files``
642 644 list of ``FileNode`` (``Node`` with NodeKind.FILE) objects
643 645
644 646 ``dirs``
645 647 list of ``DirNode`` (``Node`` with NodeKind.DIR) objects
646 648
647 649 ``nodes``
648 650 combined list of ``Node`` objects
649 651
650 652 ``author``
651 653 author of the commit, as unicode
652 654
653 655 ``message``
654 656 message of the commit, as unicode
655 657
656 658 ``parents``
657 659 list of parent commits
658 660
659 661 """
660 662
661 663 branch = None
662 664 """
663 665 Depending on the backend this should be set to the branch name of the
664 666 commit. Backends not supporting branches on commits should leave this
665 667 value as ``None``.
666 668 """
667 669
668 670 _ARCHIVE_PREFIX_TEMPLATE = b'{repo_name}-{short_id}'
669 671 """
670 672 This template is used to generate a default prefix for repository archives
671 673 if no prefix has been specified.
672 674 """
673 675
674 676 def __str__(self):
675 677 return '<%s at %s:%s>' % (
676 678 self.__class__.__name__, self.idx, self.short_id)
677 679
678 680 def __repr__(self):
679 681 return self.__str__()
680 682
681 683 def __unicode__(self):
682 684 return u'%s:%s' % (self.idx, self.short_id)
683 685
684 686 def __eq__(self, other):
685 687 same_instance = isinstance(other, self.__class__)
686 688 return same_instance and self.raw_id == other.raw_id
687 689
688 690 def __json__(self):
689 691 parents = []
690 692 try:
691 693 for parent in self.parents:
692 694 parents.append({'raw_id': parent.raw_id})
693 695 except NotImplementedError:
694 696 # empty commit doesn't have parents implemented
695 697 pass
696 698
697 699 return {
698 700 'short_id': self.short_id,
699 701 'raw_id': self.raw_id,
700 702 'revision': self.idx,
701 703 'message': self.message,
702 704 'date': self.date,
703 705 'author': self.author,
704 706 'parents': parents,
705 707 'branch': self.branch
706 708 }
707 709
708 710 @LazyProperty
709 711 def last(self):
710 712 """
711 713 ``True`` if this is last commit in repository, ``False``
712 714 otherwise; trying to access this attribute while there is no
713 715 commits would raise `EmptyRepositoryError`
714 716 """
715 717 if self.repository is None:
716 718 raise CommitError("Cannot check if it's most recent commit")
717 719 return self.raw_id == self.repository.commit_ids[-1]
718 720
719 721 @LazyProperty
720 722 def parents(self):
721 723 """
722 724 Returns list of parent commits.
723 725 """
724 726 raise NotImplementedError
725 727
726 728 @property
727 729 def merge(self):
728 730 """
729 731 Returns boolean if commit is a merge.
730 732 """
731 733 return len(self.parents) > 1
732 734
733 735 @LazyProperty
734 736 def children(self):
735 737 """
736 738 Returns list of child commits.
737 739 """
738 740 raise NotImplementedError
739 741
740 742 @LazyProperty
741 743 def id(self):
742 744 """
743 745 Returns string identifying this commit.
744 746 """
745 747 raise NotImplementedError
746 748
747 749 @LazyProperty
748 750 def raw_id(self):
749 751 """
750 752 Returns raw string identifying this commit.
751 753 """
752 754 raise NotImplementedError
753 755
754 756 @LazyProperty
755 757 def short_id(self):
756 758 """
757 759 Returns shortened version of ``raw_id`` attribute, as string,
758 760 identifying this commit, useful for presentation to users.
759 761 """
760 762 raise NotImplementedError
761 763
762 764 @LazyProperty
763 765 def idx(self):
764 766 """
765 767 Returns integer identifying this commit.
766 768 """
767 769 raise NotImplementedError
768 770
769 771 @LazyProperty
770 772 def committer(self):
771 773 """
772 774 Returns committer for this commit
773 775 """
774 776 raise NotImplementedError
775 777
776 778 @LazyProperty
777 779 def committer_name(self):
778 780 """
779 781 Returns committer name for this commit
780 782 """
781 783
782 784 return author_name(self.committer)
783 785
784 786 @LazyProperty
785 787 def committer_email(self):
786 788 """
787 789 Returns committer email address for this commit
788 790 """
789 791
790 792 return author_email(self.committer)
791 793
792 794 @LazyProperty
793 795 def author(self):
794 796 """
795 797 Returns author for this commit
796 798 """
797 799
798 800 raise NotImplementedError
799 801
800 802 @LazyProperty
801 803 def author_name(self):
802 804 """
803 805 Returns author name for this commit
804 806 """
805 807
806 808 return author_name(self.author)
807 809
808 810 @LazyProperty
809 811 def author_email(self):
810 812 """
811 813 Returns author email address for this commit
812 814 """
813 815
814 816 return author_email(self.author)
815 817
816 818 def get_file_mode(self, path):
817 819 """
818 820 Returns stat mode of the file at `path`.
819 821 """
820 822 raise NotImplementedError
821 823
822 824 def is_link(self, path):
823 825 """
824 826 Returns ``True`` if given `path` is a symlink
825 827 """
826 828 raise NotImplementedError
827 829
828 830 def get_file_content(self, path):
829 831 """
830 832 Returns content of the file at the given `path`.
831 833 """
832 834 raise NotImplementedError
833 835
834 836 def get_file_size(self, path):
835 837 """
836 838 Returns size of the file at the given `path`.
837 839 """
838 840 raise NotImplementedError
839 841
840 842 def get_file_commit(self, path, pre_load=None):
841 843 """
842 844 Returns last commit of the file at the given `path`.
843 845
844 846 :param pre_load: Optional. List of commit attributes to load.
845 847 """
846 848 commits = self.get_file_history(path, limit=1, pre_load=pre_load)
847 849 if not commits:
848 850 raise RepositoryError(
849 851 'Failed to fetch history for path {}. '
850 852 'Please check if such path exists in your repository'.format(
851 853 path))
852 854 return commits[0]
853 855
854 856 def get_file_history(self, path, limit=None, pre_load=None):
855 857 """
856 858 Returns history of file as reversed list of :class:`BaseCommit`
857 859 objects for which file at given `path` has been modified.
858 860
859 861 :param limit: Optional. Allows to limit the size of the returned
860 862 history. This is intended as a hint to the underlying backend, so
861 863 that it can apply optimizations depending on the limit.
862 864 :param pre_load: Optional. List of commit attributes to load.
863 865 """
864 866 raise NotImplementedError
865 867
866 868 def get_file_annotate(self, path, pre_load=None):
867 869 """
868 870 Returns a generator of four element tuples with
869 871 lineno, sha, commit lazy loader and line
870 872
871 873 :param pre_load: Optional. List of commit attributes to load.
872 874 """
873 875 raise NotImplementedError
874 876
875 877 def get_nodes(self, path):
876 878 """
877 879 Returns combined ``DirNode`` and ``FileNode`` objects list representing
878 880 state of commit at the given ``path``.
879 881
880 882 :raises ``CommitError``: if node at the given ``path`` is not
881 883 instance of ``DirNode``
882 884 """
883 885 raise NotImplementedError
884 886
885 887 def get_node(self, path):
886 888 """
887 889 Returns ``Node`` object from the given ``path``.
888 890
889 891 :raises ``NodeDoesNotExistError``: if there is no node at the given
890 892 ``path``
891 893 """
892 894 raise NotImplementedError
893 895
894 896 def get_largefile_node(self, path):
895 897 """
896 898 Returns the path to largefile from Mercurial/Git-lfs storage.
897 899 or None if it's not a largefile node
898 900 """
899 901 return None
900 902
901 903 def archive_repo(self, file_path, kind='tgz', subrepos=None,
902 904 prefix=None, write_metadata=False, mtime=None):
903 905 """
904 906 Creates an archive containing the contents of the repository.
905 907
906 908 :param file_path: path to the file which to create the archive.
907 909 :param kind: one of following: ``"tbz2"``, ``"tgz"``, ``"zip"``.
908 910 :param prefix: name of root directory in archive.
909 911 Default is repository name and commit's short_id joined with dash:
910 912 ``"{repo_name}-{short_id}"``.
911 913 :param write_metadata: write a metadata file into archive.
912 914 :param mtime: custom modification time for archive creation, defaults
913 915 to time.time() if not given.
914 916
915 917 :raise VCSError: If prefix has a problem.
916 918 """
917 919 allowed_kinds = settings.ARCHIVE_SPECS.keys()
918 920 if kind not in allowed_kinds:
919 921 raise ImproperArchiveTypeError(
920 922 'Archive kind (%s) not supported use one of %s' %
921 923 (kind, allowed_kinds))
922 924
923 925 prefix = self._validate_archive_prefix(prefix)
924 926
925 927 mtime = mtime or time.mktime(self.date.timetuple())
926 928
927 929 file_info = []
928 930 cur_rev = self.repository.get_commit(commit_id=self.raw_id)
929 931 for _r, _d, files in cur_rev.walk('/'):
930 932 for f in files:
931 933 f_path = os.path.join(prefix, f.path)
932 934 file_info.append(
933 935 (f_path, f.mode, f.is_link(), f.raw_bytes))
934 936
935 937 if write_metadata:
936 938 metadata = [
937 939 ('repo_name', self.repository.name),
938 940 ('rev', self.raw_id),
939 941 ('create_time', mtime),
940 942 ('branch', self.branch),
941 943 ('tags', ','.join(self.tags)),
942 944 ]
943 945 meta = ["%s:%s" % (f_name, value) for f_name, value in metadata]
944 946 file_info.append(('.archival.txt', 0644, False, '\n'.join(meta)))
945 947
946 948 connection.Hg.archive_repo(file_path, mtime, file_info, kind)
947 949
948 950 def _validate_archive_prefix(self, prefix):
949 951 if prefix is None:
950 952 prefix = self._ARCHIVE_PREFIX_TEMPLATE.format(
951 953 repo_name=safe_str(self.repository.name),
952 954 short_id=self.short_id)
953 955 elif not isinstance(prefix, str):
954 956 raise ValueError("prefix not a bytes object: %s" % repr(prefix))
955 957 elif prefix.startswith('/'):
956 958 raise VCSError("Prefix cannot start with leading slash")
957 959 elif prefix.strip() == '':
958 960 raise VCSError("Prefix cannot be empty")
959 961 return prefix
960 962
961 963 @LazyProperty
962 964 def root(self):
963 965 """
964 966 Returns ``RootNode`` object for this commit.
965 967 """
966 968 return self.get_node('')
967 969
968 970 def next(self, branch=None):
969 971 """
970 972 Returns next commit from current, if branch is gives it will return
971 973 next commit belonging to this branch
972 974
973 975 :param branch: show commits within the given named branch
974 976 """
975 977 indexes = xrange(self.idx + 1, self.repository.count())
976 978 return self._find_next(indexes, branch)
977 979
978 980 def prev(self, branch=None):
979 981 """
980 982 Returns previous commit from current, if branch is gives it will
981 983 return previous commit belonging to this branch
982 984
983 985 :param branch: show commit within the given named branch
984 986 """
985 987 indexes = xrange(self.idx - 1, -1, -1)
986 988 return self._find_next(indexes, branch)
987 989
988 990 def _find_next(self, indexes, branch=None):
989 991 if branch and self.branch != branch:
990 992 raise VCSError('Branch option used on commit not belonging '
991 993 'to that branch')
992 994
993 995 for next_idx in indexes:
994 996 commit = self.repository.get_commit(commit_idx=next_idx)
995 997 if branch and branch != commit.branch:
996 998 continue
997 999 return commit
998 1000 raise CommitDoesNotExistError
999 1001
1000 1002 def diff(self, ignore_whitespace=True, context=3):
1001 1003 """
1002 1004 Returns a `Diff` object representing the change made by this commit.
1003 1005 """
1004 1006 parent = (
1005 1007 self.parents[0] if self.parents else self.repository.EMPTY_COMMIT)
1006 1008 diff = self.repository.get_diff(
1007 1009 parent, self,
1008 1010 ignore_whitespace=ignore_whitespace,
1009 1011 context=context)
1010 1012 return diff
1011 1013
1012 1014 @LazyProperty
1013 1015 def added(self):
1014 1016 """
1015 1017 Returns list of added ``FileNode`` objects.
1016 1018 """
1017 1019 raise NotImplementedError
1018 1020
1019 1021 @LazyProperty
1020 1022 def changed(self):
1021 1023 """
1022 1024 Returns list of modified ``FileNode`` objects.
1023 1025 """
1024 1026 raise NotImplementedError
1025 1027
1026 1028 @LazyProperty
1027 1029 def removed(self):
1028 1030 """
1029 1031 Returns list of removed ``FileNode`` objects.
1030 1032 """
1031 1033 raise NotImplementedError
1032 1034
1033 1035 @LazyProperty
1034 1036 def size(self):
1035 1037 """
1036 1038 Returns total number of bytes from contents of all filenodes.
1037 1039 """
1038 1040 return sum((node.size for node in self.get_filenodes_generator()))
1039 1041
1040 1042 def walk(self, topurl=''):
1041 1043 """
1042 1044 Similar to os.walk method. Insted of filesystem it walks through
1043 1045 commit starting at given ``topurl``. Returns generator of tuples
1044 1046 (topnode, dirnodes, filenodes).
1045 1047 """
1046 1048 topnode = self.get_node(topurl)
1047 1049 if not topnode.is_dir():
1048 1050 return
1049 1051 yield (topnode, topnode.dirs, topnode.files)
1050 1052 for dirnode in topnode.dirs:
1051 1053 for tup in self.walk(dirnode.path):
1052 1054 yield tup
1053 1055
1054 1056 def get_filenodes_generator(self):
1055 1057 """
1056 1058 Returns generator that yields *all* file nodes.
1057 1059 """
1058 1060 for topnode, dirs, files in self.walk():
1059 1061 for node in files:
1060 1062 yield node
1061 1063
1062 1064 #
1063 1065 # Utilities for sub classes to support consistent behavior
1064 1066 #
1065 1067
1066 1068 def no_node_at_path(self, path):
1067 1069 return NodeDoesNotExistError(
1068 1070 u"There is no file nor directory at the given path: "
1069 1071 u"`%s` at commit %s" % (safe_unicode(path), self.short_id))
1070 1072
1071 1073 def _fix_path(self, path):
1072 1074 """
1073 1075 Paths are stored without trailing slash so we need to get rid off it if
1074 1076 needed.
1075 1077 """
1076 1078 return path.rstrip('/')
1077 1079
1078 1080 #
1079 1081 # Deprecated API based on changesets
1080 1082 #
1081 1083
1082 1084 @property
1083 1085 def revision(self):
1084 1086 warnings.warn("Use idx instead", DeprecationWarning)
1085 1087 return self.idx
1086 1088
1087 1089 @revision.setter
1088 1090 def revision(self, value):
1089 1091 warnings.warn("Use idx instead", DeprecationWarning)
1090 1092 self.idx = value
1091 1093
1092 1094 def get_file_changeset(self, path):
1093 1095 warnings.warn("Use get_file_commit instead", DeprecationWarning)
1094 1096 return self.get_file_commit(path)
1095 1097
1096 1098
1097 1099 class BaseChangesetClass(type):
1098 1100
1099 1101 def __instancecheck__(self, instance):
1100 1102 return isinstance(instance, BaseCommit)
1101 1103
1102 1104
1103 1105 class BaseChangeset(BaseCommit):
1104 1106
1105 1107 __metaclass__ = BaseChangesetClass
1106 1108
1107 1109 def __new__(cls, *args, **kwargs):
1108 1110 warnings.warn(
1109 1111 "Use BaseCommit instead of BaseChangeset", DeprecationWarning)
1110 1112 return super(BaseChangeset, cls).__new__(cls, *args, **kwargs)
1111 1113
1112 1114
1113 1115 class BaseInMemoryCommit(object):
1114 1116 """
1115 1117 Represents differences between repository's state (most recent head) and
1116 1118 changes made *in place*.
1117 1119
1118 1120 **Attributes**
1119 1121
1120 1122 ``repository``
1121 1123 repository object for this in-memory-commit
1122 1124
1123 1125 ``added``
1124 1126 list of ``FileNode`` objects marked as *added*
1125 1127
1126 1128 ``changed``
1127 1129 list of ``FileNode`` objects marked as *changed*
1128 1130
1129 1131 ``removed``
1130 1132 list of ``FileNode`` or ``RemovedFileNode`` objects marked to be
1131 1133 *removed*
1132 1134
1133 1135 ``parents``
1134 1136 list of :class:`BaseCommit` instances representing parents of
1135 1137 in-memory commit. Should always be 2-element sequence.
1136 1138
1137 1139 """
1138 1140
1139 1141 def __init__(self, repository):
1140 1142 self.repository = repository
1141 1143 self.added = []
1142 1144 self.changed = []
1143 1145 self.removed = []
1144 1146 self.parents = []
1145 1147
1146 1148 def add(self, *filenodes):
1147 1149 """
1148 1150 Marks given ``FileNode`` objects as *to be committed*.
1149 1151
1150 1152 :raises ``NodeAlreadyExistsError``: if node with same path exists at
1151 1153 latest commit
1152 1154 :raises ``NodeAlreadyAddedError``: if node with same path is already
1153 1155 marked as *added*
1154 1156 """
1155 1157 # Check if not already marked as *added* first
1156 1158 for node in filenodes:
1157 1159 if node.path in (n.path for n in self.added):
1158 1160 raise NodeAlreadyAddedError(
1159 1161 "Such FileNode %s is already marked for addition"
1160 1162 % node.path)
1161 1163 for node in filenodes:
1162 1164 self.added.append(node)
1163 1165
1164 1166 def change(self, *filenodes):
1165 1167 """
1166 1168 Marks given ``FileNode`` objects to be *changed* in next commit.
1167 1169
1168 1170 :raises ``EmptyRepositoryError``: if there are no commits yet
1169 1171 :raises ``NodeAlreadyExistsError``: if node with same path is already
1170 1172 marked to be *changed*
1171 1173 :raises ``NodeAlreadyRemovedError``: if node with same path is already
1172 1174 marked to be *removed*
1173 1175 :raises ``NodeDoesNotExistError``: if node doesn't exist in latest
1174 1176 commit
1175 1177 :raises ``NodeNotChangedError``: if node hasn't really be changed
1176 1178 """
1177 1179 for node in filenodes:
1178 1180 if node.path in (n.path for n in self.removed):
1179 1181 raise NodeAlreadyRemovedError(
1180 1182 "Node at %s is already marked as removed" % node.path)
1181 1183 try:
1182 1184 self.repository.get_commit()
1183 1185 except EmptyRepositoryError:
1184 1186 raise EmptyRepositoryError(
1185 1187 "Nothing to change - try to *add* new nodes rather than "
1186 1188 "changing them")
1187 1189 for node in filenodes:
1188 1190 if node.path in (n.path for n in self.changed):
1189 1191 raise NodeAlreadyChangedError(
1190 1192 "Node at '%s' is already marked as changed" % node.path)
1191 1193 self.changed.append(node)
1192 1194
1193 1195 def remove(self, *filenodes):
1194 1196 """
1195 1197 Marks given ``FileNode`` (or ``RemovedFileNode``) objects to be
1196 1198 *removed* in next commit.
1197 1199
1198 1200 :raises ``NodeAlreadyRemovedError``: if node has been already marked to
1199 1201 be *removed*
1200 1202 :raises ``NodeAlreadyChangedError``: if node has been already marked to
1201 1203 be *changed*
1202 1204 """
1203 1205 for node in filenodes:
1204 1206 if node.path in (n.path for n in self.removed):
1205 1207 raise NodeAlreadyRemovedError(
1206 1208 "Node is already marked to for removal at %s" % node.path)
1207 1209 if node.path in (n.path for n in self.changed):
1208 1210 raise NodeAlreadyChangedError(
1209 1211 "Node is already marked to be changed at %s" % node.path)
1210 1212 # We only mark node as *removed* - real removal is done by
1211 1213 # commit method
1212 1214 self.removed.append(node)
1213 1215
1214 1216 def reset(self):
1215 1217 """
1216 1218 Resets this instance to initial state (cleans ``added``, ``changed``
1217 1219 and ``removed`` lists).
1218 1220 """
1219 1221 self.added = []
1220 1222 self.changed = []
1221 1223 self.removed = []
1222 1224 self.parents = []
1223 1225
1224 1226 def get_ipaths(self):
1225 1227 """
1226 1228 Returns generator of paths from nodes marked as added, changed or
1227 1229 removed.
1228 1230 """
1229 1231 for node in itertools.chain(self.added, self.changed, self.removed):
1230 1232 yield node.path
1231 1233
1232 1234 def get_paths(self):
1233 1235 """
1234 1236 Returns list of paths from nodes marked as added, changed or removed.
1235 1237 """
1236 1238 return list(self.get_ipaths())
1237 1239
1238 1240 def check_integrity(self, parents=None):
1239 1241 """
1240 1242 Checks in-memory commit's integrity. Also, sets parents if not
1241 1243 already set.
1242 1244
1243 1245 :raises CommitError: if any error occurs (i.e.
1244 1246 ``NodeDoesNotExistError``).
1245 1247 """
1246 1248 if not self.parents:
1247 1249 parents = parents or []
1248 1250 if len(parents) == 0:
1249 1251 try:
1250 1252 parents = [self.repository.get_commit(), None]
1251 1253 except EmptyRepositoryError:
1252 1254 parents = [None, None]
1253 1255 elif len(parents) == 1:
1254 1256 parents += [None]
1255 1257 self.parents = parents
1256 1258
1257 1259 # Local parents, only if not None
1258 1260 parents = [p for p in self.parents if p]
1259 1261
1260 1262 # Check nodes marked as added
1261 1263 for p in parents:
1262 1264 for node in self.added:
1263 1265 try:
1264 1266 p.get_node(node.path)
1265 1267 except NodeDoesNotExistError:
1266 1268 pass
1267 1269 else:
1268 1270 raise NodeAlreadyExistsError(
1269 1271 "Node `%s` already exists at %s" % (node.path, p))
1270 1272
1271 1273 # Check nodes marked as changed
1272 1274 missing = set(self.changed)
1273 1275 not_changed = set(self.changed)
1274 1276 if self.changed and not parents:
1275 1277 raise NodeDoesNotExistError(str(self.changed[0].path))
1276 1278 for p in parents:
1277 1279 for node in self.changed:
1278 1280 try:
1279 1281 old = p.get_node(node.path)
1280 1282 missing.remove(node)
1281 1283 # if content actually changed, remove node from not_changed
1282 1284 if old.content != node.content:
1283 1285 not_changed.remove(node)
1284 1286 except NodeDoesNotExistError:
1285 1287 pass
1286 1288 if self.changed and missing:
1287 1289 raise NodeDoesNotExistError(
1288 1290 "Node `%s` marked as modified but missing in parents: %s"
1289 1291 % (node.path, parents))
1290 1292
1291 1293 if self.changed and not_changed:
1292 1294 raise NodeNotChangedError(
1293 1295 "Node `%s` wasn't actually changed (parents: %s)"
1294 1296 % (not_changed.pop().path, parents))
1295 1297
1296 1298 # Check nodes marked as removed
1297 1299 if self.removed and not parents:
1298 1300 raise NodeDoesNotExistError(
1299 1301 "Cannot remove node at %s as there "
1300 1302 "were no parents specified" % self.removed[0].path)
1301 1303 really_removed = set()
1302 1304 for p in parents:
1303 1305 for node in self.removed:
1304 1306 try:
1305 1307 p.get_node(node.path)
1306 1308 really_removed.add(node)
1307 1309 except CommitError:
1308 1310 pass
1309 1311 not_removed = set(self.removed) - really_removed
1310 1312 if not_removed:
1311 1313 # TODO: johbo: This code branch does not seem to be covered
1312 1314 raise NodeDoesNotExistError(
1313 1315 "Cannot remove node at %s from "
1314 1316 "following parents: %s" % (not_removed, parents))
1315 1317
1316 1318 def commit(
1317 1319 self, message, author, parents=None, branch=None, date=None,
1318 1320 **kwargs):
1319 1321 """
1320 1322 Performs in-memory commit (doesn't check workdir in any way) and
1321 1323 returns newly created :class:`BaseCommit`. Updates repository's
1322 1324 attribute `commits`.
1323 1325
1324 1326 .. note::
1325 1327
1326 1328 While overriding this method each backend's should call
1327 1329 ``self.check_integrity(parents)`` in the first place.
1328 1330
1329 1331 :param message: message of the commit
1330 1332 :param author: full username, i.e. "Joe Doe <joe.doe@example.com>"
1331 1333 :param parents: single parent or sequence of parents from which commit
1332 1334 would be derived
1333 1335 :param date: ``datetime.datetime`` instance. Defaults to
1334 1336 ``datetime.datetime.now()``.
1335 1337 :param branch: branch name, as string. If none given, default backend's
1336 1338 branch would be used.
1337 1339
1338 1340 :raises ``CommitError``: if any error occurs while committing
1339 1341 """
1340 1342 raise NotImplementedError
1341 1343
1342 1344
1343 1345 class BaseInMemoryChangesetClass(type):
1344 1346
1345 1347 def __instancecheck__(self, instance):
1346 1348 return isinstance(instance, BaseInMemoryCommit)
1347 1349
1348 1350
1349 1351 class BaseInMemoryChangeset(BaseInMemoryCommit):
1350 1352
1351 1353 __metaclass__ = BaseInMemoryChangesetClass
1352 1354
1353 1355 def __new__(cls, *args, **kwargs):
1354 1356 warnings.warn(
1355 1357 "Use BaseCommit instead of BaseInMemoryCommit", DeprecationWarning)
1356 1358 return super(BaseInMemoryChangeset, cls).__new__(cls, *args, **kwargs)
1357 1359
1358 1360
1359 1361 class EmptyCommit(BaseCommit):
1360 1362 """
1361 1363 An dummy empty commit. It's possible to pass hash when creating
1362 1364 an EmptyCommit
1363 1365 """
1364 1366
1365 1367 def __init__(
1366 1368 self, commit_id='0' * 40, repo=None, alias=None, idx=-1,
1367 1369 message='', author='', date=None):
1368 1370 self._empty_commit_id = commit_id
1369 1371 # TODO: johbo: Solve idx parameter, default value does not make
1370 1372 # too much sense
1371 1373 self.idx = idx
1372 1374 self.message = message
1373 1375 self.author = author
1374 1376 self.date = date or datetime.datetime.fromtimestamp(0)
1375 1377 self.repository = repo
1376 1378 self.alias = alias
1377 1379
1378 1380 @LazyProperty
1379 1381 def raw_id(self):
1380 1382 """
1381 1383 Returns raw string identifying this commit, useful for web
1382 1384 representation.
1383 1385 """
1384 1386
1385 1387 return self._empty_commit_id
1386 1388
1387 1389 @LazyProperty
1388 1390 def branch(self):
1389 1391 if self.alias:
1390 1392 from rhodecode.lib.vcs.backends import get_backend
1391 1393 return get_backend(self.alias).DEFAULT_BRANCH_NAME
1392 1394
1393 1395 @LazyProperty
1394 1396 def short_id(self):
1395 1397 return self.raw_id[:12]
1396 1398
1397 1399 @LazyProperty
1398 1400 def id(self):
1399 1401 return self.raw_id
1400 1402
1401 1403 def get_file_commit(self, path):
1402 1404 return self
1403 1405
1404 1406 def get_file_content(self, path):
1405 1407 return u''
1406 1408
1407 1409 def get_file_size(self, path):
1408 1410 return 0
1409 1411
1410 1412
1411 1413 class EmptyChangesetClass(type):
1412 1414
1413 1415 def __instancecheck__(self, instance):
1414 1416 return isinstance(instance, EmptyCommit)
1415 1417
1416 1418
1417 1419 class EmptyChangeset(EmptyCommit):
1418 1420
1419 1421 __metaclass__ = EmptyChangesetClass
1420 1422
1421 1423 def __new__(cls, *args, **kwargs):
1422 1424 warnings.warn(
1423 1425 "Use EmptyCommit instead of EmptyChangeset", DeprecationWarning)
1424 1426 return super(EmptyCommit, cls).__new__(cls, *args, **kwargs)
1425 1427
1426 1428 def __init__(self, cs='0' * 40, repo=None, requested_revision=None,
1427 1429 alias=None, revision=-1, message='', author='', date=None):
1428 1430 if requested_revision is not None:
1429 1431 warnings.warn(
1430 1432 "Parameter requested_revision not supported anymore",
1431 1433 DeprecationWarning)
1432 1434 super(EmptyChangeset, self).__init__(
1433 1435 commit_id=cs, repo=repo, alias=alias, idx=revision,
1434 1436 message=message, author=author, date=date)
1435 1437
1436 1438 @property
1437 1439 def revision(self):
1438 1440 warnings.warn("Use idx instead", DeprecationWarning)
1439 1441 return self.idx
1440 1442
1441 1443 @revision.setter
1442 1444 def revision(self, value):
1443 1445 warnings.warn("Use idx instead", DeprecationWarning)
1444 1446 self.idx = value
1445 1447
1446 1448
1447 1449 class EmptyRepository(BaseRepository):
1448 1450 def __init__(self, repo_path=None, config=None, create=False, **kwargs):
1449 1451 pass
1450 1452
1451 1453 def get_diff(self, *args, **kwargs):
1452 1454 from rhodecode.lib.vcs.backends.git.diff import GitDiff
1453 1455 return GitDiff('')
1454 1456
1455 1457
1456 1458 class CollectionGenerator(object):
1457 1459
1458 1460 def __init__(self, repo, commit_ids, collection_size=None, pre_load=None):
1459 1461 self.repo = repo
1460 1462 self.commit_ids = commit_ids
1461 1463 # TODO: (oliver) this isn't currently hooked up
1462 1464 self.collection_size = None
1463 1465 self.pre_load = pre_load
1464 1466
1465 1467 def __len__(self):
1466 1468 if self.collection_size is not None:
1467 1469 return self.collection_size
1468 1470 return self.commit_ids.__len__()
1469 1471
1470 1472 def __iter__(self):
1471 1473 for commit_id in self.commit_ids:
1472 1474 # TODO: johbo: Mercurial passes in commit indices or commit ids
1473 1475 yield self._commit_factory(commit_id)
1474 1476
1475 1477 def _commit_factory(self, commit_id):
1476 1478 """
1477 1479 Allows backends to override the way commits are generated.
1478 1480 """
1479 1481 return self.repo.get_commit(commit_id=commit_id,
1480 1482 pre_load=self.pre_load)
1481 1483
1482 1484 def __getslice__(self, i, j):
1483 1485 """
1484 1486 Returns an iterator of sliced repository
1485 1487 """
1486 1488 commit_ids = self.commit_ids[i:j]
1487 1489 return self.__class__(
1488 1490 self.repo, commit_ids, pre_load=self.pre_load)
1489 1491
1490 1492 def __repr__(self):
1491 1493 return '<CollectionGenerator[len:%s]>' % (self.__len__())
1492 1494
1493 1495
1494 1496 class Config(object):
1495 1497 """
1496 1498 Represents the configuration for a repository.
1497 1499
1498 1500 The API is inspired by :class:`ConfigParser.ConfigParser` from the
1499 1501 standard library. It implements only the needed subset.
1500 1502 """
1501 1503
1502 1504 def __init__(self):
1503 1505 self._values = {}
1504 1506
1505 1507 def copy(self):
1506 1508 clone = Config()
1507 1509 for section, values in self._values.items():
1508 1510 clone._values[section] = values.copy()
1509 1511 return clone
1510 1512
1511 1513 def __repr__(self):
1512 1514 return '<Config(%s sections) at %s>' % (
1513 1515 len(self._values), hex(id(self)))
1514 1516
1515 1517 def items(self, section):
1516 1518 return self._values.get(section, {}).iteritems()
1517 1519
1518 1520 def get(self, section, option):
1519 1521 return self._values.get(section, {}).get(option)
1520 1522
1521 1523 def set(self, section, option, value):
1522 1524 section_values = self._values.setdefault(section, {})
1523 1525 section_values[option] = value
1524 1526
1525 1527 def clear_section(self, section):
1526 1528 self._values[section] = {}
1527 1529
1528 1530 def serialize(self):
1529 1531 """
1530 1532 Creates a list of three tuples (section, key, value) representing
1531 1533 this config object.
1532 1534 """
1533 1535 items = []
1534 1536 for section in self._values:
1535 1537 for option, value in self._values[section].items():
1536 1538 items.append(
1537 1539 (safe_str(section), safe_str(option), safe_str(value)))
1538 1540 return items
1539 1541
1540 1542
1541 1543 class Diff(object):
1542 1544 """
1543 1545 Represents a diff result from a repository backend.
1544 1546
1545 1547 Subclasses have to provide a backend specific value for
1546 1548 :attr:`_header_re` and :attr:`_meta_re`.
1547 1549 """
1548 1550 _meta_re = None
1549 1551 _header_re = None
1550 1552
1551 1553 def __init__(self, raw_diff):
1552 1554 self.raw = raw_diff
1553 1555
1554 1556 def chunks(self):
1555 1557 """
1556 1558 split the diff in chunks of separate --git a/file b/file chunks
1557 1559 to make diffs consistent we must prepend with \n, and make sure
1558 1560 we can detect last chunk as this was also has special rule
1559 1561 """
1560 1562
1561 1563 diff_parts = ('\n' + self.raw).split('\ndiff --git')
1562 1564 header = diff_parts[0]
1563 1565
1564 1566 if self._meta_re:
1565 1567 match = self._meta_re.match(header)
1566 1568
1567 1569 chunks = diff_parts[1:]
1568 1570 total_chunks = len(chunks)
1569 1571
1570 1572 return (
1571 1573 DiffChunk(chunk, self, cur_chunk == total_chunks)
1572 1574 for cur_chunk, chunk in enumerate(chunks, start=1))
1573 1575
1574 1576
1575 1577 class DiffChunk(object):
1576 1578
1577 1579 def __init__(self, chunk, diff, last_chunk):
1578 1580 self._diff = diff
1579 1581
1580 1582 # since we split by \ndiff --git that part is lost from original diff
1581 1583 # we need to re-apply it at the end, EXCEPT ! if it's last chunk
1582 1584 if not last_chunk:
1583 1585 chunk += '\n'
1584 1586
1585 1587 match = self._diff._header_re.match(chunk)
1586 1588 self.header = match.groupdict()
1587 1589 self.diff = chunk[match.end():]
1588 1590 self.raw = chunk
@@ -1,933 +1,933 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2014-2017 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 """
22 22 GIT repository module
23 23 """
24 24
25 25 import logging
26 26 import os
27 27 import re
28 28 import shutil
29 29
30 30 from zope.cachedescriptors.property import Lazy as LazyProperty
31 31
32 32 from rhodecode.lib.compat import OrderedDict
33 33 from rhodecode.lib.datelib import (
34 34 utcdate_fromtimestamp, makedate, date_astimestamp)
35 35 from rhodecode.lib.utils import safe_unicode, safe_str
36 36 from rhodecode.lib.vcs import connection, path as vcspath
37 37 from rhodecode.lib.vcs.backends.base import (
38 38 BaseRepository, CollectionGenerator, Config, MergeResponse,
39 39 MergeFailureReason, Reference)
40 40 from rhodecode.lib.vcs.backends.git.commit import GitCommit
41 41 from rhodecode.lib.vcs.backends.git.diff import GitDiff
42 42 from rhodecode.lib.vcs.backends.git.inmemory import GitInMemoryCommit
43 43 from rhodecode.lib.vcs.exceptions import (
44 44 CommitDoesNotExistError, EmptyRepositoryError,
45 45 RepositoryError, TagAlreadyExistError, TagDoesNotExistError, VCSError)
46 46
47 47
48 48 SHA_PATTERN = re.compile(r'^[[0-9a-fA-F]{12}|[0-9a-fA-F]{40}]$')
49 49
50 50 log = logging.getLogger(__name__)
51 51
52 52
53 53 class GitRepository(BaseRepository):
54 54 """
55 55 Git repository backend.
56 56 """
57 57 DEFAULT_BRANCH_NAME = 'master'
58 58
59 59 contact = BaseRepository.DEFAULT_CONTACT
60 60
61 61 def __init__(self, repo_path, config=None, create=False, src_url=None,
62 62 update_after_clone=False, with_wire=None, bare=False):
63 63
64 64 self.path = safe_str(os.path.abspath(repo_path))
65 65 self.config = config if config else Config()
66 66 self._remote = connection.Git(
67 67 self.path, self.config, with_wire=with_wire)
68 68
69 69 self._init_repo(create, src_url, update_after_clone, bare)
70 70
71 71 # caches
72 72 self._commit_ids = {}
73 73
74 74 self.bookmarks = {}
75 75
76 76 @LazyProperty
77 77 def bare(self):
78 78 return self._remote.bare()
79 79
80 80 @LazyProperty
81 81 def head(self):
82 82 return self._remote.head()
83 83
84 84 @LazyProperty
85 85 def commit_ids(self):
86 86 """
87 87 Returns list of commit ids, in ascending order. Being lazy
88 88 attribute allows external tools to inject commit ids from cache.
89 89 """
90 90 commit_ids = self._get_all_commit_ids()
91 91 self._rebuild_cache(commit_ids)
92 92 return commit_ids
93 93
94 94 def _rebuild_cache(self, commit_ids):
95 95 self._commit_ids = dict((commit_id, index)
96 96 for index, commit_id in enumerate(commit_ids))
97 97
98 98 def run_git_command(self, cmd, **opts):
99 99 """
100 100 Runs given ``cmd`` as git command and returns tuple
101 101 (stdout, stderr).
102 102
103 103 :param cmd: git command to be executed
104 104 :param opts: env options to pass into Subprocess command
105 105 """
106 106 if not isinstance(cmd, list):
107 107 raise ValueError('cmd must be a list, got %s instead' % type(cmd))
108 108
109 109 out, err = self._remote.run_git_command(cmd, **opts)
110 110 if err:
111 111 log.debug('Stderr output of git command "%s":\n%s', cmd, err)
112 112 return out, err
113 113
114 114 @staticmethod
115 115 def check_url(url, config):
116 116 """
117 117 Function will check given url and try to verify if it's a valid
118 118 link. Sometimes it may happened that git will issue basic
119 119 auth request that can cause whole API to hang when used from python
120 120 or other external calls.
121 121
122 122 On failures it'll raise urllib2.HTTPError, exception is also thrown
123 123 when the return code is non 200
124 124 """
125 125 # check first if it's not an url
126 126 if os.path.isdir(url) or url.startswith('file:'):
127 127 return True
128 128
129 129 if '+' in url.split('://', 1)[0]:
130 130 url = url.split('+', 1)[1]
131 131
132 132 # Request the _remote to verify the url
133 133 return connection.Git.check_url(url, config.serialize())
134 134
135 135 @staticmethod
136 136 def is_valid_repository(path):
137 137 if os.path.isdir(os.path.join(path, '.git')):
138 138 return True
139 139 # check case of bare repository
140 140 try:
141 141 GitRepository(path)
142 142 return True
143 143 except VCSError:
144 144 pass
145 145 return False
146 146
147 147 def _init_repo(self, create, src_url=None, update_after_clone=False,
148 148 bare=False):
149 149 if create and os.path.exists(self.path):
150 150 raise RepositoryError(
151 151 "Cannot create repository at %s, location already exist"
152 152 % self.path)
153 153
154 154 try:
155 155 if create and src_url:
156 156 GitRepository.check_url(src_url, self.config)
157 157 self.clone(src_url, update_after_clone, bare)
158 158 elif create:
159 159 os.makedirs(self.path, mode=0755)
160 160
161 161 if bare:
162 162 self._remote.init_bare()
163 163 else:
164 164 self._remote.init()
165 165 else:
166 166 self._remote.assert_correct_path()
167 167 # TODO: johbo: check if we have to translate the OSError here
168 168 except OSError as err:
169 169 raise RepositoryError(err)
170 170
171 171 def _get_all_commit_ids(self, filters=None):
172 172 # we must check if this repo is not empty, since later command
173 173 # fails if it is. And it's cheaper to ask than throw the subprocess
174 174 # errors
175 175 try:
176 176 self._remote.head()
177 177 except KeyError:
178 178 return []
179 179
180 180 rev_filter = ['--branches', '--tags']
181 181 extra_filter = []
182 182
183 183 if filters:
184 184 if filters.get('since'):
185 185 extra_filter.append('--since=%s' % (filters['since']))
186 186 if filters.get('until'):
187 187 extra_filter.append('--until=%s' % (filters['until']))
188 188 if filters.get('branch_name'):
189 189 rev_filter = ['--tags']
190 190 extra_filter.append(filters['branch_name'])
191 191 rev_filter.extend(extra_filter)
192 192
193 193 # if filters.get('start') or filters.get('end'):
194 194 # # skip is offset, max-count is limit
195 195 # if filters.get('start'):
196 196 # extra_filter += ' --skip=%s' % filters['start']
197 197 # if filters.get('end'):
198 198 # extra_filter += ' --max-count=%s' % (filters['end'] - (filters['start'] or 0))
199 199
200 200 cmd = ['rev-list', '--reverse', '--date-order'] + rev_filter
201 201 try:
202 202 output, __ = self.run_git_command(cmd)
203 203 except RepositoryError:
204 204 # Can be raised for empty repositories
205 205 return []
206 206 return output.splitlines()
207 207
208 208 def _get_commit_id(self, commit_id_or_idx):
209 209 def is_null(value):
210 210 return len(value) == commit_id_or_idx.count('0')
211 211
212 212 if self.is_empty():
213 213 raise EmptyRepositoryError("There are no commits yet")
214 214
215 215 if commit_id_or_idx in (None, '', 'tip', 'HEAD', 'head', -1):
216 216 return self.commit_ids[-1]
217 217
218 218 is_bstr = isinstance(commit_id_or_idx, (str, unicode))
219 219 if ((is_bstr and commit_id_or_idx.isdigit() and len(commit_id_or_idx) < 12)
220 220 or isinstance(commit_id_or_idx, int) or is_null(commit_id_or_idx)):
221 221 try:
222 222 commit_id_or_idx = self.commit_ids[int(commit_id_or_idx)]
223 223 except Exception:
224 224 msg = "Commit %s does not exist for %s" % (
225 225 commit_id_or_idx, self)
226 226 raise CommitDoesNotExistError(msg)
227 227
228 228 elif is_bstr:
229 229 # check full path ref, eg. refs/heads/master
230 230 ref_id = self._refs.get(commit_id_or_idx)
231 231 if ref_id:
232 232 return ref_id
233 233
234 234 # check branch name
235 235 branch_ids = self.branches.values()
236 236 ref_id = self._refs.get('refs/heads/%s' % commit_id_or_idx)
237 237 if ref_id:
238 238 return ref_id
239 239
240 240 # check tag name
241 241 ref_id = self._refs.get('refs/tags/%s' % commit_id_or_idx)
242 242 if ref_id:
243 243 return ref_id
244 244
245 245 if (not SHA_PATTERN.match(commit_id_or_idx) or
246 246 commit_id_or_idx not in self.commit_ids):
247 247 msg = "Commit %s does not exist for %s" % (
248 248 commit_id_or_idx, self)
249 249 raise CommitDoesNotExistError(msg)
250 250
251 251 # Ensure we return full id
252 252 if not SHA_PATTERN.match(str(commit_id_or_idx)):
253 253 raise CommitDoesNotExistError(
254 254 "Given commit id %s not recognized" % commit_id_or_idx)
255 255 return commit_id_or_idx
256 256
257 257 def get_hook_location(self):
258 258 """
259 259 returns absolute path to location where hooks are stored
260 260 """
261 261 loc = os.path.join(self.path, 'hooks')
262 262 if not self.bare:
263 263 loc = os.path.join(self.path, '.git', 'hooks')
264 264 return loc
265 265
266 266 @LazyProperty
267 267 def last_change(self):
268 268 """
269 269 Returns last change made on this repository as
270 270 `datetime.datetime` object.
271 271 """
272 272 try:
273 273 return self.get_commit().date
274 274 except RepositoryError:
275 275 tzoffset = makedate()[1]
276 276 return utcdate_fromtimestamp(self._get_fs_mtime(), tzoffset)
277 277
278 278 def _get_fs_mtime(self):
279 279 idx_loc = '' if self.bare else '.git'
280 280 # fallback to filesystem
281 281 in_path = os.path.join(self.path, idx_loc, "index")
282 282 he_path = os.path.join(self.path, idx_loc, "HEAD")
283 283 if os.path.exists(in_path):
284 284 return os.stat(in_path).st_mtime
285 285 else:
286 286 return os.stat(he_path).st_mtime
287 287
288 288 @LazyProperty
289 289 def description(self):
290 290 description = self._remote.get_description()
291 291 return safe_unicode(description or self.DEFAULT_DESCRIPTION)
292 292
293 293 def _get_refs_entries(self, prefix='', reverse=False, strip_prefix=True):
294 294 if self.is_empty():
295 295 return OrderedDict()
296 296
297 297 result = []
298 298 for ref, sha in self._refs.iteritems():
299 299 if ref.startswith(prefix):
300 300 ref_name = ref
301 301 if strip_prefix:
302 302 ref_name = ref[len(prefix):]
303 303 result.append((safe_unicode(ref_name), sha))
304 304
305 305 def get_name(entry):
306 306 return entry[0]
307 307
308 308 return OrderedDict(sorted(result, key=get_name, reverse=reverse))
309 309
310 310 def _get_branches(self):
311 311 return self._get_refs_entries(prefix='refs/heads/', strip_prefix=True)
312 312
313 313 @LazyProperty
314 314 def branches(self):
315 315 return self._get_branches()
316 316
317 317 @LazyProperty
318 318 def branches_closed(self):
319 319 return {}
320 320
321 321 @LazyProperty
322 322 def branches_all(self):
323 323 all_branches = {}
324 324 all_branches.update(self.branches)
325 325 all_branches.update(self.branches_closed)
326 326 return all_branches
327 327
328 328 @LazyProperty
329 329 def tags(self):
330 330 return self._get_tags()
331 331
332 332 def _get_tags(self):
333 333 return self._get_refs_entries(
334 334 prefix='refs/tags/', strip_prefix=True, reverse=True)
335 335
336 336 def tag(self, name, user, commit_id=None, message=None, date=None,
337 337 **kwargs):
338 338 # TODO: fix this method to apply annotated tags correct with message
339 339 """
340 340 Creates and returns a tag for the given ``commit_id``.
341 341
342 342 :param name: name for new tag
343 343 :param user: full username, i.e.: "Joe Doe <joe.doe@example.com>"
344 344 :param commit_id: commit id for which new tag would be created
345 345 :param message: message of the tag's commit
346 346 :param date: date of tag's commit
347 347
348 348 :raises TagAlreadyExistError: if tag with same name already exists
349 349 """
350 350 if name in self.tags:
351 351 raise TagAlreadyExistError("Tag %s already exists" % name)
352 352 commit = self.get_commit(commit_id=commit_id)
353 353 message = message or "Added tag %s for commit %s" % (
354 354 name, commit.raw_id)
355 355 self._remote.set_refs('refs/tags/%s' % name, commit._commit['id'])
356 356
357 357 self._refs = self._get_refs()
358 358 self.tags = self._get_tags()
359 359 return commit
360 360
361 361 def remove_tag(self, name, user, message=None, date=None):
362 362 """
363 363 Removes tag with the given ``name``.
364 364
365 365 :param name: name of the tag to be removed
366 366 :param user: full username, i.e.: "Joe Doe <joe.doe@example.com>"
367 367 :param message: message of the tag's removal commit
368 368 :param date: date of tag's removal commit
369 369
370 370 :raises TagDoesNotExistError: if tag with given name does not exists
371 371 """
372 372 if name not in self.tags:
373 373 raise TagDoesNotExistError("Tag %s does not exist" % name)
374 374 tagpath = vcspath.join(
375 375 self._remote.get_refs_path(), 'refs', 'tags', name)
376 376 try:
377 377 os.remove(tagpath)
378 378 self._refs = self._get_refs()
379 379 self.tags = self._get_tags()
380 380 except OSError as e:
381 381 raise RepositoryError(e.strerror)
382 382
383 383 def _get_refs(self):
384 384 return self._remote.get_refs()
385 385
386 386 @LazyProperty
387 387 def _refs(self):
388 388 return self._get_refs()
389 389
390 390 @property
391 391 def _ref_tree(self):
392 392 node = tree = {}
393 393 for ref, sha in self._refs.iteritems():
394 394 path = ref.split('/')
395 395 for bit in path[:-1]:
396 396 node = node.setdefault(bit, {})
397 397 node[path[-1]] = sha
398 398 node = tree
399 399 return tree
400 400
401 401 def get_commit(self, commit_id=None, commit_idx=None, pre_load=None):
402 402 """
403 403 Returns `GitCommit` object representing commit from git repository
404 404 at the given `commit_id` or head (most recent commit) if None given.
405 405 """
406 406 if commit_id is not None:
407 407 self._validate_commit_id(commit_id)
408 408 elif commit_idx is not None:
409 409 self._validate_commit_idx(commit_idx)
410 410 commit_id = commit_idx
411 411 commit_id = self._get_commit_id(commit_id)
412 412 try:
413 413 # Need to call remote to translate id for tagging scenario
414 414 commit_id = self._remote.get_object(commit_id)["commit_id"]
415 415 idx = self._commit_ids[commit_id]
416 416 except KeyError:
417 417 raise RepositoryError("Cannot get object with id %s" % commit_id)
418 418
419 419 return GitCommit(self, commit_id, idx, pre_load=pre_load)
420 420
421 421 def get_commits(
422 422 self, start_id=None, end_id=None, start_date=None, end_date=None,
423 423 branch_name=None, pre_load=None):
424 424 """
425 425 Returns generator of `GitCommit` objects from start to end (both
426 426 are inclusive), in ascending date order.
427 427
428 428 :param start_id: None, str(commit_id)
429 429 :param end_id: None, str(commit_id)
430 430 :param start_date: if specified, commits with commit date less than
431 431 ``start_date`` would be filtered out from returned set
432 432 :param end_date: if specified, commits with commit date greater than
433 433 ``end_date`` would be filtered out from returned set
434 434 :param branch_name: if specified, commits not reachable from given
435 435 branch would be filtered out from returned set
436 436
437 437 :raise BranchDoesNotExistError: If given `branch_name` does not
438 438 exist.
439 439 :raise CommitDoesNotExistError: If commits for given `start` or
440 440 `end` could not be found.
441 441
442 442 """
443 443 if self.is_empty():
444 444 raise EmptyRepositoryError("There are no commits yet")
445 445 self._validate_branch_name(branch_name)
446 446
447 447 if start_id is not None:
448 448 self._validate_commit_id(start_id)
449 449 if end_id is not None:
450 450 self._validate_commit_id(end_id)
451 451
452 452 start_raw_id = self._get_commit_id(start_id)
453 453 start_pos = self._commit_ids[start_raw_id] if start_id else None
454 454 end_raw_id = self._get_commit_id(end_id)
455 455 end_pos = max(0, self._commit_ids[end_raw_id]) if end_id else None
456 456
457 457 if None not in [start_id, end_id] and start_pos > end_pos:
458 458 raise RepositoryError(
459 459 "Start commit '%s' cannot be after end commit '%s'" %
460 460 (start_id, end_id))
461 461
462 462 if end_pos is not None:
463 463 end_pos += 1
464 464
465 465 filter_ = []
466 466 if branch_name:
467 467 filter_.append({'branch_name': branch_name})
468 468 if start_date and not end_date:
469 469 filter_.append({'since': start_date})
470 470 if end_date and not start_date:
471 471 filter_.append({'until': end_date})
472 472 if start_date and end_date:
473 473 filter_.append({'since': start_date})
474 474 filter_.append({'until': end_date})
475 475
476 476 # if start_pos or end_pos:
477 477 # filter_.append({'start': start_pos})
478 478 # filter_.append({'end': end_pos})
479 479
480 480 if filter_:
481 481 revfilters = {
482 482 'branch_name': branch_name,
483 483 'since': start_date.strftime('%m/%d/%y %H:%M:%S') if start_date else None,
484 484 'until': end_date.strftime('%m/%d/%y %H:%M:%S') if end_date else None,
485 485 'start': start_pos,
486 486 'end': end_pos,
487 487 }
488 488 commit_ids = self._get_all_commit_ids(filters=revfilters)
489 489
490 490 # pure python stuff, it's slow due to walker walking whole repo
491 491 # def get_revs(walker):
492 492 # for walker_entry in walker:
493 493 # yield walker_entry.commit.id
494 494 # revfilters = {}
495 495 # commit_ids = list(reversed(list(get_revs(self._repo.get_walker(**revfilters)))))
496 496 else:
497 497 commit_ids = self.commit_ids
498 498
499 499 if start_pos or end_pos:
500 500 commit_ids = commit_ids[start_pos: end_pos]
501 501
502 502 return CollectionGenerator(self, commit_ids, pre_load=pre_load)
503 503
504 504 def get_diff(
505 505 self, commit1, commit2, path='', ignore_whitespace=False,
506 506 context=3, path1=None):
507 507 """
508 508 Returns (git like) *diff*, as plain text. Shows changes introduced by
509 509 ``commit2`` since ``commit1``.
510 510
511 511 :param commit1: Entry point from which diff is shown. Can be
512 512 ``self.EMPTY_COMMIT`` - in this case, patch showing all
513 513 the changes since empty state of the repository until ``commit2``
514 514 :param commit2: Until which commits changes should be shown.
515 515 :param ignore_whitespace: If set to ``True``, would not show whitespace
516 516 changes. Defaults to ``False``.
517 517 :param context: How many lines before/after changed lines should be
518 518 shown. Defaults to ``3``.
519 519 """
520 520 self._validate_diff_commits(commit1, commit2)
521 521 if path1 is not None and path1 != path:
522 522 raise ValueError("Diff of two different paths not supported.")
523 523
524 524 flags = [
525 525 '-U%s' % context, '--full-index', '--binary', '-p',
526 526 '-M', '--abbrev=40']
527 527 if ignore_whitespace:
528 528 flags.append('-w')
529 529
530 530 if commit1 == self.EMPTY_COMMIT:
531 531 cmd = ['show'] + flags + [commit2.raw_id]
532 532 else:
533 533 cmd = ['diff'] + flags + [commit1.raw_id, commit2.raw_id]
534 534
535 535 if path:
536 536 cmd.extend(['--', path])
537 537
538 538 stdout, __ = self.run_git_command(cmd)
539 539 # If we used 'show' command, strip first few lines (until actual diff
540 540 # starts)
541 541 if commit1 == self.EMPTY_COMMIT:
542 542 lines = stdout.splitlines()
543 543 x = 0
544 544 for line in lines:
545 545 if line.startswith('diff'):
546 546 break
547 547 x += 1
548 548 # Append new line just like 'diff' command do
549 549 stdout = '\n'.join(lines[x:]) + '\n'
550 550 return GitDiff(stdout)
551 551
552 552 def strip(self, commit_id, branch_name):
553 553 commit = self.get_commit(commit_id=commit_id)
554 554 if commit.merge:
555 555 raise Exception('Cannot reset to merge commit')
556 556
557 557 # parent is going to be the new head now
558 558 commit = commit.parents[0]
559 559 self._remote.set_refs('refs/heads/%s' % branch_name, commit.raw_id)
560 560
561 561 self.commit_ids = self._get_all_commit_ids()
562 562 self._rebuild_cache(self.commit_ids)
563 563
564 564 def get_common_ancestor(self, commit_id1, commit_id2, repo2):
565 565 if commit_id1 == commit_id2:
566 566 return commit_id1
567 567
568 568 if self != repo2:
569 569 commits = self._remote.get_missing_revs(
570 570 commit_id1, commit_id2, repo2.path)
571 571 if commits:
572 572 commit = repo2.get_commit(commits[-1])
573 573 if commit.parents:
574 574 ancestor_id = commit.parents[0].raw_id
575 575 else:
576 576 ancestor_id = None
577 577 else:
578 578 # no commits from other repo, ancestor_id is the commit_id2
579 579 ancestor_id = commit_id2
580 580 else:
581 581 output, __ = self.run_git_command(
582 582 ['merge-base', commit_id1, commit_id2])
583 583 ancestor_id = re.findall(r'[0-9a-fA-F]{40}', output)[0]
584 584
585 585 return ancestor_id
586 586
587 587 def compare(self, commit_id1, commit_id2, repo2, merge, pre_load=None):
588 588 repo1 = self
589 589 ancestor_id = None
590 590
591 591 if commit_id1 == commit_id2:
592 592 commits = []
593 593 elif repo1 != repo2:
594 594 missing_ids = self._remote.get_missing_revs(commit_id1, commit_id2,
595 595 repo2.path)
596 596 commits = [
597 597 repo2.get_commit(commit_id=commit_id, pre_load=pre_load)
598 598 for commit_id in reversed(missing_ids)]
599 599 else:
600 600 output, __ = repo1.run_git_command(
601 601 ['log', '--reverse', '--pretty=format: %H', '-s',
602 602 '%s..%s' % (commit_id1, commit_id2)])
603 603 commits = [
604 604 repo1.get_commit(commit_id=commit_id, pre_load=pre_load)
605 605 for commit_id in re.findall(r'[0-9a-fA-F]{40}', output)]
606 606
607 607 return commits
608 608
609 609 @LazyProperty
610 610 def in_memory_commit(self):
611 611 """
612 612 Returns ``GitInMemoryCommit`` object for this repository.
613 613 """
614 614 return GitInMemoryCommit(self)
615 615
616 616 def clone(self, url, update_after_clone=True, bare=False):
617 617 """
618 618 Tries to clone commits from external location.
619 619
620 620 :param update_after_clone: If set to ``False``, git won't checkout
621 621 working directory
622 622 :param bare: If set to ``True``, repository would be cloned into
623 623 *bare* git repository (no working directory at all).
624 624 """
625 625 # init_bare and init expect empty dir created to proceed
626 626 if not os.path.exists(self.path):
627 627 os.mkdir(self.path)
628 628
629 629 if bare:
630 630 self._remote.init_bare()
631 631 else:
632 632 self._remote.init()
633 633
634 634 deferred = '^{}'
635 635 valid_refs = ('refs/heads', 'refs/tags', 'HEAD')
636 636
637 637 return self._remote.clone(
638 638 url, deferred, valid_refs, update_after_clone)
639 639
640 640 def pull(self, url, commit_ids=None):
641 641 """
642 642 Tries to pull changes from external location. We use fetch here since
643 643 pull in get does merges and we want to be compatible with hg backend so
644 644 pull == fetch in this case
645 645 """
646 646 self.fetch(url, commit_ids=commit_ids)
647 647
648 648 def fetch(self, url, commit_ids=None):
649 649 """
650 650 Tries to fetch changes from external location.
651 651 """
652 652 refs = None
653 653
654 654 if commit_ids is not None:
655 655 remote_refs = self._remote.get_remote_refs(url)
656 656 refs = [
657 657 ref for ref in remote_refs if remote_refs[ref] in commit_ids]
658 658 self._remote.fetch(url, refs=refs)
659 659
660 660 def set_refs(self, ref_name, commit_id):
661 661 self._remote.set_refs(ref_name, commit_id)
662 662
663 663 def remove_ref(self, ref_name):
664 664 self._remote.remove_ref(ref_name)
665 665
666 666 def _update_server_info(self):
667 667 """
668 668 runs gits update-server-info command in this repo instance
669 669 """
670 670 self._remote.update_server_info()
671 671
672 672 def _current_branch(self):
673 673 """
674 674 Return the name of the current branch.
675 675
676 676 It only works for non bare repositories (i.e. repositories with a
677 677 working copy)
678 678 """
679 679 if self.bare:
680 680 raise RepositoryError('Bare git repos do not have active branches')
681 681
682 682 if self.is_empty():
683 683 return None
684 684
685 685 stdout, _ = self.run_git_command(['rev-parse', '--abbrev-ref', 'HEAD'])
686 686 return stdout.strip()
687 687
688 688 def _checkout(self, branch_name, create=False):
689 689 """
690 690 Checkout a branch in the working directory.
691 691
692 692 It tries to create the branch if create is True, failing if the branch
693 693 already exists.
694 694
695 695 It only works for non bare repositories (i.e. repositories with a
696 696 working copy)
697 697 """
698 698 if self.bare:
699 699 raise RepositoryError('Cannot checkout branches in a bare git repo')
700 700
701 701 cmd = ['checkout']
702 702 if create:
703 703 cmd.append('-b')
704 704 cmd.append(branch_name)
705 705 self.run_git_command(cmd, fail_on_stderr=False)
706 706
707 707 def _local_clone(self, clone_path, branch_name):
708 708 """
709 709 Create a local clone of the current repo.
710 710 """
711 711 # N.B.(skreft): the --branch option is required as otherwise the shallow
712 712 # clone will only fetch the active branch.
713 713 cmd = ['clone', '--branch', branch_name, '--single-branch',
714 714 self.path, os.path.abspath(clone_path)]
715 715 self.run_git_command(cmd, fail_on_stderr=False)
716 716
717 717 def _local_fetch(self, repository_path, branch_name):
718 718 """
719 719 Fetch a branch from a local repository.
720 720 """
721 721 repository_path = os.path.abspath(repository_path)
722 722 if repository_path == self.path:
723 723 raise ValueError('Cannot fetch from the same repository')
724 724
725 725 cmd = ['fetch', '--no-tags', repository_path, branch_name]
726 726 self.run_git_command(cmd, fail_on_stderr=False)
727 727
728 728 def _last_fetch_heads(self):
729 729 """
730 730 Return the last fetched heads that need merging.
731 731
732 732 The algorithm is defined at
733 733 https://github.com/git/git/blob/v2.1.3/git-pull.sh#L283
734 734 """
735 735 if not self.bare:
736 736 fetch_heads_path = os.path.join(self.path, '.git', 'FETCH_HEAD')
737 737 else:
738 738 fetch_heads_path = os.path.join(self.path, 'FETCH_HEAD')
739 739
740 740 heads = []
741 741 with open(fetch_heads_path) as f:
742 742 for line in f:
743 743 if ' not-for-merge ' in line:
744 744 continue
745 745 line = re.sub('\t.*', '', line, flags=re.DOTALL)
746 746 heads.append(line)
747 747
748 748 return heads
749 749
750 750 def _get_shadow_instance(self, shadow_repository_path, enable_hooks=False):
751 751 return GitRepository(shadow_repository_path)
752 752
753 753 def _local_pull(self, repository_path, branch_name):
754 754 """
755 755 Pull a branch from a local repository.
756 756 """
757 757 if self.bare:
758 758 raise RepositoryError('Cannot pull into a bare git repository')
759 759 # N.B.(skreft): The --ff-only option is to make sure this is a
760 760 # fast-forward (i.e., we are only pulling new changes and there are no
761 761 # conflicts with our current branch)
762 762 # Additionally, that option needs to go before --no-tags, otherwise git
763 763 # pull complains about it being an unknown flag.
764 764 cmd = ['pull', '--ff-only', '--no-tags', repository_path, branch_name]
765 765 self.run_git_command(cmd, fail_on_stderr=False)
766 766
767 767 def _local_merge(self, merge_message, user_name, user_email, heads):
768 768 """
769 769 Merge the given head into the checked out branch.
770 770
771 771 It will force a merge commit.
772 772
773 773 Currently it raises an error if the repo is empty, as it is not possible
774 774 to create a merge commit in an empty repo.
775 775
776 776 :param merge_message: The message to use for the merge commit.
777 777 :param heads: the heads to merge.
778 778 """
779 779 if self.bare:
780 780 raise RepositoryError('Cannot merge into a bare git repository')
781 781
782 782 if not heads:
783 783 return
784 784
785 785 if self.is_empty():
786 786 # TODO(skreft): do somehting more robust in this case.
787 787 raise RepositoryError(
788 788 'Do not know how to merge into empty repositories yet')
789 789
790 790 # N.B.(skreft): the --no-ff option is used to enforce the creation of a
791 791 # commit message. We also specify the user who is doing the merge.
792 792 cmd = ['-c', 'user.name=%s' % safe_str(user_name),
793 793 '-c', 'user.email=%s' % safe_str(user_email),
794 794 'merge', '--no-ff', '-m', safe_str(merge_message)]
795 795 cmd.extend(heads)
796 796 try:
797 797 self.run_git_command(cmd, fail_on_stderr=False)
798 798 except RepositoryError:
799 799 # Cleanup any merge leftovers
800 800 self.run_git_command(['merge', '--abort'], fail_on_stderr=False)
801 801 raise
802 802
803 803 def _local_push(
804 804 self, source_branch, repository_path, target_branch,
805 805 enable_hooks=False, rc_scm_data=None):
806 806 """
807 807 Push the source_branch to the given repository and target_branch.
808 808
809 809 Currently it if the target_branch is not master and the target repo is
810 810 empty, the push will work, but then GitRepository won't be able to find
811 811 the pushed branch or the commits. As the HEAD will be corrupted (i.e.,
812 812 pointing to master, which does not exist).
813 813
814 814 It does not run the hooks in the target repo.
815 815 """
816 816 # TODO(skreft): deal with the case in which the target repo is empty,
817 817 # and the target_branch is not master.
818 818 target_repo = GitRepository(repository_path)
819 819 if (not target_repo.bare and
820 820 target_repo._current_branch() == target_branch):
821 821 # Git prevents pushing to the checked out branch, so simulate it by
822 822 # pulling into the target repository.
823 823 target_repo._local_pull(self.path, source_branch)
824 824 else:
825 825 cmd = ['push', os.path.abspath(repository_path),
826 826 '%s:%s' % (source_branch, target_branch)]
827 827 gitenv = {}
828 828 if rc_scm_data:
829 829 gitenv.update({'RC_SCM_DATA': rc_scm_data})
830 830
831 831 if not enable_hooks:
832 832 gitenv['RC_SKIP_HOOKS'] = '1'
833 833 self.run_git_command(cmd, fail_on_stderr=False, extra_env=gitenv)
834 834
835 835 def _get_new_pr_branch(self, source_branch, target_branch):
836 836 prefix = 'pr_%s-%s_' % (source_branch, target_branch)
837 837 pr_branches = []
838 838 for branch in self.branches:
839 839 if branch.startswith(prefix):
840 840 pr_branches.append(int(branch[len(prefix):]))
841 841
842 842 if not pr_branches:
843 843 branch_id = 0
844 844 else:
845 845 branch_id = max(pr_branches) + 1
846 846
847 847 return '%s%d' % (prefix, branch_id)
848 848
849 849 def _merge_repo(self, shadow_repository_path, target_ref,
850 850 source_repo, source_ref, merge_message,
851 851 merger_name, merger_email, dry_run=False,
852 use_rebase=False):
852 use_rebase=False, close_branch=False):
853 853 if target_ref.commit_id != self.branches[target_ref.name]:
854 854 return MergeResponse(
855 855 False, False, None, MergeFailureReason.TARGET_IS_NOT_HEAD)
856 856
857 857 shadow_repo = GitRepository(shadow_repository_path)
858 858 shadow_repo._checkout(target_ref.name)
859 859 shadow_repo._local_pull(self.path, target_ref.name)
860 860 # Need to reload repo to invalidate the cache, or otherwise we cannot
861 861 # retrieve the last target commit.
862 862 shadow_repo = GitRepository(shadow_repository_path)
863 863 if target_ref.commit_id != shadow_repo.branches[target_ref.name]:
864 864 return MergeResponse(
865 865 False, False, None, MergeFailureReason.TARGET_IS_NOT_HEAD)
866 866
867 867 pr_branch = shadow_repo._get_new_pr_branch(
868 868 source_ref.name, target_ref.name)
869 869 shadow_repo._checkout(pr_branch, create=True)
870 870 try:
871 871 shadow_repo._local_fetch(source_repo.path, source_ref.name)
872 872 except RepositoryError:
873 873 log.exception('Failure when doing local fetch on git shadow repo')
874 874 return MergeResponse(
875 875 False, False, None, MergeFailureReason.MISSING_SOURCE_REF)
876 876
877 877 merge_ref = None
878 878 merge_failure_reason = MergeFailureReason.NONE
879 879 try:
880 880 shadow_repo._local_merge(merge_message, merger_name, merger_email,
881 881 [source_ref.commit_id])
882 882 merge_possible = True
883 883
884 884 # Need to reload repo to invalidate the cache, or otherwise we
885 885 # cannot retrieve the merge commit.
886 886 shadow_repo = GitRepository(shadow_repository_path)
887 887 merge_commit_id = shadow_repo.branches[pr_branch]
888 888
889 889 # Set a reference pointing to the merge commit. This reference may
890 890 # be used to easily identify the last successful merge commit in
891 891 # the shadow repository.
892 892 shadow_repo.set_refs('refs/heads/pr-merge', merge_commit_id)
893 893 merge_ref = Reference('branch', 'pr-merge', merge_commit_id)
894 894 except RepositoryError:
895 895 log.exception('Failure when doing local merge on git shadow repo')
896 896 merge_possible = False
897 897 merge_failure_reason = MergeFailureReason.MERGE_FAILED
898 898
899 899 if merge_possible and not dry_run:
900 900 try:
901 901 shadow_repo._local_push(
902 902 pr_branch, self.path, target_ref.name, enable_hooks=True,
903 903 rc_scm_data=self.config.get('rhodecode', 'RC_SCM_DATA'))
904 904 merge_succeeded = True
905 905 except RepositoryError:
906 906 log.exception(
907 907 'Failure when doing local push on git shadow repo')
908 908 merge_succeeded = False
909 909 merge_failure_reason = MergeFailureReason.PUSH_FAILED
910 910 else:
911 911 merge_succeeded = False
912 912
913 913 return MergeResponse(
914 914 merge_possible, merge_succeeded, merge_ref,
915 915 merge_failure_reason)
916 916
917 917 def _get_shadow_repository_path(self, workspace_id):
918 918 # The name of the shadow repository must start with '.', so it is
919 919 # skipped by 'rhodecode.lib.utils.get_filesystem_repos'.
920 920 return os.path.join(
921 921 os.path.dirname(self.path),
922 922 '.__shadow_%s_%s' % (os.path.basename(self.path), workspace_id))
923 923
924 924 def _maybe_prepare_merge_workspace(self, workspace_id, target_ref):
925 925 shadow_repository_path = self._get_shadow_repository_path(workspace_id)
926 926 if not os.path.exists(shadow_repository_path):
927 927 self._local_clone(shadow_repository_path, target_ref.name)
928 928
929 929 return shadow_repository_path
930 930
931 931 def cleanup_merge_workspace(self, workspace_id):
932 932 shadow_repository_path = self._get_shadow_repository_path(workspace_id)
933 933 shutil.rmtree(shadow_repository_path, ignore_errors=True)
@@ -1,819 +1,855 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2014-2017 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 """
22 22 HG repository module
23 23 """
24 24
25 25 import logging
26 26 import binascii
27 27 import os
28 28 import shutil
29 29 import urllib
30 30
31 31 from zope.cachedescriptors.property import Lazy as LazyProperty
32 32
33 33 from rhodecode.lib.compat import OrderedDict
34 34 from rhodecode.lib.datelib import (
35 35 date_to_timestamp_plus_offset, utcdate_fromtimestamp, makedate,
36 36 date_astimestamp)
37 37 from rhodecode.lib.utils import safe_unicode, safe_str
38 38 from rhodecode.lib.vcs import connection
39 39 from rhodecode.lib.vcs.backends.base import (
40 40 BaseRepository, CollectionGenerator, Config, MergeResponse,
41 41 MergeFailureReason, Reference)
42 42 from rhodecode.lib.vcs.backends.hg.commit import MercurialCommit
43 43 from rhodecode.lib.vcs.backends.hg.diff import MercurialDiff
44 44 from rhodecode.lib.vcs.backends.hg.inmemory import MercurialInMemoryCommit
45 45 from rhodecode.lib.vcs.exceptions import (
46 46 EmptyRepositoryError, RepositoryError, TagAlreadyExistError,
47 47 TagDoesNotExistError, CommitDoesNotExistError, SubrepoMergeError)
48 48
49 49 hexlify = binascii.hexlify
50 50 nullid = "\0" * 20
51 51
52 52 log = logging.getLogger(__name__)
53 53
54 54
55 55 class MercurialRepository(BaseRepository):
56 56 """
57 57 Mercurial repository backend
58 58 """
59 59 DEFAULT_BRANCH_NAME = 'default'
60 60
61 61 def __init__(self, repo_path, config=None, create=False, src_url=None,
62 62 update_after_clone=False, with_wire=None):
63 63 """
64 64 Raises RepositoryError if repository could not be find at the given
65 65 ``repo_path``.
66 66
67 67 :param repo_path: local path of the repository
68 68 :param config: config object containing the repo configuration
69 69 :param create=False: if set to True, would try to create repository if
70 70 it does not exist rather than raising exception
71 71 :param src_url=None: would try to clone repository from given location
72 72 :param update_after_clone=False: sets update of working copy after
73 73 making a clone
74 74 """
75 75 self.path = safe_str(os.path.abspath(repo_path))
76 76 self.config = config if config else Config()
77 77 self._remote = connection.Hg(
78 78 self.path, self.config, with_wire=with_wire)
79 79
80 80 self._init_repo(create, src_url, update_after_clone)
81 81
82 82 # caches
83 83 self._commit_ids = {}
84 84
85 85 @LazyProperty
86 86 def commit_ids(self):
87 87 """
88 88 Returns list of commit ids, in ascending order. Being lazy
89 89 attribute allows external tools to inject shas from cache.
90 90 """
91 91 commit_ids = self._get_all_commit_ids()
92 92 self._rebuild_cache(commit_ids)
93 93 return commit_ids
94 94
95 95 def _rebuild_cache(self, commit_ids):
96 96 self._commit_ids = dict((commit_id, index)
97 97 for index, commit_id in enumerate(commit_ids))
98 98
99 99 @LazyProperty
100 100 def branches(self):
101 101 return self._get_branches()
102 102
103 103 @LazyProperty
104 104 def branches_closed(self):
105 105 return self._get_branches(active=False, closed=True)
106 106
107 107 @LazyProperty
108 108 def branches_all(self):
109 109 all_branches = {}
110 110 all_branches.update(self.branches)
111 111 all_branches.update(self.branches_closed)
112 112 return all_branches
113 113
114 114 def _get_branches(self, active=True, closed=False):
115 115 """
116 116 Gets branches for this repository
117 117 Returns only not closed active branches by default
118 118
119 119 :param active: return also active branches
120 120 :param closed: return also closed branches
121 121
122 122 """
123 123 if self.is_empty():
124 124 return {}
125 125
126 126 def get_name(ctx):
127 127 return ctx[0]
128 128
129 129 _branches = [(safe_unicode(n), hexlify(h),) for n, h in
130 130 self._remote.branches(active, closed).items()]
131 131
132 132 return OrderedDict(sorted(_branches, key=get_name, reverse=False))
133 133
134 134 @LazyProperty
135 135 def tags(self):
136 136 """
137 137 Gets tags for this repository
138 138 """
139 139 return self._get_tags()
140 140
141 141 def _get_tags(self):
142 142 if self.is_empty():
143 143 return {}
144 144
145 145 def get_name(ctx):
146 146 return ctx[0]
147 147
148 148 _tags = [(safe_unicode(n), hexlify(h),) for n, h in
149 149 self._remote.tags().items()]
150 150
151 151 return OrderedDict(sorted(_tags, key=get_name, reverse=True))
152 152
153 153 def tag(self, name, user, commit_id=None, message=None, date=None,
154 154 **kwargs):
155 155 """
156 156 Creates and returns a tag for the given ``commit_id``.
157 157
158 158 :param name: name for new tag
159 159 :param user: full username, i.e.: "Joe Doe <joe.doe@example.com>"
160 160 :param commit_id: commit id for which new tag would be created
161 161 :param message: message of the tag's commit
162 162 :param date: date of tag's commit
163 163
164 164 :raises TagAlreadyExistError: if tag with same name already exists
165 165 """
166 166 if name in self.tags:
167 167 raise TagAlreadyExistError("Tag %s already exists" % name)
168 168 commit = self.get_commit(commit_id=commit_id)
169 169 local = kwargs.setdefault('local', False)
170 170
171 171 if message is None:
172 172 message = "Added tag %s for commit %s" % (name, commit.short_id)
173 173
174 174 date, tz = date_to_timestamp_plus_offset(date)
175 175
176 176 self._remote.tag(
177 177 name, commit.raw_id, message, local, user, date, tz)
178 178 self._remote.invalidate_vcs_cache()
179 179
180 180 # Reinitialize tags
181 181 self.tags = self._get_tags()
182 182 tag_id = self.tags[name]
183 183
184 184 return self.get_commit(commit_id=tag_id)
185 185
186 186 def remove_tag(self, name, user, message=None, date=None):
187 187 """
188 188 Removes tag with the given `name`.
189 189
190 190 :param name: name of the tag to be removed
191 191 :param user: full username, i.e.: "Joe Doe <joe.doe@example.com>"
192 192 :param message: message of the tag's removal commit
193 193 :param date: date of tag's removal commit
194 194
195 195 :raises TagDoesNotExistError: if tag with given name does not exists
196 196 """
197 197 if name not in self.tags:
198 198 raise TagDoesNotExistError("Tag %s does not exist" % name)
199 199 if message is None:
200 200 message = "Removed tag %s" % name
201 201 local = False
202 202
203 203 date, tz = date_to_timestamp_plus_offset(date)
204 204
205 205 self._remote.tag(name, nullid, message, local, user, date, tz)
206 206 self._remote.invalidate_vcs_cache()
207 207 self.tags = self._get_tags()
208 208
209 209 @LazyProperty
210 210 def bookmarks(self):
211 211 """
212 212 Gets bookmarks for this repository
213 213 """
214 214 return self._get_bookmarks()
215 215
216 216 def _get_bookmarks(self):
217 217 if self.is_empty():
218 218 return {}
219 219
220 220 def get_name(ctx):
221 221 return ctx[0]
222 222
223 223 _bookmarks = [
224 224 (safe_unicode(n), hexlify(h)) for n, h in
225 225 self._remote.bookmarks().items()]
226 226
227 227 return OrderedDict(sorted(_bookmarks, key=get_name))
228 228
229 229 def _get_all_commit_ids(self):
230 230 return self._remote.get_all_commit_ids('visible')
231 231
232 232 def get_diff(
233 233 self, commit1, commit2, path='', ignore_whitespace=False,
234 234 context=3, path1=None):
235 235 """
236 236 Returns (git like) *diff*, as plain text. Shows changes introduced by
237 237 `commit2` since `commit1`.
238 238
239 239 :param commit1: Entry point from which diff is shown. Can be
240 240 ``self.EMPTY_COMMIT`` - in this case, patch showing all
241 241 the changes since empty state of the repository until `commit2`
242 242 :param commit2: Until which commit changes should be shown.
243 243 :param ignore_whitespace: If set to ``True``, would not show whitespace
244 244 changes. Defaults to ``False``.
245 245 :param context: How many lines before/after changed lines should be
246 246 shown. Defaults to ``3``.
247 247 """
248 248 self._validate_diff_commits(commit1, commit2)
249 249 if path1 is not None and path1 != path:
250 250 raise ValueError("Diff of two different paths not supported.")
251 251
252 252 if path:
253 253 file_filter = [self.path, path]
254 254 else:
255 255 file_filter = None
256 256
257 257 diff = self._remote.diff(
258 258 commit1.raw_id, commit2.raw_id, file_filter=file_filter,
259 259 opt_git=True, opt_ignorews=ignore_whitespace,
260 260 context=context)
261 261 return MercurialDiff(diff)
262 262
263 263 def strip(self, commit_id, branch=None):
264 264 self._remote.strip(commit_id, update=False, backup="none")
265 265
266 266 self._remote.invalidate_vcs_cache()
267 267 self.commit_ids = self._get_all_commit_ids()
268 268 self._rebuild_cache(self.commit_ids)
269 269
270 270 def verify(self):
271 271 verify = self._remote.verify()
272 272
273 273 self._remote.invalidate_vcs_cache()
274 274 return verify
275 275
276 276 def get_common_ancestor(self, commit_id1, commit_id2, repo2):
277 277 if commit_id1 == commit_id2:
278 278 return commit_id1
279 279
280 280 ancestors = self._remote.revs_from_revspec(
281 281 "ancestor(id(%s), id(%s))", commit_id1, commit_id2,
282 282 other_path=repo2.path)
283 283 return repo2[ancestors[0]].raw_id if ancestors else None
284 284
285 285 def compare(self, commit_id1, commit_id2, repo2, merge, pre_load=None):
286 286 if commit_id1 == commit_id2:
287 287 commits = []
288 288 else:
289 289 if merge:
290 290 indexes = self._remote.revs_from_revspec(
291 291 "ancestors(id(%s)) - ancestors(id(%s)) - id(%s)",
292 292 commit_id2, commit_id1, commit_id1, other_path=repo2.path)
293 293 else:
294 294 indexes = self._remote.revs_from_revspec(
295 295 "id(%s)..id(%s) - id(%s)", commit_id1, commit_id2,
296 296 commit_id1, other_path=repo2.path)
297 297
298 298 commits = [repo2.get_commit(commit_idx=idx, pre_load=pre_load)
299 299 for idx in indexes]
300 300
301 301 return commits
302 302
303 303 @staticmethod
304 304 def check_url(url, config):
305 305 """
306 306 Function will check given url and try to verify if it's a valid
307 307 link. Sometimes it may happened that mercurial will issue basic
308 308 auth request that can cause whole API to hang when used from python
309 309 or other external calls.
310 310
311 311 On failures it'll raise urllib2.HTTPError, exception is also thrown
312 312 when the return code is non 200
313 313 """
314 314 # check first if it's not an local url
315 315 if os.path.isdir(url) or url.startswith('file:'):
316 316 return True
317 317
318 318 # Request the _remote to verify the url
319 319 return connection.Hg.check_url(url, config.serialize())
320 320
321 321 @staticmethod
322 322 def is_valid_repository(path):
323 323 return os.path.isdir(os.path.join(path, '.hg'))
324 324
325 325 def _init_repo(self, create, src_url=None, update_after_clone=False):
326 326 """
327 327 Function will check for mercurial repository in given path. If there
328 328 is no repository in that path it will raise an exception unless
329 329 `create` parameter is set to True - in that case repository would
330 330 be created.
331 331
332 332 If `src_url` is given, would try to clone repository from the
333 333 location at given clone_point. Additionally it'll make update to
334 334 working copy accordingly to `update_after_clone` flag.
335 335 """
336 336 if create and os.path.exists(self.path):
337 337 raise RepositoryError(
338 338 "Cannot create repository at %s, location already exist"
339 339 % self.path)
340 340
341 341 if src_url:
342 342 url = str(self._get_url(src_url))
343 343 MercurialRepository.check_url(url, self.config)
344 344
345 345 self._remote.clone(url, self.path, update_after_clone)
346 346
347 347 # Don't try to create if we've already cloned repo
348 348 create = False
349 349
350 350 if create:
351 351 os.makedirs(self.path, mode=0755)
352 352
353 353 self._remote.localrepository(create)
354 354
355 355 @LazyProperty
356 356 def in_memory_commit(self):
357 357 return MercurialInMemoryCommit(self)
358 358
359 359 @LazyProperty
360 360 def description(self):
361 361 description = self._remote.get_config_value(
362 362 'web', 'description', untrusted=True)
363 363 return safe_unicode(description or self.DEFAULT_DESCRIPTION)
364 364
365 365 @LazyProperty
366 366 def contact(self):
367 367 contact = (
368 368 self._remote.get_config_value("web", "contact") or
369 369 self._remote.get_config_value("ui", "username"))
370 370 return safe_unicode(contact or self.DEFAULT_CONTACT)
371 371
372 372 @LazyProperty
373 373 def last_change(self):
374 374 """
375 375 Returns last change made on this repository as
376 376 `datetime.datetime` object.
377 377 """
378 378 try:
379 379 return self.get_commit().date
380 380 except RepositoryError:
381 381 tzoffset = makedate()[1]
382 382 return utcdate_fromtimestamp(self._get_fs_mtime(), tzoffset)
383 383
384 384 def _get_fs_mtime(self):
385 385 # fallback to filesystem
386 386 cl_path = os.path.join(self.path, '.hg', "00changelog.i")
387 387 st_path = os.path.join(self.path, '.hg', "store")
388 388 if os.path.exists(cl_path):
389 389 return os.stat(cl_path).st_mtime
390 390 else:
391 391 return os.stat(st_path).st_mtime
392 392
393 393 def _sanitize_commit_idx(self, idx):
394 394 # Note: Mercurial has ``int(-1)`` reserved as not existing id_or_idx
395 395 # number. A `long` is treated in the correct way though. So we convert
396 396 # `int` to `long` here to make sure it is handled correctly.
397 397 if isinstance(idx, int):
398 398 return long(idx)
399 399 return idx
400 400
401 401 def _get_url(self, url):
402 402 """
403 403 Returns normalized url. If schema is not given, would fall
404 404 to filesystem
405 405 (``file:///``) schema.
406 406 """
407 407 url = url.encode('utf8')
408 408 if url != 'default' and '://' not in url:
409 409 url = "file:" + urllib.pathname2url(url)
410 410 return url
411 411
412 412 def get_hook_location(self):
413 413 """
414 414 returns absolute path to location where hooks are stored
415 415 """
416 416 return os.path.join(self.path, '.hg', '.hgrc')
417 417
418 418 def get_commit(self, commit_id=None, commit_idx=None, pre_load=None):
419 419 """
420 420 Returns ``MercurialCommit`` object representing repository's
421 421 commit at the given `commit_id` or `commit_idx`.
422 422 """
423 423 if self.is_empty():
424 424 raise EmptyRepositoryError("There are no commits yet")
425 425
426 426 if commit_id is not None:
427 427 self._validate_commit_id(commit_id)
428 428 try:
429 429 idx = self._commit_ids[commit_id]
430 430 return MercurialCommit(self, commit_id, idx, pre_load=pre_load)
431 431 except KeyError:
432 432 pass
433 433 elif commit_idx is not None:
434 434 self._validate_commit_idx(commit_idx)
435 435 commit_idx = self._sanitize_commit_idx(commit_idx)
436 436 try:
437 437 id_ = self.commit_ids[commit_idx]
438 438 if commit_idx < 0:
439 439 commit_idx += len(self.commit_ids)
440 440 return MercurialCommit(
441 441 self, id_, commit_idx, pre_load=pre_load)
442 442 except IndexError:
443 443 commit_id = commit_idx
444 444 else:
445 445 commit_id = "tip"
446 446
447 447 # TODO Paris: Ugly hack to "serialize" long for msgpack
448 448 if isinstance(commit_id, long):
449 449 commit_id = float(commit_id)
450 450
451 451 if isinstance(commit_id, unicode):
452 452 commit_id = safe_str(commit_id)
453 453
454 454 try:
455 455 raw_id, idx = self._remote.lookup(commit_id, both=True)
456 456 except CommitDoesNotExistError:
457 457 msg = "Commit %s does not exist for %s" % (
458 458 commit_id, self)
459 459 raise CommitDoesNotExistError(msg)
460 460
461 461 return MercurialCommit(self, raw_id, idx, pre_load=pre_load)
462 462
463 463 def get_commits(
464 464 self, start_id=None, end_id=None, start_date=None, end_date=None,
465 465 branch_name=None, pre_load=None):
466 466 """
467 467 Returns generator of ``MercurialCommit`` objects from start to end
468 468 (both are inclusive)
469 469
470 470 :param start_id: None, str(commit_id)
471 471 :param end_id: None, str(commit_id)
472 472 :param start_date: if specified, commits with commit date less than
473 473 ``start_date`` would be filtered out from returned set
474 474 :param end_date: if specified, commits with commit date greater than
475 475 ``end_date`` would be filtered out from returned set
476 476 :param branch_name: if specified, commits not reachable from given
477 477 branch would be filtered out from returned set
478 478
479 479 :raise BranchDoesNotExistError: If given ``branch_name`` does not
480 480 exist.
481 481 :raise CommitDoesNotExistError: If commit for given ``start`` or
482 482 ``end`` could not be found.
483 483 """
484 484 # actually we should check now if it's not an empty repo
485 485 branch_ancestors = False
486 486 if self.is_empty():
487 487 raise EmptyRepositoryError("There are no commits yet")
488 488 self._validate_branch_name(branch_name)
489 489
490 490 if start_id is not None:
491 491 self._validate_commit_id(start_id)
492 492 c_start = self.get_commit(commit_id=start_id)
493 493 start_pos = self._commit_ids[c_start.raw_id]
494 494 else:
495 495 start_pos = None
496 496
497 497 if end_id is not None:
498 498 self._validate_commit_id(end_id)
499 499 c_end = self.get_commit(commit_id=end_id)
500 500 end_pos = max(0, self._commit_ids[c_end.raw_id])
501 501 else:
502 502 end_pos = None
503 503
504 504 if None not in [start_id, end_id] and start_pos > end_pos:
505 505 raise RepositoryError(
506 506 "Start commit '%s' cannot be after end commit '%s'" %
507 507 (start_id, end_id))
508 508
509 509 if end_pos is not None:
510 510 end_pos += 1
511 511
512 512 commit_filter = []
513 513 if branch_name and not branch_ancestors:
514 514 commit_filter.append('branch("%s")' % branch_name)
515 515 elif branch_name and branch_ancestors:
516 516 commit_filter.append('ancestors(branch("%s"))' % branch_name)
517 517 if start_date and not end_date:
518 518 commit_filter.append('date(">%s")' % start_date)
519 519 if end_date and not start_date:
520 520 commit_filter.append('date("<%s")' % end_date)
521 521 if start_date and end_date:
522 522 commit_filter.append(
523 523 'date(">%s") and date("<%s")' % (start_date, end_date))
524 524
525 525 # TODO: johbo: Figure out a simpler way for this solution
526 526 collection_generator = CollectionGenerator
527 527 if commit_filter:
528 528 commit_filter = map(safe_str, commit_filter)
529 529 revisions = self._remote.rev_range(commit_filter)
530 530 collection_generator = MercurialIndexBasedCollectionGenerator
531 531 else:
532 532 revisions = self.commit_ids
533 533
534 534 if start_pos or end_pos:
535 535 revisions = revisions[start_pos:end_pos]
536 536
537 537 return collection_generator(self, revisions, pre_load=pre_load)
538 538
539 539 def pull(self, url, commit_ids=None):
540 540 """
541 541 Tries to pull changes from external location.
542 542
543 543 :param commit_ids: Optional. Can be set to a list of commit ids
544 544 which shall be pulled from the other repository.
545 545 """
546 546 url = self._get_url(url)
547 547 self._remote.pull(url, commit_ids=commit_ids)
548 548 self._remote.invalidate_vcs_cache()
549 549
550 550 def _local_clone(self, clone_path):
551 551 """
552 552 Create a local clone of the current repo.
553 553 """
554 554 self._remote.clone(self.path, clone_path, update_after_clone=True,
555 555 hooks=False)
556 556
557 557 def _update(self, revision, clean=False):
558 558 """
559 559 Update the working copty to the specified revision.
560 560 """
561 561 self._remote.update(revision, clean=clean)
562 562
563 563 def _identify(self):
564 564 """
565 565 Return the current state of the working directory.
566 566 """
567 567 return self._remote.identify().strip().rstrip('+')
568 568
569 569 def _heads(self, branch=None):
570 570 """
571 571 Return the commit ids of the repository heads.
572 572 """
573 573 return self._remote.heads(branch=branch).strip().split(' ')
574 574
575 575 def _ancestor(self, revision1, revision2):
576 576 """
577 577 Return the common ancestor of the two revisions.
578 578 """
579 579 return self._remote.ancestor(revision1, revision2)
580 580
581 581 def _local_push(
582 582 self, revision, repository_path, push_branches=False,
583 583 enable_hooks=False):
584 584 """
585 585 Push the given revision to the specified repository.
586 586
587 587 :param push_branches: allow to create branches in the target repo.
588 588 """
589 589 self._remote.push(
590 590 [revision], repository_path, hooks=enable_hooks,
591 591 push_branches=push_branches)
592 592
593 593 def _local_merge(self, target_ref, merge_message, user_name, user_email,
594 594 source_ref, use_rebase=False):
595 595 """
596 596 Merge the given source_revision into the checked out revision.
597 597
598 598 Returns the commit id of the merge and a boolean indicating if the
599 599 commit needs to be pushed.
600 600 """
601 601 self._update(target_ref.commit_id)
602 602
603 603 ancestor = self._ancestor(target_ref.commit_id, source_ref.commit_id)
604 604 is_the_same_branch = self._is_the_same_branch(target_ref, source_ref)
605 605
606 606 if ancestor == source_ref.commit_id:
607 607 # Nothing to do, the changes were already integrated
608 608 return target_ref.commit_id, False
609 609
610 610 elif ancestor == target_ref.commit_id and is_the_same_branch:
611 611 # In this case we should force a commit message
612 612 return source_ref.commit_id, True
613 613
614 614 if use_rebase:
615 615 try:
616 616 bookmark_name = 'rcbook%s%s' % (source_ref.commit_id,
617 617 target_ref.commit_id)
618 618 self.bookmark(bookmark_name, revision=source_ref.commit_id)
619 619 self._remote.rebase(
620 620 source=source_ref.commit_id, dest=target_ref.commit_id)
621 621 self._remote.invalidate_vcs_cache()
622 622 self._update(bookmark_name)
623 623 return self._identify(), True
624 624 except RepositoryError:
625 625 # The rebase-abort may raise another exception which 'hides'
626 626 # the original one, therefore we log it here.
627 627 log.exception('Error while rebasing shadow repo during merge.')
628 628
629 629 # Cleanup any rebase leftovers
630 630 self._remote.invalidate_vcs_cache()
631 631 self._remote.rebase(abort=True)
632 632 self._remote.invalidate_vcs_cache()
633 633 self._remote.update(clean=True)
634 634 raise
635 635 else:
636 636 try:
637 637 self._remote.merge(source_ref.commit_id)
638 638 self._remote.invalidate_vcs_cache()
639 639 self._remote.commit(
640 640 message=safe_str(merge_message),
641 641 username=safe_str('%s <%s>' % (user_name, user_email)))
642 642 self._remote.invalidate_vcs_cache()
643 643 return self._identify(), True
644 644 except RepositoryError:
645 645 # Cleanup any merge leftovers
646 646 self._remote.update(clean=True)
647 647 raise
648 648
649 def _local_close(self, target_ref, user_name, user_email,
650 source_ref, close_message=''):
651 """
652 Close the branch of the given source_revision
653
654 Returns the commit id of the close and a boolean indicating if the
655 commit needs to be pushed.
656 """
657 self._update(target_ref.commit_id)
658 message = close_message or "Closing branch"
659 try:
660 self._remote.commit(
661 message=safe_str(message),
662 username=safe_str('%s <%s>' % (user_name, user_email)),
663 close_branch=True)
664 self._remote.invalidate_vcs_cache()
665 return self._identify(), True
666 except RepositoryError:
667 # Cleanup any commit leftovers
668 self._remote.update(clean=True)
669 raise
670
649 671 def _is_the_same_branch(self, target_ref, source_ref):
650 672 return (
651 673 self._get_branch_name(target_ref) ==
652 674 self._get_branch_name(source_ref))
653 675
654 676 def _get_branch_name(self, ref):
655 677 if ref.type == 'branch':
656 678 return ref.name
657 679 return self._remote.ctx_branch(ref.commit_id)
658 680
659 681 def _get_shadow_repository_path(self, workspace_id):
660 682 # The name of the shadow repository must start with '.', so it is
661 683 # skipped by 'rhodecode.lib.utils.get_filesystem_repos'.
662 684 return os.path.join(
663 685 os.path.dirname(self.path),
664 686 '.__shadow_%s_%s' % (os.path.basename(self.path), workspace_id))
665 687
666 688 def _maybe_prepare_merge_workspace(self, workspace_id, unused_target_ref):
667 689 shadow_repository_path = self._get_shadow_repository_path(workspace_id)
668 690 if not os.path.exists(shadow_repository_path):
669 691 self._local_clone(shadow_repository_path)
670 692 log.debug(
671 693 'Prepared shadow repository in %s', shadow_repository_path)
672 694
673 695 return shadow_repository_path
674 696
675 697 def cleanup_merge_workspace(self, workspace_id):
676 698 shadow_repository_path = self._get_shadow_repository_path(workspace_id)
677 699 shutil.rmtree(shadow_repository_path, ignore_errors=True)
678 700
679 701 def _merge_repo(self, shadow_repository_path, target_ref,
680 702 source_repo, source_ref, merge_message,
681 703 merger_name, merger_email, dry_run=False,
682 use_rebase=False):
704 use_rebase=False, close_branch=False):
683 705 if target_ref.commit_id not in self._heads():
684 706 return MergeResponse(
685 707 False, False, None, MergeFailureReason.TARGET_IS_NOT_HEAD)
686 708
687 709 try:
688 710 if (target_ref.type == 'branch' and
689 711 len(self._heads(target_ref.name)) != 1):
690 712 return MergeResponse(
691 713 False, False, None,
692 714 MergeFailureReason.HG_TARGET_HAS_MULTIPLE_HEADS)
693 715 except CommitDoesNotExistError as e:
694 716 log.exception('Failure when looking up branch heads on hg target')
695 717 return MergeResponse(
696 718 False, False, None, MergeFailureReason.MISSING_TARGET_REF)
697 719
698 720 shadow_repo = self._get_shadow_instance(shadow_repository_path)
699 721
700 722 log.debug('Pulling in target reference %s', target_ref)
701 723 self._validate_pull_reference(target_ref)
702 724 shadow_repo._local_pull(self.path, target_ref)
703 725 try:
704 726 log.debug('Pulling in source reference %s', source_ref)
705 727 source_repo._validate_pull_reference(source_ref)
706 728 shadow_repo._local_pull(source_repo.path, source_ref)
707 729 except CommitDoesNotExistError:
708 730 log.exception('Failure when doing local pull on hg shadow repo')
709 731 return MergeResponse(
710 732 False, False, None, MergeFailureReason.MISSING_SOURCE_REF)
711 733
712 734 merge_ref = None
713 735 merge_failure_reason = MergeFailureReason.NONE
714 736
715 try:
716 merge_commit_id, needs_push = shadow_repo._local_merge(
717 target_ref, merge_message, merger_name, merger_email,
718 source_ref, use_rebase=use_rebase)
737 if close_branch and not use_rebase:
738 try:
739 close_commit_id, needs_push = shadow_repo._local_close(
740 target_ref, merger_name, merger_email, source_ref)
741 target_ref.commit_id = close_commit_id
742 merge_possible = True
743 except RepositoryError:
744 log.exception('Failure when doing close branch on hg shadow repo')
745 merge_possible = False
746 merge_failure_reason = MergeFailureReason.MERGE_FAILED
747 else:
719 748 merge_possible = True
720 749
721 # Set a bookmark pointing to the merge commit. This bookmark may be
722 # used to easily identify the last successful merge commit in the
723 # shadow repository.
724 shadow_repo.bookmark('pr-merge', revision=merge_commit_id)
725 merge_ref = Reference('book', 'pr-merge', merge_commit_id)
726 except SubrepoMergeError:
727 log.exception(
728 'Subrepo merge error during local merge on hg shadow repo.')
729 merge_possible = False
730 merge_failure_reason = MergeFailureReason.SUBREPO_MERGE_FAILED
731 except RepositoryError:
732 log.exception('Failure when doing local merge on hg shadow repo')
733 merge_possible = False
734 merge_failure_reason = MergeFailureReason.MERGE_FAILED
750 if merge_possible:
751 try:
752 merge_commit_id, needs_push = shadow_repo._local_merge(
753 target_ref, merge_message, merger_name, merger_email,
754 source_ref, use_rebase=use_rebase)
755 merge_possible = True
756
757 # Set a bookmark pointing to the merge commit. This bookmark may be
758 # used to easily identify the last successful merge commit in the
759 # shadow repository.
760 shadow_repo.bookmark('pr-merge', revision=merge_commit_id)
761 merge_ref = Reference('book', 'pr-merge', merge_commit_id)
762 except SubrepoMergeError:
763 log.exception(
764 'Subrepo merge error during local merge on hg shadow repo.')
765 merge_possible = False
766 merge_failure_reason = MergeFailureReason.SUBREPO_MERGE_FAILED
767 except RepositoryError:
768 log.exception('Failure when doing local merge on hg shadow repo')
769 merge_possible = False
770 merge_failure_reason = MergeFailureReason.MERGE_FAILED
735 771
736 772 if merge_possible and not dry_run:
737 773 if needs_push:
738 774 # In case the target is a bookmark, update it, so after pushing
739 775 # the bookmarks is also updated in the target.
740 776 if target_ref.type == 'book':
741 777 shadow_repo.bookmark(
742 778 target_ref.name, revision=merge_commit_id)
743 779
744 780 try:
745 781 shadow_repo_with_hooks = self._get_shadow_instance(
746 782 shadow_repository_path,
747 783 enable_hooks=True)
748 784 # Note: the push_branches option will push any new branch
749 785 # defined in the source repository to the target. This may
750 786 # be dangerous as branches are permanent in Mercurial.
751 787 # This feature was requested in issue #441.
752 788 shadow_repo_with_hooks._local_push(
753 789 merge_commit_id, self.path, push_branches=True,
754 790 enable_hooks=True)
755 791 merge_succeeded = True
756 792 except RepositoryError:
757 793 log.exception(
758 794 'Failure when doing local push from the shadow '
759 795 'repository to the target repository.')
760 796 merge_succeeded = False
761 797 merge_failure_reason = MergeFailureReason.PUSH_FAILED
762 798 else:
763 799 merge_succeeded = True
764 800 else:
765 801 merge_succeeded = False
766 802
767 803 return MergeResponse(
768 804 merge_possible, merge_succeeded, merge_ref, merge_failure_reason)
769 805
770 806 def _get_shadow_instance(
771 807 self, shadow_repository_path, enable_hooks=False):
772 808 config = self.config.copy()
773 809 if not enable_hooks:
774 810 config.clear_section('hooks')
775 811 return MercurialRepository(shadow_repository_path, config)
776 812
777 813 def _validate_pull_reference(self, reference):
778 814 if not (reference.name in self.bookmarks or
779 815 reference.name in self.branches or
780 816 self.get_commit(reference.commit_id)):
781 817 raise CommitDoesNotExistError(
782 818 'Unknown branch, bookmark or commit id')
783 819
784 820 def _local_pull(self, repository_path, reference):
785 821 """
786 822 Fetch a branch, bookmark or commit from a local repository.
787 823 """
788 824 repository_path = os.path.abspath(repository_path)
789 825 if repository_path == self.path:
790 826 raise ValueError('Cannot pull from the same repository')
791 827
792 828 reference_type_to_option_name = {
793 829 'book': 'bookmark',
794 830 'branch': 'branch',
795 831 }
796 832 option_name = reference_type_to_option_name.get(
797 833 reference.type, 'revision')
798 834
799 835 if option_name == 'revision':
800 836 ref = reference.commit_id
801 837 else:
802 838 ref = reference.name
803 839
804 840 options = {option_name: [ref]}
805 841 self._remote.pull_cmd(repository_path, hooks=False, **options)
806 842 self._remote.invalidate_vcs_cache()
807 843
808 844 def bookmark(self, bookmark, revision=None):
809 845 if isinstance(bookmark, unicode):
810 846 bookmark = safe_str(bookmark)
811 847 self._remote.bookmark(bookmark, revision=revision)
812 848 self._remote.invalidate_vcs_cache()
813 849
814 850
815 851 class MercurialIndexBasedCollectionGenerator(CollectionGenerator):
816 852
817 853 def _commit_factory(self, commit_id):
818 854 return self.repo.get_commit(
819 855 commit_idx=commit_id, pre_load=self.pre_load)
@@ -1,565 +1,567 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2017 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 """
22 22 this is forms validation classes
23 23 http://formencode.org/module-formencode.validators.html
24 24 for list off all availible validators
25 25
26 26 we can create our own validators
27 27
28 28 The table below outlines the options which can be used in a schema in addition to the validators themselves
29 29 pre_validators [] These validators will be applied before the schema
30 30 chained_validators [] These validators will be applied after the schema
31 31 allow_extra_fields False If True, then it is not an error when keys that aren't associated with a validator are present
32 32 filter_extra_fields False If True, then keys that aren't associated with a validator are removed
33 33 if_key_missing NoDefault If this is given, then any keys that aren't available but are expected will be replaced with this value (and then validated). This does not override a present .if_missing attribute on validators. NoDefault is a special FormEncode class to mean that no default values has been specified and therefore missing keys shouldn't take a default value.
34 34 ignore_key_missing False If True, then missing keys will be missing in the result, if the validator doesn't have .if_missing on it already
35 35
36 36
37 37 <name> = formencode.validators.<name of validator>
38 38 <name> must equal form name
39 39 list=[1,2,3,4,5]
40 40 for SELECT use formencode.All(OneOf(list), Int())
41 41
42 42 """
43 43
44 44 import deform
45 45 import logging
46 46 import formencode
47 47
48 48 from pkg_resources import resource_filename
49 49 from formencode import All, Pipe
50 50
51 51 from pylons.i18n.translation import _
52 52 from pyramid.threadlocal import get_current_request
53 53
54 54 from rhodecode import BACKENDS
55 55 from rhodecode.lib import helpers
56 56 from rhodecode.model import validators as v
57 57
58 58 log = logging.getLogger(__name__)
59 59
60 60
61 61 deform_templates = resource_filename('deform', 'templates')
62 62 rhodecode_templates = resource_filename('rhodecode', 'templates/forms')
63 63 search_path = (rhodecode_templates, deform_templates)
64 64
65 65
66 66 class RhodecodeFormZPTRendererFactory(deform.ZPTRendererFactory):
67 67 """ Subclass of ZPTRendererFactory to add rhodecode context variables """
68 68 def __call__(self, template_name, **kw):
69 69 kw['h'] = helpers
70 70 kw['request'] = get_current_request()
71 71 return self.load(template_name)(**kw)
72 72
73 73
74 74 form_renderer = RhodecodeFormZPTRendererFactory(search_path)
75 75 deform.Form.set_default_renderer(form_renderer)
76 76
77 77
78 78 def LoginForm():
79 79 class _LoginForm(formencode.Schema):
80 80 allow_extra_fields = True
81 81 filter_extra_fields = True
82 82 username = v.UnicodeString(
83 83 strip=True,
84 84 min=1,
85 85 not_empty=True,
86 86 messages={
87 87 'empty': _(u'Please enter a login'),
88 88 'tooShort': _(u'Enter a value %(min)i characters long or more')
89 89 }
90 90 )
91 91
92 92 password = v.UnicodeString(
93 93 strip=False,
94 94 min=3,
95 95 not_empty=True,
96 96 messages={
97 97 'empty': _(u'Please enter a password'),
98 98 'tooShort': _(u'Enter %(min)i characters or more')}
99 99 )
100 100
101 101 remember = v.StringBoolean(if_missing=False)
102 102
103 103 chained_validators = [v.ValidAuth()]
104 104 return _LoginForm
105 105
106 106
107 107 def UserForm(edit=False, available_languages=[], old_data={}):
108 108 class _UserForm(formencode.Schema):
109 109 allow_extra_fields = True
110 110 filter_extra_fields = True
111 111 username = All(v.UnicodeString(strip=True, min=1, not_empty=True),
112 112 v.ValidUsername(edit, old_data))
113 113 if edit:
114 114 new_password = All(
115 115 v.ValidPassword(),
116 116 v.UnicodeString(strip=False, min=6, not_empty=False)
117 117 )
118 118 password_confirmation = All(
119 119 v.ValidPassword(),
120 120 v.UnicodeString(strip=False, min=6, not_empty=False),
121 121 )
122 122 admin = v.StringBoolean(if_missing=False)
123 123 else:
124 124 password = All(
125 125 v.ValidPassword(),
126 126 v.UnicodeString(strip=False, min=6, not_empty=True)
127 127 )
128 128 password_confirmation = All(
129 129 v.ValidPassword(),
130 130 v.UnicodeString(strip=False, min=6, not_empty=False)
131 131 )
132 132
133 133 password_change = v.StringBoolean(if_missing=False)
134 134 create_repo_group = v.StringBoolean(if_missing=False)
135 135
136 136 active = v.StringBoolean(if_missing=False)
137 137 firstname = v.UnicodeString(strip=True, min=1, not_empty=False)
138 138 lastname = v.UnicodeString(strip=True, min=1, not_empty=False)
139 139 email = All(v.Email(not_empty=True), v.UniqSystemEmail(old_data))
140 140 extern_name = v.UnicodeString(strip=True)
141 141 extern_type = v.UnicodeString(strip=True)
142 142 language = v.OneOf(available_languages, hideList=False,
143 143 testValueList=True, if_missing=None)
144 144 chained_validators = [v.ValidPasswordsMatch()]
145 145 return _UserForm
146 146
147 147
148 148 def UserGroupForm(edit=False, old_data=None, allow_disabled=False):
149 149 old_data = old_data or {}
150 150
151 151 class _UserGroupForm(formencode.Schema):
152 152 allow_extra_fields = True
153 153 filter_extra_fields = True
154 154
155 155 users_group_name = All(
156 156 v.UnicodeString(strip=True, min=1, not_empty=True),
157 157 v.ValidUserGroup(edit, old_data)
158 158 )
159 159 user_group_description = v.UnicodeString(strip=True, min=1,
160 160 not_empty=False)
161 161
162 162 users_group_active = v.StringBoolean(if_missing=False)
163 163
164 164 if edit:
165 165 # this is user group owner
166 166 user = All(
167 167 v.UnicodeString(not_empty=True),
168 168 v.ValidRepoUser(allow_disabled))
169 169 return _UserGroupForm
170 170
171 171
172 172 def RepoGroupForm(edit=False, old_data=None, available_groups=None,
173 173 can_create_in_root=False, allow_disabled=False):
174 174 old_data = old_data or {}
175 175 available_groups = available_groups or []
176 176
177 177 class _RepoGroupForm(formencode.Schema):
178 178 allow_extra_fields = True
179 179 filter_extra_fields = False
180 180
181 181 group_name = All(v.UnicodeString(strip=True, min=1, not_empty=True),
182 182 v.SlugifyName(),)
183 183 group_description = v.UnicodeString(strip=True, min=1,
184 184 not_empty=False)
185 185 group_copy_permissions = v.StringBoolean(if_missing=False)
186 186
187 187 group_parent_id = v.OneOf(available_groups, hideList=False,
188 188 testValueList=True, not_empty=True)
189 189 enable_locking = v.StringBoolean(if_missing=False)
190 190 chained_validators = [
191 191 v.ValidRepoGroup(edit, old_data, can_create_in_root)]
192 192
193 193 if edit:
194 194 # this is repo group owner
195 195 user = All(
196 196 v.UnicodeString(not_empty=True),
197 197 v.ValidRepoUser(allow_disabled))
198 198
199 199 return _RepoGroupForm
200 200
201 201
202 202 def RegisterForm(edit=False, old_data={}):
203 203 class _RegisterForm(formencode.Schema):
204 204 allow_extra_fields = True
205 205 filter_extra_fields = True
206 206 username = All(
207 207 v.ValidUsername(edit, old_data),
208 208 v.UnicodeString(strip=True, min=1, not_empty=True)
209 209 )
210 210 password = All(
211 211 v.ValidPassword(),
212 212 v.UnicodeString(strip=False, min=6, not_empty=True)
213 213 )
214 214 password_confirmation = All(
215 215 v.ValidPassword(),
216 216 v.UnicodeString(strip=False, min=6, not_empty=True)
217 217 )
218 218 active = v.StringBoolean(if_missing=False)
219 219 firstname = v.UnicodeString(strip=True, min=1, not_empty=False)
220 220 lastname = v.UnicodeString(strip=True, min=1, not_empty=False)
221 221 email = All(v.Email(not_empty=True), v.UniqSystemEmail(old_data))
222 222
223 223 chained_validators = [v.ValidPasswordsMatch()]
224 224
225 225 return _RegisterForm
226 226
227 227
228 228 def PasswordResetForm():
229 229 class _PasswordResetForm(formencode.Schema):
230 230 allow_extra_fields = True
231 231 filter_extra_fields = True
232 232 email = All(v.ValidSystemEmail(), v.Email(not_empty=True))
233 233 return _PasswordResetForm
234 234
235 235
236 236 def RepoForm(edit=False, old_data=None, repo_groups=None, landing_revs=None,
237 237 allow_disabled=False):
238 238 old_data = old_data or {}
239 239 repo_groups = repo_groups or []
240 240 landing_revs = landing_revs or []
241 241 supported_backends = BACKENDS.keys()
242 242
243 243 class _RepoForm(formencode.Schema):
244 244 allow_extra_fields = True
245 245 filter_extra_fields = False
246 246 repo_name = All(v.UnicodeString(strip=True, min=1, not_empty=True),
247 247 v.SlugifyName(), v.CannotHaveGitSuffix())
248 248 repo_group = All(v.CanWriteGroup(old_data),
249 249 v.OneOf(repo_groups, hideList=True))
250 250 repo_type = v.OneOf(supported_backends, required=False,
251 251 if_missing=old_data.get('repo_type'))
252 252 repo_description = v.UnicodeString(strip=True, min=1, not_empty=False)
253 253 repo_private = v.StringBoolean(if_missing=False)
254 254 repo_landing_rev = v.OneOf(landing_revs, hideList=True)
255 255 repo_copy_permissions = v.StringBoolean(if_missing=False)
256 256 clone_uri = All(v.UnicodeString(strip=True, min=1, not_empty=False))
257 257
258 258 repo_enable_statistics = v.StringBoolean(if_missing=False)
259 259 repo_enable_downloads = v.StringBoolean(if_missing=False)
260 260 repo_enable_locking = v.StringBoolean(if_missing=False)
261 261
262 262 if edit:
263 263 # this is repo owner
264 264 user = All(
265 265 v.UnicodeString(not_empty=True),
266 266 v.ValidRepoUser(allow_disabled))
267 267 clone_uri_change = v.UnicodeString(
268 268 not_empty=False, if_missing=v.Missing)
269 269
270 270 chained_validators = [v.ValidCloneUri(),
271 271 v.ValidRepoName(edit, old_data)]
272 272 return _RepoForm
273 273
274 274
275 275 def RepoPermsForm():
276 276 class _RepoPermsForm(formencode.Schema):
277 277 allow_extra_fields = True
278 278 filter_extra_fields = False
279 279 chained_validators = [v.ValidPerms(type_='repo')]
280 280 return _RepoPermsForm
281 281
282 282
283 283 def RepoGroupPermsForm(valid_recursive_choices):
284 284 class _RepoGroupPermsForm(formencode.Schema):
285 285 allow_extra_fields = True
286 286 filter_extra_fields = False
287 287 recursive = v.OneOf(valid_recursive_choices)
288 288 chained_validators = [v.ValidPerms(type_='repo_group')]
289 289 return _RepoGroupPermsForm
290 290
291 291
292 292 def UserGroupPermsForm():
293 293 class _UserPermsForm(formencode.Schema):
294 294 allow_extra_fields = True
295 295 filter_extra_fields = False
296 296 chained_validators = [v.ValidPerms(type_='user_group')]
297 297 return _UserPermsForm
298 298
299 299
300 300 def RepoFieldForm():
301 301 class _RepoFieldForm(formencode.Schema):
302 302 filter_extra_fields = True
303 303 allow_extra_fields = True
304 304
305 305 new_field_key = All(v.FieldKey(),
306 306 v.UnicodeString(strip=True, min=3, not_empty=True))
307 307 new_field_value = v.UnicodeString(not_empty=False, if_missing=u'')
308 308 new_field_type = v.OneOf(['str', 'unicode', 'list', 'tuple'],
309 309 if_missing='str')
310 310 new_field_label = v.UnicodeString(not_empty=False)
311 311 new_field_desc = v.UnicodeString(not_empty=False)
312 312
313 313 return _RepoFieldForm
314 314
315 315
316 316 def RepoForkForm(edit=False, old_data={}, supported_backends=BACKENDS.keys(),
317 317 repo_groups=[], landing_revs=[]):
318 318 class _RepoForkForm(formencode.Schema):
319 319 allow_extra_fields = True
320 320 filter_extra_fields = False
321 321 repo_name = All(v.UnicodeString(strip=True, min=1, not_empty=True),
322 322 v.SlugifyName())
323 323 repo_group = All(v.CanWriteGroup(),
324 324 v.OneOf(repo_groups, hideList=True))
325 325 repo_type = All(v.ValidForkType(old_data), v.OneOf(supported_backends))
326 326 description = v.UnicodeString(strip=True, min=1, not_empty=True)
327 327 private = v.StringBoolean(if_missing=False)
328 328 copy_permissions = v.StringBoolean(if_missing=False)
329 329 fork_parent_id = v.UnicodeString()
330 330 chained_validators = [v.ValidForkName(edit, old_data)]
331 331 landing_rev = v.OneOf(landing_revs, hideList=True)
332 332
333 333 return _RepoForkForm
334 334
335 335
336 336 def ApplicationSettingsForm():
337 337 class _ApplicationSettingsForm(formencode.Schema):
338 338 allow_extra_fields = True
339 339 filter_extra_fields = False
340 340 rhodecode_title = v.UnicodeString(strip=True, max=40, not_empty=False)
341 341 rhodecode_realm = v.UnicodeString(strip=True, min=1, not_empty=True)
342 342 rhodecode_pre_code = v.UnicodeString(strip=True, min=1, not_empty=False)
343 343 rhodecode_post_code = v.UnicodeString(strip=True, min=1, not_empty=False)
344 344 rhodecode_captcha_public_key = v.UnicodeString(strip=True, min=1, not_empty=False)
345 345 rhodecode_captcha_private_key = v.UnicodeString(strip=True, min=1, not_empty=False)
346 346 rhodecode_create_personal_repo_group = v.StringBoolean(if_missing=False)
347 347 rhodecode_personal_repo_group_pattern = v.UnicodeString(strip=True, min=1, not_empty=False)
348 348
349 349 return _ApplicationSettingsForm
350 350
351 351
352 352 def ApplicationVisualisationForm():
353 353 class _ApplicationVisualisationForm(formencode.Schema):
354 354 allow_extra_fields = True
355 355 filter_extra_fields = False
356 356 rhodecode_show_public_icon = v.StringBoolean(if_missing=False)
357 357 rhodecode_show_private_icon = v.StringBoolean(if_missing=False)
358 358 rhodecode_stylify_metatags = v.StringBoolean(if_missing=False)
359 359
360 360 rhodecode_repository_fields = v.StringBoolean(if_missing=False)
361 361 rhodecode_lightweight_journal = v.StringBoolean(if_missing=False)
362 362 rhodecode_dashboard_items = v.Int(min=5, not_empty=True)
363 363 rhodecode_admin_grid_items = v.Int(min=5, not_empty=True)
364 364 rhodecode_show_version = v.StringBoolean(if_missing=False)
365 365 rhodecode_use_gravatar = v.StringBoolean(if_missing=False)
366 366 rhodecode_markup_renderer = v.OneOf(['markdown', 'rst'])
367 367 rhodecode_gravatar_url = v.UnicodeString(min=3)
368 368 rhodecode_clone_uri_tmpl = v.UnicodeString(min=3)
369 369 rhodecode_support_url = v.UnicodeString()
370 370 rhodecode_show_revision_number = v.StringBoolean(if_missing=False)
371 371 rhodecode_show_sha_length = v.Int(min=4, not_empty=True)
372 372
373 373 return _ApplicationVisualisationForm
374 374
375 375
376 376 class _BaseVcsSettingsForm(formencode.Schema):
377 377 allow_extra_fields = True
378 378 filter_extra_fields = False
379 379 hooks_changegroup_repo_size = v.StringBoolean(if_missing=False)
380 380 hooks_changegroup_push_logger = v.StringBoolean(if_missing=False)
381 381 hooks_outgoing_pull_logger = v.StringBoolean(if_missing=False)
382 382
383 383 # PR/Code-review
384 384 rhodecode_pr_merge_enabled = v.StringBoolean(if_missing=False)
385 385 rhodecode_use_outdated_comments = v.StringBoolean(if_missing=False)
386 386
387 387 # hg
388 388 extensions_largefiles = v.StringBoolean(if_missing=False)
389 389 extensions_evolve = v.StringBoolean(if_missing=False)
390 390 phases_publish = v.StringBoolean(if_missing=False)
391
391 392 rhodecode_hg_use_rebase_for_merging = v.StringBoolean(if_missing=False)
393 rhodecode_hg_close_branch_before_merging = v.StringBoolean(if_missing=False)
392 394
393 395 # git
394 396 vcs_git_lfs_enabled = v.StringBoolean(if_missing=False)
395 397
396 398 # svn
397 399 vcs_svn_proxy_http_requests_enabled = v.StringBoolean(if_missing=False)
398 400 vcs_svn_proxy_http_server_url = v.UnicodeString(strip=True, if_missing=None)
399 401
400 402
401 403 def ApplicationUiSettingsForm():
402 404 class _ApplicationUiSettingsForm(_BaseVcsSettingsForm):
403 405 web_push_ssl = v.StringBoolean(if_missing=False)
404 406 paths_root_path = All(
405 407 v.ValidPath(),
406 408 v.UnicodeString(strip=True, min=1, not_empty=True)
407 409 )
408 410 largefiles_usercache = All(
409 411 v.ValidPath(),
410 412 v.UnicodeString(strip=True, min=2, not_empty=True))
411 413 vcs_git_lfs_store_location = All(
412 414 v.ValidPath(),
413 415 v.UnicodeString(strip=True, min=2, not_empty=True))
414 416 extensions_hgsubversion = v.StringBoolean(if_missing=False)
415 417 extensions_hggit = v.StringBoolean(if_missing=False)
416 418 new_svn_branch = v.ValidSvnPattern(section='vcs_svn_branch')
417 419 new_svn_tag = v.ValidSvnPattern(section='vcs_svn_tag')
418 420
419 421 return _ApplicationUiSettingsForm
420 422
421 423
422 424 def RepoVcsSettingsForm(repo_name):
423 425 class _RepoVcsSettingsForm(_BaseVcsSettingsForm):
424 426 inherit_global_settings = v.StringBoolean(if_missing=False)
425 427 new_svn_branch = v.ValidSvnPattern(
426 428 section='vcs_svn_branch', repo_name=repo_name)
427 429 new_svn_tag = v.ValidSvnPattern(
428 430 section='vcs_svn_tag', repo_name=repo_name)
429 431
430 432 return _RepoVcsSettingsForm
431 433
432 434
433 435 def LabsSettingsForm():
434 436 class _LabSettingsForm(formencode.Schema):
435 437 allow_extra_fields = True
436 438 filter_extra_fields = False
437 439
438 440 return _LabSettingsForm
439 441
440 442
441 443 def ApplicationPermissionsForm(
442 444 register_choices, password_reset_choices, extern_activate_choices):
443 445 class _DefaultPermissionsForm(formencode.Schema):
444 446 allow_extra_fields = True
445 447 filter_extra_fields = True
446 448
447 449 anonymous = v.StringBoolean(if_missing=False)
448 450 default_register = v.OneOf(register_choices)
449 451 default_register_message = v.UnicodeString()
450 452 default_password_reset = v.OneOf(password_reset_choices)
451 453 default_extern_activate = v.OneOf(extern_activate_choices)
452 454
453 455 return _DefaultPermissionsForm
454 456
455 457
456 458 def ObjectPermissionsForm(repo_perms_choices, group_perms_choices,
457 459 user_group_perms_choices):
458 460 class _ObjectPermissionsForm(formencode.Schema):
459 461 allow_extra_fields = True
460 462 filter_extra_fields = True
461 463 overwrite_default_repo = v.StringBoolean(if_missing=False)
462 464 overwrite_default_group = v.StringBoolean(if_missing=False)
463 465 overwrite_default_user_group = v.StringBoolean(if_missing=False)
464 466 default_repo_perm = v.OneOf(repo_perms_choices)
465 467 default_group_perm = v.OneOf(group_perms_choices)
466 468 default_user_group_perm = v.OneOf(user_group_perms_choices)
467 469
468 470 return _ObjectPermissionsForm
469 471
470 472
471 473 def UserPermissionsForm(create_choices, create_on_write_choices,
472 474 repo_group_create_choices, user_group_create_choices,
473 475 fork_choices, inherit_default_permissions_choices):
474 476 class _DefaultPermissionsForm(formencode.Schema):
475 477 allow_extra_fields = True
476 478 filter_extra_fields = True
477 479
478 480 anonymous = v.StringBoolean(if_missing=False)
479 481
480 482 default_repo_create = v.OneOf(create_choices)
481 483 default_repo_create_on_write = v.OneOf(create_on_write_choices)
482 484 default_user_group_create = v.OneOf(user_group_create_choices)
483 485 default_repo_group_create = v.OneOf(repo_group_create_choices)
484 486 default_fork_create = v.OneOf(fork_choices)
485 487 default_inherit_default_permissions = v.OneOf(inherit_default_permissions_choices)
486 488
487 489 return _DefaultPermissionsForm
488 490
489 491
490 492 def UserIndividualPermissionsForm():
491 493 class _DefaultPermissionsForm(formencode.Schema):
492 494 allow_extra_fields = True
493 495 filter_extra_fields = True
494 496
495 497 inherit_default_permissions = v.StringBoolean(if_missing=False)
496 498
497 499 return _DefaultPermissionsForm
498 500
499 501
500 502 def DefaultsForm(edit=False, old_data={}, supported_backends=BACKENDS.keys()):
501 503 class _DefaultsForm(formencode.Schema):
502 504 allow_extra_fields = True
503 505 filter_extra_fields = True
504 506 default_repo_type = v.OneOf(supported_backends)
505 507 default_repo_private = v.StringBoolean(if_missing=False)
506 508 default_repo_enable_statistics = v.StringBoolean(if_missing=False)
507 509 default_repo_enable_downloads = v.StringBoolean(if_missing=False)
508 510 default_repo_enable_locking = v.StringBoolean(if_missing=False)
509 511
510 512 return _DefaultsForm
511 513
512 514
513 515 def AuthSettingsForm():
514 516 class _AuthSettingsForm(formencode.Schema):
515 517 allow_extra_fields = True
516 518 filter_extra_fields = True
517 519 auth_plugins = All(v.ValidAuthPlugins(),
518 520 v.UniqueListFromString()(not_empty=True))
519 521
520 522 return _AuthSettingsForm
521 523
522 524
523 525 def UserExtraEmailForm():
524 526 class _UserExtraEmailForm(formencode.Schema):
525 527 email = All(v.UniqSystemEmail(), v.Email(not_empty=True))
526 528 return _UserExtraEmailForm
527 529
528 530
529 531 def UserExtraIpForm():
530 532 class _UserExtraIpForm(formencode.Schema):
531 533 ip = v.ValidIp()(not_empty=True)
532 534 return _UserExtraIpForm
533 535
534 536
535 537
536 538 def PullRequestForm(repo_id):
537 539 class ReviewerForm(formencode.Schema):
538 540 user_id = v.Int(not_empty=True)
539 541 reasons = All()
540 542 mandatory = v.StringBoolean()
541 543
542 544 class _PullRequestForm(formencode.Schema):
543 545 allow_extra_fields = True
544 546 filter_extra_fields = True
545 547
546 548 common_ancestor = v.UnicodeString(strip=True, required=True)
547 549 source_repo = v.UnicodeString(strip=True, required=True)
548 550 source_ref = v.UnicodeString(strip=True, required=True)
549 551 target_repo = v.UnicodeString(strip=True, required=True)
550 552 target_ref = v.UnicodeString(strip=True, required=True)
551 553 revisions = All(#v.NotReviewedRevisions(repo_id)(),
552 554 v.UniqueList()(not_empty=True))
553 555 review_members = formencode.ForEach(ReviewerForm())
554 556 pullrequest_title = v.UnicodeString(strip=True, required=True)
555 557 pullrequest_desc = v.UnicodeString(strip=True, required=False)
556 558
557 559 return _PullRequestForm
558 560
559 561
560 562 def IssueTrackerPatternsForm():
561 563 class _IssueTrackerPatternsForm(formencode.Schema):
562 564 allow_extra_fields = True
563 565 filter_extra_fields = False
564 566 chained_validators = [v.ValidPattern()]
565 567 return _IssueTrackerPatternsForm
@@ -1,1584 +1,1595 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2012-2017 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21
22 22 """
23 23 pull request model for RhodeCode
24 24 """
25 25
26 26 from collections import namedtuple
27 27 import json
28 28 import logging
29 29 import datetime
30 30 import urllib
31 31
32 32 from pylons.i18n.translation import _
33 33 from pylons.i18n.translation import lazy_ugettext
34 34 from pyramid.threadlocal import get_current_request
35 35 from sqlalchemy import or_
36 36
37 37 from rhodecode import events
38 38 from rhodecode.lib import helpers as h, hooks_utils, diffs
39 39 from rhodecode.lib import audit_logger
40 40 from rhodecode.lib.compat import OrderedDict
41 41 from rhodecode.lib.hooks_daemon import prepare_callback_daemon
42 42 from rhodecode.lib.markup_renderer import (
43 43 DEFAULT_COMMENTS_RENDERER, RstTemplateRenderer)
44 44 from rhodecode.lib.utils2 import safe_unicode, safe_str, md5_safe
45 45 from rhodecode.lib.vcs.backends.base import (
46 46 Reference, MergeResponse, MergeFailureReason, UpdateFailureReason)
47 47 from rhodecode.lib.vcs.conf import settings as vcs_settings
48 48 from rhodecode.lib.vcs.exceptions import (
49 49 CommitDoesNotExistError, EmptyRepositoryError)
50 50 from rhodecode.model import BaseModel
51 51 from rhodecode.model.changeset_status import ChangesetStatusModel
52 52 from rhodecode.model.comment import CommentsModel
53 53 from rhodecode.model.db import (
54 54 PullRequest, PullRequestReviewers, ChangesetStatus,
55 55 PullRequestVersion, ChangesetComment, Repository)
56 56 from rhodecode.model.meta import Session
57 57 from rhodecode.model.notification import NotificationModel, \
58 58 EmailNotificationModel
59 59 from rhodecode.model.scm import ScmModel
60 60 from rhodecode.model.settings import VcsSettingsModel
61 61
62 62
63 63 log = logging.getLogger(__name__)
64 64
65 65
66 66 # Data structure to hold the response data when updating commits during a pull
67 67 # request update.
68 68 UpdateResponse = namedtuple('UpdateResponse', [
69 69 'executed', 'reason', 'new', 'old', 'changes',
70 70 'source_changed', 'target_changed'])
71 71
72 72
73 73 class PullRequestModel(BaseModel):
74 74
75 75 cls = PullRequest
76 76
77 77 DIFF_CONTEXT = 3
78 78
79 79 MERGE_STATUS_MESSAGES = {
80 80 MergeFailureReason.NONE: lazy_ugettext(
81 81 'This pull request can be automatically merged.'),
82 82 MergeFailureReason.UNKNOWN: lazy_ugettext(
83 83 'This pull request cannot be merged because of an unhandled'
84 84 ' exception.'),
85 85 MergeFailureReason.MERGE_FAILED: lazy_ugettext(
86 86 'This pull request cannot be merged because of merge conflicts.'),
87 87 MergeFailureReason.PUSH_FAILED: lazy_ugettext(
88 88 'This pull request could not be merged because push to target'
89 89 ' failed.'),
90 90 MergeFailureReason.TARGET_IS_NOT_HEAD: lazy_ugettext(
91 91 'This pull request cannot be merged because the target is not a'
92 92 ' head.'),
93 93 MergeFailureReason.HG_SOURCE_HAS_MORE_BRANCHES: lazy_ugettext(
94 94 'This pull request cannot be merged because the source contains'
95 95 ' more branches than the target.'),
96 96 MergeFailureReason.HG_TARGET_HAS_MULTIPLE_HEADS: lazy_ugettext(
97 97 'This pull request cannot be merged because the target has'
98 98 ' multiple heads.'),
99 99 MergeFailureReason.TARGET_IS_LOCKED: lazy_ugettext(
100 100 'This pull request cannot be merged because the target repository'
101 101 ' is locked.'),
102 102 MergeFailureReason._DEPRECATED_MISSING_COMMIT: lazy_ugettext(
103 103 'This pull request cannot be merged because the target or the '
104 104 'source reference is missing.'),
105 105 MergeFailureReason.MISSING_TARGET_REF: lazy_ugettext(
106 106 'This pull request cannot be merged because the target '
107 107 'reference is missing.'),
108 108 MergeFailureReason.MISSING_SOURCE_REF: lazy_ugettext(
109 109 'This pull request cannot be merged because the source '
110 110 'reference is missing.'),
111 111 MergeFailureReason.SUBREPO_MERGE_FAILED: lazy_ugettext(
112 112 'This pull request cannot be merged because of conflicts related '
113 113 'to sub repositories.'),
114 114 }
115 115
116 116 UPDATE_STATUS_MESSAGES = {
117 117 UpdateFailureReason.NONE: lazy_ugettext(
118 118 'Pull request update successful.'),
119 119 UpdateFailureReason.UNKNOWN: lazy_ugettext(
120 120 'Pull request update failed because of an unknown error.'),
121 121 UpdateFailureReason.NO_CHANGE: lazy_ugettext(
122 122 'No update needed because the source and target have not changed.'),
123 123 UpdateFailureReason.WRONG_REF_TYPE: lazy_ugettext(
124 124 'Pull request cannot be updated because the reference type is '
125 125 'not supported for an update. Only Branch, Tag or Bookmark is allowed.'),
126 126 UpdateFailureReason.MISSING_TARGET_REF: lazy_ugettext(
127 127 'This pull request cannot be updated because the target '
128 128 'reference is missing.'),
129 129 UpdateFailureReason.MISSING_SOURCE_REF: lazy_ugettext(
130 130 'This pull request cannot be updated because the source '
131 131 'reference is missing.'),
132 132 }
133 133
134 134 def __get_pull_request(self, pull_request):
135 135 return self._get_instance((
136 136 PullRequest, PullRequestVersion), pull_request)
137 137
138 138 def _check_perms(self, perms, pull_request, user, api=False):
139 139 if not api:
140 140 return h.HasRepoPermissionAny(*perms)(
141 141 user=user, repo_name=pull_request.target_repo.repo_name)
142 142 else:
143 143 return h.HasRepoPermissionAnyApi(*perms)(
144 144 user=user, repo_name=pull_request.target_repo.repo_name)
145 145
146 146 def check_user_read(self, pull_request, user, api=False):
147 147 _perms = ('repository.admin', 'repository.write', 'repository.read',)
148 148 return self._check_perms(_perms, pull_request, user, api)
149 149
150 150 def check_user_merge(self, pull_request, user, api=False):
151 151 _perms = ('repository.admin', 'repository.write', 'hg.admin',)
152 152 return self._check_perms(_perms, pull_request, user, api)
153 153
154 154 def check_user_update(self, pull_request, user, api=False):
155 155 owner = user.user_id == pull_request.user_id
156 156 return self.check_user_merge(pull_request, user, api) or owner
157 157
158 158 def check_user_delete(self, pull_request, user):
159 159 owner = user.user_id == pull_request.user_id
160 160 _perms = ('repository.admin',)
161 161 return self._check_perms(_perms, pull_request, user) or owner
162 162
163 163 def check_user_change_status(self, pull_request, user, api=False):
164 164 reviewer = user.user_id in [x.user_id for x in
165 165 pull_request.reviewers]
166 166 return self.check_user_update(pull_request, user, api) or reviewer
167 167
168 168 def get(self, pull_request):
169 169 return self.__get_pull_request(pull_request)
170 170
171 171 def _prepare_get_all_query(self, repo_name, source=False, statuses=None,
172 172 opened_by=None, order_by=None,
173 173 order_dir='desc'):
174 174 repo = None
175 175 if repo_name:
176 176 repo = self._get_repo(repo_name)
177 177
178 178 q = PullRequest.query()
179 179
180 180 # source or target
181 181 if repo and source:
182 182 q = q.filter(PullRequest.source_repo == repo)
183 183 elif repo:
184 184 q = q.filter(PullRequest.target_repo == repo)
185 185
186 186 # closed,opened
187 187 if statuses:
188 188 q = q.filter(PullRequest.status.in_(statuses))
189 189
190 190 # opened by filter
191 191 if opened_by:
192 192 q = q.filter(PullRequest.user_id.in_(opened_by))
193 193
194 194 if order_by:
195 195 order_map = {
196 196 'name_raw': PullRequest.pull_request_id,
197 197 'title': PullRequest.title,
198 198 'updated_on_raw': PullRequest.updated_on,
199 199 'target_repo': PullRequest.target_repo_id
200 200 }
201 201 if order_dir == 'asc':
202 202 q = q.order_by(order_map[order_by].asc())
203 203 else:
204 204 q = q.order_by(order_map[order_by].desc())
205 205
206 206 return q
207 207
208 208 def count_all(self, repo_name, source=False, statuses=None,
209 209 opened_by=None):
210 210 """
211 211 Count the number of pull requests for a specific repository.
212 212
213 213 :param repo_name: target or source repo
214 214 :param source: boolean flag to specify if repo_name refers to source
215 215 :param statuses: list of pull request statuses
216 216 :param opened_by: author user of the pull request
217 217 :returns: int number of pull requests
218 218 """
219 219 q = self._prepare_get_all_query(
220 220 repo_name, source=source, statuses=statuses, opened_by=opened_by)
221 221
222 222 return q.count()
223 223
224 224 def get_all(self, repo_name, source=False, statuses=None, opened_by=None,
225 225 offset=0, length=None, order_by=None, order_dir='desc'):
226 226 """
227 227 Get all pull requests for a specific repository.
228 228
229 229 :param repo_name: target or source repo
230 230 :param source: boolean flag to specify if repo_name refers to source
231 231 :param statuses: list of pull request statuses
232 232 :param opened_by: author user of the pull request
233 233 :param offset: pagination offset
234 234 :param length: length of returned list
235 235 :param order_by: order of the returned list
236 236 :param order_dir: 'asc' or 'desc' ordering direction
237 237 :returns: list of pull requests
238 238 """
239 239 q = self._prepare_get_all_query(
240 240 repo_name, source=source, statuses=statuses, opened_by=opened_by,
241 241 order_by=order_by, order_dir=order_dir)
242 242
243 243 if length:
244 244 pull_requests = q.limit(length).offset(offset).all()
245 245 else:
246 246 pull_requests = q.all()
247 247
248 248 return pull_requests
249 249
250 250 def count_awaiting_review(self, repo_name, source=False, statuses=None,
251 251 opened_by=None):
252 252 """
253 253 Count the number of pull requests for a specific repository that are
254 254 awaiting review.
255 255
256 256 :param repo_name: target or source repo
257 257 :param source: boolean flag to specify if repo_name refers to source
258 258 :param statuses: list of pull request statuses
259 259 :param opened_by: author user of the pull request
260 260 :returns: int number of pull requests
261 261 """
262 262 pull_requests = self.get_awaiting_review(
263 263 repo_name, source=source, statuses=statuses, opened_by=opened_by)
264 264
265 265 return len(pull_requests)
266 266
267 267 def get_awaiting_review(self, repo_name, source=False, statuses=None,
268 268 opened_by=None, offset=0, length=None,
269 269 order_by=None, order_dir='desc'):
270 270 """
271 271 Get all pull requests for a specific repository that are awaiting
272 272 review.
273 273
274 274 :param repo_name: target or source repo
275 275 :param source: boolean flag to specify if repo_name refers to source
276 276 :param statuses: list of pull request statuses
277 277 :param opened_by: author user of the pull request
278 278 :param offset: pagination offset
279 279 :param length: length of returned list
280 280 :param order_by: order of the returned list
281 281 :param order_dir: 'asc' or 'desc' ordering direction
282 282 :returns: list of pull requests
283 283 """
284 284 pull_requests = self.get_all(
285 285 repo_name, source=source, statuses=statuses, opened_by=opened_by,
286 286 order_by=order_by, order_dir=order_dir)
287 287
288 288 _filtered_pull_requests = []
289 289 for pr in pull_requests:
290 290 status = pr.calculated_review_status()
291 291 if status in [ChangesetStatus.STATUS_NOT_REVIEWED,
292 292 ChangesetStatus.STATUS_UNDER_REVIEW]:
293 293 _filtered_pull_requests.append(pr)
294 294 if length:
295 295 return _filtered_pull_requests[offset:offset+length]
296 296 else:
297 297 return _filtered_pull_requests
298 298
299 299 def count_awaiting_my_review(self, repo_name, source=False, statuses=None,
300 300 opened_by=None, user_id=None):
301 301 """
302 302 Count the number of pull requests for a specific repository that are
303 303 awaiting review from a specific user.
304 304
305 305 :param repo_name: target or source repo
306 306 :param source: boolean flag to specify if repo_name refers to source
307 307 :param statuses: list of pull request statuses
308 308 :param opened_by: author user of the pull request
309 309 :param user_id: reviewer user of the pull request
310 310 :returns: int number of pull requests
311 311 """
312 312 pull_requests = self.get_awaiting_my_review(
313 313 repo_name, source=source, statuses=statuses, opened_by=opened_by,
314 314 user_id=user_id)
315 315
316 316 return len(pull_requests)
317 317
318 318 def get_awaiting_my_review(self, repo_name, source=False, statuses=None,
319 319 opened_by=None, user_id=None, offset=0,
320 320 length=None, order_by=None, order_dir='desc'):
321 321 """
322 322 Get all pull requests for a specific repository that are awaiting
323 323 review from a specific user.
324 324
325 325 :param repo_name: target or source repo
326 326 :param source: boolean flag to specify if repo_name refers to source
327 327 :param statuses: list of pull request statuses
328 328 :param opened_by: author user of the pull request
329 329 :param user_id: reviewer user of the pull request
330 330 :param offset: pagination offset
331 331 :param length: length of returned list
332 332 :param order_by: order of the returned list
333 333 :param order_dir: 'asc' or 'desc' ordering direction
334 334 :returns: list of pull requests
335 335 """
336 336 pull_requests = self.get_all(
337 337 repo_name, source=source, statuses=statuses, opened_by=opened_by,
338 338 order_by=order_by, order_dir=order_dir)
339 339
340 340 _my = PullRequestModel().get_not_reviewed(user_id)
341 341 my_participation = []
342 342 for pr in pull_requests:
343 343 if pr in _my:
344 344 my_participation.append(pr)
345 345 _filtered_pull_requests = my_participation
346 346 if length:
347 347 return _filtered_pull_requests[offset:offset+length]
348 348 else:
349 349 return _filtered_pull_requests
350 350
351 351 def get_not_reviewed(self, user_id):
352 352 return [
353 353 x.pull_request for x in PullRequestReviewers.query().filter(
354 354 PullRequestReviewers.user_id == user_id).all()
355 355 ]
356 356
357 357 def _prepare_participating_query(self, user_id=None, statuses=None,
358 358 order_by=None, order_dir='desc'):
359 359 q = PullRequest.query()
360 360 if user_id:
361 361 reviewers_subquery = Session().query(
362 362 PullRequestReviewers.pull_request_id).filter(
363 363 PullRequestReviewers.user_id == user_id).subquery()
364 364 user_filter= or_(
365 365 PullRequest.user_id == user_id,
366 366 PullRequest.pull_request_id.in_(reviewers_subquery)
367 367 )
368 368 q = PullRequest.query().filter(user_filter)
369 369
370 370 # closed,opened
371 371 if statuses:
372 372 q = q.filter(PullRequest.status.in_(statuses))
373 373
374 374 if order_by:
375 375 order_map = {
376 376 'name_raw': PullRequest.pull_request_id,
377 377 'title': PullRequest.title,
378 378 'updated_on_raw': PullRequest.updated_on,
379 379 'target_repo': PullRequest.target_repo_id
380 380 }
381 381 if order_dir == 'asc':
382 382 q = q.order_by(order_map[order_by].asc())
383 383 else:
384 384 q = q.order_by(order_map[order_by].desc())
385 385
386 386 return q
387 387
388 388 def count_im_participating_in(self, user_id=None, statuses=None):
389 389 q = self._prepare_participating_query(user_id, statuses=statuses)
390 390 return q.count()
391 391
392 392 def get_im_participating_in(
393 393 self, user_id=None, statuses=None, offset=0,
394 394 length=None, order_by=None, order_dir='desc'):
395 395 """
396 396 Get all Pull requests that i'm participating in, or i have opened
397 397 """
398 398
399 399 q = self._prepare_participating_query(
400 400 user_id, statuses=statuses, order_by=order_by,
401 401 order_dir=order_dir)
402 402
403 403 if length:
404 404 pull_requests = q.limit(length).offset(offset).all()
405 405 else:
406 406 pull_requests = q.all()
407 407
408 408 return pull_requests
409 409
410 410 def get_versions(self, pull_request):
411 411 """
412 412 returns version of pull request sorted by ID descending
413 413 """
414 414 return PullRequestVersion.query()\
415 415 .filter(PullRequestVersion.pull_request == pull_request)\
416 416 .order_by(PullRequestVersion.pull_request_version_id.asc())\
417 417 .all()
418 418
419 419 def create(self, created_by, source_repo, source_ref, target_repo,
420 420 target_ref, revisions, reviewers, title, description=None,
421 421 reviewer_data=None):
422 422
423 423 created_by_user = self._get_user(created_by)
424 424 source_repo = self._get_repo(source_repo)
425 425 target_repo = self._get_repo(target_repo)
426 426
427 427 pull_request = PullRequest()
428 428 pull_request.source_repo = source_repo
429 429 pull_request.source_ref = source_ref
430 430 pull_request.target_repo = target_repo
431 431 pull_request.target_ref = target_ref
432 432 pull_request.revisions = revisions
433 433 pull_request.title = title
434 434 pull_request.description = description
435 435 pull_request.author = created_by_user
436 436 pull_request.reviewer_data = reviewer_data
437 437
438 438 Session().add(pull_request)
439 439 Session().flush()
440 440
441 441 reviewer_ids = set()
442 442 # members / reviewers
443 443 for reviewer_object in reviewers:
444 444 user_id, reasons, mandatory = reviewer_object
445 445 user = self._get_user(user_id)
446 446
447 447 # skip duplicates
448 448 if user.user_id in reviewer_ids:
449 449 continue
450 450
451 451 reviewer_ids.add(user.user_id)
452 452
453 453 reviewer = PullRequestReviewers()
454 454 reviewer.user = user
455 455 reviewer.pull_request = pull_request
456 456 reviewer.reasons = reasons
457 457 reviewer.mandatory = mandatory
458 458 Session().add(reviewer)
459 459
460 460 # Set approval status to "Under Review" for all commits which are
461 461 # part of this pull request.
462 462 ChangesetStatusModel().set_status(
463 463 repo=target_repo,
464 464 status=ChangesetStatus.STATUS_UNDER_REVIEW,
465 465 user=created_by_user,
466 466 pull_request=pull_request
467 467 )
468 468
469 469 self.notify_reviewers(pull_request, reviewer_ids)
470 470 self._trigger_pull_request_hook(
471 471 pull_request, created_by_user, 'create')
472 472
473 473 creation_data = pull_request.get_api_data(with_merge_state=False)
474 474 self._log_audit_action(
475 475 'repo.pull_request.create', {'data': creation_data},
476 476 created_by_user, pull_request)
477 477
478 478 return pull_request
479 479
480 480 def _trigger_pull_request_hook(self, pull_request, user, action):
481 481 pull_request = self.__get_pull_request(pull_request)
482 482 target_scm = pull_request.target_repo.scm_instance()
483 483 if action == 'create':
484 484 trigger_hook = hooks_utils.trigger_log_create_pull_request_hook
485 485 elif action == 'merge':
486 486 trigger_hook = hooks_utils.trigger_log_merge_pull_request_hook
487 487 elif action == 'close':
488 488 trigger_hook = hooks_utils.trigger_log_close_pull_request_hook
489 489 elif action == 'review_status_change':
490 490 trigger_hook = hooks_utils.trigger_log_review_pull_request_hook
491 491 elif action == 'update':
492 492 trigger_hook = hooks_utils.trigger_log_update_pull_request_hook
493 493 else:
494 494 return
495 495
496 496 trigger_hook(
497 497 username=user.username,
498 498 repo_name=pull_request.target_repo.repo_name,
499 499 repo_alias=target_scm.alias,
500 500 pull_request=pull_request)
501 501
502 502 def _get_commit_ids(self, pull_request):
503 503 """
504 504 Return the commit ids of the merged pull request.
505 505
506 506 This method is not dealing correctly yet with the lack of autoupdates
507 507 nor with the implicit target updates.
508 508 For example: if a commit in the source repo is already in the target it
509 509 will be reported anyways.
510 510 """
511 511 merge_rev = pull_request.merge_rev
512 512 if merge_rev is None:
513 513 raise ValueError('This pull request was not merged yet')
514 514
515 515 commit_ids = list(pull_request.revisions)
516 516 if merge_rev not in commit_ids:
517 517 commit_ids.append(merge_rev)
518 518
519 519 return commit_ids
520 520
521 521 def merge(self, pull_request, user, extras):
522 522 log.debug("Merging pull request %s", pull_request.pull_request_id)
523 523 merge_state = self._merge_pull_request(pull_request, user, extras)
524 524 if merge_state.executed:
525 525 log.debug(
526 526 "Merge was successful, updating the pull request comments.")
527 527 self._comment_and_close_pr(pull_request, user, merge_state)
528 528
529 529 self._log_audit_action(
530 530 'repo.pull_request.merge',
531 531 {'merge_state': merge_state.__dict__},
532 532 user, pull_request)
533 533
534 534 else:
535 535 log.warn("Merge failed, not updating the pull request.")
536 536 return merge_state
537 537
538 538 def _merge_pull_request(self, pull_request, user, extras):
539 539 target_vcs = pull_request.target_repo.scm_instance()
540 540 source_vcs = pull_request.source_repo.scm_instance()
541 541 target_ref = self._refresh_reference(
542 542 pull_request.target_ref_parts, target_vcs)
543 543
544 544 message = _(
545 545 'Merge pull request #%(pr_id)s from '
546 546 '%(source_repo)s %(source_ref_name)s\n\n %(pr_title)s') % {
547 547 'pr_id': pull_request.pull_request_id,
548 548 'source_repo': source_vcs.name,
549 549 'source_ref_name': pull_request.source_ref_parts.name,
550 550 'pr_title': pull_request.title
551 551 }
552 552
553 553 workspace_id = self._workspace_id(pull_request)
554 554 use_rebase = self._use_rebase_for_merging(pull_request)
555 close_branch = self._close_branch_before_merging(pull_request)
555 556
556 557 callback_daemon, extras = prepare_callback_daemon(
557 558 extras, protocol=vcs_settings.HOOKS_PROTOCOL,
558 559 use_direct_calls=vcs_settings.HOOKS_DIRECT_CALLS)
559 560
560 561 with callback_daemon:
561 562 # TODO: johbo: Implement a clean way to run a config_override
562 563 # for a single call.
563 564 target_vcs.config.set(
564 565 'rhodecode', 'RC_SCM_DATA', json.dumps(extras))
565 566 merge_state = target_vcs.merge(
566 567 target_ref, source_vcs, pull_request.source_ref_parts,
567 568 workspace_id, user_name=user.username,
568 user_email=user.email, message=message, use_rebase=use_rebase)
569 user_email=user.email, message=message, use_rebase=use_rebase,
570 close_branch=close_branch)
569 571 return merge_state
570 572
571 573 def _comment_and_close_pr(self, pull_request, user, merge_state):
572 574 pull_request.merge_rev = merge_state.merge_ref.commit_id
573 575 pull_request.updated_on = datetime.datetime.now()
574 576
575 577 CommentsModel().create(
576 578 text=unicode(_('Pull request merged and closed')),
577 579 repo=pull_request.target_repo.repo_id,
578 580 user=user.user_id,
579 581 pull_request=pull_request.pull_request_id,
580 582 f_path=None,
581 583 line_no=None,
582 584 closing_pr=True
583 585 )
584 586
585 587 Session().add(pull_request)
586 588 Session().flush()
587 589 # TODO: paris: replace invalidation with less radical solution
588 590 ScmModel().mark_for_invalidation(
589 591 pull_request.target_repo.repo_name)
590 592 self._trigger_pull_request_hook(pull_request, user, 'merge')
591 593
592 594 def has_valid_update_type(self, pull_request):
593 595 source_ref_type = pull_request.source_ref_parts.type
594 596 return source_ref_type in ['book', 'branch', 'tag']
595 597
596 598 def update_commits(self, pull_request):
597 599 """
598 600 Get the updated list of commits for the pull request
599 601 and return the new pull request version and the list
600 602 of commits processed by this update action
601 603 """
602 604 pull_request = self.__get_pull_request(pull_request)
603 605 source_ref_type = pull_request.source_ref_parts.type
604 606 source_ref_name = pull_request.source_ref_parts.name
605 607 source_ref_id = pull_request.source_ref_parts.commit_id
606 608
607 609 target_ref_type = pull_request.target_ref_parts.type
608 610 target_ref_name = pull_request.target_ref_parts.name
609 611 target_ref_id = pull_request.target_ref_parts.commit_id
610 612
611 613 if not self.has_valid_update_type(pull_request):
612 614 log.debug(
613 615 "Skipping update of pull request %s due to ref type: %s",
614 616 pull_request, source_ref_type)
615 617 return UpdateResponse(
616 618 executed=False,
617 619 reason=UpdateFailureReason.WRONG_REF_TYPE,
618 620 old=pull_request, new=None, changes=None,
619 621 source_changed=False, target_changed=False)
620 622
621 623 # source repo
622 624 source_repo = pull_request.source_repo.scm_instance()
623 625 try:
624 626 source_commit = source_repo.get_commit(commit_id=source_ref_name)
625 627 except CommitDoesNotExistError:
626 628 return UpdateResponse(
627 629 executed=False,
628 630 reason=UpdateFailureReason.MISSING_SOURCE_REF,
629 631 old=pull_request, new=None, changes=None,
630 632 source_changed=False, target_changed=False)
631 633
632 634 source_changed = source_ref_id != source_commit.raw_id
633 635
634 636 # target repo
635 637 target_repo = pull_request.target_repo.scm_instance()
636 638 try:
637 639 target_commit = target_repo.get_commit(commit_id=target_ref_name)
638 640 except CommitDoesNotExistError:
639 641 return UpdateResponse(
640 642 executed=False,
641 643 reason=UpdateFailureReason.MISSING_TARGET_REF,
642 644 old=pull_request, new=None, changes=None,
643 645 source_changed=False, target_changed=False)
644 646 target_changed = target_ref_id != target_commit.raw_id
645 647
646 648 if not (source_changed or target_changed):
647 649 log.debug("Nothing changed in pull request %s", pull_request)
648 650 return UpdateResponse(
649 651 executed=False,
650 652 reason=UpdateFailureReason.NO_CHANGE,
651 653 old=pull_request, new=None, changes=None,
652 654 source_changed=target_changed, target_changed=source_changed)
653 655
654 656 change_in_found = 'target repo' if target_changed else 'source repo'
655 657 log.debug('Updating pull request because of change in %s detected',
656 658 change_in_found)
657 659
658 660 # Finally there is a need for an update, in case of source change
659 661 # we create a new version, else just an update
660 662 if source_changed:
661 663 pull_request_version = self._create_version_from_snapshot(pull_request)
662 664 self._link_comments_to_version(pull_request_version)
663 665 else:
664 666 try:
665 667 ver = pull_request.versions[-1]
666 668 except IndexError:
667 669 ver = None
668 670
669 671 pull_request.pull_request_version_id = \
670 672 ver.pull_request_version_id if ver else None
671 673 pull_request_version = pull_request
672 674
673 675 try:
674 676 if target_ref_type in ('tag', 'branch', 'book'):
675 677 target_commit = target_repo.get_commit(target_ref_name)
676 678 else:
677 679 target_commit = target_repo.get_commit(target_ref_id)
678 680 except CommitDoesNotExistError:
679 681 return UpdateResponse(
680 682 executed=False,
681 683 reason=UpdateFailureReason.MISSING_TARGET_REF,
682 684 old=pull_request, new=None, changes=None,
683 685 source_changed=source_changed, target_changed=target_changed)
684 686
685 687 # re-compute commit ids
686 688 old_commit_ids = pull_request.revisions
687 689 pre_load = ["author", "branch", "date", "message"]
688 690 commit_ranges = target_repo.compare(
689 691 target_commit.raw_id, source_commit.raw_id, source_repo, merge=True,
690 692 pre_load=pre_load)
691 693
692 694 ancestor = target_repo.get_common_ancestor(
693 695 target_commit.raw_id, source_commit.raw_id, source_repo)
694 696
695 697 pull_request.source_ref = '%s:%s:%s' % (
696 698 source_ref_type, source_ref_name, source_commit.raw_id)
697 699 pull_request.target_ref = '%s:%s:%s' % (
698 700 target_ref_type, target_ref_name, ancestor)
699 701
700 702 pull_request.revisions = [
701 703 commit.raw_id for commit in reversed(commit_ranges)]
702 704 pull_request.updated_on = datetime.datetime.now()
703 705 Session().add(pull_request)
704 706 new_commit_ids = pull_request.revisions
705 707
706 708 old_diff_data, new_diff_data = self._generate_update_diffs(
707 709 pull_request, pull_request_version)
708 710
709 711 # calculate commit and file changes
710 712 changes = self._calculate_commit_id_changes(
711 713 old_commit_ids, new_commit_ids)
712 714 file_changes = self._calculate_file_changes(
713 715 old_diff_data, new_diff_data)
714 716
715 717 # set comments as outdated if DIFFS changed
716 718 CommentsModel().outdate_comments(
717 719 pull_request, old_diff_data=old_diff_data,
718 720 new_diff_data=new_diff_data)
719 721
720 722 commit_changes = (changes.added or changes.removed)
721 723 file_node_changes = (
722 724 file_changes.added or file_changes.modified or file_changes.removed)
723 725 pr_has_changes = commit_changes or file_node_changes
724 726
725 727 # Add an automatic comment to the pull request, in case
726 728 # anything has changed
727 729 if pr_has_changes:
728 730 update_comment = CommentsModel().create(
729 731 text=self._render_update_message(changes, file_changes),
730 732 repo=pull_request.target_repo,
731 733 user=pull_request.author,
732 734 pull_request=pull_request,
733 735 send_email=False, renderer=DEFAULT_COMMENTS_RENDERER)
734 736
735 737 # Update status to "Under Review" for added commits
736 738 for commit_id in changes.added:
737 739 ChangesetStatusModel().set_status(
738 740 repo=pull_request.source_repo,
739 741 status=ChangesetStatus.STATUS_UNDER_REVIEW,
740 742 comment=update_comment,
741 743 user=pull_request.author,
742 744 pull_request=pull_request,
743 745 revision=commit_id)
744 746
745 747 log.debug(
746 748 'Updated pull request %s, added_ids: %s, common_ids: %s, '
747 749 'removed_ids: %s', pull_request.pull_request_id,
748 750 changes.added, changes.common, changes.removed)
749 751 log.debug(
750 752 'Updated pull request with the following file changes: %s',
751 753 file_changes)
752 754
753 755 log.info(
754 756 "Updated pull request %s from commit %s to commit %s, "
755 757 "stored new version %s of this pull request.",
756 758 pull_request.pull_request_id, source_ref_id,
757 759 pull_request.source_ref_parts.commit_id,
758 760 pull_request_version.pull_request_version_id)
759 761 Session().commit()
760 762 self._trigger_pull_request_hook(
761 763 pull_request, pull_request.author, 'update')
762 764
763 765 return UpdateResponse(
764 766 executed=True, reason=UpdateFailureReason.NONE,
765 767 old=pull_request, new=pull_request_version, changes=changes,
766 768 source_changed=source_changed, target_changed=target_changed)
767 769
768 770 def _create_version_from_snapshot(self, pull_request):
769 771 version = PullRequestVersion()
770 772 version.title = pull_request.title
771 773 version.description = pull_request.description
772 774 version.status = pull_request.status
773 775 version.created_on = datetime.datetime.now()
774 776 version.updated_on = pull_request.updated_on
775 777 version.user_id = pull_request.user_id
776 778 version.source_repo = pull_request.source_repo
777 779 version.source_ref = pull_request.source_ref
778 780 version.target_repo = pull_request.target_repo
779 781 version.target_ref = pull_request.target_ref
780 782
781 783 version._last_merge_source_rev = pull_request._last_merge_source_rev
782 784 version._last_merge_target_rev = pull_request._last_merge_target_rev
783 785 version.last_merge_status = pull_request.last_merge_status
784 786 version.shadow_merge_ref = pull_request.shadow_merge_ref
785 787 version.merge_rev = pull_request.merge_rev
786 788 version.reviewer_data = pull_request.reviewer_data
787 789
788 790 version.revisions = pull_request.revisions
789 791 version.pull_request = pull_request
790 792 Session().add(version)
791 793 Session().flush()
792 794
793 795 return version
794 796
795 797 def _generate_update_diffs(self, pull_request, pull_request_version):
796 798
797 799 diff_context = (
798 800 self.DIFF_CONTEXT +
799 801 CommentsModel.needed_extra_diff_context())
800 802
801 803 source_repo = pull_request_version.source_repo
802 804 source_ref_id = pull_request_version.source_ref_parts.commit_id
803 805 target_ref_id = pull_request_version.target_ref_parts.commit_id
804 806 old_diff = self._get_diff_from_pr_or_version(
805 807 source_repo, source_ref_id, target_ref_id, context=diff_context)
806 808
807 809 source_repo = pull_request.source_repo
808 810 source_ref_id = pull_request.source_ref_parts.commit_id
809 811 target_ref_id = pull_request.target_ref_parts.commit_id
810 812
811 813 new_diff = self._get_diff_from_pr_or_version(
812 814 source_repo, source_ref_id, target_ref_id, context=diff_context)
813 815
814 816 old_diff_data = diffs.DiffProcessor(old_diff)
815 817 old_diff_data.prepare()
816 818 new_diff_data = diffs.DiffProcessor(new_diff)
817 819 new_diff_data.prepare()
818 820
819 821 return old_diff_data, new_diff_data
820 822
821 823 def _link_comments_to_version(self, pull_request_version):
822 824 """
823 825 Link all unlinked comments of this pull request to the given version.
824 826
825 827 :param pull_request_version: The `PullRequestVersion` to which
826 828 the comments shall be linked.
827 829
828 830 """
829 831 pull_request = pull_request_version.pull_request
830 832 comments = ChangesetComment.query()\
831 833 .filter(
832 834 # TODO: johbo: Should we query for the repo at all here?
833 835 # Pending decision on how comments of PRs are to be related
834 836 # to either the source repo, the target repo or no repo at all.
835 837 ChangesetComment.repo_id == pull_request.target_repo.repo_id,
836 838 ChangesetComment.pull_request == pull_request,
837 839 ChangesetComment.pull_request_version == None)\
838 840 .order_by(ChangesetComment.comment_id.asc())
839 841
840 842 # TODO: johbo: Find out why this breaks if it is done in a bulk
841 843 # operation.
842 844 for comment in comments:
843 845 comment.pull_request_version_id = (
844 846 pull_request_version.pull_request_version_id)
845 847 Session().add(comment)
846 848
847 849 def _calculate_commit_id_changes(self, old_ids, new_ids):
848 850 added = [x for x in new_ids if x not in old_ids]
849 851 common = [x for x in new_ids if x in old_ids]
850 852 removed = [x for x in old_ids if x not in new_ids]
851 853 total = new_ids
852 854 return ChangeTuple(added, common, removed, total)
853 855
854 856 def _calculate_file_changes(self, old_diff_data, new_diff_data):
855 857
856 858 old_files = OrderedDict()
857 859 for diff_data in old_diff_data.parsed_diff:
858 860 old_files[diff_data['filename']] = md5_safe(diff_data['raw_diff'])
859 861
860 862 added_files = []
861 863 modified_files = []
862 864 removed_files = []
863 865 for diff_data in new_diff_data.parsed_diff:
864 866 new_filename = diff_data['filename']
865 867 new_hash = md5_safe(diff_data['raw_diff'])
866 868
867 869 old_hash = old_files.get(new_filename)
868 870 if not old_hash:
869 871 # file is not present in old diff, means it's added
870 872 added_files.append(new_filename)
871 873 else:
872 874 if new_hash != old_hash:
873 875 modified_files.append(new_filename)
874 876 # now remove a file from old, since we have seen it already
875 877 del old_files[new_filename]
876 878
877 879 # removed files is when there are present in old, but not in NEW,
878 880 # since we remove old files that are present in new diff, left-overs
879 881 # if any should be the removed files
880 882 removed_files.extend(old_files.keys())
881 883
882 884 return FileChangeTuple(added_files, modified_files, removed_files)
883 885
884 886 def _render_update_message(self, changes, file_changes):
885 887 """
886 888 render the message using DEFAULT_COMMENTS_RENDERER (RST renderer),
887 889 so it's always looking the same disregarding on which default
888 890 renderer system is using.
889 891
890 892 :param changes: changes named tuple
891 893 :param file_changes: file changes named tuple
892 894
893 895 """
894 896 new_status = ChangesetStatus.get_status_lbl(
895 897 ChangesetStatus.STATUS_UNDER_REVIEW)
896 898
897 899 changed_files = (
898 900 file_changes.added + file_changes.modified + file_changes.removed)
899 901
900 902 params = {
901 903 'under_review_label': new_status,
902 904 'added_commits': changes.added,
903 905 'removed_commits': changes.removed,
904 906 'changed_files': changed_files,
905 907 'added_files': file_changes.added,
906 908 'modified_files': file_changes.modified,
907 909 'removed_files': file_changes.removed,
908 910 }
909 911 renderer = RstTemplateRenderer()
910 912 return renderer.render('pull_request_update.mako', **params)
911 913
912 914 def edit(self, pull_request, title, description, user):
913 915 pull_request = self.__get_pull_request(pull_request)
914 916 old_data = pull_request.get_api_data(with_merge_state=False)
915 917 if pull_request.is_closed():
916 918 raise ValueError('This pull request is closed')
917 919 if title:
918 920 pull_request.title = title
919 921 pull_request.description = description
920 922 pull_request.updated_on = datetime.datetime.now()
921 923 Session().add(pull_request)
922 924 self._log_audit_action(
923 925 'repo.pull_request.edit', {'old_data': old_data},
924 926 user, pull_request)
925 927
926 928 def update_reviewers(self, pull_request, reviewer_data, user):
927 929 """
928 930 Update the reviewers in the pull request
929 931
930 932 :param pull_request: the pr to update
931 933 :param reviewer_data: list of tuples
932 934 [(user, ['reason1', 'reason2'], mandatory_flag)]
933 935 """
934 936
935 937 reviewers = {}
936 938 for user_id, reasons, mandatory in reviewer_data:
937 939 if isinstance(user_id, (int, basestring)):
938 940 user_id = self._get_user(user_id).user_id
939 941 reviewers[user_id] = {
940 942 'reasons': reasons, 'mandatory': mandatory}
941 943
942 944 reviewers_ids = set(reviewers.keys())
943 945 pull_request = self.__get_pull_request(pull_request)
944 946 current_reviewers = PullRequestReviewers.query()\
945 947 .filter(PullRequestReviewers.pull_request ==
946 948 pull_request).all()
947 949 current_reviewers_ids = set([x.user.user_id for x in current_reviewers])
948 950
949 951 ids_to_add = reviewers_ids.difference(current_reviewers_ids)
950 952 ids_to_remove = current_reviewers_ids.difference(reviewers_ids)
951 953
952 954 log.debug("Adding %s reviewers", ids_to_add)
953 955 log.debug("Removing %s reviewers", ids_to_remove)
954 956 changed = False
955 957 for uid in ids_to_add:
956 958 changed = True
957 959 _usr = self._get_user(uid)
958 960 reviewer = PullRequestReviewers()
959 961 reviewer.user = _usr
960 962 reviewer.pull_request = pull_request
961 963 reviewer.reasons = reviewers[uid]['reasons']
962 964 # NOTE(marcink): mandatory shouldn't be changed now
963 965 # reviewer.mandatory = reviewers[uid]['reasons']
964 966 Session().add(reviewer)
965 967 self._log_audit_action(
966 968 'repo.pull_request.reviewer.add', {'data': reviewer.get_dict()},
967 969 user, pull_request)
968 970
969 971 for uid in ids_to_remove:
970 972 changed = True
971 973 reviewers = PullRequestReviewers.query()\
972 974 .filter(PullRequestReviewers.user_id == uid,
973 975 PullRequestReviewers.pull_request == pull_request)\
974 976 .all()
975 977 # use .all() in case we accidentally added the same person twice
976 978 # this CAN happen due to the lack of DB checks
977 979 for obj in reviewers:
978 980 old_data = obj.get_dict()
979 981 Session().delete(obj)
980 982 self._log_audit_action(
981 983 'repo.pull_request.reviewer.delete',
982 984 {'old_data': old_data}, user, pull_request)
983 985
984 986 if changed:
985 987 pull_request.updated_on = datetime.datetime.now()
986 988 Session().add(pull_request)
987 989
988 990 self.notify_reviewers(pull_request, ids_to_add)
989 991 return ids_to_add, ids_to_remove
990 992
991 993 def get_url(self, pull_request, request=None, permalink=False):
992 994 if not request:
993 995 request = get_current_request()
994 996
995 997 if permalink:
996 998 return request.route_url(
997 999 'pull_requests_global',
998 1000 pull_request_id=pull_request.pull_request_id,)
999 1001 else:
1000 1002 return request.route_url('pullrequest_show',
1001 1003 repo_name=safe_str(pull_request.target_repo.repo_name),
1002 1004 pull_request_id=pull_request.pull_request_id,)
1003 1005
1004 1006 def get_shadow_clone_url(self, pull_request):
1005 1007 """
1006 1008 Returns qualified url pointing to the shadow repository. If this pull
1007 1009 request is closed there is no shadow repository and ``None`` will be
1008 1010 returned.
1009 1011 """
1010 1012 if pull_request.is_closed():
1011 1013 return None
1012 1014 else:
1013 1015 pr_url = urllib.unquote(self.get_url(pull_request))
1014 1016 return safe_unicode('{pr_url}/repository'.format(pr_url=pr_url))
1015 1017
1016 1018 def notify_reviewers(self, pull_request, reviewers_ids):
1017 1019 # notification to reviewers
1018 1020 if not reviewers_ids:
1019 1021 return
1020 1022
1021 1023 pull_request_obj = pull_request
1022 1024 # get the current participants of this pull request
1023 1025 recipients = reviewers_ids
1024 1026 notification_type = EmailNotificationModel.TYPE_PULL_REQUEST
1025 1027
1026 1028 pr_source_repo = pull_request_obj.source_repo
1027 1029 pr_target_repo = pull_request_obj.target_repo
1028 1030
1029 1031 pr_url = h.route_url('pullrequest_show',
1030 1032 repo_name=pr_target_repo.repo_name,
1031 1033 pull_request_id=pull_request_obj.pull_request_id,)
1032 1034
1033 1035 # set some variables for email notification
1034 1036 pr_target_repo_url = h.route_url(
1035 1037 'repo_summary', repo_name=pr_target_repo.repo_name)
1036 1038
1037 1039 pr_source_repo_url = h.route_url(
1038 1040 'repo_summary', repo_name=pr_source_repo.repo_name)
1039 1041
1040 1042 # pull request specifics
1041 1043 pull_request_commits = [
1042 1044 (x.raw_id, x.message)
1043 1045 for x in map(pr_source_repo.get_commit, pull_request.revisions)]
1044 1046
1045 1047 kwargs = {
1046 1048 'user': pull_request.author,
1047 1049 'pull_request': pull_request_obj,
1048 1050 'pull_request_commits': pull_request_commits,
1049 1051
1050 1052 'pull_request_target_repo': pr_target_repo,
1051 1053 'pull_request_target_repo_url': pr_target_repo_url,
1052 1054
1053 1055 'pull_request_source_repo': pr_source_repo,
1054 1056 'pull_request_source_repo_url': pr_source_repo_url,
1055 1057
1056 1058 'pull_request_url': pr_url,
1057 1059 }
1058 1060
1059 1061 # pre-generate the subject for notification itself
1060 1062 (subject,
1061 1063 _h, _e, # we don't care about those
1062 1064 body_plaintext) = EmailNotificationModel().render_email(
1063 1065 notification_type, **kwargs)
1064 1066
1065 1067 # create notification objects, and emails
1066 1068 NotificationModel().create(
1067 1069 created_by=pull_request.author,
1068 1070 notification_subject=subject,
1069 1071 notification_body=body_plaintext,
1070 1072 notification_type=notification_type,
1071 1073 recipients=recipients,
1072 1074 email_kwargs=kwargs,
1073 1075 )
1074 1076
1075 1077 def delete(self, pull_request, user):
1076 1078 pull_request = self.__get_pull_request(pull_request)
1077 1079 old_data = pull_request.get_api_data(with_merge_state=False)
1078 1080 self._cleanup_merge_workspace(pull_request)
1079 1081 self._log_audit_action(
1080 1082 'repo.pull_request.delete', {'old_data': old_data},
1081 1083 user, pull_request)
1082 1084 Session().delete(pull_request)
1083 1085
1084 1086 def close_pull_request(self, pull_request, user):
1085 1087 pull_request = self.__get_pull_request(pull_request)
1086 1088 self._cleanup_merge_workspace(pull_request)
1087 1089 pull_request.status = PullRequest.STATUS_CLOSED
1088 1090 pull_request.updated_on = datetime.datetime.now()
1089 1091 Session().add(pull_request)
1090 1092 self._trigger_pull_request_hook(
1091 1093 pull_request, pull_request.author, 'close')
1092 1094 self._log_audit_action(
1093 1095 'repo.pull_request.close', {}, user, pull_request)
1094 1096
1095 1097 def close_pull_request_with_comment(
1096 1098 self, pull_request, user, repo, message=None):
1097 1099
1098 1100 pull_request_review_status = pull_request.calculated_review_status()
1099 1101
1100 1102 if pull_request_review_status == ChangesetStatus.STATUS_APPROVED:
1101 1103 # approved only if we have voting consent
1102 1104 status = ChangesetStatus.STATUS_APPROVED
1103 1105 else:
1104 1106 status = ChangesetStatus.STATUS_REJECTED
1105 1107 status_lbl = ChangesetStatus.get_status_lbl(status)
1106 1108
1107 1109 default_message = (
1108 1110 _('Closing with status change {transition_icon} {status}.')
1109 1111 ).format(transition_icon='>', status=status_lbl)
1110 1112 text = message or default_message
1111 1113
1112 1114 # create a comment, and link it to new status
1113 1115 comment = CommentsModel().create(
1114 1116 text=text,
1115 1117 repo=repo.repo_id,
1116 1118 user=user.user_id,
1117 1119 pull_request=pull_request.pull_request_id,
1118 1120 status_change=status_lbl,
1119 1121 status_change_type=status,
1120 1122 closing_pr=True
1121 1123 )
1122 1124
1123 1125 # calculate old status before we change it
1124 1126 old_calculated_status = pull_request.calculated_review_status()
1125 1127 ChangesetStatusModel().set_status(
1126 1128 repo.repo_id,
1127 1129 status,
1128 1130 user.user_id,
1129 1131 comment=comment,
1130 1132 pull_request=pull_request.pull_request_id
1131 1133 )
1132 1134
1133 1135 Session().flush()
1134 1136 events.trigger(events.PullRequestCommentEvent(pull_request, comment))
1135 1137 # we now calculate the status of pull request again, and based on that
1136 1138 # calculation trigger status change. This might happen in cases
1137 1139 # that non-reviewer admin closes a pr, which means his vote doesn't
1138 1140 # change the status, while if he's a reviewer this might change it.
1139 1141 calculated_status = pull_request.calculated_review_status()
1140 1142 if old_calculated_status != calculated_status:
1141 1143 self._trigger_pull_request_hook(
1142 1144 pull_request, user, 'review_status_change')
1143 1145
1144 1146 # finally close the PR
1145 1147 PullRequestModel().close_pull_request(
1146 1148 pull_request.pull_request_id, user)
1147 1149
1148 1150 return comment, status
1149 1151
1150 1152 def merge_status(self, pull_request):
1151 1153 if not self._is_merge_enabled(pull_request):
1152 1154 return False, _('Server-side pull request merging is disabled.')
1153 1155 if pull_request.is_closed():
1154 1156 return False, _('This pull request is closed.')
1155 1157 merge_possible, msg = self._check_repo_requirements(
1156 1158 target=pull_request.target_repo, source=pull_request.source_repo)
1157 1159 if not merge_possible:
1158 1160 return merge_possible, msg
1159 1161
1160 1162 try:
1161 1163 resp = self._try_merge(pull_request)
1162 1164 log.debug("Merge response: %s", resp)
1163 1165 status = resp.possible, self.merge_status_message(
1164 1166 resp.failure_reason)
1165 1167 except NotImplementedError:
1166 1168 status = False, _('Pull request merging is not supported.')
1167 1169
1168 1170 return status
1169 1171
1170 1172 def _check_repo_requirements(self, target, source):
1171 1173 """
1172 1174 Check if `target` and `source` have compatible requirements.
1173 1175
1174 1176 Currently this is just checking for largefiles.
1175 1177 """
1176 1178 target_has_largefiles = self._has_largefiles(target)
1177 1179 source_has_largefiles = self._has_largefiles(source)
1178 1180 merge_possible = True
1179 1181 message = u''
1180 1182
1181 1183 if target_has_largefiles != source_has_largefiles:
1182 1184 merge_possible = False
1183 1185 if source_has_largefiles:
1184 1186 message = _(
1185 1187 'Target repository large files support is disabled.')
1186 1188 else:
1187 1189 message = _(
1188 1190 'Source repository large files support is disabled.')
1189 1191
1190 1192 return merge_possible, message
1191 1193
1192 1194 def _has_largefiles(self, repo):
1193 1195 largefiles_ui = VcsSettingsModel(repo=repo).get_ui_settings(
1194 1196 'extensions', 'largefiles')
1195 1197 return largefiles_ui and largefiles_ui[0].active
1196 1198
1197 1199 def _try_merge(self, pull_request):
1198 1200 """
1199 1201 Try to merge the pull request and return the merge status.
1200 1202 """
1201 1203 log.debug(
1202 1204 "Trying out if the pull request %s can be merged.",
1203 1205 pull_request.pull_request_id)
1204 1206 target_vcs = pull_request.target_repo.scm_instance()
1205 1207
1206 1208 # Refresh the target reference.
1207 1209 try:
1208 1210 target_ref = self._refresh_reference(
1209 1211 pull_request.target_ref_parts, target_vcs)
1210 1212 except CommitDoesNotExistError:
1211 1213 merge_state = MergeResponse(
1212 1214 False, False, None, MergeFailureReason.MISSING_TARGET_REF)
1213 1215 return merge_state
1214 1216
1215 1217 target_locked = pull_request.target_repo.locked
1216 1218 if target_locked and target_locked[0]:
1217 1219 log.debug("The target repository is locked.")
1218 1220 merge_state = MergeResponse(
1219 1221 False, False, None, MergeFailureReason.TARGET_IS_LOCKED)
1220 1222 elif self._needs_merge_state_refresh(pull_request, target_ref):
1221 1223 log.debug("Refreshing the merge status of the repository.")
1222 1224 merge_state = self._refresh_merge_state(
1223 1225 pull_request, target_vcs, target_ref)
1224 1226 else:
1225 1227 possible = pull_request.\
1226 1228 last_merge_status == MergeFailureReason.NONE
1227 1229 merge_state = MergeResponse(
1228 1230 possible, False, None, pull_request.last_merge_status)
1229 1231
1230 1232 return merge_state
1231 1233
1232 1234 def _refresh_reference(self, reference, vcs_repository):
1233 1235 if reference.type in ('branch', 'book'):
1234 1236 name_or_id = reference.name
1235 1237 else:
1236 1238 name_or_id = reference.commit_id
1237 1239 refreshed_commit = vcs_repository.get_commit(name_or_id)
1238 1240 refreshed_reference = Reference(
1239 1241 reference.type, reference.name, refreshed_commit.raw_id)
1240 1242 return refreshed_reference
1241 1243
1242 1244 def _needs_merge_state_refresh(self, pull_request, target_reference):
1243 1245 return not(
1244 1246 pull_request.revisions and
1245 1247 pull_request.revisions[0] == pull_request._last_merge_source_rev and
1246 1248 target_reference.commit_id == pull_request._last_merge_target_rev)
1247 1249
1248 1250 def _refresh_merge_state(self, pull_request, target_vcs, target_reference):
1249 1251 workspace_id = self._workspace_id(pull_request)
1250 1252 source_vcs = pull_request.source_repo.scm_instance()
1251 1253 use_rebase = self._use_rebase_for_merging(pull_request)
1254 close_branch = self._close_branch_before_merging(pull_request)
1252 1255 merge_state = target_vcs.merge(
1253 1256 target_reference, source_vcs, pull_request.source_ref_parts,
1254 workspace_id, dry_run=True, use_rebase=use_rebase)
1257 workspace_id, dry_run=True, use_rebase=use_rebase,
1258 close_branch=close_branch)
1255 1259
1256 1260 # Do not store the response if there was an unknown error.
1257 1261 if merge_state.failure_reason != MergeFailureReason.UNKNOWN:
1258 1262 pull_request._last_merge_source_rev = \
1259 1263 pull_request.source_ref_parts.commit_id
1260 1264 pull_request._last_merge_target_rev = target_reference.commit_id
1261 1265 pull_request.last_merge_status = merge_state.failure_reason
1262 1266 pull_request.shadow_merge_ref = merge_state.merge_ref
1263 1267 Session().add(pull_request)
1264 1268 Session().commit()
1265 1269
1266 1270 return merge_state
1267 1271
1268 1272 def _workspace_id(self, pull_request):
1269 1273 workspace_id = 'pr-%s' % pull_request.pull_request_id
1270 1274 return workspace_id
1271 1275
1272 1276 def merge_status_message(self, status_code):
1273 1277 """
1274 1278 Return a human friendly error message for the given merge status code.
1275 1279 """
1276 1280 return self.MERGE_STATUS_MESSAGES[status_code]
1277 1281
1278 1282 def generate_repo_data(self, repo, commit_id=None, branch=None,
1279 1283 bookmark=None):
1280 1284 all_refs, selected_ref = \
1281 1285 self._get_repo_pullrequest_sources(
1282 1286 repo.scm_instance(), commit_id=commit_id,
1283 1287 branch=branch, bookmark=bookmark)
1284 1288
1285 1289 refs_select2 = []
1286 1290 for element in all_refs:
1287 1291 children = [{'id': x[0], 'text': x[1]} for x in element[0]]
1288 1292 refs_select2.append({'text': element[1], 'children': children})
1289 1293
1290 1294 return {
1291 1295 'user': {
1292 1296 'user_id': repo.user.user_id,
1293 1297 'username': repo.user.username,
1294 1298 'firstname': repo.user.first_name,
1295 1299 'lastname': repo.user.last_name,
1296 1300 'gravatar_link': h.gravatar_url(repo.user.email, 14),
1297 1301 },
1298 1302 'description': h.chop_at_smart(repo.description_safe, '\n'),
1299 1303 'refs': {
1300 1304 'all_refs': all_refs,
1301 1305 'selected_ref': selected_ref,
1302 1306 'select2_refs': refs_select2
1303 1307 }
1304 1308 }
1305 1309
1306 1310 def generate_pullrequest_title(self, source, source_ref, target):
1307 1311 return u'{source}#{at_ref} to {target}'.format(
1308 1312 source=source,
1309 1313 at_ref=source_ref,
1310 1314 target=target,
1311 1315 )
1312 1316
1313 1317 def _cleanup_merge_workspace(self, pull_request):
1314 1318 # Merging related cleanup
1315 1319 target_scm = pull_request.target_repo.scm_instance()
1316 1320 workspace_id = 'pr-%s' % pull_request.pull_request_id
1317 1321
1318 1322 try:
1319 1323 target_scm.cleanup_merge_workspace(workspace_id)
1320 1324 except NotImplementedError:
1321 1325 pass
1322 1326
1323 1327 def _get_repo_pullrequest_sources(
1324 1328 self, repo, commit_id=None, branch=None, bookmark=None):
1325 1329 """
1326 1330 Return a structure with repo's interesting commits, suitable for
1327 1331 the selectors in pullrequest controller
1328 1332
1329 1333 :param commit_id: a commit that must be in the list somehow
1330 1334 and selected by default
1331 1335 :param branch: a branch that must be in the list and selected
1332 1336 by default - even if closed
1333 1337 :param bookmark: a bookmark that must be in the list and selected
1334 1338 """
1335 1339
1336 1340 commit_id = safe_str(commit_id) if commit_id else None
1337 1341 branch = safe_str(branch) if branch else None
1338 1342 bookmark = safe_str(bookmark) if bookmark else None
1339 1343
1340 1344 selected = None
1341 1345
1342 1346 # order matters: first source that has commit_id in it will be selected
1343 1347 sources = []
1344 1348 sources.append(('book', repo.bookmarks.items(), _('Bookmarks'), bookmark))
1345 1349 sources.append(('branch', repo.branches.items(), _('Branches'), branch))
1346 1350
1347 1351 if commit_id:
1348 1352 ref_commit = (h.short_id(commit_id), commit_id)
1349 1353 sources.append(('rev', [ref_commit], _('Commit IDs'), commit_id))
1350 1354
1351 1355 sources.append(
1352 1356 ('branch', repo.branches_closed.items(), _('Closed Branches'), branch),
1353 1357 )
1354 1358
1355 1359 groups = []
1356 1360 for group_key, ref_list, group_name, match in sources:
1357 1361 group_refs = []
1358 1362 for ref_name, ref_id in ref_list:
1359 1363 ref_key = '%s:%s:%s' % (group_key, ref_name, ref_id)
1360 1364 group_refs.append((ref_key, ref_name))
1361 1365
1362 1366 if not selected:
1363 1367 if set([commit_id, match]) & set([ref_id, ref_name]):
1364 1368 selected = ref_key
1365 1369
1366 1370 if group_refs:
1367 1371 groups.append((group_refs, group_name))
1368 1372
1369 1373 if not selected:
1370 1374 ref = commit_id or branch or bookmark
1371 1375 if ref:
1372 1376 raise CommitDoesNotExistError(
1373 1377 'No commit refs could be found matching: %s' % ref)
1374 1378 elif repo.DEFAULT_BRANCH_NAME in repo.branches:
1375 1379 selected = 'branch:%s:%s' % (
1376 1380 repo.DEFAULT_BRANCH_NAME,
1377 1381 repo.branches[repo.DEFAULT_BRANCH_NAME]
1378 1382 )
1379 1383 elif repo.commit_ids:
1380 1384 rev = repo.commit_ids[0]
1381 1385 selected = 'rev:%s:%s' % (rev, rev)
1382 1386 else:
1383 1387 raise EmptyRepositoryError()
1384 1388 return groups, selected
1385 1389
1386 1390 def get_diff(self, source_repo, source_ref_id, target_ref_id, context=DIFF_CONTEXT):
1387 1391 return self._get_diff_from_pr_or_version(
1388 1392 source_repo, source_ref_id, target_ref_id, context=context)
1389 1393
1390 1394 def _get_diff_from_pr_or_version(
1391 1395 self, source_repo, source_ref_id, target_ref_id, context):
1392 1396 target_commit = source_repo.get_commit(
1393 1397 commit_id=safe_str(target_ref_id))
1394 1398 source_commit = source_repo.get_commit(
1395 1399 commit_id=safe_str(source_ref_id))
1396 1400 if isinstance(source_repo, Repository):
1397 1401 vcs_repo = source_repo.scm_instance()
1398 1402 else:
1399 1403 vcs_repo = source_repo
1400 1404
1401 1405 # TODO: johbo: In the context of an update, we cannot reach
1402 1406 # the old commit anymore with our normal mechanisms. It needs
1403 1407 # some sort of special support in the vcs layer to avoid this
1404 1408 # workaround.
1405 1409 if (source_commit.raw_id == vcs_repo.EMPTY_COMMIT_ID and
1406 1410 vcs_repo.alias == 'git'):
1407 1411 source_commit.raw_id = safe_str(source_ref_id)
1408 1412
1409 1413 log.debug('calculating diff between '
1410 1414 'source_ref:%s and target_ref:%s for repo `%s`',
1411 1415 target_ref_id, source_ref_id,
1412 1416 safe_unicode(vcs_repo.path))
1413 1417
1414 1418 vcs_diff = vcs_repo.get_diff(
1415 1419 commit1=target_commit, commit2=source_commit, context=context)
1416 1420 return vcs_diff
1417 1421
1418 1422 def _is_merge_enabled(self, pull_request):
1423 return self._get_general_setting(
1424 pull_request, 'rhodecode_pr_merge_enabled')
1425
1426 def _use_rebase_for_merging(self, pull_request):
1427 return self._get_general_setting(
1428 pull_request, 'rhodecode_hg_use_rebase_for_merging')
1429
1430 def _close_branch_before_merging(self, pull_request):
1431 return self._get_general_setting(
1432 pull_request, 'rhodecode_hg_close_branch_before_merging')
1433
1434 def _get_general_setting(self, pull_request, settings_key, default=False):
1419 1435 settings_model = VcsSettingsModel(repo=pull_request.target_repo)
1420 1436 settings = settings_model.get_general_settings()
1421 return settings.get('rhodecode_pr_merge_enabled', False)
1422
1423 def _use_rebase_for_merging(self, pull_request):
1424 settings_model = VcsSettingsModel(repo=pull_request.target_repo)
1425 settings = settings_model.get_general_settings()
1426 return settings.get('rhodecode_hg_use_rebase_for_merging', False)
1437 return settings.get(settings_key, default)
1427 1438
1428 1439 def _log_audit_action(self, action, action_data, user, pull_request):
1429 1440 audit_logger.store(
1430 1441 action=action,
1431 1442 action_data=action_data,
1432 1443 user=user,
1433 1444 repo=pull_request.target_repo)
1434 1445
1435 1446 def get_reviewer_functions(self):
1436 1447 """
1437 1448 Fetches functions for validation and fetching default reviewers.
1438 1449 If available we use the EE package, else we fallback to CE
1439 1450 package functions
1440 1451 """
1441 1452 try:
1442 1453 from rc_reviewers.utils import get_default_reviewers_data
1443 1454 from rc_reviewers.utils import validate_default_reviewers
1444 1455 except ImportError:
1445 1456 from rhodecode.apps.repository.utils import \
1446 1457 get_default_reviewers_data
1447 1458 from rhodecode.apps.repository.utils import \
1448 1459 validate_default_reviewers
1449 1460
1450 1461 return get_default_reviewers_data, validate_default_reviewers
1451 1462
1452 1463
1453 1464 class MergeCheck(object):
1454 1465 """
1455 1466 Perform Merge Checks and returns a check object which stores information
1456 1467 about merge errors, and merge conditions
1457 1468 """
1458 1469 TODO_CHECK = 'todo'
1459 1470 PERM_CHECK = 'perm'
1460 1471 REVIEW_CHECK = 'review'
1461 1472 MERGE_CHECK = 'merge'
1462 1473
1463 1474 def __init__(self):
1464 1475 self.review_status = None
1465 1476 self.merge_possible = None
1466 1477 self.merge_msg = ''
1467 1478 self.failed = None
1468 1479 self.errors = []
1469 1480 self.error_details = OrderedDict()
1470 1481
1471 1482 def push_error(self, error_type, message, error_key, details):
1472 1483 self.failed = True
1473 1484 self.errors.append([error_type, message])
1474 1485 self.error_details[error_key] = dict(
1475 1486 details=details,
1476 1487 error_type=error_type,
1477 1488 message=message
1478 1489 )
1479 1490
1480 1491 @classmethod
1481 1492 def validate(cls, pull_request, user, fail_early=False, translator=None):
1482 1493 # if migrated to pyramid...
1483 1494 # _ = lambda: translator or _ # use passed in translator if any
1484 1495
1485 1496 merge_check = cls()
1486 1497
1487 1498 # permissions to merge
1488 1499 user_allowed_to_merge = PullRequestModel().check_user_merge(
1489 1500 pull_request, user)
1490 1501 if not user_allowed_to_merge:
1491 1502 log.debug("MergeCheck: cannot merge, approval is pending.")
1492 1503
1493 1504 msg = _('User `{}` not allowed to perform merge.').format(user.username)
1494 1505 merge_check.push_error('error', msg, cls.PERM_CHECK, user.username)
1495 1506 if fail_early:
1496 1507 return merge_check
1497 1508
1498 1509 # review status, must be always present
1499 1510 review_status = pull_request.calculated_review_status()
1500 1511 merge_check.review_status = review_status
1501 1512
1502 1513 status_approved = review_status == ChangesetStatus.STATUS_APPROVED
1503 1514 if not status_approved:
1504 1515 log.debug("MergeCheck: cannot merge, approval is pending.")
1505 1516
1506 1517 msg = _('Pull request reviewer approval is pending.')
1507 1518
1508 1519 merge_check.push_error(
1509 1520 'warning', msg, cls.REVIEW_CHECK, review_status)
1510 1521
1511 1522 if fail_early:
1512 1523 return merge_check
1513 1524
1514 1525 # left over TODOs
1515 1526 todos = CommentsModel().get_unresolved_todos(pull_request)
1516 1527 if todos:
1517 1528 log.debug("MergeCheck: cannot merge, {} "
1518 1529 "unresolved todos left.".format(len(todos)))
1519 1530
1520 1531 if len(todos) == 1:
1521 1532 msg = _('Cannot merge, {} TODO still not resolved.').format(
1522 1533 len(todos))
1523 1534 else:
1524 1535 msg = _('Cannot merge, {} TODOs still not resolved.').format(
1525 1536 len(todos))
1526 1537
1527 1538 merge_check.push_error('warning', msg, cls.TODO_CHECK, todos)
1528 1539
1529 1540 if fail_early:
1530 1541 return merge_check
1531 1542
1532 1543 # merge possible
1533 1544 merge_status, msg = PullRequestModel().merge_status(pull_request)
1534 1545 merge_check.merge_possible = merge_status
1535 1546 merge_check.merge_msg = msg
1536 1547 if not merge_status:
1537 1548 log.debug(
1538 1549 "MergeCheck: cannot merge, pull request merge not possible.")
1539 1550 merge_check.push_error('warning', msg, cls.MERGE_CHECK, None)
1540 1551
1541 1552 if fail_early:
1542 1553 return merge_check
1543 1554
1544 1555 log.debug('MergeCheck: is failed: %s', merge_check.failed)
1545 1556 return merge_check
1546 1557
1547 1558 @classmethod
1548 1559 def get_merge_conditions(cls, pull_request):
1549 1560 merge_details = {}
1550 1561
1551 1562 model = PullRequestModel()
1552 1563 use_rebase = model._use_rebase_for_merging(pull_request)
1553 1564
1554 1565 if use_rebase:
1555 1566 merge_details['merge_strategy'] = dict(
1556 1567 details={},
1557 1568 message=_('Merge strategy: rebase')
1558 1569 )
1559 1570 else:
1560 1571 merge_details['merge_strategy'] = dict(
1561 1572 details={},
1562 1573 message=_('Merge strategy: explicit merge commit')
1563 1574 )
1564 1575
1565 1576 close_branch = model._close_branch_before_merging(pull_request)
1566 1577 if close_branch:
1567 1578 repo_type = pull_request.target_repo.repo_type
1568 1579 if repo_type == 'hg':
1569 1580 close_msg = _('Source branch will be closed after merge.')
1570 1581 elif repo_type == 'git':
1571 1582 close_msg = _('Source branch will be deleted after merge.')
1572 1583
1573 1584 merge_details['close_branch'] = dict(
1574 1585 details={},
1575 1586 message=close_msg
1576 1587 )
1577 1588
1578 1589 return merge_details
1579 1590
1580 1591 ChangeTuple = namedtuple('ChangeTuple',
1581 1592 ['added', 'common', 'removed', 'total'])
1582 1593
1583 1594 FileChangeTuple = namedtuple('FileChangeTuple',
1584 1595 ['added', 'modified', 'removed'])
@@ -1,811 +1,812 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2017 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import os
22 22 import hashlib
23 23 import logging
24 24 from collections import namedtuple
25 25 from functools import wraps
26 26
27 27 from rhodecode.lib import caches
28 28 from rhodecode.lib.utils2 import (
29 29 Optional, AttributeDict, safe_str, remove_prefix, str2bool)
30 30 from rhodecode.lib.vcs.backends import base
31 31 from rhodecode.model import BaseModel
32 32 from rhodecode.model.db import (
33 33 RepoRhodeCodeUi, RepoRhodeCodeSetting, RhodeCodeUi, RhodeCodeSetting)
34 34 from rhodecode.model.meta import Session
35 35
36 36
37 37 log = logging.getLogger(__name__)
38 38
39 39
40 40 UiSetting = namedtuple(
41 41 'UiSetting', ['section', 'key', 'value', 'active'])
42 42
43 43 SOCIAL_PLUGINS_LIST = ['github', 'bitbucket', 'twitter', 'google']
44 44
45 45
46 46 class SettingNotFound(Exception):
47 47 def __init__(self):
48 48 super(SettingNotFound, self).__init__('Setting is not found')
49 49
50 50
51 51 class SettingsModel(BaseModel):
52 52 BUILTIN_HOOKS = (
53 53 RhodeCodeUi.HOOK_REPO_SIZE, RhodeCodeUi.HOOK_PUSH,
54 54 RhodeCodeUi.HOOK_PRE_PUSH, RhodeCodeUi.HOOK_PRETX_PUSH,
55 55 RhodeCodeUi.HOOK_PULL, RhodeCodeUi.HOOK_PRE_PULL,
56 56 RhodeCodeUi.HOOK_PUSH_KEY,)
57 57 HOOKS_SECTION = 'hooks'
58 58
59 59 def __init__(self, sa=None, repo=None):
60 60 self.repo = repo
61 61 self.UiDbModel = RepoRhodeCodeUi if repo else RhodeCodeUi
62 62 self.SettingsDbModel = (
63 63 RepoRhodeCodeSetting if repo else RhodeCodeSetting)
64 64 super(SettingsModel, self).__init__(sa)
65 65
66 66 def get_ui_by_key(self, key):
67 67 q = self.UiDbModel.query()
68 68 q = q.filter(self.UiDbModel.ui_key == key)
69 69 q = self._filter_by_repo(RepoRhodeCodeUi, q)
70 70 return q.scalar()
71 71
72 72 def get_ui_by_section(self, section):
73 73 q = self.UiDbModel.query()
74 74 q = q.filter(self.UiDbModel.ui_section == section)
75 75 q = self._filter_by_repo(RepoRhodeCodeUi, q)
76 76 return q.all()
77 77
78 78 def get_ui_by_section_and_key(self, section, key):
79 79 q = self.UiDbModel.query()
80 80 q = q.filter(self.UiDbModel.ui_section == section)
81 81 q = q.filter(self.UiDbModel.ui_key == key)
82 82 q = self._filter_by_repo(RepoRhodeCodeUi, q)
83 83 return q.scalar()
84 84
85 85 def get_ui(self, section=None, key=None):
86 86 q = self.UiDbModel.query()
87 87 q = self._filter_by_repo(RepoRhodeCodeUi, q)
88 88
89 89 if section:
90 90 q = q.filter(self.UiDbModel.ui_section == section)
91 91 if key:
92 92 q = q.filter(self.UiDbModel.ui_key == key)
93 93
94 94 # TODO: mikhail: add caching
95 95 result = [
96 96 UiSetting(
97 97 section=safe_str(r.ui_section), key=safe_str(r.ui_key),
98 98 value=safe_str(r.ui_value), active=r.ui_active
99 99 )
100 100 for r in q.all()
101 101 ]
102 102 return result
103 103
104 104 def get_builtin_hooks(self):
105 105 q = self.UiDbModel.query()
106 106 q = q.filter(self.UiDbModel.ui_key.in_(self.BUILTIN_HOOKS))
107 107 return self._get_hooks(q)
108 108
109 109 def get_custom_hooks(self):
110 110 q = self.UiDbModel.query()
111 111 q = q.filter(~self.UiDbModel.ui_key.in_(self.BUILTIN_HOOKS))
112 112 return self._get_hooks(q)
113 113
114 114 def create_ui_section_value(self, section, val, key=None, active=True):
115 115 new_ui = self.UiDbModel()
116 116 new_ui.ui_section = section
117 117 new_ui.ui_value = val
118 118 new_ui.ui_active = active
119 119
120 120 if self.repo:
121 121 repo = self._get_repo(self.repo)
122 122 repository_id = repo.repo_id
123 123 new_ui.repository_id = repository_id
124 124
125 125 if not key:
126 126 # keys are unique so they need appended info
127 127 if self.repo:
128 128 key = hashlib.sha1(
129 129 '{}{}{}'.format(section, val, repository_id)).hexdigest()
130 130 else:
131 131 key = hashlib.sha1('{}{}'.format(section, val)).hexdigest()
132 132
133 133 new_ui.ui_key = key
134 134
135 135 Session().add(new_ui)
136 136 return new_ui
137 137
138 138 def create_or_update_hook(self, key, value):
139 139 ui = (
140 140 self.get_ui_by_section_and_key(self.HOOKS_SECTION, key) or
141 141 self.UiDbModel())
142 142 ui.ui_section = self.HOOKS_SECTION
143 143 ui.ui_active = True
144 144 ui.ui_key = key
145 145 ui.ui_value = value
146 146
147 147 if self.repo:
148 148 repo = self._get_repo(self.repo)
149 149 repository_id = repo.repo_id
150 150 ui.repository_id = repository_id
151 151
152 152 Session().add(ui)
153 153 return ui
154 154
155 155 def delete_ui(self, id_):
156 156 ui = self.UiDbModel.get(id_)
157 157 if not ui:
158 158 raise SettingNotFound()
159 159 Session().delete(ui)
160 160
161 161 def get_setting_by_name(self, name):
162 162 q = self._get_settings_query()
163 163 q = q.filter(self.SettingsDbModel.app_settings_name == name)
164 164 return q.scalar()
165 165
166 166 def create_or_update_setting(
167 167 self, name, val=Optional(''), type_=Optional('unicode')):
168 168 """
169 169 Creates or updates RhodeCode setting. If updates is triggered it will
170 170 only update parameters that are explicityl set Optional instance will
171 171 be skipped
172 172
173 173 :param name:
174 174 :param val:
175 175 :param type_:
176 176 :return:
177 177 """
178 178
179 179 res = self.get_setting_by_name(name)
180 180 repo = self._get_repo(self.repo) if self.repo else None
181 181
182 182 if not res:
183 183 val = Optional.extract(val)
184 184 type_ = Optional.extract(type_)
185 185
186 186 args = (
187 187 (repo.repo_id, name, val, type_)
188 188 if repo else (name, val, type_))
189 189 res = self.SettingsDbModel(*args)
190 190
191 191 else:
192 192 if self.repo:
193 193 res.repository_id = repo.repo_id
194 194
195 195 res.app_settings_name = name
196 196 if not isinstance(type_, Optional):
197 197 # update if set
198 198 res.app_settings_type = type_
199 199 if not isinstance(val, Optional):
200 200 # update if set
201 201 res.app_settings_value = val
202 202
203 203 Session().add(res)
204 204 return res
205 205
206 206 def invalidate_settings_cache(self):
207 207 namespace = 'rhodecode_settings'
208 208 cache_manager = caches.get_cache_manager('sql_cache_short', namespace)
209 209 caches.clear_cache_manager(cache_manager)
210 210
211 211 def get_all_settings(self, cache=False):
212 212
213 213 def _compute():
214 214 q = self._get_settings_query()
215 215 if not q:
216 216 raise Exception('Could not get application settings !')
217 217
218 218 settings = {
219 219 'rhodecode_' + result.app_settings_name: result.app_settings_value
220 220 for result in q
221 221 }
222 222 return settings
223 223
224 224 if cache:
225 225 log.debug('Fetching app settings using cache')
226 226 repo = self._get_repo(self.repo) if self.repo else None
227 227 namespace = 'rhodecode_settings'
228 228 cache_manager = caches.get_cache_manager(
229 229 'sql_cache_short', namespace)
230 230 _cache_key = (
231 231 "get_repo_{}_settings".format(repo.repo_id)
232 232 if repo else "get_app_settings")
233 233
234 234 return cache_manager.get(_cache_key, createfunc=_compute)
235 235
236 236 else:
237 237 return _compute()
238 238
239 239 def get_auth_settings(self):
240 240 q = self._get_settings_query()
241 241 q = q.filter(
242 242 self.SettingsDbModel.app_settings_name.startswith('auth_'))
243 243 rows = q.all()
244 244 auth_settings = {
245 245 row.app_settings_name: row.app_settings_value for row in rows}
246 246 return auth_settings
247 247
248 248 def get_auth_plugins(self):
249 249 auth_plugins = self.get_setting_by_name("auth_plugins")
250 250 return auth_plugins.app_settings_value
251 251
252 252 def get_default_repo_settings(self, strip_prefix=False):
253 253 q = self._get_settings_query()
254 254 q = q.filter(
255 255 self.SettingsDbModel.app_settings_name.startswith('default_'))
256 256 rows = q.all()
257 257
258 258 result = {}
259 259 for row in rows:
260 260 key = row.app_settings_name
261 261 if strip_prefix:
262 262 key = remove_prefix(key, prefix='default_')
263 263 result.update({key: row.app_settings_value})
264 264 return result
265 265
266 266 def get_repo(self):
267 267 repo = self._get_repo(self.repo)
268 268 if not repo:
269 269 raise Exception(
270 270 'Repository `{}` cannot be found inside the database'.format(
271 271 self.repo))
272 272 return repo
273 273
274 274 def _filter_by_repo(self, model, query):
275 275 if self.repo:
276 276 repo = self.get_repo()
277 277 query = query.filter(model.repository_id == repo.repo_id)
278 278 return query
279 279
280 280 def _get_hooks(self, query):
281 281 query = query.filter(self.UiDbModel.ui_section == self.HOOKS_SECTION)
282 282 query = self._filter_by_repo(RepoRhodeCodeUi, query)
283 283 return query.all()
284 284
285 285 def _get_settings_query(self):
286 286 q = self.SettingsDbModel.query()
287 287 return self._filter_by_repo(RepoRhodeCodeSetting, q)
288 288
289 289 def list_enabled_social_plugins(self, settings):
290 290 enabled = []
291 291 for plug in SOCIAL_PLUGINS_LIST:
292 292 if str2bool(settings.get('rhodecode_auth_{}_enabled'.format(plug)
293 293 )):
294 294 enabled.append(plug)
295 295 return enabled
296 296
297 297
298 298 def assert_repo_settings(func):
299 299 @wraps(func)
300 300 def _wrapper(self, *args, **kwargs):
301 301 if not self.repo_settings:
302 302 raise Exception('Repository is not specified')
303 303 return func(self, *args, **kwargs)
304 304 return _wrapper
305 305
306 306
307 307 class IssueTrackerSettingsModel(object):
308 308 INHERIT_SETTINGS = 'inherit_issue_tracker_settings'
309 309 SETTINGS_PREFIX = 'issuetracker_'
310 310
311 311 def __init__(self, sa=None, repo=None):
312 312 self.global_settings = SettingsModel(sa=sa)
313 313 self.repo_settings = SettingsModel(sa=sa, repo=repo) if repo else None
314 314
315 315 @property
316 316 def inherit_global_settings(self):
317 317 if not self.repo_settings:
318 318 return True
319 319 setting = self.repo_settings.get_setting_by_name(self.INHERIT_SETTINGS)
320 320 return setting.app_settings_value if setting else True
321 321
322 322 @inherit_global_settings.setter
323 323 def inherit_global_settings(self, value):
324 324 if self.repo_settings:
325 325 settings = self.repo_settings.create_or_update_setting(
326 326 self.INHERIT_SETTINGS, value, type_='bool')
327 327 Session().add(settings)
328 328
329 329 def _get_keyname(self, key, uid, prefix=''):
330 330 return '{0}{1}{2}_{3}'.format(
331 331 prefix, self.SETTINGS_PREFIX, key, uid)
332 332
333 333 def _make_dict_for_settings(self, qs):
334 334 prefix_match = self._get_keyname('pat', '', 'rhodecode_')
335 335
336 336 issuetracker_entries = {}
337 337 # create keys
338 338 for k, v in qs.items():
339 339 if k.startswith(prefix_match):
340 340 uid = k[len(prefix_match):]
341 341 issuetracker_entries[uid] = None
342 342
343 343 # populate
344 344 for uid in issuetracker_entries:
345 345 issuetracker_entries[uid] = AttributeDict({
346 346 'pat': qs.get(self._get_keyname('pat', uid, 'rhodecode_')),
347 347 'url': qs.get(self._get_keyname('url', uid, 'rhodecode_')),
348 348 'pref': qs.get(self._get_keyname('pref', uid, 'rhodecode_')),
349 349 'desc': qs.get(self._get_keyname('desc', uid, 'rhodecode_')),
350 350 })
351 351 return issuetracker_entries
352 352
353 353 def get_global_settings(self, cache=False):
354 354 """
355 355 Returns list of global issue tracker settings
356 356 """
357 357 defaults = self.global_settings.get_all_settings(cache=cache)
358 358 settings = self._make_dict_for_settings(defaults)
359 359 return settings
360 360
361 361 def get_repo_settings(self, cache=False):
362 362 """
363 363 Returns list of issue tracker settings per repository
364 364 """
365 365 if not self.repo_settings:
366 366 raise Exception('Repository is not specified')
367 367 all_settings = self.repo_settings.get_all_settings(cache=cache)
368 368 settings = self._make_dict_for_settings(all_settings)
369 369 return settings
370 370
371 371 def get_settings(self, cache=False):
372 372 if self.inherit_global_settings:
373 373 return self.get_global_settings(cache=cache)
374 374 else:
375 375 return self.get_repo_settings(cache=cache)
376 376
377 377 def delete_entries(self, uid):
378 378 if self.repo_settings:
379 379 all_patterns = self.get_repo_settings()
380 380 settings_model = self.repo_settings
381 381 else:
382 382 all_patterns = self.get_global_settings()
383 383 settings_model = self.global_settings
384 384 entries = all_patterns.get(uid)
385 385
386 386 for del_key in entries:
387 387 setting_name = self._get_keyname(del_key, uid)
388 388 entry = settings_model.get_setting_by_name(setting_name)
389 389 if entry:
390 390 Session().delete(entry)
391 391
392 392 Session().commit()
393 393
394 394 def create_or_update_setting(
395 395 self, name, val=Optional(''), type_=Optional('unicode')):
396 396 if self.repo_settings:
397 397 setting = self.repo_settings.create_or_update_setting(
398 398 name, val, type_)
399 399 else:
400 400 setting = self.global_settings.create_or_update_setting(
401 401 name, val, type_)
402 402 return setting
403 403
404 404
405 405 class VcsSettingsModel(object):
406 406
407 407 INHERIT_SETTINGS = 'inherit_vcs_settings'
408 408 GENERAL_SETTINGS = (
409 409 'use_outdated_comments',
410 410 'pr_merge_enabled',
411 'hg_use_rebase_for_merging')
411 'hg_use_rebase_for_merging',
412 'hg_close_branch_before_merging')
412 413
413 414 HOOKS_SETTINGS = (
414 415 ('hooks', 'changegroup.repo_size'),
415 416 ('hooks', 'changegroup.push_logger'),
416 417 ('hooks', 'outgoing.pull_logger'),)
417 418 HG_SETTINGS = (
418 419 ('extensions', 'largefiles'),
419 420 ('phases', 'publish'),
420 421 ('extensions', 'evolve'),)
421 422 GIT_SETTINGS = (
422 423 ('vcs_git_lfs', 'enabled'),)
423 424 GLOBAL_HG_SETTINGS = (
424 425 ('extensions', 'largefiles'),
425 426 ('largefiles', 'usercache'),
426 427 ('phases', 'publish'),
427 428 ('extensions', 'hgsubversion'),
428 429 ('extensions', 'evolve'),)
429 430 GLOBAL_GIT_SETTINGS = (
430 431 ('vcs_git_lfs', 'enabled'),
431 432 ('vcs_git_lfs', 'store_location'))
432 433 GLOBAL_SVN_SETTINGS = (
433 434 ('vcs_svn_proxy', 'http_requests_enabled'),
434 435 ('vcs_svn_proxy', 'http_server_url'))
435 436
436 437 SVN_BRANCH_SECTION = 'vcs_svn_branch'
437 438 SVN_TAG_SECTION = 'vcs_svn_tag'
438 439 SSL_SETTING = ('web', 'push_ssl')
439 440 PATH_SETTING = ('paths', '/')
440 441
441 442 def __init__(self, sa=None, repo=None):
442 443 self.global_settings = SettingsModel(sa=sa)
443 444 self.repo_settings = SettingsModel(sa=sa, repo=repo) if repo else None
444 445 self._ui_settings = (
445 446 self.HG_SETTINGS + self.GIT_SETTINGS + self.HOOKS_SETTINGS)
446 447 self._svn_sections = (self.SVN_BRANCH_SECTION, self.SVN_TAG_SECTION)
447 448
448 449 @property
449 450 @assert_repo_settings
450 451 def inherit_global_settings(self):
451 452 setting = self.repo_settings.get_setting_by_name(self.INHERIT_SETTINGS)
452 453 return setting.app_settings_value if setting else True
453 454
454 455 @inherit_global_settings.setter
455 456 @assert_repo_settings
456 457 def inherit_global_settings(self, value):
457 458 self.repo_settings.create_or_update_setting(
458 459 self.INHERIT_SETTINGS, value, type_='bool')
459 460
460 461 def get_global_svn_branch_patterns(self):
461 462 return self.global_settings.get_ui_by_section(self.SVN_BRANCH_SECTION)
462 463
463 464 @assert_repo_settings
464 465 def get_repo_svn_branch_patterns(self):
465 466 return self.repo_settings.get_ui_by_section(self.SVN_BRANCH_SECTION)
466 467
467 468 def get_global_svn_tag_patterns(self):
468 469 return self.global_settings.get_ui_by_section(self.SVN_TAG_SECTION)
469 470
470 471 @assert_repo_settings
471 472 def get_repo_svn_tag_patterns(self):
472 473 return self.repo_settings.get_ui_by_section(self.SVN_TAG_SECTION)
473 474
474 475 def get_global_settings(self):
475 476 return self._collect_all_settings(global_=True)
476 477
477 478 @assert_repo_settings
478 479 def get_repo_settings(self):
479 480 return self._collect_all_settings(global_=False)
480 481
481 482 @assert_repo_settings
482 483 def create_or_update_repo_settings(
483 484 self, data, inherit_global_settings=False):
484 485 from rhodecode.model.scm import ScmModel
485 486
486 487 self.inherit_global_settings = inherit_global_settings
487 488
488 489 repo = self.repo_settings.get_repo()
489 490 if not inherit_global_settings:
490 491 if repo.repo_type == 'svn':
491 492 self.create_repo_svn_settings(data)
492 493 else:
493 494 self.create_or_update_repo_hook_settings(data)
494 495 self.create_or_update_repo_pr_settings(data)
495 496
496 497 if repo.repo_type == 'hg':
497 498 self.create_or_update_repo_hg_settings(data)
498 499
499 500 if repo.repo_type == 'git':
500 501 self.create_or_update_repo_git_settings(data)
501 502
502 503 ScmModel().mark_for_invalidation(repo.repo_name, delete=True)
503 504
504 505 @assert_repo_settings
505 506 def create_or_update_repo_hook_settings(self, data):
506 507 for section, key in self.HOOKS_SETTINGS:
507 508 data_key = self._get_form_ui_key(section, key)
508 509 if data_key not in data:
509 510 raise ValueError(
510 511 'The given data does not contain {} key'.format(data_key))
511 512
512 513 active = data.get(data_key)
513 514 repo_setting = self.repo_settings.get_ui_by_section_and_key(
514 515 section, key)
515 516 if not repo_setting:
516 517 global_setting = self.global_settings.\
517 518 get_ui_by_section_and_key(section, key)
518 519 self.repo_settings.create_ui_section_value(
519 520 section, global_setting.ui_value, key=key, active=active)
520 521 else:
521 522 repo_setting.ui_active = active
522 523 Session().add(repo_setting)
523 524
524 525 def update_global_hook_settings(self, data):
525 526 for section, key in self.HOOKS_SETTINGS:
526 527 data_key = self._get_form_ui_key(section, key)
527 528 if data_key not in data:
528 529 raise ValueError(
529 530 'The given data does not contain {} key'.format(data_key))
530 531 active = data.get(data_key)
531 532 repo_setting = self.global_settings.get_ui_by_section_and_key(
532 533 section, key)
533 534 repo_setting.ui_active = active
534 535 Session().add(repo_setting)
535 536
536 537 @assert_repo_settings
537 538 def create_or_update_repo_pr_settings(self, data):
538 539 return self._create_or_update_general_settings(
539 540 self.repo_settings, data)
540 541
541 542 def create_or_update_global_pr_settings(self, data):
542 543 return self._create_or_update_general_settings(
543 544 self.global_settings, data)
544 545
545 546 @assert_repo_settings
546 547 def create_repo_svn_settings(self, data):
547 548 return self._create_svn_settings(self.repo_settings, data)
548 549
549 550 @assert_repo_settings
550 551 def create_or_update_repo_hg_settings(self, data):
551 552 largefiles, phases, evolve = \
552 553 self.HG_SETTINGS
553 554 largefiles_key, phases_key, evolve_key = \
554 555 self._get_settings_keys(self.HG_SETTINGS, data)
555 556
556 557 self._create_or_update_ui(
557 558 self.repo_settings, *largefiles, value='',
558 559 active=data[largefiles_key])
559 560 self._create_or_update_ui(
560 561 self.repo_settings, *evolve, value='',
561 562 active=data[evolve_key])
562 563 self._create_or_update_ui(
563 564 self.repo_settings, *phases, value=safe_str(data[phases_key]))
564 565
565 566 def create_or_update_global_hg_settings(self, data):
566 567 largefiles, largefiles_store, phases, hgsubversion, evolve \
567 568 = self.GLOBAL_HG_SETTINGS
568 569 largefiles_key, largefiles_store_key, phases_key, subversion_key, evolve_key \
569 570 = self._get_settings_keys(self.GLOBAL_HG_SETTINGS, data)
570 571
571 572 self._create_or_update_ui(
572 573 self.global_settings, *largefiles, value='',
573 574 active=data[largefiles_key])
574 575 self._create_or_update_ui(
575 576 self.global_settings, *largefiles_store,
576 577 value=data[largefiles_store_key])
577 578 self._create_or_update_ui(
578 579 self.global_settings, *phases, value=safe_str(data[phases_key]))
579 580 self._create_or_update_ui(
580 581 self.global_settings, *hgsubversion, active=data[subversion_key])
581 582 self._create_or_update_ui(
582 583 self.global_settings, *evolve, value='',
583 584 active=data[evolve_key])
584 585
585 586 def create_or_update_repo_git_settings(self, data):
586 587 # NOTE(marcink): # comma make unpack work properly
587 588 lfs_enabled, \
588 589 = self.GIT_SETTINGS
589 590
590 591 lfs_enabled_key, \
591 592 = self._get_settings_keys(self.GIT_SETTINGS, data)
592 593
593 594 self._create_or_update_ui(
594 595 self.repo_settings, *lfs_enabled, value=data[lfs_enabled_key],
595 596 active=data[lfs_enabled_key])
596 597
597 598 def create_or_update_global_git_settings(self, data):
598 599 lfs_enabled, lfs_store_location \
599 600 = self.GLOBAL_GIT_SETTINGS
600 601 lfs_enabled_key, lfs_store_location_key \
601 602 = self._get_settings_keys(self.GLOBAL_GIT_SETTINGS, data)
602 603
603 604 self._create_or_update_ui(
604 605 self.global_settings, *lfs_enabled, value=data[lfs_enabled_key],
605 606 active=data[lfs_enabled_key])
606 607 self._create_or_update_ui(
607 608 self.global_settings, *lfs_store_location,
608 609 value=data[lfs_store_location_key])
609 610
610 611 def create_or_update_global_svn_settings(self, data):
611 612 # branch/tags patterns
612 613 self._create_svn_settings(self.global_settings, data)
613 614
614 615 http_requests_enabled, http_server_url = self.GLOBAL_SVN_SETTINGS
615 616 http_requests_enabled_key, http_server_url_key = self._get_settings_keys(
616 617 self.GLOBAL_SVN_SETTINGS, data)
617 618
618 619 self._create_or_update_ui(
619 620 self.global_settings, *http_requests_enabled,
620 621 value=safe_str(data[http_requests_enabled_key]))
621 622 self._create_or_update_ui(
622 623 self.global_settings, *http_server_url,
623 624 value=data[http_server_url_key])
624 625
625 626 def update_global_ssl_setting(self, value):
626 627 self._create_or_update_ui(
627 628 self.global_settings, *self.SSL_SETTING, value=value)
628 629
629 630 def update_global_path_setting(self, value):
630 631 self._create_or_update_ui(
631 632 self.global_settings, *self.PATH_SETTING, value=value)
632 633
633 634 @assert_repo_settings
634 635 def delete_repo_svn_pattern(self, id_):
635 636 self.repo_settings.delete_ui(id_)
636 637
637 638 def delete_global_svn_pattern(self, id_):
638 639 self.global_settings.delete_ui(id_)
639 640
640 641 @assert_repo_settings
641 642 def get_repo_ui_settings(self, section=None, key=None):
642 643 global_uis = self.global_settings.get_ui(section, key)
643 644 repo_uis = self.repo_settings.get_ui(section, key)
644 645 filtered_repo_uis = self._filter_ui_settings(repo_uis)
645 646 filtered_repo_uis_keys = [
646 647 (s.section, s.key) for s in filtered_repo_uis]
647 648
648 649 def _is_global_ui_filtered(ui):
649 650 return (
650 651 (ui.section, ui.key) in filtered_repo_uis_keys
651 652 or ui.section in self._svn_sections)
652 653
653 654 filtered_global_uis = [
654 655 ui for ui in global_uis if not _is_global_ui_filtered(ui)]
655 656
656 657 return filtered_global_uis + filtered_repo_uis
657 658
658 659 def get_global_ui_settings(self, section=None, key=None):
659 660 return self.global_settings.get_ui(section, key)
660 661
661 662 def get_ui_settings_as_config_obj(self, section=None, key=None):
662 663 config = base.Config()
663 664
664 665 ui_settings = self.get_ui_settings(section=section, key=key)
665 666
666 667 for entry in ui_settings:
667 668 config.set(entry.section, entry.key, entry.value)
668 669
669 670 return config
670 671
671 672 def get_ui_settings(self, section=None, key=None):
672 673 if not self.repo_settings or self.inherit_global_settings:
673 674 return self.get_global_ui_settings(section, key)
674 675 else:
675 676 return self.get_repo_ui_settings(section, key)
676 677
677 678 def get_svn_patterns(self, section=None):
678 679 if not self.repo_settings:
679 680 return self.get_global_ui_settings(section)
680 681 else:
681 682 return self.get_repo_ui_settings(section)
682 683
683 684 @assert_repo_settings
684 685 def get_repo_general_settings(self):
685 686 global_settings = self.global_settings.get_all_settings()
686 687 repo_settings = self.repo_settings.get_all_settings()
687 688 filtered_repo_settings = self._filter_general_settings(repo_settings)
688 689 global_settings.update(filtered_repo_settings)
689 690 return global_settings
690 691
691 692 def get_global_general_settings(self):
692 693 return self.global_settings.get_all_settings()
693 694
694 695 def get_general_settings(self):
695 696 if not self.repo_settings or self.inherit_global_settings:
696 697 return self.get_global_general_settings()
697 698 else:
698 699 return self.get_repo_general_settings()
699 700
700 701 def get_repos_location(self):
701 702 return self.global_settings.get_ui_by_key('/').ui_value
702 703
703 704 def _filter_ui_settings(self, settings):
704 705 filtered_settings = [
705 706 s for s in settings if self._should_keep_setting(s)]
706 707 return filtered_settings
707 708
708 709 def _should_keep_setting(self, setting):
709 710 keep = (
710 711 (setting.section, setting.key) in self._ui_settings or
711 712 setting.section in self._svn_sections)
712 713 return keep
713 714
714 715 def _filter_general_settings(self, settings):
715 716 keys = ['rhodecode_{}'.format(key) for key in self.GENERAL_SETTINGS]
716 717 return {
717 718 k: settings[k]
718 719 for k in settings if k in keys}
719 720
720 721 def _collect_all_settings(self, global_=False):
721 722 settings = self.global_settings if global_ else self.repo_settings
722 723 result = {}
723 724
724 725 for section, key in self._ui_settings:
725 726 ui = settings.get_ui_by_section_and_key(section, key)
726 727 result_key = self._get_form_ui_key(section, key)
727 728
728 729 if ui:
729 730 if section in ('hooks', 'extensions'):
730 731 result[result_key] = ui.ui_active
731 732 elif result_key in ['vcs_git_lfs_enabled']:
732 733 result[result_key] = ui.ui_active
733 734 else:
734 735 result[result_key] = ui.ui_value
735 736
736 737 for name in self.GENERAL_SETTINGS:
737 738 setting = settings.get_setting_by_name(name)
738 739 if setting:
739 740 result_key = 'rhodecode_{}'.format(name)
740 741 result[result_key] = setting.app_settings_value
741 742
742 743 return result
743 744
744 745 def _get_form_ui_key(self, section, key):
745 746 return '{section}_{key}'.format(
746 747 section=section, key=key.replace('.', '_'))
747 748
748 749 def _create_or_update_ui(
749 750 self, settings, section, key, value=None, active=None):
750 751 ui = settings.get_ui_by_section_and_key(section, key)
751 752 if not ui:
752 753 active = True if active is None else active
753 754 settings.create_ui_section_value(
754 755 section, value, key=key, active=active)
755 756 else:
756 757 if active is not None:
757 758 ui.ui_active = active
758 759 if value is not None:
759 760 ui.ui_value = value
760 761 Session().add(ui)
761 762
762 763 def _create_svn_settings(self, settings, data):
763 764 svn_settings = {
764 765 'new_svn_branch': self.SVN_BRANCH_SECTION,
765 766 'new_svn_tag': self.SVN_TAG_SECTION
766 767 }
767 768 for key in svn_settings:
768 769 if data.get(key):
769 770 settings.create_ui_section_value(svn_settings[key], data[key])
770 771
771 772 def _create_or_update_general_settings(self, settings, data):
772 773 for name in self.GENERAL_SETTINGS:
773 774 data_key = 'rhodecode_{}'.format(name)
774 775 if data_key not in data:
775 776 raise ValueError(
776 777 'The given data does not contain {} key'.format(data_key))
777 778 setting = settings.create_or_update_setting(
778 779 name, data[data_key], 'bool')
779 780 Session().add(setting)
780 781
781 782 def _get_settings_keys(self, settings, data):
782 783 data_keys = [self._get_form_ui_key(*s) for s in settings]
783 784 for data_key in data_keys:
784 785 if data_key not in data:
785 786 raise ValueError(
786 787 'The given data does not contain {} key'.format(data_key))
787 788 return data_keys
788 789
789 790 def create_largeobjects_dirs_if_needed(self, repo_store_path):
790 791 """
791 792 This is subscribed to the `pyramid.events.ApplicationCreated` event. It
792 793 does a repository scan if enabled in the settings.
793 794 """
794 795
795 796 from rhodecode.lib.vcs.backends.hg import largefiles_store
796 797 from rhodecode.lib.vcs.backends.git import lfs_store
797 798
798 799 paths = [
799 800 largefiles_store(repo_store_path),
800 801 lfs_store(repo_store_path)]
801 802
802 803 for path in paths:
803 804 if os.path.isdir(path):
804 805 continue
805 806 if os.path.isfile(path):
806 807 continue
807 808 # not a file nor dir, we try to create it
808 809 try:
809 810 os.makedirs(path)
810 811 except Exception:
811 812 log.warning('Failed to create largefiles dir:%s', path)
@@ -1,335 +1,343 b''
1 1 ## snippet for displaying vcs settings
2 2 ## usage:
3 3 ## <%namespace name="vcss" file="/base/vcssettings.mako"/>
4 4 ## ${vcss.vcs_settings_fields()}
5 5
6 6 <%def name="vcs_settings_fields(suffix='', svn_branch_patterns=None, svn_tag_patterns=None, repo_type=None, display_globals=False, allow_repo_location_change=False, **kwargs)">
7 7 % if display_globals:
8 8 <div class="panel panel-default">
9 9 <div class="panel-heading" id="general">
10 10 <h3 class="panel-title">${_('General')}<a class="permalink" href="#general"></a></h3>
11 11 </div>
12 12 <div class="panel-body">
13 13 <div class="field">
14 14 <div class="checkbox">
15 15 ${h.checkbox('web_push_ssl' + suffix, 'True')}
16 16 <label for="web_push_ssl${suffix}">${_('Require SSL for vcs operations')}</label>
17 17 </div>
18 18 <div class="label">
19 19 <span class="help-block">${_('Activate to set RhodeCode to require SSL for pushing or pulling. If SSL certificate is missing it will return a HTTP Error 406: Not Acceptable.')}</span>
20 20 </div>
21 21 </div>
22 22 </div>
23 23 </div>
24 24 % endif
25 25
26 26 % if display_globals:
27 27 <div class="panel panel-default">
28 28 <div class="panel-heading" id="vcs-storage-options">
29 29 <h3 class="panel-title">${_('Main Storage Location')}<a class="permalink" href="#vcs-storage-options"></a></h3>
30 30 </div>
31 31 <div class="panel-body">
32 32 <div class="field">
33 33 <div class="inputx locked_input">
34 34 %if allow_repo_location_change:
35 35 ${h.text('paths_root_path',size=59,readonly="readonly", class_="disabled")}
36 36 <span id="path_unlock" class="tooltip"
37 37 title="${h.tooltip(_('Click to unlock. You must restart RhodeCode in order to make this setting take effect.'))}">
38 38 <div class="btn btn-default lock_input_button"><i id="path_unlock_icon" class="icon-lock"></i></div>
39 39 </span>
40 40 %else:
41 41 ${_('Repository location change is disabled. You can enable this by changing the `allow_repo_location_change` inside .ini file.')}
42 42 ## form still requires this but we cannot internally change it anyway
43 43 ${h.hidden('paths_root_path',size=30,readonly="readonly", class_="disabled")}
44 44 %endif
45 45 </div>
46 46 </div>
47 47 <div class="label">
48 48 <span class="help-block">${_('Filesystem location where repositories should be stored. After changing this value a restart and rescan of the repository folder are required.')}</span>
49 49 </div>
50 50 </div>
51 51 </div>
52 52 % endif
53 53
54 54 % if display_globals or repo_type in ['git', 'hg']:
55 55 <div class="panel panel-default">
56 56 <div class="panel-heading" id="vcs-hooks-options">
57 57 <h3 class="panel-title">${_('Internal Hooks')}<a class="permalink" href="#vcs-hooks-options"></a></h3>
58 58 </div>
59 59 <div class="panel-body">
60 60 <div class="field">
61 61 <div class="checkbox">
62 62 ${h.checkbox('hooks_changegroup_repo_size' + suffix, 'True', **kwargs)}
63 63 <label for="hooks_changegroup_repo_size${suffix}">${_('Show repository size after push')}</label>
64 64 </div>
65 65
66 66 <div class="label">
67 67 <span class="help-block">${_('Trigger a hook that calculates repository size after each push.')}</span>
68 68 </div>
69 69 <div class="checkbox">
70 70 ${h.checkbox('hooks_changegroup_push_logger' + suffix, 'True', **kwargs)}
71 71 <label for="hooks_changegroup_push_logger${suffix}">${_('Execute pre/post push hooks')}</label>
72 72 </div>
73 73 <div class="label">
74 74 <span class="help-block">${_('Execute Built in pre/post push hooks. This also executes rcextensions hooks.')}</span>
75 75 </div>
76 76 <div class="checkbox">
77 77 ${h.checkbox('hooks_outgoing_pull_logger' + suffix, 'True', **kwargs)}
78 78 <label for="hooks_outgoing_pull_logger${suffix}">${_('Execute pre/post pull hooks')}</label>
79 79 </div>
80 80 <div class="label">
81 81 <span class="help-block">${_('Execute Built in pre/post pull hooks. This also executes rcextensions hooks.')}</span>
82 82 </div>
83 83 </div>
84 84 </div>
85 85 </div>
86 86 % endif
87 87
88 88 % if display_globals or repo_type in ['hg']:
89 89 <div class="panel panel-default">
90 90 <div class="panel-heading" id="vcs-hg-options">
91 91 <h3 class="panel-title">${_('Mercurial Settings')}<a class="permalink" href="#vcs-hg-options"></a></h3>
92 92 </div>
93 93 <div class="panel-body">
94 94 <div class="checkbox">
95 95 ${h.checkbox('extensions_largefiles' + suffix, 'True', **kwargs)}
96 96 <label for="extensions_largefiles${suffix}">${_('Enable largefiles extension')}</label>
97 97 </div>
98 98 <div class="label">
99 99 % if display_globals:
100 100 <span class="help-block">${_('Enable Largefiles extensions for all repositories.')}</span>
101 101 % else:
102 102 <span class="help-block">${_('Enable Largefiles extensions for this repository.')}</span>
103 103 % endif
104 104 </div>
105 105
106 106 % if display_globals:
107 107 <div class="field">
108 108 <div class="input">
109 109 ${h.text('largefiles_usercache' + suffix, size=59)}
110 110 </div>
111 111 </div>
112 112 <div class="label">
113 113 <span class="help-block">${_('Filesystem location where Mercurial largefile objects should be stored.')}</span>
114 114 </div>
115 115 % endif
116 116
117 117 <div class="checkbox">
118 118 ${h.checkbox('phases_publish' + suffix, 'True', **kwargs)}
119 119 <label for="phases_publish${suffix}">${_('Set repositories as publishing') if display_globals else _('Set repository as publishing')}</label>
120 120 </div>
121 121 <div class="label">
122 122 <span class="help-block">${_('When this is enabled all commits in the repository are seen as public commits by clients.')}</span>
123 123 </div>
124 124 % if display_globals:
125 125 <div class="checkbox">
126 126 ${h.checkbox('extensions_hgsubversion' + suffix,'True')}
127 127 <label for="extensions_hgsubversion${suffix}">${_('Enable hgsubversion extension')}</label>
128 128 </div>
129 129 <div class="label">
130 130 <span class="help-block">${_('Requires hgsubversion library to be installed. Allows cloning remote SVN repositories and migrates them to Mercurial type.')}</span>
131 131 </div>
132 132 % endif
133 133
134 134 <div class="checkbox">
135 135 ${h.checkbox('extensions_evolve' + suffix, 'True', **kwargs)}
136 136 <label for="extensions_evolve${suffix}">${_('Enable evolve extension')}</label>
137 137 </div>
138 138 <div class="label">
139 139 % if display_globals:
140 140 <span class="help-block">${_('Enable evolve extension for all repositories.')}</span>
141 141 % else:
142 142 <span class="help-block">${_('Enable evolve extension for this repository.')}</span>
143 143 % endif
144 144 </div>
145 145
146 146 </div>
147 147 </div>
148 148 ## LABS for HG
149 149 % if c.labs_active:
150 150 <div class="panel panel-danger">
151 151 <div class="panel-heading">
152 152 <h3 class="panel-title">${_('Mercurial Labs Settings')} (${_('These features are considered experimental and may not work as expected.')})</h3>
153 153 </div>
154 154 <div class="panel-body">
155 155
156 156 <div class="checkbox">
157 157 ${h.checkbox('rhodecode_hg_use_rebase_for_merging' + suffix, 'True', **kwargs)}
158 158 <label for="rhodecode_hg_use_rebase_for_merging${suffix}">${_('Use rebase as merge strategy')}</label>
159 159 </div>
160 160 <div class="label">
161 161 <span class="help-block">${_('Use rebase instead of creating a merge commit when merging via web interface.')}</span>
162 162 </div>
163 163
164 <div class="checkbox">
165 ${h.checkbox('rhodecode_hg_close_branch_before_merging' + suffix, 'True', **kwargs)}
166 <label for="rhodecode_hg_close_branch_before_merging{suffix}">${_('Close branch before merging it')}</label>
167 </div>
168 <div class="label">
169 <span class="help-block">${_('Close branch before merging it into destination branch. No effect when rebase strategy is use.')}</span>
170 </div>
171
164 172 </div>
165 173 </div>
166 174 % endif
167 175
168 176 % endif
169 177
170 178 % if display_globals or repo_type in ['git']:
171 179 <div class="panel panel-default">
172 180 <div class="panel-heading" id="vcs-git-options">
173 181 <h3 class="panel-title">${_('Git Settings')}<a class="permalink" href="#vcs-git-options"></a></h3>
174 182 </div>
175 183 <div class="panel-body">
176 184 <div class="checkbox">
177 185 ${h.checkbox('vcs_git_lfs_enabled' + suffix, 'True', **kwargs)}
178 186 <label for="vcs_git_lfs_enabled${suffix}">${_('Enable lfs extension')}</label>
179 187 </div>
180 188 <div class="label">
181 189 % if display_globals:
182 190 <span class="help-block">${_('Enable lfs extensions for all repositories.')}</span>
183 191 % else:
184 192 <span class="help-block">${_('Enable lfs extensions for this repository.')}</span>
185 193 % endif
186 194 </div>
187 195
188 196 % if display_globals:
189 197 <div class="field">
190 198 <div class="input">
191 199 ${h.text('vcs_git_lfs_store_location' + suffix, size=59)}
192 200 </div>
193 201 </div>
194 202 <div class="label">
195 203 <span class="help-block">${_('Filesystem location where Git lfs objects should be stored.')}</span>
196 204 </div>
197 205 % endif
198 206 </div>
199 207 </div>
200 208 % endif
201 209
202 210
203 211 % if display_globals:
204 212 <div class="panel panel-default">
205 213 <div class="panel-heading" id="vcs-global-svn-options">
206 214 <h3 class="panel-title">${_('Global Subversion Settings')}<a class="permalink" href="#vcs-global-svn-options"></a></h3>
207 215 </div>
208 216 <div class="panel-body">
209 217 <div class="field">
210 218 <div class="checkbox">
211 219 ${h.checkbox('vcs_svn_proxy_http_requests_enabled' + suffix, 'True', **kwargs)}
212 220 <label for="vcs_svn_proxy_http_requests_enabled${suffix}">${_('Proxy subversion HTTP requests')}</label>
213 221 </div>
214 222 <div class="label">
215 223 <span class="help-block">
216 224 ${_('Subversion HTTP Support. Enables communication with SVN over HTTP protocol.')}
217 225 <a href="${h.route_url('enterprise_svn_setup')}" target="_blank">${_('SVN Protocol setup Documentation')}</a>.
218 226 </span>
219 227 </div>
220 228 </div>
221 229 <div class="field">
222 230 <div class="label">
223 231 <label for="vcs_svn_proxy_http_server_url">${_('Subversion HTTP Server URL')}</label><br/>
224 232 </div>
225 233 <div class="input">
226 234 ${h.text('vcs_svn_proxy_http_server_url',size=59)}
227 235 % if c.svn_proxy_generate_config:
228 236 <span class="buttons">
229 237 <button class="btn btn-primary" id="vcs_svn_generate_cfg">${_('Generate Apache Config')}</button>
230 238 </span>
231 239 % endif
232 240 </div>
233 241 </div>
234 242 </div>
235 243 </div>
236 244 % endif
237 245
238 246 % if display_globals or repo_type in ['svn']:
239 247 <div class="panel panel-default">
240 248 <div class="panel-heading" id="vcs-svn-options">
241 249 <h3 class="panel-title">${_('Subversion Settings')}<a class="permalink" href="#vcs-svn-options"></a></h3>
242 250 </div>
243 251 <div class="panel-body">
244 252 <div class="field">
245 253 <div class="content" >
246 254 <label>${_('Repository patterns')}</label><br/>
247 255 </div>
248 256 </div>
249 257 <div class="label">
250 258 <span class="help-block">${_('Patterns for identifying SVN branches and tags. For recursive search, use "*". Eg.: "/branches/*"')}</span>
251 259 </div>
252 260
253 261 <div class="field branch_patterns">
254 262 <div class="input" >
255 263 <label>${_('Branches')}:</label><br/>
256 264 </div>
257 265 % if svn_branch_patterns:
258 266 % for branch in svn_branch_patterns:
259 267 <div class="input adjacent" id="${'id%s' % branch.ui_id}">
260 268 ${h.hidden('branch_ui_key' + suffix, branch.ui_key)}
261 269 ${h.text('branch_value_%d' % branch.ui_id + suffix, branch.ui_value, size=59, readonly="readonly", class_='disabled')}
262 270 % if kwargs.get('disabled') != 'disabled':
263 271 <span class="btn btn-x" onclick="ajaxDeletePattern(${branch.ui_id},'${'id%s' % branch.ui_id}')">
264 272 ${_('Delete')}
265 273 </span>
266 274 % endif
267 275 </div>
268 276 % endfor
269 277 %endif
270 278 </div>
271 279 % if kwargs.get('disabled') != 'disabled':
272 280 <div class="field branch_patterns">
273 281 <div class="input" >
274 282 ${h.text('new_svn_branch',size=59,placeholder='New branch pattern')}
275 283 </div>
276 284 </div>
277 285 % endif
278 286 <div class="field tag_patterns">
279 287 <div class="input" >
280 288 <label>${_('Tags')}:</label><br/>
281 289 </div>
282 290 % if svn_tag_patterns:
283 291 % for tag in svn_tag_patterns:
284 292 <div class="input" id="${'id%s' % tag.ui_id + suffix}">
285 293 ${h.hidden('tag_ui_key' + suffix, tag.ui_key)}
286 294 ${h.text('tag_ui_value_new_%d' % tag.ui_id + suffix, tag.ui_value, size=59, readonly="readonly", class_='disabled tag_input')}
287 295 % if kwargs.get('disabled') != 'disabled':
288 296 <span class="btn btn-x" onclick="ajaxDeletePattern(${tag.ui_id},'${'id%s' % tag.ui_id}')">
289 297 ${_('Delete')}
290 298 </span>
291 299 %endif
292 300 </div>
293 301 % endfor
294 302 % endif
295 303 </div>
296 304 % if kwargs.get('disabled') != 'disabled':
297 305 <div class="field tag_patterns">
298 306 <div class="input" >
299 307 ${h.text('new_svn_tag' + suffix, size=59, placeholder='New tag pattern')}
300 308 </div>
301 309 </div>
302 310 %endif
303 311 </div>
304 312 </div>
305 313 % else:
306 314 ${h.hidden('new_svn_branch' + suffix, '')}
307 315 ${h.hidden('new_svn_tag' + suffix, '')}
308 316 % endif
309 317
310 318
311 319 % if display_globals or repo_type in ['hg', 'git']:
312 320 <div class="panel panel-default">
313 321 <div class="panel-heading" id="vcs-pull-requests-options">
314 322 <h3 class="panel-title">${_('Pull Request Settings')}<a class="permalink" href="#vcs-pull-requests-options"></a></h3>
315 323 </div>
316 324 <div class="panel-body">
317 325 <div class="checkbox">
318 326 ${h.checkbox('rhodecode_pr_merge_enabled' + suffix, 'True', **kwargs)}
319 327 <label for="rhodecode_pr_merge_enabled${suffix}">${_('Enable server-side merge for pull requests')}</label>
320 328 </div>
321 329 <div class="label">
322 330 <span class="help-block">${_('Note: when this feature is enabled, it only runs hooks defined in the rcextension package. Custom hooks added on the Admin -> Settings -> Hooks page will not be run when pull requests are automatically merged from the web interface.')}</span>
323 331 </div>
324 332 <div class="checkbox">
325 333 ${h.checkbox('rhodecode_use_outdated_comments' + suffix, 'True', **kwargs)}
326 334 <label for="rhodecode_use_outdated_comments${suffix}">${_('Invalidate and relocate inline comments during update')}</label>
327 335 </div>
328 336 <div class="label">
329 337 <span class="help-block">${_('During the update of a pull request, the position of inline comments will be updated and outdated inline comments will be hidden.')}</span>
330 338 </div>
331 339 </div>
332 340 </div>
333 341 % endif
334 342
335 343 </%def>
General Comments 0
You need to be logged in to leave comments. Login now