Show More
@@ -0,0 +1,80 b'' | |||
|
1 | .. _svn-path-permissions: | |
|
2 | ||
|
3 | |svn| Enabling Path Permissions | |
|
4 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
|
5 | ||
|
6 | Because |RCEE| uses standard svn apache mod_svn we can take advantage of the | |
|
7 | authz configuration to protect paths and branches. | |
|
8 | ||
|
9 | ||
|
10 | Configuring RhodeCode | |
|
11 | ===================== | |
|
12 | ||
|
13 | ||
|
14 | 1. To configure path based permissions first we need to use a customized | |
|
15 | mod_dav_svn.conf. | |
|
16 | ||
|
17 | Open :file:`home/{user}/.rccontrol/{instance-id}/rhodecode.ini` file. | |
|
18 | And find `svn.proxy.config_template` setting. Now set a new path to read | |
|
19 | the template from. For example: | |
|
20 | ||
|
21 | .. code-block:: ini | |
|
22 | ||
|
23 | svn.proxy.config_template = /home/ubuntu/rhodecode/custom_mod_dav_svn.conf.mako | |
|
24 | ||
|
25 | ||
|
26 | 2. Create the file as in example: `/home/ubuntu/rhodecode/custom_mod_dav_svn.conf.mako` | |
|
27 | You can download one from: | |
|
28 | ||
|
29 | `<https://code.rhodecode.com/rhodecode-enterprise-ce/files/default/rhodecode/apps/svn_support/templates/mod-dav-svn.conf.mako/>`_ | |
|
30 | ||
|
31 | 3. Add (if not yet exists) a section `AuthzSVNReposRelativeAccessFile` in order | |
|
32 | to read the path auth file. | |
|
33 | ||
|
34 | Example modified config section enabling reading the authz file relative | |
|
35 | to repository path. Means located in `/storage_dir/repo_name/conf/authz` | |
|
36 | ||
|
37 | .. code-block:: text | |
|
38 | ||
|
39 | ||
|
40 | # snip ... | |
|
41 | ||
|
42 | # use specific SVN conf/authz file for each repository | |
|
43 | AuthzSVNReposRelativeAccessFile authz | |
|
44 | ||
|
45 | Allow from all | |
|
46 | # snip ... | |
|
47 | ||
|
48 | .. note:: | |
|
49 | ||
|
50 | The `AuthzSVNReposRelativeAccessFile` should go above the `Allow from all` | |
|
51 | directive. | |
|
52 | ||
|
53 | ||
|
54 | 4. Restart RhodeCode, Go to | |
|
55 | the :menuselection:`Admin --> Settings --> VCS` page, and | |
|
56 | click :guilabel:`Generate Apache Config`. | |
|
57 | This will now generate a new configuration with enabled changes to read | |
|
58 | the authz file. You can verify if changes were made by checking the generated | |
|
59 | mod_dav_svn.conf file which is included in your apache configuration. | |
|
60 | ||
|
61 | 5. Specify new rules in the repository authz configuration. | |
|
62 | edit a file in :file:`repo_name/conf/authz`. For example, we specify that | |
|
63 | only admin is allowed to push to develop branch | |
|
64 | ||
|
65 | .. code-block:: ini | |
|
66 | ||
|
67 | [/branches/develop] | |
|
68 | * = r | |
|
69 | admin = rw | |
|
70 | ||
|
71 | ||
|
72 | For more example see: | |
|
73 | `<https://svn.apache.org/repos/asf/subversion/trunk/subversion/mod_authz_svn/INSTALL/>`_ | |
|
74 | ||
|
75 | Those rules also work for paths, so not only branches but all different | |
|
76 | paths inside the repository can be specified. | |
|
77 | ||
|
78 | 6. Reload Apache. If all is configured correctly it should not be allowed to | |
|
79 | commit according to specified rules. | |
|
80 |
@@ -0,0 +1,48 b'' | |||
|
1 | .. _views-ref: | |
|
2 | ||
|
3 | views | |
|
4 | ===== | |
|
5 | ||
|
6 | push (EE only) | |
|
7 | -------------- | |
|
8 | ||
|
9 | .. py:function:: push(apiuser, repoid, remote_uri=<Optional:None>) | |
|
10 | ||
|
11 | Triggers a push on the given repository from a remote location. You | |
|
12 | can use this to keep remote repositories up-to-date. | |
|
13 | ||
|
14 | This command can only be run using an |authtoken| with admin | |
|
15 | rights to the specified repository. For more information, | |
|
16 | see :ref:`config-token-ref`. | |
|
17 | ||
|
18 | This command takes the following options: | |
|
19 | ||
|
20 | :param apiuser: This is filled automatically from the |authtoken|. | |
|
21 | :type apiuser: AuthUser | |
|
22 | :param repoid: The repository name or repository ID. | |
|
23 | :type repoid: str or int | |
|
24 | :param remote_uri: Optional remote URI to pass in for push | |
|
25 | :type remote_uri: str | |
|
26 | ||
|
27 | Example output: | |
|
28 | ||
|
29 | .. code-block:: bash | |
|
30 | ||
|
31 | id : <id_given_in_input> | |
|
32 | result : { | |
|
33 | "msg": "Pushed to url `<remote_url>` on repo `<repository name>`" | |
|
34 | "repository": "<repository name>" | |
|
35 | } | |
|
36 | error : null | |
|
37 | ||
|
38 | Example error output: | |
|
39 | ||
|
40 | .. code-block:: bash | |
|
41 | ||
|
42 | id : <id_given_in_input> | |
|
43 | result : null | |
|
44 | error : { | |
|
45 | "Unable to push changes to `<remote_url>`" | |
|
46 | } | |
|
47 | ||
|
48 |
@@ -0,0 +1,112 b'' | |||
|
1 | .. _config-ldap-groups-ref: | |
|
2 | ||
|
3 | LDAP/AD With User Groups Sync | |
|
4 | ----------------------------- | |
|
5 | ||
|
6 | |RCM| supports LDAP (Lightweight Directory Access Protocol) or | |
|
7 | AD (active Directory) authentication. | |
|
8 | All LDAP versions are supported, with the following |RCM| plugins managing each: | |
|
9 | ||
|
10 | * For LDAP/AD with user group sync use ``LDAP + User Groups (egg:rhodecode-enterprise-ee#ldap_group)`` | |
|
11 | ||
|
12 | RhodeCode reads all data defined from plugin and creates corresponding | |
|
13 | accounts on local database after receiving data from LDAP. This is done on | |
|
14 | every user log-in including operations like pushing/pulling/checkout. | |
|
15 | In addition group membership is read from LDAP and following operations are done: | |
|
16 | ||
|
17 | - automatic addition of user to |RCM| user group | |
|
18 | - automatic removal of user from any other |RCM| user groups not specified in LDAP. | |
|
19 | The removal is done *only* on groups that are marked to be synced from ldap. | |
|
20 | This setting can be changed in advanced settings on user groups | |
|
21 | - automatic creation of user groups if they aren't yet existing in |RCM| | |
|
22 | - marking user as super-admins if he is a member of any admin group defined in plugin settings | |
|
23 | ||
|
24 | This plugin is available only in EE Edition. | |
|
25 | ||
|
26 | .. important:: | |
|
27 | ||
|
28 | The email used with your |RCE| super-admin account needs to match the email | |
|
29 | address attached to your admin profile in LDAP. This is because | |
|
30 | within |RCE| the user email needs to be unique, and multiple users | |
|
31 | cannot share an email account. | |
|
32 | ||
|
33 | Likewise, if as an admin you also have a user account, the email address | |
|
34 | attached to the user account needs to be different. | |
|
35 | ||
|
36 | ||
|
37 | LDAP Configuration Steps | |
|
38 | ^^^^^^^^^^^^^^^^^^^^^^^^ | |
|
39 | ||
|
40 | To configure |LDAP|, use the following steps: | |
|
41 | ||
|
42 | 1. From the |RCM| interface, select | |
|
43 | :menuselection:`Admin --> Authentication` | |
|
44 | 2. Enable the ldap+ groups plugin and select :guilabel:`Save` | |
|
45 | 3. Select the :guilabel:`Enabled` check box in the plugin configuration section | |
|
46 | 4. Add the required LDAP information and :guilabel:`Save`, for more details, | |
|
47 | see :ref:`config-ldap-groups-examples` | |
|
48 | ||
|
49 | For a more detailed description of LDAP objects, see :ref:`ldap-gloss-ref`: | |
|
50 | ||
|
51 | .. _config-ldap-groups-examples: | |
|
52 | ||
|
53 | Example LDAP configuration | |
|
54 | ^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
|
55 | .. code-block:: bash | |
|
56 | ||
|
57 | # Auth Cache TTL, Defines the caching for authentication to offload LDAP server. | |
|
58 | # This means that cache result will be saved for 3600 before contacting LDAP server to verify the user access | |
|
59 | 3600 | |
|
60 | # Host, comma seperated format is optionally possible to specify more than 1 server | |
|
61 | https://ldap1.server.com/ldap-admin/,https://ldap2.server.com/ldap-admin/ | |
|
62 | # Default LDAP Port, use 689 for LDAPS | |
|
63 | 389 | |
|
64 | # Account, used for SimpleBind if LDAP server requires an authentication | |
|
65 | e.g admin@server.com | |
|
66 | # Password used for simple bind | |
|
67 | ldap-user-password | |
|
68 | # LDAP connection security | |
|
69 | LDAPS | |
|
70 | # Certificate checks level | |
|
71 | DEMAND | |
|
72 | # Base DN | |
|
73 | cn=Rufus Magillacuddy,ou=users,dc=rhodecode,dc=com | |
|
74 | # User Search Base | |
|
75 | ou=groups,ou=users | |
|
76 | # LDAP search filter to narrow the results | |
|
77 | (objectClass=person) | |
|
78 | # LDAP search scope | |
|
79 | SUBTREE | |
|
80 | # Login attribute | |
|
81 | sAMAccountName | |
|
82 | # First Name Attribute to read | |
|
83 | givenName | |
|
84 | # Last Name Attribute to read | |
|
85 | sn | |
|
86 | # Email Attribute to read email address from | |
|
87 | ||
|
88 | # group extraction method | |
|
89 | rfc2307bis | |
|
90 | # Group search base | |
|
91 | ou=RC-Groups | |
|
92 | # Group Name Attribute, field to read the group name from | |
|
93 | sAMAAccountName | |
|
94 | # User Member of Attribute, field in which groups are stored | |
|
95 | memberOf | |
|
96 | # LDAP Group Search Filter, allows narrowing the results | |
|
97 | ||
|
98 | # Admin Groups. Comma separated list of groups. If user is member of | |
|
99 | # any of those he will be marked as super-admin in RhodeCode | |
|
100 | admins, management | |
|
101 | ||
|
102 | ||
|
103 | Below is example setup that can be used with Active Directory and ldap groups. | |
|
104 | ||
|
105 | .. image:: ../images/ldap-groups-example.png | |
|
106 | :alt: LDAP/AD setup example | |
|
107 | :scale: 50 % | |
|
108 | ||
|
109 | .. toctree:: | |
|
110 | ||
|
111 | ldap-active-directory | |
|
112 | ldap-authentication No newline at end of file |
|
1 | NO CONTENT: new file 100644, binary diff hidden |
@@ -0,0 +1,139 b'' | |||
|
1 | |RCE| 4.12.0 |RNS| | |
|
2 | ------------------ | |
|
3 | ||
|
4 | Release Date | |
|
5 | ^^^^^^^^^^^^ | |
|
6 | ||
|
7 | - 2018-04-24 | |
|
8 | ||
|
9 | ||
|
10 | New Features | |
|
11 | ^^^^^^^^^^^^ | |
|
12 | ||
|
13 | - Svn: added support for RhodeCode integration framework. All integrations like | |
|
14 | slack, email, Jenkins now also fully work for SVN. | |
|
15 | - Integrations: added new dedicated Jenkins integration with the support of | |
|
16 | CSRF authentication. Available in EE edition only. | |
|
17 | - Automation: added new bi-directional remote sync. RhodeCode instances can now | |
|
18 | automatically push or pull from/to remote locations. This feature is powered | |
|
19 | by the Scheduler of 4.11 release, and it is required to be enabled for this feature to work. | |
|
20 | Available in EE edition only. | |
|
21 | - Mercurial: path-based permissions. RhodeCode can now use Mercurials narrowhg | |
|
22 | to implement path-based permissions. All permissions are read from .hg/hgacl. | |
|
23 | Thanks to the great contribution from Sandu Turcan. | |
|
24 | - VCS: added new diff caches. Available as an option under vcs settings. | |
|
25 | Diff caches work on pull-request, or individual commits for greater | |
|
26 | performance and reduced memory usage. This feature increases speed of large | |
|
27 | pull requests significantly. In addition for pull requests it will allow | |
|
28 | showing old closed pull requests even if commits from source were removed, | |
|
29 | further enhancing auditing capabilities. | |
|
30 | - Audit: added few new audit log entries especially around changing permissions. | |
|
31 | - LDAP: added connection pinning and timeout option to ldap plugin. This should | |
|
32 | prevent problems when connection to LDAP is not stable causing RhodeCode | |
|
33 | instances to freeze waiting on LDAP connections. | |
|
34 | - User groups: expose public user group profiles. Allows to see members of a user | |
|
35 | groups by other team members, if they have proper permissions. | |
|
36 | - UI: show pull request page in quick nav menu on my account for quicker access. | |
|
37 | - UI: hidden/outdated comments now have visible markers next to line numbers. | |
|
38 | This allows access to them without showing all hidden comments. | |
|
39 | ||
|
40 | ||
|
41 | General | |
|
42 | ^^^^^^^ | |
|
43 | ||
|
44 | - Ssh: show conflicting fingerprint when adding an already existing key. | |
|
45 | Helps to track why adding a key failed. | |
|
46 | - System info: added ulimit to system info. This is causing lots of problems | |
|
47 | when we hit any of those limits, that is why it's important to show this. | |
|
48 | - Repository settings: add hidden view to force re-install hooks. | |
|
49 | Available under /{repo_name}/settings/advanced/hooks | |
|
50 | - Integrations: Webhook now handles response errors and show response for | |
|
51 | easier debugging. | |
|
52 | - Cli: speed up CLI execution start by skipping auth plugin search/registry. | |
|
53 | - SVN: added an example in the docs on how to enable path-based permissions. | |
|
54 | - LDAP: enable connection recycling on LDAP plugin. | |
|
55 | - Auth plugins: use a nicer visual display of auth plugins that would | |
|
56 | highlight that order of enabled plugins does matter. | |
|
57 | - Events: expose shadow repo build url. | |
|
58 | - Events: expose pull request title and uid in event data. | |
|
59 | - API: enable setting sync flag for user groups on create/edit. | |
|
60 | - API: update pull method with a possible specification of the url | |
|
61 | - Logging: improved consistency of auth plugins logs. | |
|
62 | - Logging: improved log for ssl required | |
|
63 | - Dependencies: bumped mercurial to 4.4 series | |
|
64 | - Dependencies: bumped zope.cachedescriptors==4.3.1 | |
|
65 | - Dependencies: bumped zope.deprecation==4.3.0 | |
|
66 | - Dependencies: bumped zope.event==4.3.0 | |
|
67 | - Dependencies: bumped zope.interface==4.4.3 | |
|
68 | - Dependencies: bumped graphviz 0.8.2 | |
|
69 | - Dependencies: bumped to ipaddress 0.1.19 | |
|
70 | - Dependencies: bumped pyexpect to 4.3.1 | |
|
71 | - Dependencies: bumped ws4py to 0.4.3 | |
|
72 | - Dependencies: bumped bleach to 2.1.2 | |
|
73 | - Dependencies: bumped html5lib 1.0.1 | |
|
74 | - Dependencies: bumped greenlet to 0.4.13 | |
|
75 | - Dependencies: bumped markdown to 2.6.11 | |
|
76 | - Dependencies: bumped psutil to 5.4.3 | |
|
77 | - Dependencies: bumped beaker to 1.9.1 | |
|
78 | - Dependencies: bumped alembic to 0.6.8 release. | |
|
79 | - Dependencies: bumped supervisor to 3.3.4 | |
|
80 | - Dependencies: bumped pyexpect to 4.4.0 and scandir to 1.7 | |
|
81 | - Dependencies: bumped appenlight client to 0.6.25 | |
|
82 | - Dependencies: don't require full mysql lib for the db driver. | |
|
83 | Reduces installation package size by around 100MB. | |
|
84 | ||
|
85 | ||
|
86 | Security | |
|
87 | ^^^^^^^^ | |
|
88 | ||
|
89 | - My account: changing email in my account now requires providing user | |
|
90 | access password. This is a case for only RhodeCode built-in accounts. | |
|
91 | Prevents adding recovery email by unauthorized users who gain | |
|
92 | access to logged in session of user. | |
|
93 | - Logging: fix leaking of tokens to logging. | |
|
94 | - General: serialize the repo name in repo checks to prevent potential | |
|
95 | html injections by providing a malformed url. | |
|
96 | ||
|
97 | ||
|
98 | Performance | |
|
99 | ^^^^^^^^^^^ | |
|
100 | ||
|
101 | - Diffs: don't use recurred diffset attachment in diffs. This makes | |
|
102 | this structure much harder to garbage collect. Reduces memory usage. | |
|
103 | - Diff cache: added caching for better performance of large pull requests. | |
|
104 | ||
|
105 | ||
|
106 | Fixes | |
|
107 | ^^^^^ | |
|
108 | ||
|
109 | - Age helper: fix issues with proper timezone detection for certain timezones. | |
|
110 | Fixes wrong age display in few cases. | |
|
111 | - API: added audit logs for user group related calls that were | |
|
112 | accidentally missing. | |
|
113 | - Diffs: fix and improve line selections and anchor links. | |
|
114 | - Pull requests: fixed cases with default expected refs are closed or unavailable. | |
|
115 | For Mercurial with closed default branch a compare across forks could fail. | |
|
116 | - Core: properly report 502 errors for gevent and gunicorn. | |
|
117 | Gevent wtih Gunicorn doesn't raise normal pycurl errors. | |
|
118 | - Auth plugins: fixed problem with cache of settings in multi-worker mode. | |
|
119 | The previous implementation had a bug that cached the settings in each class, | |
|
120 | caused not refreshing the update of settings in multi-worker mode. | |
|
121 | Only restart of RhodeCode loaded new settings. | |
|
122 | - Audit logs: properly handle query syntax in the search field. | |
|
123 | - Repositories: better handling of missing requirements errors for repositories. | |
|
124 | - API: fixed problems with repository fork/create using celery backend. | |
|
125 | - VCS settings: added missing flash message on validation errors to prevent | |
|
126 | missing out some field input validation problems. | |
|
127 | ||
|
128 | ||
|
129 | Upgrade notes | |
|
130 | ^^^^^^^^^^^^^ | |
|
131 | ||
|
132 | - This release adds support for SVN hook. This required lots of changes on how we | |
|
133 | handle SVN protocol. We did thoughtful tests for SVN compatibility. | |
|
134 | Please be advised to check the behaviour of SVN repositories during this update. | |
|
135 | ||
|
136 | - Diff caches are turned off by default for backward compatibility. We however recommend | |
|
137 | turning them on either individually for bigger repositories or globally for every repository. | |
|
138 | This setting can be found in admin > settings > vcs, or repository > settings > vcs | |
|
139 |
@@ -0,0 +1,20 b'' | |||
|
1 | diff -rup Beaker-1.9.1-orig/beaker/container.py Beaker-1.9.1/beaker/container.py | |
|
2 | --- Beaker-1.9.1-orig/beaker/container.py 2018-04-10 10:23:04.000000000 +0200 | |
|
3 | +++ Beaker-1.9.1/beaker/container.py 2018-04-10 10:23:34.000000000 +0200 | |
|
4 | @@ -353,13 +353,13 @@ class Value(object): | |
|
5 | debug("get_value returning old value while new one is created") | |
|
6 | return value | |
|
7 | else: | |
|
8 | - debug("lock_creatfunc (didnt wait)") | |
|
9 | + debug("lock_creatfunc `%s` (didnt wait)", self.createfunc.__name__) | |
|
10 | has_createlock = True | |
|
11 | ||
|
12 | if not has_createlock: | |
|
13 | - debug("lock_createfunc (waiting)") | |
|
14 | + debug("lock_createfunc `%s` (waiting)", self.createfunc.__name__) | |
|
15 | creation_lock.acquire() | |
|
16 | - debug("lock_createfunc (waited)") | |
|
17 | + debug("lock_createfunc `%s` (waited)", self.createfunc.__name__) | |
|
18 | ||
|
19 | try: | |
|
20 | # see if someone created the value already |
@@ -0,0 +1,46 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2016-2018 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | import logging | |
|
22 | ||
|
23 | from pyramid.view import view_config | |
|
24 | ||
|
25 | from rhodecode.apps._base import RepoAppView | |
|
26 | from rhodecode.apps.repository.utils import get_default_reviewers_data | |
|
27 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator | |
|
28 | ||
|
29 | log = logging.getLogger(__name__) | |
|
30 | ||
|
31 | ||
|
32 | class RepoAutomationView(RepoAppView): | |
|
33 | def load_default_context(self): | |
|
34 | c = self._get_local_tmpl_context() | |
|
35 | return c | |
|
36 | ||
|
37 | @LoginRequired() | |
|
38 | @HasRepoPermissionAnyDecorator('repository.admin') | |
|
39 | @view_config( | |
|
40 | route_name='repo_automation', request_method='GET', | |
|
41 | renderer='rhodecode:templates/admin/repos/repo_edit.mako') | |
|
42 | def repo_automation(self): | |
|
43 | c = self.load_default_context() | |
|
44 | c.active = 'automation' | |
|
45 | ||
|
46 | return self._get_template_context(c) |
@@ -0,0 +1,27 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2016-2018 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | ||
|
22 | def includeme(config): | |
|
23 | config.add_route( | |
|
24 | name='user_group_profile', | |
|
25 | pattern='/_profile_user_group/{user_group_name}') | |
|
26 | # Scan module for configuration decorators. | |
|
27 | config.scan('.views', ignore='.tests') |
@@ -0,0 +1,19 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2016-2018 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
@@ -0,0 +1,76 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2010-2018 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | from rhodecode.model.user_group import UserGroupModel | |
|
21 | from rhodecode.tests import ( | |
|
22 | TestController, TEST_USER_REGULAR_LOGIN, TEST_USER_REGULAR_PASS) | |
|
23 | from rhodecode.tests.fixture import Fixture | |
|
24 | from rhodecode.tests.utils import AssertResponse | |
|
25 | ||
|
26 | fixture = Fixture() | |
|
27 | ||
|
28 | ||
|
29 | def route_path(name, **kwargs): | |
|
30 | return '/_profile_user_group/{user_group_name}'.format(**kwargs) | |
|
31 | ||
|
32 | ||
|
33 | class TestUsersController(TestController): | |
|
34 | ||
|
35 | def test_user_group_profile(self, user_util): | |
|
36 | self.log_user(TEST_USER_REGULAR_LOGIN, TEST_USER_REGULAR_PASS) | |
|
37 | user, usergroup = user_util.create_user_with_group() | |
|
38 | ||
|
39 | response = self.app.get(route_path('profile_user_group', user_group_name=usergroup.users_group_name)) | |
|
40 | response.mustcontain(usergroup.users_group_name) | |
|
41 | response.mustcontain(user.username) | |
|
42 | ||
|
43 | def test_user_can_check_own_group(self, user_util): | |
|
44 | user = user_util.create_user( | |
|
45 | TEST_USER_REGULAR_LOGIN, password=TEST_USER_REGULAR_PASS, email='testme@rhodecode.org') | |
|
46 | self.log_user(TEST_USER_REGULAR_LOGIN, TEST_USER_REGULAR_PASS) | |
|
47 | usergroup = user_util.create_user_group(owner=user) | |
|
48 | response = self.app.get(route_path('profile_user_group', user_group_name=usergroup.users_group_name)) | |
|
49 | response.mustcontain(usergroup.users_group_name) | |
|
50 | response.mustcontain(user.username) | |
|
51 | ||
|
52 | def test_user_can_not_check_other_group(self, user_util): | |
|
53 | self.log_user(TEST_USER_REGULAR_LOGIN, TEST_USER_REGULAR_PASS) | |
|
54 | user_group = user_util.create_user_group() | |
|
55 | UserGroupModel().grant_user_permission(user_group, self._get_logged_user(), 'usergroup.none') | |
|
56 | response = self.app.get(route_path('profile_user_group', user_group_name=user_group.users_group_name), status=404) | |
|
57 | assert response.status_code == 404 | |
|
58 | ||
|
59 | def test_another_user_can_check_if_he_is_in_group(self, user_util): | |
|
60 | self.log_user(TEST_USER_REGULAR_LOGIN, TEST_USER_REGULAR_PASS) | |
|
61 | user = user_util.create_user( | |
|
62 | 'test-my-user', password='qweqwe', email='testme@rhodecode.org') | |
|
63 | user_group = user_util.create_user_group() | |
|
64 | UserGroupModel().add_user_to_group(user_group, user) | |
|
65 | UserGroupModel().grant_user_permission(user_group, self._get_logged_user(), 'usergroup.read') | |
|
66 | response = self.app.get(route_path('profile_user_group', user_group_name=user_group.users_group_name)) | |
|
67 | response.mustcontain(user_group.users_group_name) | |
|
68 | response.mustcontain(user.username) | |
|
69 | ||
|
70 | def test_with_anonymous_user(self, user_util): | |
|
71 | user = user_util.create_user( | |
|
72 | 'test-my-user', password='qweqwe', email='testme@rhodecode.org') | |
|
73 | user_group = user_util.create_user_group() | |
|
74 | UserGroupModel().add_user_to_group(user_group, user) | |
|
75 | response = self.app.get(route_path('profile_user_group', user_group_name=user_group.users_group_name), status=302) | |
|
76 | assert response.status_code == 302 No newline at end of file |
@@ -0,0 +1,53 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2016-2018 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | import logging | |
|
22 | ||
|
23 | from pyramid.httpexceptions import HTTPNotFound | |
|
24 | from pyramid.view import view_config | |
|
25 | ||
|
26 | from rhodecode.apps._base import BaseAppView | |
|
27 | from rhodecode.lib.auth import HasUserGroupPermissionAnyDecorator, LoginRequired, NotAnonymous | |
|
28 | from rhodecode.model.db import UserGroup, User | |
|
29 | ||
|
30 | ||
|
31 | log = logging.getLogger(__name__) | |
|
32 | ||
|
33 | ||
|
34 | class UserGroupProfileView(BaseAppView): | |
|
35 | ||
|
36 | @LoginRequired() | |
|
37 | @NotAnonymous() | |
|
38 | @HasUserGroupPermissionAnyDecorator('usergroup.read', 'usergroup.write', 'usergroup.admin',) | |
|
39 | @view_config( | |
|
40 | route_name='user_group_profile', request_method='GET', | |
|
41 | renderer='rhodecode:templates/user_group/user_group.mako') | |
|
42 | def user_group_profile(self): | |
|
43 | c = self._get_local_tmpl_context() | |
|
44 | c.active = 'profile' | |
|
45 | self.db_user_group_name = self.request.matchdict.get('user_group_name') | |
|
46 | c.user_group = UserGroup().get_by_group_name(self.db_user_group_name) | |
|
47 | if not c.user_group: | |
|
48 | raise HTTPNotFound() | |
|
49 | group_members_obj = sorted((x.user for x in c.user_group.members), | |
|
50 | key=lambda u: u.username.lower()) | |
|
51 | c.group_members = group_members_obj | |
|
52 | c.anonymous = self._rhodecode_user.username == User.DEFAULT_USER | |
|
53 | return self._get_template_context(c) |
@@ -0,0 +1,39 b'' | |||
|
1 | import logging | |
|
2 | ||
|
3 | from sqlalchemy import * | |
|
4 | ||
|
5 | from rhodecode.model import meta | |
|
6 | from rhodecode.lib.dbmigrate.versions import _reset_base, notify | |
|
7 | ||
|
8 | log = logging.getLogger(__name__) | |
|
9 | ||
|
10 | ||
|
11 | def upgrade(migrate_engine): | |
|
12 | """ | |
|
13 | Upgrade operations go here. | |
|
14 | Don't create your own engine; bind migrate_engine to your metadata | |
|
15 | """ | |
|
16 | _reset_base(migrate_engine) | |
|
17 | from rhodecode.lib.dbmigrate.schema import db_4_11_0_0 as db | |
|
18 | ||
|
19 | repository_table = db.Repository.__table__ | |
|
20 | ||
|
21 | push_uri = Column( | |
|
22 | "push_uri", db.EncryptedTextValue(), nullable=True, unique=False, | |
|
23 | default=None) | |
|
24 | ||
|
25 | push_uri.create(table=repository_table) | |
|
26 | ||
|
27 | # issue fixups | |
|
28 | fixups(db, meta.Session) | |
|
29 | ||
|
30 | ||
|
31 | def downgrade(migrate_engine): | |
|
32 | meta = MetaData() | |
|
33 | meta.bind = migrate_engine | |
|
34 | ||
|
35 | ||
|
36 | def fixups(models, _SESSION): | |
|
37 | pass | |
|
38 | ||
|
39 |
@@ -0,0 +1,1 b'' | |||
|
1 | from pyramid.compat import configparser No newline at end of file |
|
1 | NO CONTENT: new file 100644, binary diff hidden |
|
1 | NO CONTENT: new file 100644, binary diff hidden |
@@ -0,0 +1,9 b'' | |||
|
1 | <div class="panel panel-default"> | |
|
2 | <div class="panel-heading"> | |
|
3 | <h3 class="panel-title">${_('Repo Automation')}</h3> | |
|
4 | </div> | |
|
5 | <div class="panel-body"> | |
|
6 | <h4>${_('This feature is available in RhodeCode EE edition only. Contact {sales_email} to obtain a trial license.').format(sales_email='<a href="mailto:sales@rhodecode.com">sales@rhodecode.com</a>')|n}</h4> | |
|
7 | <img style="width: 100%; height: 100%" src="${h.asset('images/ee_features/repo_automation.png')}"/> | |
|
8 | </div> | |
|
9 | </div> |
@@ -0,0 +1,9 b'' | |||
|
1 | <div class="panel panel-default"> | |
|
2 | <div class="panel-heading"> | |
|
3 | <h3 class="panel-title">${_('Admin Automation')}</h3> | |
|
4 | </div> | |
|
5 | <div class="panel-body"> | |
|
6 | <h4>${_('This feature is available in RhodeCode EE edition only. Contact {sales_email} to obtain a trial license.').format(sales_email='<a href="mailto:sales@rhodecode.com">sales@rhodecode.com</a>')|n}</h4> | |
|
7 | <img style="width: 100%; height: 100%" src="${h.asset('images/ee_features/admin_automation.png')}"/> | |
|
8 | </div> | |
|
9 | </div> |
@@ -0,0 +1,27 b'' | |||
|
1 | <span tal:define="name name|field.name; | |
|
2 | true_val true_val|field.widget.true_val; | |
|
3 | css_class css_class|field.widget.css_class; | |
|
4 | style style|field.widget.style; | |
|
5 | oid oid|field.oid; | |
|
6 | help_block help_block|field.widget.help_block|''; | |
|
7 | " | |
|
8 | tal:omit-tag=""> | |
|
9 | ||
|
10 | <div class="checkbox"> | |
|
11 | <input type="checkbox" name="${name}" value="${true_val}" | |
|
12 | readonly="readonly" disabled="disabled" | |
|
13 | id="${oid}" | |
|
14 | tal:attributes="checked cstruct == true_val; | |
|
15 | class css_class; | |
|
16 | style style;" | |
|
17 | /> | |
|
18 | <p tal:condition="help_block" class="help-block">${help_block}</p> | |
|
19 | <label for="${field.oid}"> | |
|
20 | <span tal:condition="hasattr(field, 'schema') and hasattr(field.schema, 'label')" | |
|
21 | tal:replace="field.schema.label" class="checkbox-label" > | |
|
22 | </span> | |
|
23 | ||
|
24 | </label> | |
|
25 | </div> | |
|
26 | ||
|
27 | </span> No newline at end of file |
@@ -0,0 +1,70 b'' | |||
|
1 | <%namespace name="base" file="/base/base.mako"/> | |
|
2 | ||
|
3 | <div class="panel panel-default user-profile"> | |
|
4 | <div class="panel-heading"> | |
|
5 | <h3 class="panel-title">${_('User group profile')}</h3> | |
|
6 | %if h.HasPermissionAny('hg.admin')(): | |
|
7 | ${h.link_to(_('Edit'), h.route_path('edit_user_group', user_group_id=c.user_group.users_group_id), class_='panel-edit')} | |
|
8 | %endif | |
|
9 | </div> | |
|
10 | ||
|
11 | <div class="panel-body user-profile-content"> | |
|
12 | ||
|
13 | <div class="fieldset"> | |
|
14 | <div class="left-label"> | |
|
15 | ${_('Group Name')}: | |
|
16 | </div> | |
|
17 | <div class="right-content"> | |
|
18 | ${c.user_group.users_group_name} | |
|
19 | </div> | |
|
20 | </div> | |
|
21 | <div class="fieldset"> | |
|
22 | <div class="left-label"> | |
|
23 | ${_('Owner')}: | |
|
24 | </div> | |
|
25 | <div class="group_member"> | |
|
26 | ${base.gravatar(c.user_group.user.email, 16)} | |
|
27 | <span class="username user">${h.link_to_user(c.user_group.user)}</span> | |
|
28 | ||
|
29 | </div> | |
|
30 | </div> | |
|
31 | <div class="fieldset"> | |
|
32 | <div class="left-label"> | |
|
33 | ${_('Active')}: | |
|
34 | </div> | |
|
35 | <div class="right-content"> | |
|
36 | ${c.user_group.users_group_active} | |
|
37 | </div> | |
|
38 | </div> | |
|
39 | % if not c.anonymous: | |
|
40 | <div class="fieldset"> | |
|
41 | <div class="left-label"> | |
|
42 | ${_('Members')}: | |
|
43 | </div> | |
|
44 | <div class="right-content"> | |
|
45 | <table id="group_members_placeholder" class="rctable group_members"> | |
|
46 | <th>${_('Username')}</th> | |
|
47 | % if c.group_members: | |
|
48 | % for user in c.group_members: | |
|
49 | <tr> | |
|
50 | <td id="member_user_${user.user_id}" class="td-author"> | |
|
51 | <div class="group_member"> | |
|
52 | ${base.gravatar(user.email, 16)} | |
|
53 | <span class="username user">${h.link_to(h.person(user), h.route_path('user_edit',user_id=user.user_id))}</span> | |
|
54 | <input type="hidden" name="__start__" value="member:mapping"> | |
|
55 | <input type="hidden" name="member_user_id" value="${user.user_id}"> | |
|
56 | <input type="hidden" name="type" value="existing" id="member_${user.user_id}"> | |
|
57 | <input type="hidden" name="__end__" value="member:mapping"> | |
|
58 | </div> | |
|
59 | </td> | |
|
60 | </tr> | |
|
61 | % endfor | |
|
62 | % else: | |
|
63 | <tr><td colspan="2">${_('No members yet')}</td></tr> | |
|
64 | % endif | |
|
65 | </table> | |
|
66 | </div> | |
|
67 | </div> | |
|
68 | % endif | |
|
69 | </div> | |
|
70 | </div> No newline at end of file |
@@ -0,0 +1,46 b'' | |||
|
1 | <%inherit file="/base/base.mako"/> | |
|
2 | ||
|
3 | <%def name="title()"> | |
|
4 | ${_('User group')}: ${c.user_group.users_group_name} | |
|
5 | %if c.rhodecode_name: | |
|
6 | · ${h.branding(c.rhodecode_name)} | |
|
7 | %endif | |
|
8 | </%def> | |
|
9 | ||
|
10 | <%def name="breadcrumbs_links()"> | |
|
11 | ${_('User group')}: ${c.user_group.users_group_name} | |
|
12 | </%def> | |
|
13 | ||
|
14 | <%def name="menu_bar_nav()"> | |
|
15 | ${self.menu_items(active='my_account')} | |
|
16 | </%def> | |
|
17 | ||
|
18 | <%def name="main()"> | |
|
19 | <div class="box"> | |
|
20 | <div class="title"> | |
|
21 | ${self.breadcrumbs()} | |
|
22 | </div> | |
|
23 | ||
|
24 | <div class="sidebar-col-wrapper scw-small"> | |
|
25 | ##main | |
|
26 | <div class="sidebar"> | |
|
27 | <ul class="nav nav-pills nav-stacked"> | |
|
28 | <li class="${'active' if c.active=='profile' else ''}"> | |
|
29 | <a href="${h.route_path('user_group_profile', user_group_name=c.user_group.users_group_name)}">${_('User Group Profile')}</a></li> | |
|
30 | ## These placeholders are here only for styling purposes. For every new item added to the list, you should remove one placeholder | |
|
31 | <li class="placeholder"><a href="#" style="visibility: hidden;">placeholder</a></li> | |
|
32 | <li class="placeholder"><a href="#" style="visibility: hidden;">placeholder</a></li> | |
|
33 | <li class="placeholder"><a href="#" style="visibility: hidden;">placeholder</a></li> | |
|
34 | <li class="placeholder"><a href="#" style="visibility: hidden;">placeholder</a></li> | |
|
35 | <li class="placeholder"><a href="#" style="visibility: hidden;">placeholder</a></li> | |
|
36 | <li class="placeholder"><a href="#" style="visibility: hidden;">placeholder</a></li> | |
|
37 | </ul> | |
|
38 | </div> | |
|
39 | ||
|
40 | <div class="main-content-full-width"> | |
|
41 | <%include file="/user_group/${c.active}.mako"/> | |
|
42 | </div> | |
|
43 | </div> | |
|
44 | </div> | |
|
45 | ||
|
46 | </%def> |
@@ -1,5 +1,5 b'' | |||
|
1 | 1 | [bumpversion] |
|
2 |
current_version = 4.1 |
|
|
2 | current_version = 4.12.0 | |
|
3 | 3 | message = release: Bump version {current_version} to {new_version} |
|
4 | 4 | |
|
5 | 5 | [bumpversion:file:rhodecode/VERSION] |
@@ -5,25 +5,20 b' done = false' | |||
|
5 | 5 | done = true |
|
6 | 6 | |
|
7 | 7 | [task:rc_tools_pinned] |
|
8 | done = true | |
|
9 | 8 | |
|
10 | 9 | [task:fixes_on_stable] |
|
11 | done = true | |
|
12 | 10 | |
|
13 | 11 | [task:pip2nix_generated] |
|
14 | done = true | |
|
15 | 12 | |
|
16 | 13 | [task:changelog_updated] |
|
17 | done = true | |
|
18 | 14 | |
|
19 | 15 | [task:generate_api_docs] |
|
20 | done = true | |
|
16 | ||
|
17 | [task:updated_translation] | |
|
21 | 18 | |
|
22 | 19 | [release] |
|
23 |
state = |
|
|
24 |
version = 4.1 |
|
|
25 | ||
|
26 | [task:updated_translation] | |
|
20 | state = in_progress | |
|
21 | version = 4.12.0 | |
|
27 | 22 | |
|
28 | 23 | [task:generate_js_routes] |
|
29 | 24 |
@@ -16,9 +16,6 b' recursive-include configs *' | |||
|
16 | 16 | # translations |
|
17 | 17 | recursive-include rhodecode/i18n * |
|
18 | 18 | |
|
19 | # hook templates | |
|
20 | recursive-include rhodecode/config/hook_templates * | |
|
21 | ||
|
22 | 19 | # non-python core stuff |
|
23 | 20 | recursive-include rhodecode *.cfg |
|
24 | 21 | recursive-include rhodecode *.json |
@@ -190,9 +190,6 b' force_https = false' | |||
|
190 | 190 | ## use Strict-Transport-Security headers |
|
191 | 191 | use_htsts = false |
|
192 | 192 | |
|
193 | ## number of commits stats will parse on each iteration | |
|
194 | commit_parse_limit = 25 | |
|
195 | ||
|
196 | 193 | ## git rev filter option, --all is the default filter, if you need to |
|
197 | 194 | ## hide all refs in changelog switch this to --branches --tags |
|
198 | 195 | git_rev_filter = --branches --tags |
@@ -708,12 +705,12 b' formatter = color_formatter_sql' | |||
|
708 | 705 | |
|
709 | 706 | [formatter_generic] |
|
710 | 707 | class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter |
|
711 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s | |
|
708 | format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s | |
|
712 | 709 | datefmt = %Y-%m-%d %H:%M:%S |
|
713 | 710 | |
|
714 | 711 | [formatter_color_formatter] |
|
715 | 712 | class = rhodecode.lib.logging_formatter.ColorFormatter |
|
716 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s | |
|
713 | format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s | |
|
717 | 714 | datefmt = %Y-%m-%d %H:%M:%S |
|
718 | 715 | |
|
719 | 716 | [formatter_color_formatter_sql] |
@@ -55,12 +55,13 b' keepalive = 2' | |||
|
55 | 55 | |
|
56 | 56 | # SERVER MECHANICS |
|
57 | 57 | # None == system temp dir |
|
58 | # worker_tmp_dir is recommended to be set to some tmpfs | |
|
58 | 59 | worker_tmp_dir = None |
|
59 | 60 | tmp_upload_dir = None |
|
60 | 61 | |
|
61 | 62 | # Custom log format |
|
62 | 63 | access_log_format = ( |
|
63 |
'%(t)s |
|
|
64 | '%(t)s [%(p)-8s] GNCRN %(h)-15s rqt:%(L)s %(s)s %(b)-6s "%(m)s:%(U)s %(q)s" usr:%(u)s "%(f)s" "%(a)s"') | |
|
64 | 65 | |
|
65 | 66 | # self adjust workers based on CPU count |
|
66 | 67 | # workers = multiprocessing.cpu_count() * 2 + 1 |
@@ -119,16 +120,16 b' def child_exit(server, worker):' | |||
|
119 | 120 | |
|
120 | 121 | |
|
121 | 122 | def pre_request(worker, req): |
|
122 | return | |
|
123 | worker.log.debug("[<%-10s>] PRE WORKER: %s %s", | |
|
124 |
|
|
|
123 | worker.start_time = time.time() | |
|
124 | worker.log.debug( | |
|
125 | "GNCRN PRE WORKER [cnt:%s]: %s %s", worker.nr, req.method, req.path) | |
|
125 | 126 | |
|
126 | 127 | |
|
127 | 128 | def post_request(worker, req, environ, resp): |
|
128 | return | |
|
129 | worker.log.debug("[<%-10s>] POST WORKER: %s %s resp: %s", worker.pid, | |
|
130 | req.method, req.path, resp.status_code) | |
|
131 | ||
|
129 | total_time = time.time() - worker.start_time | |
|
130 | worker.log.debug( | |
|
131 | "GNCRN POST WORKER [cnt:%s]: %s %s resp: %s, Load Time: %.3fs", | |
|
132 | worker.nr, req.method, req.path, resp.status_code, total_time) | |
|
132 | 133 | |
|
133 | 134 | |
|
134 | 135 | class RhodeCodeLogger(Logger): |
@@ -165,9 +165,6 b' force_https = false' | |||
|
165 | 165 | ## use Strict-Transport-Security headers |
|
166 | 166 | use_htsts = false |
|
167 | 167 | |
|
168 | ## number of commits stats will parse on each iteration | |
|
169 | commit_parse_limit = 25 | |
|
170 | ||
|
171 | 168 | ## git rev filter option, --all is the default filter, if you need to |
|
172 | 169 | ## hide all refs in changelog switch this to --branches --tags |
|
173 | 170 | git_rev_filter = --branches --tags |
@@ -678,12 +675,12 b' formatter = generic' | |||
|
678 | 675 | |
|
679 | 676 | [formatter_generic] |
|
680 | 677 | class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter |
|
681 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s | |
|
678 | format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s | |
|
682 | 679 | datefmt = %Y-%m-%d %H:%M:%S |
|
683 | 680 | |
|
684 | 681 | [formatter_color_formatter] |
|
685 | 682 | class = rhodecode.lib.logging_formatter.ColorFormatter |
|
686 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s | |
|
683 | format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s | |
|
687 | 684 | datefmt = %Y-%m-%d %H:%M:%S |
|
688 | 685 | |
|
689 | 686 | [formatter_color_formatter_sql] |
@@ -1,4 +1,4 b'' | |||
|
1 | .. _sec-your-server: | |
|
1 | .. _sec-sophos-umc: | |
|
2 | 2 | |
|
3 | 3 | Securing Your Server via Sophos UTM 9 |
|
4 | 4 | ------------------------------------- |
@@ -19,6 +19,7 b' The following are the most common system' | |||
|
19 | 19 | config-files-overview |
|
20 | 20 | vcs-server |
|
21 | 21 | svn-http |
|
22 | svn-path-permissions | |
|
22 | 23 | gunicorn-ssl-support |
|
23 | 24 | apache-config |
|
24 | 25 | nginx-config |
@@ -196,6 +196,7 b' are not required in args.' | |||
|
196 | 196 | .. --- API DEFS MARKER --- |
|
197 | 197 | .. toctree:: |
|
198 | 198 | |
|
199 | methods/views | |
|
199 | 200 | methods/license-methods |
|
200 | 201 | methods/deprecated-methods |
|
201 | 202 | methods/gist-methods |
@@ -66,7 +66,7 b' comment_commit' | |||
|
66 | 66 | create_repo |
|
67 | 67 | ----------- |
|
68 | 68 | |
|
69 | .. py:function:: create_repo(apiuser, repo_name, repo_type, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, copy_permissions=<Optional:False>) | |
|
69 | .. py:function:: create_repo(apiuser, repo_name, repo_type, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, push_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, copy_permissions=<Optional:False>) | |
|
70 | 70 | |
|
71 | 71 | Creates a repository. |
|
72 | 72 | |
@@ -95,6 +95,8 b' create_repo' | |||
|
95 | 95 | :type private: bool |
|
96 | 96 | :param clone_uri: set clone_uri |
|
97 | 97 | :type clone_uri: str |
|
98 | :param push_uri: set push_uri | |
|
99 | :type push_uri: str | |
|
98 | 100 | :param landing_rev: <rev_type>:<rev> |
|
99 | 101 | :type landing_rev: str |
|
100 | 102 | :param enable_locking: |
@@ -789,7 +791,7 b' maintenance' | |||
|
789 | 791 | pull |
|
790 | 792 | ---- |
|
791 | 793 | |
|
792 | .. py:function:: pull(apiuser, repoid) | |
|
794 | .. py:function:: pull(apiuser, repoid, remote_uri=<Optional:None>) | |
|
793 | 795 | |
|
794 | 796 | Triggers a pull on the given repository from a remote location. You |
|
795 | 797 | can use this to keep remote repositories up-to-date. |
@@ -804,6 +806,8 b' pull' | |||
|
804 | 806 | :type apiuser: AuthUser |
|
805 | 807 | :param repoid: The repository name or repository ID. |
|
806 | 808 | :type repoid: str or int |
|
809 | :param remote_uri: Optional remote URI to pass in for pull | |
|
810 | :type remote_uri: str | |
|
807 | 811 | |
|
808 | 812 | Example output: |
|
809 | 813 | |
@@ -811,7 +815,7 b' pull' | |||
|
811 | 815 | |
|
812 | 816 | id : <id_given_in_input> |
|
813 | 817 | result : { |
|
814 | "msg": "Pulled from `<repository name>`" | |
|
818 | "msg": "Pulled from url `<remote_url>` on repo `<repository name>`" | |
|
815 | 819 | "repository": "<repository name>" |
|
816 | 820 | } |
|
817 | 821 | error : null |
@@ -823,7 +827,7 b' pull' | |||
|
823 | 827 | id : <id_given_in_input> |
|
824 | 828 | result : null |
|
825 | 829 | error : { |
|
826 |
"Unable to pu |
|
|
830 | "Unable to push changes from `<remote_url>`" | |
|
827 | 831 | } |
|
828 | 832 | |
|
829 | 833 | |
@@ -976,7 +980,7 b' strip' | |||
|
976 | 980 | update_repo |
|
977 | 981 | ----------- |
|
978 | 982 | |
|
979 | .. py:function:: update_repo(apiuser, repoid, repo_name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, fork_of=<Optional:None>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>) | |
|
983 | .. py:function:: update_repo(apiuser, repoid, repo_name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, push_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, fork_of=<Optional:None>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>) | |
|
980 | 984 | |
|
981 | 985 | Updates a repository with the given information. |
|
982 | 986 |
@@ -51,7 +51,7 b' add_user_to_user_group' | |||
|
51 | 51 | create_user_group |
|
52 | 52 | ----------------- |
|
53 | 53 | |
|
54 | .. py:function:: create_user_group(apiuser, group_name, description=<Optional:''>, owner=<Optional:<OptionalAttr:apiuser>>, active=<Optional:True>) | |
|
54 | .. py:function:: create_user_group(apiuser, group_name, description=<Optional:''>, owner=<Optional:<OptionalAttr:apiuser>>, active=<Optional:True>, sync=<Optional:None>) | |
|
55 | 55 | |
|
56 | 56 | Creates a new user group. |
|
57 | 57 | |
@@ -71,6 +71,9 b' create_user_group' | |||
|
71 | 71 | :type owner: Optional(str or int) |
|
72 | 72 | :param active: Set this group as active. |
|
73 | 73 | :type active: Optional(``True`` | ``False``) |
|
74 | :param sync: Set enabled or disabled the automatically sync from | |
|
75 | external authentication types like ldap. | |
|
76 | :type sync: Optional(``True`` | ``False``) | |
|
74 | 77 | |
|
75 | 78 | Example output: |
|
76 | 79 | |
@@ -368,7 +371,7 b' revoke_user_permission_from_user_group' | |||
|
368 | 371 | update_user_group |
|
369 | 372 | ----------------- |
|
370 | 373 | |
|
371 | .. py:function:: update_user_group(apiuser, usergroupid, group_name=<Optional:''>, description=<Optional:''>, owner=<Optional:None>, active=<Optional:True>) | |
|
374 | .. py:function:: update_user_group(apiuser, usergroupid, group_name=<Optional:''>, description=<Optional:''>, owner=<Optional:None>, active=<Optional:True>, sync=<Optional:None>) | |
|
372 | 375 | |
|
373 | 376 | Updates the specified `user group` with the details provided. |
|
374 | 377 | |
@@ -387,6 +390,9 b' update_user_group' | |||
|
387 | 390 | :type owner: Optional(str or int) |
|
388 | 391 | :param active: Set the group as active. |
|
389 | 392 | :type active: Optional(``True`` | ``False``) |
|
393 | :param sync: Set enabled or disabled the automatically sync from | |
|
394 | external authentication types like ldap. | |
|
395 | :type sync: Optional(``True`` | ``False``) | |
|
390 | 396 | |
|
391 | 397 | Example output: |
|
392 | 398 |
|
1 | NO CONTENT: file renamed from docs/auth/crowd-auth.rst to docs/auth/auth-crowd.rst |
@@ -1,14 +1,17 b'' | |||
|
1 | 1 | .. _config-ldap-ref: |
|
2 | 2 | |
|
3 | LDAP | |
|
4 | ---- | |
|
3 | LDAP/AD | |
|
4 | ------- | |
|
5 | 5 | |
|
6 | 6 | |RCM| supports LDAP (Lightweight Directory Access Protocol) or |
|
7 | 7 | AD (active Directory) authentication. |
|
8 | 8 | All LDAP versions are supported, with the following |RCM| plugins managing each: |
|
9 | 9 | |
|
10 |
* For LDAP |
|
|
11 | * For LDAPv3 with user group sync use ``LDAP + User Groups (egg:rhodecode-enterprise-ee#ldap_group)`` | |
|
10 | * For LDAP or Active Directory use ``LDAP (egg:rhodecode-enterprise-ce#ldap)`` | |
|
11 | ||
|
12 | RhodeCode reads all data defined from plugin and creates corresponding | |
|
13 | accounts on local database after receiving data from LDAP. This is done on | |
|
14 | every user log-in including operations like pushing/pulling/checkout. | |
|
12 | 15 | |
|
13 | 16 | |
|
14 | 17 | .. important:: |
@@ -21,6 +24,7 b' All LDAP versions are supported, with th' | |||
|
21 | 24 | Likewise, if as an admin you also have a user account, the email address |
|
22 | 25 | attached to the user account needs to be different. |
|
23 | 26 | |
|
27 | ||
|
24 | 28 | LDAP Configuration Steps |
|
25 | 29 | ^^^^^^^^^^^^^^^^^^^^^^^^ |
|
26 | 30 | |
@@ -28,7 +32,7 b' To configure |LDAP|, use the following s' | |||
|
28 | 32 | |
|
29 | 33 | 1. From the |RCM| interface, select |
|
30 | 34 | :menuselection:`Admin --> Authentication` |
|
31 |
2. Enable the |
|
|
35 | 2. Enable the ldap plugin and select :guilabel:`Save` | |
|
32 | 36 | 3. Select the :guilabel:`Enabled` check box in the plugin configuration section |
|
33 | 37 | 4. Add the required LDAP information and :guilabel:`Save`, for more details, |
|
34 | 38 | see :ref:`config-ldap-examples` |
@@ -41,15 +45,16 b' Example LDAP configuration' | |||
|
41 | 45 | ^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
42 | 46 | .. code-block:: bash |
|
43 | 47 | |
|
44 | # Auth Cache TTL | |
|
48 | # Auth Cache TTL, Defines the caching for authentication to offload LDAP server. | |
|
49 | # This means that cache result will be saved for 3600 before contacting LDAP server to verify the user access | |
|
45 | 50 | 3600 |
|
46 | # Host | |
|
51 | # Host, comma seperated format is optionally possible to specify more than 1 server | |
|
47 | 52 | https://ldap1.server.com/ldap-admin/,https://ldap2.server.com/ldap-admin/ |
|
48 | # Port | |
|
53 | # Default LDAP Port, use 689 for LDAPS | |
|
49 | 54 | 389 |
|
50 | # Account | |
|
51 | cn=admin,dc=rhodecode,dc=com | |
|
52 | # Password | |
|
55 | # Account, used for SimpleBind if LDAP server requires an authentication | |
|
56 | e.g admin@server.com | |
|
57 | # Password used for simple bind | |
|
53 | 58 | ldap-user-password |
|
54 | 59 | # LDAP connection security |
|
55 | 60 | LDAPS |
@@ -57,32 +62,26 b' Example LDAP configuration' | |||
|
57 | 62 | DEMAND |
|
58 | 63 | # Base DN |
|
59 | 64 | cn=Rufus Magillacuddy,ou=users,dc=rhodecode,dc=com |
|
60 | # User Search Base | |
|
61 | ou=groups,ou=users | |
|
62 | # LDAP search filter | |
|
65 | # LDAP search filter to narrow the results | |
|
63 | 66 | (objectClass=person) |
|
64 | 67 | # LDAP search scope |
|
65 | 68 | SUBTREE |
|
66 | 69 | # Login attribute |
|
67 | rmagillacuddy | |
|
68 | # First Name Attribute | |
|
69 | Rufus | |
|
70 | # Last Name Attribute | |
|
71 | Magillacuddy | |
|
72 | # Email Attribute | |
|
73 | LDAP-Registered@email.ac | |
|
74 | # User Member of Attribute | |
|
75 | Organizational Role | |
|
76 | # Group search base | |
|
77 | cn=users,ou=groups,dc=rhodecode,dc=com | |
|
78 | # LDAP Group Search Filter | |
|
79 | (objectclass=posixGroup) | |
|
80 | # Group Name Attribute | |
|
81 | users | |
|
82 | # Group Member Of Attribute | |
|
83 | cn | |
|
84 | # Admin Groups | |
|
85 | admin,devops,qa | |
|
70 | sAMAccountName | |
|
71 | # First Name Attribute to read | |
|
72 | givenName | |
|
73 | # Last Name Attribute to read | |
|
74 | sn | |
|
75 | # Email Attribute to read email address from | |
|
76 | ||
|
77 | ||
|
78 | ||
|
79 | Below is example setup that can be used with Active Directory/LDAP server. | |
|
80 | ||
|
81 | .. image:: ../images/ldap-example.png | |
|
82 | :alt: LDAP/AD setup example | |
|
83 | :scale: 50 % | |
|
84 | ||
|
86 | 85 | |
|
87 | 86 | .. toctree:: |
|
88 | 87 |
|
1 | NO CONTENT: file renamed from docs/auth/pam-auth.rst to docs/auth/auth-pam.rst |
|
1 | NO CONTENT: file renamed from docs/auth/token-auth.rst to docs/auth/auth-token.rst |
@@ -3,35 +3,30 b'' | |||
|
3 | 3 | Authentication Options |
|
4 | 4 | ====================== |
|
5 | 5 | |
|
6 |
|RCE| provides a built in authentication |
|
|
7 | ``rhodecode.lib.auth_rhodecode``. This is enabled by default and accessed | |
|
8 | through the administrative interface. Additionally, | |
|
9 |
|RCE| provides a Pluggable Authentication System |
|
|
6 | |RCE| provides a built in authentication against its own database. This is | |
|
7 | implemented using ``rhodecode.lib.auth_rhodecode`` plugin. This plugin is | |
|
8 | enabled by default. | |
|
9 | Additionally, |RCE| provides a Pluggable Authentication System. This gives the | |
|
10 | 10 | administrator greater control over how users authenticate with the system. |
|
11 | 11 | |
|
12 | 12 | .. important:: |
|
13 | 13 | |
|
14 | 14 | You can disable the built in |RCM| authentication plugin |
|
15 | 15 | ``rhodecode.lib.auth_rhodecode`` and force all authentication to go |
|
16 |
through your authentication plugin |
|
|
17 |
and your external authentication tools fails, |
|
|
18 | access |RCM|. | |
|
16 | through your authentication plugin of choice e.g LDAP only. | |
|
17 | However, if you do this, and your external authentication tools fails, | |
|
18 | you will be unable to access |RCM|. | |
|
19 | 19 | |
|
20 | 20 | |RCM| comes with the following user authentication management plugins: |
|
21 | 21 | |
|
22 | .. only:: latex | |
|
23 | ||
|
24 | * :ref:`config-ldap-ref` | |
|
25 | * :ref:`config-pam-ref` | |
|
26 | * :ref:`config-crowd-ref` | |
|
27 | * :ref:`config-token-ref` | |
|
28 | 22 | |
|
29 | 23 | .. toctree:: |
|
30 | 24 | |
|
31 | ldap-config-steps | |
|
32 | crowd-auth | |
|
33 |
|
|
|
34 |
|
|
|
25 | auth-ldap | |
|
26 | auth-ldap-groups | |
|
27 | auth-crowd | |
|
28 | auth-pam | |
|
29 | auth-token | |
|
35 | 30 | ssh-connection |
|
36 | 31 | |
|
37 | 32 |
@@ -34,7 +34,7 b' import common' | |||
|
34 | 34 | extensions = [ |
|
35 | 35 | 'sphinx.ext.intersphinx', |
|
36 | 36 | 'sphinx.ext.todo', |
|
37 |
'sphinx.ext. |
|
|
37 | 'sphinx.ext.imgmath' | |
|
38 | 38 | ] |
|
39 | 39 | |
|
40 | 40 | intersphinx_mapping = { |
@@ -104,7 +104,6 b' exclude_patterns = [' | |||
|
104 | 104 | |
|
105 | 105 | # Other RST files |
|
106 | 106 | 'admin/rhodecode-backup.rst', |
|
107 | 'auth/ldap-configuration-example.rst', | |
|
108 | 107 | 'issue-trackers/redmine.rst', |
|
109 | 108 | 'known-issues/error-msg-guide.rst', |
|
110 | 109 | 'tutorials/docs-build.rst', |
@@ -34,6 +34,13 b' following commands:' | |||
|
34 | 34 | |
|
35 | 35 | Update your channels frequently by running ``nix-channel --update``. |
|
36 | 36 | |
|
37 | .. note:: | |
|
38 | ||
|
39 | To uninstall nix run the following: | |
|
40 | ||
|
41 | remove the . "$HOME/.nix-profile/etc/profile.d/nix.sh" line in your ~/.profile or ~/.bash_profile | |
|
42 | rm -rf $HOME/{.nix-channels,.nix-defexpr,.nix-profile,.config/nixpkgs} | |
|
43 | sudo rm -rf /nix | |
|
37 | 44 | |
|
38 | 45 | Switch nix to the latest STABLE channel |
|
39 | 46 | --------------------------------------- |
@@ -169,17 +169,6 b' let' | |||
|
169 | 169 | }; |
|
170 | 170 | }; |
|
171 | 171 | |
|
172 | setuptools = buildPythonPackage { | |
|
173 | name = "setuptools-36.6.0"; | |
|
174 | buildInputs = []; | |
|
175 | doCheck = false; | |
|
176 | propagatedBuildInputs = []; | |
|
177 | src = fetchurl { | |
|
178 | url = "https://pypi.python.org/packages/45/29/8814bf414e7cd1031e1a3c8a4169218376e284ea2553cc0822a6ea1c2d78/setuptools-36.6.0.zip"; | |
|
179 | md5 = "74663b15117d9a2cc5295d76011e6fd1"; | |
|
180 | }; | |
|
181 | }; | |
|
182 | ||
|
183 | 172 | six = buildPythonPackage { |
|
184 | 173 | name = "six-1.11.0"; |
|
185 | 174 | buildInputs = []; |
@@ -259,6 +248,9 b' let' | |||
|
259 | 248 | |
|
260 | 249 | |
|
261 | 250 | }; |
|
251 | # Avoid that setuptools is replaced, this leads to trouble | |
|
252 | # with buildPythonPackage. | |
|
253 | setuptools = pkgs.python27Packages.setuptools; | |
|
262 | 254 | |
|
263 | 255 | in python.buildEnv.override { |
|
264 | 256 | inherit python; |
@@ -9,6 +9,7 b' Release Notes' | |||
|
9 | 9 | .. toctree:: |
|
10 | 10 | :maxdepth: 1 |
|
11 | 11 | |
|
12 | release-notes-4.12.0.rst | |
|
12 | 13 | release-notes-4.11.6.rst |
|
13 | 14 | release-notes-4.11.5.rst |
|
14 | 15 | release-notes-4.11.4.rst |
@@ -88,7 +88,7 b' 2. Once downloaded to your computer, tra' | |||
|
88 | 88 | |
|
89 | 89 | The |RCC| installer is now on your server, and you can read the full |
|
90 | 90 | instructions here |
|
91 |
:ref:`Install RhodeCode Control <control:rcc- |
|
|
91 | :ref:`Install RhodeCode Control <control:rcc-linux-ref>` , | |
|
92 | 92 | but below is the example shortcut. |
|
93 | 93 | |
|
94 | 94 | .. code-block:: bash |
@@ -25,6 +25,12 b' self: super: {' | |||
|
25 | 25 | }; |
|
26 | 26 | }); |
|
27 | 27 | |
|
28 | Beaker = super.Beaker.override (attrs: { | |
|
29 | patches = [ | |
|
30 | ./patch-beaker-lock-func-debug.diff | |
|
31 | ]; | |
|
32 | }); | |
|
33 | ||
|
28 | 34 | future = super.future.override (attrs: { |
|
29 | 35 | meta = { |
|
30 | 36 | license = [ pkgs.lib.licenses.mit ]; |
@@ -82,7 +88,7 b' self: super: {' | |||
|
82 | 88 | pkgs.openssl |
|
83 | 89 | ]; |
|
84 | 90 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
85 |
pkgs.mysql |
|
|
91 | pkgs.libmysql | |
|
86 | 92 | pkgs.zlib |
|
87 | 93 | ]; |
|
88 | 94 | }); |
@@ -16,13 +16,13 b'' | |||
|
16 | 16 | }; |
|
17 | 17 | }; |
|
18 | 18 | Beaker = super.buildPythonPackage { |
|
19 |
name = "Beaker-1.9. |
|
|
19 | name = "Beaker-1.9.1"; | |
|
20 | 20 | buildInputs = with self; []; |
|
21 | 21 | doCheck = false; |
|
22 | 22 | propagatedBuildInputs = with self; [funcsigs]; |
|
23 | 23 | src = fetchurl { |
|
24 |
url = "https://pypi.python.org/packages/ |
|
|
25 | md5 = "38b3fcdfa24faf97c6cf66991eb54e9c"; | |
|
24 | url = "https://pypi.python.org/packages/ca/14/a626188d0d0c7b55dd7cf1902046c2743bd392a7078bb53073e13280eb1e/Beaker-1.9.1.tar.gz"; | |
|
25 | md5 = "46fda0a164e2b0d24ccbda51a2310301"; | |
|
26 | 26 | }; |
|
27 | 27 | meta = { |
|
28 | 28 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -94,13 +94,13 b'' | |||
|
94 | 94 | }; |
|
95 | 95 | }; |
|
96 | 96 | Markdown = super.buildPythonPackage { |
|
97 |
name = "Markdown-2.6. |
|
|
97 | name = "Markdown-2.6.11"; | |
|
98 | 98 | buildInputs = with self; []; |
|
99 | 99 | doCheck = false; |
|
100 | 100 | propagatedBuildInputs = with self; []; |
|
101 | 101 | src = fetchurl { |
|
102 |
url = "https://pypi.python.org/packages/2 |
|
|
103 | md5 = "56547d362a9abcf30955b8950b08b5e3"; | |
|
102 | url = "https://pypi.python.org/packages/b3/73/fc5c850f44af5889192dff783b7b0d8f3fe8d30b65c8e3f78f8f0265fecf/Markdown-2.6.11.tar.gz"; | |
|
103 | md5 = "a67c1b2914f7d74eeede2ebe0fdae470"; | |
|
104 | 104 | }; |
|
105 | 105 | meta = { |
|
106 | 106 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -315,13 +315,13 b'' | |||
|
315 | 315 | }; |
|
316 | 316 | }; |
|
317 | 317 | alembic = super.buildPythonPackage { |
|
318 |
name = "alembic-0.9. |
|
|
318 | name = "alembic-0.9.8"; | |
|
319 | 319 | buildInputs = with self; []; |
|
320 | 320 | doCheck = false; |
|
321 | 321 | propagatedBuildInputs = with self; [SQLAlchemy Mako python-editor python-dateutil]; |
|
322 | 322 | src = fetchurl { |
|
323 |
url = "https://pypi.python.org/packages/ |
|
|
324 | md5 = "fcb096bccc87c8770bd07a04606cb989"; | |
|
323 | url = "https://pypi.python.org/packages/a1/95/2252783859df9ec76b9a25d968c2827ed75a43ba34c6e8d38f87a5c0fb26/alembic-0.9.8.tar.gz"; | |
|
324 | md5 = "5cfef58641c9a94d4a5d547e951a7dda"; | |
|
325 | 325 | }; |
|
326 | 326 | meta = { |
|
327 | 327 | license = [ pkgs.lib.licenses.mit ]; |
@@ -341,13 +341,13 b'' | |||
|
341 | 341 | }; |
|
342 | 342 | }; |
|
343 | 343 | appenlight-client = super.buildPythonPackage { |
|
344 |
name = "appenlight-client-0.6.2 |
|
|
344 | name = "appenlight-client-0.6.25"; | |
|
345 | 345 | buildInputs = with self; []; |
|
346 | 346 | doCheck = false; |
|
347 | 347 | propagatedBuildInputs = with self; [WebOb requests six]; |
|
348 | 348 | src = fetchurl { |
|
349 |
url = "https://pypi.python.org/packages/ |
|
|
350 | md5 = "641afc114a9a3b3af4f75b11c70968ee"; | |
|
349 | url = "https://pypi.python.org/packages/fa/44/2911ef85ea4f4fe65058fd22959d8dad598fab6a3c84e5bcb569d15c8783/appenlight_client-0.6.25.tar.gz"; | |
|
350 | md5 = "76dd2f9d42659fae8f290982078dc880"; | |
|
351 | 351 | }; |
|
352 | 352 | meta = { |
|
353 | 353 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -406,13 +406,13 b'' | |||
|
406 | 406 | }; |
|
407 | 407 | }; |
|
408 | 408 | bleach = super.buildPythonPackage { |
|
409 |
name = "bleach-2.1. |
|
|
409 | name = "bleach-2.1.2"; | |
|
410 | 410 | buildInputs = with self; []; |
|
411 | 411 | doCheck = false; |
|
412 | 412 | propagatedBuildInputs = with self; [six html5lib]; |
|
413 | 413 | src = fetchurl { |
|
414 |
url = "https://pypi.python.org/packages/d |
|
|
415 | md5 = "7c5dfb1d66ea979b5a465afb12c82ec4"; | |
|
414 | url = "https://pypi.python.org/packages/b3/5f/0da670d30d3ffbc57cc97fa82947f81bbe3eab8d441e2d42e661f215baf2/bleach-2.1.2.tar.gz"; | |
|
415 | md5 = "d0b14ae43a437ee0c650e04c6063eedd"; | |
|
416 | 416 | }; |
|
417 | 417 | meta = { |
|
418 | 418 | license = [ pkgs.lib.licenses.asl20 ]; |
@@ -710,8 +710,8 b'' | |||
|
710 | 710 | doCheck = false; |
|
711 | 711 | propagatedBuildInputs = with self; []; |
|
712 | 712 | src = fetchurl { |
|
713 | url = "https://pypi.python.org/packages/5e/1a/0aa2c8195a204a9f51284018562dea77e25511f02fe924fac202fc012172/functools32-3.2.3-2.zip"; | |
|
714 | md5 = "d55232eb132ec779e6893c902a0bc5ad"; | |
|
713 | url = "https://pypi.python.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz"; | |
|
714 | md5 = "09f24ffd9af9f6cd0f63cb9f4e23d4b2"; | |
|
715 | 715 | }; |
|
716 | 716 | meta = { |
|
717 | 717 | license = [ pkgs.lib.licenses.psfl ]; |
@@ -783,26 +783,26 b'' | |||
|
783 | 783 | }; |
|
784 | 784 | }; |
|
785 | 785 | graphviz = super.buildPythonPackage { |
|
786 |
name = "graphviz-0.8. |
|
|
786 | name = "graphviz-0.8.2"; | |
|
787 | 787 | buildInputs = with self; []; |
|
788 | 788 | doCheck = false; |
|
789 | 789 | propagatedBuildInputs = with self; []; |
|
790 | 790 | src = fetchurl { |
|
791 | url = "https://pypi.python.org/packages/a9/a6/ee6721349489a2da6eedd3dba124f2b5ac15ee1e0a7bd4d3cfdc4fff0327/graphviz-0.8.1.zip"; | |
|
792 | md5 = "88d8efa88c02a735b3659fe0feaf0b96"; | |
|
791 | url = "https://pypi.python.org/packages/fa/d1/63b62dee9e55368f60b5ea445e6afb361bb47e692fc27553f3672e16efb8/graphviz-0.8.2.zip"; | |
|
792 | md5 = "50866e780f43e1cb0d073c70424fcaff"; | |
|
793 | 793 | }; |
|
794 | 794 | meta = { |
|
795 | 795 | license = [ pkgs.lib.licenses.mit ]; |
|
796 | 796 | }; |
|
797 | 797 | }; |
|
798 | 798 | greenlet = super.buildPythonPackage { |
|
799 |
name = "greenlet-0.4.1 |
|
|
799 | name = "greenlet-0.4.13"; | |
|
800 | 800 | buildInputs = with self; []; |
|
801 | 801 | doCheck = false; |
|
802 | 802 | propagatedBuildInputs = with self; []; |
|
803 | 803 | src = fetchurl { |
|
804 |
url = "https://pypi.python.org/packages/ |
|
|
805 | md5 = "e8637647d58a26c4a1f51ca393e53c00"; | |
|
804 | url = "https://pypi.python.org/packages/13/de/ba92335e9e76040ca7274224942282a80d54f85e342a5e33c5277c7f87eb/greenlet-0.4.13.tar.gz"; | |
|
805 | md5 = "6e0b9dd5385f81d478451ec8ed1d62b3"; | |
|
806 | 806 | }; |
|
807 | 807 | meta = { |
|
808 | 808 | license = [ pkgs.lib.licenses.mit ]; |
@@ -822,13 +822,13 b'' | |||
|
822 | 822 | }; |
|
823 | 823 | }; |
|
824 | 824 | html5lib = super.buildPythonPackage { |
|
825 |
name = "html5lib-1.0 |
|
|
825 | name = "html5lib-1.0.1"; | |
|
826 | 826 | buildInputs = with self; []; |
|
827 | 827 | doCheck = false; |
|
828 |
propagatedBuildInputs = with self; [six webencodings |
|
|
828 | propagatedBuildInputs = with self; [six webencodings]; | |
|
829 | 829 | src = fetchurl { |
|
830 | url = "https://pypi.python.org/packages/97/16/982214624095c1420c75f3bd295d9e658794aafb95fc075823de107e0ae4/html5lib-1.0b10.tar.gz"; | |
|
831 | md5 = "5ada1243b7a863624b2f35245b2186e9"; | |
|
830 | url = "https://pypi.python.org/packages/85/3e/cf449cf1b5004e87510b9368e7a5f1acd8831c2d6691edd3c62a0823f98f/html5lib-1.0.1.tar.gz"; | |
|
831 | md5 = "942a0688d6bdf20d087c9805c40182ad"; | |
|
832 | 832 | }; |
|
833 | 833 | meta = { |
|
834 | 834 | license = [ pkgs.lib.licenses.mit ]; |
@@ -874,13 +874,13 b'' | |||
|
874 | 874 | }; |
|
875 | 875 | }; |
|
876 | 876 | ipaddress = super.buildPythonPackage { |
|
877 |
name = "ipaddress-1.0.1 |
|
|
877 | name = "ipaddress-1.0.19"; | |
|
878 | 878 | buildInputs = with self; []; |
|
879 | 879 | doCheck = false; |
|
880 | 880 | propagatedBuildInputs = with self; []; |
|
881 | 881 | src = fetchurl { |
|
882 |
url = "https://pypi.python.org/packages/4e |
|
|
883 | md5 = "310c2dfd64eb6f0df44aa8c59f2334a7"; | |
|
882 | url = "https://pypi.python.org/packages/f0/ba/860a4a3e283456d6b7e2ab39ce5cf11a3490ee1a363652ac50abf9f0f5df/ipaddress-1.0.19.tar.gz"; | |
|
883 | md5 = "d0687efaf93a32476d81e90ba0609c57"; | |
|
884 | 884 | }; |
|
885 | 885 | meta = { |
|
886 | 886 | license = [ pkgs.lib.licenses.psfl ]; |
@@ -1048,8 +1048,8 b'' | |||
|
1048 | 1048 | doCheck = false; |
|
1049 | 1049 | propagatedBuildInputs = with self; []; |
|
1050 | 1050 | src = fetchurl { |
|
1051 | url = "https://pypi.python.org/packages/15/45/30273ee91feb60dabb8fbb2da7868520525f02cf910279b3047182feed80/mock-1.0.1.zip"; | |
|
1052 | md5 = "869f08d003c289a97c1a6610faf5e913"; | |
|
1051 | url = "https://pypi.python.org/packages/a2/52/7edcd94f0afb721a2d559a5b9aae8af4f8f2c79bc63fdbe8a8a6c9b23bbe/mock-1.0.1.tar.gz"; | |
|
1052 | md5 = "c3971991738caa55ec7c356bbc154ee2"; | |
|
1053 | 1053 | }; |
|
1054 | 1054 | meta = { |
|
1055 | 1055 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -1160,13 +1160,13 b'' | |||
|
1160 | 1160 | }; |
|
1161 | 1161 | }; |
|
1162 | 1162 | pexpect = super.buildPythonPackage { |
|
1163 |
name = "pexpect-4. |
|
|
1163 | name = "pexpect-4.4.0"; | |
|
1164 | 1164 | buildInputs = with self; []; |
|
1165 | 1165 | doCheck = false; |
|
1166 | 1166 | propagatedBuildInputs = with self; [ptyprocess]; |
|
1167 | 1167 | src = fetchurl { |
|
1168 | url = "https://pypi.python.org/packages/f8/44/5466c30e49762bb92e442bbdf4472d6904608d211258eb3198a11f0309a4/pexpect-4.3.0.tar.gz"; | |
|
1169 | md5 = "047a486dcd26134b74f2e67046bb61a0"; | |
|
1168 | url = "https://pypi.python.org/packages/fa/c3/60c0cbf96f242d0b47a82e9ca634dcd6dcb043832cf05e17540812e1c707/pexpect-4.4.0.tar.gz"; | |
|
1169 | md5 = "e9b07f0765df8245ac72201d757baaef"; | |
|
1170 | 1170 | }; |
|
1171 | 1171 | meta = { |
|
1172 | 1172 | license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ]; |
@@ -1225,13 +1225,13 b'' | |||
|
1225 | 1225 | }; |
|
1226 | 1226 | }; |
|
1227 | 1227 | psutil = super.buildPythonPackage { |
|
1228 |
name = "psutil-5.4. |
|
|
1228 | name = "psutil-5.4.3"; | |
|
1229 | 1229 | buildInputs = with self; []; |
|
1230 | 1230 | doCheck = false; |
|
1231 | 1231 | propagatedBuildInputs = with self; []; |
|
1232 | 1232 | src = fetchurl { |
|
1233 | url = "https://pypi.python.org/packages/8d/96/1fc6468be91521192861966c40bd73fdf8b065eae6d82dd0f870b9825a65/psutil-5.4.0.tar.gz"; | |
|
1234 | md5 = "01af6219b1e8fcfd53603023967713bf"; | |
|
1233 | url = "https://pypi.python.org/packages/e2/e1/600326635f97fee89bf8426fef14c5c29f4849c79f68fd79f433d8c1bd96/psutil-5.4.3.tar.gz"; | |
|
1234 | md5 = "3b291833dbea631db9d271aa602a169a"; | |
|
1235 | 1235 | }; |
|
1236 | 1236 | meta = { |
|
1237 | 1237 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -1360,8 +1360,8 b'' | |||
|
1360 | 1360 | doCheck = false; |
|
1361 | 1361 | propagatedBuildInputs = with self; []; |
|
1362 | 1362 | src = fetchurl { |
|
1363 |
url = "https://pypi.python.org/packages/ |
|
|
1364 | md5 = "b86854857a368d6ccb4d5b6e76d0637f"; | |
|
1363 | url = "https://pypi.python.org/packages/6f/2c/47457771c02a8ff0f302b695e094ec309e30452232bd79198ee94fda689f/pyparsing-1.5.7.tar.gz"; | |
|
1364 | md5 = "9be0fcdcc595199c646ab317c1d9a709"; | |
|
1365 | 1365 | }; |
|
1366 | 1366 | meta = { |
|
1367 | 1367 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1602,13 +1602,13 b'' | |||
|
1602 | 1602 | }; |
|
1603 | 1603 | }; |
|
1604 | 1604 | pytz = super.buildPythonPackage { |
|
1605 |
name = "pytz-201 |
|
|
1605 | name = "pytz-2018.3"; | |
|
1606 | 1606 | buildInputs = with self; []; |
|
1607 | 1607 | doCheck = false; |
|
1608 | 1608 | propagatedBuildInputs = with self; []; |
|
1609 | 1609 | src = fetchurl { |
|
1610 | url = "https://pypi.python.org/packages/60/88/d3152c234da4b2a1f7a989f89609ea488225eaea015bc16fbde2b3fdfefa/pytz-2017.3.zip"; | |
|
1611 | md5 = "7006b56c0d68a162d9fe57d4249c3171"; | |
|
1610 | url = "https://pypi.python.org/packages/1b/50/4cdc62fc0753595fc16c8f722a89740f487c6e5670c644eb8983946777be/pytz-2018.3.tar.gz"; | |
|
1611 | md5 = "abb07c09c79f78d7c04f222a550c99ef"; | |
|
1612 | 1612 | }; |
|
1613 | 1613 | meta = { |
|
1614 | 1614 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1680,10 +1680,10 b'' | |||
|
1680 | 1680 | }; |
|
1681 | 1681 | }; |
|
1682 | 1682 | rhodecode-enterprise-ce = super.buildPythonPackage { |
|
1683 |
name = "rhodecode-enterprise-ce-4.1 |
|
|
1683 | name = "rhodecode-enterprise-ce-4.12.0"; | |
|
1684 | 1684 | buildInputs = with self; [pytest py pytest-cov pytest-sugar pytest-runner pytest-catchlog pytest-profiling gprof2dot pytest-timeout mock WebTest cov-core coverage configobj]; |
|
1685 | 1685 | doCheck = true; |
|
1686 | propagatedBuildInputs = with self; [setuptools-scm amqp authomatic Babel Beaker celery Chameleon channelstream click colander configobj cssselect decorator deform docutils dogpile.cache dogpile.core ecdsa FormEncode future futures gnureadline infrae.cache iso8601 itsdangerous Jinja2 billiard kombu lxml Mako Markdown MarkupSafe msgpack-python MySQL-python objgraph packaging Paste PasteDeploy PasteScript pathlib2 peppercorn psutil psycopg2 py-bcrypt pycrypto pycurl pyflakes pygments-markdown-lexer Pygments pyparsing pyramid-beaker pyramid-debugtoolbar pyramid-jinja2 pyramid-mako pyramid pysqlite python-dateutil python-ldap python-memcached python-pam pytz pyzmq py-gfm recaptcha-client redis repoze.lru requests Routes setproctitle simplejson six SQLAlchemy sshpubkeys subprocess32 supervisor Tempita translationstring trollius urllib3 URLObject venusian WebError WebHelpers2 WebHelpers WebOb Whoosh wsgiref zope.cachedescriptors zope.deprecation zope.event zope.interface nbconvert bleach nbformat jupyter-client alembic invoke bumpversion transifex-client gevent greenlet gunicorn waitress uWSGI ipdb ipython CProfileV bottle rhodecode-tools appenlight-client pytest py pytest-cov pytest-sugar pytest-runner pytest-catchlog pytest-profiling gprof2dot pytest-timeout mock WebTest cov-core coverage]; | |
|
1686 | propagatedBuildInputs = with self; [setuptools-scm amqp authomatic Babel Beaker celery Chameleon channelstream click colander configobj cssselect decorator deform docutils dogpile.cache dogpile.core ecdsa FormEncode future futures gnureadline infrae.cache iso8601 itsdangerous Jinja2 billiard kombu lxml Mako Markdown MarkupSafe msgpack-python MySQL-python objgraph packaging Paste PasteDeploy PasteScript pathlib2 peppercorn psutil psycopg2 py-bcrypt pycrypto pycurl pyflakes pygments-markdown-lexer Pygments pyparsing pyramid-beaker pyramid-debugtoolbar pyramid-jinja2 pyramid-mako pyramid pysqlite python-dateutil python-ldap python-memcached python-pam pytz tzlocal pyzmq py-gfm recaptcha-client redis repoze.lru requests Routes setproctitle simplejson six SQLAlchemy sshpubkeys subprocess32 supervisor Tempita translationstring trollius urllib3 URLObject venusian WebError WebHelpers2 WebHelpers WebOb Whoosh wsgiref zope.cachedescriptors zope.deprecation zope.event zope.interface nbconvert bleach nbformat jupyter-client alembic invoke bumpversion transifex-client gevent greenlet gunicorn waitress uWSGI ipdb ipython CProfileV bottle rhodecode-tools appenlight-client pytest py pytest-cov pytest-sugar pytest-runner pytest-catchlog pytest-profiling gprof2dot pytest-timeout mock WebTest cov-core coverage]; | |
|
1687 | 1687 | src = ./.; |
|
1688 | 1688 | meta = { |
|
1689 | 1689 | license = [ { fullName = "Affero GNU General Public License v3 or later (AGPLv3+)"; } { fullName = "AGPLv3, and Commercial License"; } ]; |
@@ -1703,13 +1703,13 b'' | |||
|
1703 | 1703 | }; |
|
1704 | 1704 | }; |
|
1705 | 1705 | scandir = super.buildPythonPackage { |
|
1706 |
name = "scandir-1. |
|
|
1706 | name = "scandir-1.7"; | |
|
1707 | 1707 | buildInputs = with self; []; |
|
1708 | 1708 | doCheck = false; |
|
1709 | 1709 | propagatedBuildInputs = with self; []; |
|
1710 | 1710 | src = fetchurl { |
|
1711 |
url = "https://pypi.python.org/packages/77 |
|
|
1712 | md5 = "0180ddb97c96cbb2d4f25d2ae11c64ac"; | |
|
1711 | url = "https://pypi.python.org/packages/13/bb/e541b74230bbf7a20a3949a2ee6631be299378a784f5445aa5d0047c192b/scandir-1.7.tar.gz"; | |
|
1712 | md5 = "037e5f24d1a0e78b17faca72dea9555f"; | |
|
1713 | 1713 | }; |
|
1714 | 1714 | meta = { |
|
1715 | 1715 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ]; |
@@ -1820,13 +1820,13 b'' | |||
|
1820 | 1820 | }; |
|
1821 | 1821 | }; |
|
1822 | 1822 | supervisor = super.buildPythonPackage { |
|
1823 |
name = "supervisor-3.3. |
|
|
1823 | name = "supervisor-3.3.4"; | |
|
1824 | 1824 | buildInputs = with self; []; |
|
1825 | 1825 | doCheck = false; |
|
1826 | 1826 | propagatedBuildInputs = with self; [meld3]; |
|
1827 | 1827 | src = fetchurl { |
|
1828 | url = "https://pypi.python.org/packages/31/7e/788fc6566211e77c395ea272058eb71299c65cc5e55b6214d479c6c2ec9a/supervisor-3.3.3.tar.gz"; | |
|
1829 | md5 = "0fe86dfec4e5c5d98324d24c4cf944bd"; | |
|
1828 | url = "https://pypi.python.org/packages/44/60/698e54b4a4a9b956b2d709b4b7b676119c833d811d53ee2500f1b5e96dc3/supervisor-3.3.4.tar.gz"; | |
|
1829 | md5 = "f1814d71d820ddfa8c86d46a72314cec"; | |
|
1830 | 1830 | }; |
|
1831 | 1831 | meta = { |
|
1832 | 1832 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
@@ -1910,6 +1910,19 b'' | |||
|
1910 | 1910 | license = [ pkgs.lib.licenses.asl20 ]; |
|
1911 | 1911 | }; |
|
1912 | 1912 | }; |
|
1913 | tzlocal = super.buildPythonPackage { | |
|
1914 | name = "tzlocal-1.5.1"; | |
|
1915 | buildInputs = with self; []; | |
|
1916 | doCheck = false; | |
|
1917 | propagatedBuildInputs = with self; [pytz]; | |
|
1918 | src = fetchurl { | |
|
1919 | url = "https://pypi.python.org/packages/cb/89/e3687d3ed99bc882793f82634e9824e62499fdfdc4b1ae39e211c5b05017/tzlocal-1.5.1.tar.gz"; | |
|
1920 | md5 = "4553be891efa0812c4adfb0c6e818eec"; | |
|
1921 | }; | |
|
1922 | meta = { | |
|
1923 | license = [ pkgs.lib.licenses.mit ]; | |
|
1924 | }; | |
|
1925 | }; | |
|
1913 | 1926 | uWSGI = super.buildPythonPackage { |
|
1914 | 1927 | name = "uWSGI-2.0.15"; |
|
1915 | 1928 | buildInputs = with self; []; |
@@ -2002,13 +2015,13 b'' | |||
|
2002 | 2015 | }; |
|
2003 | 2016 | }; |
|
2004 | 2017 | ws4py = super.buildPythonPackage { |
|
2005 |
name = "ws4py-0.4. |
|
|
2018 | name = "ws4py-0.4.3"; | |
|
2006 | 2019 | buildInputs = with self; []; |
|
2007 | 2020 | doCheck = false; |
|
2008 | 2021 | propagatedBuildInputs = with self; []; |
|
2009 | 2022 | src = fetchurl { |
|
2010 |
url = "https://pypi.python.org/packages/ |
|
|
2011 | md5 = "f0603ae376707a58d205bd87a67758a2"; | |
|
2023 | url = "https://pypi.python.org/packages/fa/a1/33c43a4304ac3b4dc81deb93cbd329de9297dd034d75c47cce64fda806bc/ws4py-0.4.3.tar.gz"; | |
|
2024 | md5 = "d5834cf7d3965bb0da31bbb02bd8513a"; | |
|
2012 | 2025 | }; |
|
2013 | 2026 | meta = { |
|
2014 | 2027 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -2028,52 +2041,52 b'' | |||
|
2028 | 2041 | }; |
|
2029 | 2042 | }; |
|
2030 | 2043 | zope.cachedescriptors = super.buildPythonPackage { |
|
2031 |
name = "zope.cachedescriptors-4. |
|
|
2044 | name = "zope.cachedescriptors-4.3.1"; | |
|
2032 | 2045 | buildInputs = with self; []; |
|
2033 | 2046 | doCheck = false; |
|
2034 | 2047 | propagatedBuildInputs = with self; [setuptools]; |
|
2035 | 2048 | src = fetchurl { |
|
2036 |
url = "https://pypi.python.org/packages/ |
|
|
2037 | md5 = "8d308de8c936792c8e758058fcb7d0f0"; | |
|
2049 | url = "https://pypi.python.org/packages/2f/89/ebe1890cc6d3291ebc935558fa764d5fffe571018dbbee200e9db78762cb/zope.cachedescriptors-4.3.1.tar.gz"; | |
|
2050 | md5 = "42f3693f43bc93f3b1eb86940f58acf3"; | |
|
2038 | 2051 | }; |
|
2039 | 2052 | meta = { |
|
2040 | 2053 | license = [ pkgs.lib.licenses.zpt21 ]; |
|
2041 | 2054 | }; |
|
2042 | 2055 | }; |
|
2043 | 2056 | zope.deprecation = super.buildPythonPackage { |
|
2044 |
name = "zope.deprecation-4. |
|
|
2057 | name = "zope.deprecation-4.3.0"; | |
|
2045 | 2058 | buildInputs = with self; []; |
|
2046 | 2059 | doCheck = false; |
|
2047 | 2060 | propagatedBuildInputs = with self; [setuptools]; |
|
2048 | 2061 | src = fetchurl { |
|
2049 |
url = "https://pypi.python.org/packages/ |
|
|
2050 | md5 = "e9a663ded58f4f9f7881beb56cae2782"; | |
|
2062 | url = "https://pypi.python.org/packages/a1/18/2dc5e6bfe64fdc3b79411b67464c55bb0b43b127051a20f7f492ab767758/zope.deprecation-4.3.0.tar.gz"; | |
|
2063 | md5 = "2166b2cb7e0e96a21104e6f8f9b696bb"; | |
|
2051 | 2064 | }; |
|
2052 | 2065 | meta = { |
|
2053 | 2066 | license = [ pkgs.lib.licenses.zpt21 ]; |
|
2054 | 2067 | }; |
|
2055 | 2068 | }; |
|
2056 | 2069 | zope.event = super.buildPythonPackage { |
|
2057 |
name = "zope.event-4.0 |
|
|
2070 | name = "zope.event-4.3.0"; | |
|
2058 | 2071 | buildInputs = with self; []; |
|
2059 | 2072 | doCheck = false; |
|
2060 | 2073 | propagatedBuildInputs = with self; [setuptools]; |
|
2061 | 2074 | src = fetchurl { |
|
2062 | url = "https://pypi.python.org/packages/c1/29/91ba884d7d6d96691df592e9e9c2bfa57a47040ec1ff47eff18c85137152/zope.event-4.0.3.tar.gz"; | |
|
2063 | md5 = "9a3780916332b18b8b85f522bcc3e249"; | |
|
2075 | url = "https://pypi.python.org/packages/9e/d0/54ba59f19a0635f6591b74be259cf6fbf67e73f4edda27b5cd0cf4d26efa/zope.event-4.3.0.tar.gz"; | |
|
2076 | md5 = "8ca737960741c6fd112972f3313303bd"; | |
|
2064 | 2077 | }; |
|
2065 | 2078 | meta = { |
|
2066 | 2079 | license = [ pkgs.lib.licenses.zpt21 ]; |
|
2067 | 2080 | }; |
|
2068 | 2081 | }; |
|
2069 | 2082 | zope.interface = super.buildPythonPackage { |
|
2070 |
name = "zope.interface-4. |
|
|
2083 | name = "zope.interface-4.4.3"; | |
|
2071 | 2084 | buildInputs = with self; []; |
|
2072 | 2085 | doCheck = false; |
|
2073 | 2086 | propagatedBuildInputs = with self; [setuptools]; |
|
2074 | 2087 | src = fetchurl { |
|
2075 |
url = "https://pypi.python.org/packages/ |
|
|
2076 | md5 = "9ae3d24c0c7415deb249dd1a132f0f79"; | |
|
2088 | url = "https://pypi.python.org/packages/bd/d2/25349ed41f9dcff7b3baf87bd88a4c82396cf6e02f1f42bb68657a3132af/zope.interface-4.4.3.tar.gz"; | |
|
2089 | md5 = "8700a4f527c1203b34b10c2b4e7a6912"; | |
|
2077 | 2090 | }; |
|
2078 | 2091 | meta = { |
|
2079 | 2092 | license = [ pkgs.lib.licenses.zpt21 ]; |
@@ -5,7 +5,7 b' setuptools-scm==1.15.6' | |||
|
5 | 5 | amqp==2.2.2 |
|
6 | 6 | authomatic==0.1.0.post1 |
|
7 | 7 | Babel==1.3 |
|
8 |
Beaker==1.9. |
|
|
8 | Beaker==1.9.1 | |
|
9 | 9 | celery==4.1.0 |
|
10 | 10 | Chameleon==2.24 |
|
11 | 11 | channelstream==0.5.2 |
@@ -31,7 +31,7 b' billiard==3.5.0.3' | |||
|
31 | 31 | kombu==4.1.0 |
|
32 | 32 | lxml==3.7.3 |
|
33 | 33 | Mako==1.0.7 |
|
34 |
Markdown==2.6. |
|
|
34 | Markdown==2.6.11 | |
|
35 | 35 | MarkupSafe==1.0.0 |
|
36 | 36 | msgpack-python==0.4.8 |
|
37 | 37 | MySQL-python==1.2.5 |
@@ -42,7 +42,7 b' PasteDeploy==1.5.2' | |||
|
42 | 42 | PasteScript==2.0.2 |
|
43 | 43 | pathlib2==2.3.0 |
|
44 | 44 | peppercorn==0.5 |
|
45 |
psutil==5.4. |
|
|
45 | psutil==5.4.3 | |
|
46 | 46 | psycopg2==2.7.3.2 |
|
47 | 47 | py-bcrypt==0.4 |
|
48 | 48 | pycrypto==2.6.1 |
@@ -61,7 +61,8 b' python-dateutil' | |||
|
61 | 61 | python-ldap==2.4.45 |
|
62 | 62 | python-memcached==1.58 |
|
63 | 63 | python-pam==1.8.2 |
|
64 |
pytz==201 |
|
|
64 | pytz==2018.3 | |
|
65 | tzlocal==1.5.1 | |
|
65 | 66 | pyzmq==14.6.0 |
|
66 | 67 | py-gfm==0.1.3 |
|
67 | 68 | recaptcha-client==1.0.6 |
@@ -75,7 +76,7 b' six==1.11.0' | |||
|
75 | 76 | SQLAlchemy==1.1.15 |
|
76 | 77 | sshpubkeys==2.2.0 |
|
77 | 78 | subprocess32==3.2.7 |
|
78 |
supervisor==3.3. |
|
|
79 | supervisor==3.3.4 | |
|
79 | 80 | Tempita==0.5.2 |
|
80 | 81 | translationstring==1.3 |
|
81 | 82 | trollius==1.0.4 |
@@ -88,29 +89,29 b' WebHelpers==1.3' | |||
|
88 | 89 | WebOb==1.7.4 |
|
89 | 90 | Whoosh==2.7.4 |
|
90 | 91 | wsgiref==0.1.2 |
|
91 |
zope.cachedescriptors==4. |
|
|
92 |
zope.deprecation==4. |
|
|
93 |
zope.event==4. |
|
|
94 |
zope.interface==4. |
|
|
92 | zope.cachedescriptors==4.3.1 | |
|
93 | zope.deprecation==4.3.0 | |
|
94 | zope.event==4.3.0 | |
|
95 | zope.interface==4.4.3 | |
|
95 | 96 | |
|
96 | 97 | |
|
97 | 98 | # IPYTHON RENDERING |
|
98 | 99 | # entrypoints backport, pypi version doesn't support egg installs |
|
99 | 100 | https://code.rhodecode.com/upstream/entrypoints/archive/96e6d645684e1af3d7df5b5272f3fe85a546b233.tar.gz?md5=7db37771aea9ac9fefe093e5d6987313#egg=entrypoints==0.2.2.rhodecode-upstream1 |
|
100 | 101 | nbconvert==5.3.1 |
|
101 |
bleach==2.1. |
|
|
102 | bleach==2.1.2 | |
|
102 | 103 | nbformat==4.4.0 |
|
103 | 104 | jupyter_client==5.0.0 |
|
104 | 105 | |
|
105 | 106 | ## cli tools |
|
106 |
alembic==0.9. |
|
|
107 | alembic==0.9.8 | |
|
107 | 108 | invoke==0.13.0 |
|
108 | 109 | bumpversion==0.5.3 |
|
109 | 110 | transifex-client==0.12.5 |
|
110 | 111 | |
|
111 | 112 | ## http servers |
|
112 | 113 | gevent==1.2.2 |
|
113 |
greenlet==0.4.1 |
|
|
114 | greenlet==0.4.13 | |
|
114 | 115 | gunicorn==19.7.1 |
|
115 | 116 | waitress==1.1.0 |
|
116 | 117 | uWSGI==2.0.15 |
@@ -125,7 +126,7 b' bottle==0.12.13' | |||
|
125 | 126 | https://code.rhodecode.com/rhodecode-tools-ce/archive/v0.14.1.tar.gz?md5=0b9c2caad160b68889f8172ea54af7b2#egg=rhodecode-tools==0.14.1 |
|
126 | 127 | |
|
127 | 128 | ## appenlight |
|
128 |
appenlight-client==0.6.2 |
|
|
129 | appenlight-client==0.6.25 | |
|
129 | 130 | |
|
130 | 131 | ## test related requirements |
|
131 | 132 | -r requirements_test.txt |
@@ -51,7 +51,7 b' PYRAMID_SETTINGS = {}' | |||
|
51 | 51 | EXTENSIONS = {} |
|
52 | 52 | |
|
53 | 53 | __version__ = ('.'.join((str(each) for each in VERSION[:3]))) |
|
54 |
__dbversion__ = 8 |
|
|
54 | __dbversion__ = 86 # defines current db version for migrations | |
|
55 | 55 | __platform__ = platform.system() |
|
56 | 56 | __license__ = 'AGPLv3, and Commercial License' |
|
57 | 57 | __author__ = 'RhodeCode GmbH' |
@@ -42,7 +42,8 b' class TestGetRepoChangeset(object):' | |||
|
42 | 42 | if details == 'full': |
|
43 | 43 | assert result['refs']['bookmarks'] == getattr( |
|
44 | 44 | commit, 'bookmarks', []) |
|
45 | assert result['refs']['branches'] == [commit.branch] | |
|
45 | branches = [commit.branch] if commit.branch else [] | |
|
46 | assert result['refs']['branches'] == branches | |
|
46 | 47 | assert result['refs']['tags'] == commit.tags |
|
47 | 48 | |
|
48 | 49 | @pytest.mark.parametrize("details", ['basic', 'extended', 'full']) |
@@ -33,12 +33,14 b' class TestPull(object):' | |||
|
33 | 33 | def test_api_pull(self, backend): |
|
34 | 34 | r = backend.create_repo() |
|
35 | 35 | repo_name = r.repo_name |
|
36 |
|
|
|
36 | clone_uri = os.path.join(TESTS_TMP_PATH, backend.repo_name) | |
|
37 | r.clone_uri = clone_uri | |
|
37 | 38 | |
|
38 | 39 | id_, params = build_data(self.apikey, 'pull', repoid=repo_name,) |
|
39 | 40 | response = api_call(self.app, params) |
|
40 | ||
|
41 | expected = {'msg': 'Pulled from `%s`' % (repo_name,), | |
|
41 | msg = 'Pulled from url `%s` on repo `%s`' % ( | |
|
42 | clone_uri, repo_name) | |
|
43 | expected = {'msg': msg, | |
|
42 | 44 | 'repository': repo_name} |
|
43 | 45 | assert_ok(id_, expected, given=response.body) |
|
44 | 46 | |
@@ -47,5 +49,5 b' class TestPull(object):' | |||
|
47 | 49 | self.apikey, 'pull', repoid=backend.repo_name) |
|
48 | 50 | response = api_call(self.app, params) |
|
49 | 51 | |
|
50 |
expected = 'Unable to pull changes from ` |
|
|
52 | expected = 'Unable to pull changes from `None`' | |
|
51 | 53 | assert_error(id_, expected, given=response.body) |
@@ -56,6 +56,9 b' class TestApiUpdateRepo(object):' | |||
|
56 | 56 | ({'clone_uri': ''}, |
|
57 | 57 | {'clone_uri': ''}), |
|
58 | 58 | |
|
59 | ({'push_uri': ''}, | |
|
60 | {'push_uri': ''}), | |
|
61 | ||
|
59 | 62 | ({'landing_rev': 'rev:tip'}, |
|
60 | 63 | {'landing_rev': ['rev', 'tip']}), |
|
61 | 64 |
@@ -36,7 +36,9 b' class TestUpdateUserGroup(object):' | |||
|
36 | 36 | # ('owner', {'owner': TEST_USER_REGULAR_LOGIN}), |
|
37 | 37 | ('owner_email', {'owner_email': TEST_USER_ADMIN_EMAIL}), |
|
38 | 38 | ('active', {'active': False}), |
|
39 | ('active', {'active': True}) | |
|
39 | ('active', {'active': True}), | |
|
40 | ('sync', {'sync': False}), | |
|
41 | ('sync', {'sync': True}) | |
|
40 | 42 | ]) |
|
41 | 43 | def test_api_update_user_group(self, changing_attr, updates, user_util): |
|
42 | 44 | user_group = user_util.create_user_group() |
@@ -49,6 +51,12 b' class TestUpdateUserGroup(object):' | |||
|
49 | 51 | **updates) |
|
50 | 52 | response = api_call(self.app, params) |
|
51 | 53 | |
|
54 | # special case for sync | |
|
55 | if changing_attr == 'sync' and updates['sync'] is False: | |
|
56 | expected_api_data['sync'] = None | |
|
57 | elif changing_attr == 'sync' and updates['sync'] is True: | |
|
58 | expected_api_data['sync'] = 'manual_api' | |
|
59 | ||
|
52 | 60 | expected = { |
|
53 | 61 | 'msg': 'updated user group ID:%s %s' % ( |
|
54 | 62 | user_group.users_group_id, user_group.users_group_name), |
@@ -63,7 +71,9 b' class TestUpdateUserGroup(object):' | |||
|
63 | 71 | # ('owner', {'owner': TEST_USER_REGULAR_LOGIN}), |
|
64 | 72 | ('owner_email', {'owner_email': TEST_USER_ADMIN_EMAIL}), |
|
65 | 73 | ('active', {'active': False}), |
|
66 | ('active', {'active': True}) | |
|
74 | ('active', {'active': True}), | |
|
75 | ('sync', {'sync': False}), | |
|
76 | ('sync', {'sync': True}) | |
|
67 | 77 | ]) |
|
68 | 78 | def test_api_update_user_group_regular_user( |
|
69 | 79 | self, changing_attr, updates, user_util): |
@@ -72,7 +82,6 b' class TestUpdateUserGroup(object):' | |||
|
72 | 82 | expected_api_data = user_group.get_api_data() |
|
73 | 83 | expected_api_data.update(updates) |
|
74 | 84 | |
|
75 | ||
|
76 | 85 | # grant permission to this user |
|
77 | 86 | user = UserModel().get_by_username(self.TEST_USER_LOGIN) |
|
78 | 87 | |
@@ -82,6 +91,12 b' class TestUpdateUserGroup(object):' | |||
|
82 | 91 | self.apikey_regular, 'update_user_group', |
|
83 | 92 | usergroupid=group_name, **updates) |
|
84 | 93 | response = api_call(self.app, params) |
|
94 | # special case for sync | |
|
95 | if changing_attr == 'sync' and updates['sync'] is False: | |
|
96 | expected_api_data['sync'] = None | |
|
97 | elif changing_attr == 'sync' and updates['sync'] is True: | |
|
98 | expected_api_data['sync'] = 'manual_api' | |
|
99 | ||
|
85 | 100 | expected = { |
|
86 | 101 | 'msg': 'updated user group ID:%s %s' % ( |
|
87 | 102 | user_group.users_group_id, user_group.users_group_name), |
@@ -563,6 +563,7 b' def create_repo(' | |||
|
563 | 563 | description=Optional(''), |
|
564 | 564 | private=Optional(False), |
|
565 | 565 | clone_uri=Optional(None), |
|
566 | push_uri=Optional(None), | |
|
566 | 567 | landing_rev=Optional('rev:tip'), |
|
567 | 568 | enable_statistics=Optional(False), |
|
568 | 569 | enable_locking=Optional(False), |
@@ -596,6 +597,8 b' def create_repo(' | |||
|
596 | 597 | :type private: bool |
|
597 | 598 | :param clone_uri: set clone_uri |
|
598 | 599 | :type clone_uri: str |
|
600 | :param push_uri: set push_uri | |
|
601 | :type push_uri: str | |
|
599 | 602 | :param landing_rev: <rev_type>:<rev> |
|
600 | 603 | :type landing_rev: str |
|
601 | 604 | :param enable_locking: |
@@ -639,6 +642,7 b' def create_repo(' | |||
|
639 | 642 | description = Optional.extract(description) |
|
640 | 643 | copy_permissions = Optional.extract(copy_permissions) |
|
641 | 644 | clone_uri = Optional.extract(clone_uri) |
|
645 | push_uri = Optional.extract(push_uri) | |
|
642 | 646 | landing_commit_ref = Optional.extract(landing_rev) |
|
643 | 647 | |
|
644 | 648 | defs = SettingsModel().get_default_repo_settings(strip_prefix=True) |
@@ -667,6 +671,7 b' def create_repo(' | |||
|
667 | 671 | repo_description=description, |
|
668 | 672 | repo_landing_commit_ref=landing_commit_ref, |
|
669 | 673 | repo_clone_uri=clone_uri, |
|
674 | repo_push_uri=push_uri, | |
|
670 | 675 | repo_private=private, |
|
671 | 676 | repo_copy_permissions=copy_permissions, |
|
672 | 677 | repo_enable_statistics=enable_statistics, |
@@ -685,6 +690,7 b' def create_repo(' | |||
|
685 | 690 | 'repo_description': schema_data['repo_description'], |
|
686 | 691 | 'repo_private': schema_data['repo_private'], |
|
687 | 692 | 'clone_uri': schema_data['repo_clone_uri'], |
|
693 | 'push_uri': schema_data['repo_push_uri'], | |
|
688 | 694 | 'repo_landing_rev': schema_data['repo_landing_commit_ref'], |
|
689 | 695 | 'enable_statistics': schema_data['repo_enable_statistics'], |
|
690 | 696 | 'enable_locking': schema_data['repo_enable_locking'], |
@@ -692,7 +698,7 b' def create_repo(' | |||
|
692 | 698 | 'repo_copy_permissions': schema_data['repo_copy_permissions'], |
|
693 | 699 | } |
|
694 | 700 | |
|
695 | task = RepoModel().create(form_data=data, cur_user=owner) | |
|
701 | task = RepoModel().create(form_data=data, cur_user=owner.user_id) | |
|
696 | 702 | task_id = get_task_id(task) |
|
697 | 703 | # no commit, it's done in RepoModel, or async via celery |
|
698 | 704 | return { |
@@ -800,7 +806,8 b' def remove_field_from_repo(request, apiu' | |||
|
800 | 806 | def update_repo( |
|
801 | 807 | request, apiuser, repoid, repo_name=Optional(None), |
|
802 | 808 | owner=Optional(OAttr('apiuser')), description=Optional(''), |
|
803 |
private=Optional(False), |
|
|
809 | private=Optional(False), | |
|
810 | clone_uri=Optional(None), push_uri=Optional(None), | |
|
804 | 811 | landing_rev=Optional('rev:tip'), fork_of=Optional(None), |
|
805 | 812 | enable_statistics=Optional(False), |
|
806 | 813 | enable_locking=Optional(False), |
@@ -877,6 +884,9 b' def update_repo(' | |||
|
877 | 884 | clone_uri=clone_uri |
|
878 | 885 | if not isinstance(clone_uri, Optional) else repo.clone_uri, |
|
879 | 886 | |
|
887 | push_uri=push_uri | |
|
888 | if not isinstance(push_uri, Optional) else repo.push_uri, | |
|
889 | ||
|
880 | 890 | repo_landing_rev=landing_rev |
|
881 | 891 | if not isinstance(landing_rev, Optional) else repo._landing_revision, |
|
882 | 892 | |
@@ -910,6 +920,7 b' def update_repo(' | |||
|
910 | 920 | repo_owner=updates['user'], |
|
911 | 921 | repo_description=updates['repo_description'], |
|
912 | 922 | repo_clone_uri=updates['clone_uri'], |
|
923 | repo_push_uri=updates['push_uri'], | |
|
913 | 924 | repo_fork_of=updates['fork_id'], |
|
914 | 925 | repo_private=updates['repo_private'], |
|
915 | 926 | repo_landing_commit_ref=updates['repo_landing_rev'], |
@@ -928,6 +939,7 b' def update_repo(' | |||
|
928 | 939 | repo_description=schema_data['repo_description'], |
|
929 | 940 | repo_private=schema_data['repo_private'], |
|
930 | 941 | clone_uri=schema_data['repo_clone_uri'], |
|
942 | push_uri=schema_data['repo_push_uri'], | |
|
931 | 943 | repo_landing_rev=schema_data['repo_landing_commit_ref'], |
|
932 | 944 | repo_enable_statistics=schema_data['repo_enable_statistics'], |
|
933 | 945 | repo_enable_locking=schema_data['repo_enable_locking'], |
@@ -1084,7 +1096,7 b' def fork_repo(request, apiuser, repoid, ' | |||
|
1084 | 1096 | 'landing_rev': schema_data['repo_landing_commit_ref'], |
|
1085 | 1097 | } |
|
1086 | 1098 | |
|
1087 | task = RepoModel().create_fork(data, cur_user=owner) | |
|
1099 | task = RepoModel().create_fork(data, cur_user=owner.user_id) | |
|
1088 | 1100 | # no commit, it's done in RepoModel, or async via celery |
|
1089 | 1101 | task_id = get_task_id(task) |
|
1090 | 1102 | |
@@ -1749,7 +1761,7 b' def revoke_user_group_permission(request' | |||
|
1749 | 1761 | |
|
1750 | 1762 | |
|
1751 | 1763 | @jsonrpc_method() |
|
1752 | def pull(request, apiuser, repoid): | |
|
1764 | def pull(request, apiuser, repoid, remote_uri=Optional(None)): | |
|
1753 | 1765 | """ |
|
1754 | 1766 | Triggers a pull on the given repository from a remote location. You |
|
1755 | 1767 | can use this to keep remote repositories up-to-date. |
@@ -1764,6 +1776,8 b' def pull(request, apiuser, repoid):' | |||
|
1764 | 1776 | :type apiuser: AuthUser |
|
1765 | 1777 | :param repoid: The repository name or repository ID. |
|
1766 | 1778 | :type repoid: str or int |
|
1779 | :param remote_uri: Optional remote URI to pass in for pull | |
|
1780 | :type remote_uri: str | |
|
1767 | 1781 | |
|
1768 | 1782 | Example output: |
|
1769 | 1783 | |
@@ -1771,7 +1785,7 b' def pull(request, apiuser, repoid):' | |||
|
1771 | 1785 | |
|
1772 | 1786 | id : <id_given_in_input> |
|
1773 | 1787 | result : { |
|
1774 | "msg": "Pulled from `<repository name>`" | |
|
1788 | "msg": "Pulled from url `<remote_url>` on repo `<repository name>`" | |
|
1775 | 1789 | "repository": "<repository name>" |
|
1776 | 1790 | } |
|
1777 | 1791 | error : null |
@@ -1783,27 +1797,31 b' def pull(request, apiuser, repoid):' | |||
|
1783 | 1797 | id : <id_given_in_input> |
|
1784 | 1798 | result : null |
|
1785 | 1799 | error : { |
|
1786 |
"Unable to pu |
|
|
1800 | "Unable to push changes from `<remote_url>`" | |
|
1787 | 1801 | } |
|
1788 | 1802 | |
|
1789 | 1803 | """ |
|
1790 | 1804 | |
|
1791 | 1805 | repo = get_repo_or_error(repoid) |
|
1806 | remote_uri = Optional.extract(remote_uri) | |
|
1807 | remote_uri_display = remote_uri or repo.clone_uri_hidden | |
|
1792 | 1808 | if not has_superadmin_permission(apiuser): |
|
1793 | 1809 | _perms = ('repository.admin',) |
|
1794 | 1810 | validate_repo_permissions(apiuser, repoid, repo, _perms) |
|
1795 | 1811 | |
|
1796 | 1812 | try: |
|
1797 |
ScmModel().pull_changes( |
|
|
1813 | ScmModel().pull_changes( | |
|
1814 | repo.repo_name, apiuser.username, remote_uri=remote_uri) | |
|
1798 | 1815 | return { |
|
1799 |
'msg': 'Pulled from `%s`' % |
|
|
1816 | 'msg': 'Pulled from url `%s` on repo `%s`' % ( | |
|
1817 | remote_uri_display, repo.repo_name), | |
|
1800 | 1818 | 'repository': repo.repo_name |
|
1801 | 1819 | } |
|
1802 | 1820 | except Exception: |
|
1803 | 1821 | log.exception("Exception occurred while trying to " |
|
1804 | 1822 | "pull changes from remote location") |
|
1805 | 1823 | raise JSONRPCError( |
|
1806 |
'Unable to pull changes from `%s`' % re |
|
|
1824 | 'Unable to pull changes from `%s`' % remote_uri_display | |
|
1807 | 1825 | ) |
|
1808 | 1826 | |
|
1809 | 1827 |
@@ -168,7 +168,8 b' def get_user_groups(request, apiuser):' | |||
|
168 | 168 | @jsonrpc_method() |
|
169 | 169 | def create_user_group( |
|
170 | 170 | request, apiuser, group_name, description=Optional(''), |
|
171 |
owner=Optional(OAttr('apiuser')), active=Optional(True) |
|
|
171 | owner=Optional(OAttr('apiuser')), active=Optional(True), | |
|
172 | sync=Optional(None)): | |
|
172 | 173 | """ |
|
173 | 174 | Creates a new user group. |
|
174 | 175 | |
@@ -188,6 +189,9 b' def create_user_group(' | |||
|
188 | 189 | :type owner: Optional(str or int) |
|
189 | 190 | :param active: Set this group as active. |
|
190 | 191 | :type active: Optional(``True`` | ``False``) |
|
192 | :param sync: Set enabled or disabled the automatically sync from | |
|
193 | external authentication types like ldap. | |
|
194 | :type sync: Optional(``True`` | ``False``) | |
|
191 | 195 | |
|
192 | 196 | Example output: |
|
193 | 197 | |
@@ -227,6 +231,15 b' def create_user_group(' | |||
|
227 | 231 | owner = get_user_or_error(owner) |
|
228 | 232 | active = Optional.extract(active) |
|
229 | 233 | description = Optional.extract(description) |
|
234 | sync = Optional.extract(sync) | |
|
235 | ||
|
236 | # set the sync option based on group_data | |
|
237 | group_data = None | |
|
238 | if sync: | |
|
239 | group_data = { | |
|
240 | 'extern_type': 'manual_api', | |
|
241 | 'extern_type_set_by': apiuser.username | |
|
242 | } | |
|
230 | 243 | |
|
231 | 244 | schema = user_group_schema.UserGroupSchema().bind( |
|
232 | 245 | # user caller |
@@ -246,7 +259,7 b' def create_user_group(' | |||
|
246 | 259 | name=schema_data['user_group_name'], |
|
247 | 260 | description=schema_data['user_group_description'], |
|
248 | 261 | owner=owner, |
|
249 | active=schema_data['user_group_active']) | |
|
262 | active=schema_data['user_group_active'], group_data=group_data) | |
|
250 | 263 | Session().flush() |
|
251 | 264 | creation_data = user_group.get_api_data() |
|
252 | 265 | audit_logger.store_api( |
@@ -265,7 +278,7 b' def create_user_group(' | |||
|
265 | 278 | @jsonrpc_method() |
|
266 | 279 | def update_user_group(request, apiuser, usergroupid, group_name=Optional(''), |
|
267 | 280 | description=Optional(''), owner=Optional(None), |
|
268 | active=Optional(True)): | |
|
281 | active=Optional(True), sync=Optional(None)): | |
|
269 | 282 | """ |
|
270 | 283 | Updates the specified `user group` with the details provided. |
|
271 | 284 | |
@@ -284,6 +297,9 b' def update_user_group(request, apiuser, ' | |||
|
284 | 297 | :type owner: Optional(str or int) |
|
285 | 298 | :param active: Set the group as active. |
|
286 | 299 | :type active: Optional(``True`` | ``False``) |
|
300 | :param sync: Set enabled or disabled the automatically sync from | |
|
301 | external authentication types like ldap. | |
|
302 | :type sync: Optional(``True`` | ``False``) | |
|
287 | 303 | |
|
288 | 304 | Example output: |
|
289 | 305 | |
@@ -329,8 +345,21 b' def update_user_group(request, apiuser, ' | |||
|
329 | 345 | store_update(updates, description, 'user_group_description') |
|
330 | 346 | store_update(updates, owner, 'user') |
|
331 | 347 | store_update(updates, active, 'users_group_active') |
|
348 | ||
|
349 | sync = Optional.extract(sync) | |
|
350 | group_data = None | |
|
351 | if sync is True: | |
|
352 | group_data = { | |
|
353 | 'extern_type': 'manual_api', | |
|
354 | 'extern_type_set_by': apiuser.username | |
|
355 | } | |
|
356 | if sync is False: | |
|
357 | group_data = user_group.group_data | |
|
358 | if group_data and "extern_type" in group_data: | |
|
359 | del group_data["extern_type"] | |
|
360 | ||
|
332 | 361 | try: |
|
333 | UserGroupModel().update(user_group, updates) | |
|
362 | UserGroupModel().update(user_group, updates, group_data=group_data) | |
|
334 | 363 | audit_logger.store_api( |
|
335 | 364 | 'user_group.edit', action_data={'old_data': old_data}, |
|
336 | 365 | user=apiuser) |
@@ -609,8 +638,18 b' def grant_user_permission_to_user_group(' | |||
|
609 | 638 | perm = get_perm_or_error(perm, prefix='usergroup.') |
|
610 | 639 | |
|
611 | 640 | try: |
|
612 | UserGroupModel().grant_user_permission( | |
|
641 | changes = UserGroupModel().grant_user_permission( | |
|
613 | 642 | user_group=user_group, user=user, perm=perm) |
|
643 | ||
|
644 | action_data = { | |
|
645 | 'added': changes['added'], | |
|
646 | 'updated': changes['updated'], | |
|
647 | 'deleted': changes['deleted'], | |
|
648 | } | |
|
649 | audit_logger.store_api( | |
|
650 | 'user_group.edit.permissions', action_data=action_data, | |
|
651 | user=apiuser) | |
|
652 | ||
|
614 | 653 | Session().commit() |
|
615 | 654 | return { |
|
616 | 655 | 'msg': |
@@ -669,8 +708,17 b' def revoke_user_permission_from_user_gro' | |||
|
669 | 708 | user = get_user_or_error(userid) |
|
670 | 709 | |
|
671 | 710 | try: |
|
672 | UserGroupModel().revoke_user_permission( | |
|
711 | changes = UserGroupModel().revoke_user_permission( | |
|
673 | 712 | user_group=user_group, user=user) |
|
713 | action_data = { | |
|
714 | 'added': changes['added'], | |
|
715 | 'updated': changes['updated'], | |
|
716 | 'deleted': changes['deleted'], | |
|
717 | } | |
|
718 | audit_logger.store_api( | |
|
719 | 'user_group.edit.permissions', action_data=action_data, | |
|
720 | user=apiuser) | |
|
721 | ||
|
674 | 722 | Session().commit() |
|
675 | 723 | return { |
|
676 | 724 | 'msg': 'Revoked perm for user: `%s` in user group: `%s`' % ( |
@@ -735,11 +783,20 b' def grant_user_group_permission_to_user_' | |||
|
735 | 783 | 'user group `%s` does not exist' % (sourceusergroupid,)) |
|
736 | 784 | |
|
737 | 785 | try: |
|
738 | UserGroupModel().grant_user_group_permission( | |
|
786 | changes = UserGroupModel().grant_user_group_permission( | |
|
739 | 787 | target_user_group=target_user_group, |
|
740 | 788 | user_group=user_group, perm=perm) |
|
789 | ||
|
790 | action_data = { | |
|
791 | 'added': changes['added'], | |
|
792 | 'updated': changes['updated'], | |
|
793 | 'deleted': changes['deleted'], | |
|
794 | } | |
|
795 | audit_logger.store_api( | |
|
796 | 'user_group.edit.permissions', action_data=action_data, | |
|
797 | user=apiuser) | |
|
798 | ||
|
741 | 799 | Session().commit() |
|
742 | ||
|
743 | 800 | return { |
|
744 | 801 | 'msg': 'Granted perm: `%s` for user group: `%s` ' |
|
745 | 802 | 'in user group: `%s`' % ( |
@@ -806,8 +863,17 b' def revoke_user_group_permission_from_us' | |||
|
806 | 863 | 'user group `%s` does not exist' % (sourceusergroupid,)) |
|
807 | 864 | |
|
808 | 865 | try: |
|
809 | UserGroupModel().revoke_user_group_permission( | |
|
866 | changes = UserGroupModel().revoke_user_group_permission( | |
|
810 | 867 | target_user_group=target_user_group, user_group=user_group) |
|
868 | action_data = { | |
|
869 | 'added': changes['added'], | |
|
870 | 'updated': changes['updated'], | |
|
871 | 'deleted': changes['deleted'], | |
|
872 | } | |
|
873 | audit_logger.store_api( | |
|
874 | 'user_group.edit.permissions', action_data=action_data, | |
|
875 | user=apiuser) | |
|
876 | ||
|
811 | 877 | Session().commit() |
|
812 | 878 | |
|
813 | 879 | return { |
@@ -22,9 +22,9 b' import time' | |||
|
22 | 22 | import logging |
|
23 | 23 | import operator |
|
24 | 24 | |
|
25 | from pyramid.httpexceptions import HTTPFound | |
|
25 | from pyramid.httpexceptions import HTTPFound, HTTPForbidden | |
|
26 | 26 | |
|
27 | from rhodecode.lib import helpers as h | |
|
27 | from rhodecode.lib import helpers as h, diffs | |
|
28 | 28 | from rhodecode.lib.utils2 import StrictAttributeDict, safe_int, datetime_to_time |
|
29 | 29 | from rhodecode.lib.vcs.exceptions import RepositoryRequirementError |
|
30 | 30 | from rhodecode.model import repo |
@@ -33,6 +33,7 b' from rhodecode.model import user_group' | |||
|
33 | 33 | from rhodecode.model import user |
|
34 | 34 | from rhodecode.model.db import User |
|
35 | 35 | from rhodecode.model.scm import ScmModel |
|
36 | from rhodecode.model.settings import VcsSettingsModel | |
|
36 | 37 | |
|
37 | 38 | log = logging.getLogger(__name__) |
|
38 | 39 | |
@@ -204,28 +205,47 b' class RepoAppView(BaseAppView):' | |||
|
204 | 205 | c.rhodecode_db_repo = self.db_repo |
|
205 | 206 | c.repo_name = self.db_repo_name |
|
206 | 207 | c.repository_pull_requests = self.db_repo_pull_requests |
|
208 | self.path_filter = PathFilter(None) | |
|
207 | 209 | |
|
208 |
c.repository_requirements_missing = |
|
|
210 | c.repository_requirements_missing = {} | |
|
209 | 211 | try: |
|
210 | 212 | self.rhodecode_vcs_repo = self.db_repo.scm_instance() |
|
213 | if self.rhodecode_vcs_repo: | |
|
214 | path_perms = self.rhodecode_vcs_repo.get_path_permissions( | |
|
215 | c.auth_user.username) | |
|
216 | self.path_filter = PathFilter(path_perms) | |
|
211 | 217 | except RepositoryRequirementError as e: |
|
212 |
c.repository_requirements_missing = |
|
|
218 | c.repository_requirements_missing = {'error': str(e)} | |
|
213 | 219 | self._handle_missing_requirements(e) |
|
214 | 220 | self.rhodecode_vcs_repo = None |
|
215 | 221 | |
|
216 | if (not c.repository_requirements_missing | |
|
217 | and self.rhodecode_vcs_repo is None): | |
|
222 | c.path_filter = self.path_filter # used by atom_feed_entry.mako | |
|
223 | ||
|
224 | if self.rhodecode_vcs_repo is None: | |
|
218 | 225 | # unable to fetch this repo as vcs instance, report back to user |
|
219 | 226 | h.flash(_( |
|
220 | 227 | "The repository `%(repo_name)s` cannot be loaded in filesystem. " |
|
221 | 228 | "Please check if it exist, or is not damaged.") % |
|
222 | 229 | {'repo_name': c.repo_name}, |
|
223 | 230 | category='error', ignore_duplicate=True) |
|
231 | if c.repository_requirements_missing: | |
|
232 | route = self.request.matched_route.name | |
|
233 | if route.startswith(('edit_repo', 'repo_summary')): | |
|
234 | # allow summary and edit repo on missing requirements | |
|
235 | return c | |
|
236 | ||
|
237 | raise HTTPFound( | |
|
238 | h.route_path('repo_summary', repo_name=self.db_repo_name)) | |
|
239 | ||
|
240 | else: # redirect if we don't show missing requirements | |
|
224 | 241 | raise HTTPFound(h.route_path('home')) |
|
225 | 242 | |
|
226 | 243 | return c |
|
227 | 244 | |
|
228 | def _get_f_path(self, matchdict, default=None): | |
|
245 | def _get_f_path_unchecked(self, matchdict, default=None): | |
|
246 | """ | |
|
247 | Should only be used by redirects, everything else should call _get_f_path | |
|
248 | """ | |
|
229 | 249 | f_path = matchdict.get('f_path') |
|
230 | 250 | if f_path: |
|
231 | 251 | # fix for multiple initial slashes that causes errors for GIT |
@@ -233,6 +253,63 b' class RepoAppView(BaseAppView):' | |||
|
233 | 253 | |
|
234 | 254 | return default |
|
235 | 255 | |
|
256 | def _get_f_path(self, matchdict, default=None): | |
|
257 | f_path_match = self._get_f_path_unchecked(matchdict, default) | |
|
258 | return self.path_filter.assert_path_permissions(f_path_match) | |
|
259 | ||
|
260 | def _get_general_setting(self, target_repo, settings_key, default=False): | |
|
261 | settings_model = VcsSettingsModel(repo=target_repo) | |
|
262 | settings = settings_model.get_general_settings() | |
|
263 | return settings.get(settings_key, default) | |
|
264 | ||
|
265 | ||
|
266 | class PathFilter(object): | |
|
267 | ||
|
268 | # Expects and instance of BasePathPermissionChecker or None | |
|
269 | def __init__(self, permission_checker): | |
|
270 | self.permission_checker = permission_checker | |
|
271 | ||
|
272 | def assert_path_permissions(self, path): | |
|
273 | if path and self.permission_checker and not self.permission_checker.has_access(path): | |
|
274 | raise HTTPForbidden() | |
|
275 | return path | |
|
276 | ||
|
277 | def filter_patchset(self, patchset): | |
|
278 | if not self.permission_checker or not patchset: | |
|
279 | return patchset, False | |
|
280 | had_filtered = False | |
|
281 | filtered_patchset = [] | |
|
282 | for patch in patchset: | |
|
283 | filename = patch.get('filename', None) | |
|
284 | if not filename or self.permission_checker.has_access(filename): | |
|
285 | filtered_patchset.append(patch) | |
|
286 | else: | |
|
287 | had_filtered = True | |
|
288 | if had_filtered: | |
|
289 | if isinstance(patchset, diffs.LimitedDiffContainer): | |
|
290 | filtered_patchset = diffs.LimitedDiffContainer(patchset.diff_limit, patchset.cur_diff_size, filtered_patchset) | |
|
291 | return filtered_patchset, True | |
|
292 | else: | |
|
293 | return patchset, False | |
|
294 | ||
|
295 | def render_patchset_filtered(self, diffset, patchset, source_ref=None, target_ref=None): | |
|
296 | filtered_patchset, has_hidden_changes = self.filter_patchset(patchset) | |
|
297 | result = diffset.render_patchset(filtered_patchset, source_ref=source_ref, target_ref=target_ref) | |
|
298 | result.has_hidden_changes = has_hidden_changes | |
|
299 | return result | |
|
300 | ||
|
301 | def get_raw_patch(self, diff_processor): | |
|
302 | if self.permission_checker is None: | |
|
303 | return diff_processor.as_raw() | |
|
304 | elif self.permission_checker.has_full_access: | |
|
305 | return diff_processor.as_raw() | |
|
306 | else: | |
|
307 | return '# Repository has user-specific filters, raw patch generation is disabled.' | |
|
308 | ||
|
309 | @property | |
|
310 | def is_enabled(self): | |
|
311 | return self.permission_checker is not None | |
|
312 | ||
|
236 | 313 | |
|
237 | 314 | class RepoGroupAppView(BaseAppView): |
|
238 | 315 | def __init__(self, context, request): |
@@ -169,6 +169,11 b' def admin_routes(config):' | |||
|
169 | 169 | name='admin_settings_labs_update', |
|
170 | 170 | pattern='/settings/labs/update') |
|
171 | 171 | |
|
172 | # Automation EE feature | |
|
173 | config.add_route( | |
|
174 | 'admin_settings_automation', | |
|
175 | pattern=ADMIN_PREFIX + '/settings/automation') | |
|
176 | ||
|
172 | 177 | # global permissions |
|
173 | 178 | |
|
174 | 179 | config.add_route( |
@@ -95,7 +95,8 b' class NavigationRegistry(object):' | |||
|
95 | 95 | 'admin_settings_sessions'), |
|
96 | 96 | NavEntry('open_source', _('Open Source Licenses'), |
|
97 | 97 | 'admin_settings_open_source'), |
|
98 | ||
|
98 | NavEntry('automation', _('Automation'), | |
|
99 | 'admin_settings_automation') | |
|
99 | 100 | ] |
|
100 | 101 | |
|
101 | 102 | _labs_entry = NavEntry('labs', _('Labs'), |
@@ -86,6 +86,7 b' class TestAuthSettingsView(object):' | |||
|
86 | 86 | |
|
87 | 87 | 'host': 'dc.example.com', |
|
88 | 88 | 'port': '999', |
|
89 | 'timeout': 3600, | |
|
89 | 90 | 'tls_kind': 'PLAIN', |
|
90 | 91 | 'tls_reqcert': 'NEVER', |
|
91 | 92 |
@@ -67,6 +67,7 b' class TestAdminUsersSshKeysView(TestCont' | |||
|
67 | 67 | 'I4fG8+hBHzpeFxUGvSGNtXPUbwaAY8j/oHYrTpMgkj6pUEFsiKfC5zPq' \ |
|
68 | 68 | 'PFR5HyKTCHW0nFUJnZsbyFT5hMiF/hZkJc9A0ZbdSvJwCRQ/g3bmdL ' \ |
|
69 | 69 | 'your_email@example.com' |
|
70 | FINGERPRINT = 'MD5:01:4f:ad:29:22:6e:01:37:c9:d2:52:26:52:b0:2d:93' | |
|
70 | 71 | |
|
71 | 72 | def test_ssh_keys_default_user(self): |
|
72 | 73 | self.log_user() |
@@ -111,9 +112,11 b' class TestAdminUsersSshKeysView(TestCont' | |||
|
111 | 112 | route_path('edit_user_ssh_keys_add', user_id=user_id), |
|
112 | 113 | {'description': desc, 'key_data': key_data, |
|
113 | 114 | 'csrf_token': self.csrf_token}) |
|
115 | ||
|
116 | err = 'Such key with fingerprint `{}` already exists, ' \ | |
|
117 | 'please use a different one'.format(self.FINGERPRINT) | |
|
114 | 118 | assert_session_flash(response, 'An error occurred during ssh key ' |
|
115 |
'saving: |
|
|
116 | 'please use a different one') | |
|
119 | 'saving: {}'.format(err)) | |
|
117 | 120 | |
|
118 | 121 | def test_add_ssh_key(self, user_util): |
|
119 | 122 | self.log_user() |
@@ -28,7 +28,7 b' from rhodecode.apps._base import BaseApp' | |||
|
28 | 28 | from rhodecode.apps.admin.navigation import navigation_list |
|
29 | 29 | from rhodecode.lib.auth import ( |
|
30 | 30 | LoginRequired, HasPermissionAllDecorator, CSRFRequired) |
|
31 | from rhodecode.lib.utils2 import safe_int | |
|
31 | from rhodecode.lib.utils2 import safe_int, StrictAttributeDict | |
|
32 | 32 | |
|
33 | 33 | log = logging.getLogger(__name__) |
|
34 | 34 | |
@@ -36,8 +36,40 b' log = logging.getLogger(__name__)' | |||
|
36 | 36 | class AdminProcessManagementView(BaseAppView): |
|
37 | 37 | def load_default_context(self): |
|
38 | 38 | c = self._get_local_tmpl_context() |
|
39 | return c | |
|
39 | 40 | |
|
40 | return c | |
|
41 | def _format_proc(self, proc, with_children=False): | |
|
42 | try: | |
|
43 | mem = proc.memory_info() | |
|
44 | proc_formatted = StrictAttributeDict({ | |
|
45 | 'pid': proc.pid, | |
|
46 | 'name': proc.name(), | |
|
47 | 'mem_rss': mem.rss, | |
|
48 | 'mem_vms': mem.vms, | |
|
49 | 'cpu_percent': proc.cpu_percent(), | |
|
50 | 'create_time': proc.create_time(), | |
|
51 | 'cmd': ' '.join(proc.cmdline()), | |
|
52 | }) | |
|
53 | ||
|
54 | if with_children: | |
|
55 | proc_formatted.update({ | |
|
56 | 'children': [self._format_proc(x) | |
|
57 | for x in proc.children(recursive=True)] | |
|
58 | }) | |
|
59 | except Exception: | |
|
60 | log.exception('Failed to load proc') | |
|
61 | proc_formatted = None | |
|
62 | return proc_formatted | |
|
63 | ||
|
64 | def get_processes(self): | |
|
65 | proc_list = [] | |
|
66 | for p in psutil.process_iter(): | |
|
67 | if 'gunicorn' in p.name(): | |
|
68 | proc = self._format_proc(p, with_children=True) | |
|
69 | if proc: | |
|
70 | proc_list.append(proc) | |
|
71 | ||
|
72 | return proc_list | |
|
41 | 73 | |
|
42 | 74 | @LoginRequired() |
|
43 | 75 | @HasPermissionAllDecorator('hg.admin') |
@@ -50,8 +82,7 b' class AdminProcessManagementView(BaseApp' | |||
|
50 | 82 | |
|
51 | 83 | c.active = 'process_management' |
|
52 | 84 | c.navlist = navigation_list(self.request) |
|
53 | c.gunicorn_processes = ( | |
|
54 | p for p in psutil.process_iter() if 'gunicorn' in p.name()) | |
|
85 | c.gunicorn_processes = self.get_processes() | |
|
55 | 86 | return self._get_template_context(c) |
|
56 | 87 | |
|
57 | 88 | @LoginRequired() |
@@ -62,8 +93,7 b' class AdminProcessManagementView(BaseApp' | |||
|
62 | 93 | def process_management_data(self): |
|
63 | 94 | _ = self.request.translate |
|
64 | 95 | c = self.load_default_context() |
|
65 | c.gunicorn_processes = ( | |
|
66 | p for p in psutil.process_iter() if 'gunicorn' in p.name()) | |
|
96 | c.gunicorn_processes = self.get_processes() | |
|
67 | 97 | return self._get_template_context(c) |
|
68 | 98 | |
|
69 | 99 | @LoginRequired() |
@@ -75,9 +105,10 b' class AdminProcessManagementView(BaseApp' | |||
|
75 | 105 | def process_management_signal(self): |
|
76 | 106 | pids = self.request.json.get('pids', []) |
|
77 | 107 | result = [] |
|
108 | ||
|
78 | 109 | def on_terminate(proc): |
|
79 | 110 | msg = "process `PID:{}` terminated with exit code {}".format( |
|
80 | proc.pid, proc.returncode) | |
|
111 | proc.pid, proc.returncode or 0) | |
|
81 | 112 | result.append(msg) |
|
82 | 113 | |
|
83 | 114 | procs = [] |
@@ -91,15 +122,22 b' class AdminProcessManagementView(BaseApp' | |||
|
91 | 122 | |
|
92 | 123 | children = proc.children(recursive=True) |
|
93 | 124 | if children: |
|
94 |
|
|
|
125 | log.warning('Wont kill Master Process') | |
|
95 | 126 | else: |
|
96 | 127 | procs.append(proc) |
|
97 | 128 | |
|
98 | 129 | for p in procs: |
|
130 | try: | |
|
99 | 131 | p.terminate() |
|
132 | except psutil.AccessDenied as e: | |
|
133 | log.warning('Access denied: {}'.format(e)) | |
|
134 | ||
|
100 | 135 | gone, alive = psutil.wait_procs(procs, timeout=10, callback=on_terminate) |
|
101 | 136 | for p in alive: |
|
137 | try: | |
|
102 | 138 | p.kill() |
|
139 | except psutil.AccessDenied as e: | |
|
140 | log.warning('Access denied: {}'.format(e)) | |
|
103 | 141 | |
|
104 | 142 | return {'result': result} |
|
105 | 143 |
@@ -314,6 +314,9 b' class AdminSettingsView(BaseAppView):' | |||
|
314 | 314 | try: |
|
315 | 315 | form_result = application_form.to_python(dict(self.request.POST)) |
|
316 | 316 | except formencode.Invalid as errors: |
|
317 | h.flash( | |
|
318 | _("Some form inputs contain invalid data."), | |
|
319 | category='error') | |
|
317 | 320 | data = render('rhodecode:templates/admin/settings/settings.mako', |
|
318 | 321 | self._get_template_context(c), self.request) |
|
319 | 322 | html = formencode.htmlfill.render( |
@@ -386,6 +389,9 b' class AdminSettingsView(BaseAppView):' | |||
|
386 | 389 | try: |
|
387 | 390 | form_result = application_form.to_python(dict(self.request.POST)) |
|
388 | 391 | except formencode.Invalid as errors: |
|
392 | h.flash( | |
|
393 | _("Some form inputs contain invalid data."), | |
|
394 | category='error') | |
|
389 | 395 | data = render('rhodecode:templates/admin/settings/settings.mako', |
|
390 | 396 | self._get_template_context(c), self.request) |
|
391 | 397 | html = formencode.htmlfill.render( |
@@ -669,6 +675,17 b' class AdminSettingsView(BaseAppView):' | |||
|
669 | 675 | @LoginRequired() |
|
670 | 676 | @HasPermissionAllDecorator('hg.admin') |
|
671 | 677 | @view_config( |
|
678 | route_name='admin_settings_automation', request_method='GET', | |
|
679 | renderer='rhodecode:templates/admin/settings/settings.mako') | |
|
680 | def settings_automation(self): | |
|
681 | c = self.load_default_context() | |
|
682 | c.active = 'automation' | |
|
683 | ||
|
684 | return self._get_template_context(c) | |
|
685 | ||
|
686 | @LoginRequired() | |
|
687 | @HasPermissionAllDecorator('hg.admin') | |
|
688 | @view_config( | |
|
672 | 689 | route_name='admin_settings_labs', request_method='GET', |
|
673 | 690 | renderer='rhodecode:templates/admin/settings/settings.mako') |
|
674 | 691 | def settings_labs(self): |
@@ -705,7 +722,7 b' class AdminSettingsView(BaseAppView):' | |||
|
705 | 722 | form_result = application_form.to_python(dict(self.request.POST)) |
|
706 | 723 | except formencode.Invalid as errors: |
|
707 | 724 | h.flash( |
|
708 |
_( |
|
|
725 | _("Some form inputs contain invalid data."), | |
|
709 | 726 | category='error') |
|
710 | 727 | data = render('rhodecode:templates/admin/settings/settings.mako', |
|
711 | 728 | self._get_template_context(c), self.request) |
@@ -123,6 +123,9 b' class AdminSystemInfoSettingsView(BaseAp' | |||
|
123 | 123 | (_('Uptime'), val('uptime')['text'], state('uptime')), |
|
124 | 124 | ('', '', ''), # spacer |
|
125 | 125 | |
|
126 | # ulimit | |
|
127 | (_('Ulimit'), val('ulimit')['text'], state('ulimit')), | |
|
128 | ||
|
126 | 129 | # Repo storage |
|
127 | 130 | (_('Storage location'), val('storage')['path'], state('storage')), |
|
128 | 131 | (_('Storage info'), val('storage')['text'], state('storage')), |
@@ -88,8 +88,8 b' class AdminUserGroupsView(BaseAppView, D' | |||
|
88 | 88 | _render = self.request.get_partial_renderer( |
|
89 | 89 | 'rhodecode:templates/data_table/_dt_elements.mako') |
|
90 | 90 | |
|
91 |
def user_group_name( |
|
|
92 |
return _render("user_group_name", |
|
|
91 | def user_group_name(user_group_name): | |
|
92 | return _render("user_group_name", user_group_name) | |
|
93 | 93 | |
|
94 | 94 | def user_group_actions(user_group_id, user_group_name): |
|
95 | 95 | return _render("user_group_actions", user_group_id, user_group_name) |
@@ -153,15 +153,14 b' class AdminUserGroupsView(BaseAppView, D' | |||
|
153 | 153 | user_groups_data = [] |
|
154 | 154 | for user_gr in auth_user_group_list: |
|
155 | 155 | user_groups_data.append({ |
|
156 | "users_group_name": user_group_name( | |
|
157 | user_gr.users_group_id, h.escape(user_gr.users_group_name)), | |
|
156 | "users_group_name": user_group_name(user_gr.users_group_name), | |
|
158 | 157 | "name_raw": h.escape(user_gr.users_group_name), |
|
159 | 158 | "description": h.escape(user_gr.user_group_description), |
|
160 | 159 | "members": user_gr.member_count, |
|
161 | 160 | # NOTE(marcink): because of advanced query we |
|
162 | 161 | # need to load it like that |
|
163 |
"sync": UserGroup._load_ |
|
|
164 |
user_gr.group_data) |
|
|
162 | "sync": UserGroup._load_sync( | |
|
163 | UserGroup._load_group_data(user_gr.group_data)), | |
|
165 | 164 | "active": h.bool2icon(user_gr.users_group_active), |
|
166 | 165 | "owner": user_profile(user_gr.User.username), |
|
167 | 166 | "action": user_group_actions( |
@@ -817,6 +817,7 b' class UsersView(UserAppView):' | |||
|
817 | 817 | key_data = self.request.POST.get('key_data') |
|
818 | 818 | description = self.request.POST.get('description') |
|
819 | 819 | |
|
820 | fingerprint = 'unknown' | |
|
820 | 821 | try: |
|
821 | 822 | if not key_data: |
|
822 | 823 | raise ValueError('Please add a valid public key') |
@@ -841,8 +842,9 b' class UsersView(UserAppView):' | |||
|
841 | 842 | |
|
842 | 843 | except IntegrityError: |
|
843 | 844 | log.exception("Exception during ssh key saving") |
|
844 | h.flash(_('An error occurred during ssh key saving: {}').format( | |
|
845 |
' |
|
|
845 | err = 'Such key with fingerprint `{}` already exists, ' \ | |
|
846 | 'please use a different one'.format(fingerprint) | |
|
847 | h.flash(_('An error occurred during ssh key saving: {}').format(err), | |
|
846 | 848 | category='error') |
|
847 | 849 | except Exception as e: |
|
848 | 850 | log.exception("Exception during ssh key saving") |
@@ -109,7 +109,7 b' class TestMyAccountEdit(TestController):' | |||
|
109 | 109 | # ('extern_name', {'extern_name': None}), |
|
110 | 110 | ('active', {'active': False}), |
|
111 | 111 | ('active', {'active': True}), |
|
112 | ('email', {'email': 'some@email.com'}), | |
|
112 | ('email', {'email': u'some@email.com'}), | |
|
113 | 113 | ]) |
|
114 | 114 | def test_my_account_update(self, name, attrs, user_util): |
|
115 | 115 | usr = user_util.create_user(password='qweqwe') |
@@ -120,13 +120,17 b' class TestMyAccountEdit(TestController):' | |||
|
120 | 120 | |
|
121 | 121 | params.update({'password_confirmation': ''}) |
|
122 | 122 | params.update({'new_password': ''}) |
|
123 | params.update({'extern_type': 'rhodecode'}) | |
|
124 | params.update({'extern_name': 'rhodecode'}) | |
|
123 | params.update({'extern_type': u'rhodecode'}) | |
|
124 | params.update({'extern_name': u'rhodecode'}) | |
|
125 | 125 | params.update({'csrf_token': self.csrf_token}) |
|
126 | 126 | |
|
127 | 127 | params.update(attrs) |
|
128 | 128 | # my account page cannot set language param yet, only for admins |
|
129 | 129 | del params['language'] |
|
130 | if name == 'email': | |
|
131 | uem = user_util.create_additional_user_email(usr, attrs['email']) | |
|
132 | email_before = User.get(user_id).email | |
|
133 | ||
|
130 | 134 | response = self.app.post(route_path('my_account_update'), params) |
|
131 | 135 | |
|
132 | 136 | assert_session_flash( |
@@ -146,7 +150,7 b' class TestMyAccountEdit(TestController):' | |||
|
146 | 150 | params['language'] = updated_params['language'] |
|
147 | 151 | |
|
148 | 152 | if name == 'email': |
|
149 | params['emails'] = [attrs['email']] | |
|
153 | params['emails'] = [attrs['email'], email_before] | |
|
150 | 154 | if name == 'extern_type': |
|
151 | 155 | # cannot update this via form, expected value is original one |
|
152 | 156 | params['extern_type'] = "rhodecode" |
@@ -162,10 +166,10 b' class TestMyAccountEdit(TestController):' | |||
|
162 | 166 | |
|
163 | 167 | assert params == updated_params |
|
164 | 168 | |
|
165 | def test_my_account_update_err_email_exists(self): | |
|
169 | def test_my_account_update_err_email_not_exists_in_emails(self): | |
|
166 | 170 | self.log_user() |
|
167 | 171 | |
|
168 |
new_email = 'test_regular@mail.com' # |
|
|
172 | new_email = 'test_regular@mail.com' # not in emails | |
|
169 | 173 | params = { |
|
170 | 174 | 'username': 'test_admin', |
|
171 | 175 | 'new_password': 'test12', |
@@ -179,7 +183,7 b' class TestMyAccountEdit(TestController):' | |||
|
179 | 183 | response = self.app.post(route_path('my_account_update'), |
|
180 | 184 | params=params) |
|
181 | 185 | |
|
182 | response.mustcontain('This e-mail address is already taken') | |
|
186 | response.mustcontain('"test_regular@mail.com" is not one of test_admin@mail.com') | |
|
183 | 187 | |
|
184 | 188 | def test_my_account_update_bad_email_address(self): |
|
185 | 189 | self.log_user('test_regular2', 'test12') |
@@ -197,7 +201,4 b' class TestMyAccountEdit(TestController):' | |||
|
197 | 201 | response = self.app.post(route_path('my_account_update'), |
|
198 | 202 | params=params) |
|
199 | 203 | |
|
200 | response.mustcontain('An email address must contain a single @') | |
|
201 | msg = u'Username "%(username)s" already exists' | |
|
202 | msg = h.html_escape(msg % {'username': 'test_admin'}) | |
|
203 | response.mustcontain(u"%s" % msg) | |
|
204 | response.mustcontain('"newmail.pl" is not one of test_regular2@mail.com') |
@@ -24,7 +24,7 b' from rhodecode.apps._base import ADMIN_P' | |||
|
24 | 24 | from rhodecode.model.db import User, UserEmailMap |
|
25 | 25 | from rhodecode.tests import ( |
|
26 | 26 | TestController, TEST_USER_ADMIN_LOGIN, TEST_USER_REGULAR_EMAIL, |
|
27 | assert_session_flash) | |
|
27 | assert_session_flash, TEST_USER_REGULAR_PASS) | |
|
28 | 28 | from rhodecode.tests.fixture import Fixture |
|
29 | 29 | |
|
30 | 30 | fixture = Fixture() |
@@ -47,30 +47,14 b' class TestMyAccountEmails(TestController' | |||
|
47 | 47 | response = self.app.get(route_path('my_account_emails')) |
|
48 | 48 | response.mustcontain('No additional emails specified') |
|
49 | 49 | |
|
50 | def test_my_account_my_emails_add_existing_email(self): | |
|
51 | self.log_user() | |
|
52 | response = self.app.get(route_path('my_account_emails')) | |
|
53 | response.mustcontain('No additional emails specified') | |
|
54 | response = self.app.post(route_path('my_account_emails_add'), | |
|
55 | {'new_email': TEST_USER_REGULAR_EMAIL, | |
|
56 | 'csrf_token': self.csrf_token}) | |
|
57 | assert_session_flash(response, 'This e-mail address is already taken') | |
|
58 | ||
|
59 | def test_my_account_my_emails_add_mising_email_in_form(self): | |
|
60 | self.log_user() | |
|
61 | response = self.app.get(route_path('my_account_emails')) | |
|
62 | response.mustcontain('No additional emails specified') | |
|
63 | response = self.app.post(route_path('my_account_emails_add'), | |
|
64 | {'csrf_token': self.csrf_token}) | |
|
65 | assert_session_flash(response, 'Please enter an email address') | |
|
66 | ||
|
67 | 50 | def test_my_account_my_emails_add_remove(self): |
|
68 | 51 | self.log_user() |
|
69 | 52 | response = self.app.get(route_path('my_account_emails')) |
|
70 | 53 | response.mustcontain('No additional emails specified') |
|
71 | 54 | |
|
72 | 55 | response = self.app.post(route_path('my_account_emails_add'), |
|
73 |
{' |
|
|
56 | {'email': 'foo@barz.com', | |
|
57 | 'current_password': TEST_USER_REGULAR_PASS, | |
|
74 | 58 | 'csrf_token': self.csrf_token}) |
|
75 | 59 | |
|
76 | 60 | response = self.app.get(route_path('my_account_emails')) |
@@ -66,6 +66,7 b' class TestMyAccountSshKeysView(TestContr' | |||
|
66 | 66 | 'I4fG8+hBHzpeFxUGvSGNtXPUbwaAY8j/oHYrTpMgkj6pUEFsiKfC5zPq' \ |
|
67 | 67 | 'PFR5HyKTCHW0nFUJnZsbyFT5hMiF/hZkJc9A0ZbdSvJwCRQ/g3bmdL ' \ |
|
68 | 68 | 'your_email@example.com' |
|
69 | FINGERPRINT = 'MD5:01:4f:ad:29:22:6e:01:37:c9:d2:52:26:52:b0:2d:93' | |
|
69 | 70 | |
|
70 | 71 | def test_add_ssh_key_error(self, user_util): |
|
71 | 72 | user = user_util.create_user(password='qweqwe') |
@@ -100,9 +101,11 b' class TestMyAccountSshKeysView(TestContr' | |||
|
100 | 101 | route_path('my_account_ssh_keys_add'), |
|
101 | 102 | {'description': desc, 'key_data': key_data, |
|
102 | 103 | 'csrf_token': self.csrf_token}) |
|
104 | ||
|
105 | err = 'Such key with fingerprint `{}` already exists, ' \ | |
|
106 | 'please use a different one'.format(self.FINGERPRINT) | |
|
103 | 107 | assert_session_flash(response, 'An error occurred during ssh key ' |
|
104 |
'saving: |
|
|
105 | 'please use a different one') | |
|
108 | 'saving: {}'.format(err)) | |
|
106 | 109 | |
|
107 | 110 | def test_add_ssh_key(self, user_util): |
|
108 | 111 | user = user_util.create_user(password='qweqwe') |
@@ -232,40 +232,59 b' class MyAccountView(BaseAppView, DataGri' | |||
|
232 | 232 | |
|
233 | 233 | c.user_email_map = UserEmailMap.query()\ |
|
234 | 234 | .filter(UserEmailMap.user == c.user).all() |
|
235 | ||
|
236 | schema = user_schema.AddEmailSchema().bind( | |
|
237 | username=c.user.username, user_emails=c.user.emails) | |
|
238 | ||
|
239 | form = forms.RcForm(schema, | |
|
240 | action=h.route_path('my_account_emails_add'), | |
|
241 | buttons=(forms.buttons.save, forms.buttons.reset)) | |
|
242 | ||
|
243 | c.form = form | |
|
235 | 244 | return self._get_template_context(c) |
|
236 | 245 | |
|
237 | 246 | @LoginRequired() |
|
238 | 247 | @NotAnonymous() |
|
239 | 248 | @CSRFRequired() |
|
240 | 249 | @view_config( |
|
241 |
route_name='my_account_emails_add', request_method='POST' |
|
|
250 | route_name='my_account_emails_add', request_method='POST', | |
|
251 | renderer='rhodecode:templates/admin/my_account/my_account.mako') | |
|
242 | 252 | def my_account_emails_add(self): |
|
243 | 253 | _ = self.request.translate |
|
244 | 254 | c = self.load_default_context() |
|
255 | c.active = 'emails' | |
|
245 | 256 | |
|
246 | email = self.request.POST.get('new_email') | |
|
257 | schema = user_schema.AddEmailSchema().bind( | |
|
258 | username=c.user.username, user_emails=c.user.emails) | |
|
247 | 259 | |
|
260 | form = forms.RcForm( | |
|
261 | schema, action=h.route_path('my_account_emails_add'), | |
|
262 | buttons=(forms.buttons.save, forms.buttons.reset)) | |
|
263 | ||
|
264 | controls = self.request.POST.items() | |
|
248 | 265 | try: |
|
249 | form = UserExtraEmailForm(self.request.translate)() | |
|
250 | data = form.to_python({'email': email}) | |
|
251 | email = data['email'] | |
|
252 | ||
|
253 | UserModel().add_extra_email(c.user.user_id, email) | |
|
266 | valid_data = form.validate(controls) | |
|
267 | UserModel().add_extra_email(c.user.user_id, valid_data['email']) | |
|
254 | 268 | audit_logger.store_web( |
|
255 | 269 | 'user.edit.email.add', action_data={ |
|
256 | 'data': {'email': email, 'user': 'self'}}, | |
|
270 | 'data': {'email': valid_data['email'], 'user': 'self'}}, | |
|
257 | 271 | user=self._rhodecode_user,) |
|
258 | ||
|
259 | 272 | Session().commit() |
|
260 | h.flash(_("Added new email address `%s` for user account") % email, | |
|
261 | category='success') | |
|
262 | 273 | except formencode.Invalid as error: |
|
263 | 274 | h.flash(h.escape(error.error_dict['email']), category='error') |
|
275 | except forms.ValidationFailure as e: | |
|
276 | c.user_email_map = UserEmailMap.query() \ | |
|
277 | .filter(UserEmailMap.user == c.user).all() | |
|
278 | c.form = e | |
|
279 | return self._get_template_context(c) | |
|
264 | 280 | except Exception: |
|
265 |
log.exception("Exception in |
|
|
266 |
h.flash(_(' |
|
|
281 | log.exception("Exception adding email") | |
|
282 | h.flash(_('Error occurred during adding email'), | |
|
267 | 283 | category='error') |
|
268 | return HTTPFound(h.route_path('my_account_emails')) | |
|
284 | else: | |
|
285 | h.flash(_("Successfully added email"), category='success') | |
|
286 | ||
|
287 | raise HTTPFound(self.request.route_path('my_account_emails')) | |
|
269 | 288 | |
|
270 | 289 | @LoginRequired() |
|
271 | 290 | @NotAnonymous() |
@@ -414,22 +433,23 b' class MyAccountView(BaseAppView, DataGri' | |||
|
414 | 433 | def my_account_edit(self): |
|
415 | 434 | c = self.load_default_context() |
|
416 | 435 | c.active = 'profile_edit' |
|
417 | ||
|
418 | c.perm_user = c.auth_user | |
|
419 | 436 | c.extern_type = c.user.extern_type |
|
420 | 437 | c.extern_name = c.user.extern_name |
|
421 | 438 | |
|
422 | defaults = c.user.get_dict() | |
|
439 | schema = user_schema.UserProfileSchema().bind( | |
|
440 | username=c.user.username, user_emails=c.user.emails) | |
|
441 | appstruct = { | |
|
442 | 'username': c.user.username, | |
|
443 | 'email': c.user.email, | |
|
444 | 'firstname': c.user.firstname, | |
|
445 | 'lastname': c.user.lastname, | |
|
446 | } | |
|
447 | c.form = forms.RcForm( | |
|
448 | schema, appstruct=appstruct, | |
|
449 | action=h.route_path('my_account_update'), | |
|
450 | buttons=(forms.buttons.save, forms.buttons.reset)) | |
|
423 | 451 | |
|
424 | data = render('rhodecode:templates/admin/my_account/my_account.mako', | |
|
425 | self._get_template_context(c), self.request) | |
|
426 | html = formencode.htmlfill.render( | |
|
427 | data, | |
|
428 | defaults=defaults, | |
|
429 | encoding="UTF-8", | |
|
430 | force_defaults=False | |
|
431 | ) | |
|
432 | return Response(html) | |
|
452 | return self._get_template_context(c) | |
|
433 | 453 | |
|
434 | 454 | @LoginRequired() |
|
435 | 455 | @NotAnonymous() |
@@ -442,55 +462,40 b' class MyAccountView(BaseAppView, DataGri' | |||
|
442 | 462 | _ = self.request.translate |
|
443 | 463 | c = self.load_default_context() |
|
444 | 464 | c.active = 'profile_edit' |
|
445 | ||
|
446 | 465 | c.perm_user = c.auth_user |
|
447 | 466 | c.extern_type = c.user.extern_type |
|
448 | 467 | c.extern_name = c.user.extern_name |
|
449 | 468 | |
|
450 | _form = UserForm(self.request.translate, edit=True, | |
|
451 | old_data={'user_id': self._rhodecode_user.user_id, | |
|
452 | 'email': self._rhodecode_user.email})() | |
|
453 | form_result = {} | |
|
469 | schema = user_schema.UserProfileSchema().bind( | |
|
470 | username=c.user.username, user_emails=c.user.emails) | |
|
471 | form = forms.RcForm( | |
|
472 | schema, buttons=(forms.buttons.save, forms.buttons.reset)) | |
|
473 | ||
|
474 | controls = self.request.POST.items() | |
|
454 | 475 | try: |
|
455 | post_data = dict(self.request.POST) | |
|
456 | post_data['new_password'] = '' | |
|
457 | post_data['password_confirmation'] = '' | |
|
458 | form_result = _form.to_python(post_data) | |
|
459 | # skip updating those attrs for my account | |
|
476 | valid_data = form.validate(controls) | |
|
460 | 477 | skip_attrs = ['admin', 'active', 'extern_type', 'extern_name', |
|
461 | 478 | 'new_password', 'password_confirmation'] |
|
462 | # TODO: plugin should define if username can be updated | |
|
463 | 479 | if c.extern_type != "rhodecode": |
|
464 | 480 | # forbid updating username for external accounts |
|
465 | 481 | skip_attrs.append('username') |
|
466 | ||
|
482 | old_email = c.user.email | |
|
467 | 483 | UserModel().update_user( |
|
468 | 484 | self._rhodecode_user.user_id, skip_attrs=skip_attrs, |
|
469 |
** |
|
|
470 | h.flash(_('Your account was updated successfully'), | |
|
471 | category='success') | |
|
485 | **valid_data) | |
|
486 | if old_email != valid_data['email']: | |
|
487 | old = UserEmailMap.query() \ | |
|
488 | .filter(UserEmailMap.user == c.user).filter(UserEmailMap.email == valid_data['email']).first() | |
|
489 | old.email = old_email | |
|
490 | h.flash(_('Your account was updated successfully'), category='success') | |
|
472 | 491 | Session().commit() |
|
473 | ||
|
474 | except formencode.Invalid as errors: | |
|
475 | data = render( | |
|
476 | 'rhodecode:templates/admin/my_account/my_account.mako', | |
|
477 | self._get_template_context(c), self.request) | |
|
478 | ||
|
479 | html = formencode.htmlfill.render( | |
|
480 | data, | |
|
481 | defaults=errors.value, | |
|
482 | errors=errors.error_dict or {}, | |
|
483 | prefix_error=False, | |
|
484 | encoding="UTF-8", | |
|
485 | force_defaults=False) | |
|
486 | return Response(html) | |
|
487 | ||
|
492 | except forms.ValidationFailure as e: | |
|
493 | c.form = e | |
|
494 | return self._get_template_context(c) | |
|
488 | 495 | except Exception: |
|
489 | 496 | log.exception("Exception updating user") |
|
490 |
h.flash(_('Error occurred during update of user |
|
|
491 |
|
|
|
492 | raise HTTPFound(h.route_path('my_account_profile')) | |
|
493 | ||
|
497 | h.flash(_('Error occurred during update of user'), | |
|
498 | category='error') | |
|
494 | 499 | raise HTTPFound(h.route_path('my_account_profile')) |
|
495 | 500 | |
|
496 | 501 | def _get_pull_requests_list(self, statuses): |
@@ -89,7 +89,7 b' class MyAccountSshKeysView(BaseAppView, ' | |||
|
89 | 89 | user_data = c.user.get_api_data() |
|
90 | 90 | key_data = self.request.POST.get('key_data') |
|
91 | 91 | description = self.request.POST.get('description') |
|
92 | ||
|
92 | fingerprint = 'unknown' | |
|
93 | 93 | try: |
|
94 | 94 | if not key_data: |
|
95 | 95 | raise ValueError('Please add a valid public key') |
@@ -114,8 +114,9 b' class MyAccountSshKeysView(BaseAppView, ' | |||
|
114 | 114 | |
|
115 | 115 | except IntegrityError: |
|
116 | 116 | log.exception("Exception during ssh key saving") |
|
117 | h.flash(_('An error occurred during ssh key saving: {}').format( | |
|
118 |
' |
|
|
117 | err = 'Such key with fingerprint `{}` already exists, ' \ | |
|
118 | 'please use a different one'.format(fingerprint) | |
|
119 | h.flash(_('An error occurred during ssh key saving: {}').format(err), | |
|
119 | 120 | category='error') |
|
120 | 121 | except Exception as e: |
|
121 | 122 | log.exception("Exception during ssh key saving") |
@@ -331,6 +331,10 b' def includeme(config):' | |||
|
331 | 331 | name='edit_repo_advanced_fork', |
|
332 | 332 | pattern='/{repo_name:.*?[^/]}/settings/advanced/fork', repo_route=True) |
|
333 | 333 | |
|
334 | config.add_route( | |
|
335 | name='edit_repo_advanced_hooks', | |
|
336 | pattern='/{repo_name:.*?[^/]}/settings/advanced/hooks', repo_route=True) | |
|
337 | ||
|
334 | 338 | # Caches |
|
335 | 339 | config.add_route( |
|
336 | 340 | name='edit_repo_caches', |
@@ -373,6 +377,9 b' def includeme(config):' | |||
|
373 | 377 | config.add_route( |
|
374 | 378 | name='edit_repo_remote_pull', |
|
375 | 379 | pattern='/{repo_name:.*?[^/]}/settings/remote/pull', repo_route=True) |
|
380 | config.add_route( | |
|
381 | name='edit_repo_remote_push', | |
|
382 | pattern='/{repo_name:.*?[^/]}/settings/remote/push', repo_route=True) | |
|
376 | 383 | |
|
377 | 384 | # Statistics |
|
378 | 385 | config.add_route( |
@@ -418,6 +425,11 b' def includeme(config):' | |||
|
418 | 425 | name='repo_default_reviewers_data', |
|
419 | 426 | pattern='/{repo_name:.*?[^/]}/settings/review/default-reviewers', repo_route=True) |
|
420 | 427 | |
|
428 | # Repo Automation (EE feature) | |
|
429 | config.add_route( | |
|
430 | name='repo_automation', | |
|
431 | pattern='/{repo_name:.*?[^/]}/settings/automation', repo_route=True) | |
|
432 | ||
|
421 | 433 | # Strip |
|
422 | 434 | config.add_route( |
|
423 | 435 | name='edit_repo_strip', |
@@ -234,7 +234,8 b' class TestSummaryView(object):' | |||
|
234 | 234 | Repository, 'scm_instance', side_effect=RepositoryRequirementError) |
|
235 | 235 | |
|
236 | 236 | with scm_patcher: |
|
237 |
response = self.app.get( |
|
|
237 | response = self.app.get( | |
|
238 | route_path('repo_summary', repo_name=repo_name)) | |
|
238 | 239 | assert_response = AssertResponse(response) |
|
239 | 240 | assert_response.element_contains( |
|
240 | 241 | '.main .alert-warning strong', 'Missing requirements') |
@@ -18,6 +18,7 b'' | |||
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | import os | |
|
21 | 22 | import logging |
|
22 | 23 | |
|
23 | 24 | from pyramid.httpexceptions import HTTPFound |
@@ -27,6 +28,7 b' from rhodecode.apps._base import RepoApp' | |||
|
27 | 28 | from rhodecode.lib.auth import ( |
|
28 | 29 | LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired) |
|
29 | 30 | from rhodecode.lib import helpers as h |
|
31 | from rhodecode.lib import system_info | |
|
30 | 32 | from rhodecode.model.meta import Session |
|
31 | 33 | from rhodecode.model.scm import ScmModel |
|
32 | 34 | |
@@ -36,8 +38,6 b' log = logging.getLogger(__name__)' | |||
|
36 | 38 | class RepoCachesView(RepoAppView): |
|
37 | 39 | def load_default_context(self): |
|
38 | 40 | c = self._get_local_tmpl_context() |
|
39 | ||
|
40 | ||
|
41 | 41 | return c |
|
42 | 42 | |
|
43 | 43 | @LoginRequired() |
@@ -48,6 +48,11 b' class RepoCachesView(RepoAppView):' | |||
|
48 | 48 | def repo_caches(self): |
|
49 | 49 | c = self.load_default_context() |
|
50 | 50 | c.active = 'caches' |
|
51 | cached_diffs_dir = c.rhodecode_db_repo.cached_diffs_dir | |
|
52 | c.cached_diff_count = len(c.rhodecode_db_repo.cached_diffs()) | |
|
53 | c.cached_diff_size = 0 | |
|
54 | if os.path.isdir(cached_diffs_dir): | |
|
55 | c.cached_diff_size = system_info.get_storage_size(cached_diffs_dir) | |
|
51 | 56 | |
|
52 | 57 | return self._get_template_context(c) |
|
53 | 58 |
@@ -34,17 +34,18 b' from rhodecode.lib.auth import (' | |||
|
34 | 34 | LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, CSRFRequired) |
|
35 | 35 | |
|
36 | 36 | from rhodecode.lib.compat import OrderedDict |
|
37 | from rhodecode.lib.diffs import cache_diff, load_cached_diff, diff_cache_exist | |
|
37 | 38 | from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError |
|
38 | 39 | import rhodecode.lib.helpers as h |
|
39 | from rhodecode.lib.utils2 import safe_unicode | |
|
40 | from rhodecode.lib.utils2 import safe_unicode, str2bool | |
|
40 | 41 | from rhodecode.lib.vcs.backends.base import EmptyCommit |
|
41 | 42 | from rhodecode.lib.vcs.exceptions import ( |
|
42 |
RepositoryError, CommitDoesNotExistError |
|
|
43 | RepositoryError, CommitDoesNotExistError) | |
|
43 | 44 | from rhodecode.model.db import ChangesetComment, ChangesetStatus |
|
44 | 45 | from rhodecode.model.changeset_status import ChangesetStatusModel |
|
45 | 46 | from rhodecode.model.comment import CommentsModel |
|
46 | 47 | from rhodecode.model.meta import Session |
|
47 | ||
|
48 | from rhodecode.model.settings import VcsSettingsModel | |
|
48 | 49 | |
|
49 | 50 | log = logging.getLogger(__name__) |
|
50 | 51 | |
@@ -152,6 +153,12 b' class RepoCommitsView(RepoAppView):' | |||
|
152 | 153 | |
|
153 | 154 | return c |
|
154 | 155 | |
|
156 | def _is_diff_cache_enabled(self, target_repo): | |
|
157 | caching_enabled = self._get_general_setting( | |
|
158 | target_repo, 'rhodecode_diff_cache') | |
|
159 | log.debug('Diff caching enabled: %s', caching_enabled) | |
|
160 | return caching_enabled | |
|
161 | ||
|
155 | 162 | def _commit(self, commit_id_range, method): |
|
156 | 163 | _ = self.request.translate |
|
157 | 164 | c = self.load_default_context() |
@@ -240,45 +247,65 b' class RepoCommitsView(RepoAppView):' | |||
|
240 | 247 | commit2 = commit |
|
241 | 248 | commit1 = commit.parents[0] if commit.parents else EmptyCommit() |
|
242 | 249 | |
|
250 | if method == 'show': | |
|
251 | inline_comments = CommentsModel().get_inline_comments( | |
|
252 | self.db_repo.repo_id, revision=commit.raw_id) | |
|
253 | c.inline_cnt = CommentsModel().get_inline_comments_count( | |
|
254 | inline_comments) | |
|
255 | c.inline_comments = inline_comments | |
|
256 | ||
|
257 | cache_path = self.rhodecode_vcs_repo.get_create_shadow_cache_pr_path( | |
|
258 | self.db_repo) | |
|
259 | cache_file_path = diff_cache_exist( | |
|
260 | cache_path, 'diff', commit.raw_id, | |
|
261 | ign_whitespace_lcl, context_lcl, c.fulldiff) | |
|
262 | ||
|
263 | caching_enabled = self._is_diff_cache_enabled(self.db_repo) | |
|
264 | force_recache = str2bool(self.request.GET.get('force_recache')) | |
|
265 | ||
|
266 | cached_diff = None | |
|
267 | if caching_enabled: | |
|
268 | cached_diff = load_cached_diff(cache_file_path) | |
|
269 | ||
|
270 | has_proper_diff_cache = cached_diff and cached_diff.get('diff') | |
|
271 | if not force_recache and has_proper_diff_cache: | |
|
272 | diffset = cached_diff['diff'] | |
|
273 | else: | |
|
274 | vcs_diff = self.rhodecode_vcs_repo.get_diff( | |
|
275 | commit1, commit2, | |
|
276 | ignore_whitespace=ign_whitespace_lcl, | |
|
277 | context=context_lcl) | |
|
278 | ||
|
279 | diff_processor = diffs.DiffProcessor( | |
|
280 | vcs_diff, format='newdiff', diff_limit=diff_limit, | |
|
281 | file_limit=file_limit, show_full_diff=c.fulldiff) | |
|
282 | ||
|
283 | _parsed = diff_processor.prepare() | |
|
284 | ||
|
285 | diffset = codeblocks.DiffSet( | |
|
286 | repo_name=self.db_repo_name, | |
|
287 | source_node_getter=codeblocks.diffset_node_getter(commit1), | |
|
288 | target_node_getter=codeblocks.diffset_node_getter(commit2)) | |
|
289 | ||
|
290 | diffset = self.path_filter.render_patchset_filtered( | |
|
291 | diffset, _parsed, commit1.raw_id, commit2.raw_id) | |
|
292 | ||
|
293 | # save cached diff | |
|
294 | if caching_enabled: | |
|
295 | cache_diff(cache_file_path, diffset, None) | |
|
296 | ||
|
297 | c.limited_diff = diffset.limited_diff | |
|
298 | c.changes[commit.raw_id] = diffset | |
|
299 | else: | |
|
300 | # TODO(marcink): no cache usage here... | |
|
243 | 301 | _diff = self.rhodecode_vcs_repo.get_diff( |
|
244 | 302 | commit1, commit2, |
|
245 | 303 | ignore_whitespace=ign_whitespace_lcl, context=context_lcl) |
|
246 | 304 | diff_processor = diffs.DiffProcessor( |
|
247 | 305 | _diff, format='newdiff', diff_limit=diff_limit, |
|
248 | 306 | file_limit=file_limit, show_full_diff=c.fulldiff) |
|
249 | ||
|
250 | commit_changes = OrderedDict() | |
|
251 | if method == 'show': | |
|
252 | _parsed = diff_processor.prepare() | |
|
253 | c.limited_diff = isinstance(_parsed, diffs.LimitedDiffContainer) | |
|
254 | ||
|
255 | _parsed = diff_processor.prepare() | |
|
256 | ||
|
257 | def _node_getter(commit): | |
|
258 | def get_node(fname): | |
|
259 | try: | |
|
260 | return commit.get_node(fname) | |
|
261 | except NodeDoesNotExistError: | |
|
262 | return None | |
|
263 | return get_node | |
|
264 | ||
|
265 | inline_comments = CommentsModel().get_inline_comments( | |
|
266 | self.db_repo.repo_id, revision=commit.raw_id) | |
|
267 | c.inline_cnt = CommentsModel().get_inline_comments_count( | |
|
268 | inline_comments) | |
|
269 | ||
|
270 | diffset = codeblocks.DiffSet( | |
|
271 | repo_name=self.db_repo_name, | |
|
272 | source_node_getter=_node_getter(commit1), | |
|
273 | target_node_getter=_node_getter(commit2), | |
|
274 | comments=inline_comments) | |
|
275 | diffset = diffset.render_patchset( | |
|
276 | _parsed, commit1.raw_id, commit2.raw_id) | |
|
277 | ||
|
278 | c.changes[commit.raw_id] = diffset | |
|
279 | else: | |
|
280 | 307 | # downloads/raw we only need RAW diff nothing else |
|
281 |
diff = diff_processor |
|
|
308 | diff = self.path_filter.get_raw_patch(diff_processor) | |
|
282 | 309 | c.changes[commit.raw_id] = [None, None, None, None, diff, None, None] |
|
283 | 310 | |
|
284 | 311 | # sort comments by how they were generated |
@@ -295,22 +295,13 b' class RepoCompareView(RepoAppView):' | |||
|
295 | 295 | file_limit=file_limit, show_full_diff=c.fulldiff) |
|
296 | 296 | _parsed = diff_processor.prepare() |
|
297 | 297 | |
|
298 | def _node_getter(commit): | |
|
299 | """ Returns a function that returns a node for a commit or None """ | |
|
300 | def get_node(fname): | |
|
301 | try: | |
|
302 | return commit.get_node(fname) | |
|
303 | except NodeDoesNotExistError: | |
|
304 | return None | |
|
305 | return get_node | |
|
306 | ||
|
307 | 298 | diffset = codeblocks.DiffSet( |
|
308 | 299 | repo_name=source_repo.repo_name, |
|
309 | source_node_getter=_node_getter(source_commit), | |
|
310 | target_node_getter=_node_getter(target_commit), | |
|
300 | source_node_getter=codeblocks.diffset_node_getter(source_commit), | |
|
301 | target_node_getter=codeblocks.diffset_node_getter(target_commit), | |
|
311 | 302 | ) |
|
312 |
c.diffset = |
|
|
313 | _parsed, source_ref, target_ref) | |
|
303 | c.diffset = self.path_filter.render_patchset_filtered( | |
|
304 | diffset, _parsed, source_ref, target_ref) | |
|
314 | 305 | |
|
315 | 306 | c.preview_mode = merge |
|
316 | 307 | c.source_commit = source_commit |
@@ -90,13 +90,15 b' class RepoFeedView(RepoAppView):' | |||
|
90 | 90 | _renderer = self.request.get_partial_renderer( |
|
91 | 91 | 'rhodecode:templates/feed/atom_feed_entry.mako') |
|
92 | 92 | diff_processor, parsed_diff, limited_diff = self._changes(commit) |
|
93 | filtered_parsed_diff, has_hidden_changes = self.path_filter.filter_patchset(parsed_diff) | |
|
93 | 94 | return _renderer( |
|
94 | 95 | 'body', |
|
95 | 96 | commit=commit, |
|
96 | parsed_diff=parsed_diff, | |
|
97 | parsed_diff=filtered_parsed_diff, | |
|
97 | 98 | limited_diff=limited_diff, |
|
98 | 99 | feed_include_diff=self.feed_include_diff, |
|
99 | 100 | diff_processor=diff_processor, |
|
101 | has_hidden_changes=has_hidden_changes | |
|
100 | 102 | ) |
|
101 | 103 | |
|
102 | 104 | def _set_timezone(self, date, tzinfo=pytz.utc): |
@@ -122,8 +124,7 b' class RepoFeedView(RepoAppView):' | |||
|
122 | 124 | """ |
|
123 | 125 | self.load_default_context() |
|
124 | 126 | |
|
125 | @cache_region('long_term') | |
|
126 | def _generate_feed(cache_key): | |
|
127 | def _generate_feed(): | |
|
127 | 128 | feed = Atom1Feed( |
|
128 | 129 | title=self.title % self.db_repo_name, |
|
129 | 130 | link=h.route_url('repo_summary', repo_name=self.db_repo_name), |
@@ -146,12 +147,18 b' class RepoFeedView(RepoAppView):' | |||
|
146 | 147 | |
|
147 | 148 | return feed.mime_type, feed.writeString('utf-8') |
|
148 | 149 | |
|
150 | @cache_region('long_term') | |
|
151 | def _generate_feed_and_cache(cache_key): | |
|
152 | return _generate_feed() | |
|
153 | ||
|
154 | if self.path_filter.is_enabled: | |
|
149 | 155 | invalidator_context = CacheKey.repo_context_cache( |
|
150 | _generate_feed, self.db_repo_name, CacheKey.CACHE_TYPE_ATOM) | |
|
151 | ||
|
156 | _generate_feed_and_cache, self.db_repo_name, CacheKey.CACHE_TYPE_ATOM) | |
|
152 | 157 | with invalidator_context as context: |
|
153 | 158 | context.invalidate() |
|
154 | 159 | mime_type, feed = context.compute() |
|
160 | else: | |
|
161 | mime_type, feed = _generate_feed() | |
|
155 | 162 | |
|
156 | 163 | response = Response(feed) |
|
157 | 164 | response.content_type = mime_type |
@@ -169,8 +176,7 b' class RepoFeedView(RepoAppView):' | |||
|
169 | 176 | """ |
|
170 | 177 | self.load_default_context() |
|
171 | 178 | |
|
172 | @cache_region('long_term') | |
|
173 | def _generate_feed(cache_key): | |
|
179 | def _generate_feed(): | |
|
174 | 180 | feed = Rss201rev2Feed( |
|
175 | 181 | title=self.title % self.db_repo_name, |
|
176 | 182 | link=h.route_url('repo_summary', repo_name=self.db_repo_name), |
@@ -193,12 +199,19 b' class RepoFeedView(RepoAppView):' | |||
|
193 | 199 | |
|
194 | 200 | return feed.mime_type, feed.writeString('utf-8') |
|
195 | 201 | |
|
202 | @cache_region('long_term') | |
|
203 | def _generate_feed_and_cache(cache_key): | |
|
204 | return _generate_feed() | |
|
205 | ||
|
206 | if self.path_filter.is_enabled: | |
|
196 | 207 | invalidator_context = CacheKey.repo_context_cache( |
|
197 | _generate_feed, self.db_repo_name, CacheKey.CACHE_TYPE_RSS) | |
|
208 | _generate_feed_and_cache, self.db_repo_name, CacheKey.CACHE_TYPE_RSS) | |
|
198 | 209 | |
|
199 | 210 | with invalidator_context as context: |
|
200 | 211 | context.invalidate() |
|
201 | 212 | mime_type, feed = context.compute() |
|
213 | else: | |
|
214 | mime_type, feed = _generate_feed() | |
|
202 | 215 | |
|
203 | 216 | response = Response(feed) |
|
204 | 217 | response.content_type = mime_type |
@@ -426,7 +426,7 b' class RepoFilesView(RepoAppView):' | |||
|
426 | 426 | context=line_context) |
|
427 | 427 | diff = diffs.DiffProcessor(_diff, format='gitdiff') |
|
428 | 428 | |
|
429 |
response = Response(diff |
|
|
429 | response = Response(self.path_filter.get_raw_patch(diff)) | |
|
430 | 430 | response.content_type = 'text/plain' |
|
431 | 431 | response.content_disposition = ( |
|
432 | 432 | 'attachment; filename=%s_%s_vs_%s.diff' % (f_path, diff1, diff2) |
@@ -442,7 +442,7 b' class RepoFilesView(RepoAppView):' | |||
|
442 | 442 | context=line_context) |
|
443 | 443 | diff = diffs.DiffProcessor(_diff, format='gitdiff') |
|
444 | 444 | |
|
445 |
response = Response(diff |
|
|
445 | response = Response(self.path_filter.get_raw_patch(diff)) | |
|
446 | 446 | response.content_type = 'text/plain' |
|
447 | 447 | charset = self._get_default_encoding(c) |
|
448 | 448 | if charset: |
@@ -462,7 +462,7 b' class RepoFilesView(RepoAppView):' | |||
|
462 | 462 | """ |
|
463 | 463 | Kept only to make OLD links work |
|
464 | 464 | """ |
|
465 | f_path = self._get_f_path(self.request.matchdict) | |
|
465 | f_path = self._get_f_path_unchecked(self.request.matchdict) | |
|
466 | 466 | diff1 = self.request.GET.get('diff1', '') |
|
467 | 467 | diff2 = self.request.GET.get('diff2', '') |
|
468 | 468 |
@@ -34,6 +34,7 b' from rhodecode.apps._base import RepoApp' | |||
|
34 | 34 | |
|
35 | 35 | from rhodecode.lib import helpers as h, diffs, codeblocks, channelstream |
|
36 | 36 | from rhodecode.lib.base import vcs_operation_context |
|
37 | from rhodecode.lib.diffs import load_cached_diff, cache_diff, diff_cache_exist | |
|
37 | 38 | from rhodecode.lib.ext_json import json |
|
38 | 39 | from rhodecode.lib.auth import ( |
|
39 | 40 | LoginRequired, HasRepoPermissionAny, HasRepoPermissionAnyDecorator, |
@@ -41,7 +42,7 b' from rhodecode.lib.auth import (' | |||
|
41 | 42 | from rhodecode.lib.utils2 import str2bool, safe_str, safe_unicode |
|
42 | 43 | from rhodecode.lib.vcs.backends.base import EmptyCommit, UpdateFailureReason |
|
43 | 44 | from rhodecode.lib.vcs.exceptions import (CommitDoesNotExistError, |
|
44 |
RepositoryRequirementError, |
|
|
45 | RepositoryRequirementError, EmptyRepositoryError) | |
|
45 | 46 | from rhodecode.model.changeset_status import ChangesetStatusModel |
|
46 | 47 | from rhodecode.model.comment import CommentsModel |
|
47 | 48 | from rhodecode.model.db import (func, or_, PullRequest, PullRequestVersion, |
@@ -201,10 +202,16 b' class RepoPullRequestsView(RepoAppView, ' | |||
|
201 | 202 | |
|
202 | 203 | return data |
|
203 | 204 | |
|
205 | def _is_diff_cache_enabled(self, target_repo): | |
|
206 | caching_enabled = self._get_general_setting( | |
|
207 | target_repo, 'rhodecode_diff_cache') | |
|
208 | log.debug('Diff caching enabled: %s', caching_enabled) | |
|
209 | return caching_enabled | |
|
210 | ||
|
204 | 211 | def _get_diffset(self, source_repo_name, source_repo, |
|
205 | 212 | source_ref_id, target_ref_id, |
|
206 |
target_commit, source_commit, diff_limit, f |
|
|
207 |
f |
|
|
213 | target_commit, source_commit, diff_limit, file_limit, | |
|
214 | fulldiff): | |
|
208 | 215 | |
|
209 | 216 | vcs_diff = PullRequestModel().get_diff( |
|
210 | 217 | source_repo, source_ref_id, target_ref_id) |
@@ -215,24 +222,14 b' class RepoPullRequestsView(RepoAppView, ' | |||
|
215 | 222 | |
|
216 | 223 | _parsed = diff_processor.prepare() |
|
217 | 224 | |
|
218 | def _node_getter(commit): | |
|
219 | def get_node(fname): | |
|
220 | try: | |
|
221 | return commit.get_node(fname) | |
|
222 | except NodeDoesNotExistError: | |
|
223 | return None | |
|
224 | ||
|
225 | return get_node | |
|
226 | ||
|
227 | 225 | diffset = codeblocks.DiffSet( |
|
228 | 226 | repo_name=self.db_repo_name, |
|
229 | 227 | source_repo_name=source_repo_name, |
|
230 | source_node_getter=_node_getter(target_commit), | |
|
231 | target_node_getter=_node_getter(source_commit), | |
|
232 | comments=display_inline_comments | |
|
228 | source_node_getter=codeblocks.diffset_node_getter(target_commit), | |
|
229 | target_node_getter=codeblocks.diffset_node_getter(source_commit), | |
|
233 | 230 | ) |
|
234 |
diffset = |
|
|
235 | _parsed, target_commit.raw_id, source_commit.raw_id) | |
|
231 | diffset = self.path_filter.render_patchset_filtered( | |
|
232 | diffset, _parsed, target_commit.raw_id, source_commit.raw_id) | |
|
236 | 233 | |
|
237 | 234 | return diffset |
|
238 | 235 | |
@@ -443,42 +440,54 b' class RepoPullRequestsView(RepoAppView, ' | |||
|
443 | 440 | commits_source_repo = source_scm |
|
444 | 441 | |
|
445 | 442 | c.commits_source_repo = commits_source_repo |
|
446 | commit_cache = {} | |
|
447 | try: | |
|
448 | pre_load = ["author", "branch", "date", "message"] | |
|
449 | show_revs = pull_request_at_ver.revisions | |
|
450 | for rev in show_revs: | |
|
451 | comm = commits_source_repo.get_commit( | |
|
452 | commit_id=rev, pre_load=pre_load) | |
|
453 | c.commit_ranges.append(comm) | |
|
454 | commit_cache[comm.raw_id] = comm | |
|
455 | ||
|
456 | # Order here matters, we first need to get target, and then | |
|
457 | # the source | |
|
458 | target_commit = commits_source_repo.get_commit( | |
|
459 | commit_id=safe_str(target_ref_id)) | |
|
460 | ||
|
461 | source_commit = commits_source_repo.get_commit( | |
|
462 | commit_id=safe_str(source_ref_id)) | |
|
463 | ||
|
464 | except CommitDoesNotExistError: | |
|
465 | log.warning( | |
|
466 | 'Failed to get commit from `{}` repo'.format( | |
|
467 | commits_source_repo), exc_info=True) | |
|
468 | except RepositoryRequirementError: | |
|
469 | log.warning( | |
|
470 | 'Failed to get all required data from repo', exc_info=True) | |
|
471 | c.missing_requirements = True | |
|
472 | ||
|
473 | 443 | c.ancestor = None # set it to None, to hide it from PR view |
|
474 | 444 | |
|
475 | try: | |
|
476 | ancestor_id = source_scm.get_common_ancestor( | |
|
477 | source_commit.raw_id, target_commit.raw_id, target_scm) | |
|
478 | c.ancestor_commit = source_scm.get_commit(ancestor_id) | |
|
479 | except Exception: | |
|
480 | c.ancestor_commit = None | |
|
445 | # empty version means latest, so we keep this to prevent | |
|
446 | # double caching | |
|
447 | version_normalized = version or 'latest' | |
|
448 | from_version_normalized = from_version or 'latest' | |
|
449 | ||
|
450 | cache_path = self.rhodecode_vcs_repo.get_create_shadow_cache_pr_path( | |
|
451 | target_repo) | |
|
452 | cache_file_path = diff_cache_exist( | |
|
453 | cache_path, 'pull_request', pull_request_id, version_normalized, | |
|
454 | from_version_normalized, source_ref_id, target_ref_id, c.fulldiff) | |
|
455 | ||
|
456 | caching_enabled = self._is_diff_cache_enabled(c.target_repo) | |
|
457 | force_recache = str2bool(self.request.GET.get('force_recache')) | |
|
458 | ||
|
459 | cached_diff = None | |
|
460 | if caching_enabled: | |
|
461 | cached_diff = load_cached_diff(cache_file_path) | |
|
481 | 462 | |
|
463 | has_proper_commit_cache = ( | |
|
464 | cached_diff and cached_diff.get('commits') | |
|
465 | and len(cached_diff.get('commits', [])) == 5 | |
|
466 | and cached_diff.get('commits')[0] | |
|
467 | and cached_diff.get('commits')[3]) | |
|
468 | if not force_recache and has_proper_commit_cache: | |
|
469 | diff_commit_cache = \ | |
|
470 | (ancestor_commit, commit_cache, missing_requirements, | |
|
471 | source_commit, target_commit) = cached_diff['commits'] | |
|
472 | else: | |
|
473 | diff_commit_cache = \ | |
|
474 | (ancestor_commit, commit_cache, missing_requirements, | |
|
475 | source_commit, target_commit) = self.get_commits( | |
|
476 | commits_source_repo, | |
|
477 | pull_request_at_ver, | |
|
478 | source_commit, | |
|
479 | source_ref_id, | |
|
480 | source_scm, | |
|
481 | target_commit, | |
|
482 | target_ref_id, | |
|
483 | target_scm) | |
|
484 | ||
|
485 | # register our commit range | |
|
486 | for comm in commit_cache.values(): | |
|
487 | c.commit_ranges.append(comm) | |
|
488 | ||
|
489 | c.missing_requirements = missing_requirements | |
|
490 | c.ancestor_commit = ancestor_commit | |
|
482 | 491 | c.statuses = source_repo.statuses( |
|
483 | 492 | [x.raw_id for x in c.commit_ranges]) |
|
484 | 493 | |
@@ -500,12 +509,23 b' class RepoPullRequestsView(RepoAppView, ' | |||
|
500 | 509 | |
|
501 | 510 | c.missing_commits = True |
|
502 | 511 | else: |
|
512 | c.inline_comments = display_inline_comments | |
|
503 | 513 | |
|
514 | has_proper_diff_cache = cached_diff and cached_diff.get('commits') | |
|
515 | if not force_recache and has_proper_diff_cache: | |
|
516 | c.diffset = cached_diff['diff'] | |
|
517 | (ancestor_commit, commit_cache, missing_requirements, | |
|
518 | source_commit, target_commit) = cached_diff['commits'] | |
|
519 | else: | |
|
504 | 520 | c.diffset = self._get_diffset( |
|
505 | 521 | c.source_repo.repo_name, commits_source_repo, |
|
506 | 522 | source_ref_id, target_ref_id, |
|
507 | 523 | target_commit, source_commit, |
|
508 |
diff_limit, c.fulldiff |
|
|
524 | diff_limit, file_limit, c.fulldiff) | |
|
525 | ||
|
526 | # save cached diff | |
|
527 | if caching_enabled: | |
|
528 | cache_diff(cache_file_path, c.diffset, diff_commit_cache) | |
|
509 | 529 | |
|
510 | 530 | c.limited_diff = c.diffset.limited_diff |
|
511 | 531 | |
@@ -568,7 +588,6 b' class RepoPullRequestsView(RepoAppView, ' | |||
|
568 | 588 | if self._rhodecode_user.user_id in allowed_reviewers: |
|
569 | 589 | for co in general_comments: |
|
570 | 590 | if co.author.user_id == self._rhodecode_user.user_id: |
|
571 | # each comment has a status change | |
|
572 | 591 | status = co.status_change |
|
573 | 592 | if status: |
|
574 | 593 | _ver_pr = status[0].comment.pull_request_version_id |
@@ -576,6 +595,43 b' class RepoPullRequestsView(RepoAppView, ' | |||
|
576 | 595 | |
|
577 | 596 | return self._get_template_context(c) |
|
578 | 597 | |
|
598 | def get_commits( | |
|
599 | self, commits_source_repo, pull_request_at_ver, source_commit, | |
|
600 | source_ref_id, source_scm, target_commit, target_ref_id, target_scm): | |
|
601 | commit_cache = collections.OrderedDict() | |
|
602 | missing_requirements = False | |
|
603 | try: | |
|
604 | pre_load = ["author", "branch", "date", "message"] | |
|
605 | show_revs = pull_request_at_ver.revisions | |
|
606 | for rev in show_revs: | |
|
607 | comm = commits_source_repo.get_commit( | |
|
608 | commit_id=rev, pre_load=pre_load) | |
|
609 | commit_cache[comm.raw_id] = comm | |
|
610 | ||
|
611 | # Order here matters, we first need to get target, and then | |
|
612 | # the source | |
|
613 | target_commit = commits_source_repo.get_commit( | |
|
614 | commit_id=safe_str(target_ref_id)) | |
|
615 | ||
|
616 | source_commit = commits_source_repo.get_commit( | |
|
617 | commit_id=safe_str(source_ref_id)) | |
|
618 | except CommitDoesNotExistError: | |
|
619 | log.warning( | |
|
620 | 'Failed to get commit from `{}` repo'.format( | |
|
621 | commits_source_repo), exc_info=True) | |
|
622 | except RepositoryRequirementError: | |
|
623 | log.warning( | |
|
624 | 'Failed to get all required data from repo', exc_info=True) | |
|
625 | missing_requirements = True | |
|
626 | ancestor_commit = None | |
|
627 | try: | |
|
628 | ancestor_id = source_scm.get_common_ancestor( | |
|
629 | source_commit.raw_id, target_commit.raw_id, target_scm) | |
|
630 | ancestor_commit = source_scm.get_commit(ancestor_id) | |
|
631 | except Exception: | |
|
632 | ancestor_commit = None | |
|
633 | return ancestor_commit, commit_cache, missing_requirements, source_commit, target_commit | |
|
634 | ||
|
579 | 635 | def assure_not_empty_repo(self): |
|
580 | 636 | _ = self.request.translate |
|
581 | 637 |
@@ -138,15 +138,19 b' class RepoSettingsView(RepoAppView):' | |||
|
138 | 138 | repo_description=schema_data['repo_description'], |
|
139 | 139 | repo_private=schema_data['repo_private'], |
|
140 | 140 | clone_uri=schema_data['repo_clone_uri'], |
|
141 | push_uri=schema_data['repo_push_uri'], | |
|
141 | 142 | repo_landing_rev=schema_data['repo_landing_commit_ref'], |
|
142 | 143 | repo_enable_statistics=schema_data['repo_enable_statistics'], |
|
143 | 144 | repo_enable_locking=schema_data['repo_enable_locking'], |
|
144 | 145 | repo_enable_downloads=schema_data['repo_enable_downloads'], |
|
145 | 146 | ) |
|
146 |
# detect if C |
|
|
147 | # detect if SYNC URI changed, if we get OLD means we keep old values | |
|
147 | 148 | if schema_data['repo_clone_uri_change'] == 'OLD': |
|
148 | 149 | validated_updates['clone_uri'] = self.db_repo.clone_uri |
|
149 | 150 | |
|
151 | if schema_data['repo_push_uri_change'] == 'OLD': | |
|
152 | validated_updates['push_uri'] = self.db_repo.push_uri | |
|
153 | ||
|
150 | 154 | # use the new full name for redirect |
|
151 | 155 | new_repo_name = schema_data['repo_group']['repo_name_with_group'] |
|
152 | 156 |
@@ -43,8 +43,6 b' class RepoSettingsView(RepoAppView):' | |||
|
43 | 43 | |
|
44 | 44 | def load_default_context(self): |
|
45 | 45 | c = self._get_local_tmpl_context() |
|
46 | ||
|
47 | ||
|
48 | 46 | return c |
|
49 | 47 | |
|
50 | 48 | @LoginRequired() |
@@ -231,3 +229,19 b' class RepoSettingsView(RepoAppView):' | |||
|
231 | 229 | |
|
232 | 230 | raise HTTPFound( |
|
233 | 231 | h.route_path('edit_repo_advanced', repo_name=self.db_repo_name)) |
|
232 | ||
|
233 | @LoginRequired() | |
|
234 | @HasRepoPermissionAnyDecorator('repository.admin') | |
|
235 | @view_config( | |
|
236 | route_name='edit_repo_advanced_hooks', request_method='GET', | |
|
237 | renderer='rhodecode:templates/admin/repos/repo_edit.mako') | |
|
238 | def edit_advanced_install_hooks(self): | |
|
239 | """ | |
|
240 | Install Hooks for repository | |
|
241 | """ | |
|
242 | _ = self.request.translate | |
|
243 | self.load_default_context() | |
|
244 | self.rhodecode_vcs_repo.install_hooks(force=True) | |
|
245 | h.flash(_('installed hooks repository'), category='success') | |
|
246 | raise HTTPFound( | |
|
247 | h.route_path('edit_repo_advanced', repo_name=self.db_repo_name)) |
@@ -35,8 +35,6 b' log = logging.getLogger(__name__)' | |||
|
35 | 35 | class RepoSettingsRemoteView(RepoAppView): |
|
36 | 36 | def load_default_context(self): |
|
37 | 37 | c = self._get_local_tmpl_context() |
|
38 | ||
|
39 | ||
|
40 | 38 | return c |
|
41 | 39 | |
|
42 | 40 | @LoginRequired() |
@@ -117,6 +117,7 b' class SubversionTunnelWrapper(object):' | |||
|
117 | 117 | message=self._svn_string(message))) |
|
118 | 118 | self.remove_configs() |
|
119 | 119 | self.process.kill() |
|
120 | return 1 | |
|
120 | 121 | |
|
121 | 122 | def interrupt(self, signum, frame): |
|
122 | 123 | self.fail("Exited by timeout") |
@@ -171,7 +172,7 b' class SubversionTunnelWrapper(object):' | |||
|
171 | 172 | |
|
172 | 173 | first_response = self.get_first_client_response() |
|
173 | 174 | if not first_response: |
|
174 | self.fail("Repository name cannot be extracted") | |
|
175 | return self.fail("Repository name cannot be extracted") | |
|
175 | 176 | |
|
176 | 177 | url_parts = urlparse.urlparse(first_response['url']) |
|
177 | 178 | self.server.repo_name = url_parts.path.strip('/') |
@@ -68,7 +68,7 b' def main(ini_path, mode, user, user_id, ' | |||
|
68 | 68 | 'of this script.') |
|
69 | 69 | connection_info = os.environ.get('SSH_CONNECTION', '') |
|
70 | 70 | |
|
71 | with bootstrap(ini_path) as env: | |
|
71 | with bootstrap(ini_path, env={'RC_CMD_SSH_WRAPPER': '1'}) as env: | |
|
72 | 72 | try: |
|
73 | 73 | ssh_wrapper = SshWrapper( |
|
74 | 74 | command, connection_info, mode, |
@@ -103,7 +103,7 b' class TestSubversionServer(object):' | |||
|
103 | 103 | return_value=0): |
|
104 | 104 | with mock.patch.object( |
|
105 | 105 | SubversionTunnelWrapper, 'command', |
|
106 | return_value='date'): | |
|
106 | return_value=['date']): | |
|
107 | 107 | |
|
108 | 108 | exit_code = server.run() |
|
109 | 109 | # SVN has this differently configured, and we get in our mock env |
@@ -115,7 +115,7 b' class TestSubversionServer(object):' | |||
|
115 | 115 | from rhodecode.apps.ssh_support.lib.backends.svn import SubversionTunnelWrapper |
|
116 | 116 | with mock.patch.object( |
|
117 | 117 | SubversionTunnelWrapper, 'command', |
|
118 | return_value='date'): | |
|
118 | return_value=['date']): | |
|
119 | 119 | with mock.patch.object( |
|
120 | 120 | SubversionTunnelWrapper, 'get_first_client_response', |
|
121 | 121 | return_value=None): |
@@ -64,6 +64,9 b' RequestHeader edit Destination ^https: h' | |||
|
64 | 64 | SVNParentPath "${parent_path_root|n}" |
|
65 | 65 | SVNListParentPath ${"On" if svn_list_parent_path else "Off"|n} |
|
66 | 66 | |
|
67 | # use specific SVN conf/authz file for each repository | |
|
68 | #AuthzSVNReposRelativeAccessFile authz | |
|
69 | ||
|
67 | 70 | Allow from all |
|
68 | 71 | Order allow,deny |
|
69 | 72 | </Location> |
@@ -82,6 +85,9 b' RequestHeader edit Destination ^https: h' | |||
|
82 | 85 | SVNParentPath "${parent_path|n}" |
|
83 | 86 | SVNListParentPath ${"On" if svn_list_parent_path else "Off"|n} |
|
84 | 87 | |
|
88 | # use specific SVN conf/authz file for each repository | |
|
89 | #AuthzSVNReposRelativeAccessFile authz | |
|
90 | ||
|
85 | 91 | Allow from all |
|
86 | 92 | Order allow,deny |
|
87 | 93 | </Location> |
@@ -18,6 +18,7 b'' | |||
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | import os | |
|
21 | 22 | import logging |
|
22 | 23 | import importlib |
|
23 | 24 | |
@@ -71,9 +72,12 b' def _discover_legacy_plugins(config, pre' | |||
|
71 | 72 | setting in database which are using the specified prefix. Normally 'py:' is |
|
72 | 73 | used for the legacy plugins. |
|
73 | 74 | """ |
|
75 | try: | |
|
74 | 76 | auth_plugins = SettingsModel().get_setting_by_name('auth_plugins') |
|
75 | 77 | enabled_plugins = auth_plugins.app_settings_value |
|
76 | 78 | legacy_plugins = [id_ for id_ in enabled_plugins if id_.startswith(prefix)] |
|
79 | except Exception: | |
|
80 | legacy_plugins = [] | |
|
77 | 81 | |
|
78 | 82 | for plugin_id in legacy_plugins: |
|
79 | 83 | log.debug('Legacy plugin discovered: "%s"', plugin_id) |
@@ -117,6 +121,11 b' def includeme(config):' | |||
|
117 | 121 | route_name='auth_home', |
|
118 | 122 | context=AuthnRootResource) |
|
119 | 123 | |
|
124 | for key in ['RC_CMD_SETUP_RC', 'RC_CMD_UPGRADE_DB', 'RC_CMD_SSH_WRAPPER']: | |
|
125 | if os.environ.get(key): | |
|
126 | # skip this heavy step below on certain CLI commands | |
|
127 | return | |
|
128 | ||
|
120 | 129 | # Auto discover authentication plugins and include their configuration. |
|
121 | 130 | _discover_plugins(config) |
|
122 | 131 | _discover_legacy_plugins(config) |
@@ -171,11 +171,6 b' class RhodeCodeAuthPluginBase(object):' | |||
|
171 | 171 | db_type = '{}.encrypted'.format(db_type) |
|
172 | 172 | return db_type |
|
173 | 173 | |
|
174 | @LazyProperty | |
|
175 | def plugin_settings(self): | |
|
176 | settings = SettingsModel().get_all_settings() | |
|
177 | return settings | |
|
178 | ||
|
179 | 174 | def is_enabled(self): |
|
180 | 175 | """ |
|
181 | 176 | Returns true if this plugin is enabled. An enabled plugin can be |
@@ -185,12 +180,13 b' class RhodeCodeAuthPluginBase(object):' | |||
|
185 | 180 | auth_plugins = SettingsModel().get_auth_plugins() |
|
186 | 181 | return self.get_id() in auth_plugins |
|
187 | 182 | |
|
188 | def is_active(self): | |
|
183 | def is_active(self, plugin_cached_settings=None): | |
|
189 | 184 | """ |
|
190 | 185 | Returns true if the plugin is activated. An activated plugin is |
|
191 | 186 | consulted during authentication, assumed it is also enabled. |
|
192 | 187 | """ |
|
193 |
return self.get_setting_by_name( |
|
|
188 | return self.get_setting_by_name( | |
|
189 | 'enabled', plugin_cached_settings=plugin_cached_settings) | |
|
194 | 190 | |
|
195 | 191 | def get_id(self): |
|
196 | 192 | """ |
@@ -210,13 +206,24 b' class RhodeCodeAuthPluginBase(object):' | |||
|
210 | 206 | """ |
|
211 | 207 | return AuthnPluginSettingsSchemaBase() |
|
212 | 208 | |
|
213 | def get_setting_by_name(self, name, default=None, cache=True): | |
|
209 | def get_settings(self): | |
|
210 | """ | |
|
211 | Returns the plugin settings as dictionary. | |
|
212 | """ | |
|
213 | settings = {} | |
|
214 | raw_settings = SettingsModel().get_all_settings() | |
|
215 | for node in self.get_settings_schema(): | |
|
216 | settings[node.name] = self.get_setting_by_name( | |
|
217 | node.name, plugin_cached_settings=raw_settings) | |
|
218 | return settings | |
|
219 | ||
|
220 | def get_setting_by_name(self, name, default=None, plugin_cached_settings=None): | |
|
214 | 221 | """ |
|
215 | 222 | Returns a plugin setting by name. |
|
216 | 223 | """ |
|
217 | 224 | full_name = 'rhodecode_{}'.format(self._get_setting_full_name(name)) |
|
218 | if cache: | |
|
219 |
plugin_settings = |
|
|
225 | if plugin_cached_settings: | |
|
226 | plugin_settings = plugin_cached_settings | |
|
220 | 227 | else: |
|
221 | 228 | plugin_settings = SettingsModel().get_all_settings() |
|
222 | 229 | |
@@ -235,15 +242,6 b' class RhodeCodeAuthPluginBase(object):' | |||
|
235 | 242 | full_name, value, type_) |
|
236 | 243 | return db_setting.app_settings_value |
|
237 | 244 | |
|
238 | def get_settings(self): | |
|
239 | """ | |
|
240 | Returns the plugin settings as dictionary. | |
|
241 | """ | |
|
242 | settings = {} | |
|
243 | for node in self.get_settings_schema(): | |
|
244 | settings[node.name] = self.get_setting_by_name(node.name) | |
|
245 | return settings | |
|
246 | ||
|
247 | 245 | def log_safe_settings(self, settings): |
|
248 | 246 | """ |
|
249 | 247 | returns a log safe representation of settings, without any secrets |
@@ -625,12 +623,13 b' def authenticate(username, password, env' | |||
|
625 | 623 | 'headers plugin, skipping...', plugin.get_id()) |
|
626 | 624 | continue |
|
627 | 625 | |
|
626 | log.debug('Trying authentication using ** %s **', plugin.get_id()) | |
|
627 | ||
|
628 | 628 | # load plugin settings from RhodeCode database |
|
629 | 629 | plugin_settings = plugin.get_settings() |
|
630 | 630 | plugin_sanitized_settings = plugin.log_safe_settings(plugin_settings) |
|
631 | log.debug('Plugin settings:%s', plugin_sanitized_settings) | |
|
631 | log.debug('Plugin `%s` settings:%s', plugin.get_id(), plugin_sanitized_settings) | |
|
632 | 632 | |
|
633 | log.debug('Trying authentication using ** %s **', plugin.get_id()) | |
|
634 | 633 | # use plugin's method of user extraction. |
|
635 | 634 | user = plugin.get_user(username, environ=environ, |
|
636 | 635 | settings=plugin_settings) |
@@ -684,7 +683,8 b' def authenticate(username, password, env' | |||
|
684 | 683 | environ=environ or {}) |
|
685 | 684 | |
|
686 | 685 | if plugin_cache_active: |
|
687 |
log.debug('Trying to fetch cached auth by `...%s`', |
|
|
686 | log.debug('Trying to fetch cached auth by pwd hash `...%s`', | |
|
687 | _password_hash[:6]) | |
|
688 | 688 | plugin_user = cache_manager.get( |
|
689 | 689 | _password_hash, createfunc=auth_func) |
|
690 | 690 | else: |
@@ -281,5 +281,5 b' class RhodeCodeAuthPlugin(RhodeCodeExter' | |||
|
281 | 281 | if group in user_attrs["groups"]: |
|
282 | 282 | user_attrs["admin"] = True |
|
283 | 283 | log.debug("Final crowd user object: \n%s" % (formatted_json(user_attrs))) |
|
284 | log.info('user %s authenticated correctly' % user_attrs['username']) | |
|
284 | log.info('user `%s` authenticated correctly' % user_attrs['username']) | |
|
285 | 285 | return user_attrs |
@@ -163,5 +163,5 b' class RhodeCodeAuthPlugin(RhodeCodeExter' | |||
|
163 | 163 | 'extern_type': extern_type, |
|
164 | 164 | } |
|
165 | 165 | |
|
166 | log.info('user %s authenticated correctly' % user_attrs['username']) | |
|
166 | log.info('user `%s` authenticated correctly' % user_attrs['username']) | |
|
167 | 167 | return user_attrs |
@@ -22,10 +22,12 b'' | |||
|
22 | 22 | RhodeCode authentication plugin for LDAP |
|
23 | 23 | """ |
|
24 | 24 | |
|
25 | ||
|
25 | import socket | |
|
26 | import re | |
|
26 | 27 | import colander |
|
27 | 28 | import logging |
|
28 | 29 | import traceback |
|
30 | import string | |
|
29 | 31 | |
|
30 | 32 | from rhodecode.translation import _ |
|
31 | 33 | from rhodecode.authentication.base import ( |
@@ -50,6 +52,9 b' except ImportError:' | |||
|
50 | 52 | ldap = Missing |
|
51 | 53 | |
|
52 | 54 | |
|
55 | class LdapError(Exception): | |
|
56 | pass | |
|
57 | ||
|
53 | 58 | def plugin_factory(plugin_id, *args, **kwds): |
|
54 | 59 | """ |
|
55 | 60 | Factory function that is called during plugin discovery. |
@@ -86,6 +91,16 b' class LdapSettingsSchema(AuthnPluginSett' | |||
|
86 | 91 | title=_('Port'), |
|
87 | 92 | validator=colander.Range(min=0, max=65536), |
|
88 | 93 | widget='int') |
|
94 | ||
|
95 | timeout = colander.SchemaNode( | |
|
96 | colander.Int(), | |
|
97 | default=60 * 5, | |
|
98 | description=_('Timeout for LDAP connection'), | |
|
99 | preparer=strip_whitespace, | |
|
100 | title=_('Connection timeout'), | |
|
101 | validator=colander.Range(min=1), | |
|
102 | widget='int') | |
|
103 | ||
|
89 | 104 | dn_user = colander.SchemaNode( |
|
90 | 105 | colander.String(), |
|
91 | 106 | default='', |
@@ -187,19 +202,47 b' class LdapSettingsSchema(AuthnPluginSett' | |||
|
187 | 202 | class AuthLdap(object): |
|
188 | 203 | |
|
189 | 204 | def _build_servers(self): |
|
205 | def host_resolver(host, port): | |
|
206 | """ | |
|
207 | Main work for this function is to prevent ldap connection issues, | |
|
208 | and detect them early using a "greenified" sockets | |
|
209 | """ | |
|
210 | host = host.strip() | |
|
211 | ||
|
212 | log.info('Resolving LDAP host %s', host) | |
|
213 | try: | |
|
214 | ip = socket.gethostbyname(host) | |
|
215 | log.info('Got LDAP server %s ip %s', host, ip) | |
|
216 | except Exception: | |
|
217 | raise LdapConnectionError( | |
|
218 | 'Failed to resolve host: `{}`'.format(host)) | |
|
219 | ||
|
220 | log.info('Checking LDAP IP access %s', ip) | |
|
221 | s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) | |
|
222 | try: | |
|
223 | s.connect((ip, int(port))) | |
|
224 | s.shutdown(socket.SHUT_RD) | |
|
225 | except Exception: | |
|
226 | raise LdapConnectionError( | |
|
227 | 'Failed to connect to host: `{}:{}`'.format(host, port)) | |
|
228 | ||
|
229 | return '{}:{}'.format(host, port) | |
|
230 | ||
|
231 | port = self.LDAP_SERVER_PORT | |
|
190 | 232 | return ', '.join( |
|
191 |
["{}://{} |
|
|
192 |
self.ldap_server_type, host |
|
|
233 | ["{}://{}".format( | |
|
234 | self.ldap_server_type, host_resolver(host, port)) | |
|
193 | 235 | for host in self.SERVER_ADDRESSES]) |
|
194 | 236 | |
|
195 | 237 | def __init__(self, server, base_dn, port=389, bind_dn='', bind_pass='', |
|
196 | 238 | tls_kind='PLAIN', tls_reqcert='DEMAND', ldap_version=3, |
|
197 | 239 | search_scope='SUBTREE', attr_login='uid', |
|
198 | ldap_filter=''): | |
|
240 | ldap_filter='', timeout=None): | |
|
199 | 241 | if ldap == Missing: |
|
200 | 242 | raise LdapImportError("Missing or incompatible ldap library") |
|
201 | 243 | |
|
202 | 244 | self.debug = False |
|
245 | self.timeout = timeout or 60 * 5 | |
|
203 | 246 | self.ldap_version = ldap_version |
|
204 | 247 | self.ldap_server_type = 'ldap' |
|
205 | 248 | |
@@ -226,34 +269,40 b' class AuthLdap(object):' | |||
|
226 | 269 | self.BASE_DN = safe_str(base_dn) |
|
227 | 270 | self.LDAP_FILTER = safe_str(ldap_filter) |
|
228 | 271 | |
|
229 |
def _get_ldap_ |
|
|
272 | def _get_ldap_conn(self): | |
|
273 | log.debug('initializing LDAP connection to:%s', self.LDAP_SERVER) | |
|
274 | ||
|
230 | 275 | if self.debug: |
|
231 | 276 | ldap.set_option(ldap.OPT_DEBUG_LEVEL, 255) |
|
277 | ||
|
232 | 278 | if hasattr(ldap, 'OPT_X_TLS_CACERTDIR'): |
|
233 | ldap.set_option(ldap.OPT_X_TLS_CACERTDIR, | |
|
234 | '/etc/openldap/cacerts') | |
|
279 | ldap.set_option(ldap.OPT_X_TLS_CACERTDIR, '/etc/openldap/cacerts') | |
|
280 | if self.TLS_KIND != 'PLAIN': | |
|
281 | ldap.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, self.TLS_REQCERT) | |
|
282 | ||
|
235 | 283 | ldap.set_option(ldap.OPT_REFERRALS, ldap.OPT_OFF) |
|
236 | 284 | ldap.set_option(ldap.OPT_RESTART, ldap.OPT_ON) |
|
237 | ldap.set_option(ldap.OPT_NETWORK_TIMEOUT, 60 * 10) | |
|
238 | ldap.set_option(ldap.OPT_TIMEOUT, 60 * 10) | |
|
239 | 285 | |
|
240 | if self.TLS_KIND != 'PLAIN': | |
|
241 | ldap.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, self.TLS_REQCERT) | |
|
242 | server = ldap.initialize(self.LDAP_SERVER) | |
|
286 | # init connection now | |
|
287 | ldap_conn = ldap.initialize(self.LDAP_SERVER) | |
|
288 | ldap_conn.set_option(ldap.OPT_NETWORK_TIMEOUT, self.timeout) | |
|
289 | ldap_conn.set_option(ldap.OPT_TIMEOUT, self.timeout) | |
|
290 | ldap_conn.timeout = self.timeout | |
|
291 | ||
|
243 | 292 | if self.ldap_version == 2: |
|
244 |
|
|
|
293 | ldap_conn.protocol = ldap.VERSION2 | |
|
245 | 294 | else: |
|
246 |
|
|
|
295 | ldap_conn.protocol = ldap.VERSION3 | |
|
247 | 296 | |
|
248 | 297 | if self.TLS_KIND == 'START_TLS': |
|
249 |
|
|
|
298 | ldap_conn.start_tls_s() | |
|
250 | 299 | |
|
251 | 300 | if self.LDAP_BIND_DN and self.LDAP_BIND_PASS: |
|
252 | 301 | log.debug('Trying simple_bind with password and given login DN: %s', |
|
253 | 302 | self.LDAP_BIND_DN) |
|
254 |
|
|
|
303 | ldap_conn.simple_bind_s(self.LDAP_BIND_DN, self.LDAP_BIND_PASS) | |
|
255 | 304 | |
|
256 |
return |
|
|
305 | return ldap_conn | |
|
257 | 306 | |
|
258 | 307 | def get_uid(self, username): |
|
259 | 308 | uid = username |
@@ -295,13 +344,14 b' class AuthLdap(object):' | |||
|
295 | 344 | if "," in username: |
|
296 | 345 | raise LdapUsernameError( |
|
297 | 346 | "invalid character `,` in username: `{}`".format(username)) |
|
347 | ldap_conn = None | |
|
298 | 348 | try: |
|
299 |
|
|
|
349 | ldap_conn = self._get_ldap_conn() | |
|
300 | 350 | filter_ = '(&%s(%s=%s))' % ( |
|
301 | 351 | self.LDAP_FILTER, self.attr_login, username) |
|
302 | 352 | log.debug("Authenticating %r filter %s at %s", self.BASE_DN, |
|
303 | 353 | filter_, self.LDAP_SERVER) |
|
304 |
lobjects = |
|
|
354 | lobjects = ldap_conn.search_ext_s( | |
|
305 | 355 | self.BASE_DN, self.SEARCH_SCOPE, filter_) |
|
306 | 356 | |
|
307 | 357 | if not lobjects: |
@@ -315,7 +365,7 b' class AuthLdap(object):' | |||
|
315 | 365 | continue |
|
316 | 366 | |
|
317 | 367 | user_attrs = self.fetch_attrs_from_simple_bind( |
|
318 |
|
|
|
368 | ldap_conn, dn, username, password) | |
|
319 | 369 | if user_attrs: |
|
320 | 370 | break |
|
321 | 371 | |
@@ -333,6 +383,15 b' class AuthLdap(object):' | |||
|
333 | 383 | raise LdapConnectionError( |
|
334 | 384 | "LDAP can't access authentication " |
|
335 | 385 | "server, org_exc:%s" % org_exc) |
|
386 | finally: | |
|
387 | if ldap_conn: | |
|
388 | log.debug('ldap: connection release') | |
|
389 | try: | |
|
390 | ldap_conn.unbind_s() | |
|
391 | except Exception: | |
|
392 | # for any reason this can raise exception we must catch it | |
|
393 | # to not crush the server | |
|
394 | pass | |
|
336 | 395 | |
|
337 | 396 | return dn, user_attrs |
|
338 | 397 | |
@@ -429,6 +488,7 b' class RhodeCodeAuthPlugin(RhodeCodeExter' | |||
|
429 | 488 | 'attr_login': settings.get('attr_login'), |
|
430 | 489 | 'ldap_version': 3, |
|
431 | 490 | 'ldap_filter': settings.get('filter'), |
|
491 | 'timeout': settings.get('timeout') | |
|
432 | 492 | } |
|
433 | 493 | |
|
434 | 494 | ldap_attrs = self.try_dynamic_binding(username, password, ldap_args) |
@@ -469,7 +529,7 b' class RhodeCodeAuthPlugin(RhodeCodeExter' | |||
|
469 | 529 | 'extern_type': extern_type, |
|
470 | 530 | } |
|
471 | 531 | log.debug('ldap user: %s', user_attrs) |
|
472 | log.info('user %s authenticated correctly', user_attrs['username']) | |
|
532 | log.info('user `%s` authenticated correctly', user_attrs['username']) | |
|
473 | 533 | |
|
474 | 534 | return user_attrs |
|
475 | 535 | |
@@ -479,3 +539,4 b' class RhodeCodeAuthPlugin(RhodeCodeExter' | |||
|
479 | 539 | except (Exception,): |
|
480 | 540 | log.exception("Other exception") |
|
481 | 541 | return None |
|
542 |
@@ -157,5 +157,5 b' class RhodeCodeAuthPlugin(RhodeCodeExter' | |||
|
157 | 157 | pass |
|
158 | 158 | |
|
159 | 159 | log.debug("pamuser: %s", user_attrs) |
|
160 | log.info('user %s authenticated correctly' % user_attrs['username']) | |
|
160 | log.info('user `%s` authenticated correctly' % user_attrs['username']) | |
|
161 | 161 | return user_attrs |
@@ -127,14 +127,14 b' class RhodeCodeAuthPlugin(RhodeCodeAuthP' | |||
|
127 | 127 | |
|
128 | 128 | if userobj.username == User.DEFAULT_USER and userobj.active: |
|
129 | 129 | log.info( |
|
130 | 'user %s authenticated correctly as anonymous user', userobj) | |
|
130 | 'user `%s` authenticated correctly as anonymous user', userobj.username) | |
|
131 | 131 | return user_attrs |
|
132 | 132 | |
|
133 | 133 | elif userobj.username == username and password_match: |
|
134 | log.info('user %s authenticated correctly', userobj) | |
|
134 | log.info('user `%s` authenticated correctly', userobj.username) | |
|
135 | 135 | return user_attrs |
|
136 |
log. |
|
|
137 | "authenticating on this plugin", userobj) | |
|
136 | log.warn("user `%s` used a wrong password when " | |
|
137 | "authenticating on this plugin", userobj.username) | |
|
138 | 138 | return None |
|
139 | 139 | else: |
|
140 | 140 | log.warning( |
@@ -137,7 +137,7 b' class RhodeCodeAuthPlugin(RhodeCodeAuthP' | |||
|
137 | 137 | 'user `%s` successfully authenticated via %s', |
|
138 | 138 | user_attrs['username'], self.name) |
|
139 | 139 | return user_attrs |
|
140 |
log. |
|
|
140 | log.warn( | |
|
141 | 141 | 'user `%s` failed to authenticate via %s, reason: bad or ' |
|
142 | 142 | 'inactive token.', username, self.name) |
|
143 | 143 | else: |
@@ -70,9 +70,11 b' class AuthenticationPluginRegistry(objec' | |||
|
70 | 70 | # Add all enabled and active plugins to the list. We iterate over the |
|
71 | 71 | # auth_plugins setting from DB because it also represents the ordering. |
|
72 | 72 | enabled_plugins = SettingsModel().get_auth_plugins() |
|
73 | raw_settings = SettingsModel().get_all_settings() | |
|
73 | 74 | for plugin_id in enabled_plugins: |
|
74 | 75 | plugin = self.get_plugin(plugin_id) |
|
75 |
if plugin is not None and plugin.is_active( |
|
|
76 | if plugin is not None and plugin.is_active( | |
|
77 | plugin_cached_settings=raw_settings): | |
|
76 | 78 | plugins.append(plugin) |
|
77 | 79 | |
|
78 | 80 | # Add the fallback plugin from ini file. |
@@ -63,7 +63,7 b' class AuthnPluginViewBase(BaseAppView):' | |||
|
63 | 63 | for node in schema: |
|
64 | 64 | if node.name not in defaults: |
|
65 | 65 | defaults[node.name] = self.plugin.get_setting_by_name( |
|
66 |
node.name, node.default |
|
|
66 | node.name, node.default) | |
|
67 | 67 | |
|
68 | 68 | template_context = { |
|
69 | 69 | 'defaults': defaults, |
@@ -145,7 +145,7 b' class AuthSettingsView(BaseAppView):' | |||
|
145 | 145 | |
|
146 | 146 | # Create form default values and fill the form. |
|
147 | 147 | form_defaults = { |
|
148 | 'auth_plugins': ','.join(enabled_plugins) | |
|
148 | 'auth_plugins': ',\n'.join(enabled_plugins) | |
|
149 | 149 | } |
|
150 | 150 | form_defaults.update(defaults) |
|
151 | 151 | html = formencode.htmlfill.render( |
@@ -226,6 +226,7 b' def includeme(config):' | |||
|
226 | 226 | config.include('rhodecode.apps.user_group') |
|
227 | 227 | config.include('rhodecode.apps.search') |
|
228 | 228 | config.include('rhodecode.apps.user_profile') |
|
229 | config.include('rhodecode.apps.user_group_profile') | |
|
229 | 230 | config.include('rhodecode.apps.my_account') |
|
230 | 231 | config.include('rhodecode.apps.svn_support') |
|
231 | 232 | config.include('rhodecode.apps.ssh_support') |
@@ -37,6 +37,7 b' class PullRequestEvent(RepoEvent):' | |||
|
37 | 37 | self.pullrequest = pullrequest |
|
38 | 38 | |
|
39 | 39 | def as_dict(self): |
|
40 | from rhodecode.lib.utils2 import md5_safe | |
|
40 | 41 | from rhodecode.model.pull_request import PullRequestModel |
|
41 | 42 | data = super(PullRequestEvent, self).as_dict() |
|
42 | 43 | |
@@ -46,6 +47,9 b' class PullRequestEvent(RepoEvent):' | |||
|
46 | 47 | repos=[self.pullrequest.source_repo] |
|
47 | 48 | ) |
|
48 | 49 | issues = _issues_as_dict(commits) |
|
50 | # calculate hashes of all commits for unique identifier of commits | |
|
51 | # inside that pull request | |
|
52 | commits_hash = md5_safe(':'.join(x.get('raw_id', '') for x in commits)) | |
|
49 | 53 | |
|
50 | 54 | data.update({ |
|
51 | 55 | 'pullrequest': { |
@@ -56,7 +60,10 b' class PullRequestEvent(RepoEvent):' | |||
|
56 | 60 | self.pullrequest, request=self.request), |
|
57 | 61 | 'permalink_url': PullRequestModel().get_url( |
|
58 | 62 | self.pullrequest, request=self.request, permalink=True), |
|
63 | 'shadow_url': PullRequestModel().get_shadow_clone_url( | |
|
64 | self.pullrequest, request=self.request), | |
|
59 | 65 | 'status': self.pullrequest.calculated_review_status(), |
|
66 | 'commits_uid': commits_hash, | |
|
60 | 67 | 'commits': commits, |
|
61 | 68 | } |
|
62 | 69 | }) |
@@ -132,7 +132,8 b' def _commits_as_dict(event, commit_ids, ' | |||
|
132 | 132 | |
|
133 | 133 | missing_commits = set(commit_ids) - set(c['raw_id'] for c in commits) |
|
134 | 134 | if missing_commits: |
|
135 | log.error('missing commits: %s' % ', '.join(missing_commits)) | |
|
135 | log.error('Inconsistent repository state. ' | |
|
136 | 'Missing commits: %s' % ', '.join(missing_commits)) | |
|
136 | 137 | |
|
137 | 138 | return commits |
|
138 | 139 |
@@ -45,6 +45,8 b' integration_type_registry.register_integ' | |||
|
45 | 45 | base.EEIntegration('Jira Issues integration', 'jira')) |
|
46 | 46 | integration_type_registry.register_integration_type( |
|
47 | 47 | base.EEIntegration('Redmine Tracker integration', 'redmine')) |
|
48 | integration_type_registry.register_integration_type( | |
|
49 | base.EEIntegration('Jenkins CI integration', 'jenkins')) | |
|
48 | 50 | |
|
49 | 51 | |
|
50 | 52 | def integrations_event_handler(event): |
@@ -19,14 +19,26 b'' | |||
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | import colander |
|
22 | import string | |
|
23 | import collections | |
|
24 | import logging | |
|
25 | ||
|
26 | from mako import exceptions | |
|
27 | ||
|
22 | 28 | from rhodecode.translation import _ |
|
23 | 29 | |
|
24 | 30 | |
|
31 | log = logging.getLogger(__name__) | |
|
32 | ||
|
33 | ||
|
25 | 34 | class IntegrationTypeBase(object): |
|
26 | 35 | """ Base class for IntegrationType plugins """ |
|
27 | 36 | is_dummy = False |
|
28 | 37 | description = '' |
|
29 | icon = ''' | |
|
38 | ||
|
39 | @classmethod | |
|
40 | def icon(cls): | |
|
41 | return ''' | |
|
30 | 42 | <?xml version="1.0" encoding="UTF-8" standalone="no"?> |
|
31 | 43 | <svg |
|
32 | 44 | xmlns:dc="http://purl.org/dc/elements/1.1/" |
@@ -35,7 +47,7 b' class IntegrationTypeBase(object):' | |||
|
35 | 47 | xmlns:svg="http://www.w3.org/2000/svg" |
|
36 | 48 | xmlns="http://www.w3.org/2000/svg" |
|
37 | 49 | xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" |
|
38 | xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" | |
|
50 | xmlns:inkscape="http://setwww.inkscape.org/namespaces/inkscape" | |
|
39 | 51 | viewBox="0 -256 1792 1792" |
|
40 | 52 | id="svg3025" |
|
41 | 53 | version="1.1" |
@@ -108,3 +120,192 b' class EEIntegration(IntegrationTypeBase)' | |||
|
108 | 120 | def __init__(self, name, key, settings=None): |
|
109 | 121 | self.display_name = name |
|
110 | 122 | self.key = key |
|
123 | super(EEIntegration, self).__init__(settings) | |
|
124 | ||
|
125 | ||
|
126 | # Helpers # | |
|
127 | WEBHOOK_URL_VARS = [ | |
|
128 | ('event_name', 'Unique name of the event type, e.g pullrequest-update'), | |
|
129 | ('repo_name', 'Full name of the repository'), | |
|
130 | ('repo_type', 'VCS type of repository'), | |
|
131 | ('repo_id', 'Unique id of repository'), | |
|
132 | ('repo_url', 'Repository url'), | |
|
133 | # extra repo fields | |
|
134 | ('extra:<extra_key_name>', 'Extra repo variables, read from its settings.'), | |
|
135 | ||
|
136 | # special attrs below that we handle, using multi-call | |
|
137 | ('branch', 'Name of each brach submitted, if any.'), | |
|
138 | ('commit_id', 'Id of each commit submitted, if any.'), | |
|
139 | ||
|
140 | # pr events vars | |
|
141 | ('pull_request_id', 'Unique ID of the pull request.'), | |
|
142 | ('pull_request_title', 'Title of the pull request.'), | |
|
143 | ('pull_request_url', 'Pull request url.'), | |
|
144 | ('pull_request_shadow_url', 'Pull request shadow repo clone url.'), | |
|
145 | ('pull_request_commits_uid', 'Calculated UID of all commits inside the PR. ' | |
|
146 | 'Changes after PR update'), | |
|
147 | ||
|
148 | # user who triggers the call | |
|
149 | ('username', 'User who triggered the call.'), | |
|
150 | ('user_id', 'User id who triggered the call.'), | |
|
151 | ] | |
|
152 | ||
|
153 | # common vars for url template used for CI plugins. Shared with webhook | |
|
154 | CI_URL_VARS = WEBHOOK_URL_VARS | |
|
155 | ||
|
156 | ||
|
157 | class CommitParsingDataHandler(object): | |
|
158 | ||
|
159 | def aggregate_branch_data(self, branches, commits): | |
|
160 | branch_data = collections.OrderedDict() | |
|
161 | for obj in branches: | |
|
162 | branch_data[obj['name']] = obj | |
|
163 | ||
|
164 | branches_commits = collections.OrderedDict() | |
|
165 | for commit in commits: | |
|
166 | if commit.get('git_ref_change'): | |
|
167 | # special case for GIT that allows creating tags, | |
|
168 | # deleting branches without associated commit | |
|
169 | continue | |
|
170 | commit_branch = commit['branch'] | |
|
171 | ||
|
172 | if commit_branch not in branches_commits: | |
|
173 | _branch = branch_data[commit_branch] \ | |
|
174 | if commit_branch else commit_branch | |
|
175 | branch_commits = {'branch': _branch, | |
|
176 | 'commits': []} | |
|
177 | branches_commits[commit_branch] = branch_commits | |
|
178 | ||
|
179 | branch_commits = branches_commits[commit_branch] | |
|
180 | branch_commits['commits'].append(commit) | |
|
181 | return branches_commits | |
|
182 | ||
|
183 | ||
|
184 | class WebhookDataHandler(CommitParsingDataHandler): | |
|
185 | name = 'webhook' | |
|
186 | ||
|
187 | def __init__(self, template_url, headers): | |
|
188 | self.template_url = template_url | |
|
189 | self.headers = headers | |
|
190 | ||
|
191 | def get_base_parsed_template(self, data): | |
|
192 | """ | |
|
193 | initially parses the passed in template with some common variables | |
|
194 | available on ALL calls | |
|
195 | """ | |
|
196 | # note: make sure to update the `WEBHOOK_URL_VARS` if this changes | |
|
197 | common_vars = { | |
|
198 | 'repo_name': data['repo']['repo_name'], | |
|
199 | 'repo_type': data['repo']['repo_type'], | |
|
200 | 'repo_id': data['repo']['repo_id'], | |
|
201 | 'repo_url': data['repo']['url'], | |
|
202 | 'username': data['actor']['username'], | |
|
203 | 'user_id': data['actor']['user_id'], | |
|
204 | 'event_name': data['name'] | |
|
205 | } | |
|
206 | ||
|
207 | extra_vars = {} | |
|
208 | for extra_key, extra_val in data['repo']['extra_fields'].items(): | |
|
209 | extra_vars['extra__{}'.format(extra_key)] = extra_val | |
|
210 | common_vars.update(extra_vars) | |
|
211 | ||
|
212 | template_url = self.template_url.replace('${extra:', '${extra__') | |
|
213 | return string.Template(template_url).safe_substitute(**common_vars) | |
|
214 | ||
|
215 | def repo_push_event_handler(self, event, data): | |
|
216 | url = self.get_base_parsed_template(data) | |
|
217 | url_cals = [] | |
|
218 | ||
|
219 | branches_commits = self.aggregate_branch_data( | |
|
220 | data['push']['branches'], data['push']['commits']) | |
|
221 | if '${branch}' in url: | |
|
222 | # call it multiple times, for each branch if used in variables | |
|
223 | for branch, commit_ids in branches_commits.items(): | |
|
224 | branch_url = string.Template(url).safe_substitute(branch=branch) | |
|
225 | # call further down for each commit if used | |
|
226 | if '${commit_id}' in branch_url: | |
|
227 | for commit_data in commit_ids['commits']: | |
|
228 | commit_id = commit_data['raw_id'] | |
|
229 | commit_url = string.Template(branch_url).safe_substitute( | |
|
230 | commit_id=commit_id) | |
|
231 | # register per-commit call | |
|
232 | log.debug( | |
|
233 | 'register %s call(%s) to url %s', | |
|
234 | self.name, event, commit_url) | |
|
235 | url_cals.append( | |
|
236 | (commit_url, self.headers, data)) | |
|
237 | ||
|
238 | else: | |
|
239 | # register per-branch call | |
|
240 | log.debug( | |
|
241 | 'register %s call(%s) to url %s', | |
|
242 | self.name, event, branch_url) | |
|
243 | url_cals.append( | |
|
244 | (branch_url, self.headers, data)) | |
|
245 | ||
|
246 | else: | |
|
247 | log.debug( | |
|
248 | 'register %s call(%s) to url %s', self.name, event, url) | |
|
249 | url_cals.append((url, self.headers, data)) | |
|
250 | ||
|
251 | return url_cals | |
|
252 | ||
|
253 | def repo_create_event_handler(self, event, data): | |
|
254 | url = self.get_base_parsed_template(data) | |
|
255 | log.debug( | |
|
256 | 'register %s call(%s) to url %s', self.name, event, url) | |
|
257 | return [(url, self.headers, data)] | |
|
258 | ||
|
259 | def pull_request_event_handler(self, event, data): | |
|
260 | url = self.get_base_parsed_template(data) | |
|
261 | log.debug( | |
|
262 | 'register %s call(%s) to url %s', self.name, event, url) | |
|
263 | url = string.Template(url).safe_substitute( | |
|
264 | pull_request_id=data['pullrequest']['pull_request_id'], | |
|
265 | pull_request_title=data['pullrequest']['title'], | |
|
266 | pull_request_url=data['pullrequest']['url'], | |
|
267 | pull_request_shadow_url=data['pullrequest']['shadow_url'], | |
|
268 | pull_request_commits_uid=data['pullrequest']['commits_uid'], | |
|
269 | ) | |
|
270 | return [(url, self.headers, data)] | |
|
271 | ||
|
272 | def __call__(self, event, data): | |
|
273 | from rhodecode import events | |
|
274 | ||
|
275 | if isinstance(event, events.RepoPushEvent): | |
|
276 | return self.repo_push_event_handler(event, data) | |
|
277 | elif isinstance(event, events.RepoCreateEvent): | |
|
278 | return self.repo_create_event_handler(event, data) | |
|
279 | elif isinstance(event, events.PullRequestEvent): | |
|
280 | return self.pull_request_event_handler(event, data) | |
|
281 | else: | |
|
282 | raise ValueError( | |
|
283 | 'event type `%s` not in supported list: %s' % ( | |
|
284 | event.__class__, events)) | |
|
285 | ||
|
286 | ||
|
287 | def get_auth(settings): | |
|
288 | from requests.auth import HTTPBasicAuth | |
|
289 | username = settings.get('username') | |
|
290 | password = settings.get('password') | |
|
291 | if username and password: | |
|
292 | return HTTPBasicAuth(username, password) | |
|
293 | return None | |
|
294 | ||
|
295 | ||
|
296 | def get_web_token(settings): | |
|
297 | return settings['secret_token'] | |
|
298 | ||
|
299 | ||
|
300 | def get_url_vars(url_vars): | |
|
301 | return '\n'.join( | |
|
302 | '{} - {}'.format('${' + key + '}', explanation) | |
|
303 | for key, explanation in url_vars) | |
|
304 | ||
|
305 | ||
|
306 | def render_with_traceback(template, *args, **kwargs): | |
|
307 | try: | |
|
308 | return template.render(*args, **kwargs) | |
|
309 | except Exception: | |
|
310 | log.error(exceptions.text_error_template().render()) | |
|
311 | raise |
@@ -29,7 +29,8 b' from rhodecode import events' | |||
|
29 | 29 | from rhodecode.translation import _ |
|
30 | 30 | from rhodecode.lib.celerylib import run_task |
|
31 | 31 | from rhodecode.lib.celerylib import tasks |
|
32 |
from rhodecode.integrations.types.base import |
|
|
32 | from rhodecode.integrations.types.base import ( | |
|
33 | IntegrationTypeBase, render_with_traceback) | |
|
33 | 34 | |
|
34 | 35 | |
|
35 | 36 | log = logging.getLogger(__name__) |
@@ -127,11 +128,15 b" repo_push_template_html = Template('''" | |||
|
127 | 128 | </td></tr> |
|
128 | 129 | <tr> |
|
129 | 130 | <td style="padding:15px;" valign="top"> |
|
131 | % if data['push']['commits']: | |
|
130 | 132 | % for commit in data['push']['commits']: |
|
131 | 133 | <a href="${commit['url']}">${commit['short_id']}</a> by ${commit['author']} at ${commit['date']} <br/> |
|
132 | 134 | ${commit['message_html']} <br/> |
|
133 | 135 | <br/> |
|
134 | 136 | % endfor |
|
137 | % else: | |
|
138 | No commit data | |
|
139 | % endif | |
|
135 | 140 | </td> |
|
136 | 141 | </tr> |
|
137 | 142 | </table> |
@@ -146,7 +151,34 b" repo_push_template_html = Template('''" | |||
|
146 | 151 | </html> |
|
147 | 152 | ''') |
|
148 | 153 | |
|
149 | email_icon = ''' | |
|
154 | ||
|
155 | class EmailSettingsSchema(colander.Schema): | |
|
156 | @colander.instantiate(validator=colander.Length(min=1)) | |
|
157 | class recipients(colander.SequenceSchema): | |
|
158 | title = _('Recipients') | |
|
159 | description = _('Email addresses to send push events to') | |
|
160 | widget = deform.widget.SequenceWidget(min_len=1) | |
|
161 | ||
|
162 | recipient = colander.SchemaNode( | |
|
163 | colander.String(), | |
|
164 | title=_('Email address'), | |
|
165 | description=_('Email address'), | |
|
166 | default='', | |
|
167 | validator=colander.Email(), | |
|
168 | widget=deform.widget.TextInputWidget( | |
|
169 | placeholder='user@domain.com', | |
|
170 | ), | |
|
171 | ) | |
|
172 | ||
|
173 | ||
|
174 | class EmailIntegrationType(IntegrationTypeBase): | |
|
175 | key = 'email' | |
|
176 | display_name = _('Email') | |
|
177 | description = _('Send repo push summaries to a list of recipients via email') | |
|
178 | ||
|
179 | @classmethod | |
|
180 | def icon(cls): | |
|
181 | return ''' | |
|
150 | 182 | <?xml version="1.0" encoding="UTF-8" standalone="no"?> |
|
151 | 183 | <svg |
|
152 | 184 | xmlns:dc="http://purl.org/dc/elements/1.1/" |
@@ -208,32 +240,6 b" email_icon = '''" | |||
|
208 | 240 | </svg> |
|
209 | 241 | ''' |
|
210 | 242 | |
|
211 | ||
|
212 | class EmailSettingsSchema(colander.Schema): | |
|
213 | @colander.instantiate(validator=colander.Length(min=1)) | |
|
214 | class recipients(colander.SequenceSchema): | |
|
215 | title = _('Recipients') | |
|
216 | description = _('Email addresses to send push events to') | |
|
217 | widget = deform.widget.SequenceWidget(min_len=1) | |
|
218 | ||
|
219 | recipient = colander.SchemaNode( | |
|
220 | colander.String(), | |
|
221 | title=_('Email address'), | |
|
222 | description=_('Email address'), | |
|
223 | default='', | |
|
224 | validator=colander.Email(), | |
|
225 | widget=deform.widget.TextInputWidget( | |
|
226 | placeholder='user@domain.com', | |
|
227 | ), | |
|
228 | ) | |
|
229 | ||
|
230 | ||
|
231 | class EmailIntegrationType(IntegrationTypeBase): | |
|
232 | key = 'email' | |
|
233 | display_name = _('Email') | |
|
234 | description = _('Send repo push summaries to a list of recipients via email') | |
|
235 | icon = email_icon | |
|
236 | ||
|
237 | 243 | def settings_schema(self): |
|
238 | 244 | schema = EmailSettingsSchema() |
|
239 | 245 | return schema |
@@ -276,12 +282,14 b' def repo_push_handler(data, settings):' | |||
|
276 | 282 | branches=', '.join( |
|
277 | 283 | branch['name'] for branch in data['push']['branches'])) |
|
278 | 284 | |
|
279 |
email_body_plaintext = re |
|
|
285 | email_body_plaintext = render_with_traceback( | |
|
286 | repo_push_template_plaintext, | |
|
280 | 287 | data=data, |
|
281 | 288 | subject=subject, |
|
282 | 289 | instance_url=server_url) |
|
283 | 290 | |
|
284 | email_body_html = repo_push_template_html.render( | |
|
291 | email_body_html = render_with_traceback( | |
|
292 | repo_push_template_html, | |
|
285 | 293 | data=data, |
|
286 | 294 | subject=subject, |
|
287 | 295 | instance_url=server_url) |
@@ -24,14 +24,14 b' import logging' | |||
|
24 | 24 | import requests |
|
25 | 25 | import colander |
|
26 | 26 | import textwrap |
|
27 | from collections import OrderedDict | |
|
28 | 27 | from mako.template import Template |
|
29 | 28 | from rhodecode import events |
|
30 | 29 | from rhodecode.translation import _ |
|
31 | 30 | from rhodecode.lib import helpers as h |
|
32 | 31 | from rhodecode.lib.celerylib import run_task, async_task, RequestContextTask |
|
33 | 32 | from rhodecode.lib.colander_utils import strip_whitespace |
|
34 |
from rhodecode.integrations.types.base import |
|
|
33 | from rhodecode.integrations.types.base import ( | |
|
34 | IntegrationTypeBase, CommitParsingDataHandler, render_with_traceback) | |
|
35 | 35 | |
|
36 | 36 | log = logging.getLogger(__name__) |
|
37 | 37 | |
@@ -81,7 +81,11 b" repo_push_template = Template('''" | |||
|
81 | 81 | <ul> |
|
82 | 82 | %for branch, branch_commits in branches_commits.items(): |
|
83 | 83 | <li> |
|
84 | % if branch: | |
|
84 | 85 | <a href="${branch_commits['branch']['url']}">branch: ${branch_commits['branch']['name']}</a> |
|
86 | % else: | |
|
87 | to trunk | |
|
88 | % endif | |
|
85 | 89 | <ul> |
|
86 | 90 | %for commit in branch_commits['commits']: |
|
87 | 91 | <li><a href="${commit['url']}">${commit['short_id']}</a> - ${commit['message_html']}</li> |
@@ -92,12 +96,16 b" repo_push_template = Template('''" | |||
|
92 | 96 | ''') |
|
93 | 97 | |
|
94 | 98 | |
|
95 | class HipchatIntegrationType(IntegrationTypeBase): | |
|
99 | class HipchatIntegrationType(IntegrationTypeBase, CommitParsingDataHandler): | |
|
96 | 100 | key = 'hipchat' |
|
97 | 101 | display_name = _('Hipchat') |
|
98 | 102 | description = _('Send events such as repo pushes and pull requests to ' |
|
99 | 103 | 'your hipchat channel.') |
|
100 | icon = '''<?xml version="1.0" encoding="utf-8"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 1000 1000" enable-background="new 0 0 1000 1000" xml:space="preserve"><g><g transform="translate(0.000000,511.000000) scale(0.100000,-0.100000)"><path fill="#205281" d="M4197.1,4662.4c-1661.5-260.4-3018-1171.6-3682.6-2473.3C219.9,1613.6,100,1120.3,100,462.6c0-1014,376.8-1918.4,1127-2699.4C2326.7-3377.6,3878.5-3898.3,5701-3730.5l486.5,44.5l208.9-123.3c637.2-373.4,1551.8-640.6,2240.4-650.9c304.9-6.9,335.7,0,417.9,75.4c185,174.7,147.3,411.1-89.1,548.1c-315.2,181.6-620,544.7-733.1,870.1l-51.4,157.6l472.7,472.7c349.4,349.4,520.7,551.5,657.7,774.2c784.5,1281.2,784.5,2788.5,0,4052.6c-236.4,376.8-794.8,966-1178.4,1236.7c-572.1,407.7-1264.1,709.1-1993.7,870.1c-267.2,58.2-479.6,75.4-1038,82.2C4714.4,4686.4,4310.2,4679.6,4197.1,4662.4z M5947.6,3740.9c1856.7-380.3,3127.6-1709.4,3127.6-3275c0-1000.3-534.4-1949.2-1466.2-2600.1c-188.4-133.6-287.8-226.1-301.5-284.4c-41.1-157.6,263.8-938.6,397.4-1020.8c20.5-10.3,34.3-44.5,34.3-75.4c0-167.8-811.9,195.3-1363.4,609.8l-181.6,137l-332.3-58.2c-445.3-78.8-1281.2-78.8-1702.6,0C2796-2569.2,1734.1-1832.6,1220.2-801.5C983.8-318.5,905,51.5,929,613.3c27.4,640.6,243.2,1192.1,685.1,1740.3c620,770.8,1661.5,1305.2,2822.8,1452.5C4806.9,3854,5553.7,3819.7,5947.6,3740.9z"/><path fill="#205281" d="M2381.5-345.9c-75.4-106.2-68.5-167.8,34.3-322c332.3-500.2,1010.6-928.4,1760.8-1120.2c417.9-106.2,1226.4-106.2,1644.3,0c712.5,181.6,1270.9,517.3,1685.4,1014C7681-561.7,7715.3-424.7,7616-325.4c-89.1,89.1-167.9,65.1-431.7-133.6c-835.8-630.3-2028-856.4-3086.5-585.8C3683.3-938.6,3142-685,2830.3-448.7C2576.8-253.4,2463.7-229.4,2381.5-345.9z"/></g></g><!-- Svg Vector Icons : http://www.onlinewebfonts.com/icon --></svg>''' | |
|
104 | ||
|
105 | @classmethod | |
|
106 | def icon(cls): | |
|
107 | return '''<?xml version="1.0" encoding="utf-8"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 1000 1000" enable-background="new 0 0 1000 1000" xml:space="preserve"><g><g transform="translate(0.000000,511.000000) scale(0.100000,-0.100000)"><path fill="#205281" d="M4197.1,4662.4c-1661.5-260.4-3018-1171.6-3682.6-2473.3C219.9,1613.6,100,1120.3,100,462.6c0-1014,376.8-1918.4,1127-2699.4C2326.7-3377.6,3878.5-3898.3,5701-3730.5l486.5,44.5l208.9-123.3c637.2-373.4,1551.8-640.6,2240.4-650.9c304.9-6.9,335.7,0,417.9,75.4c185,174.7,147.3,411.1-89.1,548.1c-315.2,181.6-620,544.7-733.1,870.1l-51.4,157.6l472.7,472.7c349.4,349.4,520.7,551.5,657.7,774.2c784.5,1281.2,784.5,2788.5,0,4052.6c-236.4,376.8-794.8,966-1178.4,1236.7c-572.1,407.7-1264.1,709.1-1993.7,870.1c-267.2,58.2-479.6,75.4-1038,82.2C4714.4,4686.4,4310.2,4679.6,4197.1,4662.4z M5947.6,3740.9c1856.7-380.3,3127.6-1709.4,3127.6-3275c0-1000.3-534.4-1949.2-1466.2-2600.1c-188.4-133.6-287.8-226.1-301.5-284.4c-41.1-157.6,263.8-938.6,397.4-1020.8c20.5-10.3,34.3-44.5,34.3-75.4c0-167.8-811.9,195.3-1363.4,609.8l-181.6,137l-332.3-58.2c-445.3-78.8-1281.2-78.8-1702.6,0C2796-2569.2,1734.1-1832.6,1220.2-801.5C983.8-318.5,905,51.5,929,613.3c27.4,640.6,243.2,1192.1,685.1,1740.3c620,770.8,1661.5,1305.2,2822.8,1452.5C4806.9,3854,5553.7,3819.7,5947.6,3740.9z"/><path fill="#205281" d="M2381.5-345.9c-75.4-106.2-68.5-167.8,34.3-322c332.3-500.2,1010.6-928.4,1760.8-1120.2c417.9-106.2,1226.4-106.2,1644.3,0c712.5,181.6,1270.9,517.3,1685.4,1014C7681-561.7,7715.3-424.7,7616-325.4c-89.1,89.1-167.9,65.1-431.7-133.6c-835.8-630.3-2028-856.4-3086.5-585.8C3683.3-938.6,3142-685,2830.3-448.7C2576.8-253.4,2463.7-229.4,2381.5-345.9z"/></g></g><!-- Svg Vector Icons : http://www.onlinewebfonts.com/icon --></svg>''' | |
|
108 | ||
|
101 | 109 | valid_events = [ |
|
102 | 110 | events.PullRequestCloseEvent, |
|
103 | 111 | events.PullRequestMergeEvent, |
@@ -213,20 +221,11 b' class HipchatIntegrationType(Integration' | |||
|
213 | 221 | ) |
|
214 | 222 | |
|
215 | 223 | def format_repo_push_event(self, data): |
|
216 | branch_data = {branch['name']: branch | |
|
217 | for branch in data['push']['branches']} | |
|
224 | branches_commits = self.aggregate_branch_data( | |
|
225 | data['push']['branches'], data['push']['commits']) | |
|
218 | 226 | |
|
219 | branches_commits = OrderedDict() | |
|
220 | for commit in data['push']['commits']: | |
|
221 | if commit['branch'] not in branches_commits: | |
|
222 | branch_commits = {'branch': branch_data[commit['branch']], | |
|
223 | 'commits': []} | |
|
224 | branches_commits[commit['branch']] = branch_commits | |
|
225 | ||
|
226 | branch_commits = branches_commits[commit['branch']] | |
|
227 | branch_commits['commits'].append(commit) | |
|
228 | ||
|
229 | result = repo_push_template.render( | |
|
227 | result = render_with_traceback( | |
|
228 | repo_push_template, | |
|
230 | 229 | data=data, |
|
231 | 230 | branches_commits=branches_commits, |
|
232 | 231 | ) |
@@ -28,14 +28,14 b' import deform' | |||
|
28 | 28 | import requests |
|
29 | 29 | import colander |
|
30 | 30 | from mako.template import Template |
|
31 | from collections import OrderedDict | |
|
32 | 31 | |
|
33 | 32 | from rhodecode import events |
|
34 | 33 | from rhodecode.translation import _ |
|
35 | 34 | from rhodecode.lib import helpers as h |
|
36 | 35 | from rhodecode.lib.celerylib import run_task, async_task, RequestContextTask |
|
37 | 36 | from rhodecode.lib.colander_utils import strip_whitespace |
|
38 |
from rhodecode.integrations.types.base import |
|
|
37 | from rhodecode.integrations.types.base import ( | |
|
38 | IntegrationTypeBase, CommitParsingDataHandler, render_with_traceback) | |
|
39 | 39 | |
|
40 | 40 | log = logging.getLogger(__name__) |
|
41 | 41 | |
@@ -87,12 +87,16 b' class SlackSettingsSchema(colander.Schem' | |||
|
87 | 87 | ) |
|
88 | 88 | |
|
89 | 89 | |
|
90 | class SlackIntegrationType(IntegrationTypeBase): | |
|
90 | class SlackIntegrationType(IntegrationTypeBase, CommitParsingDataHandler): | |
|
91 | 91 | key = 'slack' |
|
92 | 92 | display_name = _('Slack') |
|
93 | 93 | description = _('Send events such as repo pushes and pull requests to ' |
|
94 | 94 | 'your slack channel.') |
|
95 | icon = '''<?xml version="1.0" encoding="UTF-8" standalone="no"?><svg viewBox="0 0 256 256" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" preserveAspectRatio="xMidYMid"><g><path d="M165.963541,15.8384262 C162.07318,3.86308197 149.212328,-2.69009836 137.239082,1.20236066 C125.263738,5.09272131 118.710557,17.9535738 122.603016,29.9268197 L181.550164,211.292328 C185.597902,222.478689 197.682361,228.765377 209.282098,225.426885 C221.381246,221.943607 228.756984,209.093246 224.896,197.21023 C224.749115,196.756984 165.963541,15.8384262 165.963541,15.8384262" fill="#DFA22F"></path><path d="M74.6260984,45.515541 C70.7336393,33.5422951 57.8727869,26.9891148 45.899541,30.8794754 C33.9241967,34.7698361 27.3710164,47.6306885 31.2634754,59.6060328 L90.210623,240.971541 C94.2583607,252.157902 106.34282,258.44459 117.942557,255.104 C130.041705,251.62282 137.417443,238.772459 133.556459,226.887344 C133.409574,226.436197 74.6260984,45.515541 74.6260984,45.515541" fill="#3CB187"></path><path d="M240.161574,166.045377 C252.136918,162.155016 258.688,149.294164 254.797639,137.31882 C250.907279,125.345574 238.046426,118.792393 226.07318,122.682754 L44.7076721,181.632 C33.5213115,185.677639 27.234623,197.762098 30.5731148,209.361836 C34.0563934,221.460984 46.9067541,228.836721 58.7897705,224.975738 C59.2430164,224.828852 240.161574,166.045377 240.161574,166.045377" fill="#CE1E5B"></path><path d="M82.507541,217.270557 C94.312918,213.434754 109.528131,208.491016 125.855475,203.186361 C122.019672,191.380984 117.075934,176.163672 111.76918,159.83423 L68.4191475,173.924721 L82.507541,217.270557" fill="#392538"></path><path d="M173.847082,187.591344 C190.235279,182.267803 205.467279,177.31777 217.195016,173.507148 C213.359213,161.70177 208.413377,146.480262 203.106623,130.146623 L159.75659,144.237115 L173.847082,187.591344" fill="#BB242A"></path><path d="M210.484459,74.7058361 C222.457705,70.8154754 229.010885,57.954623 225.120525,45.9792787 C221.230164,34.0060328 208.369311,27.4528525 196.393967,31.3432131 L15.028459,90.292459 C3.84209836,94.3380984 -2.44459016,106.422557 0.896,118.022295 C4.37718033,130.121443 17.227541,137.49718 29.1126557,133.636197 C29.5638033,133.489311 210.484459,74.7058361 210.484459,74.7058361" fill="#72C5CD"></path><path d="M52.8220328,125.933115 C64.6274098,122.097311 79.8468197,117.151475 96.1762623,111.84682 C90.8527213,95.4565246 85.9026885,80.2245246 82.0920656,68.4946885 L38.731541,82.5872787 L52.8220328,125.933115" fill="#248C73"></path><path d="M144.159475,96.256 C160.551869,90.9303607 175.785967,85.9803279 187.515803,82.1676066 C182.190164,65.7752131 177.240131,50.5390164 173.42741,38.807082 L130.068984,52.8996721 L144.159475,96.256" fill="#62803A"></path></g></svg>''' | |
|
95 | ||
|
96 | @classmethod | |
|
97 | def icon(cls): | |
|
98 | return '''<?xml version="1.0" encoding="UTF-8" standalone="no"?><svg viewBox="0 0 256 256" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" preserveAspectRatio="xMidYMid"><g><path d="M165.963541,15.8384262 C162.07318,3.86308197 149.212328,-2.69009836 137.239082,1.20236066 C125.263738,5.09272131 118.710557,17.9535738 122.603016,29.9268197 L181.550164,211.292328 C185.597902,222.478689 197.682361,228.765377 209.282098,225.426885 C221.381246,221.943607 228.756984,209.093246 224.896,197.21023 C224.749115,196.756984 165.963541,15.8384262 165.963541,15.8384262" fill="#DFA22F"></path><path d="M74.6260984,45.515541 C70.7336393,33.5422951 57.8727869,26.9891148 45.899541,30.8794754 C33.9241967,34.7698361 27.3710164,47.6306885 31.2634754,59.6060328 L90.210623,240.971541 C94.2583607,252.157902 106.34282,258.44459 117.942557,255.104 C130.041705,251.62282 137.417443,238.772459 133.556459,226.887344 C133.409574,226.436197 74.6260984,45.515541 74.6260984,45.515541" fill="#3CB187"></path><path d="M240.161574,166.045377 C252.136918,162.155016 258.688,149.294164 254.797639,137.31882 C250.907279,125.345574 238.046426,118.792393 226.07318,122.682754 L44.7076721,181.632 C33.5213115,185.677639 27.234623,197.762098 30.5731148,209.361836 C34.0563934,221.460984 46.9067541,228.836721 58.7897705,224.975738 C59.2430164,224.828852 240.161574,166.045377 240.161574,166.045377" fill="#CE1E5B"></path><path d="M82.507541,217.270557 C94.312918,213.434754 109.528131,208.491016 125.855475,203.186361 C122.019672,191.380984 117.075934,176.163672 111.76918,159.83423 L68.4191475,173.924721 L82.507541,217.270557" fill="#392538"></path><path d="M173.847082,187.591344 C190.235279,182.267803 205.467279,177.31777 217.195016,173.507148 C213.359213,161.70177 208.413377,146.480262 203.106623,130.146623 L159.75659,144.237115 L173.847082,187.591344" fill="#BB242A"></path><path d="M210.484459,74.7058361 C222.457705,70.8154754 229.010885,57.954623 225.120525,45.9792787 C221.230164,34.0060328 208.369311,27.4528525 196.393967,31.3432131 L15.028459,90.292459 C3.84209836,94.3380984 -2.44459016,106.422557 0.896,118.022295 C4.37718033,130.121443 17.227541,137.49718 29.1126557,133.636197 C29.5638033,133.489311 210.484459,74.7058361 210.484459,74.7058361" fill="#72C5CD"></path><path d="M52.8220328,125.933115 C64.6274098,122.097311 79.8468197,117.151475 96.1762623,111.84682 C90.8527213,95.4565246 85.9026885,80.2245246 82.0920656,68.4946885 L38.731541,82.5872787 L52.8220328,125.933115" fill="#248C73"></path><path d="M144.159475,96.256 C160.551869,90.9303607 175.785967,85.9803279 187.515803,82.1676066 C182.190164,65.7752131 177.240131,50.5390164 173.42741,38.807082 L130.068984,52.8996721 L144.159475,96.256" fill="#62803A"></path></g></svg>''' | |
|
99 | ||
|
96 | 100 | valid_events = [ |
|
97 | 101 | events.PullRequestCloseEvent, |
|
98 | 102 | events.PullRequestMergeEvent, |
@@ -190,32 +194,39 b' class SlackIntegrationType(IntegrationTy' | |||
|
190 | 194 | } |
|
191 | 195 | ] |
|
192 | 196 | |
|
193 |
t |
|
|
197 | template = Template(textwrap.dedent(r''' | |
|
194 | 198 | *${data['actor']['username']}* left ${data['comment']['type']} on pull request <${data['pullrequest']['url']}|#${data['pullrequest']['pull_request_id']}>: |
|
195 | ''')).render(data=data, comment=event.comment) | |
|
199 | ''')) | |
|
200 | title = render_with_traceback( | |
|
201 | template, data=data, comment=event.comment) | |
|
196 | 202 | |
|
197 |
te |
|
|
203 | template = Template(textwrap.dedent(r''' | |
|
198 | 204 | *pull request title*: ${pr_title} |
|
199 | 205 | % if status_text: |
|
200 | 206 | *submitted status*: `${status_text}` |
|
201 | 207 | % endif |
|
202 | 208 | >>> ${comment_text} |
|
203 | ''')).render(comment_text=comment_text, | |
|
209 | ''')) | |
|
210 | text = render_with_traceback( | |
|
211 | template, | |
|
212 | comment_text=comment_text, | |
|
204 | 213 |
|
|
205 | 214 |
|
|
206 | 215 | |
|
207 | 216 | return title, text, fields, overrides |
|
208 | 217 | |
|
209 | 218 | def format_pull_request_review_event(self, event, data): |
|
210 |
t |
|
|
219 | template = Template(textwrap.dedent(r''' | |
|
211 | 220 | *${data['actor']['username']}* changed status of pull request <${data['pullrequest']['url']}|#${data['pullrequest']['pull_request_id']} to `${data['pullrequest']['status']}`>: |
|
212 | ''')).render(data=data) | |
|
221 | ''')) | |
|
222 | title = render_with_traceback(template, data=data) | |
|
213 | 223 | |
|
214 |
te |
|
|
224 | template = Template(textwrap.dedent(r''' | |
|
215 | 225 | *pull request title*: ${pr_title} |
|
216 |
''')) |
|
|
217 | pr_title=data['pullrequest']['title'], | |
|
218 | ) | |
|
226 | ''')) | |
|
227 | text = render_with_traceback( | |
|
228 | template, | |
|
229 | pr_title=data['pullrequest']['title']) | |
|
219 | 230 | |
|
220 | 231 | return title, text |
|
221 | 232 | |
@@ -227,50 +238,53 b' class SlackIntegrationType(IntegrationTy' | |||
|
227 | 238 | events.PullRequestCreateEvent: 'created', |
|
228 | 239 | }.get(event.__class__, str(event.__class__)) |
|
229 | 240 | |
|
230 |
t |
|
|
241 | template = Template(textwrap.dedent(r''' | |
|
231 | 242 | *${data['actor']['username']}* `${action}` pull request <${data['pullrequest']['url']}|#${data['pullrequest']['pull_request_id']}>: |
|
232 | ''')).render(data=data, action=action) | |
|
243 | ''')) | |
|
244 | title = render_with_traceback(template, data=data, action=action) | |
|
233 | 245 | |
|
234 |
te |
|
|
246 | template = Template(textwrap.dedent(r''' | |
|
235 | 247 | *pull request title*: ${pr_title} |
|
236 | 248 | %if data['pullrequest']['commits']: |
|
237 | 249 | *commits*: ${len(data['pullrequest']['commits'])} |
|
238 | 250 | %endif |
|
239 |
''')) |
|
|
251 | ''')) | |
|
252 | text = render_with_traceback( | |
|
253 | template, | |
|
240 | 254 | pr_title=data['pullrequest']['title'], |
|
241 | data=data | |
|
242 | ) | |
|
255 | data=data) | |
|
243 | 256 | |
|
244 | 257 | return title, text |
|
245 | 258 | |
|
246 | 259 | def format_repo_push_event(self, data): |
|
247 | branch_data = {branch['name']: branch | |
|
248 | for branch in data['push']['branches']} | |
|
260 | ||
|
261 | branches_commits = self.aggregate_branch_data( | |
|
262 | data['push']['branches'], data['push']['commits']) | |
|
249 | 263 | |
|
250 | branches_commits = OrderedDict() | |
|
251 | for commit in data['push']['commits']: | |
|
252 | if commit['branch'] not in branches_commits: | |
|
253 | branch_commits = {'branch': branch_data[commit['branch']], | |
|
254 | 'commits': []} | |
|
255 | branches_commits[commit['branch']] = branch_commits | |
|
256 | ||
|
257 | branch_commits = branches_commits[commit['branch']] | |
|
258 | branch_commits['commits'].append(commit) | |
|
259 | ||
|
260 | title = Template(r''' | |
|
264 | template = Template(r''' | |
|
261 | 265 | *${data['actor']['username']}* pushed to repo <${data['repo']['url']}|${data['repo']['repo_name']}>: |
|
262 | ''').render(data=data) | |
|
266 | ''') | |
|
267 | title = render_with_traceback(template, data=data) | |
|
263 | 268 | |
|
264 | 269 | repo_push_template = Template(textwrap.dedent(r''' |
|
270 | <% | |
|
271 | def branch_text(branch): | |
|
272 | if branch: | |
|
273 | return 'on branch: <{}|{}>'.format(branch_commits['branch']['url'], branch_commits['branch']['name']) | |
|
274 | else: | |
|
275 | ## case for SVN no branch push... | |
|
276 | return 'to trunk' | |
|
277 | %> \ | |
|
265 | 278 | %for branch, branch_commits in branches_commits.items(): |
|
266 |
${len(branch_commits['commits'])} ${'commit' if len(branch_commits['commits']) == 1 else 'commits'} |
|
|
279 | ${len(branch_commits['commits'])} ${'commit' if len(branch_commits['commits']) == 1 else 'commits'} ${branch_text(branch)} | |
|
267 | 280 | %for commit in branch_commits['commits']: |
|
268 | 281 | `<${commit['url']}|${commit['short_id']}>` - ${commit['message_html']|html_to_slack_links} |
|
269 | 282 | %endfor |
|
270 | 283 | %endfor |
|
271 | 284 | ''')) |
|
272 | 285 | |
|
273 | text = repo_push_template.render( | |
|
286 | text = render_with_traceback( | |
|
287 | repo_push_template, | |
|
274 | 288 | data=data, |
|
275 | 289 | branches_commits=branches_commits, |
|
276 | 290 | html_to_slack_links=html_to_slack_links, |
@@ -279,14 +293,16 b' class SlackIntegrationType(IntegrationTy' | |||
|
279 | 293 | return title, text |
|
280 | 294 | |
|
281 | 295 | def format_repo_create_event(self, data): |
|
282 |
t |
|
|
296 | template = Template(r''' | |
|
283 | 297 | *${data['actor']['username']}* created new repository ${data['repo']['repo_name']}: |
|
284 | ''').render(data=data) | |
|
298 | ''') | |
|
299 | title = render_with_traceback(template, data=data) | |
|
285 | 300 | |
|
286 |
te |
|
|
301 | template = Template(textwrap.dedent(r''' | |
|
287 | 302 | repo_url: ${data['repo']['url']} |
|
288 | 303 | repo_type: ${data['repo']['repo_type']} |
|
289 | ''')).render(data=data) | |
|
304 | ''')) | |
|
305 | text = render_with_traceback(template, data=data) | |
|
290 | 306 | |
|
291 | 307 | return title, text |
|
292 | 308 |
@@ -19,8 +19,6 b'' | |||
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | from __future__ import unicode_literals |
|
22 | import string | |
|
23 | from collections import OrderedDict | |
|
24 | 22 | |
|
25 | 23 | import deform |
|
26 | 24 | import deform.widget |
@@ -33,149 +31,18 b' from requests.packages.urllib3.util.retr' | |||
|
33 | 31 | import rhodecode |
|
34 | 32 | from rhodecode import events |
|
35 | 33 | from rhodecode.translation import _ |
|
36 |
from rhodecode.integrations.types.base import |
|
|
34 | from rhodecode.integrations.types.base import ( | |
|
35 | IntegrationTypeBase, get_auth, get_web_token, get_url_vars, | |
|
36 | WebhookDataHandler, WEBHOOK_URL_VARS) | |
|
37 | 37 | from rhodecode.lib.celerylib import run_task, async_task, RequestContextTask |
|
38 | from rhodecode.model.validation_schema import widgets | |
|
38 | 39 | |
|
39 | 40 | log = logging.getLogger(__name__) |
|
40 | 41 | |
|
41 | 42 | |
|
42 | 43 | # updating this required to update the `common_vars` passed in url calling func |
|
43 | WEBHOOK_URL_VARS = [ | |
|
44 | 'repo_name', | |
|
45 | 'repo_type', | |
|
46 | 'repo_id', | |
|
47 | 'repo_url', | |
|
48 | # extra repo fields | |
|
49 | 'extra:<extra_key_name>', | |
|
50 | 44 | |
|
51 | # special attrs below that we handle, using multi-call | |
|
52 | 'branch', | |
|
53 | 'commit_id', | |
|
54 | ||
|
55 | # pr events vars | |
|
56 | 'pull_request_id', | |
|
57 | 'pull_request_url', | |
|
58 | ||
|
59 | # user who triggers the call | |
|
60 | 'username', | |
|
61 | 'user_id', | |
|
62 | ||
|
63 | ] | |
|
64 | URL_VARS = ', '.join('${' + x + '}' for x in WEBHOOK_URL_VARS) | |
|
65 | ||
|
66 | ||
|
67 | def get_auth(settings): | |
|
68 | from requests.auth import HTTPBasicAuth | |
|
69 | username = settings.get('username') | |
|
70 | password = settings.get('password') | |
|
71 | if username and password: | |
|
72 | return HTTPBasicAuth(username, password) | |
|
73 | return None | |
|
74 | ||
|
75 | ||
|
76 | class WebhookHandler(object): | |
|
77 | def __init__(self, template_url, secret_token, headers): | |
|
78 | self.template_url = template_url | |
|
79 | self.secret_token = secret_token | |
|
80 | self.headers = headers | |
|
81 | ||
|
82 | def get_base_parsed_template(self, data): | |
|
83 | """ | |
|
84 | initially parses the passed in template with some common variables | |
|
85 | available on ALL calls | |
|
86 | """ | |
|
87 | # note: make sure to update the `WEBHOOK_URL_VARS` if this changes | |
|
88 | common_vars = { | |
|
89 | 'repo_name': data['repo']['repo_name'], | |
|
90 | 'repo_type': data['repo']['repo_type'], | |
|
91 | 'repo_id': data['repo']['repo_id'], | |
|
92 | 'repo_url': data['repo']['url'], | |
|
93 | 'username': data['actor']['username'], | |
|
94 | 'user_id': data['actor']['user_id'] | |
|
95 | } | |
|
96 | ||
|
97 | extra_vars = {} | |
|
98 | for extra_key, extra_val in data['repo']['extra_fields'].items(): | |
|
99 | extra_vars['extra__{}'.format(extra_key)] = extra_val | |
|
100 | common_vars.update(extra_vars) | |
|
101 | ||
|
102 | template_url = self.template_url.replace('${extra:', '${extra__') | |
|
103 | return string.Template(template_url).safe_substitute(**common_vars) | |
|
104 | ||
|
105 | def repo_push_event_handler(self, event, data): | |
|
106 | url = self.get_base_parsed_template(data) | |
|
107 | url_cals = [] | |
|
108 | branch_data = OrderedDict() | |
|
109 | for obj in data['push']['branches']: | |
|
110 | branch_data[obj['name']] = obj | |
|
111 | ||
|
112 | branches_commits = OrderedDict() | |
|
113 | for commit in data['push']['commits']: | |
|
114 | if commit.get('git_ref_change'): | |
|
115 | # special case for GIT that allows creating tags, | |
|
116 | # deleting branches without associated commit | |
|
117 | continue | |
|
118 | ||
|
119 | if commit['branch'] not in branches_commits: | |
|
120 | branch_commits = {'branch': branch_data[commit['branch']], | |
|
121 | 'commits': []} | |
|
122 | branches_commits[commit['branch']] = branch_commits | |
|
123 | ||
|
124 | branch_commits = branches_commits[commit['branch']] | |
|
125 | branch_commits['commits'].append(commit) | |
|
126 | ||
|
127 | if '${branch}' in url: | |
|
128 | # call it multiple times, for each branch if used in variables | |
|
129 | for branch, commit_ids in branches_commits.items(): | |
|
130 | branch_url = string.Template(url).safe_substitute(branch=branch) | |
|
131 | # call further down for each commit if used | |
|
132 | if '${commit_id}' in branch_url: | |
|
133 | for commit_data in commit_ids['commits']: | |
|
134 | commit_id = commit_data['raw_id'] | |
|
135 | commit_url = string.Template(branch_url).safe_substitute( | |
|
136 | commit_id=commit_id) | |
|
137 | # register per-commit call | |
|
138 | log.debug( | |
|
139 | 'register webhook call(%s) to url %s', event, commit_url) | |
|
140 | url_cals.append((commit_url, self.secret_token, self.headers, data)) | |
|
141 | ||
|
142 | else: | |
|
143 | # register per-branch call | |
|
144 | log.debug( | |
|
145 | 'register webhook call(%s) to url %s', event, branch_url) | |
|
146 | url_cals.append((branch_url, self.secret_token, self.headers, data)) | |
|
147 | ||
|
148 | else: | |
|
149 | log.debug( | |
|
150 | 'register webhook call(%s) to url %s', event, url) | |
|
151 | url_cals.append((url, self.secret_token, self.headers, data)) | |
|
152 | ||
|
153 | return url_cals | |
|
154 | ||
|
155 | def repo_create_event_handler(self, event, data): | |
|
156 | url = self.get_base_parsed_template(data) | |
|
157 | log.debug( | |
|
158 | 'register webhook call(%s) to url %s', event, url) | |
|
159 | return [(url, self.secret_token, self.headers, data)] | |
|
160 | ||
|
161 | def pull_request_event_handler(self, event, data): | |
|
162 | url = self.get_base_parsed_template(data) | |
|
163 | log.debug( | |
|
164 | 'register webhook call(%s) to url %s', event, url) | |
|
165 | url = string.Template(url).safe_substitute( | |
|
166 | pull_request_id=data['pullrequest']['pull_request_id'], | |
|
167 | pull_request_url=data['pullrequest']['url']) | |
|
168 | return [(url, self.secret_token, self.headers, data)] | |
|
169 | ||
|
170 | def __call__(self, event, data): | |
|
171 | if isinstance(event, events.RepoPushEvent): | |
|
172 | return self.repo_push_event_handler(event, data) | |
|
173 | elif isinstance(event, events.RepoCreateEvent): | |
|
174 | return self.repo_create_event_handler(event, data) | |
|
175 | elif isinstance(event, events.PullRequestEvent): | |
|
176 | return self.pull_request_event_handler(event, data) | |
|
177 | else: | |
|
178 | raise ValueError('event type not supported: %s' % events) | |
|
45 | URL_VARS = get_url_vars(WEBHOOK_URL_VARS) | |
|
179 | 46 | |
|
180 | 47 | |
|
181 | 48 | class WebhookSettingsSchema(colander.Schema): |
@@ -183,17 +50,21 b' class WebhookSettingsSchema(colander.Sch' | |||
|
183 | 50 | colander.String(), |
|
184 | 51 | title=_('Webhook URL'), |
|
185 | 52 | description= |
|
186 |
_('URL to which Webhook should submit data. |
|
|
187 | 'are allowed to be used: {vars}. Some of the variables would ' | |
|
188 | 'trigger multiple calls, like ${{branch}} or ${{commit_id}}. ' | |
|
189 | 'Webhook will be called as many times as unique objects in ' | |
|
190 | 'data in such cases.').format(vars=URL_VARS), | |
|
53 | _('URL to which Webhook should submit data. If used some of the ' | |
|
54 | 'variables would trigger multiple calls, like ${branch} or ' | |
|
55 | '${commit_id}. Webhook will be called as many times as unique ' | |
|
56 | 'objects in data in such cases.'), | |
|
191 | 57 | missing=colander.required, |
|
192 | 58 | required=True, |
|
193 | 59 | validator=colander.url, |
|
194 |
widget= |
|
|
195 | placeholder='https://www.example.com/webhook' | |
|
196 | ), | |
|
60 | widget=widgets.CodeMirrorWidget( | |
|
61 | help_block_collapsable_name='Show url variables', | |
|
62 | help_block_collapsable=( | |
|
63 | 'E.g http://my-serv/trigger_job/${{event_name}}' | |
|
64 | '?PR_ID=${{pull_request_id}}' | |
|
65 | '\nFull list of vars:\n{}'.format(URL_VARS)), | |
|
66 | codemirror_mode='text', | |
|
67 | codemirror_options='{"lineNumbers": false, "lineWrapping": true}'), | |
|
197 | 68 | ) |
|
198 | 69 | secret_token = colander.SchemaNode( |
|
199 | 70 | colander.String(), |
@@ -234,7 +105,7 b' class WebhookSettingsSchema(colander.Sch' | |||
|
234 | 105 | default='', |
|
235 | 106 | missing='', |
|
236 | 107 | widget=deform.widget.TextInputWidget( |
|
237 |
placeholder='e.g |
|
|
108 | placeholder='e.g: Authorization' | |
|
238 | 109 | ), |
|
239 | 110 | ) |
|
240 | 111 | custom_header_val = colander.SchemaNode( |
@@ -244,7 +115,7 b' class WebhookSettingsSchema(colander.Sch' | |||
|
244 | 115 | default='', |
|
245 | 116 | missing='', |
|
246 | 117 | widget=deform.widget.TextInputWidget( |
|
247 |
placeholder='e.g. |
|
|
118 | placeholder='e.g. Basic XxXxXx' | |
|
248 | 119 | ), |
|
249 | 120 | ) |
|
250 | 121 | method_type = colander.SchemaNode( |
@@ -264,8 +135,11 b' class WebhookSettingsSchema(colander.Sch' | |||
|
264 | 135 | class WebhookIntegrationType(IntegrationTypeBase): |
|
265 | 136 | key = 'webhook' |
|
266 | 137 | display_name = _('Webhook') |
|
267 |
description = _(' |
|
|
268 | icon = '''<?xml version="1.0" encoding="UTF-8" standalone="no"?><svg viewBox="0 0 256 239" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" preserveAspectRatio="xMidYMid"><g><path d="M119.540432,100.502743 C108.930124,118.338815 98.7646301,135.611455 88.3876025,152.753617 C85.7226696,157.154315 84.4040417,160.738531 86.5332204,166.333309 C92.4107024,181.787152 84.1193605,196.825836 68.5350381,200.908244 C53.8383677,204.759349 39.5192953,195.099955 36.6032893,179.365384 C34.0194114,165.437749 44.8274148,151.78491 60.1824106,149.608284 C61.4694072,149.424428 62.7821041,149.402681 64.944891,149.240571 C72.469175,136.623655 80.1773157,123.700312 88.3025935,110.073173 C73.611854,95.4654658 64.8677898,78.3885437 66.803227,57.2292132 C68.1712787,42.2715849 74.0527146,29.3462646 84.8033863,18.7517722 C105.393354,-1.53572199 136.805164,-4.82141828 161.048542,10.7510424 C184.333097,25.7086706 194.996783,54.8450075 185.906752,79.7822957 C179.052655,77.9239597 172.151111,76.049808 164.563565,73.9917997 C167.418285,60.1274266 165.306899,47.6765751 155.95591,37.0109123 C149.777932,29.9690049 141.850349,26.2780332 132.835442,24.9178894 C114.764113,22.1877169 97.0209573,33.7983633 91.7563309,51.5355878 C85.7800012,71.6669027 94.8245623,88.1111998 119.540432,100.502743 L119.540432,100.502743 Z" fill="#C73A63"></path><path d="M149.841194,79.4106285 C157.316054,92.5969067 164.905578,105.982857 172.427885,119.246236 C210.44865,107.483365 239.114472,128.530009 249.398582,151.063322 C261.81978,178.282014 253.328765,210.520191 228.933162,227.312431 C203.893073,244.551464 172.226236,241.605803 150.040866,219.46195 C155.694953,214.729124 161.376716,209.974552 167.44794,204.895759 C189.360489,219.088306 208.525074,218.420096 222.753207,201.614016 C234.885769,187.277151 234.622834,165.900356 222.138374,151.863988 C207.730339,135.66681 188.431321,135.172572 165.103273,150.721309 C155.426087,133.553447 145.58086,116.521995 136.210101,99.2295848 C133.05093,93.4015266 129.561608,90.0209366 122.440622,88.7873178 C110.547271,86.7253555 102.868785,76.5124151 102.408155,65.0698097 C101.955433,53.7537294 108.621719,43.5249733 119.04224,39.5394355 C129.363912,35.5914599 141.476705,38.7783085 148.419765,47.554004 C154.093621,54.7244134 155.896602,62.7943365 152.911402,71.6372484 C152.081082,74.1025091 151.00562,76.4886916 149.841194,79.4106285 L149.841194,79.4106285 Z" fill="#4B4B4B"></path><path d="M167.706921,187.209935 L121.936499,187.209935 C117.54964,205.253587 108.074103,219.821756 91.7464461,229.085759 C79.0544063,236.285822 65.3738898,238.72736 50.8136292,236.376762 C24.0061432,232.053165 2.08568567,207.920497 0.156179306,180.745298 C-2.02835403,149.962159 19.1309765,122.599149 47.3341915,116.452801 C49.2814904,123.524363 51.2485589,130.663141 53.1958579,137.716911 C27.3195169,150.919004 18.3639187,167.553089 25.6054984,188.352614 C31.9811726,206.657224 50.0900643,216.690262 69.7528413,212.809503 C89.8327554,208.847688 99.9567329,192.160226 98.7211371,165.37844 C117.75722,165.37844 136.809118,165.180745 155.847178,165.475311 C163.280522,165.591951 169.019617,164.820939 174.620326,158.267339 C183.840836,147.48306 200.811003,148.455721 210.741239,158.640984 C220.88894,169.049642 220.402609,185.79839 209.663799,195.768166 C199.302587,205.38802 182.933414,204.874012 173.240413,194.508846 C171.247644,192.37176 169.677943,189.835329 167.706921,187.209935 L167.706921,187.209935 Z" fill="#4A4A4A"></path></g></svg>''' | |
|
138 | description = _('send JSON data to a url endpoint') | |
|
139 | ||
|
140 | @classmethod | |
|
141 | def icon(cls): | |
|
142 | return '''<?xml version="1.0" encoding="UTF-8" standalone="no"?><svg viewBox="0 0 256 239" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" preserveAspectRatio="xMidYMid"><g><path d="M119.540432,100.502743 C108.930124,118.338815 98.7646301,135.611455 88.3876025,152.753617 C85.7226696,157.154315 84.4040417,160.738531 86.5332204,166.333309 C92.4107024,181.787152 84.1193605,196.825836 68.5350381,200.908244 C53.8383677,204.759349 39.5192953,195.099955 36.6032893,179.365384 C34.0194114,165.437749 44.8274148,151.78491 60.1824106,149.608284 C61.4694072,149.424428 62.7821041,149.402681 64.944891,149.240571 C72.469175,136.623655 80.1773157,123.700312 88.3025935,110.073173 C73.611854,95.4654658 64.8677898,78.3885437 66.803227,57.2292132 C68.1712787,42.2715849 74.0527146,29.3462646 84.8033863,18.7517722 C105.393354,-1.53572199 136.805164,-4.82141828 161.048542,10.7510424 C184.333097,25.7086706 194.996783,54.8450075 185.906752,79.7822957 C179.052655,77.9239597 172.151111,76.049808 164.563565,73.9917997 C167.418285,60.1274266 165.306899,47.6765751 155.95591,37.0109123 C149.777932,29.9690049 141.850349,26.2780332 132.835442,24.9178894 C114.764113,22.1877169 97.0209573,33.7983633 91.7563309,51.5355878 C85.7800012,71.6669027 94.8245623,88.1111998 119.540432,100.502743 L119.540432,100.502743 Z" fill="#C73A63"></path><path d="M149.841194,79.4106285 C157.316054,92.5969067 164.905578,105.982857 172.427885,119.246236 C210.44865,107.483365 239.114472,128.530009 249.398582,151.063322 C261.81978,178.282014 253.328765,210.520191 228.933162,227.312431 C203.893073,244.551464 172.226236,241.605803 150.040866,219.46195 C155.694953,214.729124 161.376716,209.974552 167.44794,204.895759 C189.360489,219.088306 208.525074,218.420096 222.753207,201.614016 C234.885769,187.277151 234.622834,165.900356 222.138374,151.863988 C207.730339,135.66681 188.431321,135.172572 165.103273,150.721309 C155.426087,133.553447 145.58086,116.521995 136.210101,99.2295848 C133.05093,93.4015266 129.561608,90.0209366 122.440622,88.7873178 C110.547271,86.7253555 102.868785,76.5124151 102.408155,65.0698097 C101.955433,53.7537294 108.621719,43.5249733 119.04224,39.5394355 C129.363912,35.5914599 141.476705,38.7783085 148.419765,47.554004 C154.093621,54.7244134 155.896602,62.7943365 152.911402,71.6372484 C152.081082,74.1025091 151.00562,76.4886916 149.841194,79.4106285 L149.841194,79.4106285 Z" fill="#4B4B4B"></path><path d="M167.706921,187.209935 L121.936499,187.209935 C117.54964,205.253587 108.074103,219.821756 91.7464461,229.085759 C79.0544063,236.285822 65.3738898,238.72736 50.8136292,236.376762 C24.0061432,232.053165 2.08568567,207.920497 0.156179306,180.745298 C-2.02835403,149.962159 19.1309765,122.599149 47.3341915,116.452801 C49.2814904,123.524363 51.2485589,130.663141 53.1958579,137.716911 C27.3195169,150.919004 18.3639187,167.553089 25.6054984,188.352614 C31.9811726,206.657224 50.0900643,216.690262 69.7528413,212.809503 C89.8327554,208.847688 99.9567329,192.160226 98.7211371,165.37844 C117.75722,165.37844 136.809118,165.180745 155.847178,165.475311 C163.280522,165.591951 169.019617,164.820939 174.620326,158.267339 C183.840836,147.48306 200.811003,148.455721 210.741239,158.640984 C220.88894,169.049642 220.402609,185.79839 209.663799,195.768166 C199.302587,205.38802 182.933414,204.874012 173.240413,194.508846 C171.247644,192.37176 169.677943,189.835329 167.706921,187.209935 L167.706921,187.209935 Z" fill="#4A4A4A"></path></g></svg>''' | |
|
269 | 143 | |
|
270 | 144 | valid_events = [ |
|
271 | 145 | events.PullRequestCloseEvent, |
@@ -293,8 +167,8 b' class WebhookIntegrationType(Integration' | |||
|
293 | 167 | return schema |
|
294 | 168 | |
|
295 | 169 | def send_event(self, event): |
|
296 | log.debug('handling event %s with Webhook integration %s', | |
|
297 | event.name, self) | |
|
170 | log.debug( | |
|
171 | 'handling event %s with Webhook integration %s', event.name, self) | |
|
298 | 172 | |
|
299 | 173 | if event.__class__ not in self.valid_events: |
|
300 | 174 | log.debug('event not valid: %r' % event) |
@@ -313,8 +187,7 b' class WebhookIntegrationType(Integration' | |||
|
313 | 187 | if head_key and head_val: |
|
314 | 188 | headers = {head_key: head_val} |
|
315 | 189 | |
|
316 | handler = WebhookHandler( | |
|
317 | template_url, self.settings['secret_token'], headers) | |
|
190 | handler = WebhookDataHandler(template_url, headers) | |
|
318 | 191 | |
|
319 | 192 | url_calls = handler(event, data) |
|
320 | 193 | log.debug('webhook: calling following urls: %s', |
@@ -370,7 +243,10 b' def post_to_webhook(url_calls, settings)' | |||
|
370 | 243 | rhodecode.__version__) |
|
371 | 244 | } # updated below with custom ones, allows override |
|
372 | 245 | |
|
373 | for url, token, headers, data in url_calls: | |
|
246 | auth = get_auth(settings) | |
|
247 | token = get_web_token(settings) | |
|
248 | ||
|
249 | for url, headers, data in url_calls: | |
|
374 | 250 | req_session = requests.Session() |
|
375 | 251 | req_session.mount( # retry max N times |
|
376 | 252 | 'http://', requests.adapters.HTTPAdapter(max_retries=retries)) |
@@ -380,7 +256,6 b' def post_to_webhook(url_calls, settings)' | |||
|
380 | 256 | |
|
381 | 257 | headers = headers or {} |
|
382 | 258 | call_headers.update(headers) |
|
383 | auth = get_auth(settings) | |
|
384 | 259 | |
|
385 | 260 | log.debug('calling Webhook with method: %s, and auth:%s', |
|
386 | 261 | call_method, auth) |
@@ -392,4 +267,8 b' def post_to_webhook(url_calls, settings)' | |||
|
392 | 267 | }, headers=call_headers, auth=auth) |
|
393 | 268 | log.debug('Got Webhook response: %s', resp) |
|
394 | 269 | |
|
270 | try: | |
|
395 | 271 | resp.raise_for_status() # raise exception on a failed request |
|
272 | except Exception: | |
|
273 | log.error(resp.text) | |
|
274 | raise |
@@ -99,8 +99,11 b' class CachingQuery(Query):' | |||
|
99 | 99 | |
|
100 | 100 | """ |
|
101 | 101 | if hasattr(self, '_cache_parameters'): |
|
102 | return self.get_value(createfunc=lambda: | |
|
103 | list(Query.__iter__(self))) | |
|
102 | ||
|
103 | def caching_query(): | |
|
104 | return list(Query.__iter__(self)) | |
|
105 | ||
|
106 | return self.get_value(createfunc=caching_query) | |
|
104 | 107 | else: |
|
105 | 108 | return Query.__iter__(self) |
|
106 | 109 |
@@ -28,8 +28,9 b' from pygments.lexers.special import Text' | |||
|
28 | 28 | |
|
29 | 29 | from rhodecode.lib.helpers import ( |
|
30 | 30 | get_lexer_for_filenode, html_escape, get_custom_lexer) |
|
31 | from rhodecode.lib.utils2 import AttributeDict | |
|
31 | from rhodecode.lib.utils2 import AttributeDict, StrictAttributeDict | |
|
32 | 32 | from rhodecode.lib.vcs.nodes import FileNode |
|
33 | from rhodecode.lib.vcs.exceptions import VCSError, NodeDoesNotExistError | |
|
33 | 34 | from rhodecode.lib.diff_match_patch import diff_match_patch |
|
34 | 35 | from rhodecode.lib.diffs import LimitedDiffContainer |
|
35 | 36 | from pygments.lexers import get_lexer_by_name |
@@ -38,7 +39,7 b' plain_text_lexer = get_lexer_by_name(' | |||
|
38 | 39 | 'text', stripall=False, stripnl=False, ensurenl=False) |
|
39 | 40 | |
|
40 | 41 | |
|
41 | log = logging.getLogger() | |
|
42 | log = logging.getLogger(__name__) | |
|
42 | 43 | |
|
43 | 44 | |
|
44 | 45 | def filenode_as_lines_tokens(filenode, lexer=None): |
@@ -351,6 +352,16 b' def tokens_diff(old_tokens, new_tokens, ' | |||
|
351 | 352 | return old_tokens_result, new_tokens_result, similarity |
|
352 | 353 | |
|
353 | 354 | |
|
355 | def diffset_node_getter(commit): | |
|
356 | def get_node(fname): | |
|
357 | try: | |
|
358 | return commit.get_node(fname) | |
|
359 | except NodeDoesNotExistError: | |
|
360 | return None | |
|
361 | ||
|
362 | return get_node | |
|
363 | ||
|
364 | ||
|
354 | 365 | class DiffSet(object): |
|
355 | 366 | """ |
|
356 | 367 | An object for parsing the diff result from diffs.DiffProcessor and |
@@ -400,7 +411,12 b' class DiffSet(object):' | |||
|
400 | 411 | for patch in patchset: |
|
401 | 412 | diffset.file_stats[patch['filename']] = patch['stats'] |
|
402 | 413 | filediff = self.render_patch(patch) |
|
403 |
filediff.diffset = |
|
|
414 | filediff.diffset = StrictAttributeDict(dict( | |
|
415 | source_ref=diffset.source_ref, | |
|
416 | target_ref=diffset.target_ref, | |
|
417 | repo_name=diffset.repo_name, | |
|
418 | source_repo_name=diffset.source_repo_name, | |
|
419 | )) | |
|
404 | 420 | diffset.files.append(filediff) |
|
405 | 421 | diffset.changed_files += 1 |
|
406 | 422 | if not patch['stats']['binary']: |
@@ -510,6 +526,7 b' class DiffSet(object):' | |||
|
510 | 526 | if target_file_path in self.comments_store: |
|
511 | 527 | for lineno, comments in self.comments_store[target_file_path].items(): |
|
512 | 528 | left_comments[lineno] = comments |
|
529 | ||
|
513 | 530 | # left comments are one that we couldn't place in diff lines. |
|
514 | 531 | # could be outdated, or the diff changed and this line is no |
|
515 | 532 | # longer available |
@@ -546,7 +563,7 b' class DiffSet(object):' | |||
|
546 | 563 | |
|
547 | 564 | result.lines.extend( |
|
548 | 565 | self.parse_lines(before, after, source_file, target_file)) |
|
549 | result.unified = self.as_unified(result.lines) | |
|
566 | result.unified = list(self.as_unified(result.lines)) | |
|
550 | 567 | result.sideside = result.lines |
|
551 | 568 | |
|
552 | 569 | return result |
@@ -601,8 +618,9 b' class DiffSet(object):' | |||
|
601 | 618 | original.lineno = before['old_lineno'] |
|
602 | 619 | original.content = before['line'] |
|
603 | 620 | original.action = self.action_to_op(before['action']) |
|
604 | original.comments = self.get_comments_for('old', | |
|
605 | source_file, before['old_lineno']) | |
|
621 | ||
|
622 | original.get_comment_args = ( | |
|
623 | source_file, 'o', before['old_lineno']) | |
|
606 | 624 | |
|
607 | 625 | if after: |
|
608 | 626 | if after['action'] == 'new-no-nl': |
@@ -614,8 +632,9 b' class DiffSet(object):' | |||
|
614 | 632 | modified.lineno = after['new_lineno'] |
|
615 | 633 | modified.content = after['line'] |
|
616 | 634 | modified.action = self.action_to_op(after['action']) |
|
617 | modified.comments = self.get_comments_for('new', | |
|
618 | target_file, after['new_lineno']) | |
|
635 | ||
|
636 | modified.get_comment_args = ( | |
|
637 | target_file, 'n', after['new_lineno']) | |
|
619 | 638 | |
|
620 | 639 | # diff the lines |
|
621 | 640 | if before_tokens and after_tokens: |
@@ -644,23 +663,6 b' class DiffSet(object):' | |||
|
644 | 663 | |
|
645 | 664 | return lines |
|
646 | 665 | |
|
647 | def get_comments_for(self, version, filename, line_number): | |
|
648 | if hasattr(filename, 'unicode_path'): | |
|
649 | filename = filename.unicode_path | |
|
650 | ||
|
651 | if not isinstance(filename, basestring): | |
|
652 | return None | |
|
653 | ||
|
654 | line_key = { | |
|
655 | 'old': 'o', | |
|
656 | 'new': 'n', | |
|
657 | }[version] + str(line_number) | |
|
658 | ||
|
659 | if filename in self.comments_store: | |
|
660 | file_comments = self.comments_store[filename] | |
|
661 | if line_key in file_comments: | |
|
662 | return file_comments.pop(line_key) | |
|
663 | ||
|
664 | 666 | def get_line_tokens(self, line_text, line_number, file=None): |
|
665 | 667 | filenode = None |
|
666 | 668 | filename = None |
@@ -717,25 +719,25 b' class DiffSet(object):' | |||
|
717 | 719 | if line.original.action == ' ': |
|
718 | 720 | yield (line.original.lineno, line.modified.lineno, |
|
719 | 721 | line.original.action, line.original.content, |
|
720 | line.original.comments) | |
|
722 | line.original.get_comment_args) | |
|
721 | 723 | continue |
|
722 | 724 | |
|
723 | 725 | if line.original.action == '-': |
|
724 | 726 | yield (line.original.lineno, None, |
|
725 | 727 | line.original.action, line.original.content, |
|
726 | line.original.comments) | |
|
728 | line.original.get_comment_args) | |
|
727 | 729 | |
|
728 | 730 | if line.modified.action == '+': |
|
729 | 731 | buf.append(( |
|
730 | 732 | None, line.modified.lineno, |
|
731 | 733 | line.modified.action, line.modified.content, |
|
732 | line.modified.comments)) | |
|
734 | line.modified.get_comment_args)) | |
|
733 | 735 | continue |
|
734 | 736 | |
|
735 | 737 | if line.modified: |
|
736 | 738 | yield (None, line.modified.lineno, |
|
737 | 739 | line.modified.action, line.modified.content, |
|
738 | line.modified.comments) | |
|
740 | line.modified.get_comment_args) | |
|
739 | 741 | |
|
740 | 742 | for b in buf: |
|
741 | 743 | yield b |
@@ -126,7 +126,7 b' class DbManage(object):' | |||
|
126 | 126 | log.info("Deleting (%s) cache keys now...", total) |
|
127 | 127 | CacheKey.delete_all_cache() |
|
128 | 128 | |
|
129 | def upgrade(self): | |
|
129 | def upgrade(self, version=None): | |
|
130 | 130 | """ |
|
131 | 131 | Upgrades given database schema to given revision following |
|
132 | 132 | all needed steps, to perform the upgrade |
@@ -157,9 +157,9 b' class DbManage(object):' | |||
|
157 | 157 | db_uri = self.dburi |
|
158 | 158 | |
|
159 | 159 | try: |
|
160 | curr_version = api.db_version(db_uri, repository_path) | |
|
161 | msg = ('Found current database under version ' | |
|
162 |
'control with version |
|
|
160 | curr_version = version or api.db_version(db_uri, repository_path) | |
|
161 | msg = ('Found current database db_uri under version ' | |
|
162 | 'control with version {}'.format(curr_version)) | |
|
163 | 163 | |
|
164 | 164 | except (RuntimeError, DatabaseNotControlledError): |
|
165 | 165 | curr_version = 1 |
@@ -23,16 +23,19 b'' | |||
|
23 | 23 | Set of diffing helpers, previously part of vcs |
|
24 | 24 | """ |
|
25 | 25 | |
|
26 | import os | |
|
26 | 27 | import re |
|
28 | import bz2 | |
|
29 | ||
|
27 | 30 | import collections |
|
28 | 31 | import difflib |
|
29 | 32 | import logging |
|
30 | ||
|
33 | import cPickle as pickle | |
|
31 | 34 | from itertools import tee, imap |
|
32 | 35 | |
|
33 | 36 | from rhodecode.lib.vcs.exceptions import VCSError |
|
34 | 37 | from rhodecode.lib.vcs.nodes import FileNode, SubModuleNode |
|
35 | from rhodecode.lib.utils2 import safe_unicode | |
|
38 | from rhodecode.lib.utils2 import safe_unicode, safe_str | |
|
36 | 39 | |
|
37 | 40 | log = logging.getLogger(__name__) |
|
38 | 41 | |
@@ -1129,3 +1132,82 b' class LineNotInDiffException(Exception):' | |||
|
1129 | 1132 | |
|
1130 | 1133 | class DiffLimitExceeded(Exception): |
|
1131 | 1134 | pass |
|
1135 | ||
|
1136 | ||
|
1137 | def cache_diff(cached_diff_file, diff, commits): | |
|
1138 | ||
|
1139 | struct = { | |
|
1140 | 'version': 'v1', | |
|
1141 | 'diff': diff, | |
|
1142 | 'commits': commits | |
|
1143 | } | |
|
1144 | ||
|
1145 | try: | |
|
1146 | with bz2.BZ2File(cached_diff_file, 'wb') as f: | |
|
1147 | pickle.dump(struct, f) | |
|
1148 | log.debug('Saved diff cache under %s', cached_diff_file) | |
|
1149 | except Exception: | |
|
1150 | log.warn('Failed to save cache', exc_info=True) | |
|
1151 | # cleanup file to not store it "damaged" | |
|
1152 | try: | |
|
1153 | os.remove(cached_diff_file) | |
|
1154 | except Exception: | |
|
1155 | log.exception('Failed to cleanup path %s', cached_diff_file) | |
|
1156 | ||
|
1157 | ||
|
1158 | def load_cached_diff(cached_diff_file): | |
|
1159 | ||
|
1160 | default_struct = { | |
|
1161 | 'version': 'v1', | |
|
1162 | 'diff': None, | |
|
1163 | 'commits': None | |
|
1164 | } | |
|
1165 | ||
|
1166 | has_cache = os.path.isfile(cached_diff_file) | |
|
1167 | if not has_cache: | |
|
1168 | return default_struct | |
|
1169 | ||
|
1170 | data = None | |
|
1171 | try: | |
|
1172 | with bz2.BZ2File(cached_diff_file, 'rb') as f: | |
|
1173 | data = pickle.load(f) | |
|
1174 | log.debug('Loaded diff cache from %s', cached_diff_file) | |
|
1175 | except Exception: | |
|
1176 | log.warn('Failed to read diff cache file', exc_info=True) | |
|
1177 | ||
|
1178 | if not data: | |
|
1179 | data = default_struct | |
|
1180 | ||
|
1181 | if not isinstance(data, dict): | |
|
1182 | # old version of data ? | |
|
1183 | data = default_struct | |
|
1184 | ||
|
1185 | return data | |
|
1186 | ||
|
1187 | ||
|
1188 | def generate_diff_cache_key(*args): | |
|
1189 | """ | |
|
1190 | Helper to generate a cache key using arguments | |
|
1191 | """ | |
|
1192 | def arg_mapper(input_param): | |
|
1193 | input_param = safe_str(input_param) | |
|
1194 | # we cannot allow '/' in arguments since it would allow | |
|
1195 | # subdirectory usage | |
|
1196 | input_param.replace('/', '_') | |
|
1197 | return input_param or None # prevent empty string arguments | |
|
1198 | ||
|
1199 | return '_'.join([ | |
|
1200 | '{}' for i in range(len(args))]).format(*map(arg_mapper, args)) | |
|
1201 | ||
|
1202 | ||
|
1203 | def diff_cache_exist(cache_storage, *args): | |
|
1204 | """ | |
|
1205 | Based on all generated arguments check and return a cache path | |
|
1206 | """ | |
|
1207 | cache_key = generate_diff_cache_key(*args) | |
|
1208 | cache_file_path = os.path.join(cache_storage, cache_key) | |
|
1209 | # prevent path traversal attacks using some param that have e.g '../../' | |
|
1210 | if not os.path.abspath(cache_file_path).startswith(cache_storage): | |
|
1211 | raise ValueError('Final path must be within {}'.format(cache_storage)) | |
|
1212 | ||
|
1213 | return cache_file_path |
@@ -132,6 +132,7 b' class VCSServerUnavailable(HTTPBadGatewa' | |||
|
132 | 132 | 'Incorrect vcs.server=host:port', |
|
133 | 133 | 'Incorrect vcs.server.protocol', |
|
134 | 134 | ] |
|
135 | ||
|
135 | 136 | def __init__(self, message=''): |
|
136 | 137 | self.explanation = 'Could not connect to VCS Server' |
|
137 | 138 | if message: |
@@ -25,6 +25,7 b' Consists of functions to typically be us' | |||
|
25 | 25 | available to Controllers. This module is available to both as 'h'. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | import os | |
|
28 | 29 | import random |
|
29 | 30 | import hashlib |
|
30 | 31 | import StringIO |
@@ -742,18 +743,23 b' short_id = lambda x: x[:12]' | |||
|
742 | 743 | hide_credentials = lambda x: ''.join(credentials_filter(x)) |
|
743 | 744 | |
|
744 | 745 | |
|
746 | import pytz | |
|
747 | import tzlocal | |
|
748 | local_timezone = tzlocal.get_localzone() | |
|
749 | ||
|
750 | ||
|
745 | 751 | def age_component(datetime_iso, value=None, time_is_local=False): |
|
746 | 752 | title = value or format_date(datetime_iso) |
|
747 | 753 | tzinfo = '+00:00' |
|
748 | 754 | |
|
749 | 755 | # detect if we have a timezone info, otherwise, add it |
|
750 | if isinstance(datetime_iso, datetime) and not datetime_iso.tzinfo: | |
|
751 | if time_is_local: | |
|
752 | tzinfo = time.strftime("+%H:%M", | |
|
753 | time.gmtime( | |
|
754 | (datetime.now() - datetime.utcnow()).seconds + 1 | |
|
755 | ) | |
|
756 | ) | |
|
756 | if time_is_local and isinstance(datetime_iso, datetime) and not datetime_iso.tzinfo: | |
|
757 | force_timezone = os.environ.get('RC_TIMEZONE', '') | |
|
758 | if force_timezone: | |
|
759 | force_timezone = pytz.timezone(force_timezone) | |
|
760 | timezone = force_timezone or local_timezone | |
|
761 | offset = timezone.localize(datetime_iso).strftime('%z') | |
|
762 | tzinfo = '{}:{}'.format(offset[:-2], offset[-2:]) | |
|
757 | 763 | |
|
758 | 764 | return literal( |
|
759 | 765 | '<time class="timeago tooltip" ' |
@@ -896,6 +902,13 b' def link_to_user(author, length=0, **kwa' | |||
|
896 | 902 | return escape(display_person) |
|
897 | 903 | |
|
898 | 904 | |
|
905 | def link_to_group(users_group_name, **kwargs): | |
|
906 | return link_to( | |
|
907 | escape(users_group_name), | |
|
908 | route_path('user_group_profile', user_group_name=users_group_name), | |
|
909 | **kwargs) | |
|
910 | ||
|
911 | ||
|
899 | 912 | def person(author, show_attr="username_and_name"): |
|
900 | 913 | user = discover_user(author) |
|
901 | 914 | if user: |
@@ -18,10 +18,13 b'' | |||
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 |
import |
|
|
21 | import os | |
|
22 | import time | |
|
22 | 23 | import logging |
|
24 | import tempfile | |
|
23 | 25 | import traceback |
|
24 | 26 | import threading |
|
27 | ||
|
25 | 28 | from BaseHTTPServer import BaseHTTPRequestHandler |
|
26 | 29 | from SocketServer import TCPServer |
|
27 | 30 | |
@@ -30,14 +33,28 b' from rhodecode.model import meta' | |||
|
30 | 33 | from rhodecode.lib.base import bootstrap_request, bootstrap_config |
|
31 | 34 | from rhodecode.lib import hooks_base |
|
32 | 35 | from rhodecode.lib.utils2 import AttributeDict |
|
36 | from rhodecode.lib.ext_json import json | |
|
33 | 37 | |
|
34 | 38 | |
|
35 | 39 | log = logging.getLogger(__name__) |
|
36 | 40 | |
|
37 | 41 | |
|
38 | 42 | class HooksHttpHandler(BaseHTTPRequestHandler): |
|
43 | ||
|
39 | 44 | def do_POST(self): |
|
40 | 45 | method, extras = self._read_request() |
|
46 | txn_id = getattr(self.server, 'txn_id', None) | |
|
47 | if txn_id: | |
|
48 | from rhodecode.lib.caches import compute_key_from_params | |
|
49 | log.debug('Computing TXN_ID based on `%s`:`%s`', | |
|
50 | extras['repository'], extras['txn_id']) | |
|
51 | computed_txn_id = compute_key_from_params( | |
|
52 | extras['repository'], extras['txn_id']) | |
|
53 | if txn_id != computed_txn_id: | |
|
54 | raise Exception( | |
|
55 | 'TXN ID fail: expected {} got {} instead'.format( | |
|
56 | txn_id, computed_txn_id)) | |
|
57 | ||
|
41 | 58 | try: |
|
42 | 59 | result = self._call_hook(method, extras) |
|
43 | 60 | except Exception as e: |
@@ -77,13 +94,14 b' class HooksHttpHandler(BaseHTTPRequestHa' | |||
|
77 | 94 | |
|
78 | 95 | message = format % args |
|
79 | 96 | |
|
80 | # TODO: mikhail: add different log levels support | |
|
81 | 97 | log.debug( |
|
82 | 98 | "%s - - [%s] %s", self.client_address[0], |
|
83 | 99 | self.log_date_time_string(), message) |
|
84 | 100 | |
|
85 | 101 | |
|
86 | 102 | class DummyHooksCallbackDaemon(object): |
|
103 | hooks_uri = '' | |
|
104 | ||
|
87 | 105 | def __init__(self): |
|
88 | 106 | self.hooks_module = Hooks.__module__ |
|
89 | 107 | |
@@ -101,8 +119,8 b' class ThreadedHookCallbackDaemon(object)' | |||
|
101 | 119 | _daemon = None |
|
102 | 120 | _done = False |
|
103 | 121 | |
|
104 | def __init__(self): | |
|
105 | self._prepare() | |
|
122 | def __init__(self, txn_id=None, port=None): | |
|
123 | self._prepare(txn_id=txn_id, port=port) | |
|
106 | 124 | |
|
107 | 125 | def __enter__(self): |
|
108 | 126 | self._run() |
@@ -112,7 +130,7 b' class ThreadedHookCallbackDaemon(object)' | |||
|
112 | 130 | log.debug('Callback daemon exiting now...') |
|
113 | 131 | self._stop() |
|
114 | 132 | |
|
115 | def _prepare(self): | |
|
133 | def _prepare(self, txn_id=None, port=None): | |
|
116 | 134 | raise NotImplementedError() |
|
117 | 135 | |
|
118 | 136 | def _run(self): |
@@ -135,15 +153,18 b' class HttpHooksCallbackDaemon(ThreadedHo' | |||
|
135 | 153 | # request and wastes cpu at all other times. |
|
136 | 154 | POLL_INTERVAL = 0.01 |
|
137 | 155 | |
|
138 | def _prepare(self): | |
|
139 | log.debug("Preparing HTTP callback daemon and registering hook object") | |
|
140 | ||
|
156 | def _prepare(self, txn_id=None, port=None): | |
|
141 | 157 | self._done = False |
|
142 | self._daemon = TCPServer((self.IP_ADDRESS, 0), HooksHttpHandler) | |
|
158 | self._daemon = TCPServer((self.IP_ADDRESS, port or 0), HooksHttpHandler) | |
|
143 | 159 | _, port = self._daemon.server_address |
|
144 | 160 | self.hooks_uri = '{}:{}'.format(self.IP_ADDRESS, port) |
|
161 | self.txn_id = txn_id | |
|
162 | # inject transaction_id for later verification | |
|
163 | self._daemon.txn_id = self.txn_id | |
|
145 | 164 | |
|
146 | log.debug("Hooks uri is: %s", self.hooks_uri) | |
|
165 | log.debug( | |
|
166 | "Preparing HTTP callback daemon at `%s` and registering hook object", | |
|
167 | self.hooks_uri) | |
|
147 | 168 | |
|
148 | 169 | def _run(self): |
|
149 | 170 | log.debug("Running event loop of callback daemon in background thread") |
@@ -160,26 +181,67 b' class HttpHooksCallbackDaemon(ThreadedHo' | |||
|
160 | 181 | self._callback_thread.join() |
|
161 | 182 | self._daemon = None |
|
162 | 183 | self._callback_thread = None |
|
184 | if self.txn_id: | |
|
185 | txn_id_file = get_txn_id_data_path(self.txn_id) | |
|
186 | log.debug('Cleaning up TXN ID %s', txn_id_file) | |
|
187 | if os.path.isfile(txn_id_file): | |
|
188 | os.remove(txn_id_file) | |
|
189 | ||
|
163 | 190 | log.debug("Background thread done.") |
|
164 | 191 | |
|
165 | 192 | |
|
166 | def prepare_callback_daemon(extras, protocol, use_direct_calls): | |
|
167 | callback_daemon = None | |
|
193 | def get_txn_id_data_path(txn_id): | |
|
194 | root = tempfile.gettempdir() | |
|
195 | return os.path.join(root, 'rc_txn_id_{}'.format(txn_id)) | |
|
196 | ||
|
197 | ||
|
198 | def store_txn_id_data(txn_id, data_dict): | |
|
199 | if not txn_id: | |
|
200 | log.warning('Cannot store txn_id because it is empty') | |
|
201 | return | |
|
202 | ||
|
203 | path = get_txn_id_data_path(txn_id) | |
|
204 | try: | |
|
205 | with open(path, 'wb') as f: | |
|
206 | f.write(json.dumps(data_dict)) | |
|
207 | except Exception: | |
|
208 | log.exception('Failed to write txn_id metadata') | |
|
168 | 209 | |
|
210 | ||
|
211 | def get_txn_id_from_store(txn_id): | |
|
212 | """ | |
|
213 | Reads txn_id from store and if present returns the data for callback manager | |
|
214 | """ | |
|
215 | path = get_txn_id_data_path(txn_id) | |
|
216 | try: | |
|
217 | with open(path, 'rb') as f: | |
|
218 | return json.loads(f.read()) | |
|
219 | except Exception: | |
|
220 | return {} | |
|
221 | ||
|
222 | ||
|
223 | def prepare_callback_daemon(extras, protocol, use_direct_calls, txn_id=None): | |
|
224 | txn_details = get_txn_id_from_store(txn_id) | |
|
225 | port = txn_details.get('port', 0) | |
|
169 | 226 | if use_direct_calls: |
|
170 | 227 | callback_daemon = DummyHooksCallbackDaemon() |
|
171 | 228 | extras['hooks_module'] = callback_daemon.hooks_module |
|
172 | 229 | else: |
|
173 | 230 | if protocol == 'http': |
|
174 | callback_daemon = HttpHooksCallbackDaemon() | |
|
231 | callback_daemon = HttpHooksCallbackDaemon(txn_id=txn_id, port=port) | |
|
175 | 232 | else: |
|
176 | 233 | log.error('Unsupported callback daemon protocol "%s"', protocol) |
|
177 | 234 | raise Exception('Unsupported callback daemon protocol.') |
|
178 | 235 | |
|
179 | 236 |
|
|
180 | 237 |
|
|
238 | extras['time'] = time.time() | |
|
181 | 239 | |
|
182 | log.debug('Prepared a callback daemon: %s', callback_daemon) | |
|
240 | # register txn_id | |
|
241 | extras['txn_id'] = txn_id | |
|
242 | ||
|
243 | log.debug('Prepared a callback daemon: %s at url `%s`', | |
|
244 | callback_daemon.__class__.__name__, callback_daemon.hooks_uri) | |
|
183 | 245 | return callback_daemon, extras |
|
184 | 246 | |
|
185 | 247 |
@@ -18,17 +18,21 b'' | |||
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | import base64 | |
|
21 | 22 | import logging |
|
22 | 23 | import urllib |
|
23 | 24 | from urlparse import urljoin |
|
24 | 25 | |
|
25 | ||
|
26 | 26 | import requests |
|
27 | 27 | from webob.exc import HTTPNotAcceptable |
|
28 | 28 | |
|
29 | from rhodecode.lib import caches | |
|
29 | 30 | from rhodecode.lib.middleware import simplevcs |
|
30 | 31 | from rhodecode.lib.utils import is_valid_repo |
|
31 | from rhodecode.lib.utils2 import str2bool | |
|
32 | from rhodecode.lib.utils2 import str2bool, safe_int | |
|
33 | from rhodecode.lib.ext_json import json | |
|
34 | from rhodecode.lib.hooks_daemon import store_txn_id_data | |
|
35 | ||
|
32 | 36 | |
|
33 | 37 | log = logging.getLogger(__name__) |
|
34 | 38 | |
@@ -39,7 +43,6 b' class SimpleSvnApp(object):' | |||
|
39 | 43 | 'transfer-encoding', 'content-length'] |
|
40 | 44 | rc_extras = {} |
|
41 | 45 | |
|
42 | ||
|
43 | 46 | def __init__(self, config): |
|
44 | 47 | self.config = config |
|
45 | 48 | |
@@ -52,9 +55,19 b' class SimpleSvnApp(object):' | |||
|
52 | 55 | # length, then we should transfer the payload in one request. |
|
53 | 56 | if environ['REQUEST_METHOD'] == 'MKCOL' or 'CONTENT_LENGTH' in environ: |
|
54 | 57 | data = data.read() |
|
58 | if data.startswith('(create-txn-with-props'): | |
|
59 | # store on-the-fly our rc_extra using svn revision properties | |
|
60 | # those can be read later on in hooks executed so we have a way | |
|
61 | # to pass in the data into svn hooks | |
|
62 | rc_data = base64.urlsafe_b64encode(json.dumps(self.rc_extras)) | |
|
63 | rc_data_len = len(rc_data) | |
|
64 | # header defines data lenght, and serialized data | |
|
65 | skel = ' rc-scm-extras {} {}'.format(rc_data_len, rc_data) | |
|
66 | data = data[:-2] + skel + '))' | |
|
55 | 67 | |
|
56 | 68 | log.debug('Calling: %s method via `%s`', environ['REQUEST_METHOD'], |
|
57 | 69 | self._get_url(environ['PATH_INFO'])) |
|
70 | ||
|
58 | 71 | response = requests.request( |
|
59 | 72 | environ['REQUEST_METHOD'], self._get_url(environ['PATH_INFO']), |
|
60 | 73 | data=data, headers=request_headers) |
@@ -70,6 +83,14 b' class SimpleSvnApp(object):' | |||
|
70 | 83 | log.debug('got response code: %s', response.status_code) |
|
71 | 84 | |
|
72 | 85 | response_headers = self._get_response_headers(response.headers) |
|
86 | ||
|
87 | if response.headers.get('SVN-Txn-name'): | |
|
88 | svn_tx_id = response.headers.get('SVN-Txn-name') | |
|
89 | txn_id = caches.compute_key_from_params( | |
|
90 | self.config['repository'], svn_tx_id) | |
|
91 | port = safe_int(self.rc_extras['hooks_uri'].split(':')[-1]) | |
|
92 | store_txn_id_data(txn_id, {'port': port}) | |
|
93 | ||
|
73 | 94 | start_response( |
|
74 | 95 | '{} {}'.format(response.status_code, response.reason), |
|
75 | 96 | response_headers) |
@@ -156,6 +177,14 b' class SimpleSvn(simplevcs.SimpleVCS):' | |||
|
156 | 177 | if environ['REQUEST_METHOD'] in self.READ_ONLY_COMMANDS |
|
157 | 178 | else 'push') |
|
158 | 179 | |
|
180 | def _should_use_callback_daemon(self, extras, environ, action): | |
|
181 | # only MERGE command triggers hooks, so we don't want to start | |
|
182 | # hooks server too many times. POST however starts the svn transaction | |
|
183 | # so we also need to run the init of callback daemon of POST | |
|
184 | if environ['REQUEST_METHOD'] in ['MERGE', 'POST']: | |
|
185 | return True | |
|
186 | return False | |
|
187 | ||
|
159 | 188 | def _create_wsgi_app(self, repo_path, repo_name, config): |
|
160 | 189 | if self._is_svn_enabled(): |
|
161 | 190 | return SimpleSvnApp(config) |
@@ -28,10 +28,12 b' import re' | |||
|
28 | 28 | import logging |
|
29 | 29 | import importlib |
|
30 | 30 | from functools import wraps |
|
31 | from StringIO import StringIO | |
|
32 | from lxml import etree | |
|
31 | 33 | |
|
32 | 34 | import time |
|
33 | 35 | from paste.httpheaders import REMOTE_USER, AUTH_TYPE |
|
34 | # TODO(marcink): check if we should use webob.exc here ? | |
|
36 | ||
|
35 | 37 | from pyramid.httpexceptions import ( |
|
36 | 38 | HTTPNotFound, HTTPForbidden, HTTPNotAcceptable, HTTPInternalServerError) |
|
37 | 39 | from zope.cachedescriptors.property import Lazy as LazyProperty |
@@ -43,9 +45,7 b' from rhodecode.lib import caches' | |||
|
43 | 45 | from rhodecode.lib.auth import AuthUser, HasPermissionAnyMiddleware |
|
44 | 46 | from rhodecode.lib.base import ( |
|
45 | 47 | BasicAuth, get_ip_addr, get_user_agent, vcs_operation_context) |
|
46 | from rhodecode.lib.exceptions import ( | |
|
47 | HTTPLockedRC, HTTPRequirementError, UserCreationError, | |
|
48 | NotAllowedToCreateUserError) | |
|
48 | from rhodecode.lib.exceptions import (UserCreationError, NotAllowedToCreateUserError) | |
|
49 | 49 | from rhodecode.lib.hooks_daemon import prepare_callback_daemon |
|
50 | 50 | from rhodecode.lib.middleware import appenlight |
|
51 | 51 | from rhodecode.lib.middleware.utils import scm_app_http |
@@ -53,6 +53,7 b' from rhodecode.lib.utils import is_valid' | |||
|
53 | 53 | from rhodecode.lib.utils2 import safe_str, fix_PATH, str2bool, safe_unicode |
|
54 | 54 | from rhodecode.lib.vcs.conf import settings as vcs_settings |
|
55 | 55 | from rhodecode.lib.vcs.backends import base |
|
56 | ||
|
56 | 57 | from rhodecode.model import meta |
|
57 | 58 | from rhodecode.model.db import User, Repository, PullRequest |
|
58 | 59 | from rhodecode.model.scm import ScmModel |
@@ -62,6 +63,28 b' from rhodecode.model.settings import Set' | |||
|
62 | 63 | log = logging.getLogger(__name__) |
|
63 | 64 | |
|
64 | 65 | |
|
66 | def extract_svn_txn_id(acl_repo_name, data): | |
|
67 | """ | |
|
68 | Helper method for extraction of svn txn_id from submited XML data during | |
|
69 | POST operations | |
|
70 | """ | |
|
71 | try: | |
|
72 | root = etree.fromstring(data) | |
|
73 | pat = re.compile(r'/txn/(?P<txn_id>.*)') | |
|
74 | for el in root: | |
|
75 | if el.tag == '{DAV:}source': | |
|
76 | for sub_el in el: | |
|
77 | if sub_el.tag == '{DAV:}href': | |
|
78 | match = pat.search(sub_el.text) | |
|
79 | if match: | |
|
80 | svn_tx_id = match.groupdict()['txn_id'] | |
|
81 | txn_id = caches.compute_key_from_params( | |
|
82 | acl_repo_name, svn_tx_id) | |
|
83 | return txn_id | |
|
84 | except Exception: | |
|
85 | log.exception('Failed to extract txn_id') | |
|
86 | ||
|
87 | ||
|
65 | 88 | def initialize_generator(factory): |
|
66 | 89 | """ |
|
67 | 90 | Initializes the returned generator by draining its first element. |
@@ -359,8 +382,9 b' class SimpleVCS(object):' | |||
|
359 | 382 | # check if we have SSL required ! if not it's a bad request ! |
|
360 | 383 | require_ssl = str2bool(self.repo_vcs_config.get('web', 'push_ssl')) |
|
361 | 384 | if require_ssl and org_proto == 'http': |
|
362 | log.debug('proto is %s and SSL is required BAD REQUEST !', | |
|
363 | org_proto) | |
|
385 | log.debug( | |
|
386 | 'Bad request: detected protocol is `%s` and ' | |
|
387 | 'SSL/HTTPS is required.', org_proto) | |
|
364 | 388 | return False |
|
365 | 389 | return True |
|
366 | 390 | |
@@ -420,7 +444,8 b' class SimpleVCS(object):' | |||
|
420 | 444 | # Check if the shadow repo actually exists, in case someone refers |
|
421 | 445 | # to it, and it has been deleted because of successful merge. |
|
422 | 446 | if self.is_shadow_repo and not self.is_shadow_repo_dir: |
|
423 | log.debug('Shadow repo detected, and shadow repo dir `%s` is missing', | |
|
447 | log.debug( | |
|
448 | 'Shadow repo detected, and shadow repo dir `%s` is missing', | |
|
424 | 449 |
|
|
425 | 450 | return HTTPNotFound()(environ, start_response) |
|
426 | 451 | |
@@ -436,7 +461,8 b' class SimpleVCS(object):' | |||
|
436 | 461 | anonymous_perm = self._check_permission( |
|
437 | 462 | action, anonymous_user, self.acl_repo_name, ip_addr, |
|
438 | 463 | plugin_id='anonymous_access', |
|
439 |
plugin_cache_active=plugin_cache_active, |
|
|
464 | plugin_cache_active=plugin_cache_active, | |
|
465 | cache_ttl=cache_ttl, | |
|
440 | 466 | ) |
|
441 | 467 | else: |
|
442 | 468 | anonymous_perm = False |
@@ -562,15 +588,25 b' class SimpleVCS(object):' | |||
|
562 | 588 | also handles the locking exceptions which will be triggered when |
|
563 | 589 | the first chunk is produced by the underlying WSGI application. |
|
564 | 590 | """ |
|
565 | callback_daemon, extras = self._prepare_callback_daemon(extras) | |
|
591 | txn_id = '' | |
|
592 | if 'CONTENT_LENGTH' in environ and environ['REQUEST_METHOD'] == 'MERGE': | |
|
593 | # case for SVN, we want to re-use the callback daemon port | |
|
594 | # so we use the txn_id, for this we peek the body, and still save | |
|
595 | # it as wsgi.input | |
|
596 | data = environ['wsgi.input'].read() | |
|
597 | environ['wsgi.input'] = StringIO(data) | |
|
598 | txn_id = extract_svn_txn_id(self.acl_repo_name, data) | |
|
599 | ||
|
600 | callback_daemon, extras = self._prepare_callback_daemon( | |
|
601 | extras, environ, action, txn_id=txn_id) | |
|
602 | log.debug('HOOKS extras is %s', extras) | |
|
603 | ||
|
566 | 604 | config = self._create_config(extras, self.acl_repo_name) |
|
567 | log.debug('HOOKS extras is %s', extras) | |
|
568 | 605 | app = self._create_wsgi_app(repo_path, self.url_repo_name, config) |
|
606 | with callback_daemon: | |
|
569 | 607 | app.rc_extras = extras |
|
570 | 608 | |
|
571 | 609 | try: |
|
572 | with callback_daemon: | |
|
573 | try: | |
|
574 | 610 |
|
|
575 | 611 |
|
|
576 | 612 |
|
@@ -582,28 +618,12 b' class SimpleVCS(object):' | |||
|
582 | 618 |
|
|
583 | 619 |
|
|
584 | 620 | |
|
621 | # iter content | |
|
585 | 622 |
|
|
586 | 623 |
|
|
587 | except Exception as exc: | |
|
588 | # TODO: martinb: Exceptions are only raised in case of the Pyro4 | |
|
589 | # backend. Refactor this except block after dropping Pyro4 support. | |
|
590 | # TODO: johbo: Improve "translating" back the exception. | |
|
591 | if getattr(exc, '_vcs_kind', None) == 'repo_locked': | |
|
592 | exc = HTTPLockedRC(*exc.args) | |
|
593 | _code = rhodecode.CONFIG.get('lock_ret_code') | |
|
594 | log.debug('Repository LOCKED ret code %s!', (_code,)) | |
|
595 | elif getattr(exc, '_vcs_kind', None) == 'requirement': | |
|
596 | log.debug( | |
|
597 | 'Repository requires features unknown to this Mercurial') | |
|
598 | exc = HTTPRequirementError(*exc.args) | |
|
599 | else: | |
|
600 | raise | |
|
601 | 624 | |
|
602 | for chunk in exc(environ, start_response): | |
|
603 | yield chunk | |
|
604 | finally: | |
|
625 | try: | |
|
605 | 626 | # invalidate cache on push |
|
606 | try: | |
|
607 | 627 | if action == 'push': |
|
608 | 628 | self._invalidate_cache(self.url_repo_name) |
|
609 | 629 | finally: |
@@ -631,10 +651,18 b' class SimpleVCS(object):' | |||
|
631 | 651 | """Create a safe config representation.""" |
|
632 | 652 | raise NotImplementedError() |
|
633 | 653 | |
|
634 |
def _ |
|
|
654 | def _should_use_callback_daemon(self, extras, environ, action): | |
|
655 | return True | |
|
656 | ||
|
657 | def _prepare_callback_daemon(self, extras, environ, action, txn_id=None): | |
|
658 | direct_calls = vcs_settings.HOOKS_DIRECT_CALLS | |
|
659 | if not self._should_use_callback_daemon(extras, environ, action): | |
|
660 | # disable callback daemon for actions that don't require it | |
|
661 | direct_calls = True | |
|
662 | ||
|
635 | 663 | return prepare_callback_daemon( |
|
636 | 664 | extras, protocol=vcs_settings.HOOKS_PROTOCOL, |
|
637 | use_direct_calls=vcs_settings.HOOKS_DIRECT_CALLS) | |
|
665 | use_direct_calls=direct_calls, txn_id=txn_id) | |
|
638 | 666 | |
|
639 | 667 | |
|
640 | 668 | def _should_check_locking(query_string): |
@@ -36,7 +36,9 b' def get_app_config(ini_path):' | |||
|
36 | 36 | return appconfig('config:{}'.format(ini_path), relative_to=os.getcwd()) |
|
37 | 37 | |
|
38 | 38 | |
|
39 | def bootstrap(config_uri, request=None, options=None): | |
|
39 | def bootstrap(config_uri, request=None, options=None, env=None): | |
|
40 | if env: | |
|
41 | os.environ.update(env) | |
|
40 | 42 | |
|
41 | 43 | config = get_config(config_uri) |
|
42 | 44 | base_url = 'http://rhodecode.local' |
@@ -95,7 +95,7 b' def command(ini_path, force_yes, user, e' | |||
|
95 | 95 | dbmanage.populate_default_permissions() |
|
96 | 96 | Session().commit() |
|
97 | 97 | |
|
98 | with bootstrap(ini_path) as env: | |
|
98 | with bootstrap(ini_path, env={'RC_CMD_SETUP_RC': '1'}) as env: | |
|
99 | 99 | msg = 'Successfully initialized database, schema and default data.' |
|
100 | 100 | print() |
|
101 | 101 | print('*' * len(msg)) |
@@ -40,7 +40,7 b' def main(ini_path, force_yes):' | |||
|
40 | 40 | def command(ini_path, force_yes): |
|
41 | 41 | pyramid.paster.setup_logging(ini_path) |
|
42 | 42 | |
|
43 | with bootstrap(ini_path) as env: | |
|
43 | with bootstrap(ini_path, env={'RC_CMD_UPGRADE_DB': '1'}) as env: | |
|
44 | 44 | config = env['registry'].settings |
|
45 | 45 | db_uri = config['sqlalchemy.db1.url'] |
|
46 | 46 | options = {} |
@@ -23,8 +23,10 b' import os' | |||
|
23 | 23 | import sys |
|
24 | 24 | import time |
|
25 | 25 | import platform |
|
26 | import collections | |
|
26 | 27 | import pkg_resources |
|
27 | 28 | import logging |
|
29 | import resource | |
|
28 | 30 | |
|
29 | 31 | from pyramid.compat import configparser |
|
30 | 32 | |
@@ -140,6 +142,29 b' def platform_type():' | |||
|
140 | 142 | return SysInfoRes(value=value) |
|
141 | 143 | |
|
142 | 144 | |
|
145 | def ulimit_info(): | |
|
146 | data = collections.OrderedDict([ | |
|
147 | ('cpu time (seconds)', resource.getrlimit(resource.RLIMIT_CPU)), | |
|
148 | ('file size', resource.getrlimit(resource.RLIMIT_FSIZE)), | |
|
149 | ('stack size', resource.getrlimit(resource.RLIMIT_STACK)), | |
|
150 | ('core file size', resource.getrlimit(resource.RLIMIT_CORE)), | |
|
151 | ('address space size', resource.getrlimit(resource.RLIMIT_AS)), | |
|
152 | ('locked in mem size', resource.getrlimit(resource.RLIMIT_MEMLOCK)), | |
|
153 | ('heap size', resource.getrlimit(resource.RLIMIT_DATA)), | |
|
154 | ('rss size', resource.getrlimit(resource.RLIMIT_RSS)), | |
|
155 | ('number of processes', resource.getrlimit(resource.RLIMIT_NPROC)), | |
|
156 | ('open files', resource.getrlimit(resource.RLIMIT_NOFILE)), | |
|
157 | ]) | |
|
158 | ||
|
159 | text = ', '.join('{}:{}'.format(k,v) for k,v in data.items()) | |
|
160 | ||
|
161 | value = { | |
|
162 | 'limits': data, | |
|
163 | 'text': text, | |
|
164 | } | |
|
165 | return SysInfoRes(value=value) | |
|
166 | ||
|
167 | ||
|
143 | 168 | def uptime(): |
|
144 | 169 | from rhodecode.lib.helpers import age, time_to_datetime |
|
145 | 170 | from rhodecode.translation import TranslationString |
@@ -687,6 +712,7 b' def usage_info():' | |||
|
687 | 712 | return SysInfoRes(value=value) |
|
688 | 713 | |
|
689 | 714 | |
|
715 | ||
|
690 | 716 | def get_system_info(environ): |
|
691 | 717 | environ = environ or {} |
|
692 | 718 | return { |
@@ -699,7 +725,7 b' def get_system_info(environ):' | |||
|
699 | 725 | 'platform': SysInfo(platform_type)(), |
|
700 | 726 | 'server': SysInfo(server_info, environ=environ)(), |
|
701 | 727 | 'database': SysInfo(database_info)(), |
|
702 | ||
|
728 | 'ulimit': SysInfo(ulimit_info)(), | |
|
703 | 729 | 'storage': SysInfo(storage)(), |
|
704 | 730 | 'storage_inodes': SysInfo(storage_inodes)(), |
|
705 | 731 | 'storage_archive': SysInfo(storage_archives)(), |
@@ -134,13 +134,17 b' def get_user_group_slug(request):' | |||
|
134 | 134 | elif getattr(request, 'matchdict', None): |
|
135 | 135 | # pyramid |
|
136 | 136 | _user_group = request.matchdict.get('user_group_id') |
|
137 | ||
|
137 | _user_group_name = request.matchdict.get('user_group_name') | |
|
138 | 138 | try: |
|
139 | if _user_group: | |
|
139 | 140 | _user_group = UserGroup.get(_user_group) |
|
141 | elif _user_group_name: | |
|
142 | _user_group = UserGroup.get_by_group_name(_user_group_name) | |
|
143 | ||
|
140 | 144 | if _user_group: |
|
141 | 145 | _user_group = _user_group.users_group_name |
|
142 | 146 | except Exception: |
|
143 | log.exception('Failed to get user group by id') | |
|
147 | log.exception('Failed to get user group by id and name') | |
|
144 | 148 | # catch all failures here |
|
145 | 149 | return None |
|
146 | 150 | |
@@ -352,11 +356,10 b' def config_data_from_db(clear_session=Tr' | |||
|
352 | 356 | |
|
353 | 357 | ui_settings = settings_model.get_ui_settings() |
|
354 | 358 | |
|
359 | ui_data = [] | |
|
355 | 360 | for setting in ui_settings: |
|
356 | 361 | if setting.active: |
|
357 | log.debug( | |
|
358 | 'settings ui from db: [%s] %s=%s', | |
|
359 | setting.section, setting.key, setting.value) | |
|
362 | ui_data.append((setting.section, setting.key, setting.value)) | |
|
360 | 363 | config.append(( |
|
361 | 364 | safe_str(setting.section), safe_str(setting.key), |
|
362 | 365 | safe_str(setting.value))) |
@@ -365,6 +368,9 b' def config_data_from_db(clear_session=Tr' | |||
|
365 | 368 | # handles that |
|
366 | 369 | config.append(( |
|
367 | 370 | safe_str(setting.section), safe_str(setting.key), False)) |
|
371 | log.debug( | |
|
372 | 'settings ui from db: %s', | |
|
373 | ','.join(map(lambda s: '[{}] {}={}'.format(*s), ui_data))) | |
|
368 | 374 | if clear_session: |
|
369 | 375 | meta.Session.remove() |
|
370 | 376 | |
@@ -508,7 +514,6 b' def repo2db_mapper(initial_repo_list, re' | |||
|
508 | 514 | :param remove_obsolete: check for obsolete entries in database |
|
509 | 515 | """ |
|
510 | 516 | from rhodecode.model.repo import RepoModel |
|
511 | from rhodecode.model.scm import ScmModel | |
|
512 | 517 | from rhodecode.model.repo_group import RepoGroupModel |
|
513 | 518 | from rhodecode.model.settings import SettingsModel |
|
514 | 519 | |
@@ -560,9 +565,8 b' def repo2db_mapper(initial_repo_list, re' | |||
|
560 | 565 | |
|
561 | 566 | config = db_repo._config |
|
562 | 567 | config.set('extensions', 'largefiles', '') |
|
563 | ScmModel().install_hooks( | |
|
564 | db_repo.scm_instance(config=config), | |
|
565 | repo_type=db_repo.repo_type) | |
|
568 | repo = db_repo.scm_instance(config=config) | |
|
569 | repo.install_hooks() | |
|
566 | 570 | |
|
567 | 571 | removed = [] |
|
568 | 572 | if remove_obsolete: |
@@ -709,7 +709,19 b' def extract_mentioned_users(s):' | |||
|
709 | 709 | return sorted(list(usrs), key=lambda k: k.lower()) |
|
710 | 710 | |
|
711 | 711 | |
|
712 |
class |
|
|
712 | class AttributeDictBase(dict): | |
|
713 | def __getstate__(self): | |
|
714 | odict = self.__dict__ # get attribute dictionary | |
|
715 | return odict | |
|
716 | ||
|
717 | def __setstate__(self, dict): | |
|
718 | self.__dict__ = dict | |
|
719 | ||
|
720 | __setattr__ = dict.__setitem__ | |
|
721 | __delattr__ = dict.__delitem__ | |
|
722 | ||
|
723 | ||
|
724 | class StrictAttributeDict(AttributeDictBase): | |
|
713 | 725 | """ |
|
714 | 726 | Strict Version of Attribute dict which raises an Attribute error when |
|
715 | 727 | requested attribute is not set |
@@ -720,15 +732,12 b' class StrictAttributeDict(dict):' | |||
|
720 | 732 | except KeyError: |
|
721 | 733 | raise AttributeError('%s object has no attribute %s' % ( |
|
722 | 734 | self.__class__, attr)) |
|
723 | __setattr__ = dict.__setitem__ | |
|
724 | __delattr__ = dict.__delitem__ | |
|
725 | 735 | |
|
726 | 736 | |
|
727 |
class AttributeDict( |
|
|
737 | class AttributeDict(AttributeDictBase): | |
|
728 | 738 | def __getattr__(self, attr): |
|
729 | 739 | return self.get(attr, None) |
|
730 | __setattr__ = dict.__setitem__ | |
|
731 | __delattr__ = dict.__delitem__ | |
|
740 | ||
|
732 | 741 | |
|
733 | 742 | |
|
734 | 743 | def fix_PATH(os_=None): |
@@ -24,9 +24,11 b' Base module for all VCS systems' | |||
|
24 | 24 | |
|
25 | 25 | import collections |
|
26 | 26 | import datetime |
|
27 | import fnmatch | |
|
27 | 28 | import itertools |
|
28 | 29 | import logging |
|
29 | 30 | import os |
|
31 | import re | |
|
30 | 32 | import time |
|
31 | 33 | import warnings |
|
32 | 34 | |
@@ -173,6 +175,7 b' class BaseRepository(object):' | |||
|
173 | 175 | EMPTY_COMMIT_ID = '0' * 40 |
|
174 | 176 | |
|
175 | 177 | path = None |
|
178 | _remote = None | |
|
176 | 179 | |
|
177 | 180 | def __init__(self, repo_path, config=None, create=False, **kwargs): |
|
178 | 181 | """ |
@@ -203,6 +206,12 b' class BaseRepository(object):' | |||
|
203 | 206 | def __ne__(self, other): |
|
204 | 207 | return not self.__eq__(other) |
|
205 | 208 | |
|
209 | def get_create_shadow_cache_pr_path(self, db_repo): | |
|
210 | path = db_repo.cached_diffs_dir | |
|
211 | if not os.path.exists(path): | |
|
212 | os.makedirs(path, 0755) | |
|
213 | return path | |
|
214 | ||
|
206 | 215 | @classmethod |
|
207 | 216 | def get_default_config(cls, default=None): |
|
208 | 217 | config = Config() |
@@ -249,6 +258,20 b' class BaseRepository(object):' | |||
|
249 | 258 | raise NotImplementedError |
|
250 | 259 | |
|
251 | 260 | @LazyProperty |
|
261 | def branches_closed(self): | |
|
262 | """ | |
|
263 | A `dict` which maps tags names to commit ids. | |
|
264 | """ | |
|
265 | raise NotImplementedError | |
|
266 | ||
|
267 | @LazyProperty | |
|
268 | def bookmarks(self): | |
|
269 | """ | |
|
270 | A `dict` which maps tags names to commit ids. | |
|
271 | """ | |
|
272 | raise NotImplementedError | |
|
273 | ||
|
274 | @LazyProperty | |
|
252 | 275 | def tags(self): |
|
253 | 276 | """ |
|
254 | 277 | A `dict` which maps tags names to commit ids. |
@@ -623,6 +646,18 b' class BaseRepository(object):' | |||
|
623 | 646 | warnings.warn("Use in_memory_commit instead", DeprecationWarning) |
|
624 | 647 | return self.in_memory_commit |
|
625 | 648 | |
|
649 | def get_path_permissions(self, username): | |
|
650 | """ | |
|
651 | Returns a path permission checker or None if not supported | |
|
652 | ||
|
653 | :param username: session user name | |
|
654 | :return: an instance of BasePathPermissionChecker or None | |
|
655 | """ | |
|
656 | return None | |
|
657 | ||
|
658 | def install_hooks(self, force=False): | |
|
659 | return self._remote.install_hooks(force) | |
|
660 | ||
|
626 | 661 | |
|
627 | 662 | class BaseCommit(object): |
|
628 | 663 | """ |
@@ -716,9 +751,15 b' class BaseCommit(object):' | |||
|
716 | 751 | 'branch': self.branch |
|
717 | 752 | } |
|
718 | 753 | |
|
754 | def __getstate__(self): | |
|
755 | d = self.__dict__.copy() | |
|
756 | d.pop('_remote', None) | |
|
757 | d.pop('repository', None) | |
|
758 | return d | |
|
759 | ||
|
719 | 760 | def _get_refs(self): |
|
720 | 761 | return { |
|
721 | 'branches': [self.branch], | |
|
762 | 'branches': [self.branch] if self.branch else [], | |
|
722 | 763 | 'bookmarks': getattr(self, 'bookmarks', []), |
|
723 | 764 | 'tags': self.tags |
|
724 | 765 | } |
@@ -1604,3 +1645,66 b' class DiffChunk(object):' | |||
|
1604 | 1645 | self.header = match.groupdict() |
|
1605 | 1646 | self.diff = chunk[match.end():] |
|
1606 | 1647 | self.raw = chunk |
|
1648 | ||
|
1649 | ||
|
1650 | class BasePathPermissionChecker(object): | |
|
1651 | ||
|
1652 | @staticmethod | |
|
1653 | def create_from_patterns(includes, excludes): | |
|
1654 | if includes and '*' in includes and not excludes: | |
|
1655 | return AllPathPermissionChecker() | |
|
1656 | elif excludes and '*' in excludes: | |
|
1657 | return NonePathPermissionChecker() | |
|
1658 | else: | |
|
1659 | return PatternPathPermissionChecker(includes, excludes) | |
|
1660 | ||
|
1661 | @property | |
|
1662 | def has_full_access(self): | |
|
1663 | raise NotImplemented() | |
|
1664 | ||
|
1665 | def has_access(self, path): | |
|
1666 | raise NotImplemented() | |
|
1667 | ||
|
1668 | ||
|
1669 | class AllPathPermissionChecker(BasePathPermissionChecker): | |
|
1670 | ||
|
1671 | @property | |
|
1672 | def has_full_access(self): | |
|
1673 | return True | |
|
1674 | ||
|
1675 | def has_access(self, path): | |
|
1676 | return True | |
|
1677 | ||
|
1678 | ||
|
1679 | class NonePathPermissionChecker(BasePathPermissionChecker): | |
|
1680 | ||
|
1681 | @property | |
|
1682 | def has_full_access(self): | |
|
1683 | return False | |
|
1684 | ||
|
1685 | def has_access(self, path): | |
|
1686 | return False | |
|
1687 | ||
|
1688 | ||
|
1689 | class PatternPathPermissionChecker(BasePathPermissionChecker): | |
|
1690 | ||
|
1691 | def __init__(self, includes, excludes): | |
|
1692 | self.includes = includes | |
|
1693 | self.excludes = excludes | |
|
1694 | self.includes_re = [] if not includes else [ | |
|
1695 | re.compile(fnmatch.translate(pattern)) for pattern in includes] | |
|
1696 | self.excludes_re = [] if not excludes else [ | |
|
1697 | re.compile(fnmatch.translate(pattern)) for pattern in excludes] | |
|
1698 | ||
|
1699 | @property | |
|
1700 | def has_full_access(self): | |
|
1701 | return '*' in self.includes and not self.excludes | |
|
1702 | ||
|
1703 | def has_access(self, path): | |
|
1704 | for regex in self.excludes_re: | |
|
1705 | if regex.match(path): | |
|
1706 | return False | |
|
1707 | for regex in self.includes_re: | |
|
1708 | if regex.match(path): | |
|
1709 | return True | |
|
1710 | return False |
@@ -28,7 +28,6 b' from itertools import chain' | |||
|
28 | 28 | from StringIO import StringIO |
|
29 | 29 | |
|
30 | 30 | from zope.cachedescriptors.property import Lazy as LazyProperty |
|
31 | from pyramid.compat import configparser | |
|
32 | 31 | |
|
33 | 32 | from rhodecode.lib.datelib import utcdate_fromtimestamp |
|
34 | 33 | from rhodecode.lib.utils import safe_unicode, safe_str |
@@ -40,6 +39,7 b' from rhodecode.lib.vcs.nodes import (' | |||
|
40 | 39 | FileNode, DirNode, NodeKind, RootNode, SubModuleNode, |
|
41 | 40 | ChangedFileNodesGenerator, AddedFileNodesGenerator, |
|
42 | 41 | RemovedFileNodesGenerator, LargeFileNode) |
|
42 | from rhodecode.lib.vcs.compat import configparser | |
|
43 | 43 | |
|
44 | 44 | |
|
45 | 45 | class GitCommit(base.BaseCommit): |
@@ -71,8 +71,6 b' class GitRepository(BaseRepository):' | |||
|
71 | 71 | # caches |
|
72 | 72 | self._commit_ids = {} |
|
73 | 73 | |
|
74 | self.bookmarks = {} | |
|
75 | ||
|
76 | 74 | @LazyProperty |
|
77 | 75 | def bare(self): |
|
78 | 76 | return self._remote.bare() |
@@ -323,6 +321,10 b' class GitRepository(BaseRepository):' | |||
|
323 | 321 | return {} |
|
324 | 322 | |
|
325 | 323 | @LazyProperty |
|
324 | def bookmarks(self): | |
|
325 | return {} | |
|
326 | ||
|
327 | @LazyProperty | |
|
326 | 328 | def branches_all(self): |
|
327 | 329 | all_branches = {} |
|
328 | 330 | all_branches.update(self.branches) |
@@ -791,7 +793,7 b' class GitRepository(BaseRepository):' | |||
|
791 | 793 | def _get_shadow_instance(self, shadow_repository_path, enable_hooks=False): |
|
792 | 794 | return GitRepository(shadow_repository_path) |
|
793 | 795 | |
|
794 | def _local_pull(self, repository_path, branch_name): | |
|
796 | def _local_pull(self, repository_path, branch_name, ff_only=True): | |
|
795 | 797 | """ |
|
796 | 798 | Pull a branch from a local repository. |
|
797 | 799 | """ |
@@ -802,7 +804,10 b' class GitRepository(BaseRepository):' | |||
|
802 | 804 | # conflicts with our current branch) |
|
803 | 805 | # Additionally, that option needs to go before --no-tags, otherwise git |
|
804 | 806 | # pull complains about it being an unknown flag. |
|
805 | cmd = ['pull', '--ff-only', '--no-tags', repository_path, branch_name] | |
|
807 | cmd = ['pull'] | |
|
808 | if ff_only: | |
|
809 | cmd.append('--ff-only') | |
|
810 | cmd.extend(['--no-tags', repository_path, branch_name]) | |
|
806 | 811 | self.run_git_command(cmd, fail_on_stderr=False) |
|
807 | 812 | |
|
808 | 813 | def _local_merge(self, merge_message, user_name, user_email, heads): |
@@ -915,6 +920,7 b' class GitRepository(BaseRepository):' | |||
|
915 | 920 | |
|
916 | 921 | pr_branch = shadow_repo._get_new_pr_branch( |
|
917 | 922 | source_ref.name, target_ref.name) |
|
923 | log.debug('using pull-request merge branch: `%s`', pr_branch) | |
|
918 | 924 | shadow_repo._checkout(pr_branch, create=True) |
|
919 | 925 | try: |
|
920 | 926 | shadow_repo._local_fetch(source_repo.path, source_ref.name) |
@@ -21,10 +21,9 b'' | |||
|
21 | 21 | """ |
|
22 | 22 | HG repository module |
|
23 | 23 | """ |
|
24 | ||
|
24 | import os | |
|
25 | 25 | import logging |
|
26 | 26 | import binascii |
|
27 | import os | |
|
28 | 27 | import shutil |
|
29 | 28 | import urllib |
|
30 | 29 | |
@@ -32,19 +31,19 b' from zope.cachedescriptors.property impo' | |||
|
32 | 31 | |
|
33 | 32 | from rhodecode.lib.compat import OrderedDict |
|
34 | 33 | from rhodecode.lib.datelib import ( |
|
35 |
date_to_timestamp_plus_offset, utcdate_fromtimestamp, makedate |
|
|
36 | date_astimestamp) | |
|
34 | date_to_timestamp_plus_offset, utcdate_fromtimestamp, makedate) | |
|
37 | 35 | from rhodecode.lib.utils import safe_unicode, safe_str |
|
38 | from rhodecode.lib.vcs import connection | |
|
36 | from rhodecode.lib.vcs import connection, exceptions | |
|
39 | 37 | from rhodecode.lib.vcs.backends.base import ( |
|
40 | 38 | BaseRepository, CollectionGenerator, Config, MergeResponse, |
|
41 | MergeFailureReason, Reference) | |
|
39 | MergeFailureReason, Reference, BasePathPermissionChecker) | |
|
42 | 40 | from rhodecode.lib.vcs.backends.hg.commit import MercurialCommit |
|
43 | 41 | from rhodecode.lib.vcs.backends.hg.diff import MercurialDiff |
|
44 | 42 | from rhodecode.lib.vcs.backends.hg.inmemory import MercurialInMemoryCommit |
|
45 | 43 | from rhodecode.lib.vcs.exceptions import ( |
|
46 | 44 | EmptyRepositoryError, RepositoryError, TagAlreadyExistError, |
|
47 | 45 | TagDoesNotExistError, CommitDoesNotExistError, SubrepoMergeError) |
|
46 | from rhodecode.lib.vcs.compat import configparser | |
|
48 | 47 | |
|
49 | 48 | hexlify = binascii.hexlify |
|
50 | 49 | nullid = "\0" * 20 |
@@ -778,6 +777,7 b' class MercurialRepository(BaseRepository' | |||
|
778 | 777 | else: |
|
779 | 778 | merge_possible = True |
|
780 | 779 | |
|
780 | needs_push = False | |
|
781 | 781 | if merge_possible: |
|
782 | 782 | try: |
|
783 | 783 | merge_commit_id, needs_push = shadow_repo._local_merge( |
@@ -891,6 +891,43 b' class MercurialRepository(BaseRepository' | |||
|
891 | 891 | self._remote.bookmark(bookmark, revision=revision) |
|
892 | 892 | self._remote.invalidate_vcs_cache() |
|
893 | 893 | |
|
894 | def get_path_permissions(self, username): | |
|
895 | hgacl_file = os.path.join(self.path, '.hg/hgacl') | |
|
896 | ||
|
897 | def read_patterns(suffix): | |
|
898 | svalue = None | |
|
899 | try: | |
|
900 | svalue = hgacl.get('narrowhgacl', username + suffix) | |
|
901 | except configparser.NoOptionError: | |
|
902 | try: | |
|
903 | svalue = hgacl.get('narrowhgacl', 'default' + suffix) | |
|
904 | except configparser.NoOptionError: | |
|
905 | pass | |
|
906 | if not svalue: | |
|
907 | return None | |
|
908 | result = ['/'] | |
|
909 | for pattern in svalue.split(): | |
|
910 | result.append(pattern) | |
|
911 | if '*' not in pattern and '?' not in pattern: | |
|
912 | result.append(pattern + '/*') | |
|
913 | return result | |
|
914 | ||
|
915 | if os.path.exists(hgacl_file): | |
|
916 | try: | |
|
917 | hgacl = configparser.RawConfigParser() | |
|
918 | hgacl.read(hgacl_file) | |
|
919 | ||
|
920 | includes = read_patterns('.includes') | |
|
921 | excludes = read_patterns('.excludes') | |
|
922 | return BasePathPermissionChecker.create_from_patterns( | |
|
923 | includes, excludes) | |
|
924 | except BaseException as e: | |
|
925 | msg = 'Cannot read ACL settings from {} on {}: {}'.format( | |
|
926 | hgacl_file, self.name, e) | |
|
927 | raise exceptions.RepositoryRequirementError(msg) | |
|
928 | else: | |
|
929 | return None | |
|
930 | ||
|
894 | 931 | |
|
895 | 932 | class MercurialIndexBasedCollectionGenerator(CollectionGenerator): |
|
896 | 933 |
@@ -77,8 +77,6 b' class SubversionRepository(base.BaseRepo' | |||
|
77 | 77 | |
|
78 | 78 | self._init_repo(create, src_url) |
|
79 | 79 | |
|
80 | self.bookmarks = {} | |
|
81 | ||
|
82 | 80 | def _init_repo(self, create, src_url): |
|
83 | 81 | if create and os.path.exists(self.path): |
|
84 | 82 | raise RepositoryError( |
@@ -107,6 +105,10 b' class SubversionRepository(base.BaseRepo' | |||
|
107 | 105 | return {} |
|
108 | 106 | |
|
109 | 107 | @LazyProperty |
|
108 | def bookmarks(self): | |
|
109 | return {} | |
|
110 | ||
|
111 | @LazyProperty | |
|
110 | 112 | def branches_all(self): |
|
111 | 113 | # TODO: johbo: Implement proper branch support |
|
112 | 114 | all_branches = {} |
@@ -197,6 +197,13 b' def _remote_call(url, payload, exception' | |||
|
197 | 197 | response = session.post(url, data=msgpack.packb(payload)) |
|
198 | 198 | except pycurl.error as e: |
|
199 | 199 | raise exceptions.HttpVCSCommunicationError(e) |
|
200 | except Exception as e: | |
|
201 | message = getattr(e, 'message', '') | |
|
202 | if 'Failed to connect' in message: | |
|
203 | # gevent doesn't return proper pycurl errors | |
|
204 | raise exceptions.HttpVCSCommunicationError(e) | |
|
205 | else: | |
|
206 | raise | |
|
200 | 207 | |
|
201 | 208 | if response.status_code >= 400: |
|
202 | 209 | log.error('Call to %s returned non 200 HTTP code: %s', |
@@ -1344,6 +1344,15 b' class UserGroup(Base, BaseModel):' | |||
|
1344 | 1344 | except Exception: |
|
1345 | 1345 | log.error(traceback.format_exc()) |
|
1346 | 1346 | |
|
1347 | @classmethod | |
|
1348 | def _load_sync(cls, group_data): | |
|
1349 | if group_data: | |
|
1350 | return group_data.get('extern_type') | |
|
1351 | ||
|
1352 | @property | |
|
1353 | def sync(self): | |
|
1354 | return self._load_sync(self.group_data) | |
|
1355 | ||
|
1347 | 1356 | def __unicode__(self): |
|
1348 | 1357 | return u"<%s('id:%s:%s')>" % (self.__class__.__name__, |
|
1349 | 1358 | self.users_group_id, |
@@ -1453,6 +1462,7 b' class UserGroup(Base, BaseModel):' | |||
|
1453 | 1462 | 'group_description': user_group.user_group_description, |
|
1454 | 1463 | 'active': user_group.users_group_active, |
|
1455 | 1464 | 'owner': user_group.user.username, |
|
1465 | 'sync': user_group.sync, | |
|
1456 | 1466 | 'owner_email': user_group.user.email, |
|
1457 | 1467 | } |
|
1458 | 1468 | |
@@ -1557,6 +1567,9 b' class Repository(Base, BaseModel):' | |||
|
1557 | 1567 | clone_uri = Column( |
|
1558 | 1568 | "clone_uri", EncryptedTextValue(), nullable=True, unique=False, |
|
1559 | 1569 | default=None) |
|
1570 | push_uri = Column( | |
|
1571 | "push_uri", EncryptedTextValue(), nullable=True, unique=False, | |
|
1572 | default=None) | |
|
1560 | 1573 | repo_type = Column( |
|
1561 | 1574 | "repo_type", String(255), nullable=False, unique=False, default=None) |
|
1562 | 1575 | user_id = Column( |
@@ -1846,6 +1859,30 b' class Repository(Base, BaseModel):' | |||
|
1846 | 1859 | .order_by(CacheKey.cache_key)\ |
|
1847 | 1860 | .all() |
|
1848 | 1861 | |
|
1862 | @property | |
|
1863 | def cached_diffs_relative_dir(self): | |
|
1864 | """ | |
|
1865 | Return a relative to the repository store path of cached diffs | |
|
1866 | used for safe display for users, who shouldn't know the absolute store | |
|
1867 | path | |
|
1868 | """ | |
|
1869 | return os.path.join( | |
|
1870 | os.path.dirname(self.repo_name), | |
|
1871 | self.cached_diffs_dir.split(os.path.sep)[-1]) | |
|
1872 | ||
|
1873 | @property | |
|
1874 | def cached_diffs_dir(self): | |
|
1875 | path = self.repo_full_path | |
|
1876 | return os.path.join( | |
|
1877 | os.path.dirname(path), | |
|
1878 | '.__shadow_diff_cache_repo_{}'.format(self.repo_id)) | |
|
1879 | ||
|
1880 | def cached_diffs(self): | |
|
1881 | diff_cache_dir = self.cached_diffs_dir | |
|
1882 | if os.path.isdir(diff_cache_dir): | |
|
1883 | return os.listdir(diff_cache_dir) | |
|
1884 | return [] | |
|
1885 | ||
|
1849 | 1886 | def get_new_name(self, repo_name): |
|
1850 | 1887 | """ |
|
1851 | 1888 | returns new full repository name based on assigned group and new new |
@@ -1943,6 +1980,7 b' class Repository(Base, BaseModel):' | |||
|
1943 | 1980 | 'repo_name': repo.repo_name, |
|
1944 | 1981 | 'repo_type': repo.repo_type, |
|
1945 | 1982 | 'clone_uri': repo.clone_uri or '', |
|
1983 | 'push_uri': repo.push_uri or '', | |
|
1946 | 1984 | 'url': RepoModel().get_url(self), |
|
1947 | 1985 | 'private': repo.private, |
|
1948 | 1986 | 'created_on': repo.created_on, |
@@ -2077,6 +2115,16 b' class Repository(Base, BaseModel):' | |||
|
2077 | 2115 | clone_uri = url_obj.with_password('*****') |
|
2078 | 2116 | return clone_uri |
|
2079 | 2117 | |
|
2118 | @property | |
|
2119 | def push_uri_hidden(self): | |
|
2120 | push_uri = self.push_uri | |
|
2121 | if push_uri: | |
|
2122 | import urlobject | |
|
2123 | url_obj = urlobject.URLObject(cleaned_uri(push_uri)) | |
|
2124 | if url_obj.password: | |
|
2125 | push_uri = url_obj.with_password('*****') | |
|
2126 | return push_uri | |
|
2127 | ||
|
2080 | 2128 | def clone_url(self, **override): |
|
2081 | 2129 | from rhodecode.model.settings import SettingsModel |
|
2082 | 2130 |
@@ -375,6 +375,7 b' def ApplicationSettingsForm(localizer):' | |||
|
375 | 375 | |
|
376 | 376 | |
|
377 | 377 | def ApplicationVisualisationForm(localizer): |
|
378 | from rhodecode.model.db import Repository | |
|
378 | 379 | _ = localizer |
|
379 | 380 | |
|
380 | 381 | class _ApplicationVisualisationForm(formencode.Schema): |
@@ -392,8 +393,8 b' def ApplicationVisualisationForm(localiz' | |||
|
392 | 393 | rhodecode_use_gravatar = v.StringBoolean(if_missing=False) |
|
393 | 394 | rhodecode_markup_renderer = v.OneOf(['markdown', 'rst']) |
|
394 | 395 | rhodecode_gravatar_url = v.UnicodeString(min=3) |
|
395 |
rhodecode_clone_uri_tmpl = v.UnicodeString( |
|
|
396 |
rhodecode_clone_uri_ssh_tmpl = v.UnicodeString( |
|
|
396 | rhodecode_clone_uri_tmpl = v.UnicodeString(not_empty=False, if_empty=Repository.DEFAULT_CLONE_URI) | |
|
397 | rhodecode_clone_uri_ssh_tmpl = v.UnicodeString(not_empty=False, if_empty=Repository.DEFAULT_CLONE_URI_SSH) | |
|
397 | 398 | rhodecode_support_url = v.UnicodeString() |
|
398 | 399 | rhodecode_show_revision_number = v.StringBoolean(if_missing=False) |
|
399 | 400 | rhodecode_show_sha_length = v.Int(min=4, not_empty=True) |
@@ -429,6 +430,9 b' class _BaseVcsSettingsForm(formencode.Sc' | |||
|
429 | 430 | vcs_svn_proxy_http_requests_enabled = v.StringBoolean(if_missing=False) |
|
430 | 431 | vcs_svn_proxy_http_server_url = v.UnicodeString(strip=True, if_missing=None) |
|
431 | 432 | |
|
433 | # cache | |
|
434 | rhodecode_diff_cache = v.StringBoolean(if_missing=False) | |
|
435 | ||
|
432 | 436 | |
|
433 | 437 | def ApplicationUiSettingsForm(localizer): |
|
434 | 438 | _ = localizer |
@@ -1063,7 +1063,7 b' class PullRequestModel(BaseModel):' | |||
|
1063 | 1063 | repo_name=safe_str(pull_request.target_repo.repo_name), |
|
1064 | 1064 | pull_request_id=pull_request.pull_request_id,) |
|
1065 | 1065 | |
|
1066 | def get_shadow_clone_url(self, pull_request): | |
|
1066 | def get_shadow_clone_url(self, pull_request, request=None): | |
|
1067 | 1067 | """ |
|
1068 | 1068 | Returns qualified url pointing to the shadow repository. If this pull |
|
1069 | 1069 | request is closed there is no shadow repository and ``None`` will be |
@@ -1072,7 +1072,7 b' class PullRequestModel(BaseModel):' | |||
|
1072 | 1072 | if pull_request.is_closed(): |
|
1073 | 1073 | return None |
|
1074 | 1074 | else: |
|
1075 | pr_url = urllib.unquote(self.get_url(pull_request)) | |
|
1075 | pr_url = urllib.unquote(self.get_url(pull_request, request=request)) | |
|
1076 | 1076 | return safe_unicode('{pr_url}/repository'.format(pr_url=pr_url)) |
|
1077 | 1077 | |
|
1078 | 1078 | def notify_reviewers(self, pull_request, reviewers_ids): |
@@ -304,6 +304,7 b' class RepoModel(BaseModel):' | |||
|
304 | 304 | {'k': 'repo_enable_locking', 'strip': True}, |
|
305 | 305 | {'k': 'repo_landing_rev', 'strip': True}, |
|
306 | 306 | {'k': 'clone_uri', 'strip': False}, |
|
307 | {'k': 'push_uri', 'strip': False}, | |
|
307 | 308 | {'k': 'repo_private', 'strip': True}, |
|
308 | 309 | {'k': 'repo_enable_statistics', 'strip': True} |
|
309 | 310 | ) |
@@ -319,6 +320,8 b' class RepoModel(BaseModel):' | |||
|
319 | 320 | defaults[item['k']] = val |
|
320 | 321 | if item['k'] == 'clone_uri': |
|
321 | 322 | defaults['clone_uri_hidden'] = repo_info.clone_uri_hidden |
|
323 | if item['k'] == 'push_uri': | |
|
324 | defaults['push_uri_hidden'] = repo_info.push_uri_hidden | |
|
322 | 325 | |
|
323 | 326 | # fill owner |
|
324 | 327 | if repo_info.user: |
@@ -348,6 +351,7 b' class RepoModel(BaseModel):' | |||
|
348 | 351 | (1, 'repo_enable_locking'), |
|
349 | 352 | (1, 'repo_enable_statistics'), |
|
350 | 353 | (0, 'clone_uri'), |
|
354 | (0, 'push_uri'), | |
|
351 | 355 | (0, 'fork_id') |
|
352 | 356 | ] |
|
353 | 357 | for strip, k in update_keys: |
@@ -854,7 +858,7 b' class RepoModel(BaseModel):' | |||
|
854 | 858 | repo = backend( |
|
855 | 859 | repo_path, config=config, create=True, src_url=clone_uri) |
|
856 | 860 | |
|
857 | ScmModel().install_hooks(repo, repo_type=repo_type) | |
|
861 | repo.install_hooks() | |
|
858 | 862 | |
|
859 | 863 | log.debug('Created repo %s with %s backend', |
|
860 | 864 | safe_unicode(repo_name), safe_unicode(repo_type)) |
@@ -915,6 +919,11 b' class RepoModel(BaseModel):' | |||
|
915 | 919 | if os.path.isdir(rm_path): |
|
916 | 920 | shutil.move(rm_path, os.path.join(self.repos_path, _d)) |
|
917 | 921 | |
|
922 | # finally cleanup diff-cache if it exists | |
|
923 | cached_diffs_dir = repo.cached_diffs_dir | |
|
924 | if os.path.isdir(cached_diffs_dir): | |
|
925 | shutil.rmtree(cached_diffs_dir) | |
|
926 | ||
|
918 | 927 | |
|
919 | 928 | class ReadmeFinder: |
|
920 | 929 | """ |
@@ -397,7 +397,7 b' class ScmModel(BaseModel):' | |||
|
397 | 397 | |
|
398 | 398 | def push_changes(self, repo, username, remote_uri=None): |
|
399 | 399 | dbrepo = self._get_repo(repo) |
|
400 |
remote_uri = remote_uri or dbrepo. |
|
|
400 | remote_uri = remote_uri or dbrepo.push_uri | |
|
401 | 401 | if not remote_uri: |
|
402 | 402 | raise Exception("This repository doesn't have a clone uri") |
|
403 | 403 | |
@@ -807,116 +807,6 b' class ScmModel(BaseModel):' | |||
|
807 | 807 | |
|
808 | 808 | return choices, hist_l |
|
809 | 809 | |
|
810 | def install_git_hook(self, repo, force_create=False): | |
|
811 | """ | |
|
812 | Creates a rhodecode hook inside a git repository | |
|
813 | ||
|
814 | :param repo: Instance of VCS repo | |
|
815 | :param force_create: Create even if same name hook exists | |
|
816 | """ | |
|
817 | ||
|
818 | loc = os.path.join(repo.path, 'hooks') | |
|
819 | if not repo.bare: | |
|
820 | loc = os.path.join(repo.path, '.git', 'hooks') | |
|
821 | if not os.path.isdir(loc): | |
|
822 | os.makedirs(loc, mode=0777) | |
|
823 | ||
|
824 | tmpl_post = pkg_resources.resource_string( | |
|
825 | 'rhodecode', '/'.join( | |
|
826 | ('config', 'hook_templates', 'git_post_receive.py.tmpl'))) | |
|
827 | tmpl_pre = pkg_resources.resource_string( | |
|
828 | 'rhodecode', '/'.join( | |
|
829 | ('config', 'hook_templates', 'git_pre_receive.py.tmpl'))) | |
|
830 | ||
|
831 | for h_type, tmpl in [('pre', tmpl_pre), ('post', tmpl_post)]: | |
|
832 | _hook_file = os.path.join(loc, '%s-receive' % h_type) | |
|
833 | log.debug('Installing git hook in repo %s', repo) | |
|
834 | _rhodecode_hook = _check_rhodecode_hook(_hook_file) | |
|
835 | ||
|
836 | if _rhodecode_hook or force_create: | |
|
837 | log.debug('writing %s hook file !', h_type) | |
|
838 | try: | |
|
839 | with open(_hook_file, 'wb') as f: | |
|
840 | tmpl = tmpl.replace('_TMPL_', rhodecode.__version__) | |
|
841 | tmpl = tmpl.replace('_ENV_', sys.executable) | |
|
842 | f.write(tmpl) | |
|
843 | os.chmod(_hook_file, 0755) | |
|
844 | except IOError: | |
|
845 | log.exception('error writing hook file %s', _hook_file) | |
|
846 | else: | |
|
847 | log.debug('skipping writing hook file') | |
|
848 | ||
|
849 | def install_svn_hooks(self, repo, force_create=False): | |
|
850 | """ | |
|
851 | Creates rhodecode hooks inside a svn repository | |
|
852 | ||
|
853 | :param repo: Instance of VCS repo | |
|
854 | :param force_create: Create even if same name hook exists | |
|
855 | """ | |
|
856 | hooks_path = os.path.join(repo.path, 'hooks') | |
|
857 | if not os.path.isdir(hooks_path): | |
|
858 | os.makedirs(hooks_path) | |
|
859 | post_commit_tmpl = pkg_resources.resource_string( | |
|
860 | 'rhodecode', '/'.join( | |
|
861 | ('config', 'hook_templates', 'svn_post_commit_hook.py.tmpl'))) | |
|
862 | pre_commit_template = pkg_resources.resource_string( | |
|
863 | 'rhodecode', '/'.join( | |
|
864 | ('config', 'hook_templates', 'svn_pre_commit_hook.py.tmpl'))) | |
|
865 | templates = { | |
|
866 | 'post-commit': post_commit_tmpl, | |
|
867 | 'pre-commit': pre_commit_template | |
|
868 | } | |
|
869 | for filename in templates: | |
|
870 | _hook_file = os.path.join(hooks_path, filename) | |
|
871 | _rhodecode_hook = _check_rhodecode_hook(_hook_file) | |
|
872 | if _rhodecode_hook or force_create: | |
|
873 | log.debug('writing %s hook file !', filename) | |
|
874 | template = templates[filename] | |
|
875 | try: | |
|
876 | with open(_hook_file, 'wb') as f: | |
|
877 | template = template.replace( | |
|
878 | '_TMPL_', rhodecode.__version__) | |
|
879 | template = template.replace('_ENV_', sys.executable) | |
|
880 | f.write(template) | |
|
881 | os.chmod(_hook_file, 0755) | |
|
882 | except IOError: | |
|
883 | log.exception('error writing hook file %s', filename) | |
|
884 | else: | |
|
885 | log.debug('skipping writing hook file') | |
|
886 | ||
|
887 | def install_hooks(self, repo, repo_type): | |
|
888 | if repo_type == 'git': | |
|
889 | self.install_git_hook(repo) | |
|
890 | elif repo_type == 'svn': | |
|
891 | self.install_svn_hooks(repo) | |
|
892 | ||
|
893 | 810 | def get_server_info(self, environ=None): |
|
894 | 811 | server_info = get_system_info(environ) |
|
895 | 812 | return server_info |
|
896 | ||
|
897 | ||
|
898 | def _check_rhodecode_hook(hook_path): | |
|
899 | """ | |
|
900 | Check if the hook was created by RhodeCode | |
|
901 | """ | |
|
902 | if not os.path.exists(hook_path): | |
|
903 | return True | |
|
904 | ||
|
905 | log.debug('hook exists, checking if it is from rhodecode') | |
|
906 | hook_content = _read_hook(hook_path) | |
|
907 | matches = re.search(r'(?:RC_HOOK_VER)\s*=\s*(.*)', hook_content) | |
|
908 | if matches: | |
|
909 | try: | |
|
910 | version = matches.groups()[0] | |
|
911 | log.debug('got %s, it is rhodecode', version) | |
|
912 | return True | |
|
913 | except Exception: | |
|
914 | log.exception("Exception while reading the hook version.") | |
|
915 | ||
|
916 | return False | |
|
917 | ||
|
918 | ||
|
919 | def _read_hook(hook_path): | |
|
920 | with open(hook_path, 'rb') as f: | |
|
921 | content = f.read() | |
|
922 | return content |
@@ -417,7 +417,9 b' class VcsSettingsModel(object):' | |||
|
417 | 417 | 'hg_use_rebase_for_merging', |
|
418 | 418 | 'hg_close_branch_before_merging', |
|
419 | 419 | 'git_use_rebase_for_merging', |
|
420 |
'git_close_branch_before_merging' |
|
|
420 | 'git_close_branch_before_merging', | |
|
421 | 'diff_cache', | |
|
422 | ) | |
|
421 | 423 | |
|
422 | 424 | HOOKS_SETTINGS = ( |
|
423 | 425 | ('hooks', 'changegroup.repo_size'), |
@@ -438,6 +440,7 b' class VcsSettingsModel(object):' | |||
|
438 | 440 | GLOBAL_GIT_SETTINGS = ( |
|
439 | 441 | ('vcs_git_lfs', 'enabled'), |
|
440 | 442 | ('vcs_git_lfs', 'store_location')) |
|
443 | ||
|
441 | 444 | GLOBAL_SVN_SETTINGS = ( |
|
442 | 445 | ('vcs_svn_proxy', 'http_requests_enabled'), |
|
443 | 446 | ('vcs_svn_proxy', 'http_server_url')) |
@@ -571,6 +574,7 b' class VcsSettingsModel(object):' | |||
|
571 | 574 | self._create_or_update_ui( |
|
572 | 575 | self.repo_settings, *phases, value=safe_str(data[phases_key])) |
|
573 | 576 | |
|
577 | ||
|
574 | 578 | def create_or_update_global_hg_settings(self, data): |
|
575 | 579 | largefiles, largefiles_store, phases, hgsubversion, evolve \ |
|
576 | 580 | = self.GLOBAL_HG_SETTINGS |
@@ -663,6 +663,10 b' class UserModel(BaseModel):' | |||
|
663 | 663 | :param api_key: api key to fetch by |
|
664 | 664 | :param username: username to fetch by |
|
665 | 665 | """ |
|
666 | def token_obfuscate(token): | |
|
667 | if token: | |
|
668 | return token[:4] + "****" | |
|
669 | ||
|
666 | 670 | if user_id is None and api_key is None and username is None: |
|
667 | 671 | raise Exception('You need to pass user_id, api_key or username') |
|
668 | 672 | |
@@ -681,7 +685,7 b' class UserModel(BaseModel):' | |||
|
681 | 685 | if not dbuser: |
|
682 | 686 | log.warning( |
|
683 | 687 | 'Unable to lookup user by id:%s api_key:%s username:%s', |
|
684 | user_id, api_key, username) | |
|
688 | user_id, token_obfuscate(api_key), username) | |
|
685 | 689 | return False |
|
686 | 690 | if not dbuser.active: |
|
687 | 691 | log.debug('User `%s:%s` is inactive, skipping fill data', |
@@ -80,6 +80,7 b' class UserGroupModel(BaseModel):' | |||
|
80 | 80 | 'updated': [], |
|
81 | 81 | 'deleted': [] |
|
82 | 82 | } |
|
83 | change_obj = user_group.get_api_data() | |
|
83 | 84 | # update permissions |
|
84 | 85 | for member_id, perm, member_type in perm_updates: |
|
85 | 86 | member_id = int(member_id) |
@@ -97,7 +98,9 b' class UserGroupModel(BaseModel):' | |||
|
97 | 98 | self.grant_user_group_permission( |
|
98 | 99 | target_user_group=user_group, user_group=member_id, perm=perm) |
|
99 | 100 | |
|
100 |
changes['updated'].append({ |
|
|
101 | changes['updated'].append({ | |
|
102 | 'change_obj': change_obj, | |
|
103 | 'type': member_type, 'id': member_id, | |
|
101 | 104 |
|
|
102 | 105 | |
|
103 | 106 | # set new permissions |
@@ -115,7 +118,9 b' class UserGroupModel(BaseModel):' | |||
|
115 | 118 | self.grant_user_group_permission( |
|
116 | 119 | target_user_group=user_group, user_group=member_id, perm=perm) |
|
117 | 120 | |
|
118 |
changes['added'].append({ |
|
|
121 | changes['added'].append({ | |
|
122 | 'change_obj': change_obj, | |
|
123 | 'type': member_type, 'id': member_id, | |
|
119 | 124 |
|
|
120 | 125 | |
|
121 | 126 | # delete permissions |
@@ -132,8 +137,11 b' class UserGroupModel(BaseModel):' | |||
|
132 | 137 | self.revoke_user_group_permission( |
|
133 | 138 | target_user_group=user_group, user_group=member_id) |
|
134 | 139 | |
|
135 |
changes['deleted'].append({ |
|
|
140 | changes['deleted'].append({ | |
|
141 | 'change_obj': change_obj, | |
|
142 | 'type': member_type, 'id': member_id, | |
|
136 | 143 |
|
|
144 | ||
|
137 | 145 | return changes |
|
138 | 146 | |
|
139 | 147 | def get(self, user_group_id, cache=False): |
@@ -217,7 +225,7 b' class UserGroupModel(BaseModel):' | |||
|
217 | 225 | members.append(uid) |
|
218 | 226 | return members |
|
219 | 227 | |
|
220 | def update(self, user_group, form_data): | |
|
228 | def update(self, user_group, form_data, group_data=None): | |
|
221 | 229 | user_group = self._get_user_group(user_group) |
|
222 | 230 | if 'users_group_name' in form_data: |
|
223 | 231 | user_group.users_group_name = form_data['users_group_name'] |
@@ -247,6 +255,11 b' class UserGroupModel(BaseModel):' | |||
|
247 | 255 | added_user_ids, removed_user_ids = \ |
|
248 | 256 | self._update_members_from_user_ids(user_group, members_id_list) |
|
249 | 257 | |
|
258 | if group_data: | |
|
259 | new_group_data = {} | |
|
260 | new_group_data.update(group_data) | |
|
261 | user_group.group_data = new_group_data | |
|
262 | ||
|
250 | 263 | self.sa.add(user_group) |
|
251 | 264 | return user_group, added_user_ids, removed_user_ids |
|
252 | 265 | |
@@ -395,10 +408,18 b' class UserGroupModel(BaseModel):' | |||
|
395 | 408 | :param user: Instance of User, user_id or username |
|
396 | 409 | :param perm: Instance of Permission, or permission_name |
|
397 | 410 | """ |
|
411 | changes = { | |
|
412 | 'added': [], | |
|
413 | 'updated': [], | |
|
414 | 'deleted': [] | |
|
415 | } | |
|
398 | 416 | |
|
399 | 417 | user_group = self._get_user_group(user_group) |
|
400 | 418 | user = self._get_user(user) |
|
401 | 419 | permission = self._get_perm(perm) |
|
420 | perm_name = permission.permission_name | |
|
421 | member_id = user.user_id | |
|
422 | member_name = user.username | |
|
402 | 423 | |
|
403 | 424 | # check if we have that permission already |
|
404 | 425 | obj = self.sa.query(UserUserGroupToPerm)\ |
@@ -417,7 +438,12 b' class UserGroupModel(BaseModel):' | |||
|
417 | 438 | 'granted permission: {} to user: {} on usergroup: {}'.format( |
|
418 | 439 | perm, user, user_group), namespace='security.usergroup') |
|
419 | 440 | |
|
420 | return obj | |
|
441 | changes['added'].append({ | |
|
442 | 'change_obj': user_group.get_api_data(), | |
|
443 | 'type': 'user', 'id': member_id, | |
|
444 | 'name': member_name, 'new_perm': perm_name}) | |
|
445 | ||
|
446 | return changes | |
|
421 | 447 | |
|
422 | 448 | def revoke_user_permission(self, user_group, user): |
|
423 | 449 | """ |
@@ -427,9 +453,17 b' class UserGroupModel(BaseModel):' | |||
|
427 | 453 | or users_group name |
|
428 | 454 | :param user: Instance of User, user_id or username |
|
429 | 455 | """ |
|
456 | changes = { | |
|
457 | 'added': [], | |
|
458 | 'updated': [], | |
|
459 | 'deleted': [] | |
|
460 | } | |
|
430 | 461 | |
|
431 | 462 | user_group = self._get_user_group(user_group) |
|
432 | 463 | user = self._get_user(user) |
|
464 | perm_name = 'usergroup.none' | |
|
465 | member_id = user.user_id | |
|
466 | member_name = user.username | |
|
433 | 467 | |
|
434 | 468 | obj = self.sa.query(UserUserGroupToPerm)\ |
|
435 | 469 | .filter(UserUserGroupToPerm.user == user)\ |
@@ -442,6 +476,13 b' class UserGroupModel(BaseModel):' | |||
|
442 | 476 | 'revoked permission from user: {} on usergroup: {}'.format( |
|
443 | 477 | user, user_group), namespace='security.usergroup') |
|
444 | 478 | |
|
479 | changes['deleted'].append({ | |
|
480 | 'change_obj': user_group.get_api_data(), | |
|
481 | 'type': 'user', 'id': member_id, | |
|
482 | 'name': member_name, 'new_perm': perm_name}) | |
|
483 | ||
|
484 | return changes | |
|
485 | ||
|
445 | 486 | def grant_user_group_permission(self, target_user_group, user_group, perm): |
|
446 | 487 | """ |
|
447 | 488 | Grant user group permission for given target_user_group |
@@ -450,9 +491,19 b' class UserGroupModel(BaseModel):' | |||
|
450 | 491 | :param user_group: |
|
451 | 492 | :param perm: |
|
452 | 493 | """ |
|
494 | changes = { | |
|
495 | 'added': [], | |
|
496 | 'updated': [], | |
|
497 | 'deleted': [] | |
|
498 | } | |
|
499 | ||
|
453 | 500 | target_user_group = self._get_user_group(target_user_group) |
|
454 | 501 | user_group = self._get_user_group(user_group) |
|
455 | 502 | permission = self._get_perm(perm) |
|
503 | perm_name = permission.permission_name | |
|
504 | member_id = user_group.users_group_id | |
|
505 | member_name = user_group.users_group_name | |
|
506 | ||
|
456 | 507 | # forbid assigning same user group to itself |
|
457 | 508 | if target_user_group == user_group: |
|
458 | 509 | raise RepoGroupAssignmentError('target repo:%s cannot be ' |
@@ -477,7 +528,12 b' class UserGroupModel(BaseModel):' | |||
|
477 | 528 | perm, user_group, target_user_group), |
|
478 | 529 | namespace='security.usergroup') |
|
479 | 530 | |
|
480 | return obj | |
|
531 | changes['added'].append({ | |
|
532 | 'change_obj': target_user_group.get_api_data(), | |
|
533 | 'type': 'user_group', 'id': member_id, | |
|
534 | 'name': member_name, 'new_perm': perm_name}) | |
|
535 | ||
|
536 | return changes | |
|
481 | 537 | |
|
482 | 538 | def revoke_user_group_permission(self, target_user_group, user_group): |
|
483 | 539 | """ |
@@ -486,8 +542,17 b' class UserGroupModel(BaseModel):' | |||
|
486 | 542 | :param target_user_group: |
|
487 | 543 | :param user_group: |
|
488 | 544 | """ |
|
545 | changes = { | |
|
546 | 'added': [], | |
|
547 | 'updated': [], | |
|
548 | 'deleted': [] | |
|
549 | } | |
|
550 | ||
|
489 | 551 | target_user_group = self._get_user_group(target_user_group) |
|
490 | 552 | user_group = self._get_user_group(user_group) |
|
553 | perm_name = 'usergroup.none' | |
|
554 | member_id = user_group.users_group_id | |
|
555 | member_name = user_group.users_group_name | |
|
491 | 556 | |
|
492 | 557 | obj = self.sa.query(UserGroupUserGroupToPerm)\ |
|
493 | 558 | .filter(UserGroupUserGroupToPerm.target_user_group == target_user_group)\ |
@@ -502,6 +567,13 b' class UserGroupModel(BaseModel):' | |||
|
502 | 567 | user_group, target_user_group), |
|
503 | 568 | namespace='security.repogroup') |
|
504 | 569 | |
|
570 | changes['deleted'].append({ | |
|
571 | 'change_obj': target_user_group.get_api_data(), | |
|
572 | 'type': 'user_group', 'id': member_id, | |
|
573 | 'name': member_name, 'new_perm': perm_name}) | |
|
574 | ||
|
575 | return changes | |
|
576 | ||
|
505 | 577 | def get_perms_summary(self, user_group_id): |
|
506 | 578 | permissions = { |
|
507 | 579 | 'repositories': {}, |
@@ -66,7 +66,7 b' def deferred_landing_ref_validator(node,' | |||
|
66 | 66 | |
|
67 | 67 | |
|
68 | 68 | @colander.deferred |
|
69 |
def deferred_c |
|
|
69 | def deferred_sync_uri_validator(node, kw): | |
|
70 | 70 | repo_type = kw.get('repo_type') |
|
71 | 71 | validator = validators.CloneUriValidator(repo_type) |
|
72 | 72 | return validator |
@@ -319,7 +319,13 b' class RepoSchema(colander.MappingSchema)' | |||
|
319 | 319 | |
|
320 | 320 | repo_clone_uri = colander.SchemaNode( |
|
321 | 321 | colander.String(), |
|
322 |
validator=deferred_c |
|
|
322 | validator=deferred_sync_uri_validator, | |
|
323 | preparers=[preparers.strip_preparer], | |
|
324 | missing='') | |
|
325 | ||
|
326 | repo_push_uri = colander.SchemaNode( | |
|
327 | colander.String(), | |
|
328 | validator=deferred_sync_uri_validator, | |
|
323 | 329 | preparers=[preparers.strip_preparer], |
|
324 | 330 | missing='') |
|
325 | 331 | |
@@ -381,7 +387,17 b' class RepoSettingsSchema(RepoSchema):' | |||
|
381 | 387 | repo_clone_uri = colander.SchemaNode( |
|
382 | 388 | colander.String(), |
|
383 | 389 | preparers=[preparers.strip_preparer], |
|
384 |
validator=deferred_c |
|
|
390 | validator=deferred_sync_uri_validator, | |
|
391 | missing='') | |
|
392 | ||
|
393 | repo_push_uri_change = colander.SchemaNode( | |
|
394 | colander.String(), | |
|
395 | missing='NEW') | |
|
396 | ||
|
397 | repo_push_uri = colander.SchemaNode( | |
|
398 | colander.String(), | |
|
399 | preparers=[preparers.strip_preparer], | |
|
400 | validator=deferred_sync_uri_validator, | |
|
385 | 401 | missing='') |
|
386 | 402 | |
|
387 | 403 | def deserialize(self, cstruct): |
@@ -22,10 +22,11 b' import re' | |||
|
22 | 22 | import colander |
|
23 | 23 | |
|
24 | 24 | from rhodecode import forms |
|
25 | from rhodecode.model.db import User | |
|
25 | from rhodecode.model.db import User, UserEmailMap | |
|
26 | 26 | from rhodecode.model.validation_schema import types, validators |
|
27 | 27 | from rhodecode.translation import _ |
|
28 | 28 | from rhodecode.lib.auth import check_password |
|
29 | from rhodecode.lib import helpers as h | |
|
29 | 30 | |
|
30 | 31 | |
|
31 | 32 | @colander.deferred |
@@ -40,6 +41,7 b' def deferred_user_password_validator(nod' | |||
|
40 | 41 | return _user_password_validator |
|
41 | 42 | |
|
42 | 43 | |
|
44 | ||
|
43 | 45 | class ChangePasswordSchema(colander.Schema): |
|
44 | 46 | |
|
45 | 47 | current_password = colander.SchemaNode( |
@@ -123,3 +125,64 b' class UserSchema(colander.Schema):' | |||
|
123 | 125 | |
|
124 | 126 | appstruct = super(UserSchema, self).deserialize(cstruct) |
|
125 | 127 | return appstruct |
|
128 | ||
|
129 | ||
|
130 | @colander.deferred | |
|
131 | def deferred_user_email_in_emails_validator(node, kw): | |
|
132 | return colander.OneOf(kw.get('user_emails')) | |
|
133 | ||
|
134 | ||
|
135 | @colander.deferred | |
|
136 | def deferred_additional_email_validator(node, kw): | |
|
137 | emails = kw.get('user_emails') | |
|
138 | ||
|
139 | def name_validator(node, value): | |
|
140 | if value in emails: | |
|
141 | msg = _('This e-mail address is already taken') | |
|
142 | raise colander.Invalid(node, msg) | |
|
143 | user = User.get_by_email(value, case_insensitive=True) | |
|
144 | if user: | |
|
145 | msg = _(u'This e-mail address is already taken') | |
|
146 | raise colander.Invalid(node, msg) | |
|
147 | c = colander.Email() | |
|
148 | return c(node, value) | |
|
149 | return name_validator | |
|
150 | ||
|
151 | ||
|
152 | @colander.deferred | |
|
153 | def deferred_user_email_in_emails_widget(node, kw): | |
|
154 | import deform.widget | |
|
155 | emails = [(email, email) for email in kw.get('user_emails')] | |
|
156 | return deform.widget.Select2Widget(values=emails) | |
|
157 | ||
|
158 | ||
|
159 | class UserProfileSchema(colander.Schema): | |
|
160 | username = colander.SchemaNode( | |
|
161 | colander.String(), | |
|
162 | validator=deferred_username_validator) | |
|
163 | ||
|
164 | firstname = colander.SchemaNode( | |
|
165 | colander.String(), missing='', title='First name') | |
|
166 | ||
|
167 | lastname = colander.SchemaNode( | |
|
168 | colander.String(), missing='', title='Last name') | |
|
169 | ||
|
170 | email = colander.SchemaNode( | |
|
171 | colander.String(), widget=deferred_user_email_in_emails_widget, | |
|
172 | validator=deferred_user_email_in_emails_validator, | |
|
173 | description=h.literal( | |
|
174 | _('Additional emails can be specified at <a href="{}">extra emails</a> page.').format( | |
|
175 | '/_admin/my_account/emails')), | |
|
176 | ) | |
|
177 | ||
|
178 | ||
|
179 | class AddEmailSchema(colander.Schema): | |
|
180 | current_password = colander.SchemaNode( | |
|
181 | colander.String(), | |
|
182 | missing=colander.required, | |
|
183 | widget=forms.widget.PasswordWidget(redisplay=True), | |
|
184 | validator=deferred_user_password_validator) | |
|
185 | ||
|
186 | email = colander.SchemaNode( | |
|
187 | colander.String(), title='New Email', | |
|
188 | validator=deferred_additional_email_validator) |
@@ -18,15 +18,6 b'' | |||
|
18 | 18 | } |
|
19 | 19 | } |
|
20 | 20 | |
|
21 | .compare_view_files { | |
|
22 | ||
|
23 | .diff-container { | |
|
24 | ||
|
25 | .diffblock { | |
|
26 | margin-bottom: 0; | |
|
27 | } | |
|
28 | } | |
|
29 | } | |
|
30 | 21 | |
|
31 | 22 | div.diffblock .sidebyside { |
|
32 | 23 | background: #ffffff; |
@@ -1543,14 +1543,6 b' table.integrations {' | |||
|
1543 | 1543 | } |
|
1544 | 1544 | } |
|
1545 | 1545 | |
|
1546 | .compare_view_files { | |
|
1547 | width: 100%; | |
|
1548 | ||
|
1549 | td { | |
|
1550 | vertical-align: middle; | |
|
1551 | } | |
|
1552 | } | |
|
1553 | ||
|
1554 | 1546 | .compare_view_filepath { |
|
1555 | 1547 | color: @grey1; |
|
1556 | 1548 | } |
@@ -485,58 +485,6 b' table.compare_view_commits {' | |||
|
485 | 485 | } |
|
486 | 486 | } |
|
487 | 487 | |
|
488 | .compare_view_files { | |
|
489 | ||
|
490 | td.td-actions { | |
|
491 | text-align: right; | |
|
492 | } | |
|
493 | ||
|
494 | .flag_status { | |
|
495 | margin: 0 0 0 5px; | |
|
496 | } | |
|
497 | ||
|
498 | td.injected_diff { | |
|
499 | ||
|
500 | .code-difftable { | |
|
501 | border:none; | |
|
502 | } | |
|
503 | ||
|
504 | .diff-container { | |
|
505 | border: @border-thickness solid @border-default-color; | |
|
506 | .border-radius(@border-radius); | |
|
507 | } | |
|
508 | ||
|
509 | div.diffblock { | |
|
510 | border:none; | |
|
511 | } | |
|
512 | ||
|
513 | div.code-body { | |
|
514 | max-width: 1152px; | |
|
515 | } | |
|
516 | } | |
|
517 | ||
|
518 | .rctable { | |
|
519 | ||
|
520 | td { | |
|
521 | padding-top: @space; | |
|
522 | } | |
|
523 | ||
|
524 | &:first-child td { | |
|
525 | padding-top: 0; | |
|
526 | } | |
|
527 | } | |
|
528 | ||
|
529 | .comment-bubble, | |
|
530 | .show_comments { | |
|
531 | float: right; | |
|
532 | visibility: hidden; | |
|
533 | padding: 0 1em 0 0; | |
|
534 | } | |
|
535 | ||
|
536 | .injected_diff { | |
|
537 | padding-bottom: @padding; | |
|
538 | } | |
|
539 | } | |
|
540 | 488 | |
|
541 | 489 | // Gist List |
|
542 | 490 | #gist_list_table { |
@@ -232,6 +232,7 b' function registerRCRoutes() {' | |||
|
232 | 232 | pyroutes.register('repo_edit_toggle_locking', '/%(repo_name)s/settings/toggle_locking', ['repo_name']); |
|
233 | 233 | pyroutes.register('edit_repo_remote', '/%(repo_name)s/settings/remote', ['repo_name']); |
|
234 | 234 | pyroutes.register('edit_repo_remote_pull', '/%(repo_name)s/settings/remote/pull', ['repo_name']); |
|
235 | pyroutes.register('edit_repo_remote_push', '/%(repo_name)s/settings/remote/push', ['repo_name']); | |
|
235 | 236 | pyroutes.register('edit_repo_statistics', '/%(repo_name)s/settings/statistics', ['repo_name']); |
|
236 | 237 | pyroutes.register('edit_repo_statistics_reset', '/%(repo_name)s/settings/statistics/update', ['repo_name']); |
|
237 | 238 | pyroutes.register('edit_repo_issuetracker', '/%(repo_name)s/settings/issue_trackers', ['repo_name']); |
@@ -243,6 +244,7 b' function registerRCRoutes() {' | |||
|
243 | 244 | pyroutes.register('edit_repo_vcs_svn_pattern_delete', '/%(repo_name)s/settings/vcs/svn_pattern/delete', ['repo_name']); |
|
244 | 245 | pyroutes.register('repo_reviewers', '/%(repo_name)s/settings/review/rules', ['repo_name']); |
|
245 | 246 | pyroutes.register('repo_default_reviewers_data', '/%(repo_name)s/settings/review/default-reviewers', ['repo_name']); |
|
247 | pyroutes.register('repo_automation', '/%(repo_name)s/settings/automation', ['repo_name']); | |
|
246 | 248 | pyroutes.register('edit_repo_strip', '/%(repo_name)s/settings/strip', ['repo_name']); |
|
247 | 249 | pyroutes.register('strip_check', '/%(repo_name)s/settings/strip_check', ['repo_name']); |
|
248 | 250 | pyroutes.register('strip_execute', '/%(repo_name)s/settings/strip_execute', ['repo_name']); |
@@ -293,44 +293,27 b' function scrollToElement(element, percen' | |||
|
293 | 293 | }); |
|
294 | 294 | } |
|
295 | 295 | }); |
|
296 | $('.compare_view_files').on( | |
|
297 | 'mouseenter mouseleave', 'tr.line .lineno a',function(event) { | |
|
298 | if (event.type === "mouseenter") { | |
|
299 | $(this).parents('tr.line').addClass('hover'); | |
|
300 | } else { | |
|
301 | $(this).parents('tr.line').removeClass('hover'); | |
|
302 | } | |
|
303 | }); | |
|
304 | 296 | |
|
305 | $('.compare_view_files').on( | |
|
306 | 'mouseenter mouseleave', 'tr.line .add-comment-line a',function(event){ | |
|
307 | if (event.type === "mouseenter") { | |
|
308 | $(this).parents('tr.line').addClass('commenting'); | |
|
309 | } else { | |
|
310 | $(this).parents('tr.line').removeClass('commenting'); | |
|
311 | } | |
|
312 | }); | |
|
313 | ||
|
314 | $('body').on( /* TODO: replace the $('.compare_view_files').on('click') below | |
|
315 | when new diffs are integrated */ | |
|
316 | 'click', '.cb-lineno a', function(event) { | |
|
297 | $('body').on('click', '.cb-lineno a', function(event) { | |
|
317 | 298 | |
|
318 | 299 | function sortNumber(a,b) { |
|
319 | 300 | return a - b; |
|
320 | 301 | } |
|
321 | 302 | |
|
322 |
|
|
|
303 | var lineNo = $(this).data('lineNo'); | |
|
304 | if (lineNo) { | |
|
323 | 305 | |
|
324 | 306 | // on shift, we do a range selection, if we got previous line |
|
325 |
var prevLine = $('.cb-line-selected a'). |
|
|
307 | var prevLine = $('.cb-line-selected a').data('lineNo'); | |
|
326 | 308 | if (event.shiftKey && prevLine !== undefined) { |
|
327 | 309 | var prevLine = parseInt(prevLine); |
|
328 |
var nextLine = parseInt( |
|
|
310 | var nextLine = parseInt(lineNo); | |
|
329 | 311 | var pos = [prevLine, nextLine].sort(sortNumber); |
|
330 | 312 | var anchor = '#L{0}-{1}'.format(pos[0], pos[1]); |
|
331 | 313 | |
|
332 | 314 | } else { |
|
333 | var nextLine = parseInt($(this).attr('data-line-no')); | |
|
315 | ||
|
316 | var nextLine = parseInt(lineNo); | |
|
334 | 317 | var pos = [nextLine, nextLine]; |
|
335 | 318 | var anchor = '#L{0}'.format(pos[0]); |
|
336 | 319 | |
@@ -352,9 +335,20 b' function scrollToElement(element, percen' | |||
|
352 | 335 | } |
|
353 | 336 | }); |
|
354 | 337 | |
|
338 | ||
|
339 | } else { | |
|
340 | if ($(this).attr('name') !== undefined) { | |
|
341 | // clear selection | |
|
342 | $('td.cb-line-selected').removeClass('cb-line-selected'); | |
|
343 | var aEl = $(this).closest('td'); | |
|
344 | aEl.addClass('cb-line-selected'); | |
|
345 | aEl.next('td').addClass('cb-line-selected'); | |
|
346 | } | |
|
347 | } | |
|
348 | ||
|
355 | 349 |
|
|
356 | 350 |
|
|
357 |
|
|
|
351 | if (history.pushState && anchor !== undefined) { | |
|
358 | 352 |
|
|
359 | 353 |
|
|
360 | 354 |
|
@@ -367,39 +361,9 b' function scrollToElement(element, percen' | |||
|
367 | 361 | |
|
368 | 362 |
|
|
369 | 363 |
|
|
370 | } | |
|
364 | ||
|
371 | 365 | }); |
|
372 | 366 | |
|
373 | $('.compare_view_files').on( /* TODO: replace this with .cb function above | |
|
374 | when new diffs are integrated */ | |
|
375 | 'click', 'tr.line .lineno a',function(event) { | |
|
376 | if ($(this).text() != ""){ | |
|
377 | $('tr.line').removeClass('selected'); | |
|
378 | $(this).parents("tr.line").addClass('selected'); | |
|
379 | ||
|
380 | // Replace URL without jumping to it if browser supports. | |
|
381 | // Default otherwise | |
|
382 | if (history.pushState) { | |
|
383 | var new_location = location.href; | |
|
384 | if (location.hash){ | |
|
385 | new_location = new_location.replace(location.hash, ""); | |
|
386 | } | |
|
387 | ||
|
388 | // Make new anchor url | |
|
389 | var new_location = new_location+$(this).attr('href'); | |
|
390 | history.pushState(true, document.title, new_location); | |
|
391 | ||
|
392 | return false; | |
|
393 | } | |
|
394 | } | |
|
395 | }); | |
|
396 | ||
|
397 | $('.compare_view_files').on( | |
|
398 | 'click', 'tr.line .add-comment-line a',function(event) { | |
|
399 | var tr = $(event.currentTarget).parents('tr.line')[0]; | |
|
400 | injectInlineForm(tr); | |
|
401 | return false; | |
|
402 | }); | |
|
403 | 367 | |
|
404 | 368 | $('.collapse_file').on('click', function(e) { |
|
405 | 369 | e.stopPropagation(); |
@@ -475,7 +475,7 b' var CommentsController = function() {' | |||
|
475 | 475 | |
|
476 | 476 | this.getLineNumber = function(node) { |
|
477 | 477 | var $node = $(node); |
|
478 |
var lineNo = $node.closest('td').attr('data-line-n |
|
|
478 | var lineNo = $node.closest('td').attr('data-line-no'); | |
|
479 | 479 | if (lineNo === undefined && $node.data('commentInline')){ |
|
480 | 480 | lineNo = $node.data('commentLineNo') |
|
481 | 481 | } |
@@ -598,6 +598,8 b' var CommentsController = function() {' | |||
|
598 | 598 | this.toggleLineComments = function(node) { |
|
599 | 599 | self.toggleComments(node, true); |
|
600 | 600 | var $node = $(node); |
|
601 | // mark outdated comments as visible before the toggle; | |
|
602 | $(node.closest('tr')).find('.comment-outdated').show(); | |
|
601 | 603 | $node.closest('tr').toggleClass('hide-line-comments'); |
|
602 | 604 | }; |
|
603 | 605 |
@@ -54,11 +54,9 b'' | |||
|
54 | 54 | <div class="textarea text-area editor"> |
|
55 | 55 | ${h.textarea('auth_plugins',cols=23,rows=5,class_="medium")} |
|
56 | 56 | </div> |
|
57 | <p class="help-block"> | |
|
58 | ${_('Add a list of plugins, separated by commas. ' | |
|
59 | 'The order of the plugins is also the order in which ' | |
|
60 | 'RhodeCode Enterprise will try to authenticate a user.')} | |
|
61 | </p> | |
|
57 | <p class="help-block pre-formatting">${_('List of plugins, separated by commas.' | |
|
58 | '\nThe order of the plugins is also the order in which ' | |
|
59 | 'RhodeCode Enterprise will try to authenticate a user.')}</p> | |
|
62 | 60 | </div> |
|
63 | 61 | |
|
64 | 62 | <div class="field"> |
@@ -91,15 +89,19 b'' | |||
|
91 | 89 | <script> |
|
92 | 90 | $('.toggle-plugin').click(function(e){ |
|
93 | 91 | var auth_plugins_input = $('#auth_plugins'); |
|
94 | var notEmpty = function(element, index, array) { | |
|
95 | return (element != ""); | |
|
96 | }; | |
|
97 | var elems = auth_plugins_input.val().split(',').filter(notEmpty); | |
|
92 | var elems = []; | |
|
93 | ||
|
94 | $.each(auth_plugins_input.val().split(',') , function (index, element) { | |
|
95 | if (element !== "") { | |
|
96 | elems.push(element.strip()) | |
|
97 | } | |
|
98 | }); | |
|
99 | ||
|
98 | 100 | var cur_button = e.currentTarget; |
|
99 | 101 | var plugin_id = $(cur_button).attr('plugin_id'); |
|
100 | 102 | if($(cur_button).hasClass('btn-success')){ |
|
101 | 103 | elems.splice(elems.indexOf(plugin_id), 1); |
|
102 | auth_plugins_input.val(elems.join(',')); | |
|
104 | auth_plugins_input.val(elems.join(',\n')); | |
|
103 | 105 | $(cur_button).removeClass('btn-success'); |
|
104 | 106 | cur_button.innerHTML = _gettext('disabled'); |
|
105 | 107 | } |
@@ -107,7 +109,7 b'' | |||
|
107 | 109 | if(elems.indexOf(plugin_id) == -1){ |
|
108 | 110 | elems.push(plugin_id); |
|
109 | 111 | } |
|
110 | auth_plugins_input.val(elems.join(',')); | |
|
112 | auth_plugins_input.val(elems.join(',\n')); | |
|
111 | 113 | $(cur_button).addClass('btn-success'); |
|
112 | 114 | cur_button.innerHTML = _gettext('enabled'); |
|
113 | 115 | } |
@@ -50,40 +50,6 b'' | |||
|
50 | 50 | </h3> |
|
51 | 51 | </div> |
|
52 | 52 | <div class="panel-body"> |
|
53 | <% | |
|
54 | if c.repo: | |
|
55 | home_url = request.route_path('repo_integrations_home', | |
|
56 | repo_name=c.repo.repo_name) | |
|
57 | elif c.repo_group: | |
|
58 | home_url = request.route_path('repo_group_integrations_home', | |
|
59 | repo_group_name=c.repo_group.group_name) | |
|
60 | else: | |
|
61 | home_url = request.route_path('global_integrations_home') | |
|
62 | %> | |
|
63 | ||
|
64 | <a href="${home_url}" class="btn ${not c.current_IntegrationType and 'btn-primary' or ''}">${_('All')}</a> | |
|
65 | ||
|
66 | %for integration_key, IntegrationType in c.available_integrations.items(): | |
|
67 | % if not IntegrationType.is_dummy: | |
|
68 | <% | |
|
69 | if c.repo: | |
|
70 | list_url = request.route_path('repo_integrations_list', | |
|
71 | repo_name=c.repo.repo_name, | |
|
72 | integration=integration_key) | |
|
73 | elif c.repo_group: | |
|
74 | list_url = request.route_path('repo_group_integrations_list', | |
|
75 | repo_group_name=c.repo_group.group_name, | |
|
76 | integration=integration_key) | |
|
77 | else: | |
|
78 | list_url = request.route_path('global_integrations_list', | |
|
79 | integration=integration_key) | |
|
80 | %> | |
|
81 | <a href="${list_url}" | |
|
82 | class="btn ${c.current_IntegrationType and integration_key == c.current_IntegrationType.key and 'btn-primary' or ''}"> | |
|
83 | ${IntegrationType.display_name} | |
|
84 | </a> | |
|
85 | % endif | |
|
86 | %endfor | |
|
87 | 53 | |
|
88 | 54 | <% |
|
89 | 55 | integration_type = c.current_IntegrationType and c.current_IntegrationType.display_name or '' |
@@ -153,7 +119,7 b'' | |||
|
153 | 119 | <td class="td-icon"> |
|
154 | 120 | %if integration.integration_type in c.available_integrations: |
|
155 | 121 | <div class="integration-icon"> |
|
156 | ${c.available_integrations[integration.integration_type].icon|n} | |
|
122 | ${c.available_integrations[integration.integration_type].icon()|n} | |
|
157 | 123 | </div> |
|
158 | 124 | %else: |
|
159 | 125 | ? |
@@ -56,7 +56,7 b'' | |||
|
56 | 56 | <%widgets:panel> |
|
57 | 57 | <h2> |
|
58 | 58 | <div class="integration-icon"> |
|
59 | ${IntegrationObject.icon|n} | |
|
59 | ${IntegrationObject.icon()|n} | |
|
60 | 60 | </div> |
|
61 | 61 | ${IntegrationObject.display_name} |
|
62 | 62 | </h2> |
@@ -48,25 +48,7 b'' | |||
|
48 | 48 | </div> |
|
49 | 49 | |
|
50 | 50 | <div> |
|
51 | ${h.secure_form(h.route_path('my_account_emails_add'), request=request)} | |
|
52 | <div class="form"> | |
|
53 | <!-- fields --> | |
|
54 | <div class="fields"> | |
|
55 | <div class="field"> | |
|
56 | <div class="label"> | |
|
57 | <label for="new_email">${_('New email address')}:</label> | |
|
58 | </div> | |
|
59 | <div class="input"> | |
|
60 | ${h.text('new_email', class_='medium')} | |
|
61 | </div> | |
|
62 | </div> | |
|
63 | <div class="buttons"> | |
|
64 | ${h.submit('save',_('Add'),class_="btn")} | |
|
65 | ${h.reset('reset',_('Reset'),class_="btn")} | |
|
51 | ${c.form.render() | n} | |
|
66 | 52 |
|
|
67 | 53 |
|
|
68 | 54 | </div> |
|
69 | ${h.end_form()} | |
|
70 | </div> | |
|
71 | </div> | |
|
72 | </div> |
@@ -6,7 +6,6 b'' | |||
|
6 | 6 | </div> |
|
7 | 7 | |
|
8 | 8 | <div class="panel-body"> |
|
9 | ${h.secure_form(h.route_path('my_account_update'), class_='form', request=request)} | |
|
10 | 9 | <% readonly = None %> |
|
11 | 10 | <% disabled = "" %> |
|
12 | 11 | |
@@ -24,7 +23,7 b'' | |||
|
24 | 23 | <label for="username">${_('Username')}:</label> |
|
25 | 24 | </div> |
|
26 | 25 | <div class="input"> |
|
27 | ${h.text('username', class_='input-valuedisplay', readonly=readonly)} | |
|
26 | ${c.user.username} | |
|
28 | 27 | </div> |
|
29 | 28 | </div> |
|
30 | 29 | |
@@ -33,7 +32,7 b'' | |||
|
33 | 32 | <label for="name">${_('First Name')}:</label> |
|
34 | 33 | </div> |
|
35 | 34 | <div class="input"> |
|
36 | ${h.text('firstname', class_='input-valuedisplay', readonly=readonly)} | |
|
35 | ${c.user.firstname} | |
|
37 | 36 | </div> |
|
38 | 37 | </div> |
|
39 | 38 | |
@@ -42,7 +41,7 b'' | |||
|
42 | 41 | <label for="lastname">${_('Last Name')}:</label> |
|
43 | 42 | </div> |
|
44 | 43 | <div class="input-valuedisplay"> |
|
45 | ${h.text('lastname', class_='input-valuedisplay', readonly=readonly)} | |
|
44 | ${c.user.lastname} | |
|
46 | 45 | </div> |
|
47 | 46 | </div> |
|
48 | 47 | </div> |
@@ -64,48 +63,7 b'' | |||
|
64 | 63 | %endif |
|
65 | 64 | </div> |
|
66 | 65 | </div> |
|
67 | <div class="field"> | |
|
68 | <div class="label"> | |
|
69 | <label for="username">${_('Username')}:</label> | |
|
70 | </div> | |
|
71 | <div class="input"> | |
|
72 | ${h.text('username', class_='medium%s' % disabled, readonly=readonly)} | |
|
73 | ${h.hidden('extern_name', c.extern_name)} | |
|
74 | ${h.hidden('extern_type', c.extern_type)} | |
|
75 | </div> | |
|
76 | </div> | |
|
77 | <div class="field"> | |
|
78 | <div class="label"> | |
|
79 | <label for="name">${_('First Name')}:</label> | |
|
80 | </div> | |
|
81 | <div class="input"> | |
|
82 | ${h.text('firstname', class_="medium")} | |
|
83 | </div> | |
|
84 | </div> | |
|
85 | ||
|
86 | <div class="field"> | |
|
87 | <div class="label"> | |
|
88 | <label for="lastname">${_('Last Name')}:</label> | |
|
89 | </div> | |
|
90 | <div class="input"> | |
|
91 | ${h.text('lastname', class_="medium")} | |
|
92 | </div> | |
|
93 | </div> | |
|
94 | ||
|
95 | <div class="field"> | |
|
96 | <div class="label"> | |
|
97 | <label for="email">${_('Email')}:</label> | |
|
98 | </div> | |
|
99 | <div class="input"> | |
|
100 | ## we should be able to edit email ! | |
|
101 | ${h.text('email', class_="medium")} | |
|
102 | </div> | |
|
103 | </div> | |
|
104 | ||
|
105 | <div class="buttons"> | |
|
106 | ${h.submit('save', _('Save'), class_="btn")} | |
|
107 | ${h.reset('reset', _('Reset'), class_="btn")} | |
|
108 | </div> | |
|
66 | ${c.form.render()| n} | |
|
109 | 67 | </div> |
|
110 | 68 | </div> |
|
111 | 69 | % endif |
@@ -98,7 +98,7 b'' | |||
|
98 | 98 | ${_user_group.users_group_name} |
|
99 | 99 | </a> |
|
100 | 100 | %else: |
|
101 | ${_user_group.users_group_name} | |
|
101 | ${h.link_to_group(_user_group.users_group_name)} | |
|
102 | 102 | %endif |
|
103 | 103 | </td> |
|
104 | 104 | <td class="td-action"> |
@@ -65,7 +65,7 b'' | |||
|
65 | 65 | </li> |
|
66 | 66 | %if c.rhodecode_db_repo.repo_type != 'svn': |
|
67 | 67 | <li class="${'active' if c.active=='remote' else ''}"> |
|
68 | <a href="${h.route_path('edit_repo_remote', repo_name=c.repo_name)}">${_('Remote')}</a> | |
|
68 | <a href="${h.route_path('edit_repo_remote', repo_name=c.repo_name)}">${_('Remote sync')}</a> | |
|
69 | 69 | </li> |
|
70 | 70 | %endif |
|
71 | 71 | <li class="${'active' if c.active=='statistics' else ''}"> |
@@ -79,6 +79,9 b'' | |||
|
79 | 79 | <a href="${h.route_path('repo_reviewers', repo_name=c.repo_name)}">${_('Reviewer Rules')}</a> |
|
80 | 80 | </li> |
|
81 | 81 | %endif |
|
82 | <li class="${'active' if c.active=='automation' else ''}"> | |
|
83 | <a href="${h.route_path('repo_automation', repo_name=c.repo_name)}">${_('Automation')}</a> | |
|
84 | </li> | |
|
82 | 85 | <li class="${'active' if c.active=='maintenance' else ''}"> |
|
83 | 86 | <a href="${h.route_path('edit_repo_maintenance', repo_name=c.repo_name)}">${_('Maintenance')}</a> |
|
84 | 87 | </li> |
@@ -51,3 +51,26 b'' | |||
|
51 | 51 | </div> |
|
52 | 52 | </div> |
|
53 | 53 | </div> |
|
54 | ||
|
55 | ||
|
56 | <div class="panel panel-default"> | |
|
57 | <div class="panel-heading"> | |
|
58 | <h3 class="panel-title">${_('Diff Caches')}</h3> | |
|
59 | </div> | |
|
60 | <div class="panel-body"> | |
|
61 | <table class="rctable edit_cache"> | |
|
62 | <tr> | |
|
63 | <td>${_('Cached diff name')}:</td> | |
|
64 | <td>${c.rhodecode_db_repo.cached_diffs_relative_dir}</td> | |
|
65 | </tr> | |
|
66 | <tr> | |
|
67 | <td>${_('Cached diff files')}:</td> | |
|
68 | <td>${c.cached_diff_count}</td> | |
|
69 | </tr> | |
|
70 | <tr> | |
|
71 | <td>${_('Cached diff size')}:</td> | |
|
72 | <td>${h.format_byte_size(c.cached_diff_size)}</td> | |
|
73 | </tr> | |
|
74 | </table> | |
|
75 | </div> | |
|
76 | </div> |
@@ -89,7 +89,7 b'' | |||
|
89 | 89 | ${_user_group.users_group_name} |
|
90 | 90 | </a> |
|
91 | 91 | %else: |
|
92 | ${_user_group.users_group_name} | |
|
92 | ${h.link_to_group(_user_group.users_group_name)} | |
|
93 | 93 | %endif |
|
94 | 94 | </td> |
|
95 | 95 | <td class="td-action"> |
@@ -1,16 +1,27 b'' | |||
|
1 | 1 | <div class="panel panel-default"> |
|
2 | 2 | <div class="panel-heading"> |
|
3 |
<h3 class="panel-title">${_('Remote |
|
|
3 | <h3 class="panel-title">${_('Remote Sync')}</h3> | |
|
4 | 4 | </div> |
|
5 | 5 | <div class="panel-body"> |
|
6 | 6 | |
|
7 |
<h4>${_('Manually pull changes from external |
|
|
8 | ||
|
9 | %if c.rhodecode_db_repo.clone_uri: | |
|
7 | <h4>${_('Manually pull/push changes from/to external URLs.')}</h4> | |
|
10 | 8 | |
|
11 | ${_('Remote mirror url')}: | |
|
9 | <table> | |
|
10 | <tr> | |
|
11 | <td><div style="min-width: 80px"><strong>${_('Pull url')}</strong></div></td> | |
|
12 | <td> | |
|
13 | % if c.rhodecode_db_repo.clone_uri: | |
|
12 | 14 | <a href="${c.rhodecode_db_repo.clone_uri}">${c.rhodecode_db_repo.clone_uri_hidden}</a> |
|
13 | ||
|
15 | % else: | |
|
16 | ${_('This repository does not have any pull url set.')} | |
|
17 | <a href="${h.route_path('edit_repo', repo_name=c.rhodecode_db_repo.repo_name)}">${_('Set remote url.')}</a> | |
|
18 | % endif | |
|
19 | </td> | |
|
20 | </tr> | |
|
21 | % if c.rhodecode_db_repo.clone_uri: | |
|
22 | <tr> | |
|
23 | <td></td> | |
|
24 | <td> | |
|
14 | 25 | <p> |
|
15 | 26 | ${_('Pull can be automated by such api call. Can be called periodically in crontab etc.')} |
|
16 | 27 | <br/> |
@@ -18,7 +29,11 b'' | |||
|
18 | 29 | ${h.api_call_example(method='pull', args={"repoid": c.rhodecode_db_repo.repo_name})} |
|
19 | 30 | </code> |
|
20 | 31 | </p> |
|
21 | ||
|
32 | </td> | |
|
33 | </tr> | |
|
34 | <tr> | |
|
35 | <td></td> | |
|
36 | <td> | |
|
22 | 37 | ${h.secure_form(h.route_path('edit_repo_remote_pull', repo_name=c.repo_name), request=request)} |
|
23 | 38 | <div class="form"> |
|
24 | 39 | <div class="fields"> |
@@ -26,15 +41,28 b'' | |||
|
26 | 41 | </div> |
|
27 | 42 | </div> |
|
28 | 43 | ${h.end_form()} |
|
29 | %else: | |
|
44 | </td> | |
|
45 | </tr> | |
|
46 | % endif | |
|
30 | 47 | |
|
31 | ${_('This repository does not have any remote mirror url set.')} | |
|
48 | <tr> | |
|
49 | <td><div style="min-width: 80px"><strong>${_('Push url')}</strong></div></td> | |
|
50 | <td> | |
|
51 | % if c.rhodecode_db_repo.push_uri: | |
|
52 | <a href="${c.rhodecode_db_repo.push_uri_hidden}">${c.rhodecode_db_repo.push_uri_hidden}</a> | |
|
53 | % else: | |
|
54 | ${_('This repository does not have any push url set.')} | |
|
32 | 55 | <a href="${h.route_path('edit_repo', repo_name=c.rhodecode_db_repo.repo_name)}">${_('Set remote url.')}</a> |
|
33 | <br/> | |
|
34 | <br/> | |
|
35 | <button class="btn disabled" type="submit" disabled="disabled"> | |
|
36 | ${_('Pull changes from remote location')} | |
|
37 | </button> | |
|
38 | 56 | %endif |
|
57 | </td> | |
|
58 | </tr> | |
|
59 | <tr> | |
|
60 | <td></td> | |
|
61 | <td> | |
|
62 | ${_('This feature is available in RhodeCode EE edition only. Contact {sales_email} to obtain a trial license.').format(sales_email='<a href="mailto:sales@rhodecode.com">sales@rhodecode.com</a>')|n} | |
|
63 | </td> | |
|
64 | </tr> | |
|
65 | ||
|
66 | </table> | |
|
39 | 67 | </div> |
|
40 | 68 | </div> |
@@ -46,9 +46,10 b'' | |||
|
46 | 46 | </div> |
|
47 | 47 | |
|
48 | 48 | % if c.rhodecode_db_repo.repo_type != 'svn': |
|
49 | <% sync_link = h.literal(h.link_to('remote sync', h.route_path('edit_repo_remote', repo_name=c.repo_name))) %> | |
|
49 | 50 | <div class="field"> |
|
50 | 51 | <div class="label"> |
|
51 | <label for="clone_uri">${_('Remote uri')}:</label> | |
|
52 | <label for="clone_uri">${_('Remote pull uri')}:</label> | |
|
52 | 53 | </div> |
|
53 | 54 | <div class="input"> |
|
54 | 55 | %if c.rhodecode_db_repo.clone_uri: |
@@ -83,14 +84,56 b'' | |||
|
83 | 84 | ${h.hidden('repo_clone_uri_change', 'NEW')} |
|
84 | 85 | %endif |
|
85 | 86 | <p id="alter_clone_uri_help_block" class="help-block"> |
|
86 | <% pull_link = h.literal(h.link_to('remote sync', h.route_path('edit_repo_remote', repo_name=c.repo_name))) %> | |
|
87 | ${_('http[s] url where from repository was imported, this field can used for doing {pull_link}.').format(pull_link=pull_link)|n} <br/> | |
|
87 | ${_('http[s] url where from repository was imported. This field can used for doing {sync_link}.').format(sync_link=sync_link)|n} <br/> | |
|
88 | ${_('This field is stored encrypted inside Database, a format of http://user:password@server.com/repo_name can be used and will be hidden from display.')} | |
|
89 | </p> | |
|
90 | </div> | |
|
91 | </div> | |
|
92 | <div class="field"> | |
|
93 | <div class="label"> | |
|
94 | <label for="push_uri">${_('Remote push uri')}:</label> | |
|
95 | </div> | |
|
96 | <div class="input"> | |
|
97 | %if c.rhodecode_db_repo.push_uri: | |
|
98 | ## display, if we don't have any errors | |
|
99 | % if not c.form['repo_push_uri'].error: | |
|
100 | <div id="push_uri_hidden" class='text-as-placeholder'> | |
|
101 | <span id="push_uri_hidden_value">${c.rhodecode_db_repo.push_uri_hidden}</span> | |
|
102 | <span class="link" id="edit_push_uri"><i class="icon-edit"></i>${_('edit')}</span> | |
|
103 | </div> | |
|
104 | % endif | |
|
105 | ||
|
106 | ## alter field | |
|
107 | <div id="alter_push_uri" style="${'' if c.form['repo_push_uri'].error else 'display: none'}"> | |
|
108 | ${c.form['repo_push_uri'].render(css_class='medium', oid='push_uri', placeholder=_('enter new value, or leave empty to remove'))|n} | |
|
109 | ${c.form.render_error(request, c.form['repo_push_uri'])|n} | |
|
110 | % if c.form['repo_push_uri'].error: | |
|
111 | ## we got error from form subit, means we modify the url | |
|
112 | ${h.hidden('repo_push_uri_change', 'MOD')} | |
|
113 | % else: | |
|
114 | ${h.hidden('repo_push_uri_change', 'OLD')} | |
|
115 | % endif | |
|
116 | ||
|
117 | % if not c.form['repo_push_uri'].error: | |
|
118 | <span class="link" id="cancel_edit_push_uri">${_('cancel')}</span> | |
|
119 | % endif | |
|
120 | ||
|
121 | </div> | |
|
122 | %else: | |
|
123 | ## not set yet, display form to set it | |
|
124 | ${c.form['repo_push_uri'].render(css_class='medium', oid='push_uri')|n} | |
|
125 | ${c.form.render_error(request, c.form['repo_push_uri'])|n} | |
|
126 | ${h.hidden('repo_push_uri_change', 'NEW')} | |
|
127 | %endif | |
|
128 | <p id="alter_push_uri_help_block" class="help-block"> | |
|
129 | ${_('http[s] url to sync data back. This field can used for doing {sync_link}.').format(sync_link=sync_link)|n} <br/> | |
|
88 | 130 | ${_('This field is stored encrypted inside Database, a format of http://user:password@server.com/repo_name can be used and will be hidden from display.')} |
|
89 | 131 | </p> |
|
90 | 132 | </div> |
|
91 | 133 | </div> |
|
92 | 134 | % else: |
|
93 | 135 | ${h.hidden('repo_clone_uri', '')} |
|
136 | ${h.hidden('repo_push_uri', '')} | |
|
94 | 137 | % endif |
|
95 | 138 | |
|
96 | 139 | <div class="field"> |
@@ -208,15 +251,10 b'' | |||
|
208 | 251 | |
|
209 | 252 | <script> |
|
210 | 253 | $(document).ready(function(){ |
|
211 |
var cloneUrl = function( |
|
|
212 | var alterButton = $('#alter_clone_uri'); | |
|
213 | var editButton = $('#edit_clone_uri'); | |
|
214 | var cancelEditButton = $('#cancel_edit_clone_uri'); | |
|
215 | var hiddenUrl = $('#clone_uri_hidden'); | |
|
216 | var hiddenUrlValue = $('#clone_uri_hidden_value'); | |
|
217 | var input = $('#clone_uri'); | |
|
218 | var helpBlock = $('#alter_clone_uri_help_block'); | |
|
219 | var changedFlag = $('#repo_clone_uri_change'); | |
|
254 | var cloneUrl = function ( | |
|
255 | alterButton, editButton, cancelEditButton, | |
|
256 | hiddenUrl, hiddenUrlValue, input, helpBlock, changedFlag) { | |
|
257 | ||
|
220 | 258 | var originalText = helpBlock.html(); |
|
221 | 259 | var obfuscatedUrl = hiddenUrlValue.html(); |
|
222 | 260 | |
@@ -255,7 +293,32 b'' | |||
|
255 | 293 | |
|
256 | 294 | setInitialState(); |
|
257 | 295 | initEvents(); |
|
258 |
} |
|
|
296 | }; | |
|
297 | ||
|
298 | ||
|
299 | var alterButton = $('#alter_clone_uri'); | |
|
300 | var editButton = $('#edit_clone_uri'); | |
|
301 | var cancelEditButton = $('#cancel_edit_clone_uri'); | |
|
302 | var hiddenUrl = $('#clone_uri_hidden'); | |
|
303 | var hiddenUrlValue = $('#clone_uri_hidden_value'); | |
|
304 | var input = $('#clone_uri'); | |
|
305 | var helpBlock = $('#alter_clone_uri_help_block'); | |
|
306 | var changedFlag = $('#repo_clone_uri_change'); | |
|
307 | cloneUrl( | |
|
308 | alterButton, editButton, cancelEditButton, hiddenUrl, | |
|
309 | hiddenUrlValue, input, helpBlock, changedFlag); | |
|
310 | ||
|
311 | var alterButton = $('#alter_push_uri'); | |
|
312 | var editButton = $('#edit_push_uri'); | |
|
313 | var cancelEditButton = $('#cancel_edit_push_uri'); | |
|
314 | var hiddenUrl = $('#push_uri_hidden'); | |
|
315 | var hiddenUrlValue = $('#push_uri_hidden_value'); | |
|
316 | var input = $('#push_uri'); | |
|
317 | var helpBlock = $('#alter_push_uri_help_block'); | |
|
318 | var changedFlag = $('#repo_push_uri_change'); | |
|
319 | cloneUrl( | |
|
320 | alterButton, editButton, cancelEditButton, hiddenUrl, | |
|
321 | hiddenUrlValue, input, helpBlock, changedFlag); | |
|
259 | 322 | |
|
260 | 323 | selectMyGroup = function(element) { |
|
261 | 324 | $("#repo_group").val($(element).data('personalGroupId')).trigger("change"); |
@@ -70,6 +70,13 b' autoRefresh = function(value) {' | |||
|
70 | 70 | var loadData = function() { |
|
71 | 71 | currentRequest = $.get(url) |
|
72 | 72 | .done(function(data) { |
|
73 | ||
|
74 | if(data.indexOf('id="processTimeStamp"') === -1){ | |
|
75 | clearInterval(intervalID); | |
|
76 | $('#procList').html('ERROR LOADING DATA. PLEASE REFRESH THE PAGE'); | |
|
77 | return | |
|
78 | } | |
|
79 | ||
|
73 | 80 | currentRequest = null; |
|
74 | 81 | $('#procList').html(data); |
|
75 | 82 | timeagoActivate(); |
@@ -2,12 +2,11 b'' | |||
|
2 | 2 | <table id="procList"> |
|
3 | 3 | <% |
|
4 | 4 | def get_name(proc): |
|
5 | cmd = ' '.join(proc.cmdline()) | |
|
6 | if 'vcsserver.ini' in cmd: | |
|
5 | if 'vcsserver.ini' in proc.cmd: | |
|
7 | 6 | return 'VCSServer' |
|
8 | elif 'rhodecode.ini' in cmd: | |
|
7 | elif 'rhodecode.ini' in proc.cmd: | |
|
9 | 8 | return 'RhodeCode' |
|
10 |
return proc.name |
|
|
9 | return proc.name | |
|
11 | 10 | %> |
|
12 | 11 | <tr> |
|
13 | 12 | <td colspan="8"> |
@@ -15,10 +14,8 b'' | |||
|
15 | 14 | </td> |
|
16 | 15 | </tr> |
|
17 | 16 | % for proc in c.gunicorn_processes: |
|
18 | <% mem = proc.memory_info()%> | |
|
19 | <% children = proc.children(recursive=True) %> | |
|
20 | % if children: | |
|
21 | 17 | |
|
18 | % if proc.children: | |
|
22 | 19 | <tr> |
|
23 | 20 | <td> |
|
24 | 21 | <code> |
@@ -28,18 +25,18 b'' | |||
|
28 | 25 | <td> |
|
29 | 26 | <a href="#showCommand" onclick="$('#pid'+${proc.pid}).toggle();return false"> command </a> |
|
30 | 27 | <code id="pid${proc.pid}" style="display: none"> |
|
31 |
${ |
|
|
28 | ${proc.cmd} | |
|
32 | 29 | </code> |
|
33 | 30 | </td> |
|
34 | 31 | <td></td> |
|
35 | 32 | <td> |
|
36 |
RSS:${h.format_byte_size_binary( |
|
|
33 | RSS:${h.format_byte_size_binary(proc.mem_rss)} | |
|
37 | 34 | </td> |
|
38 | 35 | <td> |
|
39 |
VMS:${h.format_byte_size_binary( |
|
|
36 | VMS:${h.format_byte_size_binary(proc.mem_vms)} | |
|
40 | 37 | </td> |
|
41 | 38 | <td> |
|
42 |
AGE: ${h.age_component(h.time_to_utcdatetime(proc.create_time |
|
|
39 | AGE: ${h.age_component(h.time_to_utcdatetime(proc.create_time))} | |
|
43 | 40 | </td> |
|
44 | 41 | <td> |
|
45 | 42 | MASTER |
@@ -49,8 +46,7 b'' | |||
|
49 | 46 | </td> |
|
50 | 47 | </tr> |
|
51 | 48 | <% mem_sum = 0 %> |
|
52 | % for proc_child in children: | |
|
53 | <% mem = proc_child.memory_info()%> | |
|
49 | % for proc_child in proc.children: | |
|
54 | 50 | <tr> |
|
55 | 51 | <td> |
|
56 | 52 | <code> |
@@ -60,21 +56,21 b'' | |||
|
60 | 56 | <td> |
|
61 | 57 | <a href="#showCommand" onclick="$('#pid'+${proc_child.pid}).toggle();return false"> command </a> |
|
62 | 58 | <code id="pid${proc_child.pid}" style="display: none"> |
|
63 |
${ |
|
|
59 | ${proc_child.cmd} | |
|
64 | 60 | </code> |
|
65 | 61 | </td> |
|
66 | 62 | <td> |
|
67 |
CPU: ${proc_child.cpu_percent |
|
|
63 | CPU: ${proc_child.cpu_percent} % | |
|
68 | 64 | </td> |
|
69 | 65 | <td> |
|
70 |
RSS:${h.format_byte_size_binary( |
|
|
71 |
<% mem_sum += |
|
|
66 | RSS:${h.format_byte_size_binary(proc_child.mem_rss)} | |
|
67 | <% mem_sum += proc_child.mem_rss %> | |
|
72 | 68 | </td> |
|
73 | 69 | <td> |
|
74 |
VMS:${h.format_byte_size_binary( |
|
|
70 | VMS:${h.format_byte_size_binary(proc_child.mem_vms)} | |
|
75 | 71 | </td> |
|
76 | 72 | <td> |
|
77 |
AGE: ${h.age_component(h.time_to_utcdatetime(proc_child.create_time |
|
|
73 | AGE: ${h.age_component(h.time_to_utcdatetime(proc_child.create_time))} | |
|
78 | 74 | </td> |
|
79 | 75 | <td> |
|
80 | 76 | <a href="#restartProcess" onclick="restart(this, ${proc_child.pid});return false"> |
@@ -84,7 +80,7 b'' | |||
|
84 | 80 | </tr> |
|
85 | 81 | % endfor |
|
86 | 82 | <tr> |
|
87 | <td colspan="2"><code>| total processes: ${len(children)}</code></td> | |
|
83 | <td colspan="2"><code>| total processes: ${len(proc.children)}</code></td> | |
|
88 | 84 | <td></td> |
|
89 | 85 | <td><strong>RSS:${h.format_byte_size_binary(mem_sum)}</strong></td> |
|
90 | 86 | <td></td> |
@@ -100,7 +100,7 b'' | |||
|
100 | 100 | ${_user_group.users_group_name} |
|
101 | 101 | </a> |
|
102 | 102 | %else: |
|
103 | ${_user_group.users_group_name} | |
|
103 | ${h.link_to_group(_user_group.users_group_name)} | |
|
104 | 104 | %endif |
|
105 | 105 | </td> |
|
106 | 106 | <td class="td-action"> |
@@ -358,6 +358,8 b'' | |||
|
358 | 358 | % if c.rhodecode_user.personal_repo_group: |
|
359 | 359 |
|
|
360 | 360 | % endif |
|
361 | <li>${h.link_to(_(u'Pull Requests'), h.route_path('my_account_pullrequests'))}</li> | |
|
362 | ||
|
361 | 363 | <li class="logout"> |
|
362 | 364 | ${h.secure_form(h.route_path('logout'), request=request)} |
|
363 | 365 | ${h.submit('log_out', _(u'Sign Out'),class_="btn btn-primary")} |
@@ -312,6 +312,20 b'' | |||
|
312 | 312 | </div> |
|
313 | 313 | % endif |
|
314 | 314 | |
|
315 | % if display_globals or repo_type in ['hg', 'git', 'svn']: | |
|
316 | <div class="panel panel-default"> | |
|
317 | <div class="panel-heading" id="vcs-pull-requests-options"> | |
|
318 | <h3 class="panel-title">${_('Diff cache')}<a class="permalink" href="#vcs-pull-requests-options"> ¶</a></h3> | |
|
319 | </div> | |
|
320 | <div class="panel-body"> | |
|
321 | <div class="checkbox"> | |
|
322 | ${h.checkbox('rhodecode_diff_cache' + suffix, 'True', **kwargs)} | |
|
323 | <label for="rhodecode_diff_cache${suffix}">${_('Enable caching diffs for pull requests cache and commits')}</label> | |
|
324 | </div> | |
|
325 | </div> | |
|
326 | </div> | |
|
327 | % endif | |
|
328 | ||
|
315 | 329 | % if display_globals or repo_type in ['hg',]: |
|
316 | 330 | <div class="panel panel-default"> |
|
317 | 331 | <div class="panel-heading" id="vcs-pull-requests-options"> |
@@ -213,7 +213,7 b'' | |||
|
213 | 213 | <%namespace name="cbdiffs" file="/codeblocks/diffs.mako"/> |
|
214 | 214 | ${cbdiffs.render_diffset_menu()} |
|
215 | 215 | ${cbdiffs.render_diffset( |
|
216 | c.changes[c.commit.raw_id], commit=c.commit, use_comments=True)} | |
|
216 | c.changes[c.commit.raw_id], commit=c.commit, use_comments=True,inline_comments=c.inline_comments )} | |
|
217 | 217 | </div> |
|
218 | 218 | |
|
219 | 219 | ## template for inline comment form |
@@ -44,13 +44,15 b" return '%s_%s_%i' % (h.safeid(filename)," | |||
|
44 | 44 | |
|
45 | 45 | # special file-comments that were deleted in previous versions |
|
46 | 46 | # it's used for showing outdated comments for deleted files in a PR |
|
47 | deleted_files_comments=None | |
|
47 | deleted_files_comments=None, | |
|
48 | ||
|
49 | # for cache purpose | |
|
50 | inline_comments=None | |
|
48 | 51 | |
|
49 | 52 | )"> |
|
50 | ||
|
51 | 53 | %if use_comments: |
|
52 | 54 | <div id="cb-comments-inline-container-template" class="js-template"> |
|
53 | ${inline_comments_container([])} | |
|
55 | ${inline_comments_container([], inline_comments)} | |
|
54 | 56 | </div> |
|
55 | 57 | <div class="js-template" id="cb-comment-inline-form-template"> |
|
56 | 58 | <div class="comment-inline-form ac"> |
@@ -132,7 +134,9 b' collapse_all = len(diffset.files) > coll' | |||
|
132 | 134 | </h2> |
|
133 | 135 | </div> |
|
134 | 136 | |
|
135 |
%if |
|
|
137 | %if diffset.has_hidden_changes: | |
|
138 | <p class="empty_data">${_('Some changes may be hidden')}</p> | |
|
139 | %elif not diffset.files: | |
|
136 | 140 | <p class="empty_data">${_('No files')}</p> |
|
137 | 141 | %endif |
|
138 | 142 | |
@@ -209,9 +213,9 b' collapse_all = len(diffset.files) > coll' | |||
|
209 | 213 | </td> |
|
210 | 214 | </tr> |
|
211 | 215 | %if c.diffmode == 'unified': |
|
212 | ${render_hunk_lines_unified(hunk, use_comments=use_comments)} | |
|
216 | ${render_hunk_lines_unified(hunk, use_comments=use_comments, inline_comments=inline_comments)} | |
|
213 | 217 | %elif c.diffmode == 'sideside': |
|
214 | ${render_hunk_lines_sideside(hunk, use_comments=use_comments)} | |
|
218 | ${render_hunk_lines_sideside(hunk, use_comments=use_comments, inline_comments=inline_comments)} | |
|
215 | 219 | %else: |
|
216 | 220 | <tr class="cb-line"> |
|
217 | 221 | <td>unknown diff mode</td> |
@@ -228,7 +232,7 b' collapse_all = len(diffset.files) > coll' | |||
|
228 | 232 | <td class="cb-lineno cb-context"></td> |
|
229 | 233 | <td class="cb-lineno cb-context"></td> |
|
230 | 234 | <td class="cb-content cb-context"> |
|
231 | ${inline_comments_container(comments)} | |
|
235 | ${inline_comments_container(comments, inline_comments)} | |
|
232 | 236 | </td> |
|
233 | 237 | </tr> |
|
234 | 238 | %elif c.diffmode == 'sideside': |
@@ -237,7 +241,7 b' collapse_all = len(diffset.files) > coll' | |||
|
237 | 241 | <td class="cb-lineno cb-context"></td> |
|
238 | 242 | <td class="cb-content cb-context"> |
|
239 | 243 | % if lineno.startswith('o'): |
|
240 | ${inline_comments_container(comments)} | |
|
244 | ${inline_comments_container(comments, inline_comments)} | |
|
241 | 245 | % endif |
|
242 | 246 | </td> |
|
243 | 247 | |
@@ -245,7 +249,7 b' collapse_all = len(diffset.files) > coll' | |||
|
245 | 249 | <td class="cb-lineno cb-context"></td> |
|
246 | 250 | <td class="cb-content cb-context"> |
|
247 | 251 | % if lineno.startswith('n'): |
|
248 | ${inline_comments_container(comments)} | |
|
252 | ${inline_comments_container(comments, inline_comments)} | |
|
249 | 253 | % endif |
|
250 | 254 | </td> |
|
251 | 255 | </tr> |
@@ -296,7 +300,7 b' collapse_all = len(diffset.files) > coll' | |||
|
296 | 300 | <td class="cb-lineno cb-context"></td> |
|
297 | 301 | <td class="cb-lineno cb-context"></td> |
|
298 | 302 | <td class="cb-content cb-context"> |
|
299 | ${inline_comments_container(comments_dict['comments'])} | |
|
303 | ${inline_comments_container(comments_dict['comments'], inline_comments)} | |
|
300 | 304 | </td> |
|
301 | 305 | </tr> |
|
302 | 306 | %elif c.diffmode == 'sideside': |
@@ -308,7 +312,7 b' collapse_all = len(diffset.files) > coll' | |||
|
308 | 312 | <td class="cb-data cb-context"></td> |
|
309 | 313 | <td class="cb-lineno cb-context"></td> |
|
310 | 314 | <td class="cb-content cb-context"> |
|
311 | ${inline_comments_container(comments_dict['comments'])} | |
|
315 | ${inline_comments_container(comments_dict['comments'], inline_comments)} | |
|
312 | 316 | </td> |
|
313 | 317 | </tr> |
|
314 | 318 | %endif |
@@ -482,12 +486,11 b' from rhodecode.lib.diffs import NEW_FILE' | |||
|
482 | 486 | </%def> |
|
483 | 487 | |
|
484 | 488 | |
|
485 | <%def name="inline_comments_container(comments)"> | |
|
489 | <%def name="inline_comments_container(comments, inline_comments)"> | |
|
486 | 490 | <div class="inline-comments"> |
|
487 | 491 | %for comment in comments: |
|
488 | 492 | ${commentblock.comment_block(comment, inline=True)} |
|
489 | 493 | %endfor |
|
490 | ||
|
491 | 494 | % if comments and comments[-1].outdated: |
|
492 | 495 | <span class="btn btn-secondary cb-comment-add-button comment-outdated}" |
|
493 | 496 | style="display: none;}"> |
@@ -503,8 +506,23 b' from rhodecode.lib.diffs import NEW_FILE' | |||
|
503 | 506 | </div> |
|
504 | 507 | </%def> |
|
505 | 508 | |
|
509 | <%! | |
|
510 | def get_comments_for(comments, filename, line_version, line_number): | |
|
511 | if hasattr(filename, 'unicode_path'): | |
|
512 | filename = filename.unicode_path | |
|
506 | 513 | |
|
507 | <%def name="render_hunk_lines_sideside(hunk, use_comments=False)"> | |
|
514 | if not isinstance(filename, basestring): | |
|
515 | return None | |
|
516 | ||
|
517 | line_key = '{}{}'.format(line_version, line_number) | |
|
518 | if comments and filename in comments: | |
|
519 | file_comments = comments[filename] | |
|
520 | if line_key in file_comments: | |
|
521 | return file_comments[line_key] | |
|
522 | %> | |
|
523 | ||
|
524 | <%def name="render_hunk_lines_sideside(hunk, use_comments=False, inline_comments=None)"> | |
|
525 | ||
|
508 | 526 | %for i, line in enumerate(hunk.sideside): |
|
509 | 527 | <% |
|
510 | 528 | old_line_anchor, new_line_anchor = None, None |
@@ -516,16 +534,25 b' from rhodecode.lib.diffs import NEW_FILE' | |||
|
516 | 534 | |
|
517 | 535 | <tr class="cb-line"> |
|
518 | 536 | <td class="cb-data ${action_class(line.original.action)}" |
|
519 |
data-line-n |
|
|
537 | data-line-no="${line.original.lineno}" | |
|
520 | 538 | > |
|
521 | 539 | <div> |
|
522 | %if line.original.comments: | |
|
523 | <i class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i> | |
|
540 | <% loc = None %> | |
|
541 | %if line.original.get_comment_args: | |
|
542 | <% loc = get_comments_for(inline_comments, *line.original.get_comment_args) %> | |
|
543 | %endif | |
|
544 | %if loc: | |
|
545 | <% has_outdated = any([x.outdated for x in loc]) %> | |
|
546 | % if has_outdated: | |
|
547 | <i title="${_('comments including outdated')}:${len(loc)}" class="icon-comment_toggle" onclick="return Rhodecode.comments.toggleLineComments(this)"></i> | |
|
548 | % else: | |
|
549 | <i title="${_('comments')}: ${len(loc)}" class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i> | |
|
550 | % endif | |
|
524 | 551 | %endif |
|
525 | 552 | </div> |
|
526 | 553 | </td> |
|
527 | 554 | <td class="cb-lineno ${action_class(line.original.action)}" |
|
528 |
data-line-n |
|
|
555 | data-line-no="${line.original.lineno}" | |
|
529 | 556 | %if old_line_anchor: |
|
530 | 557 | id="${old_line_anchor}" |
|
531 | 558 | %endif |
@@ -535,27 +562,40 b' from rhodecode.lib.diffs import NEW_FILE' | |||
|
535 | 562 | %endif |
|
536 | 563 | </td> |
|
537 | 564 | <td class="cb-content ${action_class(line.original.action)}" |
|
538 |
data-line-n |
|
|
565 | data-line-no="o${line.original.lineno}" | |
|
539 | 566 | > |
|
540 | 567 | %if use_comments and line.original.lineno: |
|
541 | 568 | ${render_add_comment_button()} |
|
542 | 569 | %endif |
|
543 | 570 | <span class="cb-code">${line.original.action} ${line.original.content or '' | n}</span> |
|
544 | %if use_comments and line.original.lineno and line.original.comments: | |
|
545 |
|
|
|
571 | ||
|
572 | %if use_comments and line.original.lineno and loc: | |
|
573 | ${inline_comments_container(loc, inline_comments)} | |
|
546 | 574 | %endif |
|
575 | ||
|
547 | 576 | </td> |
|
548 | 577 | <td class="cb-data ${action_class(line.modified.action)}" |
|
549 |
data-line-n |
|
|
578 | data-line-no="${line.modified.lineno}" | |
|
550 | 579 | > |
|
551 | 580 | <div> |
|
552 | %if line.modified.comments: | |
|
553 | <i class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i> | |
|
581 | ||
|
582 | %if line.modified.get_comment_args: | |
|
583 | <% lmc = get_comments_for(inline_comments, *line.modified.get_comment_args) %> | |
|
584 | %else: | |
|
585 | <% lmc = None%> | |
|
586 | %endif | |
|
587 | %if lmc: | |
|
588 | <% has_outdated = any([x.outdated for x in lmc]) %> | |
|
589 | % if has_outdated: | |
|
590 | <i title="${_('comments including outdated')}:${len(lmc)}" class="icon-comment_toggle" onclick="return Rhodecode.comments.toggleLineComments(this)"></i> | |
|
591 | % else: | |
|
592 | <i title="${_('comments')}: ${len(lmc)}" class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i> | |
|
593 | % endif | |
|
554 | 594 | %endif |
|
555 | 595 | </div> |
|
556 | 596 | </td> |
|
557 | 597 | <td class="cb-lineno ${action_class(line.modified.action)}" |
|
558 |
data-line-n |
|
|
598 | data-line-no="${line.modified.lineno}" | |
|
559 | 599 | %if new_line_anchor: |
|
560 | 600 | id="${new_line_anchor}" |
|
561 | 601 | %endif |
@@ -565,14 +605,14 b' from rhodecode.lib.diffs import NEW_FILE' | |||
|
565 | 605 | %endif |
|
566 | 606 | </td> |
|
567 | 607 | <td class="cb-content ${action_class(line.modified.action)}" |
|
568 |
data-line-n |
|
|
608 | data-line-no="n${line.modified.lineno}" | |
|
569 | 609 | > |
|
570 | 610 | %if use_comments and line.modified.lineno: |
|
571 | 611 | ${render_add_comment_button()} |
|
572 | 612 | %endif |
|
573 | 613 | <span class="cb-code">${line.modified.action} ${line.modified.content or '' | n}</span> |
|
574 |
%if use_comments and line.modified.lineno and l |
|
|
575 |
${inline_comments_container(l |
|
|
614 | %if use_comments and line.modified.lineno and lmc: | |
|
615 | ${inline_comments_container(lmc, inline_comments)} | |
|
576 | 616 | %endif |
|
577 | 617 | </td> |
|
578 | 618 | </tr> |
@@ -580,8 +620,8 b' from rhodecode.lib.diffs import NEW_FILE' | |||
|
580 | 620 | </%def> |
|
581 | 621 | |
|
582 | 622 | |
|
583 | <%def name="render_hunk_lines_unified(hunk, use_comments=False)"> | |
|
584 | %for old_line_no, new_line_no, action, content, comments in hunk.unified: | |
|
623 | <%def name="render_hunk_lines_unified(hunk, use_comments=False, inline_comments=None)"> | |
|
624 | %for old_line_no, new_line_no, action, content, comments_args in hunk.unified: | |
|
585 | 625 | <% |
|
586 | 626 | old_line_anchor, new_line_anchor = None, None |
|
587 | 627 | if old_line_no: |
@@ -592,13 +632,25 b' from rhodecode.lib.diffs import NEW_FILE' | |||
|
592 | 632 | <tr class="cb-line"> |
|
593 | 633 | <td class="cb-data ${action_class(action)}"> |
|
594 | 634 | <div> |
|
635 | ||
|
636 | %if comments_args: | |
|
637 | <% comments = get_comments_for(inline_comments, *comments_args) %> | |
|
638 | %else: | |
|
639 | <% comments = None%> | |
|
640 | %endif | |
|
641 | ||
|
595 | 642 |
|
|
596 | <i class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i> | |
|
643 | <% has_outdated = any([x.outdated for x in comments]) %> | |
|
644 | % if has_outdated: | |
|
645 | <i title="${_('comments including outdated')}:${len(comments)}" class="icon-comment_toggle" onclick="return Rhodecode.comments.toggleLineComments(this)"></i> | |
|
646 | % else: | |
|
647 | <i title="${_('comments')}: ${len(comments)}" class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i> | |
|
648 | % endif | |
|
597 | 649 | %endif |
|
598 | 650 | </div> |
|
599 | 651 | </td> |
|
600 | 652 | <td class="cb-lineno ${action_class(action)}" |
|
601 |
data-line-n |
|
|
653 | data-line-no="${old_line_no}" | |
|
602 | 654 | %if old_line_anchor: |
|
603 | 655 | id="${old_line_anchor}" |
|
604 | 656 | %endif |
@@ -608,7 +660,7 b' from rhodecode.lib.diffs import NEW_FILE' | |||
|
608 | 660 | %endif |
|
609 | 661 | </td> |
|
610 | 662 | <td class="cb-lineno ${action_class(action)}" |
|
611 |
data-line-n |
|
|
663 | data-line-no="${new_line_no}" | |
|
612 | 664 | %if new_line_anchor: |
|
613 | 665 | id="${new_line_anchor}" |
|
614 | 666 | %endif |
@@ -618,14 +670,14 b' from rhodecode.lib.diffs import NEW_FILE' | |||
|
618 | 670 | %endif |
|
619 | 671 | </td> |
|
620 | 672 | <td class="cb-content ${action_class(action)}" |
|
621 |
data-line-n |
|
|
673 | data-line-no="${new_line_no and 'n' or 'o'}${new_line_no or old_line_no}" | |
|
622 | 674 | > |
|
623 | 675 | %if use_comments: |
|
624 | 676 | ${render_add_comment_button()} |
|
625 | 677 | %endif |
|
626 | 678 | <span class="cb-code">${action} ${content or '' | n}</span> |
|
627 | 679 | %if use_comments and comments: |
|
628 | ${inline_comments_container(comments)} | |
|
680 | ${inline_comments_container(comments, inline_comments)} | |
|
629 | 681 | %endif |
|
630 | 682 | </td> |
|
631 | 683 | </tr> |
@@ -282,10 +282,10 b'' | |||
|
282 | 282 | ${base.gravatar_with_user(username, 16)} |
|
283 | 283 | </%def> |
|
284 | 284 | |
|
285 |
<%def name="user_group_name(user_group_ |
|
|
285 | <%def name="user_group_name(user_group_name)"> | |
|
286 | 286 | <div> |
|
287 | <a href="${h.route_path('edit_user_group', user_group_id=user_group_id)}"> | |
|
288 | <i class="icon-group" title="${_('User group')}"></i> ${user_group_name}</a> | |
|
287 | <i class="icon-group" title="${_('User group')}"></i> | |
|
288 | ${h.link_to_group(user_group_name)} | |
|
289 | 289 | </div> |
|
290 | 290 | </%def> |
|
291 | 291 |
@@ -17,6 +17,10 b'' | |||
|
17 | 17 | tag: ${tag} <br/> |
|
18 | 18 | % endfor |
|
19 | 19 | |
|
20 | % if has_hidden_changes: | |
|
21 | Has hidden changes<br/> | |
|
22 | % endif | |
|
23 | ||
|
20 | 24 | commit: <a href="${h.route_url('repo_commit', repo_name=c.rhodecode_db_repo.repo_name, commit_id=commit.raw_id)}">${h.show_id(commit)}</a> |
|
21 | 25 | <pre> |
|
22 | 26 | ${h.urlify_commit_message(commit.message)} |
@@ -29,6 +33,6 b' commit: <a href="${h.route_url(\'repo_com' | |||
|
29 | 33 | % endfor |
|
30 | 34 | |
|
31 | 35 | % if feed_include_diff: |
|
32 | ${diff_processor.as_raw()} | |
|
36 | ${c.path_filter.get_raw_patch(diff_processor)} | |
|
33 | 37 | % endif |
|
34 | 38 | </pre> |
@@ -1,20 +1,26 b'' | |||
|
1 | <div class="checkbox"> | |
|
2 | <input tal:define="name name|field.name; | |
|
1 | <span tal:define="name name|field.name; | |
|
3 | 2 |
|
|
4 | 3 |
|
|
5 | 4 |
|
|
6 |
|
|
|
7 | type="checkbox" | |
|
8 | name="${name}" value="${true_val}" | |
|
5 | oid oid|field.oid; | |
|
6 | help_block help_block|field.widget.help_block|''; | |
|
7 | " | |
|
8 | tal:omit-tag=""> | |
|
9 | ||
|
10 | <div class="checkbox"> | |
|
11 | <input type="checkbox" name="${name}" value="${true_val}" | |
|
9 | 12 |
|
|
10 | 13 |
|
|
11 | 14 |
|
|
12 |
|
|
|
13 | ||
|
15 | style style;" | |
|
16 | /> | |
|
17 | <p tal:condition="help_block" class="help-block">${help_block}</p> | |
|
14 | 18 | <label for="${field.oid}"> |
|
15 | 19 | <span tal:condition="hasattr(field, 'schema') and hasattr(field.schema, 'label')" |
|
16 | 20 | tal:replace="field.schema.label" class="checkbox-label" > |
|
17 | 21 | </span> |
|
18 | 22 | |
|
19 | 23 | </label> |
|
20 | </div> No newline at end of file | |
|
24 | </div> | |
|
25 | ||
|
26 | </span> No newline at end of file |
@@ -5,6 +5,8 b'' | |||
|
5 | 5 | name name|field.name; |
|
6 | 6 | style style|field.widget.style; |
|
7 | 7 | help_block help_block|field.widget.help_block|''; |
|
8 | help_block_collapsable_name help_block_collapsable_name|field.widget.help_block_collapsable_name|''; | |
|
9 | help_block_collapsable help_block_collapsable|field.widget.help_block_collapsable|''; | |
|
8 | 10 | codemirror_options codemirror_options|field.widget.codemirror_options|{}; |
|
9 | 11 | codemirror_mode codemirror_mode|field.widget.codemirror_mode|'' |
|
10 | 12 | "> |
@@ -17,6 +19,9 b'' | |||
|
17 | 19 | name="${name}">${cstruct}</textarea> |
|
18 | 20 | |
|
19 | 21 | <p tal:condition="help_block" class="help-block">${help_block}</p> |
|
22 | <span tal:condition="help_block_collapsable" class="help-block pre-formatting"><a href="#showVars" onclick="$('#help_block_${oid}').toggle(); return false">${help_block_collapsable_name}</a> | |
|
23 | <p id="help_block_${oid}" style="display: none">${help_block_collapsable}</p> | |
|
24 | </span> | |
|
20 | 25 | <script type="text/javascript"> |
|
21 | 26 | deform.addCallback( |
|
22 | 27 | '${oid}', |
@@ -8,11 +8,14 b'' | |||
|
8 | 8 | help_block help_block|field.widget.help_block|''; |
|
9 | 9 | " |
|
10 | 10 | tal:omit-tag=""> |
|
11 | ||
|
11 | 12 |
|
|
13 | id="${oid}" | |
|
14 | placeholder="${placeholder}" | |
|
15 | ||
|
12 | 16 |
|
|
13 | 17 |
|
|
14 | placeholder="${placeholder}" | |
|
15 | id="${oid}"/> | |
|
18 | /> | |
|
16 | 19 | |
|
17 | 20 |
|
|
18 | 21 |
|
@@ -23,4 +26,5 b'' | |||
|
23 | 26 |
|
|
24 | 27 |
|
|
25 | 28 |
|
|
29 | ||
|
26 | 30 | </span> No newline at end of file |
@@ -570,7 +570,8 b'' | |||
|
570 | 570 | c.diffset, use_comments=True, |
|
571 | 571 | collapse_when_files_over=30, |
|
572 | 572 | disable_new_comments=not c.allowed_to_comment, |
|
573 |
deleted_files_comments=c.deleted_files_comments |
|
|
573 | deleted_files_comments=c.deleted_files_comments, | |
|
574 | inline_comments=c.inline_comments)} | |
|
574 | 575 | </div> |
|
575 | 576 | % else: |
|
576 | 577 | ## skipping commits we need to clear the view for missing commits |
@@ -22,6 +22,8 b'' | |||
|
22 | 22 | - Mercurial largefiles |
|
23 | 23 | - Git LFS |
|
24 | 24 | </pre> |
|
25 | <br/> | |
|
26 | Requirement error: ${c.repository_requirements_missing.get('error')} | |
|
25 | 27 | </div> |
|
26 | 28 | |
|
27 | 29 | </%def> |
@@ -1,14 +1,14 b'' | |||
|
1 | 1 | <%inherit file="/base/base.mako"/> |
|
2 | 2 | |
|
3 | 3 | <%def name="title()"> |
|
4 | ${c.user.username} | |
|
4 | ${_('User')}: ${c.user.username} | |
|
5 | 5 | %if c.rhodecode_name: |
|
6 | 6 | · ${h.branding(c.rhodecode_name)} |
|
7 | 7 | %endif |
|
8 | 8 | </%def> |
|
9 | 9 | |
|
10 | 10 | <%def name="breadcrumbs_links()"> |
|
11 | ${c.user.username} | |
|
11 | ${_('User')}: ${c.user.username} | |
|
12 | 12 | </%def> |
|
13 | 13 | |
|
14 | 14 | <%def name="menu_bar_nav()"> |
@@ -2,7 +2,7 b'' | |||
|
2 | 2 | |
|
3 | 3 | <div class="panel panel-default user-profile"> |
|
4 | 4 | <div class="panel-heading"> |
|
5 | <h3 class="panel-title">${_('Profile')}</h3> | |
|
5 | <h3 class="panel-title">${_('User Profile')}</h3> | |
|
6 | 6 | %if h.HasPermissionAny('hg.admin')(): |
|
7 | 7 | ${h.link_to(_('Edit'), h.route_path('user_edit', user_id=c.user.user_id), class_='panel-edit')} |
|
8 | 8 | %endif |
@@ -41,7 +41,7 b' log = logging.getLogger(__name__)' | |||
|
41 | 41 | |
|
42 | 42 | __all__ = [ |
|
43 | 43 | 'get_new_dir', 'TestController', |
|
44 |
'link_to', ' |
|
|
44 | 'link_to', 'clear_all_caches', | |
|
45 | 45 | 'assert_session_flash', 'login_user', 'no_newline_id_generator', |
|
46 | 46 | 'TESTS_TMP_PATH', 'HG_REPO', 'GIT_REPO', 'SVN_REPO', |
|
47 | 47 | 'NEW_HG_REPO', 'NEW_GIT_REPO', |
@@ -95,17 +95,6 b" TEST_HG_REPO_PULL = jn(TESTS_TMP_PATH, '" | |||
|
95 | 95 | TEST_REPO_PREFIX = 'vcs-test' |
|
96 | 96 | |
|
97 | 97 | |
|
98 | # skip ldap tests if LDAP lib is not installed | |
|
99 | ldap_lib_installed = False | |
|
100 | try: | |
|
101 | import ldap | |
|
102 | ldap_lib_installed = True | |
|
103 | except ImportError: | |
|
104 | ldap = None | |
|
105 | # means that python-ldap is not installed | |
|
106 | pass | |
|
107 | ||
|
108 | ||
|
109 | 98 | def clear_all_caches(): |
|
110 | 99 | from beaker.cache import cache_managers |
|
111 | 100 | for _cache in cache_managers.values(): |
@@ -100,7 +100,7 b' class RhodeCodeAuthPlugin(RhodeCodeExter' | |||
|
100 | 100 | } |
|
101 | 101 | |
|
102 | 102 | log.debug('EXTERNAL user: \n%s' % formatted_json(user_attrs)) |
|
103 | log.info('user %s authenticated correctly' % user_attrs['username']) | |
|
103 | log.info('user `%s` authenticated correctly' % user_attrs['username']) | |
|
104 | 104 | |
|
105 | 105 | return user_attrs |
|
106 | 106 |
@@ -86,7 +86,8 b' class DBBackend(object):' | |||
|
86 | 86 | _store = os.path.dirname(os.path.abspath(__file__)) |
|
87 | 87 | _type = None |
|
88 | 88 | _base_ini_config = [{'app:main': {'vcs.start_server': 'false', |
|
89 |
'startup.import_repos': 'false' |
|
|
89 | 'startup.import_repos': 'false', | |
|
90 | 'is_test': 'False'}}] | |
|
90 | 91 | _db_url = [{'app:main': {'sqlalchemy.db1.url': ''}}] |
|
91 | 92 | _base_db_name = 'rhodecode_test_db_backend' |
|
92 | 93 | |
@@ -152,6 +153,9 b' class DBBackend(object):' | |||
|
152 | 153 | print(self.stderr) |
|
153 | 154 | raise AssertionError('non 0 retcode:{}'.format(self.p.returncode)) |
|
154 | 155 | |
|
156 | def assert_correct_output(self, stdout, version): | |
|
157 | assert 'UPGRADE FOR STEP {} COMPLETED'.format(version) in stdout | |
|
158 | ||
|
155 | 159 | def setup_rhodecode_db(self, ini_params=None, env=None): |
|
156 | 160 | if not ini_params: |
|
157 | 161 | ini_params = self._base_ini_config |
@@ -167,7 +171,7 b' class DBBackend(object):' | |||
|
167 | 171 | if not os.path.isdir(self._repos_git_lfs_store): |
|
168 | 172 | os.makedirs(self._repos_git_lfs_store) |
|
169 | 173 | |
|
170 | self.execute( | |
|
174 | return self.execute( | |
|
171 | 175 | "rc-setup-app {0} --user=marcink " |
|
172 | 176 | "--email=marcin@rhodeocode.com --password={1} " |
|
173 | 177 | "--repos={2} --force-yes".format( |
@@ -183,7 +187,8 b' class DBBackend(object):' | |||
|
183 | 187 | with test_ini as ini_file: |
|
184 | 188 | if not os.path.isdir(self._repos_location): |
|
185 | 189 | os.makedirs(self._repos_location) |
|
186 | self.execute( | |
|
190 | ||
|
191 | return self.execute( | |
|
187 | 192 | "rc-upgrade-db {0} --force-yes".format(ini_file)) |
|
188 | 193 | |
|
189 | 194 | def setup_db(self): |
@@ -226,12 +231,11 b' class SQLiteDBBackend(DBBackend):' | |||
|
226 | 231 | |
|
227 | 232 | def import_dump(self, dumpname): |
|
228 | 233 | dump = os.path.join(self.fixture_store, dumpname) |
|
229 | shutil.copy( | |
|
230 | dump, | |
|
231 | os.path.join(self._basetemp, '{0.db_name}.sqlite'.format(self))) | |
|
234 | target = os.path.join(self._basetemp, '{0.db_name}.sqlite'.format(self)) | |
|
235 | return self.execute('cp -v {} {}'.format(dump, target)) | |
|
232 | 236 | |
|
233 | 237 | def teardown_db(self): |
|
234 | self.execute("rm -rf {}.sqlite".format( | |
|
238 | return self.execute("rm -rf {}.sqlite".format( | |
|
235 | 239 | os.path.join(self._basetemp, self.db_name))) |
|
236 | 240 | |
|
237 | 241 | |
@@ -246,16 +250,16 b' class MySQLDBBackend(DBBackend):' | |||
|
246 | 250 | # mysqldump -uroot -pqweqwe $TEST_DB_NAME |
|
247 | 251 | self._db_url = [{'app:main': { |
|
248 | 252 | 'sqlalchemy.db1.url': self.connection_string}}] |
|
249 | self.execute("mysql -v -u{} -p{} -e 'create database '{}';'".format( | |
|
253 | return self.execute("mysql -v -u{} -p{} -e 'create database '{}';'".format( | |
|
250 | 254 | self.user, self.password, self.db_name)) |
|
251 | 255 | |
|
252 | 256 | def import_dump(self, dumpname): |
|
253 | 257 | dump = os.path.join(self.fixture_store, dumpname) |
|
254 | self.execute("mysql -u{} -p{} {} < {}".format( | |
|
258 | return self.execute("mysql -u{} -p{} {} < {}".format( | |
|
255 | 259 | self.user, self.password, self.db_name, dump)) |
|
256 | 260 | |
|
257 | 261 | def teardown_db(self): |
|
258 | self.execute("mysql -v -u{} -p{} -e 'drop database '{}';'".format( | |
|
262 | return self.execute("mysql -v -u{} -p{} -e 'drop database '{}';'".format( | |
|
259 | 263 | self.user, self.password, self.db_name)) |
|
260 | 264 | |
|
261 | 265 | |
@@ -271,18 +275,18 b' class PostgresDBBackend(DBBackend):' | |||
|
271 | 275 | self._db_url = [{'app:main': { |
|
272 | 276 | 'sqlalchemy.db1.url': |
|
273 | 277 | self.connection_string}}] |
|
274 | self.execute("PGPASSWORD={} psql -U {} -h localhost " | |
|
278 | return self.execute("PGPASSWORD={} psql -U {} -h localhost " | |
|
275 | 279 | "-c 'create database '{}';'".format( |
|
276 | 280 | self.password, self.user, self.db_name)) |
|
277 | 281 | |
|
278 | 282 | def teardown_db(self): |
|
279 | self.execute("PGPASSWORD={} psql -U {} -h localhost " | |
|
283 | return self.execute("PGPASSWORD={} psql -U {} -h localhost " | |
|
280 | 284 | "-c 'drop database if exists '{}';'".format( |
|
281 | 285 | self.password, self.user, self.db_name)) |
|
282 | 286 | |
|
283 | 287 | def import_dump(self, dumpname): |
|
284 | 288 | dump = os.path.join(self.fixture_store, dumpname) |
|
285 | self.execute( | |
|
289 | return self.execute( | |
|
286 | 290 | "PGPASSWORD={} psql -U {} -h localhost -d {} -1 " |
|
287 | 291 | "-f {}".format( |
|
288 | 292 | self.password, self.user, self.db_name, dump)) |
@@ -59,5 +59,7 b' def _run_migration_test(db_backend, dump' | |||
|
59 | 59 | db_backend.assert_returncode_success() |
|
60 | 60 | |
|
61 | 61 | db_backend.import_dump(dumpname) |
|
62 | db_backend.upgrade_database() | |
|
62 | stdout, stderr = db_backend.upgrade_database() | |
|
63 | ||
|
64 | db_backend.assert_correct_output(stdout+stderr, version='16') | |
|
63 | 65 | db_backend.assert_returncode_success() |
@@ -30,7 +30,7 b' import shutil' | |||
|
30 | 30 | import configobj |
|
31 | 31 | |
|
32 | 32 | from rhodecode.tests import * |
|
33 | from rhodecode.model.db import Repository, User, RepoGroup, UserGroup, Gist | |
|
33 | from rhodecode.model.db import Repository, User, RepoGroup, UserGroup, Gist, UserEmailMap | |
|
34 | 34 | from rhodecode.model.meta import Session |
|
35 | 35 | from rhodecode.model.repo import RepoModel |
|
36 | 36 | from rhodecode.model.user import UserModel |
@@ -125,6 +125,7 b' class Fixture(object):' | |||
|
125 | 125 | 'repo_name': None, |
|
126 | 126 | 'repo_type': 'hg', |
|
127 | 127 | 'clone_uri': '', |
|
128 | 'push_uri': '', | |
|
128 | 129 | 'repo_group': '-1', |
|
129 | 130 | 'repo_description': 'DESC', |
|
130 | 131 | 'repo_private': False, |
@@ -274,6 +275,13 b' class Fixture(object):' | |||
|
274 | 275 | UserModel().delete(userid) |
|
275 | 276 | Session().commit() |
|
276 | 277 | |
|
278 | def create_additional_user_email(self, user, email): | |
|
279 | uem = UserEmailMap() | |
|
280 | uem.user = user | |
|
281 | uem.email = email | |
|
282 | Session().add(uem) | |
|
283 | return uem | |
|
284 | ||
|
277 | 285 | def destroy_users(self, userid_iter): |
|
278 | 286 | for user_id in userid_iter: |
|
279 | 287 | if User.get_by_username(user_id): |
@@ -22,12 +22,13 b' import pytest' | |||
|
22 | 22 | |
|
23 | 23 | from rhodecode import events |
|
24 | 24 | from rhodecode.lib.utils2 import AttributeDict |
|
25 | from rhodecode.integrations.types.webhook import WebhookHandler | |
|
25 | from rhodecode.integrations.types.webhook import WebhookDataHandler | |
|
26 | 26 | |
|
27 | 27 | |
|
28 | 28 | @pytest.fixture |
|
29 | 29 | def base_data(): |
|
30 | 30 | return { |
|
31 | 'name': 'event', | |
|
31 | 32 | 'repo': { |
|
32 | 33 | 'repo_name': 'foo', |
|
33 | 34 | 'repo_type': 'hg', |
@@ -44,31 +45,40 b' def base_data():' | |||
|
44 | 45 | |
|
45 | 46 | def test_webhook_parse_url_invalid_event(): |
|
46 | 47 | template_url = 'http://server.com/${repo_name}/build' |
|
47 | handler = WebhookHandler( | |
|
48 |
template_url |
|
|
48 | handler = WebhookDataHandler( | |
|
49 | template_url, {'exmaple-header': 'header-values'}) | |
|
50 | event = events.RepoDeleteEvent('') | |
|
49 | 51 | with pytest.raises(ValueError) as err: |
|
50 |
handler(event |
|
|
51 | assert str(err.value).startswith('event type not supported') | |
|
52 | handler(event, {}) | |
|
53 | ||
|
54 | err = str(err.value) | |
|
55 | assert err.startswith( | |
|
56 | 'event type `%s` not in supported list' % event.__class__) | |
|
52 | 57 | |
|
53 | 58 | |
|
54 | 59 | @pytest.mark.parametrize('template,expected_urls', [ |
|
55 |
('http://server.com/${repo_name}/build', |
|
|
56 | ('http://server.com/${repo_name}/${repo_type}', ['http://server.com/foo/hg']), | |
|
57 |
('http:// |
|
|
58 | ('http://server.com/${branch}/build', ['http://server.com/${branch}/build']), | |
|
60 | ('http://server.com/${repo_name}/build', | |
|
61 | ['http://server.com/foo/build']), | |
|
62 | ('http://server.com/${repo_name}/${repo_type}', | |
|
63 | ['http://server.com/foo/hg']), | |
|
64 | ('http://${server}.com/${repo_name}/${repo_id}', | |
|
65 | ['http://${server}.com/foo/12']), | |
|
66 | ('http://server.com/${branch}/build', | |
|
67 | ['http://server.com/${branch}/build']), | |
|
59 | 68 | ]) |
|
60 | 69 | def test_webook_parse_url_for_create_event(base_data, template, expected_urls): |
|
61 | 70 | headers = {'exmaple-header': 'header-values'} |
|
62 | handler = WebhookHandler( | |
|
63 | template, 'secret_token', headers) | |
|
71 | handler = WebhookDataHandler(template, headers) | |
|
64 | 72 | urls = handler(events.RepoCreateEvent(''), base_data) |
|
65 | 73 | assert urls == [ |
|
66 |
(url |
|
|
74 | (url, headers, base_data) for url in expected_urls] | |
|
67 | 75 | |
|
68 | 76 | |
|
69 | 77 | @pytest.mark.parametrize('template,expected_urls', [ |
|
70 |
('http://server.com/${repo_name}/${pull_request_id}', |
|
|
71 | ('http://server.com/${repo_name}/${pull_request_url}', ['http://server.com/foo/http://pr-url.com']), | |
|
78 | ('http://server.com/${repo_name}/${pull_request_id}', | |
|
79 | ['http://server.com/foo/999']), | |
|
80 | ('http://server.com/${repo_name}/${pull_request_url}', | |
|
81 | ['http://server.com/foo/http://pr-url.com']), | |
|
72 | 82 | ]) |
|
73 | 83 | def test_webook_parse_url_for_pull_request_event( |
|
74 | 84 | base_data, template, expected_urls): |
@@ -76,20 +86,24 b' def test_webook_parse_url_for_pull_reque' | |||
|
76 | 86 | base_data['pullrequest'] = { |
|
77 | 87 | 'pull_request_id': 999, |
|
78 | 88 | 'url': 'http://pr-url.com', |
|
89 | 'title': 'example-pr-title', | |
|
90 | 'commits_uid': 'abcdefg1234', | |
|
91 | 'shadow_url': 'http://pr-url.com/repository' | |
|
79 | 92 | } |
|
80 | 93 | headers = {'exmaple-header': 'header-values'} |
|
81 | handler = WebhookHandler( | |
|
82 | template, 'secret_token', headers) | |
|
94 | handler = WebhookDataHandler(template, headers) | |
|
83 | 95 | urls = handler(events.PullRequestCreateEvent( |
|
84 | 96 | AttributeDict({'target_repo': 'foo'})), base_data) |
|
85 | 97 | assert urls == [ |
|
86 |
(url |
|
|
98 | (url, headers, base_data) for url in expected_urls] | |
|
87 | 99 | |
|
88 | 100 | |
|
89 | 101 | @pytest.mark.parametrize('template,expected_urls', [ |
|
90 |
('http://server.com/${branch}/build', |
|
|
102 | ('http://server.com/${branch}/build', | |
|
103 | ['http://server.com/stable/build', | |
|
91 | 104 |
|
|
92 |
('http://server.com/${branch}/${commit_id}', |
|
|
105 | ('http://server.com/${branch}/${commit_id}', | |
|
106 | ['http://server.com/stable/stable-xxx', | |
|
93 | 107 |
|
|
94 | 108 |
|
|
95 | 109 |
|
@@ -104,8 +118,7 b' def test_webook_parse_url_for_push_event' | |||
|
104 | 118 | {'branch': 'dev', 'raw_id': 'dev-yyy'}] |
|
105 | 119 | } |
|
106 | 120 | headers = {'exmaple-header': 'header-values'} |
|
107 | handler = WebhookHandler( | |
|
108 | template, 'secret_token', headers) | |
|
121 | handler = WebhookDataHandler(template, headers) | |
|
109 | 122 | urls = handler(repo_push_event, base_data) |
|
110 | 123 | assert urls == [ |
|
111 |
(url |
|
|
124 | (url, headers, base_data) for url in expected_urls] |
@@ -377,34 +377,6 b' class TestGenerateVcsResponse(object):' | |||
|
377 | 377 | list(result) |
|
378 | 378 | assert self.was_cache_invalidated() |
|
379 | 379 | |
|
380 | @mock.patch('rhodecode.lib.middleware.simplevcs.HTTPLockedRC') | |
|
381 | def test_handles_locking_exception(self, http_locked_rc): | |
|
382 | result = self.call_controller_with_response_body( | |
|
383 | self.raise_result_iter(vcs_kind='repo_locked')) | |
|
384 | assert not http_locked_rc.called | |
|
385 | # Consume the result | |
|
386 | list(result) | |
|
387 | assert http_locked_rc.called | |
|
388 | ||
|
389 | @mock.patch('rhodecode.lib.middleware.simplevcs.HTTPRequirementError') | |
|
390 | def test_handles_requirement_exception(self, http_requirement): | |
|
391 | result = self.call_controller_with_response_body( | |
|
392 | self.raise_result_iter(vcs_kind='requirement')) | |
|
393 | assert not http_requirement.called | |
|
394 | # Consume the result | |
|
395 | list(result) | |
|
396 | assert http_requirement.called | |
|
397 | ||
|
398 | @mock.patch('rhodecode.lib.middleware.simplevcs.HTTPLockedRC') | |
|
399 | def test_handles_locking_exception_in_app_call(self, http_locked_rc): | |
|
400 | app_factory_patcher = mock.patch.object( | |
|
401 | StubVCSController, '_create_wsgi_app') | |
|
402 | with app_factory_patcher as app_factory: | |
|
403 | app_factory().side_effect = self.vcs_exception() | |
|
404 | result = self.call_controller_with_response_body(['a']) | |
|
405 | list(result) | |
|
406 | assert http_locked_rc.called | |
|
407 | ||
|
408 | 380 | def test_raises_unknown_exceptions(self): |
|
409 | 381 | result = self.call_controller_with_response_body( |
|
410 | 382 | self.raise_result_iter(vcs_kind='unknown')) |
@@ -412,7 +384,7 b' class TestGenerateVcsResponse(object):' | |||
|
412 | 384 | list(result) |
|
413 | 385 | |
|
414 | 386 | def test_prepare_callback_daemon_is_called(self): |
|
415 | def side_effect(extras): | |
|
387 | def side_effect(extras, environ, action, txn_id=None): | |
|
416 | 388 | return DummyHooksCallbackDaemon(), extras |
|
417 | 389 | |
|
418 | 390 | prepare_patcher = mock.patch.object( |
@@ -489,10 +461,11 b' class TestPrepareHooksDaemon(object):' | |||
|
489 | 461 | return_value=(daemon, expected_extras)) |
|
490 | 462 | with prepare_patcher as prepare_mock: |
|
491 | 463 | callback_daemon, extras = controller._prepare_callback_daemon( |
|
492 | expected_extras.copy()) | |
|
464 | expected_extras.copy(), {}, 'push') | |
|
493 | 465 | prepare_mock.assert_called_once_with( |
|
494 | 466 | expected_extras, |
|
495 | 467 | protocol=app_settings['vcs.hooks.protocol'], |
|
468 | txn_id=None, | |
|
496 | 469 | use_direct_calls=app_settings['vcs.hooks.direct_calls']) |
|
497 | 470 | |
|
498 | 471 | assert callback_daemon == daemon |
@@ -179,10 +179,12 b' class TestHttpHooksCallbackDaemon(object' | |||
|
179 | 179 | daemon = hooks_daemon.HttpHooksCallbackDaemon() |
|
180 | 180 | assert daemon._daemon == tcp_server |
|
181 | 181 | |
|
182 | _, port = tcp_server.server_address | |
|
183 | expected_uri = '{}:{}'.format(daemon.IP_ADDRESS, port) | |
|
184 | msg = 'Preparing HTTP callback daemon at `{}` and ' \ | |
|
185 | 'registering hook object'.format(expected_uri) | |
|
182 | 186 | assert_message_in_log( |
|
183 | caplog.records, | |
|
184 | 'Preparing HTTP callback daemon and registering hook object', | |
|
185 | levelno=logging.DEBUG, module='hooks_daemon') | |
|
187 | caplog.records, msg, levelno=logging.DEBUG, module='hooks_daemon') | |
|
186 | 188 | |
|
187 | 189 | def test_prepare_inits_hooks_uri_and_logs_it( |
|
188 | 190 | self, tcp_server, caplog): |
@@ -193,8 +195,10 b' class TestHttpHooksCallbackDaemon(object' | |||
|
193 | 195 | expected_uri = '{}:{}'.format(daemon.IP_ADDRESS, port) |
|
194 | 196 | assert daemon.hooks_uri == expected_uri |
|
195 | 197 | |
|
198 | msg = 'Preparing HTTP callback daemon at `{}` and ' \ | |
|
199 | 'registering hook object'.format(expected_uri) | |
|
196 | 200 | assert_message_in_log( |
|
197 |
caplog.records, |
|
|
201 | caplog.records, msg, | |
|
198 | 202 | levelno=logging.DEBUG, module='hooks_daemon') |
|
199 | 203 | |
|
200 | 204 | def test_run_creates_a_thread(self, tcp_server): |
@@ -263,7 +267,8 b' class TestPrepareHooksDaemon(object):' | |||
|
263 | 267 | expected_extras.copy(), protocol=protocol, use_direct_calls=True) |
|
264 | 268 | assert isinstance(callback, hooks_daemon.DummyHooksCallbackDaemon) |
|
265 | 269 | expected_extras['hooks_module'] = 'rhodecode.lib.hooks_daemon' |
|
266 | assert extras == expected_extras | |
|
270 | expected_extras['time'] = extras['time'] | |
|
271 | assert 'extra1' in extras | |
|
267 | 272 | |
|
268 | 273 | @pytest.mark.parametrize('protocol, expected_class', ( |
|
269 | 274 | ('http', hooks_daemon.HttpHooksCallbackDaemon), |
@@ -272,12 +277,15 b' class TestPrepareHooksDaemon(object):' | |||
|
272 | 277 | self, protocol, expected_class): |
|
273 | 278 | expected_extras = { |
|
274 | 279 | 'extra1': 'value1', |
|
280 | 'txn_id': 'txnid2', | |
|
275 | 281 | 'hooks_protocol': protocol.lower() |
|
276 | 282 | } |
|
277 | 283 | callback, extras = hooks_daemon.prepare_callback_daemon( |
|
278 |
expected_extras.copy(), protocol=protocol, use_direct_calls=False |
|
|
284 | expected_extras.copy(), protocol=protocol, use_direct_calls=False, | |
|
285 | txn_id='txnid2') | |
|
279 | 286 | assert isinstance(callback, expected_class) |
|
280 |
|
|
|
287 | extras.pop('hooks_uri') | |
|
288 | expected_extras['time'] = extras['time'] | |
|
281 | 289 | assert extras == expected_extras |
|
282 | 290 | |
|
283 | 291 | @pytest.mark.parametrize('protocol', ( |
@@ -245,9 +245,6 b' def test_repo2db_mapper_enables_largefil' | |||
|
245 | 245 | repo = backend.create_repo() |
|
246 | 246 | repo_list = {repo.repo_name: 'test'} |
|
247 | 247 | with mock.patch('rhodecode.model.db.Repository.scm_instance') as scm_mock: |
|
248 | with mock.patch.multiple('rhodecode.model.scm.ScmModel', | |
|
249 | install_git_hook=mock.DEFAULT, | |
|
250 | install_svn_hooks=mock.DEFAULT): | |
|
251 | 248 |
|
|
252 | 249 |
|
|
253 | 250 |
|
@@ -257,10 +254,7 b' def test_repo2db_mapper_enables_largefil' | |||
|
257 | 254 | def test_repo2db_mapper_installs_hooks_for_repos_in_db(backend): |
|
258 | 255 | repo = backend.create_repo() |
|
259 | 256 | repo_list = {repo.repo_name: 'test'} |
|
260 | with mock.patch.object(ScmModel, 'install_hooks') as install_hooks_mock: | |
|
261 | 257 |
|
|
262 | install_hooks_mock.assert_called_once_with( | |
|
263 | repo.scm_instance(), repo_type=backend.alias) | |
|
264 | 258 | |
|
265 | 259 | |
|
266 | 260 | @pytest.mark.backends("git", "svn") |
@@ -269,11 +263,7 b' def test_repo2db_mapper_installs_hooks_f' | |||
|
269 | 263 | RepoModel().delete(repo, fs_remove=False) |
|
270 | 264 | meta.Session().commit() |
|
271 | 265 | repo_list = {repo.repo_name: repo.scm_instance()} |
|
272 | with mock.patch.object(ScmModel, 'install_hooks') as install_hooks_mock: | |
|
273 | 266 |
|
|
274 | assert install_hooks_mock.call_count == 1 | |
|
275 | install_hooks_args, _ = install_hooks_mock.call_args | |
|
276 | assert install_hooks_args[0].name == repo.repo_name | |
|
277 | 267 | |
|
278 | 268 | |
|
279 | 269 | class TestPasswordChanged(object): |
@@ -44,6 +44,7 b' GENERAL_FORM_DATA = {' | |||
|
44 | 44 | 'rhodecode_hg_close_branch_before_merging': True, |
|
45 | 45 | 'rhodecode_git_use_rebase_for_merging': True, |
|
46 | 46 | 'rhodecode_git_close_branch_before_merging': True, |
|
47 | 'rhodecode_diff_cache': True, | |
|
47 | 48 | } |
|
48 | 49 | |
|
49 | 50 |
@@ -18,10 +18,11 b'' | |||
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | import os | |
|
22 | import mock | |
|
23 | import pytest | |
|
21 | 24 | import tempfile |
|
22 | 25 | |
|
23 | import mock | |
|
24 | import pytest | |
|
25 | 26 | |
|
26 | 27 | from rhodecode.lib.exceptions import AttachedForksError |
|
27 | 28 | from rhodecode.lib.utils import make_db_config |
@@ -119,22 +120,22 b' class TestRepoModel(object):' | |||
|
119 | 120 | |
|
120 | 121 | @pytest.mark.backends("git", "svn") |
|
121 | 122 | def test_create_filesystem_repo_installs_hooks(self, tmpdir, backend): |
|
122 | hook_methods = { | |
|
123 | 'git': 'install_git_hook', | |
|
124 | 'svn': 'install_svn_hooks' | |
|
125 | } | |
|
126 | 123 | repo = backend.create_repo() |
|
127 | 124 | repo_name = repo.repo_name |
|
128 | 125 | model = RepoModel() |
|
129 | 126 | repo_location = tempfile.mkdtemp() |
|
130 | 127 | model.repos_path = repo_location |
|
131 | method = hook_methods[backend.alias] | |
|
132 | with mock.patch.object(ScmModel, method) as hooks_mock: | |
|
133 | model._create_filesystem_repo( | |
|
128 | repo = model._create_filesystem_repo( | |
|
134 | 129 |
|
|
135 | assert hooks_mock.call_count == 1 | |
|
136 | hook_args, hook_kwargs = hooks_mock.call_args | |
|
137 | assert hook_args[0].name == repo_name | |
|
130 | ||
|
131 | hooks = { | |
|
132 | 'svn': ('pre-commit', 'post-commit'), | |
|
133 | 'git': ('pre-receive', 'post-receive'), | |
|
134 | } | |
|
135 | for hook in hooks[backend.alias]: | |
|
136 | with open(os.path.join(repo.path, 'hooks', hook)) as f: | |
|
137 | data = f.read() | |
|
138 | assert 'RC_HOOK_VER' in data | |
|
138 | 139 | |
|
139 | 140 | @pytest.mark.parametrize("use_global_config, repo_name_passed", [ |
|
140 | 141 | (True, False), |
@@ -194,143 +194,3 b' def test_get_non_unicode_reference(backe' | |||
|
194 | 194 | u'tag:Ad\xc4\xb1n\xc4\xb1'] |
|
195 | 195 | |
|
196 | 196 | assert choices == valid_choices |
|
197 | ||
|
198 | ||
|
199 | class TestInstallSvnHooks(object): | |
|
200 | HOOK_FILES = ('pre-commit', 'post-commit') | |
|
201 | ||
|
202 | def test_new_hooks_are_created(self, backend_svn): | |
|
203 | model = scm.ScmModel() | |
|
204 | repo = backend_svn.create_repo() | |
|
205 | vcs_repo = repo.scm_instance() | |
|
206 | model.install_svn_hooks(vcs_repo) | |
|
207 | ||
|
208 | hooks_path = os.path.join(vcs_repo.path, 'hooks') | |
|
209 | assert os.path.isdir(hooks_path) | |
|
210 | for file_name in self.HOOK_FILES: | |
|
211 | file_path = os.path.join(hooks_path, file_name) | |
|
212 | self._check_hook_file_mode(file_path) | |
|
213 | self._check_hook_file_content(file_path) | |
|
214 | ||
|
215 | def test_rc_hooks_are_replaced(self, backend_svn): | |
|
216 | model = scm.ScmModel() | |
|
217 | repo = backend_svn.create_repo() | |
|
218 | vcs_repo = repo.scm_instance() | |
|
219 | hooks_path = os.path.join(vcs_repo.path, 'hooks') | |
|
220 | file_paths = [os.path.join(hooks_path, f) for f in self.HOOK_FILES] | |
|
221 | ||
|
222 | for file_path in file_paths: | |
|
223 | self._create_fake_hook( | |
|
224 | file_path, content="RC_HOOK_VER = 'abcde'\n") | |
|
225 | ||
|
226 | model.install_svn_hooks(vcs_repo) | |
|
227 | ||
|
228 | for file_path in file_paths: | |
|
229 | self._check_hook_file_content(file_path) | |
|
230 | ||
|
231 | def test_non_rc_hooks_are_not_replaced_without_force_create( | |
|
232 | self, backend_svn): | |
|
233 | model = scm.ScmModel() | |
|
234 | repo = backend_svn.create_repo() | |
|
235 | vcs_repo = repo.scm_instance() | |
|
236 | hooks_path = os.path.join(vcs_repo.path, 'hooks') | |
|
237 | file_paths = [os.path.join(hooks_path, f) for f in self.HOOK_FILES] | |
|
238 | non_rc_content = "exit 0\n" | |
|
239 | ||
|
240 | for file_path in file_paths: | |
|
241 | self._create_fake_hook(file_path, content=non_rc_content) | |
|
242 | ||
|
243 | model.install_svn_hooks(vcs_repo) | |
|
244 | ||
|
245 | for file_path in file_paths: | |
|
246 | with open(file_path, 'rt') as hook_file: | |
|
247 | content = hook_file.read() | |
|
248 | assert content == non_rc_content | |
|
249 | ||
|
250 | def test_non_rc_hooks_are_replaced_with_force_create(self, backend_svn): | |
|
251 | model = scm.ScmModel() | |
|
252 | repo = backend_svn.create_repo() | |
|
253 | vcs_repo = repo.scm_instance() | |
|
254 | hooks_path = os.path.join(vcs_repo.path, 'hooks') | |
|
255 | file_paths = [os.path.join(hooks_path, f) for f in self.HOOK_FILES] | |
|
256 | non_rc_content = "exit 0\n" | |
|
257 | ||
|
258 | for file_path in file_paths: | |
|
259 | self._create_fake_hook(file_path, content=non_rc_content) | |
|
260 | ||
|
261 | model.install_svn_hooks(vcs_repo, force_create=True) | |
|
262 | ||
|
263 | for file_path in file_paths: | |
|
264 | self._check_hook_file_content(file_path) | |
|
265 | ||
|
266 | def _check_hook_file_mode(self, file_path): | |
|
267 | assert os.path.exists(file_path) | |
|
268 | stat_info = os.stat(file_path) | |
|
269 | ||
|
270 | file_mode = stat.S_IMODE(stat_info.st_mode) | |
|
271 | expected_mode = int('755', 8) | |
|
272 | assert expected_mode == file_mode | |
|
273 | ||
|
274 | def _check_hook_file_content(self, file_path): | |
|
275 | with open(file_path, 'rt') as hook_file: | |
|
276 | content = hook_file.read() | |
|
277 | ||
|
278 | expected_env = '#!{}'.format(sys.executable) | |
|
279 | expected_rc_version = "\nRC_HOOK_VER = '{}'\n".format( | |
|
280 | rhodecode.__version__) | |
|
281 | assert content.strip().startswith(expected_env) | |
|
282 | assert expected_rc_version in content | |
|
283 | ||
|
284 | def _create_fake_hook(self, file_path, content): | |
|
285 | with open(file_path, 'w') as hook_file: | |
|
286 | hook_file.write(content) | |
|
287 | ||
|
288 | ||
|
289 | class TestCheckRhodecodeHook(object): | |
|
290 | ||
|
291 | @patch('os.path.exists', Mock(return_value=False)) | |
|
292 | def test_returns_true_when_no_hook_found(self): | |
|
293 | result = scm._check_rhodecode_hook('/tmp/fake_hook_file.py') | |
|
294 | assert result | |
|
295 | ||
|
296 | @pytest.mark.parametrize("file_content, expected_result", [ | |
|
297 | ("RC_HOOK_VER = '3.3.3'\n", True), | |
|
298 | ("RC_HOOK = '3.3.3'\n", False), | |
|
299 | ], ids=no_newline_id_generator) | |
|
300 | @patch('os.path.exists', Mock(return_value=True)) | |
|
301 | def test_signatures(self, file_content, expected_result): | |
|
302 | hook_content_patcher = patch.object( | |
|
303 | scm, '_read_hook', return_value=file_content) | |
|
304 | with hook_content_patcher: | |
|
305 | result = scm._check_rhodecode_hook('/tmp/fake_hook_file.py') | |
|
306 | ||
|
307 | assert result is expected_result | |
|
308 | ||
|
309 | ||
|
310 | class TestInstallHooks(object): | |
|
311 | def test_hooks_are_installed_for_git_repo(self, backend_git): | |
|
312 | repo = backend_git.create_repo() | |
|
313 | model = scm.ScmModel() | |
|
314 | scm_repo = repo.scm_instance() | |
|
315 | with patch.object(model, 'install_git_hook') as hooks_mock: | |
|
316 | model.install_hooks(scm_repo, repo_type='git') | |
|
317 | hooks_mock.assert_called_once_with(scm_repo) | |
|
318 | ||
|
319 | def test_hooks_are_installed_for_svn_repo(self, backend_svn): | |
|
320 | repo = backend_svn.create_repo() | |
|
321 | scm_repo = repo.scm_instance() | |
|
322 | model = scm.ScmModel() | |
|
323 | with patch.object(scm.ScmModel, 'install_svn_hooks') as hooks_mock: | |
|
324 | model.install_hooks(scm_repo, repo_type='svn') | |
|
325 | hooks_mock.assert_called_once_with(scm_repo) | |
|
326 | ||
|
327 | @pytest.mark.parametrize('hook_method', [ | |
|
328 | 'install_svn_hooks', | |
|
329 | 'install_git_hook']) | |
|
330 | def test_mercurial_doesnt_trigger_hooks(self, backend_hg, hook_method): | |
|
331 | repo = backend_hg.create_repo() | |
|
332 | scm_repo = repo.scm_instance() | |
|
333 | model = scm.ScmModel() | |
|
334 | with patch.object(scm.ScmModel, hook_method) as hooks_mock: | |
|
335 | model.install_hooks(scm_repo, repo_type='hg') | |
|
336 | assert hooks_mock.call_count == 0 |
@@ -24,8 +24,7 b' import pytest' | |||
|
24 | 24 | |
|
25 | 25 | from rhodecode.tests import ( |
|
26 | 26 | HG_REPO, TEST_USER_REGULAR2_EMAIL, TEST_USER_REGULAR2_LOGIN, |
|
27 |
TEST_USER_REGULAR2_PASS, TEST_USER_ADMIN_LOGIN, TESTS_TMP_PATH |
|
|
28 | ldap_lib_installed) | |
|
27 | TEST_USER_REGULAR2_PASS, TEST_USER_ADMIN_LOGIN, TESTS_TMP_PATH) | |
|
29 | 28 | |
|
30 | 29 | from rhodecode.model import validators as v |
|
31 | 30 | from rhodecode.model.user_group import UserGroupModel |
@@ -1178,6 +1178,10 b' class UserUtility(object):' | |||
|
1178 | 1178 | self.user_ids.append(user.user_id) |
|
1179 | 1179 | return user |
|
1180 | 1180 | |
|
1181 | def create_additional_user_email(self, user, email): | |
|
1182 | uem = self.fixture.create_additional_user_email(user=user, email=email) | |
|
1183 | return uem | |
|
1184 | ||
|
1181 | 1185 | def create_user_with_group(self): |
|
1182 | 1186 | user = self.create_user() |
|
1183 | 1187 | user_group = self.create_user_group(members=[user]) |
@@ -1706,7 +1710,10 b' def StubIntegrationType():' | |||
|
1706 | 1710 | key = 'test' |
|
1707 | 1711 | display_name = 'Test integration type' |
|
1708 | 1712 | description = 'A test integration type for testing' |
|
1709 | icon = 'test_icon_html_image' | |
|
1713 | ||
|
1714 | @classmethod | |
|
1715 | def icon(cls): | |
|
1716 | return 'test_icon_html_image' | |
|
1710 | 1717 | |
|
1711 | 1718 | def __init__(self, settings): |
|
1712 | 1719 | super(_StubIntegrationType, self).__init__(settings) |
@@ -154,9 +154,6 b' force_https = false' | |||
|
154 | 154 | ## use Strict-Transport-Security headers |
|
155 | 155 | use_htsts = false |
|
156 | 156 | |
|
157 | ## number of commits stats will parse on each iteration | |
|
158 | commit_parse_limit = 25 | |
|
159 | ||
|
160 | 157 | ## git rev filter option, --all is the default filter, if you need to |
|
161 | 158 | ## hide all refs in changelog switch this to --branches --tags |
|
162 | 159 | git_rev_filter = --all |
@@ -21,41 +21,31 b'' | |||
|
21 | 21 | """ |
|
22 | 22 | Tests for main module's methods. |
|
23 | 23 | """ |
|
24 | ||
|
24 | import os | |
|
25 | import tempfile | |
|
26 | import shutil | |
|
25 | 27 | import mock |
|
26 | import os | |
|
27 | import shutil | |
|
28 | import tempfile | |
|
29 | ||
|
30 | 28 | import pytest |
|
31 | 29 | |
|
32 | 30 | from rhodecode.lib.vcs import VCSError, get_backend, get_vcs_instance |
|
33 | from rhodecode.lib.vcs.backends.hg import MercurialRepository | |
|
34 | from rhodecode.tests import TEST_HG_REPO, TEST_GIT_REPO | |
|
35 | 31 | |
|
36 | 32 | |
|
37 | 33 | pytestmark = pytest.mark.usefixtures("baseapp") |
|
38 | 34 | |
|
39 | 35 | |
|
40 | def test_get_backend(): | |
|
41 | hg = get_backend('hg') | |
|
42 | assert hg == MercurialRepository | |
|
36 | def test_get_backend(backend): | |
|
37 | repo_class = get_backend(backend.alias) | |
|
38 | assert repo_class == backend.repo.scm_instance().__class__ | |
|
43 | 39 | |
|
44 | 40 | |
|
45 |
def test_alias_detect |
|
|
46 | alias = 'hg' | |
|
47 | path = TEST_HG_REPO | |
|
48 | backend = get_backend(alias) | |
|
49 | repo = backend(path) | |
|
50 | assert 'hg' == repo.alias | |
|
41 | def test_alias_detect(backend): | |
|
42 | alias = backend.alias | |
|
43 | path = backend.repo.scm_instance().path | |
|
51 | 44 | |
|
45 | new_backend = get_backend(alias) | |
|
46 | repo = new_backend(path) | |
|
52 | 47 | |
|
53 | def test_alias_detect_git(): | |
|
54 | alias = 'git' | |
|
55 | path = TEST_GIT_REPO | |
|
56 | backend = get_backend(alias) | |
|
57 | repo = backend(path) | |
|
58 | assert 'git' == repo.alias | |
|
48 | assert alias == repo.alias | |
|
59 | 49 | |
|
60 | 50 | |
|
61 | 51 | def test_wrong_alias(): |
@@ -73,56 +63,13 b' def test_get_vcs_instance_by_path(vcs_re' | |||
|
73 | 63 | assert repo.name == vcs_repo.name |
|
74 | 64 | |
|
75 | 65 | |
|
76 | @mock.patch('rhodecode.lib.vcs.backends.get_scm') | |
|
77 | @mock.patch('rhodecode.lib.vcs.backends.get_backend') | |
|
78 | def test_get_vcs_instance_by_path_args_passed( | |
|
79 | get_backend_mock, get_scm_mock): | |
|
80 | """ | |
|
81 | Test that the arguments passed to ``get_vcs_instance_by_path`` are | |
|
82 | forewarded to the vcs backend class. | |
|
83 | """ | |
|
84 | backend = mock.MagicMock() | |
|
85 | get_backend_mock.return_value = backend | |
|
86 | args = ['these-are-test-args', 0, True, None] | |
|
87 | get_vcs_instance(TEST_HG_REPO, *args) | |
|
88 | ||
|
89 | backend.assert_called_with(*args, repo_path=TEST_HG_REPO) | |
|
90 | ||
|
91 | ||
|
92 | @mock.patch('rhodecode.lib.vcs.backends.get_scm') | |
|
93 | @mock.patch('rhodecode.lib.vcs.backends.get_backend') | |
|
94 | def test_get_vcs_instance_by_path_kwargs_passed( | |
|
95 | get_backend_mock, get_scm_mock): | |
|
96 | """ | |
|
97 | Test that the keyword arguments passed to ``get_vcs_instance_by_path`` are | |
|
98 | forewarded to the vcs backend class. | |
|
99 | """ | |
|
100 | backend = mock.MagicMock() | |
|
101 | get_backend_mock.return_value = backend | |
|
102 | kwargs = { | |
|
103 | 'foo': 'these-are-test-args', | |
|
104 | 'bar': 0, | |
|
105 | 'baz': True, | |
|
106 | 'foobar': None | |
|
107 | } | |
|
108 | get_vcs_instance(TEST_HG_REPO, **kwargs) | |
|
109 | ||
|
110 | backend.assert_called_with(repo_path=TEST_HG_REPO, **kwargs) | |
|
111 | ||
|
112 | ||
|
113 | def test_get_vcs_instance_by_path_err(request): | |
|
66 | def test_get_vcs_instance_by_path_empty_dir(request, tmpdir): | |
|
114 | 67 | """ |
|
115 | 68 | Test that ``get_vcs_instance_by_path`` returns None if a path is passed |
|
116 | 69 | to an empty directory. |
|
117 | 70 | """ |
|
118 | empty_dir = tempfile.mkdtemp(prefix='pytest-empty-dir-') | |
|
119 | ||
|
120 | def fin(): | |
|
121 | shutil.rmtree(empty_dir) | |
|
122 | request.addfinalizer(fin) | |
|
123 | ||
|
71 | empty_dir = str(tmpdir) | |
|
124 | 72 | repo = get_vcs_instance(empty_dir) |
|
125 | ||
|
126 | 73 | assert repo is None |
|
127 | 74 | |
|
128 | 75 | |
@@ -142,3 +89,42 b' def test_get_vcs_instance_by_path_multip' | |||
|
142 | 89 | repo = get_vcs_instance(empty_dir) |
|
143 | 90 | |
|
144 | 91 | assert repo is None |
|
92 | ||
|
93 | ||
|
94 | @mock.patch('rhodecode.lib.vcs.backends.get_scm') | |
|
95 | @mock.patch('rhodecode.lib.vcs.backends.get_backend') | |
|
96 | def test_get_vcs_instance_by_path_args_passed( | |
|
97 | get_backend_mock, get_scm_mock, tmpdir, vcs_repo): | |
|
98 | """ | |
|
99 | Test that the arguments passed to ``get_vcs_instance_by_path`` are | |
|
100 | forwarded to the vcs backend class. | |
|
101 | """ | |
|
102 | backend = mock.MagicMock() | |
|
103 | get_backend_mock.return_value = backend | |
|
104 | args = ['these-are-test-args', 0, True, None] | |
|
105 | repo = vcs_repo.path | |
|
106 | get_vcs_instance(repo, *args) | |
|
107 | ||
|
108 | backend.assert_called_with(*args, repo_path=repo) | |
|
109 | ||
|
110 | ||
|
111 | @mock.patch('rhodecode.lib.vcs.backends.get_scm') | |
|
112 | @mock.patch('rhodecode.lib.vcs.backends.get_backend') | |
|
113 | def test_get_vcs_instance_by_path_kwargs_passed( | |
|
114 | get_backend_mock, get_scm_mock, vcs_repo): | |
|
115 | """ | |
|
116 | Test that the keyword arguments passed to ``get_vcs_instance_by_path`` are | |
|
117 | forwarded to the vcs backend class. | |
|
118 | """ | |
|
119 | backend = mock.MagicMock() | |
|
120 | get_backend_mock.return_value = backend | |
|
121 | kwargs = { | |
|
122 | 'foo': 'these-are-test-args', | |
|
123 | 'bar': 0, | |
|
124 | 'baz': True, | |
|
125 | 'foobar': None | |
|
126 | } | |
|
127 | repo = vcs_repo.path | |
|
128 | get_vcs_instance(repo, **kwargs) | |
|
129 | ||
|
130 | backend.assert_called_with(repo_path=repo, **kwargs) |
@@ -106,9 +106,10 b' def rc_web_server_config_factory(testini' | |||
|
106 | 106 | Configuration file used for the fixture `rc_web_server`. |
|
107 | 107 | """ |
|
108 | 108 | |
|
109 | def factory(vcsserver_port): | |
|
109 | def factory(rcweb_port, vcsserver_port): | |
|
110 | 110 | custom_params = [ |
|
111 | 111 | {'handler_console': {'level': 'DEBUG'}}, |
|
112 | {'server:main': {'port': rcweb_port}}, | |
|
112 | 113 | {'app:main': {'vcs.server': 'localhost:%s' % vcsserver_port}} |
|
113 | 114 | ] |
|
114 | 115 | custom_params.extend(rc_web_server_config_modification) |
@@ -123,6 +124,8 b' def rc_web_server(' | |||
|
123 | 124 | """ |
|
124 | 125 | Run the web server as a subprocess. with it's own instance of vcsserver |
|
125 | 126 | """ |
|
127 | rcweb_port = available_port_factory() | |
|
128 | print('Using rcweb ops test port {}'.format(rcweb_port)) | |
|
126 | 129 | |
|
127 | 130 | vcsserver_port = available_port_factory() |
|
128 | 131 | print('Using vcsserver ops test port {}'.format(vcsserver_port)) |
@@ -138,6 +141,7 b' def rc_web_server(' | |||
|
138 | 141 | |
|
139 | 142 | rc_log = os.path.join(tempfile.gettempdir(), 'rc_op_web.log') |
|
140 | 143 | rc_web_server_config = rc_web_server_config_factory( |
|
144 | rcweb_port=rcweb_port, | |
|
141 | 145 | vcsserver_port=vcsserver_port) |
|
142 | 146 | server = RcWebServer(rc_web_server_config, log_file=rc_log) |
|
143 | 147 | server.start() |
|
1 | NO CONTENT: file was removed |
|
1 | NO CONTENT: file was removed |
|
1 | NO CONTENT: file was removed |
|
1 | NO CONTENT: file was removed |
|
1 | NO CONTENT: file was removed |
General Comments 0
You need to be logged in to leave comments.
Login now