##// END OF EJS Templates
release: Merge default into stable for release preparation
marcink -
r2693:660bcebf merge stable
parent child Browse files
Show More
@@ -0,0 +1,80 b''
1 .. _svn-path-permissions:
2
3 |svn| Enabling Path Permissions
4 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
5
6 Because |RCEE| uses standard svn apache mod_svn we can take advantage of the
7 authz configuration to protect paths and branches.
8
9
10 Configuring RhodeCode
11 =====================
12
13
14 1. To configure path based permissions first we need to use a customized
15 mod_dav_svn.conf.
16
17 Open :file:`home/{user}/.rccontrol/{instance-id}/rhodecode.ini` file.
18 And find `svn.proxy.config_template` setting. Now set a new path to read
19 the template from. For example:
20
21 .. code-block:: ini
22
23 svn.proxy.config_template = /home/ubuntu/rhodecode/custom_mod_dav_svn.conf.mako
24
25
26 2. Create the file as in example: `/home/ubuntu/rhodecode/custom_mod_dav_svn.conf.mako`
27 You can download one from:
28
29 `<https://code.rhodecode.com/rhodecode-enterprise-ce/files/default/rhodecode/apps/svn_support/templates/mod-dav-svn.conf.mako/>`_
30
31 3. Add (if not yet exists) a section `AuthzSVNReposRelativeAccessFile` in order
32 to read the path auth file.
33
34 Example modified config section enabling reading the authz file relative
35 to repository path. Means located in `/storage_dir/repo_name/conf/authz`
36
37 .. code-block:: text
38
39
40 # snip ...
41
42 # use specific SVN conf/authz file for each repository
43 AuthzSVNReposRelativeAccessFile authz
44
45 Allow from all
46 # snip ...
47
48 .. note::
49
50 The `AuthzSVNReposRelativeAccessFile` should go above the `Allow from all`
51 directive.
52
53
54 4. Restart RhodeCode, Go to
55 the :menuselection:`Admin --> Settings --> VCS` page, and
56 click :guilabel:`Generate Apache Config`.
57 This will now generate a new configuration with enabled changes to read
58 the authz file. You can verify if changes were made by checking the generated
59 mod_dav_svn.conf file which is included in your apache configuration.
60
61 5. Specify new rules in the repository authz configuration.
62 edit a file in :file:`repo_name/conf/authz`. For example, we specify that
63 only admin is allowed to push to develop branch
64
65 .. code-block:: ini
66
67 [/branches/develop]
68 * = r
69 admin = rw
70
71
72 For more example see:
73 `<https://svn.apache.org/repos/asf/subversion/trunk/subversion/mod_authz_svn/INSTALL/>`_
74
75 Those rules also work for paths, so not only branches but all different
76 paths inside the repository can be specified.
77
78 6. Reload Apache. If all is configured correctly it should not be allowed to
79 commit according to specified rules.
80
@@ -0,0 +1,48 b''
1 .. _views-ref:
2
3 views
4 =====
5
6 push (EE only)
7 --------------
8
9 .. py:function:: push(apiuser, repoid, remote_uri=<Optional:None>)
10
11 Triggers a push on the given repository from a remote location. You
12 can use this to keep remote repositories up-to-date.
13
14 This command can only be run using an |authtoken| with admin
15 rights to the specified repository. For more information,
16 see :ref:`config-token-ref`.
17
18 This command takes the following options:
19
20 :param apiuser: This is filled automatically from the |authtoken|.
21 :type apiuser: AuthUser
22 :param repoid: The repository name or repository ID.
23 :type repoid: str or int
24 :param remote_uri: Optional remote URI to pass in for push
25 :type remote_uri: str
26
27 Example output:
28
29 .. code-block:: bash
30
31 id : <id_given_in_input>
32 result : {
33 "msg": "Pushed to url `<remote_url>` on repo `<repository name>`"
34 "repository": "<repository name>"
35 }
36 error : null
37
38 Example error output:
39
40 .. code-block:: bash
41
42 id : <id_given_in_input>
43 result : null
44 error : {
45 "Unable to push changes to `<remote_url>`"
46 }
47
48
@@ -0,0 +1,112 b''
1 .. _config-ldap-groups-ref:
2
3 LDAP/AD With User Groups Sync
4 -----------------------------
5
6 |RCM| supports LDAP (Lightweight Directory Access Protocol) or
7 AD (active Directory) authentication.
8 All LDAP versions are supported, with the following |RCM| plugins managing each:
9
10 * For LDAP/AD with user group sync use ``LDAP + User Groups (egg:rhodecode-enterprise-ee#ldap_group)``
11
12 RhodeCode reads all data defined from plugin and creates corresponding
13 accounts on local database after receiving data from LDAP. This is done on
14 every user log-in including operations like pushing/pulling/checkout.
15 In addition group membership is read from LDAP and following operations are done:
16
17 - automatic addition of user to |RCM| user group
18 - automatic removal of user from any other |RCM| user groups not specified in LDAP.
19 The removal is done *only* on groups that are marked to be synced from ldap.
20 This setting can be changed in advanced settings on user groups
21 - automatic creation of user groups if they aren't yet existing in |RCM|
22 - marking user as super-admins if he is a member of any admin group defined in plugin settings
23
24 This plugin is available only in EE Edition.
25
26 .. important::
27
28 The email used with your |RCE| super-admin account needs to match the email
29 address attached to your admin profile in LDAP. This is because
30 within |RCE| the user email needs to be unique, and multiple users
31 cannot share an email account.
32
33 Likewise, if as an admin you also have a user account, the email address
34 attached to the user account needs to be different.
35
36
37 LDAP Configuration Steps
38 ^^^^^^^^^^^^^^^^^^^^^^^^
39
40 To configure |LDAP|, use the following steps:
41
42 1. From the |RCM| interface, select
43 :menuselection:`Admin --> Authentication`
44 2. Enable the ldap+ groups plugin and select :guilabel:`Save`
45 3. Select the :guilabel:`Enabled` check box in the plugin configuration section
46 4. Add the required LDAP information and :guilabel:`Save`, for more details,
47 see :ref:`config-ldap-groups-examples`
48
49 For a more detailed description of LDAP objects, see :ref:`ldap-gloss-ref`:
50
51 .. _config-ldap-groups-examples:
52
53 Example LDAP configuration
54 ^^^^^^^^^^^^^^^^^^^^^^^^^^
55 .. code-block:: bash
56
57 # Auth Cache TTL, Defines the caching for authentication to offload LDAP server.
58 # This means that cache result will be saved for 3600 before contacting LDAP server to verify the user access
59 3600
60 # Host, comma seperated format is optionally possible to specify more than 1 server
61 https://ldap1.server.com/ldap-admin/,https://ldap2.server.com/ldap-admin/
62 # Default LDAP Port, use 689 for LDAPS
63 389
64 # Account, used for SimpleBind if LDAP server requires an authentication
65 e.g admin@server.com
66 # Password used for simple bind
67 ldap-user-password
68 # LDAP connection security
69 LDAPS
70 # Certificate checks level
71 DEMAND
72 # Base DN
73 cn=Rufus Magillacuddy,ou=users,dc=rhodecode,dc=com
74 # User Search Base
75 ou=groups,ou=users
76 # LDAP search filter to narrow the results
77 (objectClass=person)
78 # LDAP search scope
79 SUBTREE
80 # Login attribute
81 sAMAccountName
82 # First Name Attribute to read
83 givenName
84 # Last Name Attribute to read
85 sn
86 # Email Attribute to read email address from
87 mail
88 # group extraction method
89 rfc2307bis
90 # Group search base
91 ou=RC-Groups
92 # Group Name Attribute, field to read the group name from
93 sAMAAccountName
94 # User Member of Attribute, field in which groups are stored
95 memberOf
96 # LDAP Group Search Filter, allows narrowing the results
97
98 # Admin Groups. Comma separated list of groups. If user is member of
99 # any of those he will be marked as super-admin in RhodeCode
100 admins, management
101
102
103 Below is example setup that can be used with Active Directory and ldap groups.
104
105 .. image:: ../images/ldap-groups-example.png
106 :alt: LDAP/AD setup example
107 :scale: 50 %
108
109 .. toctree::
110
111 ldap-active-directory
112 ldap-authentication No newline at end of file
1 NO CONTENT: new file 100644, binary diff hidden
NO CONTENT: new file 100644, binary diff hidden
@@ -0,0 +1,139 b''
1 |RCE| 4.12.0 |RNS|
2 ------------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2018-04-24
8
9
10 New Features
11 ^^^^^^^^^^^^
12
13 - Svn: added support for RhodeCode integration framework. All integrations like
14 slack, email, Jenkins now also fully work for SVN.
15 - Integrations: added new dedicated Jenkins integration with the support of
16 CSRF authentication. Available in EE edition only.
17 - Automation: added new bi-directional remote sync. RhodeCode instances can now
18 automatically push or pull from/to remote locations. This feature is powered
19 by the Scheduler of 4.11 release, and it is required to be enabled for this feature to work.
20 Available in EE edition only.
21 - Mercurial: path-based permissions. RhodeCode can now use Mercurials narrowhg
22 to implement path-based permissions. All permissions are read from .hg/hgacl.
23 Thanks to the great contribution from Sandu Turcan.
24 - VCS: added new diff caches. Available as an option under vcs settings.
25 Diff caches work on pull-request, or individual commits for greater
26 performance and reduced memory usage. This feature increases speed of large
27 pull requests significantly. In addition for pull requests it will allow
28 showing old closed pull requests even if commits from source were removed,
29 further enhancing auditing capabilities.
30 - Audit: added few new audit log entries especially around changing permissions.
31 - LDAP: added connection pinning and timeout option to ldap plugin. This should
32 prevent problems when connection to LDAP is not stable causing RhodeCode
33 instances to freeze waiting on LDAP connections.
34 - User groups: expose public user group profiles. Allows to see members of a user
35 groups by other team members, if they have proper permissions.
36 - UI: show pull request page in quick nav menu on my account for quicker access.
37 - UI: hidden/outdated comments now have visible markers next to line numbers.
38 This allows access to them without showing all hidden comments.
39
40
41 General
42 ^^^^^^^
43
44 - Ssh: show conflicting fingerprint when adding an already existing key.
45 Helps to track why adding a key failed.
46 - System info: added ulimit to system info. This is causing lots of problems
47 when we hit any of those limits, that is why it's important to show this.
48 - Repository settings: add hidden view to force re-install hooks.
49 Available under /{repo_name}/settings/advanced/hooks
50 - Integrations: Webhook now handles response errors and show response for
51 easier debugging.
52 - Cli: speed up CLI execution start by skipping auth plugin search/registry.
53 - SVN: added an example in the docs on how to enable path-based permissions.
54 - LDAP: enable connection recycling on LDAP plugin.
55 - Auth plugins: use a nicer visual display of auth plugins that would
56 highlight that order of enabled plugins does matter.
57 - Events: expose shadow repo build url.
58 - Events: expose pull request title and uid in event data.
59 - API: enable setting sync flag for user groups on create/edit.
60 - API: update pull method with a possible specification of the url
61 - Logging: improved consistency of auth plugins logs.
62 - Logging: improved log for ssl required
63 - Dependencies: bumped mercurial to 4.4 series
64 - Dependencies: bumped zope.cachedescriptors==4.3.1
65 - Dependencies: bumped zope.deprecation==4.3.0
66 - Dependencies: bumped zope.event==4.3.0
67 - Dependencies: bumped zope.interface==4.4.3
68 - Dependencies: bumped graphviz 0.8.2
69 - Dependencies: bumped to ipaddress 0.1.19
70 - Dependencies: bumped pyexpect to 4.3.1
71 - Dependencies: bumped ws4py to 0.4.3
72 - Dependencies: bumped bleach to 2.1.2
73 - Dependencies: bumped html5lib 1.0.1
74 - Dependencies: bumped greenlet to 0.4.13
75 - Dependencies: bumped markdown to 2.6.11
76 - Dependencies: bumped psutil to 5.4.3
77 - Dependencies: bumped beaker to 1.9.1
78 - Dependencies: bumped alembic to 0.6.8 release.
79 - Dependencies: bumped supervisor to 3.3.4
80 - Dependencies: bumped pyexpect to 4.4.0 and scandir to 1.7
81 - Dependencies: bumped appenlight client to 0.6.25
82 - Dependencies: don't require full mysql lib for the db driver.
83 Reduces installation package size by around 100MB.
84
85
86 Security
87 ^^^^^^^^
88
89 - My account: changing email in my account now requires providing user
90 access password. This is a case for only RhodeCode built-in accounts.
91 Prevents adding recovery email by unauthorized users who gain
92 access to logged in session of user.
93 - Logging: fix leaking of tokens to logging.
94 - General: serialize the repo name in repo checks to prevent potential
95 html injections by providing a malformed url.
96
97
98 Performance
99 ^^^^^^^^^^^
100
101 - Diffs: don't use recurred diffset attachment in diffs. This makes
102 this structure much harder to garbage collect. Reduces memory usage.
103 - Diff cache: added caching for better performance of large pull requests.
104
105
106 Fixes
107 ^^^^^
108
109 - Age helper: fix issues with proper timezone detection for certain timezones.
110 Fixes wrong age display in few cases.
111 - API: added audit logs for user group related calls that were
112 accidentally missing.
113 - Diffs: fix and improve line selections and anchor links.
114 - Pull requests: fixed cases with default expected refs are closed or unavailable.
115 For Mercurial with closed default branch a compare across forks could fail.
116 - Core: properly report 502 errors for gevent and gunicorn.
117 Gevent wtih Gunicorn doesn't raise normal pycurl errors.
118 - Auth plugins: fixed problem with cache of settings in multi-worker mode.
119 The previous implementation had a bug that cached the settings in each class,
120 caused not refreshing the update of settings in multi-worker mode.
121 Only restart of RhodeCode loaded new settings.
122 - Audit logs: properly handle query syntax in the search field.
123 - Repositories: better handling of missing requirements errors for repositories.
124 - API: fixed problems with repository fork/create using celery backend.
125 - VCS settings: added missing flash message on validation errors to prevent
126 missing out some field input validation problems.
127
128
129 Upgrade notes
130 ^^^^^^^^^^^^^
131
132 - This release adds support for SVN hook. This required lots of changes on how we
133 handle SVN protocol. We did thoughtful tests for SVN compatibility.
134 Please be advised to check the behaviour of SVN repositories during this update.
135
136 - Diff caches are turned off by default for backward compatibility. We however recommend
137 turning them on either individually for bigger repositories or globally for every repository.
138 This setting can be found in admin > settings > vcs, or repository > settings > vcs
139
@@ -0,0 +1,20 b''
1 diff -rup Beaker-1.9.1-orig/beaker/container.py Beaker-1.9.1/beaker/container.py
2 --- Beaker-1.9.1-orig/beaker/container.py 2018-04-10 10:23:04.000000000 +0200
3 +++ Beaker-1.9.1/beaker/container.py 2018-04-10 10:23:34.000000000 +0200
4 @@ -353,13 +353,13 @@ class Value(object):
5 debug("get_value returning old value while new one is created")
6 return value
7 else:
8 - debug("lock_creatfunc (didnt wait)")
9 + debug("lock_creatfunc `%s` (didnt wait)", self.createfunc.__name__)
10 has_createlock = True
11
12 if not has_createlock:
13 - debug("lock_createfunc (waiting)")
14 + debug("lock_createfunc `%s` (waiting)", self.createfunc.__name__)
15 creation_lock.acquire()
16 - debug("lock_createfunc (waited)")
17 + debug("lock_createfunc `%s` (waited)", self.createfunc.__name__)
18
19 try:
20 # see if someone created the value already
@@ -0,0 +1,46 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2016-2018 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21 import logging
22
23 from pyramid.view import view_config
24
25 from rhodecode.apps._base import RepoAppView
26 from rhodecode.apps.repository.utils import get_default_reviewers_data
27 from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator
28
29 log = logging.getLogger(__name__)
30
31
32 class RepoAutomationView(RepoAppView):
33 def load_default_context(self):
34 c = self._get_local_tmpl_context()
35 return c
36
37 @LoginRequired()
38 @HasRepoPermissionAnyDecorator('repository.admin')
39 @view_config(
40 route_name='repo_automation', request_method='GET',
41 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
42 def repo_automation(self):
43 c = self.load_default_context()
44 c.active = 'automation'
45
46 return self._get_template_context(c)
@@ -0,0 +1,27 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2016-2018 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21
22 def includeme(config):
23 config.add_route(
24 name='user_group_profile',
25 pattern='/_profile_user_group/{user_group_name}')
26 # Scan module for configuration decorators.
27 config.scan('.views', ignore='.tests')
@@ -0,0 +1,19 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2016-2018 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
@@ -0,0 +1,76 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2010-2018 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 from rhodecode.model.user_group import UserGroupModel
21 from rhodecode.tests import (
22 TestController, TEST_USER_REGULAR_LOGIN, TEST_USER_REGULAR_PASS)
23 from rhodecode.tests.fixture import Fixture
24 from rhodecode.tests.utils import AssertResponse
25
26 fixture = Fixture()
27
28
29 def route_path(name, **kwargs):
30 return '/_profile_user_group/{user_group_name}'.format(**kwargs)
31
32
33 class TestUsersController(TestController):
34
35 def test_user_group_profile(self, user_util):
36 self.log_user(TEST_USER_REGULAR_LOGIN, TEST_USER_REGULAR_PASS)
37 user, usergroup = user_util.create_user_with_group()
38
39 response = self.app.get(route_path('profile_user_group', user_group_name=usergroup.users_group_name))
40 response.mustcontain(usergroup.users_group_name)
41 response.mustcontain(user.username)
42
43 def test_user_can_check_own_group(self, user_util):
44 user = user_util.create_user(
45 TEST_USER_REGULAR_LOGIN, password=TEST_USER_REGULAR_PASS, email='testme@rhodecode.org')
46 self.log_user(TEST_USER_REGULAR_LOGIN, TEST_USER_REGULAR_PASS)
47 usergroup = user_util.create_user_group(owner=user)
48 response = self.app.get(route_path('profile_user_group', user_group_name=usergroup.users_group_name))
49 response.mustcontain(usergroup.users_group_name)
50 response.mustcontain(user.username)
51
52 def test_user_can_not_check_other_group(self, user_util):
53 self.log_user(TEST_USER_REGULAR_LOGIN, TEST_USER_REGULAR_PASS)
54 user_group = user_util.create_user_group()
55 UserGroupModel().grant_user_permission(user_group, self._get_logged_user(), 'usergroup.none')
56 response = self.app.get(route_path('profile_user_group', user_group_name=user_group.users_group_name), status=404)
57 assert response.status_code == 404
58
59 def test_another_user_can_check_if_he_is_in_group(self, user_util):
60 self.log_user(TEST_USER_REGULAR_LOGIN, TEST_USER_REGULAR_PASS)
61 user = user_util.create_user(
62 'test-my-user', password='qweqwe', email='testme@rhodecode.org')
63 user_group = user_util.create_user_group()
64 UserGroupModel().add_user_to_group(user_group, user)
65 UserGroupModel().grant_user_permission(user_group, self._get_logged_user(), 'usergroup.read')
66 response = self.app.get(route_path('profile_user_group', user_group_name=user_group.users_group_name))
67 response.mustcontain(user_group.users_group_name)
68 response.mustcontain(user.username)
69
70 def test_with_anonymous_user(self, user_util):
71 user = user_util.create_user(
72 'test-my-user', password='qweqwe', email='testme@rhodecode.org')
73 user_group = user_util.create_user_group()
74 UserGroupModel().add_user_to_group(user_group, user)
75 response = self.app.get(route_path('profile_user_group', user_group_name=user_group.users_group_name), status=302)
76 assert response.status_code == 302 No newline at end of file
@@ -0,0 +1,53 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2016-2018 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21 import logging
22
23 from pyramid.httpexceptions import HTTPNotFound
24 from pyramid.view import view_config
25
26 from rhodecode.apps._base import BaseAppView
27 from rhodecode.lib.auth import HasUserGroupPermissionAnyDecorator, LoginRequired, NotAnonymous
28 from rhodecode.model.db import UserGroup, User
29
30
31 log = logging.getLogger(__name__)
32
33
34 class UserGroupProfileView(BaseAppView):
35
36 @LoginRequired()
37 @NotAnonymous()
38 @HasUserGroupPermissionAnyDecorator('usergroup.read', 'usergroup.write', 'usergroup.admin',)
39 @view_config(
40 route_name='user_group_profile', request_method='GET',
41 renderer='rhodecode:templates/user_group/user_group.mako')
42 def user_group_profile(self):
43 c = self._get_local_tmpl_context()
44 c.active = 'profile'
45 self.db_user_group_name = self.request.matchdict.get('user_group_name')
46 c.user_group = UserGroup().get_by_group_name(self.db_user_group_name)
47 if not c.user_group:
48 raise HTTPNotFound()
49 group_members_obj = sorted((x.user for x in c.user_group.members),
50 key=lambda u: u.username.lower())
51 c.group_members = group_members_obj
52 c.anonymous = self._rhodecode_user.username == User.DEFAULT_USER
53 return self._get_template_context(c)
@@ -0,0 +1,39 b''
1 import logging
2
3 from sqlalchemy import *
4
5 from rhodecode.model import meta
6 from rhodecode.lib.dbmigrate.versions import _reset_base, notify
7
8 log = logging.getLogger(__name__)
9
10
11 def upgrade(migrate_engine):
12 """
13 Upgrade operations go here.
14 Don't create your own engine; bind migrate_engine to your metadata
15 """
16 _reset_base(migrate_engine)
17 from rhodecode.lib.dbmigrate.schema import db_4_11_0_0 as db
18
19 repository_table = db.Repository.__table__
20
21 push_uri = Column(
22 "push_uri", db.EncryptedTextValue(), nullable=True, unique=False,
23 default=None)
24
25 push_uri.create(table=repository_table)
26
27 # issue fixups
28 fixups(db, meta.Session)
29
30
31 def downgrade(migrate_engine):
32 meta = MetaData()
33 meta.bind = migrate_engine
34
35
36 def fixups(models, _SESSION):
37 pass
38
39
@@ -0,0 +1,1 b''
1 from pyramid.compat import configparser No newline at end of file
1 NO CONTENT: new file 100644, binary diff hidden
NO CONTENT: new file 100644, binary diff hidden
1 NO CONTENT: new file 100644, binary diff hidden
NO CONTENT: new file 100644, binary diff hidden
@@ -0,0 +1,9 b''
1 <div class="panel panel-default">
2 <div class="panel-heading">
3 <h3 class="panel-title">${_('Repo Automation')}</h3>
4 </div>
5 <div class="panel-body">
6 <h4>${_('This feature is available in RhodeCode EE edition only. Contact {sales_email} to obtain a trial license.').format(sales_email='<a href="mailto:sales@rhodecode.com">sales@rhodecode.com</a>')|n}</h4>
7 <img style="width: 100%; height: 100%" src="${h.asset('images/ee_features/repo_automation.png')}"/>
8 </div>
9 </div>
@@ -0,0 +1,9 b''
1 <div class="panel panel-default">
2 <div class="panel-heading">
3 <h3 class="panel-title">${_('Admin Automation')}</h3>
4 </div>
5 <div class="panel-body">
6 <h4>${_('This feature is available in RhodeCode EE edition only. Contact {sales_email} to obtain a trial license.').format(sales_email='<a href="mailto:sales@rhodecode.com">sales@rhodecode.com</a>')|n}</h4>
7 <img style="width: 100%; height: 100%" src="${h.asset('images/ee_features/admin_automation.png')}"/>
8 </div>
9 </div>
@@ -0,0 +1,27 b''
1 <span tal:define="name name|field.name;
2 true_val true_val|field.widget.true_val;
3 css_class css_class|field.widget.css_class;
4 style style|field.widget.style;
5 oid oid|field.oid;
6 help_block help_block|field.widget.help_block|'';
7 "
8 tal:omit-tag="">
9
10 <div class="checkbox">
11 <input type="checkbox" name="${name}" value="${true_val}"
12 readonly="readonly" disabled="disabled"
13 id="${oid}"
14 tal:attributes="checked cstruct == true_val;
15 class css_class;
16 style style;"
17 />
18 <p tal:condition="help_block" class="help-block">${help_block}</p>
19 <label for="${field.oid}">
20 <span tal:condition="hasattr(field, 'schema') and hasattr(field.schema, 'label')"
21 tal:replace="field.schema.label" class="checkbox-label" >
22 </span>
23
24 </label>
25 </div>
26
27 </span> No newline at end of file
@@ -0,0 +1,70 b''
1 <%namespace name="base" file="/base/base.mako"/>
2
3 <div class="panel panel-default user-profile">
4 <div class="panel-heading">
5 <h3 class="panel-title">${_('User group profile')}</h3>
6 %if h.HasPermissionAny('hg.admin')():
7 ${h.link_to(_('Edit'), h.route_path('edit_user_group', user_group_id=c.user_group.users_group_id), class_='panel-edit')}
8 %endif
9 </div>
10
11 <div class="panel-body user-profile-content">
12
13 <div class="fieldset">
14 <div class="left-label">
15 ${_('Group Name')}:
16 </div>
17 <div class="right-content">
18 ${c.user_group.users_group_name}
19 </div>
20 </div>
21 <div class="fieldset">
22 <div class="left-label">
23 ${_('Owner')}:
24 </div>
25 <div class="group_member">
26 ${base.gravatar(c.user_group.user.email, 16)}
27 <span class="username user">${h.link_to_user(c.user_group.user)}</span>
28
29 </div>
30 </div>
31 <div class="fieldset">
32 <div class="left-label">
33 ${_('Active')}:
34 </div>
35 <div class="right-content">
36 ${c.user_group.users_group_active}
37 </div>
38 </div>
39 % if not c.anonymous:
40 <div class="fieldset">
41 <div class="left-label">
42 ${_('Members')}:
43 </div>
44 <div class="right-content">
45 <table id="group_members_placeholder" class="rctable group_members">
46 <th>${_('Username')}</th>
47 % if c.group_members:
48 % for user in c.group_members:
49 <tr>
50 <td id="member_user_${user.user_id}" class="td-author">
51 <div class="group_member">
52 ${base.gravatar(user.email, 16)}
53 <span class="username user">${h.link_to(h.person(user), h.route_path('user_edit',user_id=user.user_id))}</span>
54 <input type="hidden" name="__start__" value="member:mapping">
55 <input type="hidden" name="member_user_id" value="${user.user_id}">
56 <input type="hidden" name="type" value="existing" id="member_${user.user_id}">
57 <input type="hidden" name="__end__" value="member:mapping">
58 </div>
59 </td>
60 </tr>
61 % endfor
62 % else:
63 <tr><td colspan="2">${_('No members yet')}</td></tr>
64 % endif
65 </table>
66 </div>
67 </div>
68 % endif
69 </div>
70 </div> No newline at end of file
@@ -0,0 +1,46 b''
1 <%inherit file="/base/base.mako"/>
2
3 <%def name="title()">
4 ${_('User group')}: ${c.user_group.users_group_name}
5 %if c.rhodecode_name:
6 &middot; ${h.branding(c.rhodecode_name)}
7 %endif
8 </%def>
9
10 <%def name="breadcrumbs_links()">
11 ${_('User group')}: ${c.user_group.users_group_name}
12 </%def>
13
14 <%def name="menu_bar_nav()">
15 ${self.menu_items(active='my_account')}
16 </%def>
17
18 <%def name="main()">
19 <div class="box">
20 <div class="title">
21 ${self.breadcrumbs()}
22 </div>
23
24 <div class="sidebar-col-wrapper scw-small">
25 ##main
26 <div class="sidebar">
27 <ul class="nav nav-pills nav-stacked">
28 <li class="${'active' if c.active=='profile' else ''}">
29 <a href="${h.route_path('user_group_profile', user_group_name=c.user_group.users_group_name)}">${_('User Group Profile')}</a></li>
30 ## These placeholders are here only for styling purposes. For every new item added to the list, you should remove one placeholder
31 <li class="placeholder"><a href="#" style="visibility: hidden;">placeholder</a></li>
32 <li class="placeholder"><a href="#" style="visibility: hidden;">placeholder</a></li>
33 <li class="placeholder"><a href="#" style="visibility: hidden;">placeholder</a></li>
34 <li class="placeholder"><a href="#" style="visibility: hidden;">placeholder</a></li>
35 <li class="placeholder"><a href="#" style="visibility: hidden;">placeholder</a></li>
36 <li class="placeholder"><a href="#" style="visibility: hidden;">placeholder</a></li>
37 </ul>
38 </div>
39
40 <div class="main-content-full-width">
41 <%include file="/user_group/${c.active}.mako"/>
42 </div>
43 </div>
44 </div>
45
46 </%def>
@@ -1,5 +1,5 b''
1 [bumpversion]
1 [bumpversion]
2 current_version = 4.11.6
2 current_version = 4.12.0
3 message = release: Bump version {current_version} to {new_version}
3 message = release: Bump version {current_version} to {new_version}
4
4
5 [bumpversion:file:rhodecode/VERSION]
5 [bumpversion:file:rhodecode/VERSION]
@@ -5,25 +5,20 b' done = false'
5 done = true
5 done = true
6
6
7 [task:rc_tools_pinned]
7 [task:rc_tools_pinned]
8 done = true
9
8
10 [task:fixes_on_stable]
9 [task:fixes_on_stable]
11 done = true
12
10
13 [task:pip2nix_generated]
11 [task:pip2nix_generated]
14 done = true
15
12
16 [task:changelog_updated]
13 [task:changelog_updated]
17 done = true
18
14
19 [task:generate_api_docs]
15 [task:generate_api_docs]
20 done = true
16
17 [task:updated_translation]
21
18
22 [release]
19 [release]
23 state = prepared
20 state = in_progress
24 version = 4.11.6
21 version = 4.12.0
25
26 [task:updated_translation]
27
22
28 [task:generate_js_routes]
23 [task:generate_js_routes]
29
24
@@ -16,9 +16,6 b' recursive-include configs *'
16 # translations
16 # translations
17 recursive-include rhodecode/i18n *
17 recursive-include rhodecode/i18n *
18
18
19 # hook templates
20 recursive-include rhodecode/config/hook_templates *
21
22 # non-python core stuff
19 # non-python core stuff
23 recursive-include rhodecode *.cfg
20 recursive-include rhodecode *.cfg
24 recursive-include rhodecode *.json
21 recursive-include rhodecode *.json
@@ -190,9 +190,6 b' force_https = false'
190 ## use Strict-Transport-Security headers
190 ## use Strict-Transport-Security headers
191 use_htsts = false
191 use_htsts = false
192
192
193 ## number of commits stats will parse on each iteration
194 commit_parse_limit = 25
195
196 ## git rev filter option, --all is the default filter, if you need to
193 ## git rev filter option, --all is the default filter, if you need to
197 ## hide all refs in changelog switch this to --branches --tags
194 ## hide all refs in changelog switch this to --branches --tags
198 git_rev_filter = --branches --tags
195 git_rev_filter = --branches --tags
@@ -708,12 +705,12 b' formatter = color_formatter_sql'
708
705
709 [formatter_generic]
706 [formatter_generic]
710 class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter
707 class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter
711 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
708 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
712 datefmt = %Y-%m-%d %H:%M:%S
709 datefmt = %Y-%m-%d %H:%M:%S
713
710
714 [formatter_color_formatter]
711 [formatter_color_formatter]
715 class = rhodecode.lib.logging_formatter.ColorFormatter
712 class = rhodecode.lib.logging_formatter.ColorFormatter
716 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
713 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
717 datefmt = %Y-%m-%d %H:%M:%S
714 datefmt = %Y-%m-%d %H:%M:%S
718
715
719 [formatter_color_formatter_sql]
716 [formatter_color_formatter_sql]
@@ -55,12 +55,13 b' keepalive = 2'
55
55
56 # SERVER MECHANICS
56 # SERVER MECHANICS
57 # None == system temp dir
57 # None == system temp dir
58 # worker_tmp_dir is recommended to be set to some tmpfs
58 worker_tmp_dir = None
59 worker_tmp_dir = None
59 tmp_upload_dir = None
60 tmp_upload_dir = None
60
61
61 # Custom log format
62 # Custom log format
62 access_log_format = (
63 access_log_format = (
63 '%(t)s GNCRN %(p)-8s %(h)-15s rqt:%(L)s %(s)s %(b)-6s "%(m)s:%(U)s %(q)s" usr:%(u)s "%(f)s" "%(a)s"')
64 '%(t)s [%(p)-8s] GNCRN %(h)-15s rqt:%(L)s %(s)s %(b)-6s "%(m)s:%(U)s %(q)s" usr:%(u)s "%(f)s" "%(a)s"')
64
65
65 # self adjust workers based on CPU count
66 # self adjust workers based on CPU count
66 # workers = multiprocessing.cpu_count() * 2 + 1
67 # workers = multiprocessing.cpu_count() * 2 + 1
@@ -119,16 +120,16 b' def child_exit(server, worker):'
119
120
120
121
121 def pre_request(worker, req):
122 def pre_request(worker, req):
122 return
123 worker.start_time = time.time()
123 worker.log.debug("[<%-10s>] PRE WORKER: %s %s",
124 worker.log.debug(
124 worker.pid, req.method, req.path)
125 "GNCRN PRE WORKER [cnt:%s]: %s %s", worker.nr, req.method, req.path)
125
126
126
127
127 def post_request(worker, req, environ, resp):
128 def post_request(worker, req, environ, resp):
128 return
129 total_time = time.time() - worker.start_time
129 worker.log.debug("[<%-10s>] POST WORKER: %s %s resp: %s", worker.pid,
130 worker.log.debug(
130 req.method, req.path, resp.status_code)
131 "GNCRN POST WORKER [cnt:%s]: %s %s resp: %s, Load Time: %.3fs",
131
132 worker.nr, req.method, req.path, resp.status_code, total_time)
132
133
133
134
134 class RhodeCodeLogger(Logger):
135 class RhodeCodeLogger(Logger):
@@ -165,9 +165,6 b' force_https = false'
165 ## use Strict-Transport-Security headers
165 ## use Strict-Transport-Security headers
166 use_htsts = false
166 use_htsts = false
167
167
168 ## number of commits stats will parse on each iteration
169 commit_parse_limit = 25
170
171 ## git rev filter option, --all is the default filter, if you need to
168 ## git rev filter option, --all is the default filter, if you need to
172 ## hide all refs in changelog switch this to --branches --tags
169 ## hide all refs in changelog switch this to --branches --tags
173 git_rev_filter = --branches --tags
170 git_rev_filter = --branches --tags
@@ -678,12 +675,12 b' formatter = generic'
678
675
679 [formatter_generic]
676 [formatter_generic]
680 class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter
677 class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter
681 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
678 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
682 datefmt = %Y-%m-%d %H:%M:%S
679 datefmt = %Y-%m-%d %H:%M:%S
683
680
684 [formatter_color_formatter]
681 [formatter_color_formatter]
685 class = rhodecode.lib.logging_formatter.ColorFormatter
682 class = rhodecode.lib.logging_formatter.ColorFormatter
686 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
683 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
687 datefmt = %Y-%m-%d %H:%M:%S
684 datefmt = %Y-%m-%d %H:%M:%S
688
685
689 [formatter_color_formatter_sql]
686 [formatter_color_formatter_sql]
@@ -1,4 +1,4 b''
1 .. _sec-your-server:
1 .. _sec-sophos-umc:
2
2
3 Securing Your Server via Sophos UTM 9
3 Securing Your Server via Sophos UTM 9
4 -------------------------------------
4 -------------------------------------
@@ -19,6 +19,7 b' The following are the most common system'
19 config-files-overview
19 config-files-overview
20 vcs-server
20 vcs-server
21 svn-http
21 svn-http
22 svn-path-permissions
22 gunicorn-ssl-support
23 gunicorn-ssl-support
23 apache-config
24 apache-config
24 nginx-config
25 nginx-config
@@ -196,6 +196,7 b' are not required in args.'
196 .. --- API DEFS MARKER ---
196 .. --- API DEFS MARKER ---
197 .. toctree::
197 .. toctree::
198
198
199 methods/views
199 methods/license-methods
200 methods/license-methods
200 methods/deprecated-methods
201 methods/deprecated-methods
201 methods/gist-methods
202 methods/gist-methods
@@ -66,7 +66,7 b' comment_commit'
66 create_repo
66 create_repo
67 -----------
67 -----------
68
68
69 .. py:function:: create_repo(apiuser, repo_name, repo_type, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, copy_permissions=<Optional:False>)
69 .. py:function:: create_repo(apiuser, repo_name, repo_type, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, push_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, copy_permissions=<Optional:False>)
70
70
71 Creates a repository.
71 Creates a repository.
72
72
@@ -95,6 +95,8 b' create_repo'
95 :type private: bool
95 :type private: bool
96 :param clone_uri: set clone_uri
96 :param clone_uri: set clone_uri
97 :type clone_uri: str
97 :type clone_uri: str
98 :param push_uri: set push_uri
99 :type push_uri: str
98 :param landing_rev: <rev_type>:<rev>
100 :param landing_rev: <rev_type>:<rev>
99 :type landing_rev: str
101 :type landing_rev: str
100 :param enable_locking:
102 :param enable_locking:
@@ -789,7 +791,7 b' maintenance'
789 pull
791 pull
790 ----
792 ----
791
793
792 .. py:function:: pull(apiuser, repoid)
794 .. py:function:: pull(apiuser, repoid, remote_uri=<Optional:None>)
793
795
794 Triggers a pull on the given repository from a remote location. You
796 Triggers a pull on the given repository from a remote location. You
795 can use this to keep remote repositories up-to-date.
797 can use this to keep remote repositories up-to-date.
@@ -804,6 +806,8 b' pull'
804 :type apiuser: AuthUser
806 :type apiuser: AuthUser
805 :param repoid: The repository name or repository ID.
807 :param repoid: The repository name or repository ID.
806 :type repoid: str or int
808 :type repoid: str or int
809 :param remote_uri: Optional remote URI to pass in for pull
810 :type remote_uri: str
807
811
808 Example output:
812 Example output:
809
813
@@ -811,7 +815,7 b' pull'
811
815
812 id : <id_given_in_input>
816 id : <id_given_in_input>
813 result : {
817 result : {
814 "msg": "Pulled from `<repository name>`"
818 "msg": "Pulled from url `<remote_url>` on repo `<repository name>`"
815 "repository": "<repository name>"
819 "repository": "<repository name>"
816 }
820 }
817 error : null
821 error : null
@@ -823,7 +827,7 b' pull'
823 id : <id_given_in_input>
827 id : <id_given_in_input>
824 result : null
828 result : null
825 error : {
829 error : {
826 "Unable to pull changes from `<reponame>`"
830 "Unable to push changes from `<remote_url>`"
827 }
831 }
828
832
829
833
@@ -976,7 +980,7 b' strip'
976 update_repo
980 update_repo
977 -----------
981 -----------
978
982
979 .. py:function:: update_repo(apiuser, repoid, repo_name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, fork_of=<Optional:None>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>)
983 .. py:function:: update_repo(apiuser, repoid, repo_name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, push_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, fork_of=<Optional:None>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>)
980
984
981 Updates a repository with the given information.
985 Updates a repository with the given information.
982
986
@@ -51,7 +51,7 b' add_user_to_user_group'
51 create_user_group
51 create_user_group
52 -----------------
52 -----------------
53
53
54 .. py:function:: create_user_group(apiuser, group_name, description=<Optional:''>, owner=<Optional:<OptionalAttr:apiuser>>, active=<Optional:True>)
54 .. py:function:: create_user_group(apiuser, group_name, description=<Optional:''>, owner=<Optional:<OptionalAttr:apiuser>>, active=<Optional:True>, sync=<Optional:None>)
55
55
56 Creates a new user group.
56 Creates a new user group.
57
57
@@ -71,6 +71,9 b' create_user_group'
71 :type owner: Optional(str or int)
71 :type owner: Optional(str or int)
72 :param active: Set this group as active.
72 :param active: Set this group as active.
73 :type active: Optional(``True`` | ``False``)
73 :type active: Optional(``True`` | ``False``)
74 :param sync: Set enabled or disabled the automatically sync from
75 external authentication types like ldap.
76 :type sync: Optional(``True`` | ``False``)
74
77
75 Example output:
78 Example output:
76
79
@@ -368,7 +371,7 b' revoke_user_permission_from_user_group'
368 update_user_group
371 update_user_group
369 -----------------
372 -----------------
370
373
371 .. py:function:: update_user_group(apiuser, usergroupid, group_name=<Optional:''>, description=<Optional:''>, owner=<Optional:None>, active=<Optional:True>)
374 .. py:function:: update_user_group(apiuser, usergroupid, group_name=<Optional:''>, description=<Optional:''>, owner=<Optional:None>, active=<Optional:True>, sync=<Optional:None>)
372
375
373 Updates the specified `user group` with the details provided.
376 Updates the specified `user group` with the details provided.
374
377
@@ -387,6 +390,9 b' update_user_group'
387 :type owner: Optional(str or int)
390 :type owner: Optional(str or int)
388 :param active: Set the group as active.
391 :param active: Set the group as active.
389 :type active: Optional(``True`` | ``False``)
392 :type active: Optional(``True`` | ``False``)
393 :param sync: Set enabled or disabled the automatically sync from
394 external authentication types like ldap.
395 :type sync: Optional(``True`` | ``False``)
390
396
391 Example output:
397 Example output:
392
398
1 NO CONTENT: file renamed from docs/auth/crowd-auth.rst to docs/auth/auth-crowd.rst
NO CONTENT: file renamed from docs/auth/crowd-auth.rst to docs/auth/auth-crowd.rst
@@ -1,14 +1,17 b''
1 .. _config-ldap-ref:
1 .. _config-ldap-ref:
2
2
3 LDAP
3 LDAP/AD
4 ----
4 -------
5
5
6 |RCM| supports LDAP (Lightweight Directory Access Protocol) or
6 |RCM| supports LDAP (Lightweight Directory Access Protocol) or
7 AD (active Directory) authentication.
7 AD (active Directory) authentication.
8 All LDAP versions are supported, with the following |RCM| plugins managing each:
8 All LDAP versions are supported, with the following |RCM| plugins managing each:
9
9
10 * For LDAPv3 use ``LDAP (egg:rhodecode-enterprise-ce#ldap)``
10 * For LDAP or Active Directory use ``LDAP (egg:rhodecode-enterprise-ce#ldap)``
11 * For LDAPv3 with user group sync use ``LDAP + User Groups (egg:rhodecode-enterprise-ee#ldap_group)``
11
12 RhodeCode reads all data defined from plugin and creates corresponding
13 accounts on local database after receiving data from LDAP. This is done on
14 every user log-in including operations like pushing/pulling/checkout.
12
15
13
16
14 .. important::
17 .. important::
@@ -21,6 +24,7 b' All LDAP versions are supported, with th'
21 Likewise, if as an admin you also have a user account, the email address
24 Likewise, if as an admin you also have a user account, the email address
22 attached to the user account needs to be different.
25 attached to the user account needs to be different.
23
26
27
24 LDAP Configuration Steps
28 LDAP Configuration Steps
25 ^^^^^^^^^^^^^^^^^^^^^^^^
29 ^^^^^^^^^^^^^^^^^^^^^^^^
26
30
@@ -28,7 +32,7 b' To configure |LDAP|, use the following s'
28
32
29 1. From the |RCM| interface, select
33 1. From the |RCM| interface, select
30 :menuselection:`Admin --> Authentication`
34 :menuselection:`Admin --> Authentication`
31 2. Enable the required plugin and select :guilabel:`Save`
35 2. Enable the ldap plugin and select :guilabel:`Save`
32 3. Select the :guilabel:`Enabled` check box in the plugin configuration section
36 3. Select the :guilabel:`Enabled` check box in the plugin configuration section
33 4. Add the required LDAP information and :guilabel:`Save`, for more details,
37 4. Add the required LDAP information and :guilabel:`Save`, for more details,
34 see :ref:`config-ldap-examples`
38 see :ref:`config-ldap-examples`
@@ -41,15 +45,16 b' Example LDAP configuration'
41 ^^^^^^^^^^^^^^^^^^^^^^^^^^
45 ^^^^^^^^^^^^^^^^^^^^^^^^^^
42 .. code-block:: bash
46 .. code-block:: bash
43
47
44 # Auth Cache TTL
48 # Auth Cache TTL, Defines the caching for authentication to offload LDAP server.
49 # This means that cache result will be saved for 3600 before contacting LDAP server to verify the user access
45 3600
50 3600
46 # Host
51 # Host, comma seperated format is optionally possible to specify more than 1 server
47 https://ldap1.server.com/ldap-admin/,https://ldap2.server.com/ldap-admin/
52 https://ldap1.server.com/ldap-admin/,https://ldap2.server.com/ldap-admin/
48 # Port
53 # Default LDAP Port, use 689 for LDAPS
49 389
54 389
50 # Account
55 # Account, used for SimpleBind if LDAP server requires an authentication
51 cn=admin,dc=rhodecode,dc=com
56 e.g admin@server.com
52 # Password
57 # Password used for simple bind
53 ldap-user-password
58 ldap-user-password
54 # LDAP connection security
59 # LDAP connection security
55 LDAPS
60 LDAPS
@@ -57,32 +62,26 b' Example LDAP configuration'
57 DEMAND
62 DEMAND
58 # Base DN
63 # Base DN
59 cn=Rufus Magillacuddy,ou=users,dc=rhodecode,dc=com
64 cn=Rufus Magillacuddy,ou=users,dc=rhodecode,dc=com
60 # User Search Base
65 # LDAP search filter to narrow the results
61 ou=groups,ou=users
62 # LDAP search filter
63 (objectClass=person)
66 (objectClass=person)
64 # LDAP search scope
67 # LDAP search scope
65 SUBTREE
68 SUBTREE
66 # Login attribute
69 # Login attribute
67 rmagillacuddy
70 sAMAccountName
68 # First Name Attribute
71 # First Name Attribute to read
69 Rufus
72 givenName
70 # Last Name Attribute
73 # Last Name Attribute to read
71 Magillacuddy
74 sn
72 # Email Attribute
75 # Email Attribute to read email address from
73 LDAP-Registered@email.ac
76 mail
74 # User Member of Attribute
77
75 Organizational Role
78
76 # Group search base
79 Below is example setup that can be used with Active Directory/LDAP server.
77 cn=users,ou=groups,dc=rhodecode,dc=com
80
78 # LDAP Group Search Filter
81 .. image:: ../images/ldap-example.png
79 (objectclass=posixGroup)
82 :alt: LDAP/AD setup example
80 # Group Name Attribute
83 :scale: 50 %
81 users
84
82 # Group Member Of Attribute
83 cn
84 # Admin Groups
85 admin,devops,qa
86
85
87 .. toctree::
86 .. toctree::
88
87
1 NO CONTENT: file renamed from docs/auth/pam-auth.rst to docs/auth/auth-pam.rst
NO CONTENT: file renamed from docs/auth/pam-auth.rst to docs/auth/auth-pam.rst
1 NO CONTENT: file renamed from docs/auth/token-auth.rst to docs/auth/auth-token.rst
NO CONTENT: file renamed from docs/auth/token-auth.rst to docs/auth/auth-token.rst
@@ -3,35 +3,30 b''
3 Authentication Options
3 Authentication Options
4 ======================
4 ======================
5
5
6 |RCE| provides a built in authentication plugin
6 |RCE| provides a built in authentication against its own database. This is
7 ``rhodecode.lib.auth_rhodecode``. This is enabled by default and accessed
7 implemented using ``rhodecode.lib.auth_rhodecode`` plugin. This plugin is
8 through the administrative interface. Additionally,
8 enabled by default.
9 |RCE| provides a Pluggable Authentication System (PAS). This gives the
9 Additionally, |RCE| provides a Pluggable Authentication System. This gives the
10 administrator greater control over how users authenticate with the system.
10 administrator greater control over how users authenticate with the system.
11
11
12 .. important::
12 .. important::
13
13
14 You can disable the built in |RCM| authentication plugin
14 You can disable the built in |RCM| authentication plugin
15 ``rhodecode.lib.auth_rhodecode`` and force all authentication to go
15 ``rhodecode.lib.auth_rhodecode`` and force all authentication to go
16 through your authentication plugin. However, if you do this,
16 through your authentication plugin of choice e.g LDAP only.
17 and your external authentication tools fails, you will be unable to
17 However, if you do this, and your external authentication tools fails,
18 access |RCM|.
18 you will be unable to access |RCM|.
19
19
20 |RCM| comes with the following user authentication management plugins:
20 |RCM| comes with the following user authentication management plugins:
21
21
22 .. only:: latex
23
24 * :ref:`config-ldap-ref`
25 * :ref:`config-pam-ref`
26 * :ref:`config-crowd-ref`
27 * :ref:`config-token-ref`
28
22
29 .. toctree::
23 .. toctree::
30
24
31 ldap-config-steps
25 auth-ldap
32 crowd-auth
26 auth-ldap-groups
33 pam-auth
27 auth-crowd
34 token-auth
28 auth-pam
29 auth-token
35 ssh-connection
30 ssh-connection
36
31
37
32
@@ -34,7 +34,7 b' import common'
34 extensions = [
34 extensions = [
35 'sphinx.ext.intersphinx',
35 'sphinx.ext.intersphinx',
36 'sphinx.ext.todo',
36 'sphinx.ext.todo',
37 'sphinx.ext.pngmath'
37 'sphinx.ext.imgmath'
38 ]
38 ]
39
39
40 intersphinx_mapping = {
40 intersphinx_mapping = {
@@ -104,7 +104,6 b' exclude_patterns = ['
104
104
105 # Other RST files
105 # Other RST files
106 'admin/rhodecode-backup.rst',
106 'admin/rhodecode-backup.rst',
107 'auth/ldap-configuration-example.rst',
108 'issue-trackers/redmine.rst',
107 'issue-trackers/redmine.rst',
109 'known-issues/error-msg-guide.rst',
108 'known-issues/error-msg-guide.rst',
110 'tutorials/docs-build.rst',
109 'tutorials/docs-build.rst',
@@ -34,6 +34,13 b' following commands:'
34
34
35 Update your channels frequently by running ``nix-channel --update``.
35 Update your channels frequently by running ``nix-channel --update``.
36
36
37 .. note::
38
39 To uninstall nix run the following:
40
41 remove the . "$HOME/.nix-profile/etc/profile.d/nix.sh" line in your ~/.profile or ~/.bash_profile
42 rm -rf $HOME/{.nix-channels,.nix-defexpr,.nix-profile,.config/nixpkgs}
43 sudo rm -rf /nix
37
44
38 Switch nix to the latest STABLE channel
45 Switch nix to the latest STABLE channel
39 ---------------------------------------
46 ---------------------------------------
@@ -169,17 +169,6 b' let'
169 };
169 };
170 };
170 };
171
171
172 setuptools = buildPythonPackage {
173 name = "setuptools-36.6.0";
174 buildInputs = [];
175 doCheck = false;
176 propagatedBuildInputs = [];
177 src = fetchurl {
178 url = "https://pypi.python.org/packages/45/29/8814bf414e7cd1031e1a3c8a4169218376e284ea2553cc0822a6ea1c2d78/setuptools-36.6.0.zip";
179 md5 = "74663b15117d9a2cc5295d76011e6fd1";
180 };
181 };
182
183 six = buildPythonPackage {
172 six = buildPythonPackage {
184 name = "six-1.11.0";
173 name = "six-1.11.0";
185 buildInputs = [];
174 buildInputs = [];
@@ -259,6 +248,9 b' let'
259
248
260
249
261 };
250 };
251 # Avoid that setuptools is replaced, this leads to trouble
252 # with buildPythonPackage.
253 setuptools = pkgs.python27Packages.setuptools;
262
254
263 in python.buildEnv.override {
255 in python.buildEnv.override {
264 inherit python;
256 inherit python;
@@ -9,6 +9,7 b' Release Notes'
9 .. toctree::
9 .. toctree::
10 :maxdepth: 1
10 :maxdepth: 1
11
11
12 release-notes-4.12.0.rst
12 release-notes-4.11.6.rst
13 release-notes-4.11.6.rst
13 release-notes-4.11.5.rst
14 release-notes-4.11.5.rst
14 release-notes-4.11.4.rst
15 release-notes-4.11.4.rst
@@ -88,7 +88,7 b' 2. Once downloaded to your computer, tra'
88
88
89 The |RCC| installer is now on your server, and you can read the full
89 The |RCC| installer is now on your server, and you can read the full
90 instructions here
90 instructions here
91 :ref:`Install RhodeCode Control <control:rcc-install-ref>`,
91 :ref:`Install RhodeCode Control <control:rcc-linux-ref>` ,
92 but below is the example shortcut.
92 but below is the example shortcut.
93
93
94 .. code-block:: bash
94 .. code-block:: bash
@@ -25,6 +25,12 b' self: super: {'
25 };
25 };
26 });
26 });
27
27
28 Beaker = super.Beaker.override (attrs: {
29 patches = [
30 ./patch-beaker-lock-func-debug.diff
31 ];
32 });
33
28 future = super.future.override (attrs: {
34 future = super.future.override (attrs: {
29 meta = {
35 meta = {
30 license = [ pkgs.lib.licenses.mit ];
36 license = [ pkgs.lib.licenses.mit ];
@@ -82,7 +88,7 b' self: super: {'
82 pkgs.openssl
88 pkgs.openssl
83 ];
89 ];
84 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
90 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
85 pkgs.mysql.lib
91 pkgs.libmysql
86 pkgs.zlib
92 pkgs.zlib
87 ];
93 ];
88 });
94 });
@@ -16,13 +16,13 b''
16 };
16 };
17 };
17 };
18 Beaker = super.buildPythonPackage {
18 Beaker = super.buildPythonPackage {
19 name = "Beaker-1.9.0";
19 name = "Beaker-1.9.1";
20 buildInputs = with self; [];
20 buildInputs = with self; [];
21 doCheck = false;
21 doCheck = false;
22 propagatedBuildInputs = with self; [funcsigs];
22 propagatedBuildInputs = with self; [funcsigs];
23 src = fetchurl {
23 src = fetchurl {
24 url = "https://pypi.python.org/packages/93/b2/12de6937b06e9615dbb3cb3a1c9af17f133f435bdef59f4ad42032b6eb49/Beaker-1.9.0.tar.gz";
24 url = "https://pypi.python.org/packages/ca/14/a626188d0d0c7b55dd7cf1902046c2743bd392a7078bb53073e13280eb1e/Beaker-1.9.1.tar.gz";
25 md5 = "38b3fcdfa24faf97c6cf66991eb54e9c";
25 md5 = "46fda0a164e2b0d24ccbda51a2310301";
26 };
26 };
27 meta = {
27 meta = {
28 license = [ pkgs.lib.licenses.bsdOriginal ];
28 license = [ pkgs.lib.licenses.bsdOriginal ];
@@ -94,13 +94,13 b''
94 };
94 };
95 };
95 };
96 Markdown = super.buildPythonPackage {
96 Markdown = super.buildPythonPackage {
97 name = "Markdown-2.6.9";
97 name = "Markdown-2.6.11";
98 buildInputs = with self; [];
98 buildInputs = with self; [];
99 doCheck = false;
99 doCheck = false;
100 propagatedBuildInputs = with self; [];
100 propagatedBuildInputs = with self; [];
101 src = fetchurl {
101 src = fetchurl {
102 url = "https://pypi.python.org/packages/29/82/dfe242bcfd9eec0e7bf93a80a8f8d8515a95b980c44f5c0b45606397a423/Markdown-2.6.9.tar.gz";
102 url = "https://pypi.python.org/packages/b3/73/fc5c850f44af5889192dff783b7b0d8f3fe8d30b65c8e3f78f8f0265fecf/Markdown-2.6.11.tar.gz";
103 md5 = "56547d362a9abcf30955b8950b08b5e3";
103 md5 = "a67c1b2914f7d74eeede2ebe0fdae470";
104 };
104 };
105 meta = {
105 meta = {
106 license = [ pkgs.lib.licenses.bsdOriginal ];
106 license = [ pkgs.lib.licenses.bsdOriginal ];
@@ -315,13 +315,13 b''
315 };
315 };
316 };
316 };
317 alembic = super.buildPythonPackage {
317 alembic = super.buildPythonPackage {
318 name = "alembic-0.9.6";
318 name = "alembic-0.9.8";
319 buildInputs = with self; [];
319 buildInputs = with self; [];
320 doCheck = false;
320 doCheck = false;
321 propagatedBuildInputs = with self; [SQLAlchemy Mako python-editor python-dateutil];
321 propagatedBuildInputs = with self; [SQLAlchemy Mako python-editor python-dateutil];
322 src = fetchurl {
322 src = fetchurl {
323 url = "https://pypi.python.org/packages/bf/b3/b28ea715824f8455635ece3c12f59d5d205f98cc378858e414e3aa6ebdbc/alembic-0.9.6.tar.gz";
323 url = "https://pypi.python.org/packages/a1/95/2252783859df9ec76b9a25d968c2827ed75a43ba34c6e8d38f87a5c0fb26/alembic-0.9.8.tar.gz";
324 md5 = "fcb096bccc87c8770bd07a04606cb989";
324 md5 = "5cfef58641c9a94d4a5d547e951a7dda";
325 };
325 };
326 meta = {
326 meta = {
327 license = [ pkgs.lib.licenses.mit ];
327 license = [ pkgs.lib.licenses.mit ];
@@ -341,13 +341,13 b''
341 };
341 };
342 };
342 };
343 appenlight-client = super.buildPythonPackage {
343 appenlight-client = super.buildPythonPackage {
344 name = "appenlight-client-0.6.22";
344 name = "appenlight-client-0.6.25";
345 buildInputs = with self; [];
345 buildInputs = with self; [];
346 doCheck = false;
346 doCheck = false;
347 propagatedBuildInputs = with self; [WebOb requests six];
347 propagatedBuildInputs = with self; [WebOb requests six];
348 src = fetchurl {
348 src = fetchurl {
349 url = "https://pypi.python.org/packages/73/37/0a64460fa9670b17c061adc433bc8be5079cba21e8b3a92d824adccb12bc/appenlight_client-0.6.22.tar.gz";
349 url = "https://pypi.python.org/packages/fa/44/2911ef85ea4f4fe65058fd22959d8dad598fab6a3c84e5bcb569d15c8783/appenlight_client-0.6.25.tar.gz";
350 md5 = "641afc114a9a3b3af4f75b11c70968ee";
350 md5 = "76dd2f9d42659fae8f290982078dc880";
351 };
351 };
352 meta = {
352 meta = {
353 license = [ pkgs.lib.licenses.bsdOriginal ];
353 license = [ pkgs.lib.licenses.bsdOriginal ];
@@ -406,13 +406,13 b''
406 };
406 };
407 };
407 };
408 bleach = super.buildPythonPackage {
408 bleach = super.buildPythonPackage {
409 name = "bleach-2.1.1";
409 name = "bleach-2.1.2";
410 buildInputs = with self; [];
410 buildInputs = with self; [];
411 doCheck = false;
411 doCheck = false;
412 propagatedBuildInputs = with self; [six html5lib];
412 propagatedBuildInputs = with self; [six html5lib];
413 src = fetchurl {
413 src = fetchurl {
414 url = "https://pypi.python.org/packages/d4/3f/d517089af35b01bb9bc4eac5ea04bae342b37a5e9abbb27b7c3ce0eae070/bleach-2.1.1.tar.gz";
414 url = "https://pypi.python.org/packages/b3/5f/0da670d30d3ffbc57cc97fa82947f81bbe3eab8d441e2d42e661f215baf2/bleach-2.1.2.tar.gz";
415 md5 = "7c5dfb1d66ea979b5a465afb12c82ec4";
415 md5 = "d0b14ae43a437ee0c650e04c6063eedd";
416 };
416 };
417 meta = {
417 meta = {
418 license = [ pkgs.lib.licenses.asl20 ];
418 license = [ pkgs.lib.licenses.asl20 ];
@@ -710,8 +710,8 b''
710 doCheck = false;
710 doCheck = false;
711 propagatedBuildInputs = with self; [];
711 propagatedBuildInputs = with self; [];
712 src = fetchurl {
712 src = fetchurl {
713 url = "https://pypi.python.org/packages/5e/1a/0aa2c8195a204a9f51284018562dea77e25511f02fe924fac202fc012172/functools32-3.2.3-2.zip";
713 url = "https://pypi.python.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz";
714 md5 = "d55232eb132ec779e6893c902a0bc5ad";
714 md5 = "09f24ffd9af9f6cd0f63cb9f4e23d4b2";
715 };
715 };
716 meta = {
716 meta = {
717 license = [ pkgs.lib.licenses.psfl ];
717 license = [ pkgs.lib.licenses.psfl ];
@@ -783,26 +783,26 b''
783 };
783 };
784 };
784 };
785 graphviz = super.buildPythonPackage {
785 graphviz = super.buildPythonPackage {
786 name = "graphviz-0.8.1";
786 name = "graphviz-0.8.2";
787 buildInputs = with self; [];
787 buildInputs = with self; [];
788 doCheck = false;
788 doCheck = false;
789 propagatedBuildInputs = with self; [];
789 propagatedBuildInputs = with self; [];
790 src = fetchurl {
790 src = fetchurl {
791 url = "https://pypi.python.org/packages/a9/a6/ee6721349489a2da6eedd3dba124f2b5ac15ee1e0a7bd4d3cfdc4fff0327/graphviz-0.8.1.zip";
791 url = "https://pypi.python.org/packages/fa/d1/63b62dee9e55368f60b5ea445e6afb361bb47e692fc27553f3672e16efb8/graphviz-0.8.2.zip";
792 md5 = "88d8efa88c02a735b3659fe0feaf0b96";
792 md5 = "50866e780f43e1cb0d073c70424fcaff";
793 };
793 };
794 meta = {
794 meta = {
795 license = [ pkgs.lib.licenses.mit ];
795 license = [ pkgs.lib.licenses.mit ];
796 };
796 };
797 };
797 };
798 greenlet = super.buildPythonPackage {
798 greenlet = super.buildPythonPackage {
799 name = "greenlet-0.4.12";
799 name = "greenlet-0.4.13";
800 buildInputs = with self; [];
800 buildInputs = with self; [];
801 doCheck = false;
801 doCheck = false;
802 propagatedBuildInputs = with self; [];
802 propagatedBuildInputs = with self; [];
803 src = fetchurl {
803 src = fetchurl {
804 url = "https://pypi.python.org/packages/be/76/82af375d98724054b7e273b5d9369346937324f9bcc20980b45b068ef0b0/greenlet-0.4.12.tar.gz";
804 url = "https://pypi.python.org/packages/13/de/ba92335e9e76040ca7274224942282a80d54f85e342a5e33c5277c7f87eb/greenlet-0.4.13.tar.gz";
805 md5 = "e8637647d58a26c4a1f51ca393e53c00";
805 md5 = "6e0b9dd5385f81d478451ec8ed1d62b3";
806 };
806 };
807 meta = {
807 meta = {
808 license = [ pkgs.lib.licenses.mit ];
808 license = [ pkgs.lib.licenses.mit ];
@@ -822,13 +822,13 b''
822 };
822 };
823 };
823 };
824 html5lib = super.buildPythonPackage {
824 html5lib = super.buildPythonPackage {
825 name = "html5lib-1.0b10";
825 name = "html5lib-1.0.1";
826 buildInputs = with self; [];
826 buildInputs = with self; [];
827 doCheck = false;
827 doCheck = false;
828 propagatedBuildInputs = with self; [six webencodings setuptools];
828 propagatedBuildInputs = with self; [six webencodings];
829 src = fetchurl {
829 src = fetchurl {
830 url = "https://pypi.python.org/packages/97/16/982214624095c1420c75f3bd295d9e658794aafb95fc075823de107e0ae4/html5lib-1.0b10.tar.gz";
830 url = "https://pypi.python.org/packages/85/3e/cf449cf1b5004e87510b9368e7a5f1acd8831c2d6691edd3c62a0823f98f/html5lib-1.0.1.tar.gz";
831 md5 = "5ada1243b7a863624b2f35245b2186e9";
831 md5 = "942a0688d6bdf20d087c9805c40182ad";
832 };
832 };
833 meta = {
833 meta = {
834 license = [ pkgs.lib.licenses.mit ];
834 license = [ pkgs.lib.licenses.mit ];
@@ -874,13 +874,13 b''
874 };
874 };
875 };
875 };
876 ipaddress = super.buildPythonPackage {
876 ipaddress = super.buildPythonPackage {
877 name = "ipaddress-1.0.18";
877 name = "ipaddress-1.0.19";
878 buildInputs = with self; [];
878 buildInputs = with self; [];
879 doCheck = false;
879 doCheck = false;
880 propagatedBuildInputs = with self; [];
880 propagatedBuildInputs = with self; [];
881 src = fetchurl {
881 src = fetchurl {
882 url = "https://pypi.python.org/packages/4e/13/774faf38b445d0b3a844b65747175b2e0500164b7c28d78e34987a5bfe06/ipaddress-1.0.18.tar.gz";
882 url = "https://pypi.python.org/packages/f0/ba/860a4a3e283456d6b7e2ab39ce5cf11a3490ee1a363652ac50abf9f0f5df/ipaddress-1.0.19.tar.gz";
883 md5 = "310c2dfd64eb6f0df44aa8c59f2334a7";
883 md5 = "d0687efaf93a32476d81e90ba0609c57";
884 };
884 };
885 meta = {
885 meta = {
886 license = [ pkgs.lib.licenses.psfl ];
886 license = [ pkgs.lib.licenses.psfl ];
@@ -1048,8 +1048,8 b''
1048 doCheck = false;
1048 doCheck = false;
1049 propagatedBuildInputs = with self; [];
1049 propagatedBuildInputs = with self; [];
1050 src = fetchurl {
1050 src = fetchurl {
1051 url = "https://pypi.python.org/packages/15/45/30273ee91feb60dabb8fbb2da7868520525f02cf910279b3047182feed80/mock-1.0.1.zip";
1051 url = "https://pypi.python.org/packages/a2/52/7edcd94f0afb721a2d559a5b9aae8af4f8f2c79bc63fdbe8a8a6c9b23bbe/mock-1.0.1.tar.gz";
1052 md5 = "869f08d003c289a97c1a6610faf5e913";
1052 md5 = "c3971991738caa55ec7c356bbc154ee2";
1053 };
1053 };
1054 meta = {
1054 meta = {
1055 license = [ pkgs.lib.licenses.bsdOriginal ];
1055 license = [ pkgs.lib.licenses.bsdOriginal ];
@@ -1160,13 +1160,13 b''
1160 };
1160 };
1161 };
1161 };
1162 pexpect = super.buildPythonPackage {
1162 pexpect = super.buildPythonPackage {
1163 name = "pexpect-4.3.0";
1163 name = "pexpect-4.4.0";
1164 buildInputs = with self; [];
1164 buildInputs = with self; [];
1165 doCheck = false;
1165 doCheck = false;
1166 propagatedBuildInputs = with self; [ptyprocess];
1166 propagatedBuildInputs = with self; [ptyprocess];
1167 src = fetchurl {
1167 src = fetchurl {
1168 url = "https://pypi.python.org/packages/f8/44/5466c30e49762bb92e442bbdf4472d6904608d211258eb3198a11f0309a4/pexpect-4.3.0.tar.gz";
1168 url = "https://pypi.python.org/packages/fa/c3/60c0cbf96f242d0b47a82e9ca634dcd6dcb043832cf05e17540812e1c707/pexpect-4.4.0.tar.gz";
1169 md5 = "047a486dcd26134b74f2e67046bb61a0";
1169 md5 = "e9b07f0765df8245ac72201d757baaef";
1170 };
1170 };
1171 meta = {
1171 meta = {
1172 license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ];
1172 license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ];
@@ -1225,13 +1225,13 b''
1225 };
1225 };
1226 };
1226 };
1227 psutil = super.buildPythonPackage {
1227 psutil = super.buildPythonPackage {
1228 name = "psutil-5.4.0";
1228 name = "psutil-5.4.3";
1229 buildInputs = with self; [];
1229 buildInputs = with self; [];
1230 doCheck = false;
1230 doCheck = false;
1231 propagatedBuildInputs = with self; [];
1231 propagatedBuildInputs = with self; [];
1232 src = fetchurl {
1232 src = fetchurl {
1233 url = "https://pypi.python.org/packages/8d/96/1fc6468be91521192861966c40bd73fdf8b065eae6d82dd0f870b9825a65/psutil-5.4.0.tar.gz";
1233 url = "https://pypi.python.org/packages/e2/e1/600326635f97fee89bf8426fef14c5c29f4849c79f68fd79f433d8c1bd96/psutil-5.4.3.tar.gz";
1234 md5 = "01af6219b1e8fcfd53603023967713bf";
1234 md5 = "3b291833dbea631db9d271aa602a169a";
1235 };
1235 };
1236 meta = {
1236 meta = {
1237 license = [ pkgs.lib.licenses.bsdOriginal ];
1237 license = [ pkgs.lib.licenses.bsdOriginal ];
@@ -1360,8 +1360,8 b''
1360 doCheck = false;
1360 doCheck = false;
1361 propagatedBuildInputs = with self; [];
1361 propagatedBuildInputs = with self; [];
1362 src = fetchurl {
1362 src = fetchurl {
1363 url = "https://pypi.python.org/packages/2e/26/e8fb5b4256a5f5036be7ce115ef8db8d06bc537becfbdc46c6af008314ee/pyparsing-1.5.7.zip";
1363 url = "https://pypi.python.org/packages/6f/2c/47457771c02a8ff0f302b695e094ec309e30452232bd79198ee94fda689f/pyparsing-1.5.7.tar.gz";
1364 md5 = "b86854857a368d6ccb4d5b6e76d0637f";
1364 md5 = "9be0fcdcc595199c646ab317c1d9a709";
1365 };
1365 };
1366 meta = {
1366 meta = {
1367 license = [ pkgs.lib.licenses.mit ];
1367 license = [ pkgs.lib.licenses.mit ];
@@ -1602,13 +1602,13 b''
1602 };
1602 };
1603 };
1603 };
1604 pytz = super.buildPythonPackage {
1604 pytz = super.buildPythonPackage {
1605 name = "pytz-2017.3";
1605 name = "pytz-2018.3";
1606 buildInputs = with self; [];
1606 buildInputs = with self; [];
1607 doCheck = false;
1607 doCheck = false;
1608 propagatedBuildInputs = with self; [];
1608 propagatedBuildInputs = with self; [];
1609 src = fetchurl {
1609 src = fetchurl {
1610 url = "https://pypi.python.org/packages/60/88/d3152c234da4b2a1f7a989f89609ea488225eaea015bc16fbde2b3fdfefa/pytz-2017.3.zip";
1610 url = "https://pypi.python.org/packages/1b/50/4cdc62fc0753595fc16c8f722a89740f487c6e5670c644eb8983946777be/pytz-2018.3.tar.gz";
1611 md5 = "7006b56c0d68a162d9fe57d4249c3171";
1611 md5 = "abb07c09c79f78d7c04f222a550c99ef";
1612 };
1612 };
1613 meta = {
1613 meta = {
1614 license = [ pkgs.lib.licenses.mit ];
1614 license = [ pkgs.lib.licenses.mit ];
@@ -1680,10 +1680,10 b''
1680 };
1680 };
1681 };
1681 };
1682 rhodecode-enterprise-ce = super.buildPythonPackage {
1682 rhodecode-enterprise-ce = super.buildPythonPackage {
1683 name = "rhodecode-enterprise-ce-4.11.6";
1683 name = "rhodecode-enterprise-ce-4.12.0";
1684 buildInputs = with self; [pytest py pytest-cov pytest-sugar pytest-runner pytest-catchlog pytest-profiling gprof2dot pytest-timeout mock WebTest cov-core coverage configobj];
1684 buildInputs = with self; [pytest py pytest-cov pytest-sugar pytest-runner pytest-catchlog pytest-profiling gprof2dot pytest-timeout mock WebTest cov-core coverage configobj];
1685 doCheck = true;
1685 doCheck = true;
1686 propagatedBuildInputs = with self; [setuptools-scm amqp authomatic Babel Beaker celery Chameleon channelstream click colander configobj cssselect decorator deform docutils dogpile.cache dogpile.core ecdsa FormEncode future futures gnureadline infrae.cache iso8601 itsdangerous Jinja2 billiard kombu lxml Mako Markdown MarkupSafe msgpack-python MySQL-python objgraph packaging Paste PasteDeploy PasteScript pathlib2 peppercorn psutil psycopg2 py-bcrypt pycrypto pycurl pyflakes pygments-markdown-lexer Pygments pyparsing pyramid-beaker pyramid-debugtoolbar pyramid-jinja2 pyramid-mako pyramid pysqlite python-dateutil python-ldap python-memcached python-pam pytz pyzmq py-gfm recaptcha-client redis repoze.lru requests Routes setproctitle simplejson six SQLAlchemy sshpubkeys subprocess32 supervisor Tempita translationstring trollius urllib3 URLObject venusian WebError WebHelpers2 WebHelpers WebOb Whoosh wsgiref zope.cachedescriptors zope.deprecation zope.event zope.interface nbconvert bleach nbformat jupyter-client alembic invoke bumpversion transifex-client gevent greenlet gunicorn waitress uWSGI ipdb ipython CProfileV bottle rhodecode-tools appenlight-client pytest py pytest-cov pytest-sugar pytest-runner pytest-catchlog pytest-profiling gprof2dot pytest-timeout mock WebTest cov-core coverage];
1686 propagatedBuildInputs = with self; [setuptools-scm amqp authomatic Babel Beaker celery Chameleon channelstream click colander configobj cssselect decorator deform docutils dogpile.cache dogpile.core ecdsa FormEncode future futures gnureadline infrae.cache iso8601 itsdangerous Jinja2 billiard kombu lxml Mako Markdown MarkupSafe msgpack-python MySQL-python objgraph packaging Paste PasteDeploy PasteScript pathlib2 peppercorn psutil psycopg2 py-bcrypt pycrypto pycurl pyflakes pygments-markdown-lexer Pygments pyparsing pyramid-beaker pyramid-debugtoolbar pyramid-jinja2 pyramid-mako pyramid pysqlite python-dateutil python-ldap python-memcached python-pam pytz tzlocal pyzmq py-gfm recaptcha-client redis repoze.lru requests Routes setproctitle simplejson six SQLAlchemy sshpubkeys subprocess32 supervisor Tempita translationstring trollius urllib3 URLObject venusian WebError WebHelpers2 WebHelpers WebOb Whoosh wsgiref zope.cachedescriptors zope.deprecation zope.event zope.interface nbconvert bleach nbformat jupyter-client alembic invoke bumpversion transifex-client gevent greenlet gunicorn waitress uWSGI ipdb ipython CProfileV bottle rhodecode-tools appenlight-client pytest py pytest-cov pytest-sugar pytest-runner pytest-catchlog pytest-profiling gprof2dot pytest-timeout mock WebTest cov-core coverage];
1687 src = ./.;
1687 src = ./.;
1688 meta = {
1688 meta = {
1689 license = [ { fullName = "Affero GNU General Public License v3 or later (AGPLv3+)"; } { fullName = "AGPLv3, and Commercial License"; } ];
1689 license = [ { fullName = "Affero GNU General Public License v3 or later (AGPLv3+)"; } { fullName = "AGPLv3, and Commercial License"; } ];
@@ -1703,13 +1703,13 b''
1703 };
1703 };
1704 };
1704 };
1705 scandir = super.buildPythonPackage {
1705 scandir = super.buildPythonPackage {
1706 name = "scandir-1.6";
1706 name = "scandir-1.7";
1707 buildInputs = with self; [];
1707 buildInputs = with self; [];
1708 doCheck = false;
1708 doCheck = false;
1709 propagatedBuildInputs = with self; [];
1709 propagatedBuildInputs = with self; [];
1710 src = fetchurl {
1710 src = fetchurl {
1711 url = "https://pypi.python.org/packages/77/3f/916f524f50ee65e3f465a280d2851bd63685250fddb3020c212b3977664d/scandir-1.6.tar.gz";
1711 url = "https://pypi.python.org/packages/13/bb/e541b74230bbf7a20a3949a2ee6631be299378a784f5445aa5d0047c192b/scandir-1.7.tar.gz";
1712 md5 = "0180ddb97c96cbb2d4f25d2ae11c64ac";
1712 md5 = "037e5f24d1a0e78b17faca72dea9555f";
1713 };
1713 };
1714 meta = {
1714 meta = {
1715 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ];
1715 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ];
@@ -1820,13 +1820,13 b''
1820 };
1820 };
1821 };
1821 };
1822 supervisor = super.buildPythonPackage {
1822 supervisor = super.buildPythonPackage {
1823 name = "supervisor-3.3.3";
1823 name = "supervisor-3.3.4";
1824 buildInputs = with self; [];
1824 buildInputs = with self; [];
1825 doCheck = false;
1825 doCheck = false;
1826 propagatedBuildInputs = with self; [meld3];
1826 propagatedBuildInputs = with self; [meld3];
1827 src = fetchurl {
1827 src = fetchurl {
1828 url = "https://pypi.python.org/packages/31/7e/788fc6566211e77c395ea272058eb71299c65cc5e55b6214d479c6c2ec9a/supervisor-3.3.3.tar.gz";
1828 url = "https://pypi.python.org/packages/44/60/698e54b4a4a9b956b2d709b4b7b676119c833d811d53ee2500f1b5e96dc3/supervisor-3.3.4.tar.gz";
1829 md5 = "0fe86dfec4e5c5d98324d24c4cf944bd";
1829 md5 = "f1814d71d820ddfa8c86d46a72314cec";
1830 };
1830 };
1831 meta = {
1831 meta = {
1832 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1832 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
@@ -1910,6 +1910,19 b''
1910 license = [ pkgs.lib.licenses.asl20 ];
1910 license = [ pkgs.lib.licenses.asl20 ];
1911 };
1911 };
1912 };
1912 };
1913 tzlocal = super.buildPythonPackage {
1914 name = "tzlocal-1.5.1";
1915 buildInputs = with self; [];
1916 doCheck = false;
1917 propagatedBuildInputs = with self; [pytz];
1918 src = fetchurl {
1919 url = "https://pypi.python.org/packages/cb/89/e3687d3ed99bc882793f82634e9824e62499fdfdc4b1ae39e211c5b05017/tzlocal-1.5.1.tar.gz";
1920 md5 = "4553be891efa0812c4adfb0c6e818eec";
1921 };
1922 meta = {
1923 license = [ pkgs.lib.licenses.mit ];
1924 };
1925 };
1913 uWSGI = super.buildPythonPackage {
1926 uWSGI = super.buildPythonPackage {
1914 name = "uWSGI-2.0.15";
1927 name = "uWSGI-2.0.15";
1915 buildInputs = with self; [];
1928 buildInputs = with self; [];
@@ -2002,13 +2015,13 b''
2002 };
2015 };
2003 };
2016 };
2004 ws4py = super.buildPythonPackage {
2017 ws4py = super.buildPythonPackage {
2005 name = "ws4py-0.4.2";
2018 name = "ws4py-0.4.3";
2006 buildInputs = with self; [];
2019 buildInputs = with self; [];
2007 doCheck = false;
2020 doCheck = false;
2008 propagatedBuildInputs = with self; [];
2021 propagatedBuildInputs = with self; [];
2009 src = fetchurl {
2022 src = fetchurl {
2010 url = "https://pypi.python.org/packages/b8/98/a90f1d96ffcb15dfc220af524ce23e0a5881258dafa197673357ce1683dd/ws4py-0.4.2.tar.gz";
2023 url = "https://pypi.python.org/packages/fa/a1/33c43a4304ac3b4dc81deb93cbd329de9297dd034d75c47cce64fda806bc/ws4py-0.4.3.tar.gz";
2011 md5 = "f0603ae376707a58d205bd87a67758a2";
2024 md5 = "d5834cf7d3965bb0da31bbb02bd8513a";
2012 };
2025 };
2013 meta = {
2026 meta = {
2014 license = [ pkgs.lib.licenses.bsdOriginal ];
2027 license = [ pkgs.lib.licenses.bsdOriginal ];
@@ -2028,52 +2041,52 b''
2028 };
2041 };
2029 };
2042 };
2030 zope.cachedescriptors = super.buildPythonPackage {
2043 zope.cachedescriptors = super.buildPythonPackage {
2031 name = "zope.cachedescriptors-4.0.0";
2044 name = "zope.cachedescriptors-4.3.1";
2032 buildInputs = with self; [];
2045 buildInputs = with self; [];
2033 doCheck = false;
2046 doCheck = false;
2034 propagatedBuildInputs = with self; [setuptools];
2047 propagatedBuildInputs = with self; [setuptools];
2035 src = fetchurl {
2048 src = fetchurl {
2036 url = "https://pypi.python.org/packages/40/33/694b6644c37f28553f4b9f20b3c3a20fb709a22574dff20b5bdffb09ecd5/zope.cachedescriptors-4.0.0.tar.gz";
2049 url = "https://pypi.python.org/packages/2f/89/ebe1890cc6d3291ebc935558fa764d5fffe571018dbbee200e9db78762cb/zope.cachedescriptors-4.3.1.tar.gz";
2037 md5 = "8d308de8c936792c8e758058fcb7d0f0";
2050 md5 = "42f3693f43bc93f3b1eb86940f58acf3";
2038 };
2051 };
2039 meta = {
2052 meta = {
2040 license = [ pkgs.lib.licenses.zpt21 ];
2053 license = [ pkgs.lib.licenses.zpt21 ];
2041 };
2054 };
2042 };
2055 };
2043 zope.deprecation = super.buildPythonPackage {
2056 zope.deprecation = super.buildPythonPackage {
2044 name = "zope.deprecation-4.1.2";
2057 name = "zope.deprecation-4.3.0";
2045 buildInputs = with self; [];
2058 buildInputs = with self; [];
2046 doCheck = false;
2059 doCheck = false;
2047 propagatedBuildInputs = with self; [setuptools];
2060 propagatedBuildInputs = with self; [setuptools];
2048 src = fetchurl {
2061 src = fetchurl {
2049 url = "https://pypi.python.org/packages/c1/d3/3919492d5e57d8dd01b36f30b34fc8404a30577392b1eb817c303499ad20/zope.deprecation-4.1.2.tar.gz";
2062 url = "https://pypi.python.org/packages/a1/18/2dc5e6bfe64fdc3b79411b67464c55bb0b43b127051a20f7f492ab767758/zope.deprecation-4.3.0.tar.gz";
2050 md5 = "e9a663ded58f4f9f7881beb56cae2782";
2063 md5 = "2166b2cb7e0e96a21104e6f8f9b696bb";
2051 };
2064 };
2052 meta = {
2065 meta = {
2053 license = [ pkgs.lib.licenses.zpt21 ];
2066 license = [ pkgs.lib.licenses.zpt21 ];
2054 };
2067 };
2055 };
2068 };
2056 zope.event = super.buildPythonPackage {
2069 zope.event = super.buildPythonPackage {
2057 name = "zope.event-4.0.3";
2070 name = "zope.event-4.3.0";
2058 buildInputs = with self; [];
2071 buildInputs = with self; [];
2059 doCheck = false;
2072 doCheck = false;
2060 propagatedBuildInputs = with self; [setuptools];
2073 propagatedBuildInputs = with self; [setuptools];
2061 src = fetchurl {
2074 src = fetchurl {
2062 url = "https://pypi.python.org/packages/c1/29/91ba884d7d6d96691df592e9e9c2bfa57a47040ec1ff47eff18c85137152/zope.event-4.0.3.tar.gz";
2075 url = "https://pypi.python.org/packages/9e/d0/54ba59f19a0635f6591b74be259cf6fbf67e73f4edda27b5cd0cf4d26efa/zope.event-4.3.0.tar.gz";
2063 md5 = "9a3780916332b18b8b85f522bcc3e249";
2076 md5 = "8ca737960741c6fd112972f3313303bd";
2064 };
2077 };
2065 meta = {
2078 meta = {
2066 license = [ pkgs.lib.licenses.zpt21 ];
2079 license = [ pkgs.lib.licenses.zpt21 ];
2067 };
2080 };
2068 };
2081 };
2069 zope.interface = super.buildPythonPackage {
2082 zope.interface = super.buildPythonPackage {
2070 name = "zope.interface-4.1.3";
2083 name = "zope.interface-4.4.3";
2071 buildInputs = with self; [];
2084 buildInputs = with self; [];
2072 doCheck = false;
2085 doCheck = false;
2073 propagatedBuildInputs = with self; [setuptools];
2086 propagatedBuildInputs = with self; [setuptools];
2074 src = fetchurl {
2087 src = fetchurl {
2075 url = "https://pypi.python.org/packages/9d/81/2509ca3c6f59080123c1a8a97125eb48414022618cec0e64eb1313727bfe/zope.interface-4.1.3.tar.gz";
2088 url = "https://pypi.python.org/packages/bd/d2/25349ed41f9dcff7b3baf87bd88a4c82396cf6e02f1f42bb68657a3132af/zope.interface-4.4.3.tar.gz";
2076 md5 = "9ae3d24c0c7415deb249dd1a132f0f79";
2089 md5 = "8700a4f527c1203b34b10c2b4e7a6912";
2077 };
2090 };
2078 meta = {
2091 meta = {
2079 license = [ pkgs.lib.licenses.zpt21 ];
2092 license = [ pkgs.lib.licenses.zpt21 ];
@@ -5,7 +5,7 b' setuptools-scm==1.15.6'
5 amqp==2.2.2
5 amqp==2.2.2
6 authomatic==0.1.0.post1
6 authomatic==0.1.0.post1
7 Babel==1.3
7 Babel==1.3
8 Beaker==1.9.0
8 Beaker==1.9.1
9 celery==4.1.0
9 celery==4.1.0
10 Chameleon==2.24
10 Chameleon==2.24
11 channelstream==0.5.2
11 channelstream==0.5.2
@@ -31,7 +31,7 b' billiard==3.5.0.3'
31 kombu==4.1.0
31 kombu==4.1.0
32 lxml==3.7.3
32 lxml==3.7.3
33 Mako==1.0.7
33 Mako==1.0.7
34 Markdown==2.6.9
34 Markdown==2.6.11
35 MarkupSafe==1.0.0
35 MarkupSafe==1.0.0
36 msgpack-python==0.4.8
36 msgpack-python==0.4.8
37 MySQL-python==1.2.5
37 MySQL-python==1.2.5
@@ -42,7 +42,7 b' PasteDeploy==1.5.2'
42 PasteScript==2.0.2
42 PasteScript==2.0.2
43 pathlib2==2.3.0
43 pathlib2==2.3.0
44 peppercorn==0.5
44 peppercorn==0.5
45 psutil==5.4.0
45 psutil==5.4.3
46 psycopg2==2.7.3.2
46 psycopg2==2.7.3.2
47 py-bcrypt==0.4
47 py-bcrypt==0.4
48 pycrypto==2.6.1
48 pycrypto==2.6.1
@@ -61,7 +61,8 b' python-dateutil'
61 python-ldap==2.4.45
61 python-ldap==2.4.45
62 python-memcached==1.58
62 python-memcached==1.58
63 python-pam==1.8.2
63 python-pam==1.8.2
64 pytz==2017.3
64 pytz==2018.3
65 tzlocal==1.5.1
65 pyzmq==14.6.0
66 pyzmq==14.6.0
66 py-gfm==0.1.3
67 py-gfm==0.1.3
67 recaptcha-client==1.0.6
68 recaptcha-client==1.0.6
@@ -75,7 +76,7 b' six==1.11.0'
75 SQLAlchemy==1.1.15
76 SQLAlchemy==1.1.15
76 sshpubkeys==2.2.0
77 sshpubkeys==2.2.0
77 subprocess32==3.2.7
78 subprocess32==3.2.7
78 supervisor==3.3.3
79 supervisor==3.3.4
79 Tempita==0.5.2
80 Tempita==0.5.2
80 translationstring==1.3
81 translationstring==1.3
81 trollius==1.0.4
82 trollius==1.0.4
@@ -88,29 +89,29 b' WebHelpers==1.3'
88 WebOb==1.7.4
89 WebOb==1.7.4
89 Whoosh==2.7.4
90 Whoosh==2.7.4
90 wsgiref==0.1.2
91 wsgiref==0.1.2
91 zope.cachedescriptors==4.0.0
92 zope.cachedescriptors==4.3.1
92 zope.deprecation==4.1.2
93 zope.deprecation==4.3.0
93 zope.event==4.0.3
94 zope.event==4.3.0
94 zope.interface==4.1.3
95 zope.interface==4.4.3
95
96
96
97
97 # IPYTHON RENDERING
98 # IPYTHON RENDERING
98 # entrypoints backport, pypi version doesn't support egg installs
99 # entrypoints backport, pypi version doesn't support egg installs
99 https://code.rhodecode.com/upstream/entrypoints/archive/96e6d645684e1af3d7df5b5272f3fe85a546b233.tar.gz?md5=7db37771aea9ac9fefe093e5d6987313#egg=entrypoints==0.2.2.rhodecode-upstream1
100 https://code.rhodecode.com/upstream/entrypoints/archive/96e6d645684e1af3d7df5b5272f3fe85a546b233.tar.gz?md5=7db37771aea9ac9fefe093e5d6987313#egg=entrypoints==0.2.2.rhodecode-upstream1
100 nbconvert==5.3.1
101 nbconvert==5.3.1
101 bleach==2.1.1
102 bleach==2.1.2
102 nbformat==4.4.0
103 nbformat==4.4.0
103 jupyter_client==5.0.0
104 jupyter_client==5.0.0
104
105
105 ## cli tools
106 ## cli tools
106 alembic==0.9.6
107 alembic==0.9.8
107 invoke==0.13.0
108 invoke==0.13.0
108 bumpversion==0.5.3
109 bumpversion==0.5.3
109 transifex-client==0.12.5
110 transifex-client==0.12.5
110
111
111 ## http servers
112 ## http servers
112 gevent==1.2.2
113 gevent==1.2.2
113 greenlet==0.4.12
114 greenlet==0.4.13
114 gunicorn==19.7.1
115 gunicorn==19.7.1
115 waitress==1.1.0
116 waitress==1.1.0
116 uWSGI==2.0.15
117 uWSGI==2.0.15
@@ -125,7 +126,7 b' bottle==0.12.13'
125 https://code.rhodecode.com/rhodecode-tools-ce/archive/v0.14.1.tar.gz?md5=0b9c2caad160b68889f8172ea54af7b2#egg=rhodecode-tools==0.14.1
126 https://code.rhodecode.com/rhodecode-tools-ce/archive/v0.14.1.tar.gz?md5=0b9c2caad160b68889f8172ea54af7b2#egg=rhodecode-tools==0.14.1
126
127
127 ## appenlight
128 ## appenlight
128 appenlight-client==0.6.22
129 appenlight-client==0.6.25
129
130
130 ## test related requirements
131 ## test related requirements
131 -r requirements_test.txt
132 -r requirements_test.txt
@@ -1,1 +1,1 b''
1 4.11.6 No newline at end of file
1 4.12.0 No newline at end of file
@@ -51,7 +51,7 b' PYRAMID_SETTINGS = {}'
51 EXTENSIONS = {}
51 EXTENSIONS = {}
52
52
53 __version__ = ('.'.join((str(each) for each in VERSION[:3])))
53 __version__ = ('.'.join((str(each) for each in VERSION[:3])))
54 __dbversion__ = 85 # defines current db version for migrations
54 __dbversion__ = 86 # defines current db version for migrations
55 __platform__ = platform.system()
55 __platform__ = platform.system()
56 __license__ = 'AGPLv3, and Commercial License'
56 __license__ = 'AGPLv3, and Commercial License'
57 __author__ = 'RhodeCode GmbH'
57 __author__ = 'RhodeCode GmbH'
@@ -42,7 +42,8 b' class TestGetRepoChangeset(object):'
42 if details == 'full':
42 if details == 'full':
43 assert result['refs']['bookmarks'] == getattr(
43 assert result['refs']['bookmarks'] == getattr(
44 commit, 'bookmarks', [])
44 commit, 'bookmarks', [])
45 assert result['refs']['branches'] == [commit.branch]
45 branches = [commit.branch] if commit.branch else []
46 assert result['refs']['branches'] == branches
46 assert result['refs']['tags'] == commit.tags
47 assert result['refs']['tags'] == commit.tags
47
48
48 @pytest.mark.parametrize("details", ['basic', 'extended', 'full'])
49 @pytest.mark.parametrize("details", ['basic', 'extended', 'full'])
@@ -33,12 +33,14 b' class TestPull(object):'
33 def test_api_pull(self, backend):
33 def test_api_pull(self, backend):
34 r = backend.create_repo()
34 r = backend.create_repo()
35 repo_name = r.repo_name
35 repo_name = r.repo_name
36 r.clone_uri = os.path.join(TESTS_TMP_PATH, backend.repo_name)
36 clone_uri = os.path.join(TESTS_TMP_PATH, backend.repo_name)
37 r.clone_uri = clone_uri
37
38
38 id_, params = build_data(self.apikey, 'pull', repoid=repo_name,)
39 id_, params = build_data(self.apikey, 'pull', repoid=repo_name,)
39 response = api_call(self.app, params)
40 response = api_call(self.app, params)
40
41 msg = 'Pulled from url `%s` on repo `%s`' % (
41 expected = {'msg': 'Pulled from `%s`' % (repo_name,),
42 clone_uri, repo_name)
43 expected = {'msg': msg,
42 'repository': repo_name}
44 'repository': repo_name}
43 assert_ok(id_, expected, given=response.body)
45 assert_ok(id_, expected, given=response.body)
44
46
@@ -47,5 +49,5 b' class TestPull(object):'
47 self.apikey, 'pull', repoid=backend.repo_name)
49 self.apikey, 'pull', repoid=backend.repo_name)
48 response = api_call(self.app, params)
50 response = api_call(self.app, params)
49
51
50 expected = 'Unable to pull changes from `%s`' % (backend.repo_name,)
52 expected = 'Unable to pull changes from `None`'
51 assert_error(id_, expected, given=response.body)
53 assert_error(id_, expected, given=response.body)
@@ -56,6 +56,9 b' class TestApiUpdateRepo(object):'
56 ({'clone_uri': ''},
56 ({'clone_uri': ''},
57 {'clone_uri': ''}),
57 {'clone_uri': ''}),
58
58
59 ({'push_uri': ''},
60 {'push_uri': ''}),
61
59 ({'landing_rev': 'rev:tip'},
62 ({'landing_rev': 'rev:tip'},
60 {'landing_rev': ['rev', 'tip']}),
63 {'landing_rev': ['rev', 'tip']}),
61
64
@@ -36,7 +36,9 b' class TestUpdateUserGroup(object):'
36 # ('owner', {'owner': TEST_USER_REGULAR_LOGIN}),
36 # ('owner', {'owner': TEST_USER_REGULAR_LOGIN}),
37 ('owner_email', {'owner_email': TEST_USER_ADMIN_EMAIL}),
37 ('owner_email', {'owner_email': TEST_USER_ADMIN_EMAIL}),
38 ('active', {'active': False}),
38 ('active', {'active': False}),
39 ('active', {'active': True})
39 ('active', {'active': True}),
40 ('sync', {'sync': False}),
41 ('sync', {'sync': True})
40 ])
42 ])
41 def test_api_update_user_group(self, changing_attr, updates, user_util):
43 def test_api_update_user_group(self, changing_attr, updates, user_util):
42 user_group = user_util.create_user_group()
44 user_group = user_util.create_user_group()
@@ -49,6 +51,12 b' class TestUpdateUserGroup(object):'
49 **updates)
51 **updates)
50 response = api_call(self.app, params)
52 response = api_call(self.app, params)
51
53
54 # special case for sync
55 if changing_attr == 'sync' and updates['sync'] is False:
56 expected_api_data['sync'] = None
57 elif changing_attr == 'sync' and updates['sync'] is True:
58 expected_api_data['sync'] = 'manual_api'
59
52 expected = {
60 expected = {
53 'msg': 'updated user group ID:%s %s' % (
61 'msg': 'updated user group ID:%s %s' % (
54 user_group.users_group_id, user_group.users_group_name),
62 user_group.users_group_id, user_group.users_group_name),
@@ -63,7 +71,9 b' class TestUpdateUserGroup(object):'
63 # ('owner', {'owner': TEST_USER_REGULAR_LOGIN}),
71 # ('owner', {'owner': TEST_USER_REGULAR_LOGIN}),
64 ('owner_email', {'owner_email': TEST_USER_ADMIN_EMAIL}),
72 ('owner_email', {'owner_email': TEST_USER_ADMIN_EMAIL}),
65 ('active', {'active': False}),
73 ('active', {'active': False}),
66 ('active', {'active': True})
74 ('active', {'active': True}),
75 ('sync', {'sync': False}),
76 ('sync', {'sync': True})
67 ])
77 ])
68 def test_api_update_user_group_regular_user(
78 def test_api_update_user_group_regular_user(
69 self, changing_attr, updates, user_util):
79 self, changing_attr, updates, user_util):
@@ -72,7 +82,6 b' class TestUpdateUserGroup(object):'
72 expected_api_data = user_group.get_api_data()
82 expected_api_data = user_group.get_api_data()
73 expected_api_data.update(updates)
83 expected_api_data.update(updates)
74
84
75
76 # grant permission to this user
85 # grant permission to this user
77 user = UserModel().get_by_username(self.TEST_USER_LOGIN)
86 user = UserModel().get_by_username(self.TEST_USER_LOGIN)
78
87
@@ -82,6 +91,12 b' class TestUpdateUserGroup(object):'
82 self.apikey_regular, 'update_user_group',
91 self.apikey_regular, 'update_user_group',
83 usergroupid=group_name, **updates)
92 usergroupid=group_name, **updates)
84 response = api_call(self.app, params)
93 response = api_call(self.app, params)
94 # special case for sync
95 if changing_attr == 'sync' and updates['sync'] is False:
96 expected_api_data['sync'] = None
97 elif changing_attr == 'sync' and updates['sync'] is True:
98 expected_api_data['sync'] = 'manual_api'
99
85 expected = {
100 expected = {
86 'msg': 'updated user group ID:%s %s' % (
101 'msg': 'updated user group ID:%s %s' % (
87 user_group.users_group_id, user_group.users_group_name),
102 user_group.users_group_id, user_group.users_group_name),
@@ -563,6 +563,7 b' def create_repo('
563 description=Optional(''),
563 description=Optional(''),
564 private=Optional(False),
564 private=Optional(False),
565 clone_uri=Optional(None),
565 clone_uri=Optional(None),
566 push_uri=Optional(None),
566 landing_rev=Optional('rev:tip'),
567 landing_rev=Optional('rev:tip'),
567 enable_statistics=Optional(False),
568 enable_statistics=Optional(False),
568 enable_locking=Optional(False),
569 enable_locking=Optional(False),
@@ -596,6 +597,8 b' def create_repo('
596 :type private: bool
597 :type private: bool
597 :param clone_uri: set clone_uri
598 :param clone_uri: set clone_uri
598 :type clone_uri: str
599 :type clone_uri: str
600 :param push_uri: set push_uri
601 :type push_uri: str
599 :param landing_rev: <rev_type>:<rev>
602 :param landing_rev: <rev_type>:<rev>
600 :type landing_rev: str
603 :type landing_rev: str
601 :param enable_locking:
604 :param enable_locking:
@@ -639,6 +642,7 b' def create_repo('
639 description = Optional.extract(description)
642 description = Optional.extract(description)
640 copy_permissions = Optional.extract(copy_permissions)
643 copy_permissions = Optional.extract(copy_permissions)
641 clone_uri = Optional.extract(clone_uri)
644 clone_uri = Optional.extract(clone_uri)
645 push_uri = Optional.extract(push_uri)
642 landing_commit_ref = Optional.extract(landing_rev)
646 landing_commit_ref = Optional.extract(landing_rev)
643
647
644 defs = SettingsModel().get_default_repo_settings(strip_prefix=True)
648 defs = SettingsModel().get_default_repo_settings(strip_prefix=True)
@@ -667,6 +671,7 b' def create_repo('
667 repo_description=description,
671 repo_description=description,
668 repo_landing_commit_ref=landing_commit_ref,
672 repo_landing_commit_ref=landing_commit_ref,
669 repo_clone_uri=clone_uri,
673 repo_clone_uri=clone_uri,
674 repo_push_uri=push_uri,
670 repo_private=private,
675 repo_private=private,
671 repo_copy_permissions=copy_permissions,
676 repo_copy_permissions=copy_permissions,
672 repo_enable_statistics=enable_statistics,
677 repo_enable_statistics=enable_statistics,
@@ -685,6 +690,7 b' def create_repo('
685 'repo_description': schema_data['repo_description'],
690 'repo_description': schema_data['repo_description'],
686 'repo_private': schema_data['repo_private'],
691 'repo_private': schema_data['repo_private'],
687 'clone_uri': schema_data['repo_clone_uri'],
692 'clone_uri': schema_data['repo_clone_uri'],
693 'push_uri': schema_data['repo_push_uri'],
688 'repo_landing_rev': schema_data['repo_landing_commit_ref'],
694 'repo_landing_rev': schema_data['repo_landing_commit_ref'],
689 'enable_statistics': schema_data['repo_enable_statistics'],
695 'enable_statistics': schema_data['repo_enable_statistics'],
690 'enable_locking': schema_data['repo_enable_locking'],
696 'enable_locking': schema_data['repo_enable_locking'],
@@ -692,7 +698,7 b' def create_repo('
692 'repo_copy_permissions': schema_data['repo_copy_permissions'],
698 'repo_copy_permissions': schema_data['repo_copy_permissions'],
693 }
699 }
694
700
695 task = RepoModel().create(form_data=data, cur_user=owner)
701 task = RepoModel().create(form_data=data, cur_user=owner.user_id)
696 task_id = get_task_id(task)
702 task_id = get_task_id(task)
697 # no commit, it's done in RepoModel, or async via celery
703 # no commit, it's done in RepoModel, or async via celery
698 return {
704 return {
@@ -800,7 +806,8 b' def remove_field_from_repo(request, apiu'
800 def update_repo(
806 def update_repo(
801 request, apiuser, repoid, repo_name=Optional(None),
807 request, apiuser, repoid, repo_name=Optional(None),
802 owner=Optional(OAttr('apiuser')), description=Optional(''),
808 owner=Optional(OAttr('apiuser')), description=Optional(''),
803 private=Optional(False), clone_uri=Optional(None),
809 private=Optional(False),
810 clone_uri=Optional(None), push_uri=Optional(None),
804 landing_rev=Optional('rev:tip'), fork_of=Optional(None),
811 landing_rev=Optional('rev:tip'), fork_of=Optional(None),
805 enable_statistics=Optional(False),
812 enable_statistics=Optional(False),
806 enable_locking=Optional(False),
813 enable_locking=Optional(False),
@@ -877,6 +884,9 b' def update_repo('
877 clone_uri=clone_uri
884 clone_uri=clone_uri
878 if not isinstance(clone_uri, Optional) else repo.clone_uri,
885 if not isinstance(clone_uri, Optional) else repo.clone_uri,
879
886
887 push_uri=push_uri
888 if not isinstance(push_uri, Optional) else repo.push_uri,
889
880 repo_landing_rev=landing_rev
890 repo_landing_rev=landing_rev
881 if not isinstance(landing_rev, Optional) else repo._landing_revision,
891 if not isinstance(landing_rev, Optional) else repo._landing_revision,
882
892
@@ -910,6 +920,7 b' def update_repo('
910 repo_owner=updates['user'],
920 repo_owner=updates['user'],
911 repo_description=updates['repo_description'],
921 repo_description=updates['repo_description'],
912 repo_clone_uri=updates['clone_uri'],
922 repo_clone_uri=updates['clone_uri'],
923 repo_push_uri=updates['push_uri'],
913 repo_fork_of=updates['fork_id'],
924 repo_fork_of=updates['fork_id'],
914 repo_private=updates['repo_private'],
925 repo_private=updates['repo_private'],
915 repo_landing_commit_ref=updates['repo_landing_rev'],
926 repo_landing_commit_ref=updates['repo_landing_rev'],
@@ -928,6 +939,7 b' def update_repo('
928 repo_description=schema_data['repo_description'],
939 repo_description=schema_data['repo_description'],
929 repo_private=schema_data['repo_private'],
940 repo_private=schema_data['repo_private'],
930 clone_uri=schema_data['repo_clone_uri'],
941 clone_uri=schema_data['repo_clone_uri'],
942 push_uri=schema_data['repo_push_uri'],
931 repo_landing_rev=schema_data['repo_landing_commit_ref'],
943 repo_landing_rev=schema_data['repo_landing_commit_ref'],
932 repo_enable_statistics=schema_data['repo_enable_statistics'],
944 repo_enable_statistics=schema_data['repo_enable_statistics'],
933 repo_enable_locking=schema_data['repo_enable_locking'],
945 repo_enable_locking=schema_data['repo_enable_locking'],
@@ -1084,7 +1096,7 b' def fork_repo(request, apiuser, repoid, '
1084 'landing_rev': schema_data['repo_landing_commit_ref'],
1096 'landing_rev': schema_data['repo_landing_commit_ref'],
1085 }
1097 }
1086
1098
1087 task = RepoModel().create_fork(data, cur_user=owner)
1099 task = RepoModel().create_fork(data, cur_user=owner.user_id)
1088 # no commit, it's done in RepoModel, or async via celery
1100 # no commit, it's done in RepoModel, or async via celery
1089 task_id = get_task_id(task)
1101 task_id = get_task_id(task)
1090
1102
@@ -1749,7 +1761,7 b' def revoke_user_group_permission(request'
1749
1761
1750
1762
1751 @jsonrpc_method()
1763 @jsonrpc_method()
1752 def pull(request, apiuser, repoid):
1764 def pull(request, apiuser, repoid, remote_uri=Optional(None)):
1753 """
1765 """
1754 Triggers a pull on the given repository from a remote location. You
1766 Triggers a pull on the given repository from a remote location. You
1755 can use this to keep remote repositories up-to-date.
1767 can use this to keep remote repositories up-to-date.
@@ -1764,6 +1776,8 b' def pull(request, apiuser, repoid):'
1764 :type apiuser: AuthUser
1776 :type apiuser: AuthUser
1765 :param repoid: The repository name or repository ID.
1777 :param repoid: The repository name or repository ID.
1766 :type repoid: str or int
1778 :type repoid: str or int
1779 :param remote_uri: Optional remote URI to pass in for pull
1780 :type remote_uri: str
1767
1781
1768 Example output:
1782 Example output:
1769
1783
@@ -1771,7 +1785,7 b' def pull(request, apiuser, repoid):'
1771
1785
1772 id : <id_given_in_input>
1786 id : <id_given_in_input>
1773 result : {
1787 result : {
1774 "msg": "Pulled from `<repository name>`"
1788 "msg": "Pulled from url `<remote_url>` on repo `<repository name>`"
1775 "repository": "<repository name>"
1789 "repository": "<repository name>"
1776 }
1790 }
1777 error : null
1791 error : null
@@ -1783,27 +1797,31 b' def pull(request, apiuser, repoid):'
1783 id : <id_given_in_input>
1797 id : <id_given_in_input>
1784 result : null
1798 result : null
1785 error : {
1799 error : {
1786 "Unable to pull changes from `<reponame>`"
1800 "Unable to push changes from `<remote_url>`"
1787 }
1801 }
1788
1802
1789 """
1803 """
1790
1804
1791 repo = get_repo_or_error(repoid)
1805 repo = get_repo_or_error(repoid)
1806 remote_uri = Optional.extract(remote_uri)
1807 remote_uri_display = remote_uri or repo.clone_uri_hidden
1792 if not has_superadmin_permission(apiuser):
1808 if not has_superadmin_permission(apiuser):
1793 _perms = ('repository.admin',)
1809 _perms = ('repository.admin',)
1794 validate_repo_permissions(apiuser, repoid, repo, _perms)
1810 validate_repo_permissions(apiuser, repoid, repo, _perms)
1795
1811
1796 try:
1812 try:
1797 ScmModel().pull_changes(repo.repo_name, apiuser.username)
1813 ScmModel().pull_changes(
1814 repo.repo_name, apiuser.username, remote_uri=remote_uri)
1798 return {
1815 return {
1799 'msg': 'Pulled from `%s`' % repo.repo_name,
1816 'msg': 'Pulled from url `%s` on repo `%s`' % (
1817 remote_uri_display, repo.repo_name),
1800 'repository': repo.repo_name
1818 'repository': repo.repo_name
1801 }
1819 }
1802 except Exception:
1820 except Exception:
1803 log.exception("Exception occurred while trying to "
1821 log.exception("Exception occurred while trying to "
1804 "pull changes from remote location")
1822 "pull changes from remote location")
1805 raise JSONRPCError(
1823 raise JSONRPCError(
1806 'Unable to pull changes from `%s`' % repo.repo_name
1824 'Unable to pull changes from `%s`' % remote_uri_display
1807 )
1825 )
1808
1826
1809
1827
@@ -168,7 +168,8 b' def get_user_groups(request, apiuser):'
168 @jsonrpc_method()
168 @jsonrpc_method()
169 def create_user_group(
169 def create_user_group(
170 request, apiuser, group_name, description=Optional(''),
170 request, apiuser, group_name, description=Optional(''),
171 owner=Optional(OAttr('apiuser')), active=Optional(True)):
171 owner=Optional(OAttr('apiuser')), active=Optional(True),
172 sync=Optional(None)):
172 """
173 """
173 Creates a new user group.
174 Creates a new user group.
174
175
@@ -188,6 +189,9 b' def create_user_group('
188 :type owner: Optional(str or int)
189 :type owner: Optional(str or int)
189 :param active: Set this group as active.
190 :param active: Set this group as active.
190 :type active: Optional(``True`` | ``False``)
191 :type active: Optional(``True`` | ``False``)
192 :param sync: Set enabled or disabled the automatically sync from
193 external authentication types like ldap.
194 :type sync: Optional(``True`` | ``False``)
191
195
192 Example output:
196 Example output:
193
197
@@ -227,6 +231,15 b' def create_user_group('
227 owner = get_user_or_error(owner)
231 owner = get_user_or_error(owner)
228 active = Optional.extract(active)
232 active = Optional.extract(active)
229 description = Optional.extract(description)
233 description = Optional.extract(description)
234 sync = Optional.extract(sync)
235
236 # set the sync option based on group_data
237 group_data = None
238 if sync:
239 group_data = {
240 'extern_type': 'manual_api',
241 'extern_type_set_by': apiuser.username
242 }
230
243
231 schema = user_group_schema.UserGroupSchema().bind(
244 schema = user_group_schema.UserGroupSchema().bind(
232 # user caller
245 # user caller
@@ -246,7 +259,7 b' def create_user_group('
246 name=schema_data['user_group_name'],
259 name=schema_data['user_group_name'],
247 description=schema_data['user_group_description'],
260 description=schema_data['user_group_description'],
248 owner=owner,
261 owner=owner,
249 active=schema_data['user_group_active'])
262 active=schema_data['user_group_active'], group_data=group_data)
250 Session().flush()
263 Session().flush()
251 creation_data = user_group.get_api_data()
264 creation_data = user_group.get_api_data()
252 audit_logger.store_api(
265 audit_logger.store_api(
@@ -265,7 +278,7 b' def create_user_group('
265 @jsonrpc_method()
278 @jsonrpc_method()
266 def update_user_group(request, apiuser, usergroupid, group_name=Optional(''),
279 def update_user_group(request, apiuser, usergroupid, group_name=Optional(''),
267 description=Optional(''), owner=Optional(None),
280 description=Optional(''), owner=Optional(None),
268 active=Optional(True)):
281 active=Optional(True), sync=Optional(None)):
269 """
282 """
270 Updates the specified `user group` with the details provided.
283 Updates the specified `user group` with the details provided.
271
284
@@ -284,6 +297,9 b' def update_user_group(request, apiuser, '
284 :type owner: Optional(str or int)
297 :type owner: Optional(str or int)
285 :param active: Set the group as active.
298 :param active: Set the group as active.
286 :type active: Optional(``True`` | ``False``)
299 :type active: Optional(``True`` | ``False``)
300 :param sync: Set enabled or disabled the automatically sync from
301 external authentication types like ldap.
302 :type sync: Optional(``True`` | ``False``)
287
303
288 Example output:
304 Example output:
289
305
@@ -329,8 +345,21 b' def update_user_group(request, apiuser, '
329 store_update(updates, description, 'user_group_description')
345 store_update(updates, description, 'user_group_description')
330 store_update(updates, owner, 'user')
346 store_update(updates, owner, 'user')
331 store_update(updates, active, 'users_group_active')
347 store_update(updates, active, 'users_group_active')
348
349 sync = Optional.extract(sync)
350 group_data = None
351 if sync is True:
352 group_data = {
353 'extern_type': 'manual_api',
354 'extern_type_set_by': apiuser.username
355 }
356 if sync is False:
357 group_data = user_group.group_data
358 if group_data and "extern_type" in group_data:
359 del group_data["extern_type"]
360
332 try:
361 try:
333 UserGroupModel().update(user_group, updates)
362 UserGroupModel().update(user_group, updates, group_data=group_data)
334 audit_logger.store_api(
363 audit_logger.store_api(
335 'user_group.edit', action_data={'old_data': old_data},
364 'user_group.edit', action_data={'old_data': old_data},
336 user=apiuser)
365 user=apiuser)
@@ -609,8 +638,18 b' def grant_user_permission_to_user_group('
609 perm = get_perm_or_error(perm, prefix='usergroup.')
638 perm = get_perm_or_error(perm, prefix='usergroup.')
610
639
611 try:
640 try:
612 UserGroupModel().grant_user_permission(
641 changes = UserGroupModel().grant_user_permission(
613 user_group=user_group, user=user, perm=perm)
642 user_group=user_group, user=user, perm=perm)
643
644 action_data = {
645 'added': changes['added'],
646 'updated': changes['updated'],
647 'deleted': changes['deleted'],
648 }
649 audit_logger.store_api(
650 'user_group.edit.permissions', action_data=action_data,
651 user=apiuser)
652
614 Session().commit()
653 Session().commit()
615 return {
654 return {
616 'msg':
655 'msg':
@@ -669,8 +708,17 b' def revoke_user_permission_from_user_gro'
669 user = get_user_or_error(userid)
708 user = get_user_or_error(userid)
670
709
671 try:
710 try:
672 UserGroupModel().revoke_user_permission(
711 changes = UserGroupModel().revoke_user_permission(
673 user_group=user_group, user=user)
712 user_group=user_group, user=user)
713 action_data = {
714 'added': changes['added'],
715 'updated': changes['updated'],
716 'deleted': changes['deleted'],
717 }
718 audit_logger.store_api(
719 'user_group.edit.permissions', action_data=action_data,
720 user=apiuser)
721
674 Session().commit()
722 Session().commit()
675 return {
723 return {
676 'msg': 'Revoked perm for user: `%s` in user group: `%s`' % (
724 'msg': 'Revoked perm for user: `%s` in user group: `%s`' % (
@@ -735,11 +783,20 b' def grant_user_group_permission_to_user_'
735 'user group `%s` does not exist' % (sourceusergroupid,))
783 'user group `%s` does not exist' % (sourceusergroupid,))
736
784
737 try:
785 try:
738 UserGroupModel().grant_user_group_permission(
786 changes = UserGroupModel().grant_user_group_permission(
739 target_user_group=target_user_group,
787 target_user_group=target_user_group,
740 user_group=user_group, perm=perm)
788 user_group=user_group, perm=perm)
789
790 action_data = {
791 'added': changes['added'],
792 'updated': changes['updated'],
793 'deleted': changes['deleted'],
794 }
795 audit_logger.store_api(
796 'user_group.edit.permissions', action_data=action_data,
797 user=apiuser)
798
741 Session().commit()
799 Session().commit()
742
743 return {
800 return {
744 'msg': 'Granted perm: `%s` for user group: `%s` '
801 'msg': 'Granted perm: `%s` for user group: `%s` '
745 'in user group: `%s`' % (
802 'in user group: `%s`' % (
@@ -806,8 +863,17 b' def revoke_user_group_permission_from_us'
806 'user group `%s` does not exist' % (sourceusergroupid,))
863 'user group `%s` does not exist' % (sourceusergroupid,))
807
864
808 try:
865 try:
809 UserGroupModel().revoke_user_group_permission(
866 changes = UserGroupModel().revoke_user_group_permission(
810 target_user_group=target_user_group, user_group=user_group)
867 target_user_group=target_user_group, user_group=user_group)
868 action_data = {
869 'added': changes['added'],
870 'updated': changes['updated'],
871 'deleted': changes['deleted'],
872 }
873 audit_logger.store_api(
874 'user_group.edit.permissions', action_data=action_data,
875 user=apiuser)
876
811 Session().commit()
877 Session().commit()
812
878
813 return {
879 return {
@@ -22,9 +22,9 b' import time'
22 import logging
22 import logging
23 import operator
23 import operator
24
24
25 from pyramid.httpexceptions import HTTPFound
25 from pyramid.httpexceptions import HTTPFound, HTTPForbidden
26
26
27 from rhodecode.lib import helpers as h
27 from rhodecode.lib import helpers as h, diffs
28 from rhodecode.lib.utils2 import StrictAttributeDict, safe_int, datetime_to_time
28 from rhodecode.lib.utils2 import StrictAttributeDict, safe_int, datetime_to_time
29 from rhodecode.lib.vcs.exceptions import RepositoryRequirementError
29 from rhodecode.lib.vcs.exceptions import RepositoryRequirementError
30 from rhodecode.model import repo
30 from rhodecode.model import repo
@@ -33,6 +33,7 b' from rhodecode.model import user_group'
33 from rhodecode.model import user
33 from rhodecode.model import user
34 from rhodecode.model.db import User
34 from rhodecode.model.db import User
35 from rhodecode.model.scm import ScmModel
35 from rhodecode.model.scm import ScmModel
36 from rhodecode.model.settings import VcsSettingsModel
36
37
37 log = logging.getLogger(__name__)
38 log = logging.getLogger(__name__)
38
39
@@ -204,28 +205,47 b' class RepoAppView(BaseAppView):'
204 c.rhodecode_db_repo = self.db_repo
205 c.rhodecode_db_repo = self.db_repo
205 c.repo_name = self.db_repo_name
206 c.repo_name = self.db_repo_name
206 c.repository_pull_requests = self.db_repo_pull_requests
207 c.repository_pull_requests = self.db_repo_pull_requests
208 self.path_filter = PathFilter(None)
207
209
208 c.repository_requirements_missing = False
210 c.repository_requirements_missing = {}
209 try:
211 try:
210 self.rhodecode_vcs_repo = self.db_repo.scm_instance()
212 self.rhodecode_vcs_repo = self.db_repo.scm_instance()
213 if self.rhodecode_vcs_repo:
214 path_perms = self.rhodecode_vcs_repo.get_path_permissions(
215 c.auth_user.username)
216 self.path_filter = PathFilter(path_perms)
211 except RepositoryRequirementError as e:
217 except RepositoryRequirementError as e:
212 c.repository_requirements_missing = True
218 c.repository_requirements_missing = {'error': str(e)}
213 self._handle_missing_requirements(e)
219 self._handle_missing_requirements(e)
214 self.rhodecode_vcs_repo = None
220 self.rhodecode_vcs_repo = None
215
221
216 if (not c.repository_requirements_missing
222 c.path_filter = self.path_filter # used by atom_feed_entry.mako
217 and self.rhodecode_vcs_repo is None):
223
224 if self.rhodecode_vcs_repo is None:
218 # unable to fetch this repo as vcs instance, report back to user
225 # unable to fetch this repo as vcs instance, report back to user
219 h.flash(_(
226 h.flash(_(
220 "The repository `%(repo_name)s` cannot be loaded in filesystem. "
227 "The repository `%(repo_name)s` cannot be loaded in filesystem. "
221 "Please check if it exist, or is not damaged.") %
228 "Please check if it exist, or is not damaged.") %
222 {'repo_name': c.repo_name},
229 {'repo_name': c.repo_name},
223 category='error', ignore_duplicate=True)
230 category='error', ignore_duplicate=True)
224 raise HTTPFound(h.route_path('home'))
231 if c.repository_requirements_missing:
232 route = self.request.matched_route.name
233 if route.startswith(('edit_repo', 'repo_summary')):
234 # allow summary and edit repo on missing requirements
235 return c
236
237 raise HTTPFound(
238 h.route_path('repo_summary', repo_name=self.db_repo_name))
239
240 else: # redirect if we don't show missing requirements
241 raise HTTPFound(h.route_path('home'))
225
242
226 return c
243 return c
227
244
228 def _get_f_path(self, matchdict, default=None):
245 def _get_f_path_unchecked(self, matchdict, default=None):
246 """
247 Should only be used by redirects, everything else should call _get_f_path
248 """
229 f_path = matchdict.get('f_path')
249 f_path = matchdict.get('f_path')
230 if f_path:
250 if f_path:
231 # fix for multiple initial slashes that causes errors for GIT
251 # fix for multiple initial slashes that causes errors for GIT
@@ -233,6 +253,63 b' class RepoAppView(BaseAppView):'
233
253
234 return default
254 return default
235
255
256 def _get_f_path(self, matchdict, default=None):
257 f_path_match = self._get_f_path_unchecked(matchdict, default)
258 return self.path_filter.assert_path_permissions(f_path_match)
259
260 def _get_general_setting(self, target_repo, settings_key, default=False):
261 settings_model = VcsSettingsModel(repo=target_repo)
262 settings = settings_model.get_general_settings()
263 return settings.get(settings_key, default)
264
265
266 class PathFilter(object):
267
268 # Expects and instance of BasePathPermissionChecker or None
269 def __init__(self, permission_checker):
270 self.permission_checker = permission_checker
271
272 def assert_path_permissions(self, path):
273 if path and self.permission_checker and not self.permission_checker.has_access(path):
274 raise HTTPForbidden()
275 return path
276
277 def filter_patchset(self, patchset):
278 if not self.permission_checker or not patchset:
279 return patchset, False
280 had_filtered = False
281 filtered_patchset = []
282 for patch in patchset:
283 filename = patch.get('filename', None)
284 if not filename or self.permission_checker.has_access(filename):
285 filtered_patchset.append(patch)
286 else:
287 had_filtered = True
288 if had_filtered:
289 if isinstance(patchset, diffs.LimitedDiffContainer):
290 filtered_patchset = diffs.LimitedDiffContainer(patchset.diff_limit, patchset.cur_diff_size, filtered_patchset)
291 return filtered_patchset, True
292 else:
293 return patchset, False
294
295 def render_patchset_filtered(self, diffset, patchset, source_ref=None, target_ref=None):
296 filtered_patchset, has_hidden_changes = self.filter_patchset(patchset)
297 result = diffset.render_patchset(filtered_patchset, source_ref=source_ref, target_ref=target_ref)
298 result.has_hidden_changes = has_hidden_changes
299 return result
300
301 def get_raw_patch(self, diff_processor):
302 if self.permission_checker is None:
303 return diff_processor.as_raw()
304 elif self.permission_checker.has_full_access:
305 return diff_processor.as_raw()
306 else:
307 return '# Repository has user-specific filters, raw patch generation is disabled.'
308
309 @property
310 def is_enabled(self):
311 return self.permission_checker is not None
312
236
313
237 class RepoGroupAppView(BaseAppView):
314 class RepoGroupAppView(BaseAppView):
238 def __init__(self, context, request):
315 def __init__(self, context, request):
@@ -169,6 +169,11 b' def admin_routes(config):'
169 name='admin_settings_labs_update',
169 name='admin_settings_labs_update',
170 pattern='/settings/labs/update')
170 pattern='/settings/labs/update')
171
171
172 # Automation EE feature
173 config.add_route(
174 'admin_settings_automation',
175 pattern=ADMIN_PREFIX + '/settings/automation')
176
172 # global permissions
177 # global permissions
173
178
174 config.add_route(
179 config.add_route(
@@ -95,7 +95,8 b' class NavigationRegistry(object):'
95 'admin_settings_sessions'),
95 'admin_settings_sessions'),
96 NavEntry('open_source', _('Open Source Licenses'),
96 NavEntry('open_source', _('Open Source Licenses'),
97 'admin_settings_open_source'),
97 'admin_settings_open_source'),
98
98 NavEntry('automation', _('Automation'),
99 'admin_settings_automation')
99 ]
100 ]
100
101
101 _labs_entry = NavEntry('labs', _('Labs'),
102 _labs_entry = NavEntry('labs', _('Labs'),
@@ -86,6 +86,7 b' class TestAuthSettingsView(object):'
86
86
87 'host': 'dc.example.com',
87 'host': 'dc.example.com',
88 'port': '999',
88 'port': '999',
89 'timeout': 3600,
89 'tls_kind': 'PLAIN',
90 'tls_kind': 'PLAIN',
90 'tls_reqcert': 'NEVER',
91 'tls_reqcert': 'NEVER',
91
92
@@ -67,6 +67,7 b' class TestAdminUsersSshKeysView(TestCont'
67 'I4fG8+hBHzpeFxUGvSGNtXPUbwaAY8j/oHYrTpMgkj6pUEFsiKfC5zPq' \
67 'I4fG8+hBHzpeFxUGvSGNtXPUbwaAY8j/oHYrTpMgkj6pUEFsiKfC5zPq' \
68 'PFR5HyKTCHW0nFUJnZsbyFT5hMiF/hZkJc9A0ZbdSvJwCRQ/g3bmdL ' \
68 'PFR5HyKTCHW0nFUJnZsbyFT5hMiF/hZkJc9A0ZbdSvJwCRQ/g3bmdL ' \
69 'your_email@example.com'
69 'your_email@example.com'
70 FINGERPRINT = 'MD5:01:4f:ad:29:22:6e:01:37:c9:d2:52:26:52:b0:2d:93'
70
71
71 def test_ssh_keys_default_user(self):
72 def test_ssh_keys_default_user(self):
72 self.log_user()
73 self.log_user()
@@ -111,9 +112,11 b' class TestAdminUsersSshKeysView(TestCont'
111 route_path('edit_user_ssh_keys_add', user_id=user_id),
112 route_path('edit_user_ssh_keys_add', user_id=user_id),
112 {'description': desc, 'key_data': key_data,
113 {'description': desc, 'key_data': key_data,
113 'csrf_token': self.csrf_token})
114 'csrf_token': self.csrf_token})
115
116 err = 'Such key with fingerprint `{}` already exists, ' \
117 'please use a different one'.format(self.FINGERPRINT)
114 assert_session_flash(response, 'An error occurred during ssh key '
118 assert_session_flash(response, 'An error occurred during ssh key '
115 'saving: Such key already exists, '
119 'saving: {}'.format(err))
116 'please use a different one')
117
120
118 def test_add_ssh_key(self, user_util):
121 def test_add_ssh_key(self, user_util):
119 self.log_user()
122 self.log_user()
@@ -28,7 +28,7 b' from rhodecode.apps._base import BaseApp'
28 from rhodecode.apps.admin.navigation import navigation_list
28 from rhodecode.apps.admin.navigation import navigation_list
29 from rhodecode.lib.auth import (
29 from rhodecode.lib.auth import (
30 LoginRequired, HasPermissionAllDecorator, CSRFRequired)
30 LoginRequired, HasPermissionAllDecorator, CSRFRequired)
31 from rhodecode.lib.utils2 import safe_int
31 from rhodecode.lib.utils2 import safe_int, StrictAttributeDict
32
32
33 log = logging.getLogger(__name__)
33 log = logging.getLogger(__name__)
34
34
@@ -36,8 +36,40 b' log = logging.getLogger(__name__)'
36 class AdminProcessManagementView(BaseAppView):
36 class AdminProcessManagementView(BaseAppView):
37 def load_default_context(self):
37 def load_default_context(self):
38 c = self._get_local_tmpl_context()
38 c = self._get_local_tmpl_context()
39 return c
39
40
40 return c
41 def _format_proc(self, proc, with_children=False):
42 try:
43 mem = proc.memory_info()
44 proc_formatted = StrictAttributeDict({
45 'pid': proc.pid,
46 'name': proc.name(),
47 'mem_rss': mem.rss,
48 'mem_vms': mem.vms,
49 'cpu_percent': proc.cpu_percent(),
50 'create_time': proc.create_time(),
51 'cmd': ' '.join(proc.cmdline()),
52 })
53
54 if with_children:
55 proc_formatted.update({
56 'children': [self._format_proc(x)
57 for x in proc.children(recursive=True)]
58 })
59 except Exception:
60 log.exception('Failed to load proc')
61 proc_formatted = None
62 return proc_formatted
63
64 def get_processes(self):
65 proc_list = []
66 for p in psutil.process_iter():
67 if 'gunicorn' in p.name():
68 proc = self._format_proc(p, with_children=True)
69 if proc:
70 proc_list.append(proc)
71
72 return proc_list
41
73
42 @LoginRequired()
74 @LoginRequired()
43 @HasPermissionAllDecorator('hg.admin')
75 @HasPermissionAllDecorator('hg.admin')
@@ -50,8 +82,7 b' class AdminProcessManagementView(BaseApp'
50
82
51 c.active = 'process_management'
83 c.active = 'process_management'
52 c.navlist = navigation_list(self.request)
84 c.navlist = navigation_list(self.request)
53 c.gunicorn_processes = (
85 c.gunicorn_processes = self.get_processes()
54 p for p in psutil.process_iter() if 'gunicorn' in p.name())
55 return self._get_template_context(c)
86 return self._get_template_context(c)
56
87
57 @LoginRequired()
88 @LoginRequired()
@@ -62,8 +93,7 b' class AdminProcessManagementView(BaseApp'
62 def process_management_data(self):
93 def process_management_data(self):
63 _ = self.request.translate
94 _ = self.request.translate
64 c = self.load_default_context()
95 c = self.load_default_context()
65 c.gunicorn_processes = (
96 c.gunicorn_processes = self.get_processes()
66 p for p in psutil.process_iter() if 'gunicorn' in p.name())
67 return self._get_template_context(c)
97 return self._get_template_context(c)
68
98
69 @LoginRequired()
99 @LoginRequired()
@@ -75,9 +105,10 b' class AdminProcessManagementView(BaseApp'
75 def process_management_signal(self):
105 def process_management_signal(self):
76 pids = self.request.json.get('pids', [])
106 pids = self.request.json.get('pids', [])
77 result = []
107 result = []
108
78 def on_terminate(proc):
109 def on_terminate(proc):
79 msg = "process `PID:{}` terminated with exit code {}".format(
110 msg = "process `PID:{}` terminated with exit code {}".format(
80 proc.pid, proc.returncode)
111 proc.pid, proc.returncode or 0)
81 result.append(msg)
112 result.append(msg)
82
113
83 procs = []
114 procs = []
@@ -91,15 +122,22 b' class AdminProcessManagementView(BaseApp'
91
122
92 children = proc.children(recursive=True)
123 children = proc.children(recursive=True)
93 if children:
124 if children:
94 print('Wont kill Master Process')
125 log.warning('Wont kill Master Process')
95 else:
126 else:
96 procs.append(proc)
127 procs.append(proc)
97
128
98 for p in procs:
129 for p in procs:
99 p.terminate()
130 try:
131 p.terminate()
132 except psutil.AccessDenied as e:
133 log.warning('Access denied: {}'.format(e))
134
100 gone, alive = psutil.wait_procs(procs, timeout=10, callback=on_terminate)
135 gone, alive = psutil.wait_procs(procs, timeout=10, callback=on_terminate)
101 for p in alive:
136 for p in alive:
102 p.kill()
137 try:
138 p.kill()
139 except psutil.AccessDenied as e:
140 log.warning('Access denied: {}'.format(e))
103
141
104 return {'result': result}
142 return {'result': result}
105
143
@@ -314,6 +314,9 b' class AdminSettingsView(BaseAppView):'
314 try:
314 try:
315 form_result = application_form.to_python(dict(self.request.POST))
315 form_result = application_form.to_python(dict(self.request.POST))
316 except formencode.Invalid as errors:
316 except formencode.Invalid as errors:
317 h.flash(
318 _("Some form inputs contain invalid data."),
319 category='error')
317 data = render('rhodecode:templates/admin/settings/settings.mako',
320 data = render('rhodecode:templates/admin/settings/settings.mako',
318 self._get_template_context(c), self.request)
321 self._get_template_context(c), self.request)
319 html = formencode.htmlfill.render(
322 html = formencode.htmlfill.render(
@@ -386,6 +389,9 b' class AdminSettingsView(BaseAppView):'
386 try:
389 try:
387 form_result = application_form.to_python(dict(self.request.POST))
390 form_result = application_form.to_python(dict(self.request.POST))
388 except formencode.Invalid as errors:
391 except formencode.Invalid as errors:
392 h.flash(
393 _("Some form inputs contain invalid data."),
394 category='error')
389 data = render('rhodecode:templates/admin/settings/settings.mako',
395 data = render('rhodecode:templates/admin/settings/settings.mako',
390 self._get_template_context(c), self.request)
396 self._get_template_context(c), self.request)
391 html = formencode.htmlfill.render(
397 html = formencode.htmlfill.render(
@@ -669,6 +675,17 b' class AdminSettingsView(BaseAppView):'
669 @LoginRequired()
675 @LoginRequired()
670 @HasPermissionAllDecorator('hg.admin')
676 @HasPermissionAllDecorator('hg.admin')
671 @view_config(
677 @view_config(
678 route_name='admin_settings_automation', request_method='GET',
679 renderer='rhodecode:templates/admin/settings/settings.mako')
680 def settings_automation(self):
681 c = self.load_default_context()
682 c.active = 'automation'
683
684 return self._get_template_context(c)
685
686 @LoginRequired()
687 @HasPermissionAllDecorator('hg.admin')
688 @view_config(
672 route_name='admin_settings_labs', request_method='GET',
689 route_name='admin_settings_labs', request_method='GET',
673 renderer='rhodecode:templates/admin/settings/settings.mako')
690 renderer='rhodecode:templates/admin/settings/settings.mako')
674 def settings_labs(self):
691 def settings_labs(self):
@@ -705,7 +722,7 b' class AdminSettingsView(BaseAppView):'
705 form_result = application_form.to_python(dict(self.request.POST))
722 form_result = application_form.to_python(dict(self.request.POST))
706 except formencode.Invalid as errors:
723 except formencode.Invalid as errors:
707 h.flash(
724 h.flash(
708 _('Some form inputs contain invalid data.'),
725 _("Some form inputs contain invalid data."),
709 category='error')
726 category='error')
710 data = render('rhodecode:templates/admin/settings/settings.mako',
727 data = render('rhodecode:templates/admin/settings/settings.mako',
711 self._get_template_context(c), self.request)
728 self._get_template_context(c), self.request)
@@ -123,6 +123,9 b' class AdminSystemInfoSettingsView(BaseAp'
123 (_('Uptime'), val('uptime')['text'], state('uptime')),
123 (_('Uptime'), val('uptime')['text'], state('uptime')),
124 ('', '', ''), # spacer
124 ('', '', ''), # spacer
125
125
126 # ulimit
127 (_('Ulimit'), val('ulimit')['text'], state('ulimit')),
128
126 # Repo storage
129 # Repo storage
127 (_('Storage location'), val('storage')['path'], state('storage')),
130 (_('Storage location'), val('storage')['path'], state('storage')),
128 (_('Storage info'), val('storage')['text'], state('storage')),
131 (_('Storage info'), val('storage')['text'], state('storage')),
@@ -88,8 +88,8 b' class AdminUserGroupsView(BaseAppView, D'
88 _render = self.request.get_partial_renderer(
88 _render = self.request.get_partial_renderer(
89 'rhodecode:templates/data_table/_dt_elements.mako')
89 'rhodecode:templates/data_table/_dt_elements.mako')
90
90
91 def user_group_name(user_group_id, user_group_name):
91 def user_group_name(user_group_name):
92 return _render("user_group_name", user_group_id, user_group_name)
92 return _render("user_group_name", user_group_name)
93
93
94 def user_group_actions(user_group_id, user_group_name):
94 def user_group_actions(user_group_id, user_group_name):
95 return _render("user_group_actions", user_group_id, user_group_name)
95 return _render("user_group_actions", user_group_id, user_group_name)
@@ -153,15 +153,14 b' class AdminUserGroupsView(BaseAppView, D'
153 user_groups_data = []
153 user_groups_data = []
154 for user_gr in auth_user_group_list:
154 for user_gr in auth_user_group_list:
155 user_groups_data.append({
155 user_groups_data.append({
156 "users_group_name": user_group_name(
156 "users_group_name": user_group_name(user_gr.users_group_name),
157 user_gr.users_group_id, h.escape(user_gr.users_group_name)),
158 "name_raw": h.escape(user_gr.users_group_name),
157 "name_raw": h.escape(user_gr.users_group_name),
159 "description": h.escape(user_gr.user_group_description),
158 "description": h.escape(user_gr.user_group_description),
160 "members": user_gr.member_count,
159 "members": user_gr.member_count,
161 # NOTE(marcink): because of advanced query we
160 # NOTE(marcink): because of advanced query we
162 # need to load it like that
161 # need to load it like that
163 "sync": UserGroup._load_group_data(
162 "sync": UserGroup._load_sync(
164 user_gr.group_data).get('extern_type'),
163 UserGroup._load_group_data(user_gr.group_data)),
165 "active": h.bool2icon(user_gr.users_group_active),
164 "active": h.bool2icon(user_gr.users_group_active),
166 "owner": user_profile(user_gr.User.username),
165 "owner": user_profile(user_gr.User.username),
167 "action": user_group_actions(
166 "action": user_group_actions(
@@ -817,6 +817,7 b' class UsersView(UserAppView):'
817 key_data = self.request.POST.get('key_data')
817 key_data = self.request.POST.get('key_data')
818 description = self.request.POST.get('description')
818 description = self.request.POST.get('description')
819
819
820 fingerprint = 'unknown'
820 try:
821 try:
821 if not key_data:
822 if not key_data:
822 raise ValueError('Please add a valid public key')
823 raise ValueError('Please add a valid public key')
@@ -841,8 +842,9 b' class UsersView(UserAppView):'
841
842
842 except IntegrityError:
843 except IntegrityError:
843 log.exception("Exception during ssh key saving")
844 log.exception("Exception during ssh key saving")
844 h.flash(_('An error occurred during ssh key saving: {}').format(
845 err = 'Such key with fingerprint `{}` already exists, ' \
845 'Such key already exists, please use a different one'),
846 'please use a different one'.format(fingerprint)
847 h.flash(_('An error occurred during ssh key saving: {}').format(err),
846 category='error')
848 category='error')
847 except Exception as e:
849 except Exception as e:
848 log.exception("Exception during ssh key saving")
850 log.exception("Exception during ssh key saving")
@@ -109,7 +109,7 b' class TestMyAccountEdit(TestController):'
109 # ('extern_name', {'extern_name': None}),
109 # ('extern_name', {'extern_name': None}),
110 ('active', {'active': False}),
110 ('active', {'active': False}),
111 ('active', {'active': True}),
111 ('active', {'active': True}),
112 ('email', {'email': 'some@email.com'}),
112 ('email', {'email': u'some@email.com'}),
113 ])
113 ])
114 def test_my_account_update(self, name, attrs, user_util):
114 def test_my_account_update(self, name, attrs, user_util):
115 usr = user_util.create_user(password='qweqwe')
115 usr = user_util.create_user(password='qweqwe')
@@ -120,13 +120,17 b' class TestMyAccountEdit(TestController):'
120
120
121 params.update({'password_confirmation': ''})
121 params.update({'password_confirmation': ''})
122 params.update({'new_password': ''})
122 params.update({'new_password': ''})
123 params.update({'extern_type': 'rhodecode'})
123 params.update({'extern_type': u'rhodecode'})
124 params.update({'extern_name': 'rhodecode'})
124 params.update({'extern_name': u'rhodecode'})
125 params.update({'csrf_token': self.csrf_token})
125 params.update({'csrf_token': self.csrf_token})
126
126
127 params.update(attrs)
127 params.update(attrs)
128 # my account page cannot set language param yet, only for admins
128 # my account page cannot set language param yet, only for admins
129 del params['language']
129 del params['language']
130 if name == 'email':
131 uem = user_util.create_additional_user_email(usr, attrs['email'])
132 email_before = User.get(user_id).email
133
130 response = self.app.post(route_path('my_account_update'), params)
134 response = self.app.post(route_path('my_account_update'), params)
131
135
132 assert_session_flash(
136 assert_session_flash(
@@ -146,7 +150,7 b' class TestMyAccountEdit(TestController):'
146 params['language'] = updated_params['language']
150 params['language'] = updated_params['language']
147
151
148 if name == 'email':
152 if name == 'email':
149 params['emails'] = [attrs['email']]
153 params['emails'] = [attrs['email'], email_before]
150 if name == 'extern_type':
154 if name == 'extern_type':
151 # cannot update this via form, expected value is original one
155 # cannot update this via form, expected value is original one
152 params['extern_type'] = "rhodecode"
156 params['extern_type'] = "rhodecode"
@@ -162,10 +166,10 b' class TestMyAccountEdit(TestController):'
162
166
163 assert params == updated_params
167 assert params == updated_params
164
168
165 def test_my_account_update_err_email_exists(self):
169 def test_my_account_update_err_email_not_exists_in_emails(self):
166 self.log_user()
170 self.log_user()
167
171
168 new_email = 'test_regular@mail.com' # already existing email
172 new_email = 'test_regular@mail.com' # not in emails
169 params = {
173 params = {
170 'username': 'test_admin',
174 'username': 'test_admin',
171 'new_password': 'test12',
175 'new_password': 'test12',
@@ -179,7 +183,7 b' class TestMyAccountEdit(TestController):'
179 response = self.app.post(route_path('my_account_update'),
183 response = self.app.post(route_path('my_account_update'),
180 params=params)
184 params=params)
181
185
182 response.mustcontain('This e-mail address is already taken')
186 response.mustcontain('"test_regular@mail.com" is not one of test_admin@mail.com')
183
187
184 def test_my_account_update_bad_email_address(self):
188 def test_my_account_update_bad_email_address(self):
185 self.log_user('test_regular2', 'test12')
189 self.log_user('test_regular2', 'test12')
@@ -197,7 +201,4 b' class TestMyAccountEdit(TestController):'
197 response = self.app.post(route_path('my_account_update'),
201 response = self.app.post(route_path('my_account_update'),
198 params=params)
202 params=params)
199
203
200 response.mustcontain('An email address must contain a single @')
204 response.mustcontain('"newmail.pl" is not one of test_regular2@mail.com')
201 msg = u'Username "%(username)s" already exists'
202 msg = h.html_escape(msg % {'username': 'test_admin'})
203 response.mustcontain(u"%s" % msg)
@@ -24,7 +24,7 b' from rhodecode.apps._base import ADMIN_P'
24 from rhodecode.model.db import User, UserEmailMap
24 from rhodecode.model.db import User, UserEmailMap
25 from rhodecode.tests import (
25 from rhodecode.tests import (
26 TestController, TEST_USER_ADMIN_LOGIN, TEST_USER_REGULAR_EMAIL,
26 TestController, TEST_USER_ADMIN_LOGIN, TEST_USER_REGULAR_EMAIL,
27 assert_session_flash)
27 assert_session_flash, TEST_USER_REGULAR_PASS)
28 from rhodecode.tests.fixture import Fixture
28 from rhodecode.tests.fixture import Fixture
29
29
30 fixture = Fixture()
30 fixture = Fixture()
@@ -47,30 +47,14 b' class TestMyAccountEmails(TestController'
47 response = self.app.get(route_path('my_account_emails'))
47 response = self.app.get(route_path('my_account_emails'))
48 response.mustcontain('No additional emails specified')
48 response.mustcontain('No additional emails specified')
49
49
50 def test_my_account_my_emails_add_existing_email(self):
51 self.log_user()
52 response = self.app.get(route_path('my_account_emails'))
53 response.mustcontain('No additional emails specified')
54 response = self.app.post(route_path('my_account_emails_add'),
55 {'new_email': TEST_USER_REGULAR_EMAIL,
56 'csrf_token': self.csrf_token})
57 assert_session_flash(response, 'This e-mail address is already taken')
58
59 def test_my_account_my_emails_add_mising_email_in_form(self):
60 self.log_user()
61 response = self.app.get(route_path('my_account_emails'))
62 response.mustcontain('No additional emails specified')
63 response = self.app.post(route_path('my_account_emails_add'),
64 {'csrf_token': self.csrf_token})
65 assert_session_flash(response, 'Please enter an email address')
66
67 def test_my_account_my_emails_add_remove(self):
50 def test_my_account_my_emails_add_remove(self):
68 self.log_user()
51 self.log_user()
69 response = self.app.get(route_path('my_account_emails'))
52 response = self.app.get(route_path('my_account_emails'))
70 response.mustcontain('No additional emails specified')
53 response.mustcontain('No additional emails specified')
71
54
72 response = self.app.post(route_path('my_account_emails_add'),
55 response = self.app.post(route_path('my_account_emails_add'),
73 {'new_email': 'foo@barz.com',
56 {'email': 'foo@barz.com',
57 'current_password': TEST_USER_REGULAR_PASS,
74 'csrf_token': self.csrf_token})
58 'csrf_token': self.csrf_token})
75
59
76 response = self.app.get(route_path('my_account_emails'))
60 response = self.app.get(route_path('my_account_emails'))
@@ -66,6 +66,7 b' class TestMyAccountSshKeysView(TestContr'
66 'I4fG8+hBHzpeFxUGvSGNtXPUbwaAY8j/oHYrTpMgkj6pUEFsiKfC5zPq' \
66 'I4fG8+hBHzpeFxUGvSGNtXPUbwaAY8j/oHYrTpMgkj6pUEFsiKfC5zPq' \
67 'PFR5HyKTCHW0nFUJnZsbyFT5hMiF/hZkJc9A0ZbdSvJwCRQ/g3bmdL ' \
67 'PFR5HyKTCHW0nFUJnZsbyFT5hMiF/hZkJc9A0ZbdSvJwCRQ/g3bmdL ' \
68 'your_email@example.com'
68 'your_email@example.com'
69 FINGERPRINT = 'MD5:01:4f:ad:29:22:6e:01:37:c9:d2:52:26:52:b0:2d:93'
69
70
70 def test_add_ssh_key_error(self, user_util):
71 def test_add_ssh_key_error(self, user_util):
71 user = user_util.create_user(password='qweqwe')
72 user = user_util.create_user(password='qweqwe')
@@ -100,9 +101,11 b' class TestMyAccountSshKeysView(TestContr'
100 route_path('my_account_ssh_keys_add'),
101 route_path('my_account_ssh_keys_add'),
101 {'description': desc, 'key_data': key_data,
102 {'description': desc, 'key_data': key_data,
102 'csrf_token': self.csrf_token})
103 'csrf_token': self.csrf_token})
104
105 err = 'Such key with fingerprint `{}` already exists, ' \
106 'please use a different one'.format(self.FINGERPRINT)
103 assert_session_flash(response, 'An error occurred during ssh key '
107 assert_session_flash(response, 'An error occurred during ssh key '
104 'saving: Such key already exists, '
108 'saving: {}'.format(err))
105 'please use a different one')
106
109
107 def test_add_ssh_key(self, user_util):
110 def test_add_ssh_key(self, user_util):
108 user = user_util.create_user(password='qweqwe')
111 user = user_util.create_user(password='qweqwe')
@@ -232,40 +232,59 b' class MyAccountView(BaseAppView, DataGri'
232
232
233 c.user_email_map = UserEmailMap.query()\
233 c.user_email_map = UserEmailMap.query()\
234 .filter(UserEmailMap.user == c.user).all()
234 .filter(UserEmailMap.user == c.user).all()
235
236 schema = user_schema.AddEmailSchema().bind(
237 username=c.user.username, user_emails=c.user.emails)
238
239 form = forms.RcForm(schema,
240 action=h.route_path('my_account_emails_add'),
241 buttons=(forms.buttons.save, forms.buttons.reset))
242
243 c.form = form
235 return self._get_template_context(c)
244 return self._get_template_context(c)
236
245
237 @LoginRequired()
246 @LoginRequired()
238 @NotAnonymous()
247 @NotAnonymous()
239 @CSRFRequired()
248 @CSRFRequired()
240 @view_config(
249 @view_config(
241 route_name='my_account_emails_add', request_method='POST')
250 route_name='my_account_emails_add', request_method='POST',
251 renderer='rhodecode:templates/admin/my_account/my_account.mako')
242 def my_account_emails_add(self):
252 def my_account_emails_add(self):
243 _ = self.request.translate
253 _ = self.request.translate
244 c = self.load_default_context()
254 c = self.load_default_context()
255 c.active = 'emails'
245
256
246 email = self.request.POST.get('new_email')
257 schema = user_schema.AddEmailSchema().bind(
258 username=c.user.username, user_emails=c.user.emails)
247
259
260 form = forms.RcForm(
261 schema, action=h.route_path('my_account_emails_add'),
262 buttons=(forms.buttons.save, forms.buttons.reset))
263
264 controls = self.request.POST.items()
248 try:
265 try:
249 form = UserExtraEmailForm(self.request.translate)()
266 valid_data = form.validate(controls)
250 data = form.to_python({'email': email})
267 UserModel().add_extra_email(c.user.user_id, valid_data['email'])
251 email = data['email']
252
253 UserModel().add_extra_email(c.user.user_id, email)
254 audit_logger.store_web(
268 audit_logger.store_web(
255 'user.edit.email.add', action_data={
269 'user.edit.email.add', action_data={
256 'data': {'email': email, 'user': 'self'}},
270 'data': {'email': valid_data['email'], 'user': 'self'}},
257 user=self._rhodecode_user,)
271 user=self._rhodecode_user,)
258
259 Session().commit()
272 Session().commit()
260 h.flash(_("Added new email address `%s` for user account") % email,
261 category='success')
262 except formencode.Invalid as error:
273 except formencode.Invalid as error:
263 h.flash(h.escape(error.error_dict['email']), category='error')
274 h.flash(h.escape(error.error_dict['email']), category='error')
275 except forms.ValidationFailure as e:
276 c.user_email_map = UserEmailMap.query() \
277 .filter(UserEmailMap.user == c.user).all()
278 c.form = e
279 return self._get_template_context(c)
264 except Exception:
280 except Exception:
265 log.exception("Exception in my_account_emails")
281 log.exception("Exception adding email")
266 h.flash(_('An error occurred during email saving'),
282 h.flash(_('Error occurred during adding email'),
267 category='error')
283 category='error')
268 return HTTPFound(h.route_path('my_account_emails'))
284 else:
285 h.flash(_("Successfully added email"), category='success')
286
287 raise HTTPFound(self.request.route_path('my_account_emails'))
269
288
270 @LoginRequired()
289 @LoginRequired()
271 @NotAnonymous()
290 @NotAnonymous()
@@ -414,22 +433,23 b' class MyAccountView(BaseAppView, DataGri'
414 def my_account_edit(self):
433 def my_account_edit(self):
415 c = self.load_default_context()
434 c = self.load_default_context()
416 c.active = 'profile_edit'
435 c.active = 'profile_edit'
417
418 c.perm_user = c.auth_user
419 c.extern_type = c.user.extern_type
436 c.extern_type = c.user.extern_type
420 c.extern_name = c.user.extern_name
437 c.extern_name = c.user.extern_name
421
438
422 defaults = c.user.get_dict()
439 schema = user_schema.UserProfileSchema().bind(
440 username=c.user.username, user_emails=c.user.emails)
441 appstruct = {
442 'username': c.user.username,
443 'email': c.user.email,
444 'firstname': c.user.firstname,
445 'lastname': c.user.lastname,
446 }
447 c.form = forms.RcForm(
448 schema, appstruct=appstruct,
449 action=h.route_path('my_account_update'),
450 buttons=(forms.buttons.save, forms.buttons.reset))
423
451
424 data = render('rhodecode:templates/admin/my_account/my_account.mako',
452 return self._get_template_context(c)
425 self._get_template_context(c), self.request)
426 html = formencode.htmlfill.render(
427 data,
428 defaults=defaults,
429 encoding="UTF-8",
430 force_defaults=False
431 )
432 return Response(html)
433
453
434 @LoginRequired()
454 @LoginRequired()
435 @NotAnonymous()
455 @NotAnonymous()
@@ -442,55 +462,40 b' class MyAccountView(BaseAppView, DataGri'
442 _ = self.request.translate
462 _ = self.request.translate
443 c = self.load_default_context()
463 c = self.load_default_context()
444 c.active = 'profile_edit'
464 c.active = 'profile_edit'
445
446 c.perm_user = c.auth_user
465 c.perm_user = c.auth_user
447 c.extern_type = c.user.extern_type
466 c.extern_type = c.user.extern_type
448 c.extern_name = c.user.extern_name
467 c.extern_name = c.user.extern_name
449
468
450 _form = UserForm(self.request.translate, edit=True,
469 schema = user_schema.UserProfileSchema().bind(
451 old_data={'user_id': self._rhodecode_user.user_id,
470 username=c.user.username, user_emails=c.user.emails)
452 'email': self._rhodecode_user.email})()
471 form = forms.RcForm(
453 form_result = {}
472 schema, buttons=(forms.buttons.save, forms.buttons.reset))
473
474 controls = self.request.POST.items()
454 try:
475 try:
455 post_data = dict(self.request.POST)
476 valid_data = form.validate(controls)
456 post_data['new_password'] = ''
457 post_data['password_confirmation'] = ''
458 form_result = _form.to_python(post_data)
459 # skip updating those attrs for my account
460 skip_attrs = ['admin', 'active', 'extern_type', 'extern_name',
477 skip_attrs = ['admin', 'active', 'extern_type', 'extern_name',
461 'new_password', 'password_confirmation']
478 'new_password', 'password_confirmation']
462 # TODO: plugin should define if username can be updated
463 if c.extern_type != "rhodecode":
479 if c.extern_type != "rhodecode":
464 # forbid updating username for external accounts
480 # forbid updating username for external accounts
465 skip_attrs.append('username')
481 skip_attrs.append('username')
466
482 old_email = c.user.email
467 UserModel().update_user(
483 UserModel().update_user(
468 self._rhodecode_user.user_id, skip_attrs=skip_attrs,
484 self._rhodecode_user.user_id, skip_attrs=skip_attrs,
469 **form_result)
485 **valid_data)
470 h.flash(_('Your account was updated successfully'),
486 if old_email != valid_data['email']:
471 category='success')
487 old = UserEmailMap.query() \
488 .filter(UserEmailMap.user == c.user).filter(UserEmailMap.email == valid_data['email']).first()
489 old.email = old_email
490 h.flash(_('Your account was updated successfully'), category='success')
472 Session().commit()
491 Session().commit()
473
492 except forms.ValidationFailure as e:
474 except formencode.Invalid as errors:
493 c.form = e
475 data = render(
494 return self._get_template_context(c)
476 'rhodecode:templates/admin/my_account/my_account.mako',
477 self._get_template_context(c), self.request)
478
479 html = formencode.htmlfill.render(
480 data,
481 defaults=errors.value,
482 errors=errors.error_dict or {},
483 prefix_error=False,
484 encoding="UTF-8",
485 force_defaults=False)
486 return Response(html)
487
488 except Exception:
495 except Exception:
489 log.exception("Exception updating user")
496 log.exception("Exception updating user")
490 h.flash(_('Error occurred during update of user %s')
497 h.flash(_('Error occurred during update of user'),
491 % form_result.get('username'), category='error')
498 category='error')
492 raise HTTPFound(h.route_path('my_account_profile'))
493
494 raise HTTPFound(h.route_path('my_account_profile'))
499 raise HTTPFound(h.route_path('my_account_profile'))
495
500
496 def _get_pull_requests_list(self, statuses):
501 def _get_pull_requests_list(self, statuses):
@@ -89,7 +89,7 b' class MyAccountSshKeysView(BaseAppView, '
89 user_data = c.user.get_api_data()
89 user_data = c.user.get_api_data()
90 key_data = self.request.POST.get('key_data')
90 key_data = self.request.POST.get('key_data')
91 description = self.request.POST.get('description')
91 description = self.request.POST.get('description')
92
92 fingerprint = 'unknown'
93 try:
93 try:
94 if not key_data:
94 if not key_data:
95 raise ValueError('Please add a valid public key')
95 raise ValueError('Please add a valid public key')
@@ -114,8 +114,9 b' class MyAccountSshKeysView(BaseAppView, '
114
114
115 except IntegrityError:
115 except IntegrityError:
116 log.exception("Exception during ssh key saving")
116 log.exception("Exception during ssh key saving")
117 h.flash(_('An error occurred during ssh key saving: {}').format(
117 err = 'Such key with fingerprint `{}` already exists, ' \
118 'Such key already exists, please use a different one'),
118 'please use a different one'.format(fingerprint)
119 h.flash(_('An error occurred during ssh key saving: {}').format(err),
119 category='error')
120 category='error')
120 except Exception as e:
121 except Exception as e:
121 log.exception("Exception during ssh key saving")
122 log.exception("Exception during ssh key saving")
@@ -331,6 +331,10 b' def includeme(config):'
331 name='edit_repo_advanced_fork',
331 name='edit_repo_advanced_fork',
332 pattern='/{repo_name:.*?[^/]}/settings/advanced/fork', repo_route=True)
332 pattern='/{repo_name:.*?[^/]}/settings/advanced/fork', repo_route=True)
333
333
334 config.add_route(
335 name='edit_repo_advanced_hooks',
336 pattern='/{repo_name:.*?[^/]}/settings/advanced/hooks', repo_route=True)
337
334 # Caches
338 # Caches
335 config.add_route(
339 config.add_route(
336 name='edit_repo_caches',
340 name='edit_repo_caches',
@@ -373,6 +377,9 b' def includeme(config):'
373 config.add_route(
377 config.add_route(
374 name='edit_repo_remote_pull',
378 name='edit_repo_remote_pull',
375 pattern='/{repo_name:.*?[^/]}/settings/remote/pull', repo_route=True)
379 pattern='/{repo_name:.*?[^/]}/settings/remote/pull', repo_route=True)
380 config.add_route(
381 name='edit_repo_remote_push',
382 pattern='/{repo_name:.*?[^/]}/settings/remote/push', repo_route=True)
376
383
377 # Statistics
384 # Statistics
378 config.add_route(
385 config.add_route(
@@ -418,6 +425,11 b' def includeme(config):'
418 name='repo_default_reviewers_data',
425 name='repo_default_reviewers_data',
419 pattern='/{repo_name:.*?[^/]}/settings/review/default-reviewers', repo_route=True)
426 pattern='/{repo_name:.*?[^/]}/settings/review/default-reviewers', repo_route=True)
420
427
428 # Repo Automation (EE feature)
429 config.add_route(
430 name='repo_automation',
431 pattern='/{repo_name:.*?[^/]}/settings/automation', repo_route=True)
432
421 # Strip
433 # Strip
422 config.add_route(
434 config.add_route(
423 name='edit_repo_strip',
435 name='edit_repo_strip',
@@ -234,7 +234,8 b' class TestSummaryView(object):'
234 Repository, 'scm_instance', side_effect=RepositoryRequirementError)
234 Repository, 'scm_instance', side_effect=RepositoryRequirementError)
235
235
236 with scm_patcher:
236 with scm_patcher:
237 response = self.app.get(route_path('repo_summary', repo_name=repo_name))
237 response = self.app.get(
238 route_path('repo_summary', repo_name=repo_name))
238 assert_response = AssertResponse(response)
239 assert_response = AssertResponse(response)
239 assert_response.element_contains(
240 assert_response.element_contains(
240 '.main .alert-warning strong', 'Missing requirements')
241 '.main .alert-warning strong', 'Missing requirements')
@@ -18,6 +18,7 b''
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import os
21 import logging
22 import logging
22
23
23 from pyramid.httpexceptions import HTTPFound
24 from pyramid.httpexceptions import HTTPFound
@@ -27,6 +28,7 b' from rhodecode.apps._base import RepoApp'
27 from rhodecode.lib.auth import (
28 from rhodecode.lib.auth import (
28 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired)
29 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired)
29 from rhodecode.lib import helpers as h
30 from rhodecode.lib import helpers as h
31 from rhodecode.lib import system_info
30 from rhodecode.model.meta import Session
32 from rhodecode.model.meta import Session
31 from rhodecode.model.scm import ScmModel
33 from rhodecode.model.scm import ScmModel
32
34
@@ -36,8 +38,6 b' log = logging.getLogger(__name__)'
36 class RepoCachesView(RepoAppView):
38 class RepoCachesView(RepoAppView):
37 def load_default_context(self):
39 def load_default_context(self):
38 c = self._get_local_tmpl_context()
40 c = self._get_local_tmpl_context()
39
40
41 return c
41 return c
42
42
43 @LoginRequired()
43 @LoginRequired()
@@ -48,6 +48,11 b' class RepoCachesView(RepoAppView):'
48 def repo_caches(self):
48 def repo_caches(self):
49 c = self.load_default_context()
49 c = self.load_default_context()
50 c.active = 'caches'
50 c.active = 'caches'
51 cached_diffs_dir = c.rhodecode_db_repo.cached_diffs_dir
52 c.cached_diff_count = len(c.rhodecode_db_repo.cached_diffs())
53 c.cached_diff_size = 0
54 if os.path.isdir(cached_diffs_dir):
55 c.cached_diff_size = system_info.get_storage_size(cached_diffs_dir)
51
56
52 return self._get_template_context(c)
57 return self._get_template_context(c)
53
58
@@ -34,17 +34,18 b' from rhodecode.lib.auth import ('
34 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, CSRFRequired)
34 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, CSRFRequired)
35
35
36 from rhodecode.lib.compat import OrderedDict
36 from rhodecode.lib.compat import OrderedDict
37 from rhodecode.lib.diffs import cache_diff, load_cached_diff, diff_cache_exist
37 from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError
38 from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError
38 import rhodecode.lib.helpers as h
39 import rhodecode.lib.helpers as h
39 from rhodecode.lib.utils2 import safe_unicode
40 from rhodecode.lib.utils2 import safe_unicode, str2bool
40 from rhodecode.lib.vcs.backends.base import EmptyCommit
41 from rhodecode.lib.vcs.backends.base import EmptyCommit
41 from rhodecode.lib.vcs.exceptions import (
42 from rhodecode.lib.vcs.exceptions import (
42 RepositoryError, CommitDoesNotExistError, NodeDoesNotExistError)
43 RepositoryError, CommitDoesNotExistError)
43 from rhodecode.model.db import ChangesetComment, ChangesetStatus
44 from rhodecode.model.db import ChangesetComment, ChangesetStatus
44 from rhodecode.model.changeset_status import ChangesetStatusModel
45 from rhodecode.model.changeset_status import ChangesetStatusModel
45 from rhodecode.model.comment import CommentsModel
46 from rhodecode.model.comment import CommentsModel
46 from rhodecode.model.meta import Session
47 from rhodecode.model.meta import Session
47
48 from rhodecode.model.settings import VcsSettingsModel
48
49
49 log = logging.getLogger(__name__)
50 log = logging.getLogger(__name__)
50
51
@@ -152,6 +153,12 b' class RepoCommitsView(RepoAppView):'
152
153
153 return c
154 return c
154
155
156 def _is_diff_cache_enabled(self, target_repo):
157 caching_enabled = self._get_general_setting(
158 target_repo, 'rhodecode_diff_cache')
159 log.debug('Diff caching enabled: %s', caching_enabled)
160 return caching_enabled
161
155 def _commit(self, commit_id_range, method):
162 def _commit(self, commit_id_range, method):
156 _ = self.request.translate
163 _ = self.request.translate
157 c = self.load_default_context()
164 c = self.load_default_context()
@@ -240,45 +247,65 b' class RepoCommitsView(RepoAppView):'
240 commit2 = commit
247 commit2 = commit
241 commit1 = commit.parents[0] if commit.parents else EmptyCommit()
248 commit1 = commit.parents[0] if commit.parents else EmptyCommit()
242
249
243 _diff = self.rhodecode_vcs_repo.get_diff(
244 commit1, commit2,
245 ignore_whitespace=ign_whitespace_lcl, context=context_lcl)
246 diff_processor = diffs.DiffProcessor(
247 _diff, format='newdiff', diff_limit=diff_limit,
248 file_limit=file_limit, show_full_diff=c.fulldiff)
249
250 commit_changes = OrderedDict()
251 if method == 'show':
250 if method == 'show':
252 _parsed = diff_processor.prepare()
253 c.limited_diff = isinstance(_parsed, diffs.LimitedDiffContainer)
254
255 _parsed = diff_processor.prepare()
256
257 def _node_getter(commit):
258 def get_node(fname):
259 try:
260 return commit.get_node(fname)
261 except NodeDoesNotExistError:
262 return None
263 return get_node
264
265 inline_comments = CommentsModel().get_inline_comments(
251 inline_comments = CommentsModel().get_inline_comments(
266 self.db_repo.repo_id, revision=commit.raw_id)
252 self.db_repo.repo_id, revision=commit.raw_id)
267 c.inline_cnt = CommentsModel().get_inline_comments_count(
253 c.inline_cnt = CommentsModel().get_inline_comments_count(
268 inline_comments)
254 inline_comments)
255 c.inline_comments = inline_comments
269
256
270 diffset = codeblocks.DiffSet(
257 cache_path = self.rhodecode_vcs_repo.get_create_shadow_cache_pr_path(
271 repo_name=self.db_repo_name,
258 self.db_repo)
272 source_node_getter=_node_getter(commit1),
259 cache_file_path = diff_cache_exist(
273 target_node_getter=_node_getter(commit2),
260 cache_path, 'diff', commit.raw_id,
274 comments=inline_comments)
261 ign_whitespace_lcl, context_lcl, c.fulldiff)
275 diffset = diffset.render_patchset(
262
276 _parsed, commit1.raw_id, commit2.raw_id)
263 caching_enabled = self._is_diff_cache_enabled(self.db_repo)
264 force_recache = str2bool(self.request.GET.get('force_recache'))
265
266 cached_diff = None
267 if caching_enabled:
268 cached_diff = load_cached_diff(cache_file_path)
277
269
270 has_proper_diff_cache = cached_diff and cached_diff.get('diff')
271 if not force_recache and has_proper_diff_cache:
272 diffset = cached_diff['diff']
273 else:
274 vcs_diff = self.rhodecode_vcs_repo.get_diff(
275 commit1, commit2,
276 ignore_whitespace=ign_whitespace_lcl,
277 context=context_lcl)
278
279 diff_processor = diffs.DiffProcessor(
280 vcs_diff, format='newdiff', diff_limit=diff_limit,
281 file_limit=file_limit, show_full_diff=c.fulldiff)
282
283 _parsed = diff_processor.prepare()
284
285 diffset = codeblocks.DiffSet(
286 repo_name=self.db_repo_name,
287 source_node_getter=codeblocks.diffset_node_getter(commit1),
288 target_node_getter=codeblocks.diffset_node_getter(commit2))
289
290 diffset = self.path_filter.render_patchset_filtered(
291 diffset, _parsed, commit1.raw_id, commit2.raw_id)
292
293 # save cached diff
294 if caching_enabled:
295 cache_diff(cache_file_path, diffset, None)
296
297 c.limited_diff = diffset.limited_diff
278 c.changes[commit.raw_id] = diffset
298 c.changes[commit.raw_id] = diffset
279 else:
299 else:
300 # TODO(marcink): no cache usage here...
301 _diff = self.rhodecode_vcs_repo.get_diff(
302 commit1, commit2,
303 ignore_whitespace=ign_whitespace_lcl, context=context_lcl)
304 diff_processor = diffs.DiffProcessor(
305 _diff, format='newdiff', diff_limit=diff_limit,
306 file_limit=file_limit, show_full_diff=c.fulldiff)
280 # downloads/raw we only need RAW diff nothing else
307 # downloads/raw we only need RAW diff nothing else
281 diff = diff_processor.as_raw()
308 diff = self.path_filter.get_raw_patch(diff_processor)
282 c.changes[commit.raw_id] = [None, None, None, None, diff, None, None]
309 c.changes[commit.raw_id] = [None, None, None, None, diff, None, None]
283
310
284 # sort comments by how they were generated
311 # sort comments by how they were generated
@@ -295,22 +295,13 b' class RepoCompareView(RepoAppView):'
295 file_limit=file_limit, show_full_diff=c.fulldiff)
295 file_limit=file_limit, show_full_diff=c.fulldiff)
296 _parsed = diff_processor.prepare()
296 _parsed = diff_processor.prepare()
297
297
298 def _node_getter(commit):
299 """ Returns a function that returns a node for a commit or None """
300 def get_node(fname):
301 try:
302 return commit.get_node(fname)
303 except NodeDoesNotExistError:
304 return None
305 return get_node
306
307 diffset = codeblocks.DiffSet(
298 diffset = codeblocks.DiffSet(
308 repo_name=source_repo.repo_name,
299 repo_name=source_repo.repo_name,
309 source_node_getter=_node_getter(source_commit),
300 source_node_getter=codeblocks.diffset_node_getter(source_commit),
310 target_node_getter=_node_getter(target_commit),
301 target_node_getter=codeblocks.diffset_node_getter(target_commit),
311 )
302 )
312 c.diffset = diffset.render_patchset(
303 c.diffset = self.path_filter.render_patchset_filtered(
313 _parsed, source_ref, target_ref)
304 diffset, _parsed, source_ref, target_ref)
314
305
315 c.preview_mode = merge
306 c.preview_mode = merge
316 c.source_commit = source_commit
307 c.source_commit = source_commit
@@ -90,13 +90,15 b' class RepoFeedView(RepoAppView):'
90 _renderer = self.request.get_partial_renderer(
90 _renderer = self.request.get_partial_renderer(
91 'rhodecode:templates/feed/atom_feed_entry.mako')
91 'rhodecode:templates/feed/atom_feed_entry.mako')
92 diff_processor, parsed_diff, limited_diff = self._changes(commit)
92 diff_processor, parsed_diff, limited_diff = self._changes(commit)
93 filtered_parsed_diff, has_hidden_changes = self.path_filter.filter_patchset(parsed_diff)
93 return _renderer(
94 return _renderer(
94 'body',
95 'body',
95 commit=commit,
96 commit=commit,
96 parsed_diff=parsed_diff,
97 parsed_diff=filtered_parsed_diff,
97 limited_diff=limited_diff,
98 limited_diff=limited_diff,
98 feed_include_diff=self.feed_include_diff,
99 feed_include_diff=self.feed_include_diff,
99 diff_processor=diff_processor,
100 diff_processor=diff_processor,
101 has_hidden_changes=has_hidden_changes
100 )
102 )
101
103
102 def _set_timezone(self, date, tzinfo=pytz.utc):
104 def _set_timezone(self, date, tzinfo=pytz.utc):
@@ -122,8 +124,7 b' class RepoFeedView(RepoAppView):'
122 """
124 """
123 self.load_default_context()
125 self.load_default_context()
124
126
125 @cache_region('long_term')
127 def _generate_feed():
126 def _generate_feed(cache_key):
127 feed = Atom1Feed(
128 feed = Atom1Feed(
128 title=self.title % self.db_repo_name,
129 title=self.title % self.db_repo_name,
129 link=h.route_url('repo_summary', repo_name=self.db_repo_name),
130 link=h.route_url('repo_summary', repo_name=self.db_repo_name),
@@ -146,12 +147,18 b' class RepoFeedView(RepoAppView):'
146
147
147 return feed.mime_type, feed.writeString('utf-8')
148 return feed.mime_type, feed.writeString('utf-8')
148
149
149 invalidator_context = CacheKey.repo_context_cache(
150 @cache_region('long_term')
150 _generate_feed, self.db_repo_name, CacheKey.CACHE_TYPE_ATOM)
151 def _generate_feed_and_cache(cache_key):
152 return _generate_feed()
151
153
152 with invalidator_context as context:
154 if self.path_filter.is_enabled:
153 context.invalidate()
155 invalidator_context = CacheKey.repo_context_cache(
154 mime_type, feed = context.compute()
156 _generate_feed_and_cache, self.db_repo_name, CacheKey.CACHE_TYPE_ATOM)
157 with invalidator_context as context:
158 context.invalidate()
159 mime_type, feed = context.compute()
160 else:
161 mime_type, feed = _generate_feed()
155
162
156 response = Response(feed)
163 response = Response(feed)
157 response.content_type = mime_type
164 response.content_type = mime_type
@@ -169,8 +176,7 b' class RepoFeedView(RepoAppView):'
169 """
176 """
170 self.load_default_context()
177 self.load_default_context()
171
178
172 @cache_region('long_term')
179 def _generate_feed():
173 def _generate_feed(cache_key):
174 feed = Rss201rev2Feed(
180 feed = Rss201rev2Feed(
175 title=self.title % self.db_repo_name,
181 title=self.title % self.db_repo_name,
176 link=h.route_url('repo_summary', repo_name=self.db_repo_name),
182 link=h.route_url('repo_summary', repo_name=self.db_repo_name),
@@ -193,12 +199,19 b' class RepoFeedView(RepoAppView):'
193
199
194 return feed.mime_type, feed.writeString('utf-8')
200 return feed.mime_type, feed.writeString('utf-8')
195
201
196 invalidator_context = CacheKey.repo_context_cache(
202 @cache_region('long_term')
197 _generate_feed, self.db_repo_name, CacheKey.CACHE_TYPE_RSS)
203 def _generate_feed_and_cache(cache_key):
204 return _generate_feed()
198
205
199 with invalidator_context as context:
206 if self.path_filter.is_enabled:
200 context.invalidate()
207 invalidator_context = CacheKey.repo_context_cache(
201 mime_type, feed = context.compute()
208 _generate_feed_and_cache, self.db_repo_name, CacheKey.CACHE_TYPE_RSS)
209
210 with invalidator_context as context:
211 context.invalidate()
212 mime_type, feed = context.compute()
213 else:
214 mime_type, feed = _generate_feed()
202
215
203 response = Response(feed)
216 response = Response(feed)
204 response.content_type = mime_type
217 response.content_type = mime_type
@@ -426,7 +426,7 b' class RepoFilesView(RepoAppView):'
426 context=line_context)
426 context=line_context)
427 diff = diffs.DiffProcessor(_diff, format='gitdiff')
427 diff = diffs.DiffProcessor(_diff, format='gitdiff')
428
428
429 response = Response(diff.as_raw())
429 response = Response(self.path_filter.get_raw_patch(diff))
430 response.content_type = 'text/plain'
430 response.content_type = 'text/plain'
431 response.content_disposition = (
431 response.content_disposition = (
432 'attachment; filename=%s_%s_vs_%s.diff' % (f_path, diff1, diff2)
432 'attachment; filename=%s_%s_vs_%s.diff' % (f_path, diff1, diff2)
@@ -442,7 +442,7 b' class RepoFilesView(RepoAppView):'
442 context=line_context)
442 context=line_context)
443 diff = diffs.DiffProcessor(_diff, format='gitdiff')
443 diff = diffs.DiffProcessor(_diff, format='gitdiff')
444
444
445 response = Response(diff.as_raw())
445 response = Response(self.path_filter.get_raw_patch(diff))
446 response.content_type = 'text/plain'
446 response.content_type = 'text/plain'
447 charset = self._get_default_encoding(c)
447 charset = self._get_default_encoding(c)
448 if charset:
448 if charset:
@@ -462,7 +462,7 b' class RepoFilesView(RepoAppView):'
462 """
462 """
463 Kept only to make OLD links work
463 Kept only to make OLD links work
464 """
464 """
465 f_path = self._get_f_path(self.request.matchdict)
465 f_path = self._get_f_path_unchecked(self.request.matchdict)
466 diff1 = self.request.GET.get('diff1', '')
466 diff1 = self.request.GET.get('diff1', '')
467 diff2 = self.request.GET.get('diff2', '')
467 diff2 = self.request.GET.get('diff2', '')
468
468
@@ -34,6 +34,7 b' from rhodecode.apps._base import RepoApp'
34
34
35 from rhodecode.lib import helpers as h, diffs, codeblocks, channelstream
35 from rhodecode.lib import helpers as h, diffs, codeblocks, channelstream
36 from rhodecode.lib.base import vcs_operation_context
36 from rhodecode.lib.base import vcs_operation_context
37 from rhodecode.lib.diffs import load_cached_diff, cache_diff, diff_cache_exist
37 from rhodecode.lib.ext_json import json
38 from rhodecode.lib.ext_json import json
38 from rhodecode.lib.auth import (
39 from rhodecode.lib.auth import (
39 LoginRequired, HasRepoPermissionAny, HasRepoPermissionAnyDecorator,
40 LoginRequired, HasRepoPermissionAny, HasRepoPermissionAnyDecorator,
@@ -41,7 +42,7 b' from rhodecode.lib.auth import ('
41 from rhodecode.lib.utils2 import str2bool, safe_str, safe_unicode
42 from rhodecode.lib.utils2 import str2bool, safe_str, safe_unicode
42 from rhodecode.lib.vcs.backends.base import EmptyCommit, UpdateFailureReason
43 from rhodecode.lib.vcs.backends.base import EmptyCommit, UpdateFailureReason
43 from rhodecode.lib.vcs.exceptions import (CommitDoesNotExistError,
44 from rhodecode.lib.vcs.exceptions import (CommitDoesNotExistError,
44 RepositoryRequirementError, NodeDoesNotExistError, EmptyRepositoryError)
45 RepositoryRequirementError, EmptyRepositoryError)
45 from rhodecode.model.changeset_status import ChangesetStatusModel
46 from rhodecode.model.changeset_status import ChangesetStatusModel
46 from rhodecode.model.comment import CommentsModel
47 from rhodecode.model.comment import CommentsModel
47 from rhodecode.model.db import (func, or_, PullRequest, PullRequestVersion,
48 from rhodecode.model.db import (func, or_, PullRequest, PullRequestVersion,
@@ -201,10 +202,16 b' class RepoPullRequestsView(RepoAppView, '
201
202
202 return data
203 return data
203
204
205 def _is_diff_cache_enabled(self, target_repo):
206 caching_enabled = self._get_general_setting(
207 target_repo, 'rhodecode_diff_cache')
208 log.debug('Diff caching enabled: %s', caching_enabled)
209 return caching_enabled
210
204 def _get_diffset(self, source_repo_name, source_repo,
211 def _get_diffset(self, source_repo_name, source_repo,
205 source_ref_id, target_ref_id,
212 source_ref_id, target_ref_id,
206 target_commit, source_commit, diff_limit, fulldiff,
213 target_commit, source_commit, diff_limit, file_limit,
207 file_limit, display_inline_comments):
214 fulldiff):
208
215
209 vcs_diff = PullRequestModel().get_diff(
216 vcs_diff = PullRequestModel().get_diff(
210 source_repo, source_ref_id, target_ref_id)
217 source_repo, source_ref_id, target_ref_id)
@@ -215,24 +222,14 b' class RepoPullRequestsView(RepoAppView, '
215
222
216 _parsed = diff_processor.prepare()
223 _parsed = diff_processor.prepare()
217
224
218 def _node_getter(commit):
219 def get_node(fname):
220 try:
221 return commit.get_node(fname)
222 except NodeDoesNotExistError:
223 return None
224
225 return get_node
226
227 diffset = codeblocks.DiffSet(
225 diffset = codeblocks.DiffSet(
228 repo_name=self.db_repo_name,
226 repo_name=self.db_repo_name,
229 source_repo_name=source_repo_name,
227 source_repo_name=source_repo_name,
230 source_node_getter=_node_getter(target_commit),
228 source_node_getter=codeblocks.diffset_node_getter(target_commit),
231 target_node_getter=_node_getter(source_commit),
229 target_node_getter=codeblocks.diffset_node_getter(source_commit),
232 comments=display_inline_comments
233 )
230 )
234 diffset = diffset.render_patchset(
231 diffset = self.path_filter.render_patchset_filtered(
235 _parsed, target_commit.raw_id, source_commit.raw_id)
232 diffset, _parsed, target_commit.raw_id, source_commit.raw_id)
236
233
237 return diffset
234 return diffset
238
235
@@ -443,42 +440,54 b' class RepoPullRequestsView(RepoAppView, '
443 commits_source_repo = source_scm
440 commits_source_repo = source_scm
444
441
445 c.commits_source_repo = commits_source_repo
442 c.commits_source_repo = commits_source_repo
446 commit_cache = {}
447 try:
448 pre_load = ["author", "branch", "date", "message"]
449 show_revs = pull_request_at_ver.revisions
450 for rev in show_revs:
451 comm = commits_source_repo.get_commit(
452 commit_id=rev, pre_load=pre_load)
453 c.commit_ranges.append(comm)
454 commit_cache[comm.raw_id] = comm
455
456 # Order here matters, we first need to get target, and then
457 # the source
458 target_commit = commits_source_repo.get_commit(
459 commit_id=safe_str(target_ref_id))
460
461 source_commit = commits_source_repo.get_commit(
462 commit_id=safe_str(source_ref_id))
463
464 except CommitDoesNotExistError:
465 log.warning(
466 'Failed to get commit from `{}` repo'.format(
467 commits_source_repo), exc_info=True)
468 except RepositoryRequirementError:
469 log.warning(
470 'Failed to get all required data from repo', exc_info=True)
471 c.missing_requirements = True
472
473 c.ancestor = None # set it to None, to hide it from PR view
443 c.ancestor = None # set it to None, to hide it from PR view
474
444
475 try:
445 # empty version means latest, so we keep this to prevent
476 ancestor_id = source_scm.get_common_ancestor(
446 # double caching
477 source_commit.raw_id, target_commit.raw_id, target_scm)
447 version_normalized = version or 'latest'
478 c.ancestor_commit = source_scm.get_commit(ancestor_id)
448 from_version_normalized = from_version or 'latest'
479 except Exception:
449
480 c.ancestor_commit = None
450 cache_path = self.rhodecode_vcs_repo.get_create_shadow_cache_pr_path(
451 target_repo)
452 cache_file_path = diff_cache_exist(
453 cache_path, 'pull_request', pull_request_id, version_normalized,
454 from_version_normalized, source_ref_id, target_ref_id, c.fulldiff)
455
456 caching_enabled = self._is_diff_cache_enabled(c.target_repo)
457 force_recache = str2bool(self.request.GET.get('force_recache'))
458
459 cached_diff = None
460 if caching_enabled:
461 cached_diff = load_cached_diff(cache_file_path)
481
462
463 has_proper_commit_cache = (
464 cached_diff and cached_diff.get('commits')
465 and len(cached_diff.get('commits', [])) == 5
466 and cached_diff.get('commits')[0]
467 and cached_diff.get('commits')[3])
468 if not force_recache and has_proper_commit_cache:
469 diff_commit_cache = \
470 (ancestor_commit, commit_cache, missing_requirements,
471 source_commit, target_commit) = cached_diff['commits']
472 else:
473 diff_commit_cache = \
474 (ancestor_commit, commit_cache, missing_requirements,
475 source_commit, target_commit) = self.get_commits(
476 commits_source_repo,
477 pull_request_at_ver,
478 source_commit,
479 source_ref_id,
480 source_scm,
481 target_commit,
482 target_ref_id,
483 target_scm)
484
485 # register our commit range
486 for comm in commit_cache.values():
487 c.commit_ranges.append(comm)
488
489 c.missing_requirements = missing_requirements
490 c.ancestor_commit = ancestor_commit
482 c.statuses = source_repo.statuses(
491 c.statuses = source_repo.statuses(
483 [x.raw_id for x in c.commit_ranges])
492 [x.raw_id for x in c.commit_ranges])
484
493
@@ -500,12 +509,23 b' class RepoPullRequestsView(RepoAppView, '
500
509
501 c.missing_commits = True
510 c.missing_commits = True
502 else:
511 else:
512 c.inline_comments = display_inline_comments
503
513
504 c.diffset = self._get_diffset(
514 has_proper_diff_cache = cached_diff and cached_diff.get('commits')
505 c.source_repo.repo_name, commits_source_repo,
515 if not force_recache and has_proper_diff_cache:
506 source_ref_id, target_ref_id,
516 c.diffset = cached_diff['diff']
507 target_commit, source_commit,
517 (ancestor_commit, commit_cache, missing_requirements,
508 diff_limit, c.fulldiff, file_limit, display_inline_comments)
518 source_commit, target_commit) = cached_diff['commits']
519 else:
520 c.diffset = self._get_diffset(
521 c.source_repo.repo_name, commits_source_repo,
522 source_ref_id, target_ref_id,
523 target_commit, source_commit,
524 diff_limit, file_limit, c.fulldiff)
525
526 # save cached diff
527 if caching_enabled:
528 cache_diff(cache_file_path, c.diffset, diff_commit_cache)
509
529
510 c.limited_diff = c.diffset.limited_diff
530 c.limited_diff = c.diffset.limited_diff
511
531
@@ -568,7 +588,6 b' class RepoPullRequestsView(RepoAppView, '
568 if self._rhodecode_user.user_id in allowed_reviewers:
588 if self._rhodecode_user.user_id in allowed_reviewers:
569 for co in general_comments:
589 for co in general_comments:
570 if co.author.user_id == self._rhodecode_user.user_id:
590 if co.author.user_id == self._rhodecode_user.user_id:
571 # each comment has a status change
572 status = co.status_change
591 status = co.status_change
573 if status:
592 if status:
574 _ver_pr = status[0].comment.pull_request_version_id
593 _ver_pr = status[0].comment.pull_request_version_id
@@ -576,6 +595,43 b' class RepoPullRequestsView(RepoAppView, '
576
595
577 return self._get_template_context(c)
596 return self._get_template_context(c)
578
597
598 def get_commits(
599 self, commits_source_repo, pull_request_at_ver, source_commit,
600 source_ref_id, source_scm, target_commit, target_ref_id, target_scm):
601 commit_cache = collections.OrderedDict()
602 missing_requirements = False
603 try:
604 pre_load = ["author", "branch", "date", "message"]
605 show_revs = pull_request_at_ver.revisions
606 for rev in show_revs:
607 comm = commits_source_repo.get_commit(
608 commit_id=rev, pre_load=pre_load)
609 commit_cache[comm.raw_id] = comm
610
611 # Order here matters, we first need to get target, and then
612 # the source
613 target_commit = commits_source_repo.get_commit(
614 commit_id=safe_str(target_ref_id))
615
616 source_commit = commits_source_repo.get_commit(
617 commit_id=safe_str(source_ref_id))
618 except CommitDoesNotExistError:
619 log.warning(
620 'Failed to get commit from `{}` repo'.format(
621 commits_source_repo), exc_info=True)
622 except RepositoryRequirementError:
623 log.warning(
624 'Failed to get all required data from repo', exc_info=True)
625 missing_requirements = True
626 ancestor_commit = None
627 try:
628 ancestor_id = source_scm.get_common_ancestor(
629 source_commit.raw_id, target_commit.raw_id, target_scm)
630 ancestor_commit = source_scm.get_commit(ancestor_id)
631 except Exception:
632 ancestor_commit = None
633 return ancestor_commit, commit_cache, missing_requirements, source_commit, target_commit
634
579 def assure_not_empty_repo(self):
635 def assure_not_empty_repo(self):
580 _ = self.request.translate
636 _ = self.request.translate
581
637
@@ -138,15 +138,19 b' class RepoSettingsView(RepoAppView):'
138 repo_description=schema_data['repo_description'],
138 repo_description=schema_data['repo_description'],
139 repo_private=schema_data['repo_private'],
139 repo_private=schema_data['repo_private'],
140 clone_uri=schema_data['repo_clone_uri'],
140 clone_uri=schema_data['repo_clone_uri'],
141 push_uri=schema_data['repo_push_uri'],
141 repo_landing_rev=schema_data['repo_landing_commit_ref'],
142 repo_landing_rev=schema_data['repo_landing_commit_ref'],
142 repo_enable_statistics=schema_data['repo_enable_statistics'],
143 repo_enable_statistics=schema_data['repo_enable_statistics'],
143 repo_enable_locking=schema_data['repo_enable_locking'],
144 repo_enable_locking=schema_data['repo_enable_locking'],
144 repo_enable_downloads=schema_data['repo_enable_downloads'],
145 repo_enable_downloads=schema_data['repo_enable_downloads'],
145 )
146 )
146 # detect if CLONE URI changed, if we get OLD means we keep old values
147 # detect if SYNC URI changed, if we get OLD means we keep old values
147 if schema_data['repo_clone_uri_change'] == 'OLD':
148 if schema_data['repo_clone_uri_change'] == 'OLD':
148 validated_updates['clone_uri'] = self.db_repo.clone_uri
149 validated_updates['clone_uri'] = self.db_repo.clone_uri
149
150
151 if schema_data['repo_push_uri_change'] == 'OLD':
152 validated_updates['push_uri'] = self.db_repo.push_uri
153
150 # use the new full name for redirect
154 # use the new full name for redirect
151 new_repo_name = schema_data['repo_group']['repo_name_with_group']
155 new_repo_name = schema_data['repo_group']['repo_name_with_group']
152
156
@@ -43,8 +43,6 b' class RepoSettingsView(RepoAppView):'
43
43
44 def load_default_context(self):
44 def load_default_context(self):
45 c = self._get_local_tmpl_context()
45 c = self._get_local_tmpl_context()
46
47
48 return c
46 return c
49
47
50 @LoginRequired()
48 @LoginRequired()
@@ -231,3 +229,19 b' class RepoSettingsView(RepoAppView):'
231
229
232 raise HTTPFound(
230 raise HTTPFound(
233 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name))
231 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name))
232
233 @LoginRequired()
234 @HasRepoPermissionAnyDecorator('repository.admin')
235 @view_config(
236 route_name='edit_repo_advanced_hooks', request_method='GET',
237 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
238 def edit_advanced_install_hooks(self):
239 """
240 Install Hooks for repository
241 """
242 _ = self.request.translate
243 self.load_default_context()
244 self.rhodecode_vcs_repo.install_hooks(force=True)
245 h.flash(_('installed hooks repository'), category='success')
246 raise HTTPFound(
247 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name))
@@ -35,8 +35,6 b' log = logging.getLogger(__name__)'
35 class RepoSettingsRemoteView(RepoAppView):
35 class RepoSettingsRemoteView(RepoAppView):
36 def load_default_context(self):
36 def load_default_context(self):
37 c = self._get_local_tmpl_context()
37 c = self._get_local_tmpl_context()
38
39
40 return c
38 return c
41
39
42 @LoginRequired()
40 @LoginRequired()
@@ -117,6 +117,7 b' class SubversionTunnelWrapper(object):'
117 message=self._svn_string(message)))
117 message=self._svn_string(message)))
118 self.remove_configs()
118 self.remove_configs()
119 self.process.kill()
119 self.process.kill()
120 return 1
120
121
121 def interrupt(self, signum, frame):
122 def interrupt(self, signum, frame):
122 self.fail("Exited by timeout")
123 self.fail("Exited by timeout")
@@ -171,7 +172,7 b' class SubversionTunnelWrapper(object):'
171
172
172 first_response = self.get_first_client_response()
173 first_response = self.get_first_client_response()
173 if not first_response:
174 if not first_response:
174 self.fail("Repository name cannot be extracted")
175 return self.fail("Repository name cannot be extracted")
175
176
176 url_parts = urlparse.urlparse(first_response['url'])
177 url_parts = urlparse.urlparse(first_response['url'])
177 self.server.repo_name = url_parts.path.strip('/')
178 self.server.repo_name = url_parts.path.strip('/')
@@ -68,7 +68,7 b' def main(ini_path, mode, user, user_id, '
68 'of this script.')
68 'of this script.')
69 connection_info = os.environ.get('SSH_CONNECTION', '')
69 connection_info = os.environ.get('SSH_CONNECTION', '')
70
70
71 with bootstrap(ini_path) as env:
71 with bootstrap(ini_path, env={'RC_CMD_SSH_WRAPPER': '1'}) as env:
72 try:
72 try:
73 ssh_wrapper = SshWrapper(
73 ssh_wrapper = SshWrapper(
74 command, connection_info, mode,
74 command, connection_info, mode,
@@ -103,7 +103,7 b' class TestSubversionServer(object):'
103 return_value=0):
103 return_value=0):
104 with mock.patch.object(
104 with mock.patch.object(
105 SubversionTunnelWrapper, 'command',
105 SubversionTunnelWrapper, 'command',
106 return_value='date'):
106 return_value=['date']):
107
107
108 exit_code = server.run()
108 exit_code = server.run()
109 # SVN has this differently configured, and we get in our mock env
109 # SVN has this differently configured, and we get in our mock env
@@ -115,7 +115,7 b' class TestSubversionServer(object):'
115 from rhodecode.apps.ssh_support.lib.backends.svn import SubversionTunnelWrapper
115 from rhodecode.apps.ssh_support.lib.backends.svn import SubversionTunnelWrapper
116 with mock.patch.object(
116 with mock.patch.object(
117 SubversionTunnelWrapper, 'command',
117 SubversionTunnelWrapper, 'command',
118 return_value='date'):
118 return_value=['date']):
119 with mock.patch.object(
119 with mock.patch.object(
120 SubversionTunnelWrapper, 'get_first_client_response',
120 SubversionTunnelWrapper, 'get_first_client_response',
121 return_value=None):
121 return_value=None):
@@ -64,6 +64,9 b' RequestHeader edit Destination ^https: h'
64 SVNParentPath "${parent_path_root|n}"
64 SVNParentPath "${parent_path_root|n}"
65 SVNListParentPath ${"On" if svn_list_parent_path else "Off"|n}
65 SVNListParentPath ${"On" if svn_list_parent_path else "Off"|n}
66
66
67 # use specific SVN conf/authz file for each repository
68 #AuthzSVNReposRelativeAccessFile authz
69
67 Allow from all
70 Allow from all
68 Order allow,deny
71 Order allow,deny
69 </Location>
72 </Location>
@@ -82,6 +85,9 b' RequestHeader edit Destination ^https: h'
82 SVNParentPath "${parent_path|n}"
85 SVNParentPath "${parent_path|n}"
83 SVNListParentPath ${"On" if svn_list_parent_path else "Off"|n}
86 SVNListParentPath ${"On" if svn_list_parent_path else "Off"|n}
84
87
88 # use specific SVN conf/authz file for each repository
89 #AuthzSVNReposRelativeAccessFile authz
90
85 Allow from all
91 Allow from all
86 Order allow,deny
92 Order allow,deny
87 </Location>
93 </Location>
@@ -18,6 +18,7 b''
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import os
21 import logging
22 import logging
22 import importlib
23 import importlib
23
24
@@ -71,9 +72,12 b' def _discover_legacy_plugins(config, pre'
71 setting in database which are using the specified prefix. Normally 'py:' is
72 setting in database which are using the specified prefix. Normally 'py:' is
72 used for the legacy plugins.
73 used for the legacy plugins.
73 """
74 """
74 auth_plugins = SettingsModel().get_setting_by_name('auth_plugins')
75 try:
75 enabled_plugins = auth_plugins.app_settings_value
76 auth_plugins = SettingsModel().get_setting_by_name('auth_plugins')
76 legacy_plugins = [id_ for id_ in enabled_plugins if id_.startswith(prefix)]
77 enabled_plugins = auth_plugins.app_settings_value
78 legacy_plugins = [id_ for id_ in enabled_plugins if id_.startswith(prefix)]
79 except Exception:
80 legacy_plugins = []
77
81
78 for plugin_id in legacy_plugins:
82 for plugin_id in legacy_plugins:
79 log.debug('Legacy plugin discovered: "%s"', plugin_id)
83 log.debug('Legacy plugin discovered: "%s"', plugin_id)
@@ -117,6 +121,11 b' def includeme(config):'
117 route_name='auth_home',
121 route_name='auth_home',
118 context=AuthnRootResource)
122 context=AuthnRootResource)
119
123
124 for key in ['RC_CMD_SETUP_RC', 'RC_CMD_UPGRADE_DB', 'RC_CMD_SSH_WRAPPER']:
125 if os.environ.get(key):
126 # skip this heavy step below on certain CLI commands
127 return
128
120 # Auto discover authentication plugins and include their configuration.
129 # Auto discover authentication plugins and include their configuration.
121 _discover_plugins(config)
130 _discover_plugins(config)
122 _discover_legacy_plugins(config)
131 _discover_legacy_plugins(config)
@@ -171,11 +171,6 b' class RhodeCodeAuthPluginBase(object):'
171 db_type = '{}.encrypted'.format(db_type)
171 db_type = '{}.encrypted'.format(db_type)
172 return db_type
172 return db_type
173
173
174 @LazyProperty
175 def plugin_settings(self):
176 settings = SettingsModel().get_all_settings()
177 return settings
178
179 def is_enabled(self):
174 def is_enabled(self):
180 """
175 """
181 Returns true if this plugin is enabled. An enabled plugin can be
176 Returns true if this plugin is enabled. An enabled plugin can be
@@ -185,12 +180,13 b' class RhodeCodeAuthPluginBase(object):'
185 auth_plugins = SettingsModel().get_auth_plugins()
180 auth_plugins = SettingsModel().get_auth_plugins()
186 return self.get_id() in auth_plugins
181 return self.get_id() in auth_plugins
187
182
188 def is_active(self):
183 def is_active(self, plugin_cached_settings=None):
189 """
184 """
190 Returns true if the plugin is activated. An activated plugin is
185 Returns true if the plugin is activated. An activated plugin is
191 consulted during authentication, assumed it is also enabled.
186 consulted during authentication, assumed it is also enabled.
192 """
187 """
193 return self.get_setting_by_name('enabled')
188 return self.get_setting_by_name(
189 'enabled', plugin_cached_settings=plugin_cached_settings)
194
190
195 def get_id(self):
191 def get_id(self):
196 """
192 """
@@ -210,13 +206,24 b' class RhodeCodeAuthPluginBase(object):'
210 """
206 """
211 return AuthnPluginSettingsSchemaBase()
207 return AuthnPluginSettingsSchemaBase()
212
208
213 def get_setting_by_name(self, name, default=None, cache=True):
209 def get_settings(self):
210 """
211 Returns the plugin settings as dictionary.
212 """
213 settings = {}
214 raw_settings = SettingsModel().get_all_settings()
215 for node in self.get_settings_schema():
216 settings[node.name] = self.get_setting_by_name(
217 node.name, plugin_cached_settings=raw_settings)
218 return settings
219
220 def get_setting_by_name(self, name, default=None, plugin_cached_settings=None):
214 """
221 """
215 Returns a plugin setting by name.
222 Returns a plugin setting by name.
216 """
223 """
217 full_name = 'rhodecode_{}'.format(self._get_setting_full_name(name))
224 full_name = 'rhodecode_{}'.format(self._get_setting_full_name(name))
218 if cache:
225 if plugin_cached_settings:
219 plugin_settings = self.plugin_settings
226 plugin_settings = plugin_cached_settings
220 else:
227 else:
221 plugin_settings = SettingsModel().get_all_settings()
228 plugin_settings = SettingsModel().get_all_settings()
222
229
@@ -235,15 +242,6 b' class RhodeCodeAuthPluginBase(object):'
235 full_name, value, type_)
242 full_name, value, type_)
236 return db_setting.app_settings_value
243 return db_setting.app_settings_value
237
244
238 def get_settings(self):
239 """
240 Returns the plugin settings as dictionary.
241 """
242 settings = {}
243 for node in self.get_settings_schema():
244 settings[node.name] = self.get_setting_by_name(node.name)
245 return settings
246
247 def log_safe_settings(self, settings):
245 def log_safe_settings(self, settings):
248 """
246 """
249 returns a log safe representation of settings, without any secrets
247 returns a log safe representation of settings, without any secrets
@@ -625,12 +623,13 b' def authenticate(username, password, env'
625 'headers plugin, skipping...', plugin.get_id())
623 'headers plugin, skipping...', plugin.get_id())
626 continue
624 continue
627
625
626 log.debug('Trying authentication using ** %s **', plugin.get_id())
627
628 # load plugin settings from RhodeCode database
628 # load plugin settings from RhodeCode database
629 plugin_settings = plugin.get_settings()
629 plugin_settings = plugin.get_settings()
630 plugin_sanitized_settings = plugin.log_safe_settings(plugin_settings)
630 plugin_sanitized_settings = plugin.log_safe_settings(plugin_settings)
631 log.debug('Plugin settings:%s', plugin_sanitized_settings)
631 log.debug('Plugin `%s` settings:%s', plugin.get_id(), plugin_sanitized_settings)
632
632
633 log.debug('Trying authentication using ** %s **', plugin.get_id())
634 # use plugin's method of user extraction.
633 # use plugin's method of user extraction.
635 user = plugin.get_user(username, environ=environ,
634 user = plugin.get_user(username, environ=environ,
636 settings=plugin_settings)
635 settings=plugin_settings)
@@ -684,7 +683,8 b' def authenticate(username, password, env'
684 environ=environ or {})
683 environ=environ or {})
685
684
686 if plugin_cache_active:
685 if plugin_cache_active:
687 log.debug('Trying to fetch cached auth by `...%s`', _password_hash[:6])
686 log.debug('Trying to fetch cached auth by pwd hash `...%s`',
687 _password_hash[:6])
688 plugin_user = cache_manager.get(
688 plugin_user = cache_manager.get(
689 _password_hash, createfunc=auth_func)
689 _password_hash, createfunc=auth_func)
690 else:
690 else:
@@ -281,5 +281,5 b' class RhodeCodeAuthPlugin(RhodeCodeExter'
281 if group in user_attrs["groups"]:
281 if group in user_attrs["groups"]:
282 user_attrs["admin"] = True
282 user_attrs["admin"] = True
283 log.debug("Final crowd user object: \n%s" % (formatted_json(user_attrs)))
283 log.debug("Final crowd user object: \n%s" % (formatted_json(user_attrs)))
284 log.info('user %s authenticated correctly' % user_attrs['username'])
284 log.info('user `%s` authenticated correctly' % user_attrs['username'])
285 return user_attrs
285 return user_attrs
@@ -163,5 +163,5 b' class RhodeCodeAuthPlugin(RhodeCodeExter'
163 'extern_type': extern_type,
163 'extern_type': extern_type,
164 }
164 }
165
165
166 log.info('user %s authenticated correctly' % user_attrs['username'])
166 log.info('user `%s` authenticated correctly' % user_attrs['username'])
167 return user_attrs
167 return user_attrs
@@ -22,10 +22,12 b''
22 RhodeCode authentication plugin for LDAP
22 RhodeCode authentication plugin for LDAP
23 """
23 """
24
24
25
25 import socket
26 import re
26 import colander
27 import colander
27 import logging
28 import logging
28 import traceback
29 import traceback
30 import string
29
31
30 from rhodecode.translation import _
32 from rhodecode.translation import _
31 from rhodecode.authentication.base import (
33 from rhodecode.authentication.base import (
@@ -50,6 +52,9 b' except ImportError:'
50 ldap = Missing
52 ldap = Missing
51
53
52
54
55 class LdapError(Exception):
56 pass
57
53 def plugin_factory(plugin_id, *args, **kwds):
58 def plugin_factory(plugin_id, *args, **kwds):
54 """
59 """
55 Factory function that is called during plugin discovery.
60 Factory function that is called during plugin discovery.
@@ -86,6 +91,16 b' class LdapSettingsSchema(AuthnPluginSett'
86 title=_('Port'),
91 title=_('Port'),
87 validator=colander.Range(min=0, max=65536),
92 validator=colander.Range(min=0, max=65536),
88 widget='int')
93 widget='int')
94
95 timeout = colander.SchemaNode(
96 colander.Int(),
97 default=60 * 5,
98 description=_('Timeout for LDAP connection'),
99 preparer=strip_whitespace,
100 title=_('Connection timeout'),
101 validator=colander.Range(min=1),
102 widget='int')
103
89 dn_user = colander.SchemaNode(
104 dn_user = colander.SchemaNode(
90 colander.String(),
105 colander.String(),
91 default='',
106 default='',
@@ -187,19 +202,47 b' class LdapSettingsSchema(AuthnPluginSett'
187 class AuthLdap(object):
202 class AuthLdap(object):
188
203
189 def _build_servers(self):
204 def _build_servers(self):
205 def host_resolver(host, port):
206 """
207 Main work for this function is to prevent ldap connection issues,
208 and detect them early using a "greenified" sockets
209 """
210 host = host.strip()
211
212 log.info('Resolving LDAP host %s', host)
213 try:
214 ip = socket.gethostbyname(host)
215 log.info('Got LDAP server %s ip %s', host, ip)
216 except Exception:
217 raise LdapConnectionError(
218 'Failed to resolve host: `{}`'.format(host))
219
220 log.info('Checking LDAP IP access %s', ip)
221 s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
222 try:
223 s.connect((ip, int(port)))
224 s.shutdown(socket.SHUT_RD)
225 except Exception:
226 raise LdapConnectionError(
227 'Failed to connect to host: `{}:{}`'.format(host, port))
228
229 return '{}:{}'.format(host, port)
230
231 port = self.LDAP_SERVER_PORT
190 return ', '.join(
232 return ', '.join(
191 ["{}://{}:{}".format(
233 ["{}://{}".format(
192 self.ldap_server_type, host.strip(), self.LDAP_SERVER_PORT)
234 self.ldap_server_type, host_resolver(host, port))
193 for host in self.SERVER_ADDRESSES])
235 for host in self.SERVER_ADDRESSES])
194
236
195 def __init__(self, server, base_dn, port=389, bind_dn='', bind_pass='',
237 def __init__(self, server, base_dn, port=389, bind_dn='', bind_pass='',
196 tls_kind='PLAIN', tls_reqcert='DEMAND', ldap_version=3,
238 tls_kind='PLAIN', tls_reqcert='DEMAND', ldap_version=3,
197 search_scope='SUBTREE', attr_login='uid',
239 search_scope='SUBTREE', attr_login='uid',
198 ldap_filter=''):
240 ldap_filter='', timeout=None):
199 if ldap == Missing:
241 if ldap == Missing:
200 raise LdapImportError("Missing or incompatible ldap library")
242 raise LdapImportError("Missing or incompatible ldap library")
201
243
202 self.debug = False
244 self.debug = False
245 self.timeout = timeout or 60 * 5
203 self.ldap_version = ldap_version
246 self.ldap_version = ldap_version
204 self.ldap_server_type = 'ldap'
247 self.ldap_server_type = 'ldap'
205
248
@@ -226,34 +269,40 b' class AuthLdap(object):'
226 self.BASE_DN = safe_str(base_dn)
269 self.BASE_DN = safe_str(base_dn)
227 self.LDAP_FILTER = safe_str(ldap_filter)
270 self.LDAP_FILTER = safe_str(ldap_filter)
228
271
229 def _get_ldap_server(self):
272 def _get_ldap_conn(self):
273 log.debug('initializing LDAP connection to:%s', self.LDAP_SERVER)
274
230 if self.debug:
275 if self.debug:
231 ldap.set_option(ldap.OPT_DEBUG_LEVEL, 255)
276 ldap.set_option(ldap.OPT_DEBUG_LEVEL, 255)
277
232 if hasattr(ldap, 'OPT_X_TLS_CACERTDIR'):
278 if hasattr(ldap, 'OPT_X_TLS_CACERTDIR'):
233 ldap.set_option(ldap.OPT_X_TLS_CACERTDIR,
279 ldap.set_option(ldap.OPT_X_TLS_CACERTDIR, '/etc/openldap/cacerts')
234 '/etc/openldap/cacerts')
280 if self.TLS_KIND != 'PLAIN':
281 ldap.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, self.TLS_REQCERT)
282
235 ldap.set_option(ldap.OPT_REFERRALS, ldap.OPT_OFF)
283 ldap.set_option(ldap.OPT_REFERRALS, ldap.OPT_OFF)
236 ldap.set_option(ldap.OPT_RESTART, ldap.OPT_ON)
284 ldap.set_option(ldap.OPT_RESTART, ldap.OPT_ON)
237 ldap.set_option(ldap.OPT_NETWORK_TIMEOUT, 60 * 10)
238 ldap.set_option(ldap.OPT_TIMEOUT, 60 * 10)
239
285
240 if self.TLS_KIND != 'PLAIN':
286 # init connection now
241 ldap.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, self.TLS_REQCERT)
287 ldap_conn = ldap.initialize(self.LDAP_SERVER)
242 server = ldap.initialize(self.LDAP_SERVER)
288 ldap_conn.set_option(ldap.OPT_NETWORK_TIMEOUT, self.timeout)
289 ldap_conn.set_option(ldap.OPT_TIMEOUT, self.timeout)
290 ldap_conn.timeout = self.timeout
291
243 if self.ldap_version == 2:
292 if self.ldap_version == 2:
244 server.protocol = ldap.VERSION2
293 ldap_conn.protocol = ldap.VERSION2
245 else:
294 else:
246 server.protocol = ldap.VERSION3
295 ldap_conn.protocol = ldap.VERSION3
247
296
248 if self.TLS_KIND == 'START_TLS':
297 if self.TLS_KIND == 'START_TLS':
249 server.start_tls_s()
298 ldap_conn.start_tls_s()
250
299
251 if self.LDAP_BIND_DN and self.LDAP_BIND_PASS:
300 if self.LDAP_BIND_DN and self.LDAP_BIND_PASS:
252 log.debug('Trying simple_bind with password and given login DN: %s',
301 log.debug('Trying simple_bind with password and given login DN: %s',
253 self.LDAP_BIND_DN)
302 self.LDAP_BIND_DN)
254 server.simple_bind_s(self.LDAP_BIND_DN, self.LDAP_BIND_PASS)
303 ldap_conn.simple_bind_s(self.LDAP_BIND_DN, self.LDAP_BIND_PASS)
255
304
256 return server
305 return ldap_conn
257
306
258 def get_uid(self, username):
307 def get_uid(self, username):
259 uid = username
308 uid = username
@@ -295,13 +344,14 b' class AuthLdap(object):'
295 if "," in username:
344 if "," in username:
296 raise LdapUsernameError(
345 raise LdapUsernameError(
297 "invalid character `,` in username: `{}`".format(username))
346 "invalid character `,` in username: `{}`".format(username))
347 ldap_conn = None
298 try:
348 try:
299 server = self._get_ldap_server()
349 ldap_conn = self._get_ldap_conn()
300 filter_ = '(&%s(%s=%s))' % (
350 filter_ = '(&%s(%s=%s))' % (
301 self.LDAP_FILTER, self.attr_login, username)
351 self.LDAP_FILTER, self.attr_login, username)
302 log.debug("Authenticating %r filter %s at %s", self.BASE_DN,
352 log.debug("Authenticating %r filter %s at %s", self.BASE_DN,
303 filter_, self.LDAP_SERVER)
353 filter_, self.LDAP_SERVER)
304 lobjects = server.search_ext_s(
354 lobjects = ldap_conn.search_ext_s(
305 self.BASE_DN, self.SEARCH_SCOPE, filter_)
355 self.BASE_DN, self.SEARCH_SCOPE, filter_)
306
356
307 if not lobjects:
357 if not lobjects:
@@ -315,7 +365,7 b' class AuthLdap(object):'
315 continue
365 continue
316
366
317 user_attrs = self.fetch_attrs_from_simple_bind(
367 user_attrs = self.fetch_attrs_from_simple_bind(
318 server, dn, username, password)
368 ldap_conn, dn, username, password)
319 if user_attrs:
369 if user_attrs:
320 break
370 break
321
371
@@ -333,6 +383,15 b' class AuthLdap(object):'
333 raise LdapConnectionError(
383 raise LdapConnectionError(
334 "LDAP can't access authentication "
384 "LDAP can't access authentication "
335 "server, org_exc:%s" % org_exc)
385 "server, org_exc:%s" % org_exc)
386 finally:
387 if ldap_conn:
388 log.debug('ldap: connection release')
389 try:
390 ldap_conn.unbind_s()
391 except Exception:
392 # for any reason this can raise exception we must catch it
393 # to not crush the server
394 pass
336
395
337 return dn, user_attrs
396 return dn, user_attrs
338
397
@@ -429,6 +488,7 b' class RhodeCodeAuthPlugin(RhodeCodeExter'
429 'attr_login': settings.get('attr_login'),
488 'attr_login': settings.get('attr_login'),
430 'ldap_version': 3,
489 'ldap_version': 3,
431 'ldap_filter': settings.get('filter'),
490 'ldap_filter': settings.get('filter'),
491 'timeout': settings.get('timeout')
432 }
492 }
433
493
434 ldap_attrs = self.try_dynamic_binding(username, password, ldap_args)
494 ldap_attrs = self.try_dynamic_binding(username, password, ldap_args)
@@ -469,7 +529,7 b' class RhodeCodeAuthPlugin(RhodeCodeExter'
469 'extern_type': extern_type,
529 'extern_type': extern_type,
470 }
530 }
471 log.debug('ldap user: %s', user_attrs)
531 log.debug('ldap user: %s', user_attrs)
472 log.info('user %s authenticated correctly', user_attrs['username'])
532 log.info('user `%s` authenticated correctly', user_attrs['username'])
473
533
474 return user_attrs
534 return user_attrs
475
535
@@ -479,3 +539,4 b' class RhodeCodeAuthPlugin(RhodeCodeExter'
479 except (Exception,):
539 except (Exception,):
480 log.exception("Other exception")
540 log.exception("Other exception")
481 return None
541 return None
542
@@ -157,5 +157,5 b' class RhodeCodeAuthPlugin(RhodeCodeExter'
157 pass
157 pass
158
158
159 log.debug("pamuser: %s", user_attrs)
159 log.debug("pamuser: %s", user_attrs)
160 log.info('user %s authenticated correctly' % user_attrs['username'])
160 log.info('user `%s` authenticated correctly' % user_attrs['username'])
161 return user_attrs
161 return user_attrs
@@ -127,14 +127,14 b' class RhodeCodeAuthPlugin(RhodeCodeAuthP'
127
127
128 if userobj.username == User.DEFAULT_USER and userobj.active:
128 if userobj.username == User.DEFAULT_USER and userobj.active:
129 log.info(
129 log.info(
130 'user %s authenticated correctly as anonymous user', userobj)
130 'user `%s` authenticated correctly as anonymous user', userobj.username)
131 return user_attrs
131 return user_attrs
132
132
133 elif userobj.username == username and password_match:
133 elif userobj.username == username and password_match:
134 log.info('user %s authenticated correctly', userobj)
134 log.info('user `%s` authenticated correctly', userobj.username)
135 return user_attrs
135 return user_attrs
136 log.info("user %s had a bad password when "
136 log.warn("user `%s` used a wrong password when "
137 "authenticating on this plugin", userobj)
137 "authenticating on this plugin", userobj.username)
138 return None
138 return None
139 else:
139 else:
140 log.warning(
140 log.warning(
@@ -137,7 +137,7 b' class RhodeCodeAuthPlugin(RhodeCodeAuthP'
137 'user `%s` successfully authenticated via %s',
137 'user `%s` successfully authenticated via %s',
138 user_attrs['username'], self.name)
138 user_attrs['username'], self.name)
139 return user_attrs
139 return user_attrs
140 log.error(
140 log.warn(
141 'user `%s` failed to authenticate via %s, reason: bad or '
141 'user `%s` failed to authenticate via %s, reason: bad or '
142 'inactive token.', username, self.name)
142 'inactive token.', username, self.name)
143 else:
143 else:
@@ -70,9 +70,11 b' class AuthenticationPluginRegistry(objec'
70 # Add all enabled and active plugins to the list. We iterate over the
70 # Add all enabled and active plugins to the list. We iterate over the
71 # auth_plugins setting from DB because it also represents the ordering.
71 # auth_plugins setting from DB because it also represents the ordering.
72 enabled_plugins = SettingsModel().get_auth_plugins()
72 enabled_plugins = SettingsModel().get_auth_plugins()
73 raw_settings = SettingsModel().get_all_settings()
73 for plugin_id in enabled_plugins:
74 for plugin_id in enabled_plugins:
74 plugin = self.get_plugin(plugin_id)
75 plugin = self.get_plugin(plugin_id)
75 if plugin is not None and plugin.is_active():
76 if plugin is not None and plugin.is_active(
77 plugin_cached_settings=raw_settings):
76 plugins.append(plugin)
78 plugins.append(plugin)
77
79
78 # Add the fallback plugin from ini file.
80 # Add the fallback plugin from ini file.
@@ -63,7 +63,7 b' class AuthnPluginViewBase(BaseAppView):'
63 for node in schema:
63 for node in schema:
64 if node.name not in defaults:
64 if node.name not in defaults:
65 defaults[node.name] = self.plugin.get_setting_by_name(
65 defaults[node.name] = self.plugin.get_setting_by_name(
66 node.name, node.default, cache=False)
66 node.name, node.default)
67
67
68 template_context = {
68 template_context = {
69 'defaults': defaults,
69 'defaults': defaults,
@@ -145,7 +145,7 b' class AuthSettingsView(BaseAppView):'
145
145
146 # Create form default values and fill the form.
146 # Create form default values and fill the form.
147 form_defaults = {
147 form_defaults = {
148 'auth_plugins': ','.join(enabled_plugins)
148 'auth_plugins': ',\n'.join(enabled_plugins)
149 }
149 }
150 form_defaults.update(defaults)
150 form_defaults.update(defaults)
151 html = formencode.htmlfill.render(
151 html = formencode.htmlfill.render(
@@ -226,6 +226,7 b' def includeme(config):'
226 config.include('rhodecode.apps.user_group')
226 config.include('rhodecode.apps.user_group')
227 config.include('rhodecode.apps.search')
227 config.include('rhodecode.apps.search')
228 config.include('rhodecode.apps.user_profile')
228 config.include('rhodecode.apps.user_profile')
229 config.include('rhodecode.apps.user_group_profile')
229 config.include('rhodecode.apps.my_account')
230 config.include('rhodecode.apps.my_account')
230 config.include('rhodecode.apps.svn_support')
231 config.include('rhodecode.apps.svn_support')
231 config.include('rhodecode.apps.ssh_support')
232 config.include('rhodecode.apps.ssh_support')
@@ -37,6 +37,7 b' class PullRequestEvent(RepoEvent):'
37 self.pullrequest = pullrequest
37 self.pullrequest = pullrequest
38
38
39 def as_dict(self):
39 def as_dict(self):
40 from rhodecode.lib.utils2 import md5_safe
40 from rhodecode.model.pull_request import PullRequestModel
41 from rhodecode.model.pull_request import PullRequestModel
41 data = super(PullRequestEvent, self).as_dict()
42 data = super(PullRequestEvent, self).as_dict()
42
43
@@ -46,6 +47,9 b' class PullRequestEvent(RepoEvent):'
46 repos=[self.pullrequest.source_repo]
47 repos=[self.pullrequest.source_repo]
47 )
48 )
48 issues = _issues_as_dict(commits)
49 issues = _issues_as_dict(commits)
50 # calculate hashes of all commits for unique identifier of commits
51 # inside that pull request
52 commits_hash = md5_safe(':'.join(x.get('raw_id', '') for x in commits))
49
53
50 data.update({
54 data.update({
51 'pullrequest': {
55 'pullrequest': {
@@ -56,7 +60,10 b' class PullRequestEvent(RepoEvent):'
56 self.pullrequest, request=self.request),
60 self.pullrequest, request=self.request),
57 'permalink_url': PullRequestModel().get_url(
61 'permalink_url': PullRequestModel().get_url(
58 self.pullrequest, request=self.request, permalink=True),
62 self.pullrequest, request=self.request, permalink=True),
63 'shadow_url': PullRequestModel().get_shadow_clone_url(
64 self.pullrequest, request=self.request),
59 'status': self.pullrequest.calculated_review_status(),
65 'status': self.pullrequest.calculated_review_status(),
66 'commits_uid': commits_hash,
60 'commits': commits,
67 'commits': commits,
61 }
68 }
62 })
69 })
@@ -132,7 +132,8 b' def _commits_as_dict(event, commit_ids, '
132
132
133 missing_commits = set(commit_ids) - set(c['raw_id'] for c in commits)
133 missing_commits = set(commit_ids) - set(c['raw_id'] for c in commits)
134 if missing_commits:
134 if missing_commits:
135 log.error('missing commits: %s' % ', '.join(missing_commits))
135 log.error('Inconsistent repository state. '
136 'Missing commits: %s' % ', '.join(missing_commits))
136
137
137 return commits
138 return commits
138
139
@@ -45,6 +45,8 b' integration_type_registry.register_integ'
45 base.EEIntegration('Jira Issues integration', 'jira'))
45 base.EEIntegration('Jira Issues integration', 'jira'))
46 integration_type_registry.register_integration_type(
46 integration_type_registry.register_integration_type(
47 base.EEIntegration('Redmine Tracker integration', 'redmine'))
47 base.EEIntegration('Redmine Tracker integration', 'redmine'))
48 integration_type_registry.register_integration_type(
49 base.EEIntegration('Jenkins CI integration', 'jenkins'))
48
50
49
51
50 def integrations_event_handler(event):
52 def integrations_event_handler(event):
@@ -19,74 +19,86 b''
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import colander
21 import colander
22 import string
23 import collections
24 import logging
25
26 from mako import exceptions
27
22 from rhodecode.translation import _
28 from rhodecode.translation import _
23
29
24
30
31 log = logging.getLogger(__name__)
32
33
25 class IntegrationTypeBase(object):
34 class IntegrationTypeBase(object):
26 """ Base class for IntegrationType plugins """
35 """ Base class for IntegrationType plugins """
27 is_dummy = False
36 is_dummy = False
28 description = ''
37 description = ''
29 icon = '''
38
30 <?xml version="1.0" encoding="UTF-8" standalone="no"?>
39 @classmethod
31 <svg
40 def icon(cls):
32 xmlns:dc="http://purl.org/dc/elements/1.1/"
41 return '''
33 xmlns:cc="http://creativecommons.org/ns#"
42 <?xml version="1.0" encoding="UTF-8" standalone="no"?>
34 xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
43 <svg
35 xmlns:svg="http://www.w3.org/2000/svg"
44 xmlns:dc="http://purl.org/dc/elements/1.1/"
36 xmlns="http://www.w3.org/2000/svg"
45 xmlns:cc="http://creativecommons.org/ns#"
37 xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
46 xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
38 xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
47 xmlns:svg="http://www.w3.org/2000/svg"
39 viewBox="0 -256 1792 1792"
48 xmlns="http://www.w3.org/2000/svg"
40 id="svg3025"
49 xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
41 version="1.1"
50 xmlns:inkscape="http://setwww.inkscape.org/namespaces/inkscape"
42 inkscape:version="0.48.3.1 r9886"
51 viewBox="0 -256 1792 1792"
43 width="100%"
52 id="svg3025"
44 height="100%"
53 version="1.1"
45 sodipodi:docname="cog_font_awesome.svg">
54 inkscape:version="0.48.3.1 r9886"
46 <metadata
55 width="100%"
47 id="metadata3035">
56 height="100%"
48 <rdf:RDF>
57 sodipodi:docname="cog_font_awesome.svg">
49 <cc:Work
58 <metadata
50 rdf:about="">
59 id="metadata3035">
51 <dc:format>image/svg+xml</dc:format>
60 <rdf:RDF>
52 <dc:type
61 <cc:Work
53 rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
62 rdf:about="">
54 </cc:Work>
63 <dc:format>image/svg+xml</dc:format>
55 </rdf:RDF>
64 <dc:type
56 </metadata>
65 rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
57 <defs
66 </cc:Work>
58 id="defs3033" />
67 </rdf:RDF>
59 <sodipodi:namedview
68 </metadata>
60 pagecolor="#ffffff"
69 <defs
61 bordercolor="#666666"
70 id="defs3033" />
62 borderopacity="1"
71 <sodipodi:namedview
63 objecttolerance="10"
72 pagecolor="#ffffff"
64 gridtolerance="10"
73 bordercolor="#666666"
65 guidetolerance="10"
74 borderopacity="1"
66 inkscape:pageopacity="0"
75 objecttolerance="10"
67 inkscape:pageshadow="2"
76 gridtolerance="10"
68 inkscape:window-width="640"
77 guidetolerance="10"
69 inkscape:window-height="480"
78 inkscape:pageopacity="0"
70 id="namedview3031"
79 inkscape:pageshadow="2"
71 showgrid="false"
80 inkscape:window-width="640"
72 inkscape:zoom="0.13169643"
81 inkscape:window-height="480"
73 inkscape:cx="896"
82 id="namedview3031"
74 inkscape:cy="896"
83 showgrid="false"
75 inkscape:window-x="0"
84 inkscape:zoom="0.13169643"
76 inkscape:window-y="25"
85 inkscape:cx="896"
77 inkscape:window-maximized="0"
86 inkscape:cy="896"
78 inkscape:current-layer="svg3025" />
87 inkscape:window-x="0"
79 <g
88 inkscape:window-y="25"
80 transform="matrix(1,0,0,-1,121.49153,1285.4237)"
89 inkscape:window-maximized="0"
81 id="g3027">
90 inkscape:current-layer="svg3025" />
82 <path
91 <g
83 d="m 1024,640 q 0,106 -75,181 -75,75 -181,75 -106,0 -181,-75 -75,-75 -75,-181 0,-106 75,-181 75,-75 181,-75 106,0 181,75 75,75 75,181 z m 512,109 V 527 q 0,-12 -8,-23 -8,-11 -20,-13 l -185,-28 q -19,-54 -39,-91 35,-50 107,-138 10,-12 10,-25 0,-13 -9,-23 -27,-37 -99,-108 -72,-71 -94,-71 -12,0 -26,9 l -138,108 q -44,-23 -91,-38 -16,-136 -29,-186 -7,-28 -36,-28 H 657 q -14,0 -24.5,8.5 Q 622,-111 621,-98 L 593,86 q -49,16 -90,37 L 362,16 Q 352,7 337,7 323,7 312,18 186,132 147,186 q -7,10 -7,23 0,12 8,23 15,21 51,66.5 36,45.5 54,70.5 -27,50 -41,99 L 29,495 Q 16,497 8,507.5 0,518 0,531 v 222 q 0,12 8,23 8,11 19,13 l 186,28 q 14,46 39,92 -40,57 -107,138 -10,12 -10,24 0,10 9,23 26,36 98.5,107.5 72.5,71.5 94.5,71.5 13,0 26,-10 l 138,-107 q 44,23 91,38 16,136 29,186 7,28 36,28 h 222 q 14,0 24.5,-8.5 Q 914,1391 915,1378 l 28,-184 q 49,-16 90,-37 l 142,107 q 9,9 24,9 13,0 25,-10 129,-119 165,-170 7,-8 7,-22 0,-12 -8,-23 -15,-21 -51,-66.5 -36,-45.5 -54,-70.5 26,-50 41,-98 l 183,-28 q 13,-2 21,-12.5 8,-10.5 8,-23.5 z"
92 transform="matrix(1,0,0,-1,121.49153,1285.4237)"
84 id="path3029"
93 id="g3027">
85 inkscape:connector-curvature="0"
94 <path
86 style="fill:currentColor" />
95 d="m 1024,640 q 0,106 -75,181 -75,75 -181,75 -106,0 -181,-75 -75,-75 -75,-181 0,-106 75,-181 75,-75 181,-75 106,0 181,75 75,75 75,181 z m 512,109 V 527 q 0,-12 -8,-23 -8,-11 -20,-13 l -185,-28 q -19,-54 -39,-91 35,-50 107,-138 10,-12 10,-25 0,-13 -9,-23 -27,-37 -99,-108 -72,-71 -94,-71 -12,0 -26,9 l -138,108 q -44,-23 -91,-38 -16,-136 -29,-186 -7,-28 -36,-28 H 657 q -14,0 -24.5,8.5 Q 622,-111 621,-98 L 593,86 q -49,16 -90,37 L 362,16 Q 352,7 337,7 323,7 312,18 186,132 147,186 q -7,10 -7,23 0,12 8,23 15,21 51,66.5 36,45.5 54,70.5 -27,50 -41,99 L 29,495 Q 16,497 8,507.5 0,518 0,531 v 222 q 0,12 8,23 8,11 19,13 l 186,28 q 14,46 39,92 -40,57 -107,138 -10,12 -10,24 0,10 9,23 26,36 98.5,107.5 72.5,71.5 94.5,71.5 13,0 26,-10 l 138,-107 q 44,23 91,38 16,136 29,186 7,28 36,28 h 222 q 14,0 24.5,-8.5 Q 914,1391 915,1378 l 28,-184 q 49,-16 90,-37 l 142,107 q 9,9 24,9 13,0 25,-10 129,-119 165,-170 7,-8 7,-22 0,-12 -8,-23 -15,-21 -51,-66.5 -36,-45.5 -54,-70.5 26,-50 41,-98 l 183,-28 q 13,-2 21,-12.5 8,-10.5 8,-23.5 z"
87 </g>
96 id="path3029"
88 </svg>
97 inkscape:connector-curvature="0"
89 '''
98 style="fill:currentColor" />
99 </g>
100 </svg>
101 '''
90
102
91 def __init__(self, settings):
103 def __init__(self, settings):
92 """
104 """
@@ -108,3 +120,192 b' class EEIntegration(IntegrationTypeBase)'
108 def __init__(self, name, key, settings=None):
120 def __init__(self, name, key, settings=None):
109 self.display_name = name
121 self.display_name = name
110 self.key = key
122 self.key = key
123 super(EEIntegration, self).__init__(settings)
124
125
126 # Helpers #
127 WEBHOOK_URL_VARS = [
128 ('event_name', 'Unique name of the event type, e.g pullrequest-update'),
129 ('repo_name', 'Full name of the repository'),
130 ('repo_type', 'VCS type of repository'),
131 ('repo_id', 'Unique id of repository'),
132 ('repo_url', 'Repository url'),
133 # extra repo fields
134 ('extra:<extra_key_name>', 'Extra repo variables, read from its settings.'),
135
136 # special attrs below that we handle, using multi-call
137 ('branch', 'Name of each brach submitted, if any.'),
138 ('commit_id', 'Id of each commit submitted, if any.'),
139
140 # pr events vars
141 ('pull_request_id', 'Unique ID of the pull request.'),
142 ('pull_request_title', 'Title of the pull request.'),
143 ('pull_request_url', 'Pull request url.'),
144 ('pull_request_shadow_url', 'Pull request shadow repo clone url.'),
145 ('pull_request_commits_uid', 'Calculated UID of all commits inside the PR. '
146 'Changes after PR update'),
147
148 # user who triggers the call
149 ('username', 'User who triggered the call.'),
150 ('user_id', 'User id who triggered the call.'),
151 ]
152
153 # common vars for url template used for CI plugins. Shared with webhook
154 CI_URL_VARS = WEBHOOK_URL_VARS
155
156
157 class CommitParsingDataHandler(object):
158
159 def aggregate_branch_data(self, branches, commits):
160 branch_data = collections.OrderedDict()
161 for obj in branches:
162 branch_data[obj['name']] = obj
163
164 branches_commits = collections.OrderedDict()
165 for commit in commits:
166 if commit.get('git_ref_change'):
167 # special case for GIT that allows creating tags,
168 # deleting branches without associated commit
169 continue
170 commit_branch = commit['branch']
171
172 if commit_branch not in branches_commits:
173 _branch = branch_data[commit_branch] \
174 if commit_branch else commit_branch
175 branch_commits = {'branch': _branch,
176 'commits': []}
177 branches_commits[commit_branch] = branch_commits
178
179 branch_commits = branches_commits[commit_branch]
180 branch_commits['commits'].append(commit)
181 return branches_commits
182
183
184 class WebhookDataHandler(CommitParsingDataHandler):
185 name = 'webhook'
186
187 def __init__(self, template_url, headers):
188 self.template_url = template_url
189 self.headers = headers
190
191 def get_base_parsed_template(self, data):
192 """
193 initially parses the passed in template with some common variables
194 available on ALL calls
195 """
196 # note: make sure to update the `WEBHOOK_URL_VARS` if this changes
197 common_vars = {
198 'repo_name': data['repo']['repo_name'],
199 'repo_type': data['repo']['repo_type'],
200 'repo_id': data['repo']['repo_id'],
201 'repo_url': data['repo']['url'],
202 'username': data['actor']['username'],
203 'user_id': data['actor']['user_id'],
204 'event_name': data['name']
205 }
206
207 extra_vars = {}
208 for extra_key, extra_val in data['repo']['extra_fields'].items():
209 extra_vars['extra__{}'.format(extra_key)] = extra_val
210 common_vars.update(extra_vars)
211
212 template_url = self.template_url.replace('${extra:', '${extra__')
213 return string.Template(template_url).safe_substitute(**common_vars)
214
215 def repo_push_event_handler(self, event, data):
216 url = self.get_base_parsed_template(data)
217 url_cals = []
218
219 branches_commits = self.aggregate_branch_data(
220 data['push']['branches'], data['push']['commits'])
221 if '${branch}' in url:
222 # call it multiple times, for each branch if used in variables
223 for branch, commit_ids in branches_commits.items():
224 branch_url = string.Template(url).safe_substitute(branch=branch)
225 # call further down for each commit if used
226 if '${commit_id}' in branch_url:
227 for commit_data in commit_ids['commits']:
228 commit_id = commit_data['raw_id']
229 commit_url = string.Template(branch_url).safe_substitute(
230 commit_id=commit_id)
231 # register per-commit call
232 log.debug(
233 'register %s call(%s) to url %s',
234 self.name, event, commit_url)
235 url_cals.append(
236 (commit_url, self.headers, data))
237
238 else:
239 # register per-branch call
240 log.debug(
241 'register %s call(%s) to url %s',
242 self.name, event, branch_url)
243 url_cals.append(
244 (branch_url, self.headers, data))
245
246 else:
247 log.debug(
248 'register %s call(%s) to url %s', self.name, event, url)
249 url_cals.append((url, self.headers, data))
250
251 return url_cals
252
253 def repo_create_event_handler(self, event, data):
254 url = self.get_base_parsed_template(data)
255 log.debug(
256 'register %s call(%s) to url %s', self.name, event, url)
257 return [(url, self.headers, data)]
258
259 def pull_request_event_handler(self, event, data):
260 url = self.get_base_parsed_template(data)
261 log.debug(
262 'register %s call(%s) to url %s', self.name, event, url)
263 url = string.Template(url).safe_substitute(
264 pull_request_id=data['pullrequest']['pull_request_id'],
265 pull_request_title=data['pullrequest']['title'],
266 pull_request_url=data['pullrequest']['url'],
267 pull_request_shadow_url=data['pullrequest']['shadow_url'],
268 pull_request_commits_uid=data['pullrequest']['commits_uid'],
269 )
270 return [(url, self.headers, data)]
271
272 def __call__(self, event, data):
273 from rhodecode import events
274
275 if isinstance(event, events.RepoPushEvent):
276 return self.repo_push_event_handler(event, data)
277 elif isinstance(event, events.RepoCreateEvent):
278 return self.repo_create_event_handler(event, data)
279 elif isinstance(event, events.PullRequestEvent):
280 return self.pull_request_event_handler(event, data)
281 else:
282 raise ValueError(
283 'event type `%s` not in supported list: %s' % (
284 event.__class__, events))
285
286
287 def get_auth(settings):
288 from requests.auth import HTTPBasicAuth
289 username = settings.get('username')
290 password = settings.get('password')
291 if username and password:
292 return HTTPBasicAuth(username, password)
293 return None
294
295
296 def get_web_token(settings):
297 return settings['secret_token']
298
299
300 def get_url_vars(url_vars):
301 return '\n'.join(
302 '{} - {}'.format('${' + key + '}', explanation)
303 for key, explanation in url_vars)
304
305
306 def render_with_traceback(template, *args, **kwargs):
307 try:
308 return template.render(*args, **kwargs)
309 except Exception:
310 log.error(exceptions.text_error_template().render())
311 raise
@@ -29,7 +29,8 b' from rhodecode import events'
29 from rhodecode.translation import _
29 from rhodecode.translation import _
30 from rhodecode.lib.celerylib import run_task
30 from rhodecode.lib.celerylib import run_task
31 from rhodecode.lib.celerylib import tasks
31 from rhodecode.lib.celerylib import tasks
32 from rhodecode.integrations.types.base import IntegrationTypeBase
32 from rhodecode.integrations.types.base import (
33 IntegrationTypeBase, render_with_traceback)
33
34
34
35
35 log = logging.getLogger(__name__)
36 log = logging.getLogger(__name__)
@@ -127,11 +128,15 b" repo_push_template_html = Template('''"
127 </td></tr>
128 </td></tr>
128 <tr>
129 <tr>
129 <td style="padding:15px;" valign="top">
130 <td style="padding:15px;" valign="top">
130 % for commit in data['push']['commits']:
131 % if data['push']['commits']:
131 <a href="${commit['url']}">${commit['short_id']}</a> by ${commit['author']} at ${commit['date']} <br/>
132 % for commit in data['push']['commits']:
132 ${commit['message_html']} <br/>
133 <a href="${commit['url']}">${commit['short_id']}</a> by ${commit['author']} at ${commit['date']} <br/>
133 <br/>
134 ${commit['message_html']} <br/>
134 % endfor
135 <br/>
136 % endfor
137 % else:
138 No commit data
139 % endif
135 </td>
140 </td>
136 </tr>
141 </tr>
137 </table>
142 </table>
@@ -146,7 +151,34 b" repo_push_template_html = Template('''"
146 </html>
151 </html>
147 ''')
152 ''')
148
153
149 email_icon = '''
154
155 class EmailSettingsSchema(colander.Schema):
156 @colander.instantiate(validator=colander.Length(min=1))
157 class recipients(colander.SequenceSchema):
158 title = _('Recipients')
159 description = _('Email addresses to send push events to')
160 widget = deform.widget.SequenceWidget(min_len=1)
161
162 recipient = colander.SchemaNode(
163 colander.String(),
164 title=_('Email address'),
165 description=_('Email address'),
166 default='',
167 validator=colander.Email(),
168 widget=deform.widget.TextInputWidget(
169 placeholder='user@domain.com',
170 ),
171 )
172
173
174 class EmailIntegrationType(IntegrationTypeBase):
175 key = 'email'
176 display_name = _('Email')
177 description = _('Send repo push summaries to a list of recipients via email')
178
179 @classmethod
180 def icon(cls):
181 return '''
150 <?xml version="1.0" encoding="UTF-8" standalone="no"?>
182 <?xml version="1.0" encoding="UTF-8" standalone="no"?>
151 <svg
183 <svg
152 xmlns:dc="http://purl.org/dc/elements/1.1/"
184 xmlns:dc="http://purl.org/dc/elements/1.1/"
@@ -208,32 +240,6 b" email_icon = '''"
208 </svg>
240 </svg>
209 '''
241 '''
210
242
211
212 class EmailSettingsSchema(colander.Schema):
213 @colander.instantiate(validator=colander.Length(min=1))
214 class recipients(colander.SequenceSchema):
215 title = _('Recipients')
216 description = _('Email addresses to send push events to')
217 widget = deform.widget.SequenceWidget(min_len=1)
218
219 recipient = colander.SchemaNode(
220 colander.String(),
221 title=_('Email address'),
222 description=_('Email address'),
223 default='',
224 validator=colander.Email(),
225 widget=deform.widget.TextInputWidget(
226 placeholder='user@domain.com',
227 ),
228 )
229
230
231 class EmailIntegrationType(IntegrationTypeBase):
232 key = 'email'
233 display_name = _('Email')
234 description = _('Send repo push summaries to a list of recipients via email')
235 icon = email_icon
236
237 def settings_schema(self):
243 def settings_schema(self):
238 schema = EmailSettingsSchema()
244 schema = EmailSettingsSchema()
239 return schema
245 return schema
@@ -276,12 +282,14 b' def repo_push_handler(data, settings):'
276 branches=', '.join(
282 branches=', '.join(
277 branch['name'] for branch in data['push']['branches']))
283 branch['name'] for branch in data['push']['branches']))
278
284
279 email_body_plaintext = repo_push_template_plaintext.render(
285 email_body_plaintext = render_with_traceback(
286 repo_push_template_plaintext,
280 data=data,
287 data=data,
281 subject=subject,
288 subject=subject,
282 instance_url=server_url)
289 instance_url=server_url)
283
290
284 email_body_html = repo_push_template_html.render(
291 email_body_html = render_with_traceback(
292 repo_push_template_html,
285 data=data,
293 data=data,
286 subject=subject,
294 subject=subject,
287 instance_url=server_url)
295 instance_url=server_url)
@@ -24,14 +24,14 b' import logging'
24 import requests
24 import requests
25 import colander
25 import colander
26 import textwrap
26 import textwrap
27 from collections import OrderedDict
28 from mako.template import Template
27 from mako.template import Template
29 from rhodecode import events
28 from rhodecode import events
30 from rhodecode.translation import _
29 from rhodecode.translation import _
31 from rhodecode.lib import helpers as h
30 from rhodecode.lib import helpers as h
32 from rhodecode.lib.celerylib import run_task, async_task, RequestContextTask
31 from rhodecode.lib.celerylib import run_task, async_task, RequestContextTask
33 from rhodecode.lib.colander_utils import strip_whitespace
32 from rhodecode.lib.colander_utils import strip_whitespace
34 from rhodecode.integrations.types.base import IntegrationTypeBase
33 from rhodecode.integrations.types.base import (
34 IntegrationTypeBase, CommitParsingDataHandler, render_with_traceback)
35
35
36 log = logging.getLogger(__name__)
36 log = logging.getLogger(__name__)
37
37
@@ -81,23 +81,31 b" repo_push_template = Template('''"
81 <ul>
81 <ul>
82 %for branch, branch_commits in branches_commits.items():
82 %for branch, branch_commits in branches_commits.items():
83 <li>
83 <li>
84 % if branch:
84 <a href="${branch_commits['branch']['url']}">branch: ${branch_commits['branch']['name']}</a>
85 <a href="${branch_commits['branch']['url']}">branch: ${branch_commits['branch']['name']}</a>
86 % else:
87 to trunk
88 % endif
85 <ul>
89 <ul>
86 %for commit in branch_commits['commits']:
90 % for commit in branch_commits['commits']:
87 <li><a href="${commit['url']}">${commit['short_id']}</a> - ${commit['message_html']}</li>
91 <li><a href="${commit['url']}">${commit['short_id']}</a> - ${commit['message_html']}</li>
88 %endfor
92 % endfor
89 </ul>
93 </ul>
90 </li>
94 </li>
91 %endfor
95 %endfor
92 ''')
96 ''')
93
97
94
98
95 class HipchatIntegrationType(IntegrationTypeBase):
99 class HipchatIntegrationType(IntegrationTypeBase, CommitParsingDataHandler):
96 key = 'hipchat'
100 key = 'hipchat'
97 display_name = _('Hipchat')
101 display_name = _('Hipchat')
98 description = _('Send events such as repo pushes and pull requests to '
102 description = _('Send events such as repo pushes and pull requests to '
99 'your hipchat channel.')
103 'your hipchat channel.')
100 icon = '''<?xml version="1.0" encoding="utf-8"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 1000 1000" enable-background="new 0 0 1000 1000" xml:space="preserve"><g><g transform="translate(0.000000,511.000000) scale(0.100000,-0.100000)"><path fill="#205281" d="M4197.1,4662.4c-1661.5-260.4-3018-1171.6-3682.6-2473.3C219.9,1613.6,100,1120.3,100,462.6c0-1014,376.8-1918.4,1127-2699.4C2326.7-3377.6,3878.5-3898.3,5701-3730.5l486.5,44.5l208.9-123.3c637.2-373.4,1551.8-640.6,2240.4-650.9c304.9-6.9,335.7,0,417.9,75.4c185,174.7,147.3,411.1-89.1,548.1c-315.2,181.6-620,544.7-733.1,870.1l-51.4,157.6l472.7,472.7c349.4,349.4,520.7,551.5,657.7,774.2c784.5,1281.2,784.5,2788.5,0,4052.6c-236.4,376.8-794.8,966-1178.4,1236.7c-572.1,407.7-1264.1,709.1-1993.7,870.1c-267.2,58.2-479.6,75.4-1038,82.2C4714.4,4686.4,4310.2,4679.6,4197.1,4662.4z M5947.6,3740.9c1856.7-380.3,3127.6-1709.4,3127.6-3275c0-1000.3-534.4-1949.2-1466.2-2600.1c-188.4-133.6-287.8-226.1-301.5-284.4c-41.1-157.6,263.8-938.6,397.4-1020.8c20.5-10.3,34.3-44.5,34.3-75.4c0-167.8-811.9,195.3-1363.4,609.8l-181.6,137l-332.3-58.2c-445.3-78.8-1281.2-78.8-1702.6,0C2796-2569.2,1734.1-1832.6,1220.2-801.5C983.8-318.5,905,51.5,929,613.3c27.4,640.6,243.2,1192.1,685.1,1740.3c620,770.8,1661.5,1305.2,2822.8,1452.5C4806.9,3854,5553.7,3819.7,5947.6,3740.9z"/><path fill="#205281" d="M2381.5-345.9c-75.4-106.2-68.5-167.8,34.3-322c332.3-500.2,1010.6-928.4,1760.8-1120.2c417.9-106.2,1226.4-106.2,1644.3,0c712.5,181.6,1270.9,517.3,1685.4,1014C7681-561.7,7715.3-424.7,7616-325.4c-89.1,89.1-167.9,65.1-431.7-133.6c-835.8-630.3-2028-856.4-3086.5-585.8C3683.3-938.6,3142-685,2830.3-448.7C2576.8-253.4,2463.7-229.4,2381.5-345.9z"/></g></g><!-- Svg Vector Icons : http://www.onlinewebfonts.com/icon --></svg>'''
104
105 @classmethod
106 def icon(cls):
107 return '''<?xml version="1.0" encoding="utf-8"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 1000 1000" enable-background="new 0 0 1000 1000" xml:space="preserve"><g><g transform="translate(0.000000,511.000000) scale(0.100000,-0.100000)"><path fill="#205281" d="M4197.1,4662.4c-1661.5-260.4-3018-1171.6-3682.6-2473.3C219.9,1613.6,100,1120.3,100,462.6c0-1014,376.8-1918.4,1127-2699.4C2326.7-3377.6,3878.5-3898.3,5701-3730.5l486.5,44.5l208.9-123.3c637.2-373.4,1551.8-640.6,2240.4-650.9c304.9-6.9,335.7,0,417.9,75.4c185,174.7,147.3,411.1-89.1,548.1c-315.2,181.6-620,544.7-733.1,870.1l-51.4,157.6l472.7,472.7c349.4,349.4,520.7,551.5,657.7,774.2c784.5,1281.2,784.5,2788.5,0,4052.6c-236.4,376.8-794.8,966-1178.4,1236.7c-572.1,407.7-1264.1,709.1-1993.7,870.1c-267.2,58.2-479.6,75.4-1038,82.2C4714.4,4686.4,4310.2,4679.6,4197.1,4662.4z M5947.6,3740.9c1856.7-380.3,3127.6-1709.4,3127.6-3275c0-1000.3-534.4-1949.2-1466.2-2600.1c-188.4-133.6-287.8-226.1-301.5-284.4c-41.1-157.6,263.8-938.6,397.4-1020.8c20.5-10.3,34.3-44.5,34.3-75.4c0-167.8-811.9,195.3-1363.4,609.8l-181.6,137l-332.3-58.2c-445.3-78.8-1281.2-78.8-1702.6,0C2796-2569.2,1734.1-1832.6,1220.2-801.5C983.8-318.5,905,51.5,929,613.3c27.4,640.6,243.2,1192.1,685.1,1740.3c620,770.8,1661.5,1305.2,2822.8,1452.5C4806.9,3854,5553.7,3819.7,5947.6,3740.9z"/><path fill="#205281" d="M2381.5-345.9c-75.4-106.2-68.5-167.8,34.3-322c332.3-500.2,1010.6-928.4,1760.8-1120.2c417.9-106.2,1226.4-106.2,1644.3,0c712.5,181.6,1270.9,517.3,1685.4,1014C7681-561.7,7715.3-424.7,7616-325.4c-89.1,89.1-167.9,65.1-431.7-133.6c-835.8-630.3-2028-856.4-3086.5-585.8C3683.3-938.6,3142-685,2830.3-448.7C2576.8-253.4,2463.7-229.4,2381.5-345.9z"/></g></g><!-- Svg Vector Icons : http://www.onlinewebfonts.com/icon --></svg>'''
108
101 valid_events = [
109 valid_events = [
102 events.PullRequestCloseEvent,
110 events.PullRequestCloseEvent,
103 events.PullRequestMergeEvent,
111 events.PullRequestMergeEvent,
@@ -213,20 +221,11 b' class HipchatIntegrationType(Integration'
213 )
221 )
214
222
215 def format_repo_push_event(self, data):
223 def format_repo_push_event(self, data):
216 branch_data = {branch['name']: branch
224 branches_commits = self.aggregate_branch_data(
217 for branch in data['push']['branches']}
225 data['push']['branches'], data['push']['commits'])
218
226
219 branches_commits = OrderedDict()
227 result = render_with_traceback(
220 for commit in data['push']['commits']:
228 repo_push_template,
221 if commit['branch'] not in branches_commits:
222 branch_commits = {'branch': branch_data[commit['branch']],
223 'commits': []}
224 branches_commits[commit['branch']] = branch_commits
225
226 branch_commits = branches_commits[commit['branch']]
227 branch_commits['commits'].append(commit)
228
229 result = repo_push_template.render(
230 data=data,
229 data=data,
231 branches_commits=branches_commits,
230 branches_commits=branches_commits,
232 )
231 )
@@ -28,14 +28,14 b' import deform'
28 import requests
28 import requests
29 import colander
29 import colander
30 from mako.template import Template
30 from mako.template import Template
31 from collections import OrderedDict
32
31
33 from rhodecode import events
32 from rhodecode import events
34 from rhodecode.translation import _
33 from rhodecode.translation import _
35 from rhodecode.lib import helpers as h
34 from rhodecode.lib import helpers as h
36 from rhodecode.lib.celerylib import run_task, async_task, RequestContextTask
35 from rhodecode.lib.celerylib import run_task, async_task, RequestContextTask
37 from rhodecode.lib.colander_utils import strip_whitespace
36 from rhodecode.lib.colander_utils import strip_whitespace
38 from rhodecode.integrations.types.base import IntegrationTypeBase
37 from rhodecode.integrations.types.base import (
38 IntegrationTypeBase, CommitParsingDataHandler, render_with_traceback)
39
39
40 log = logging.getLogger(__name__)
40 log = logging.getLogger(__name__)
41
41
@@ -87,12 +87,16 b' class SlackSettingsSchema(colander.Schem'
87 )
87 )
88
88
89
89
90 class SlackIntegrationType(IntegrationTypeBase):
90 class SlackIntegrationType(IntegrationTypeBase, CommitParsingDataHandler):
91 key = 'slack'
91 key = 'slack'
92 display_name = _('Slack')
92 display_name = _('Slack')
93 description = _('Send events such as repo pushes and pull requests to '
93 description = _('Send events such as repo pushes and pull requests to '
94 'your slack channel.')
94 'your slack channel.')
95 icon = '''<?xml version="1.0" encoding="UTF-8" standalone="no"?><svg viewBox="0 0 256 256" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" preserveAspectRatio="xMidYMid"><g><path d="M165.963541,15.8384262 C162.07318,3.86308197 149.212328,-2.69009836 137.239082,1.20236066 C125.263738,5.09272131 118.710557,17.9535738 122.603016,29.9268197 L181.550164,211.292328 C185.597902,222.478689 197.682361,228.765377 209.282098,225.426885 C221.381246,221.943607 228.756984,209.093246 224.896,197.21023 C224.749115,196.756984 165.963541,15.8384262 165.963541,15.8384262" fill="#DFA22F"></path><path d="M74.6260984,45.515541 C70.7336393,33.5422951 57.8727869,26.9891148 45.899541,30.8794754 C33.9241967,34.7698361 27.3710164,47.6306885 31.2634754,59.6060328 L90.210623,240.971541 C94.2583607,252.157902 106.34282,258.44459 117.942557,255.104 C130.041705,251.62282 137.417443,238.772459 133.556459,226.887344 C133.409574,226.436197 74.6260984,45.515541 74.6260984,45.515541" fill="#3CB187"></path><path d="M240.161574,166.045377 C252.136918,162.155016 258.688,149.294164 254.797639,137.31882 C250.907279,125.345574 238.046426,118.792393 226.07318,122.682754 L44.7076721,181.632 C33.5213115,185.677639 27.234623,197.762098 30.5731148,209.361836 C34.0563934,221.460984 46.9067541,228.836721 58.7897705,224.975738 C59.2430164,224.828852 240.161574,166.045377 240.161574,166.045377" fill="#CE1E5B"></path><path d="M82.507541,217.270557 C94.312918,213.434754 109.528131,208.491016 125.855475,203.186361 C122.019672,191.380984 117.075934,176.163672 111.76918,159.83423 L68.4191475,173.924721 L82.507541,217.270557" fill="#392538"></path><path d="M173.847082,187.591344 C190.235279,182.267803 205.467279,177.31777 217.195016,173.507148 C213.359213,161.70177 208.413377,146.480262 203.106623,130.146623 L159.75659,144.237115 L173.847082,187.591344" fill="#BB242A"></path><path d="M210.484459,74.7058361 C222.457705,70.8154754 229.010885,57.954623 225.120525,45.9792787 C221.230164,34.0060328 208.369311,27.4528525 196.393967,31.3432131 L15.028459,90.292459 C3.84209836,94.3380984 -2.44459016,106.422557 0.896,118.022295 C4.37718033,130.121443 17.227541,137.49718 29.1126557,133.636197 C29.5638033,133.489311 210.484459,74.7058361 210.484459,74.7058361" fill="#72C5CD"></path><path d="M52.8220328,125.933115 C64.6274098,122.097311 79.8468197,117.151475 96.1762623,111.84682 C90.8527213,95.4565246 85.9026885,80.2245246 82.0920656,68.4946885 L38.731541,82.5872787 L52.8220328,125.933115" fill="#248C73"></path><path d="M144.159475,96.256 C160.551869,90.9303607 175.785967,85.9803279 187.515803,82.1676066 C182.190164,65.7752131 177.240131,50.5390164 173.42741,38.807082 L130.068984,52.8996721 L144.159475,96.256" fill="#62803A"></path></g></svg>'''
95
96 @classmethod
97 def icon(cls):
98 return '''<?xml version="1.0" encoding="UTF-8" standalone="no"?><svg viewBox="0 0 256 256" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" preserveAspectRatio="xMidYMid"><g><path d="M165.963541,15.8384262 C162.07318,3.86308197 149.212328,-2.69009836 137.239082,1.20236066 C125.263738,5.09272131 118.710557,17.9535738 122.603016,29.9268197 L181.550164,211.292328 C185.597902,222.478689 197.682361,228.765377 209.282098,225.426885 C221.381246,221.943607 228.756984,209.093246 224.896,197.21023 C224.749115,196.756984 165.963541,15.8384262 165.963541,15.8384262" fill="#DFA22F"></path><path d="M74.6260984,45.515541 C70.7336393,33.5422951 57.8727869,26.9891148 45.899541,30.8794754 C33.9241967,34.7698361 27.3710164,47.6306885 31.2634754,59.6060328 L90.210623,240.971541 C94.2583607,252.157902 106.34282,258.44459 117.942557,255.104 C130.041705,251.62282 137.417443,238.772459 133.556459,226.887344 C133.409574,226.436197 74.6260984,45.515541 74.6260984,45.515541" fill="#3CB187"></path><path d="M240.161574,166.045377 C252.136918,162.155016 258.688,149.294164 254.797639,137.31882 C250.907279,125.345574 238.046426,118.792393 226.07318,122.682754 L44.7076721,181.632 C33.5213115,185.677639 27.234623,197.762098 30.5731148,209.361836 C34.0563934,221.460984 46.9067541,228.836721 58.7897705,224.975738 C59.2430164,224.828852 240.161574,166.045377 240.161574,166.045377" fill="#CE1E5B"></path><path d="M82.507541,217.270557 C94.312918,213.434754 109.528131,208.491016 125.855475,203.186361 C122.019672,191.380984 117.075934,176.163672 111.76918,159.83423 L68.4191475,173.924721 L82.507541,217.270557" fill="#392538"></path><path d="M173.847082,187.591344 C190.235279,182.267803 205.467279,177.31777 217.195016,173.507148 C213.359213,161.70177 208.413377,146.480262 203.106623,130.146623 L159.75659,144.237115 L173.847082,187.591344" fill="#BB242A"></path><path d="M210.484459,74.7058361 C222.457705,70.8154754 229.010885,57.954623 225.120525,45.9792787 C221.230164,34.0060328 208.369311,27.4528525 196.393967,31.3432131 L15.028459,90.292459 C3.84209836,94.3380984 -2.44459016,106.422557 0.896,118.022295 C4.37718033,130.121443 17.227541,137.49718 29.1126557,133.636197 C29.5638033,133.489311 210.484459,74.7058361 210.484459,74.7058361" fill="#72C5CD"></path><path d="M52.8220328,125.933115 C64.6274098,122.097311 79.8468197,117.151475 96.1762623,111.84682 C90.8527213,95.4565246 85.9026885,80.2245246 82.0920656,68.4946885 L38.731541,82.5872787 L52.8220328,125.933115" fill="#248C73"></path><path d="M144.159475,96.256 C160.551869,90.9303607 175.785967,85.9803279 187.515803,82.1676066 C182.190164,65.7752131 177.240131,50.5390164 173.42741,38.807082 L130.068984,52.8996721 L144.159475,96.256" fill="#62803A"></path></g></svg>'''
99
96 valid_events = [
100 valid_events = [
97 events.PullRequestCloseEvent,
101 events.PullRequestCloseEvent,
98 events.PullRequestMergeEvent,
102 events.PullRequestMergeEvent,
@@ -190,32 +194,39 b' class SlackIntegrationType(IntegrationTy'
190 }
194 }
191 ]
195 ]
192
196
193 title = Template(textwrap.dedent(r'''
197 template = Template(textwrap.dedent(r'''
194 *${data['actor']['username']}* left ${data['comment']['type']} on pull request <${data['pullrequest']['url']}|#${data['pullrequest']['pull_request_id']}>:
198 *${data['actor']['username']}* left ${data['comment']['type']} on pull request <${data['pullrequest']['url']}|#${data['pullrequest']['pull_request_id']}>:
195 ''')).render(data=data, comment=event.comment)
199 '''))
200 title = render_with_traceback(
201 template, data=data, comment=event.comment)
196
202
197 text = Template(textwrap.dedent(r'''
203 template = Template(textwrap.dedent(r'''
198 *pull request title*: ${pr_title}
204 *pull request title*: ${pr_title}
199 % if status_text:
205 % if status_text:
200 *submitted status*: `${status_text}`
206 *submitted status*: `${status_text}`
201 % endif
207 % endif
202 >>> ${comment_text}
208 >>> ${comment_text}
203 ''')).render(comment_text=comment_text,
209 '''))
204 pr_title=data['pullrequest']['title'],
210 text = render_with_traceback(
205 status_text=status_text)
211 template,
212 comment_text=comment_text,
213 pr_title=data['pullrequest']['title'],
214 status_text=status_text)
206
215
207 return title, text, fields, overrides
216 return title, text, fields, overrides
208
217
209 def format_pull_request_review_event(self, event, data):
218 def format_pull_request_review_event(self, event, data):
210 title = Template(textwrap.dedent(r'''
219 template = Template(textwrap.dedent(r'''
211 *${data['actor']['username']}* changed status of pull request <${data['pullrequest']['url']}|#${data['pullrequest']['pull_request_id']} to `${data['pullrequest']['status']}`>:
220 *${data['actor']['username']}* changed status of pull request <${data['pullrequest']['url']}|#${data['pullrequest']['pull_request_id']} to `${data['pullrequest']['status']}`>:
212 ''')).render(data=data)
221 '''))
222 title = render_with_traceback(template, data=data)
213
223
214 text = Template(textwrap.dedent(r'''
224 template = Template(textwrap.dedent(r'''
215 *pull request title*: ${pr_title}
225 *pull request title*: ${pr_title}
216 ''')).render(
226 '''))
217 pr_title=data['pullrequest']['title'],
227 text = render_with_traceback(
218 )
228 template,
229 pr_title=data['pullrequest']['title'])
219
230
220 return title, text
231 return title, text
221
232
@@ -227,50 +238,53 b' class SlackIntegrationType(IntegrationTy'
227 events.PullRequestCreateEvent: 'created',
238 events.PullRequestCreateEvent: 'created',
228 }.get(event.__class__, str(event.__class__))
239 }.get(event.__class__, str(event.__class__))
229
240
230 title = Template(textwrap.dedent(r'''
241 template = Template(textwrap.dedent(r'''
231 *${data['actor']['username']}* `${action}` pull request <${data['pullrequest']['url']}|#${data['pullrequest']['pull_request_id']}>:
242 *${data['actor']['username']}* `${action}` pull request <${data['pullrequest']['url']}|#${data['pullrequest']['pull_request_id']}>:
232 ''')).render(data=data, action=action)
243 '''))
244 title = render_with_traceback(template, data=data, action=action)
233
245
234 text = Template(textwrap.dedent(r'''
246 template = Template(textwrap.dedent(r'''
235 *pull request title*: ${pr_title}
247 *pull request title*: ${pr_title}
236 %if data['pullrequest']['commits']:
248 %if data['pullrequest']['commits']:
237 *commits*: ${len(data['pullrequest']['commits'])}
249 *commits*: ${len(data['pullrequest']['commits'])}
238 %endif
250 %endif
239 ''')).render(
251 '''))
252 text = render_with_traceback(
253 template,
240 pr_title=data['pullrequest']['title'],
254 pr_title=data['pullrequest']['title'],
241 data=data
255 data=data)
242 )
243
256
244 return title, text
257 return title, text
245
258
246 def format_repo_push_event(self, data):
259 def format_repo_push_event(self, data):
247 branch_data = {branch['name']: branch
260
248 for branch in data['push']['branches']}
261 branches_commits = self.aggregate_branch_data(
262 data['push']['branches'], data['push']['commits'])
249
263
250 branches_commits = OrderedDict()
264 template = Template(r'''
251 for commit in data['push']['commits']:
252 if commit['branch'] not in branches_commits:
253 branch_commits = {'branch': branch_data[commit['branch']],
254 'commits': []}
255 branches_commits[commit['branch']] = branch_commits
256
257 branch_commits = branches_commits[commit['branch']]
258 branch_commits['commits'].append(commit)
259
260 title = Template(r'''
261 *${data['actor']['username']}* pushed to repo <${data['repo']['url']}|${data['repo']['repo_name']}>:
265 *${data['actor']['username']}* pushed to repo <${data['repo']['url']}|${data['repo']['repo_name']}>:
262 ''').render(data=data)
266 ''')
267 title = render_with_traceback(template, data=data)
263
268
264 repo_push_template = Template(textwrap.dedent(r'''
269 repo_push_template = Template(textwrap.dedent(r'''
265 %for branch, branch_commits in branches_commits.items():
270 <%
266 ${len(branch_commits['commits'])} ${'commit' if len(branch_commits['commits']) == 1 else 'commits'} on branch: <${branch_commits['branch']['url']}|${branch_commits['branch']['name']}>
271 def branch_text(branch):
267 %for commit in branch_commits['commits']:
272 if branch:
273 return 'on branch: <{}|{}>'.format(branch_commits['branch']['url'], branch_commits['branch']['name'])
274 else:
275 ## case for SVN no branch push...
276 return 'to trunk'
277 %> \
278 % for branch, branch_commits in branches_commits.items():
279 ${len(branch_commits['commits'])} ${'commit' if len(branch_commits['commits']) == 1 else 'commits'} ${branch_text(branch)}
280 % for commit in branch_commits['commits']:
268 `<${commit['url']}|${commit['short_id']}>` - ${commit['message_html']|html_to_slack_links}
281 `<${commit['url']}|${commit['short_id']}>` - ${commit['message_html']|html_to_slack_links}
269 %endfor
282 % endfor
270 %endfor
283 % endfor
271 '''))
284 '''))
272
285
273 text = repo_push_template.render(
286 text = render_with_traceback(
287 repo_push_template,
274 data=data,
288 data=data,
275 branches_commits=branches_commits,
289 branches_commits=branches_commits,
276 html_to_slack_links=html_to_slack_links,
290 html_to_slack_links=html_to_slack_links,
@@ -279,14 +293,16 b' class SlackIntegrationType(IntegrationTy'
279 return title, text
293 return title, text
280
294
281 def format_repo_create_event(self, data):
295 def format_repo_create_event(self, data):
282 title = Template(r'''
296 template = Template(r'''
283 *${data['actor']['username']}* created new repository ${data['repo']['repo_name']}:
297 *${data['actor']['username']}* created new repository ${data['repo']['repo_name']}:
284 ''').render(data=data)
298 ''')
299 title = render_with_traceback(template, data=data)
285
300
286 text = Template(textwrap.dedent(r'''
301 template = Template(textwrap.dedent(r'''
287 repo_url: ${data['repo']['url']}
302 repo_url: ${data['repo']['url']}
288 repo_type: ${data['repo']['repo_type']}
303 repo_type: ${data['repo']['repo_type']}
289 ''')).render(data=data)
304 '''))
305 text = render_with_traceback(template, data=data)
290
306
291 return title, text
307 return title, text
292
308
@@ -19,8 +19,6 b''
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 from __future__ import unicode_literals
21 from __future__ import unicode_literals
22 import string
23 from collections import OrderedDict
24
22
25 import deform
23 import deform
26 import deform.widget
24 import deform.widget
@@ -33,149 +31,18 b' from requests.packages.urllib3.util.retr'
33 import rhodecode
31 import rhodecode
34 from rhodecode import events
32 from rhodecode import events
35 from rhodecode.translation import _
33 from rhodecode.translation import _
36 from rhodecode.integrations.types.base import IntegrationTypeBase
34 from rhodecode.integrations.types.base import (
35 IntegrationTypeBase, get_auth, get_web_token, get_url_vars,
36 WebhookDataHandler, WEBHOOK_URL_VARS)
37 from rhodecode.lib.celerylib import run_task, async_task, RequestContextTask
37 from rhodecode.lib.celerylib import run_task, async_task, RequestContextTask
38 from rhodecode.model.validation_schema import widgets
38
39
39 log = logging.getLogger(__name__)
40 log = logging.getLogger(__name__)
40
41
41
42
42 # updating this required to update the `common_vars` passed in url calling func
43 # updating this required to update the `common_vars` passed in url calling func
43 WEBHOOK_URL_VARS = [
44 'repo_name',
45 'repo_type',
46 'repo_id',
47 'repo_url',
48 # extra repo fields
49 'extra:<extra_key_name>',
50
44
51 # special attrs below that we handle, using multi-call
45 URL_VARS = get_url_vars(WEBHOOK_URL_VARS)
52 'branch',
53 'commit_id',
54
55 # pr events vars
56 'pull_request_id',
57 'pull_request_url',
58
59 # user who triggers the call
60 'username',
61 'user_id',
62
63 ]
64 URL_VARS = ', '.join('${' + x + '}' for x in WEBHOOK_URL_VARS)
65
66
67 def get_auth(settings):
68 from requests.auth import HTTPBasicAuth
69 username = settings.get('username')
70 password = settings.get('password')
71 if username and password:
72 return HTTPBasicAuth(username, password)
73 return None
74
75
76 class WebhookHandler(object):
77 def __init__(self, template_url, secret_token, headers):
78 self.template_url = template_url
79 self.secret_token = secret_token
80 self.headers = headers
81
82 def get_base_parsed_template(self, data):
83 """
84 initially parses the passed in template with some common variables
85 available on ALL calls
86 """
87 # note: make sure to update the `WEBHOOK_URL_VARS` if this changes
88 common_vars = {
89 'repo_name': data['repo']['repo_name'],
90 'repo_type': data['repo']['repo_type'],
91 'repo_id': data['repo']['repo_id'],
92 'repo_url': data['repo']['url'],
93 'username': data['actor']['username'],
94 'user_id': data['actor']['user_id']
95 }
96
97 extra_vars = {}
98 for extra_key, extra_val in data['repo']['extra_fields'].items():
99 extra_vars['extra__{}'.format(extra_key)] = extra_val
100 common_vars.update(extra_vars)
101
102 template_url = self.template_url.replace('${extra:', '${extra__')
103 return string.Template(template_url).safe_substitute(**common_vars)
104
105 def repo_push_event_handler(self, event, data):
106 url = self.get_base_parsed_template(data)
107 url_cals = []
108 branch_data = OrderedDict()
109 for obj in data['push']['branches']:
110 branch_data[obj['name']] = obj
111
112 branches_commits = OrderedDict()
113 for commit in data['push']['commits']:
114 if commit.get('git_ref_change'):
115 # special case for GIT that allows creating tags,
116 # deleting branches without associated commit
117 continue
118
119 if commit['branch'] not in branches_commits:
120 branch_commits = {'branch': branch_data[commit['branch']],
121 'commits': []}
122 branches_commits[commit['branch']] = branch_commits
123
124 branch_commits = branches_commits[commit['branch']]
125 branch_commits['commits'].append(commit)
126
127 if '${branch}' in url:
128 # call it multiple times, for each branch if used in variables
129 for branch, commit_ids in branches_commits.items():
130 branch_url = string.Template(url).safe_substitute(branch=branch)
131 # call further down for each commit if used
132 if '${commit_id}' in branch_url:
133 for commit_data in commit_ids['commits']:
134 commit_id = commit_data['raw_id']
135 commit_url = string.Template(branch_url).safe_substitute(
136 commit_id=commit_id)
137 # register per-commit call
138 log.debug(
139 'register webhook call(%s) to url %s', event, commit_url)
140 url_cals.append((commit_url, self.secret_token, self.headers, data))
141
142 else:
143 # register per-branch call
144 log.debug(
145 'register webhook call(%s) to url %s', event, branch_url)
146 url_cals.append((branch_url, self.secret_token, self.headers, data))
147
148 else:
149 log.debug(
150 'register webhook call(%s) to url %s', event, url)
151 url_cals.append((url, self.secret_token, self.headers, data))
152
153 return url_cals
154
155 def repo_create_event_handler(self, event, data):
156 url = self.get_base_parsed_template(data)
157 log.debug(
158 'register webhook call(%s) to url %s', event, url)
159 return [(url, self.secret_token, self.headers, data)]
160
161 def pull_request_event_handler(self, event, data):
162 url = self.get_base_parsed_template(data)
163 log.debug(
164 'register webhook call(%s) to url %s', event, url)
165 url = string.Template(url).safe_substitute(
166 pull_request_id=data['pullrequest']['pull_request_id'],
167 pull_request_url=data['pullrequest']['url'])
168 return [(url, self.secret_token, self.headers, data)]
169
170 def __call__(self, event, data):
171 if isinstance(event, events.RepoPushEvent):
172 return self.repo_push_event_handler(event, data)
173 elif isinstance(event, events.RepoCreateEvent):
174 return self.repo_create_event_handler(event, data)
175 elif isinstance(event, events.PullRequestEvent):
176 return self.pull_request_event_handler(event, data)
177 else:
178 raise ValueError('event type not supported: %s' % events)
179
46
180
47
181 class WebhookSettingsSchema(colander.Schema):
48 class WebhookSettingsSchema(colander.Schema):
@@ -183,17 +50,21 b' class WebhookSettingsSchema(colander.Sch'
183 colander.String(),
50 colander.String(),
184 title=_('Webhook URL'),
51 title=_('Webhook URL'),
185 description=
52 description=
186 _('URL to which Webhook should submit data. Following variables '
53 _('URL to which Webhook should submit data. If used some of the '
187 'are allowed to be used: {vars}. Some of the variables would '
54 'variables would trigger multiple calls, like ${branch} or '
188 'trigger multiple calls, like ${{branch}} or ${{commit_id}}. '
55 '${commit_id}. Webhook will be called as many times as unique '
189 'Webhook will be called as many times as unique objects in '
56 'objects in data in such cases.'),
190 'data in such cases.').format(vars=URL_VARS),
191 missing=colander.required,
57 missing=colander.required,
192 required=True,
58 required=True,
193 validator=colander.url,
59 validator=colander.url,
194 widget=deform.widget.TextInputWidget(
60 widget=widgets.CodeMirrorWidget(
195 placeholder='https://www.example.com/webhook'
61 help_block_collapsable_name='Show url variables',
196 ),
62 help_block_collapsable=(
63 'E.g http://my-serv/trigger_job/${{event_name}}'
64 '?PR_ID=${{pull_request_id}}'
65 '\nFull list of vars:\n{}'.format(URL_VARS)),
66 codemirror_mode='text',
67 codemirror_options='{"lineNumbers": false, "lineWrapping": true}'),
197 )
68 )
198 secret_token = colander.SchemaNode(
69 secret_token = colander.SchemaNode(
199 colander.String(),
70 colander.String(),
@@ -234,7 +105,7 b' class WebhookSettingsSchema(colander.Sch'
234 default='',
105 default='',
235 missing='',
106 missing='',
236 widget=deform.widget.TextInputWidget(
107 widget=deform.widget.TextInputWidget(
237 placeholder='e.g.Authorization'
108 placeholder='e.g: Authorization'
238 ),
109 ),
239 )
110 )
240 custom_header_val = colander.SchemaNode(
111 custom_header_val = colander.SchemaNode(
@@ -244,7 +115,7 b' class WebhookSettingsSchema(colander.Sch'
244 default='',
115 default='',
245 missing='',
116 missing='',
246 widget=deform.widget.TextInputWidget(
117 widget=deform.widget.TextInputWidget(
247 placeholder='e.g. RcLogin auth=xxxx'
118 placeholder='e.g. Basic XxXxXx'
248 ),
119 ),
249 )
120 )
250 method_type = colander.SchemaNode(
121 method_type = colander.SchemaNode(
@@ -264,8 +135,11 b' class WebhookSettingsSchema(colander.Sch'
264 class WebhookIntegrationType(IntegrationTypeBase):
135 class WebhookIntegrationType(IntegrationTypeBase):
265 key = 'webhook'
136 key = 'webhook'
266 display_name = _('Webhook')
137 display_name = _('Webhook')
267 description = _('Post json events to a Webhook endpoint')
138 description = _('send JSON data to a url endpoint')
268 icon = '''<?xml version="1.0" encoding="UTF-8" standalone="no"?><svg viewBox="0 0 256 239" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" preserveAspectRatio="xMidYMid"><g><path d="M119.540432,100.502743 C108.930124,118.338815 98.7646301,135.611455 88.3876025,152.753617 C85.7226696,157.154315 84.4040417,160.738531 86.5332204,166.333309 C92.4107024,181.787152 84.1193605,196.825836 68.5350381,200.908244 C53.8383677,204.759349 39.5192953,195.099955 36.6032893,179.365384 C34.0194114,165.437749 44.8274148,151.78491 60.1824106,149.608284 C61.4694072,149.424428 62.7821041,149.402681 64.944891,149.240571 C72.469175,136.623655 80.1773157,123.700312 88.3025935,110.073173 C73.611854,95.4654658 64.8677898,78.3885437 66.803227,57.2292132 C68.1712787,42.2715849 74.0527146,29.3462646 84.8033863,18.7517722 C105.393354,-1.53572199 136.805164,-4.82141828 161.048542,10.7510424 C184.333097,25.7086706 194.996783,54.8450075 185.906752,79.7822957 C179.052655,77.9239597 172.151111,76.049808 164.563565,73.9917997 C167.418285,60.1274266 165.306899,47.6765751 155.95591,37.0109123 C149.777932,29.9690049 141.850349,26.2780332 132.835442,24.9178894 C114.764113,22.1877169 97.0209573,33.7983633 91.7563309,51.5355878 C85.7800012,71.6669027 94.8245623,88.1111998 119.540432,100.502743 L119.540432,100.502743 Z" fill="#C73A63"></path><path d="M149.841194,79.4106285 C157.316054,92.5969067 164.905578,105.982857 172.427885,119.246236 C210.44865,107.483365 239.114472,128.530009 249.398582,151.063322 C261.81978,178.282014 253.328765,210.520191 228.933162,227.312431 C203.893073,244.551464 172.226236,241.605803 150.040866,219.46195 C155.694953,214.729124 161.376716,209.974552 167.44794,204.895759 C189.360489,219.088306 208.525074,218.420096 222.753207,201.614016 C234.885769,187.277151 234.622834,165.900356 222.138374,151.863988 C207.730339,135.66681 188.431321,135.172572 165.103273,150.721309 C155.426087,133.553447 145.58086,116.521995 136.210101,99.2295848 C133.05093,93.4015266 129.561608,90.0209366 122.440622,88.7873178 C110.547271,86.7253555 102.868785,76.5124151 102.408155,65.0698097 C101.955433,53.7537294 108.621719,43.5249733 119.04224,39.5394355 C129.363912,35.5914599 141.476705,38.7783085 148.419765,47.554004 C154.093621,54.7244134 155.896602,62.7943365 152.911402,71.6372484 C152.081082,74.1025091 151.00562,76.4886916 149.841194,79.4106285 L149.841194,79.4106285 Z" fill="#4B4B4B"></path><path d="M167.706921,187.209935 L121.936499,187.209935 C117.54964,205.253587 108.074103,219.821756 91.7464461,229.085759 C79.0544063,236.285822 65.3738898,238.72736 50.8136292,236.376762 C24.0061432,232.053165 2.08568567,207.920497 0.156179306,180.745298 C-2.02835403,149.962159 19.1309765,122.599149 47.3341915,116.452801 C49.2814904,123.524363 51.2485589,130.663141 53.1958579,137.716911 C27.3195169,150.919004 18.3639187,167.553089 25.6054984,188.352614 C31.9811726,206.657224 50.0900643,216.690262 69.7528413,212.809503 C89.8327554,208.847688 99.9567329,192.160226 98.7211371,165.37844 C117.75722,165.37844 136.809118,165.180745 155.847178,165.475311 C163.280522,165.591951 169.019617,164.820939 174.620326,158.267339 C183.840836,147.48306 200.811003,148.455721 210.741239,158.640984 C220.88894,169.049642 220.402609,185.79839 209.663799,195.768166 C199.302587,205.38802 182.933414,204.874012 173.240413,194.508846 C171.247644,192.37176 169.677943,189.835329 167.706921,187.209935 L167.706921,187.209935 Z" fill="#4A4A4A"></path></g></svg>'''
139
140 @classmethod
141 def icon(cls):
142 return '''<?xml version="1.0" encoding="UTF-8" standalone="no"?><svg viewBox="0 0 256 239" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" preserveAspectRatio="xMidYMid"><g><path d="M119.540432,100.502743 C108.930124,118.338815 98.7646301,135.611455 88.3876025,152.753617 C85.7226696,157.154315 84.4040417,160.738531 86.5332204,166.333309 C92.4107024,181.787152 84.1193605,196.825836 68.5350381,200.908244 C53.8383677,204.759349 39.5192953,195.099955 36.6032893,179.365384 C34.0194114,165.437749 44.8274148,151.78491 60.1824106,149.608284 C61.4694072,149.424428 62.7821041,149.402681 64.944891,149.240571 C72.469175,136.623655 80.1773157,123.700312 88.3025935,110.073173 C73.611854,95.4654658 64.8677898,78.3885437 66.803227,57.2292132 C68.1712787,42.2715849 74.0527146,29.3462646 84.8033863,18.7517722 C105.393354,-1.53572199 136.805164,-4.82141828 161.048542,10.7510424 C184.333097,25.7086706 194.996783,54.8450075 185.906752,79.7822957 C179.052655,77.9239597 172.151111,76.049808 164.563565,73.9917997 C167.418285,60.1274266 165.306899,47.6765751 155.95591,37.0109123 C149.777932,29.9690049 141.850349,26.2780332 132.835442,24.9178894 C114.764113,22.1877169 97.0209573,33.7983633 91.7563309,51.5355878 C85.7800012,71.6669027 94.8245623,88.1111998 119.540432,100.502743 L119.540432,100.502743 Z" fill="#C73A63"></path><path d="M149.841194,79.4106285 C157.316054,92.5969067 164.905578,105.982857 172.427885,119.246236 C210.44865,107.483365 239.114472,128.530009 249.398582,151.063322 C261.81978,178.282014 253.328765,210.520191 228.933162,227.312431 C203.893073,244.551464 172.226236,241.605803 150.040866,219.46195 C155.694953,214.729124 161.376716,209.974552 167.44794,204.895759 C189.360489,219.088306 208.525074,218.420096 222.753207,201.614016 C234.885769,187.277151 234.622834,165.900356 222.138374,151.863988 C207.730339,135.66681 188.431321,135.172572 165.103273,150.721309 C155.426087,133.553447 145.58086,116.521995 136.210101,99.2295848 C133.05093,93.4015266 129.561608,90.0209366 122.440622,88.7873178 C110.547271,86.7253555 102.868785,76.5124151 102.408155,65.0698097 C101.955433,53.7537294 108.621719,43.5249733 119.04224,39.5394355 C129.363912,35.5914599 141.476705,38.7783085 148.419765,47.554004 C154.093621,54.7244134 155.896602,62.7943365 152.911402,71.6372484 C152.081082,74.1025091 151.00562,76.4886916 149.841194,79.4106285 L149.841194,79.4106285 Z" fill="#4B4B4B"></path><path d="M167.706921,187.209935 L121.936499,187.209935 C117.54964,205.253587 108.074103,219.821756 91.7464461,229.085759 C79.0544063,236.285822 65.3738898,238.72736 50.8136292,236.376762 C24.0061432,232.053165 2.08568567,207.920497 0.156179306,180.745298 C-2.02835403,149.962159 19.1309765,122.599149 47.3341915,116.452801 C49.2814904,123.524363 51.2485589,130.663141 53.1958579,137.716911 C27.3195169,150.919004 18.3639187,167.553089 25.6054984,188.352614 C31.9811726,206.657224 50.0900643,216.690262 69.7528413,212.809503 C89.8327554,208.847688 99.9567329,192.160226 98.7211371,165.37844 C117.75722,165.37844 136.809118,165.180745 155.847178,165.475311 C163.280522,165.591951 169.019617,164.820939 174.620326,158.267339 C183.840836,147.48306 200.811003,148.455721 210.741239,158.640984 C220.88894,169.049642 220.402609,185.79839 209.663799,195.768166 C199.302587,205.38802 182.933414,204.874012 173.240413,194.508846 C171.247644,192.37176 169.677943,189.835329 167.706921,187.209935 L167.706921,187.209935 Z" fill="#4A4A4A"></path></g></svg>'''
269
143
270 valid_events = [
144 valid_events = [
271 events.PullRequestCloseEvent,
145 events.PullRequestCloseEvent,
@@ -293,8 +167,8 b' class WebhookIntegrationType(Integration'
293 return schema
167 return schema
294
168
295 def send_event(self, event):
169 def send_event(self, event):
296 log.debug('handling event %s with Webhook integration %s',
170 log.debug(
297 event.name, self)
171 'handling event %s with Webhook integration %s', event.name, self)
298
172
299 if event.__class__ not in self.valid_events:
173 if event.__class__ not in self.valid_events:
300 log.debug('event not valid: %r' % event)
174 log.debug('event not valid: %r' % event)
@@ -313,8 +187,7 b' class WebhookIntegrationType(Integration'
313 if head_key and head_val:
187 if head_key and head_val:
314 headers = {head_key: head_val}
188 headers = {head_key: head_val}
315
189
316 handler = WebhookHandler(
190 handler = WebhookDataHandler(template_url, headers)
317 template_url, self.settings['secret_token'], headers)
318
191
319 url_calls = handler(event, data)
192 url_calls = handler(event, data)
320 log.debug('webhook: calling following urls: %s',
193 log.debug('webhook: calling following urls: %s',
@@ -370,7 +243,10 b' def post_to_webhook(url_calls, settings)'
370 rhodecode.__version__)
243 rhodecode.__version__)
371 } # updated below with custom ones, allows override
244 } # updated below with custom ones, allows override
372
245
373 for url, token, headers, data in url_calls:
246 auth = get_auth(settings)
247 token = get_web_token(settings)
248
249 for url, headers, data in url_calls:
374 req_session = requests.Session()
250 req_session = requests.Session()
375 req_session.mount( # retry max N times
251 req_session.mount( # retry max N times
376 'http://', requests.adapters.HTTPAdapter(max_retries=retries))
252 'http://', requests.adapters.HTTPAdapter(max_retries=retries))
@@ -380,7 +256,6 b' def post_to_webhook(url_calls, settings)'
380
256
381 headers = headers or {}
257 headers = headers or {}
382 call_headers.update(headers)
258 call_headers.update(headers)
383 auth = get_auth(settings)
384
259
385 log.debug('calling Webhook with method: %s, and auth:%s',
260 log.debug('calling Webhook with method: %s, and auth:%s',
386 call_method, auth)
261 call_method, auth)
@@ -392,4 +267,8 b' def post_to_webhook(url_calls, settings)'
392 }, headers=call_headers, auth=auth)
267 }, headers=call_headers, auth=auth)
393 log.debug('Got Webhook response: %s', resp)
268 log.debug('Got Webhook response: %s', resp)
394
269
395 resp.raise_for_status() # raise exception on a failed request
270 try:
271 resp.raise_for_status() # raise exception on a failed request
272 except Exception:
273 log.error(resp.text)
274 raise
@@ -99,8 +99,11 b' class CachingQuery(Query):'
99
99
100 """
100 """
101 if hasattr(self, '_cache_parameters'):
101 if hasattr(self, '_cache_parameters'):
102 return self.get_value(createfunc=lambda:
102
103 list(Query.__iter__(self)))
103 def caching_query():
104 return list(Query.__iter__(self))
105
106 return self.get_value(createfunc=caching_query)
104 else:
107 else:
105 return Query.__iter__(self)
108 return Query.__iter__(self)
106
109
@@ -28,8 +28,9 b' from pygments.lexers.special import Text'
28
28
29 from rhodecode.lib.helpers import (
29 from rhodecode.lib.helpers import (
30 get_lexer_for_filenode, html_escape, get_custom_lexer)
30 get_lexer_for_filenode, html_escape, get_custom_lexer)
31 from rhodecode.lib.utils2 import AttributeDict
31 from rhodecode.lib.utils2 import AttributeDict, StrictAttributeDict
32 from rhodecode.lib.vcs.nodes import FileNode
32 from rhodecode.lib.vcs.nodes import FileNode
33 from rhodecode.lib.vcs.exceptions import VCSError, NodeDoesNotExistError
33 from rhodecode.lib.diff_match_patch import diff_match_patch
34 from rhodecode.lib.diff_match_patch import diff_match_patch
34 from rhodecode.lib.diffs import LimitedDiffContainer
35 from rhodecode.lib.diffs import LimitedDiffContainer
35 from pygments.lexers import get_lexer_by_name
36 from pygments.lexers import get_lexer_by_name
@@ -38,7 +39,7 b' plain_text_lexer = get_lexer_by_name('
38 'text', stripall=False, stripnl=False, ensurenl=False)
39 'text', stripall=False, stripnl=False, ensurenl=False)
39
40
40
41
41 log = logging.getLogger()
42 log = logging.getLogger(__name__)
42
43
43
44
44 def filenode_as_lines_tokens(filenode, lexer=None):
45 def filenode_as_lines_tokens(filenode, lexer=None):
@@ -351,6 +352,16 b' def tokens_diff(old_tokens, new_tokens, '
351 return old_tokens_result, new_tokens_result, similarity
352 return old_tokens_result, new_tokens_result, similarity
352
353
353
354
355 def diffset_node_getter(commit):
356 def get_node(fname):
357 try:
358 return commit.get_node(fname)
359 except NodeDoesNotExistError:
360 return None
361
362 return get_node
363
364
354 class DiffSet(object):
365 class DiffSet(object):
355 """
366 """
356 An object for parsing the diff result from diffs.DiffProcessor and
367 An object for parsing the diff result from diffs.DiffProcessor and
@@ -400,7 +411,12 b' class DiffSet(object):'
400 for patch in patchset:
411 for patch in patchset:
401 diffset.file_stats[patch['filename']] = patch['stats']
412 diffset.file_stats[patch['filename']] = patch['stats']
402 filediff = self.render_patch(patch)
413 filediff = self.render_patch(patch)
403 filediff.diffset = diffset
414 filediff.diffset = StrictAttributeDict(dict(
415 source_ref=diffset.source_ref,
416 target_ref=diffset.target_ref,
417 repo_name=diffset.repo_name,
418 source_repo_name=diffset.source_repo_name,
419 ))
404 diffset.files.append(filediff)
420 diffset.files.append(filediff)
405 diffset.changed_files += 1
421 diffset.changed_files += 1
406 if not patch['stats']['binary']:
422 if not patch['stats']['binary']:
@@ -510,6 +526,7 b' class DiffSet(object):'
510 if target_file_path in self.comments_store:
526 if target_file_path in self.comments_store:
511 for lineno, comments in self.comments_store[target_file_path].items():
527 for lineno, comments in self.comments_store[target_file_path].items():
512 left_comments[lineno] = comments
528 left_comments[lineno] = comments
529
513 # left comments are one that we couldn't place in diff lines.
530 # left comments are one that we couldn't place in diff lines.
514 # could be outdated, or the diff changed and this line is no
531 # could be outdated, or the diff changed and this line is no
515 # longer available
532 # longer available
@@ -546,7 +563,7 b' class DiffSet(object):'
546
563
547 result.lines.extend(
564 result.lines.extend(
548 self.parse_lines(before, after, source_file, target_file))
565 self.parse_lines(before, after, source_file, target_file))
549 result.unified = self.as_unified(result.lines)
566 result.unified = list(self.as_unified(result.lines))
550 result.sideside = result.lines
567 result.sideside = result.lines
551
568
552 return result
569 return result
@@ -601,8 +618,9 b' class DiffSet(object):'
601 original.lineno = before['old_lineno']
618 original.lineno = before['old_lineno']
602 original.content = before['line']
619 original.content = before['line']
603 original.action = self.action_to_op(before['action'])
620 original.action = self.action_to_op(before['action'])
604 original.comments = self.get_comments_for('old',
621
605 source_file, before['old_lineno'])
622 original.get_comment_args = (
623 source_file, 'o', before['old_lineno'])
606
624
607 if after:
625 if after:
608 if after['action'] == 'new-no-nl':
626 if after['action'] == 'new-no-nl':
@@ -614,8 +632,9 b' class DiffSet(object):'
614 modified.lineno = after['new_lineno']
632 modified.lineno = after['new_lineno']
615 modified.content = after['line']
633 modified.content = after['line']
616 modified.action = self.action_to_op(after['action'])
634 modified.action = self.action_to_op(after['action'])
617 modified.comments = self.get_comments_for('new',
635
618 target_file, after['new_lineno'])
636 modified.get_comment_args = (
637 target_file, 'n', after['new_lineno'])
619
638
620 # diff the lines
639 # diff the lines
621 if before_tokens and after_tokens:
640 if before_tokens and after_tokens:
@@ -644,23 +663,6 b' class DiffSet(object):'
644
663
645 return lines
664 return lines
646
665
647 def get_comments_for(self, version, filename, line_number):
648 if hasattr(filename, 'unicode_path'):
649 filename = filename.unicode_path
650
651 if not isinstance(filename, basestring):
652 return None
653
654 line_key = {
655 'old': 'o',
656 'new': 'n',
657 }[version] + str(line_number)
658
659 if filename in self.comments_store:
660 file_comments = self.comments_store[filename]
661 if line_key in file_comments:
662 return file_comments.pop(line_key)
663
664 def get_line_tokens(self, line_text, line_number, file=None):
666 def get_line_tokens(self, line_text, line_number, file=None):
665 filenode = None
667 filenode = None
666 filename = None
668 filename = None
@@ -717,25 +719,25 b' class DiffSet(object):'
717 if line.original.action == ' ':
719 if line.original.action == ' ':
718 yield (line.original.lineno, line.modified.lineno,
720 yield (line.original.lineno, line.modified.lineno,
719 line.original.action, line.original.content,
721 line.original.action, line.original.content,
720 line.original.comments)
722 line.original.get_comment_args)
721 continue
723 continue
722
724
723 if line.original.action == '-':
725 if line.original.action == '-':
724 yield (line.original.lineno, None,
726 yield (line.original.lineno, None,
725 line.original.action, line.original.content,
727 line.original.action, line.original.content,
726 line.original.comments)
728 line.original.get_comment_args)
727
729
728 if line.modified.action == '+':
730 if line.modified.action == '+':
729 buf.append((
731 buf.append((
730 None, line.modified.lineno,
732 None, line.modified.lineno,
731 line.modified.action, line.modified.content,
733 line.modified.action, line.modified.content,
732 line.modified.comments))
734 line.modified.get_comment_args))
733 continue
735 continue
734
736
735 if line.modified:
737 if line.modified:
736 yield (None, line.modified.lineno,
738 yield (None, line.modified.lineno,
737 line.modified.action, line.modified.content,
739 line.modified.action, line.modified.content,
738 line.modified.comments)
740 line.modified.get_comment_args)
739
741
740 for b in buf:
742 for b in buf:
741 yield b
743 yield b
@@ -126,7 +126,7 b' class DbManage(object):'
126 log.info("Deleting (%s) cache keys now...", total)
126 log.info("Deleting (%s) cache keys now...", total)
127 CacheKey.delete_all_cache()
127 CacheKey.delete_all_cache()
128
128
129 def upgrade(self):
129 def upgrade(self, version=None):
130 """
130 """
131 Upgrades given database schema to given revision following
131 Upgrades given database schema to given revision following
132 all needed steps, to perform the upgrade
132 all needed steps, to perform the upgrade
@@ -157,9 +157,9 b' class DbManage(object):'
157 db_uri = self.dburi
157 db_uri = self.dburi
158
158
159 try:
159 try:
160 curr_version = api.db_version(db_uri, repository_path)
160 curr_version = version or api.db_version(db_uri, repository_path)
161 msg = ('Found current database under version '
161 msg = ('Found current database db_uri under version '
162 'control with version %s' % curr_version)
162 'control with version {}'.format(curr_version))
163
163
164 except (RuntimeError, DatabaseNotControlledError):
164 except (RuntimeError, DatabaseNotControlledError):
165 curr_version = 1
165 curr_version = 1
@@ -23,16 +23,19 b''
23 Set of diffing helpers, previously part of vcs
23 Set of diffing helpers, previously part of vcs
24 """
24 """
25
25
26 import os
26 import re
27 import re
28 import bz2
29
27 import collections
30 import collections
28 import difflib
31 import difflib
29 import logging
32 import logging
30
33 import cPickle as pickle
31 from itertools import tee, imap
34 from itertools import tee, imap
32
35
33 from rhodecode.lib.vcs.exceptions import VCSError
36 from rhodecode.lib.vcs.exceptions import VCSError
34 from rhodecode.lib.vcs.nodes import FileNode, SubModuleNode
37 from rhodecode.lib.vcs.nodes import FileNode, SubModuleNode
35 from rhodecode.lib.utils2 import safe_unicode
38 from rhodecode.lib.utils2 import safe_unicode, safe_str
36
39
37 log = logging.getLogger(__name__)
40 log = logging.getLogger(__name__)
38
41
@@ -1129,3 +1132,82 b' class LineNotInDiffException(Exception):'
1129
1132
1130 class DiffLimitExceeded(Exception):
1133 class DiffLimitExceeded(Exception):
1131 pass
1134 pass
1135
1136
1137 def cache_diff(cached_diff_file, diff, commits):
1138
1139 struct = {
1140 'version': 'v1',
1141 'diff': diff,
1142 'commits': commits
1143 }
1144
1145 try:
1146 with bz2.BZ2File(cached_diff_file, 'wb') as f:
1147 pickle.dump(struct, f)
1148 log.debug('Saved diff cache under %s', cached_diff_file)
1149 except Exception:
1150 log.warn('Failed to save cache', exc_info=True)
1151 # cleanup file to not store it "damaged"
1152 try:
1153 os.remove(cached_diff_file)
1154 except Exception:
1155 log.exception('Failed to cleanup path %s', cached_diff_file)
1156
1157
1158 def load_cached_diff(cached_diff_file):
1159
1160 default_struct = {
1161 'version': 'v1',
1162 'diff': None,
1163 'commits': None
1164 }
1165
1166 has_cache = os.path.isfile(cached_diff_file)
1167 if not has_cache:
1168 return default_struct
1169
1170 data = None
1171 try:
1172 with bz2.BZ2File(cached_diff_file, 'rb') as f:
1173 data = pickle.load(f)
1174 log.debug('Loaded diff cache from %s', cached_diff_file)
1175 except Exception:
1176 log.warn('Failed to read diff cache file', exc_info=True)
1177
1178 if not data:
1179 data = default_struct
1180
1181 if not isinstance(data, dict):
1182 # old version of data ?
1183 data = default_struct
1184
1185 return data
1186
1187
1188 def generate_diff_cache_key(*args):
1189 """
1190 Helper to generate a cache key using arguments
1191 """
1192 def arg_mapper(input_param):
1193 input_param = safe_str(input_param)
1194 # we cannot allow '/' in arguments since it would allow
1195 # subdirectory usage
1196 input_param.replace('/', '_')
1197 return input_param or None # prevent empty string arguments
1198
1199 return '_'.join([
1200 '{}' for i in range(len(args))]).format(*map(arg_mapper, args))
1201
1202
1203 def diff_cache_exist(cache_storage, *args):
1204 """
1205 Based on all generated arguments check and return a cache path
1206 """
1207 cache_key = generate_diff_cache_key(*args)
1208 cache_file_path = os.path.join(cache_storage, cache_key)
1209 # prevent path traversal attacks using some param that have e.g '../../'
1210 if not os.path.abspath(cache_file_path).startswith(cache_storage):
1211 raise ValueError('Final path must be within {}'.format(cache_storage))
1212
1213 return cache_file_path
@@ -132,6 +132,7 b' class VCSServerUnavailable(HTTPBadGatewa'
132 'Incorrect vcs.server=host:port',
132 'Incorrect vcs.server=host:port',
133 'Incorrect vcs.server.protocol',
133 'Incorrect vcs.server.protocol',
134 ]
134 ]
135
135 def __init__(self, message=''):
136 def __init__(self, message=''):
136 self.explanation = 'Could not connect to VCS Server'
137 self.explanation = 'Could not connect to VCS Server'
137 if message:
138 if message:
@@ -25,6 +25,7 b' Consists of functions to typically be us'
25 available to Controllers. This module is available to both as 'h'.
25 available to Controllers. This module is available to both as 'h'.
26 """
26 """
27
27
28 import os
28 import random
29 import random
29 import hashlib
30 import hashlib
30 import StringIO
31 import StringIO
@@ -742,18 +743,23 b' short_id = lambda x: x[:12]'
742 hide_credentials = lambda x: ''.join(credentials_filter(x))
743 hide_credentials = lambda x: ''.join(credentials_filter(x))
743
744
744
745
746 import pytz
747 import tzlocal
748 local_timezone = tzlocal.get_localzone()
749
750
745 def age_component(datetime_iso, value=None, time_is_local=False):
751 def age_component(datetime_iso, value=None, time_is_local=False):
746 title = value or format_date(datetime_iso)
752 title = value or format_date(datetime_iso)
747 tzinfo = '+00:00'
753 tzinfo = '+00:00'
748
754
749 # detect if we have a timezone info, otherwise, add it
755 # detect if we have a timezone info, otherwise, add it
750 if isinstance(datetime_iso, datetime) and not datetime_iso.tzinfo:
756 if time_is_local and isinstance(datetime_iso, datetime) and not datetime_iso.tzinfo:
751 if time_is_local:
757 force_timezone = os.environ.get('RC_TIMEZONE', '')
752 tzinfo = time.strftime("+%H:%M",
758 if force_timezone:
753 time.gmtime(
759 force_timezone = pytz.timezone(force_timezone)
754 (datetime.now() - datetime.utcnow()).seconds + 1
760 timezone = force_timezone or local_timezone
755 )
761 offset = timezone.localize(datetime_iso).strftime('%z')
756 )
762 tzinfo = '{}:{}'.format(offset[:-2], offset[-2:])
757
763
758 return literal(
764 return literal(
759 '<time class="timeago tooltip" '
765 '<time class="timeago tooltip" '
@@ -896,6 +902,13 b' def link_to_user(author, length=0, **kwa'
896 return escape(display_person)
902 return escape(display_person)
897
903
898
904
905 def link_to_group(users_group_name, **kwargs):
906 return link_to(
907 escape(users_group_name),
908 route_path('user_group_profile', user_group_name=users_group_name),
909 **kwargs)
910
911
899 def person(author, show_attr="username_and_name"):
912 def person(author, show_attr="username_and_name"):
900 user = discover_user(author)
913 user = discover_user(author)
901 if user:
914 if user:
@@ -18,10 +18,13 b''
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import json
21 import os
22 import time
22 import logging
23 import logging
24 import tempfile
23 import traceback
25 import traceback
24 import threading
26 import threading
27
25 from BaseHTTPServer import BaseHTTPRequestHandler
28 from BaseHTTPServer import BaseHTTPRequestHandler
26 from SocketServer import TCPServer
29 from SocketServer import TCPServer
27
30
@@ -30,14 +33,28 b' from rhodecode.model import meta'
30 from rhodecode.lib.base import bootstrap_request, bootstrap_config
33 from rhodecode.lib.base import bootstrap_request, bootstrap_config
31 from rhodecode.lib import hooks_base
34 from rhodecode.lib import hooks_base
32 from rhodecode.lib.utils2 import AttributeDict
35 from rhodecode.lib.utils2 import AttributeDict
36 from rhodecode.lib.ext_json import json
33
37
34
38
35 log = logging.getLogger(__name__)
39 log = logging.getLogger(__name__)
36
40
37
41
38 class HooksHttpHandler(BaseHTTPRequestHandler):
42 class HooksHttpHandler(BaseHTTPRequestHandler):
43
39 def do_POST(self):
44 def do_POST(self):
40 method, extras = self._read_request()
45 method, extras = self._read_request()
46 txn_id = getattr(self.server, 'txn_id', None)
47 if txn_id:
48 from rhodecode.lib.caches import compute_key_from_params
49 log.debug('Computing TXN_ID based on `%s`:`%s`',
50 extras['repository'], extras['txn_id'])
51 computed_txn_id = compute_key_from_params(
52 extras['repository'], extras['txn_id'])
53 if txn_id != computed_txn_id:
54 raise Exception(
55 'TXN ID fail: expected {} got {} instead'.format(
56 txn_id, computed_txn_id))
57
41 try:
58 try:
42 result = self._call_hook(method, extras)
59 result = self._call_hook(method, extras)
43 except Exception as e:
60 except Exception as e:
@@ -77,13 +94,14 b' class HooksHttpHandler(BaseHTTPRequestHa'
77
94
78 message = format % args
95 message = format % args
79
96
80 # TODO: mikhail: add different log levels support
81 log.debug(
97 log.debug(
82 "%s - - [%s] %s", self.client_address[0],
98 "%s - - [%s] %s", self.client_address[0],
83 self.log_date_time_string(), message)
99 self.log_date_time_string(), message)
84
100
85
101
86 class DummyHooksCallbackDaemon(object):
102 class DummyHooksCallbackDaemon(object):
103 hooks_uri = ''
104
87 def __init__(self):
105 def __init__(self):
88 self.hooks_module = Hooks.__module__
106 self.hooks_module = Hooks.__module__
89
107
@@ -101,8 +119,8 b' class ThreadedHookCallbackDaemon(object)'
101 _daemon = None
119 _daemon = None
102 _done = False
120 _done = False
103
121
104 def __init__(self):
122 def __init__(self, txn_id=None, port=None):
105 self._prepare()
123 self._prepare(txn_id=txn_id, port=port)
106
124
107 def __enter__(self):
125 def __enter__(self):
108 self._run()
126 self._run()
@@ -112,7 +130,7 b' class ThreadedHookCallbackDaemon(object)'
112 log.debug('Callback daemon exiting now...')
130 log.debug('Callback daemon exiting now...')
113 self._stop()
131 self._stop()
114
132
115 def _prepare(self):
133 def _prepare(self, txn_id=None, port=None):
116 raise NotImplementedError()
134 raise NotImplementedError()
117
135
118 def _run(self):
136 def _run(self):
@@ -135,15 +153,18 b' class HttpHooksCallbackDaemon(ThreadedHo'
135 # request and wastes cpu at all other times.
153 # request and wastes cpu at all other times.
136 POLL_INTERVAL = 0.01
154 POLL_INTERVAL = 0.01
137
155
138 def _prepare(self):
156 def _prepare(self, txn_id=None, port=None):
139 log.debug("Preparing HTTP callback daemon and registering hook object")
140
141 self._done = False
157 self._done = False
142 self._daemon = TCPServer((self.IP_ADDRESS, 0), HooksHttpHandler)
158 self._daemon = TCPServer((self.IP_ADDRESS, port or 0), HooksHttpHandler)
143 _, port = self._daemon.server_address
159 _, port = self._daemon.server_address
144 self.hooks_uri = '{}:{}'.format(self.IP_ADDRESS, port)
160 self.hooks_uri = '{}:{}'.format(self.IP_ADDRESS, port)
161 self.txn_id = txn_id
162 # inject transaction_id for later verification
163 self._daemon.txn_id = self.txn_id
145
164
146 log.debug("Hooks uri is: %s", self.hooks_uri)
165 log.debug(
166 "Preparing HTTP callback daemon at `%s` and registering hook object",
167 self.hooks_uri)
147
168
148 def _run(self):
169 def _run(self):
149 log.debug("Running event loop of callback daemon in background thread")
170 log.debug("Running event loop of callback daemon in background thread")
@@ -160,26 +181,67 b' class HttpHooksCallbackDaemon(ThreadedHo'
160 self._callback_thread.join()
181 self._callback_thread.join()
161 self._daemon = None
182 self._daemon = None
162 self._callback_thread = None
183 self._callback_thread = None
184 if self.txn_id:
185 txn_id_file = get_txn_id_data_path(self.txn_id)
186 log.debug('Cleaning up TXN ID %s', txn_id_file)
187 if os.path.isfile(txn_id_file):
188 os.remove(txn_id_file)
189
163 log.debug("Background thread done.")
190 log.debug("Background thread done.")
164
191
165
192
166 def prepare_callback_daemon(extras, protocol, use_direct_calls):
193 def get_txn_id_data_path(txn_id):
167 callback_daemon = None
194 root = tempfile.gettempdir()
195 return os.path.join(root, 'rc_txn_id_{}'.format(txn_id))
196
197
198 def store_txn_id_data(txn_id, data_dict):
199 if not txn_id:
200 log.warning('Cannot store txn_id because it is empty')
201 return
202
203 path = get_txn_id_data_path(txn_id)
204 try:
205 with open(path, 'wb') as f:
206 f.write(json.dumps(data_dict))
207 except Exception:
208 log.exception('Failed to write txn_id metadata')
168
209
210
211 def get_txn_id_from_store(txn_id):
212 """
213 Reads txn_id from store and if present returns the data for callback manager
214 """
215 path = get_txn_id_data_path(txn_id)
216 try:
217 with open(path, 'rb') as f:
218 return json.loads(f.read())
219 except Exception:
220 return {}
221
222
223 def prepare_callback_daemon(extras, protocol, use_direct_calls, txn_id=None):
224 txn_details = get_txn_id_from_store(txn_id)
225 port = txn_details.get('port', 0)
169 if use_direct_calls:
226 if use_direct_calls:
170 callback_daemon = DummyHooksCallbackDaemon()
227 callback_daemon = DummyHooksCallbackDaemon()
171 extras['hooks_module'] = callback_daemon.hooks_module
228 extras['hooks_module'] = callback_daemon.hooks_module
172 else:
229 else:
173 if protocol == 'http':
230 if protocol == 'http':
174 callback_daemon = HttpHooksCallbackDaemon()
231 callback_daemon = HttpHooksCallbackDaemon(txn_id=txn_id, port=port)
175 else:
232 else:
176 log.error('Unsupported callback daemon protocol "%s"', protocol)
233 log.error('Unsupported callback daemon protocol "%s"', protocol)
177 raise Exception('Unsupported callback daemon protocol.')
234 raise Exception('Unsupported callback daemon protocol.')
178
235
179 extras['hooks_uri'] = callback_daemon.hooks_uri
236 extras['hooks_uri'] = callback_daemon.hooks_uri
180 extras['hooks_protocol'] = protocol
237 extras['hooks_protocol'] = protocol
238 extras['time'] = time.time()
181
239
182 log.debug('Prepared a callback daemon: %s', callback_daemon)
240 # register txn_id
241 extras['txn_id'] = txn_id
242
243 log.debug('Prepared a callback daemon: %s at url `%s`',
244 callback_daemon.__class__.__name__, callback_daemon.hooks_uri)
183 return callback_daemon, extras
245 return callback_daemon, extras
184
246
185
247
@@ -18,17 +18,21 b''
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import base64
21 import logging
22 import logging
22 import urllib
23 import urllib
23 from urlparse import urljoin
24 from urlparse import urljoin
24
25
25
26 import requests
26 import requests
27 from webob.exc import HTTPNotAcceptable
27 from webob.exc import HTTPNotAcceptable
28
28
29 from rhodecode.lib import caches
29 from rhodecode.lib.middleware import simplevcs
30 from rhodecode.lib.middleware import simplevcs
30 from rhodecode.lib.utils import is_valid_repo
31 from rhodecode.lib.utils import is_valid_repo
31 from rhodecode.lib.utils2 import str2bool
32 from rhodecode.lib.utils2 import str2bool, safe_int
33 from rhodecode.lib.ext_json import json
34 from rhodecode.lib.hooks_daemon import store_txn_id_data
35
32
36
33 log = logging.getLogger(__name__)
37 log = logging.getLogger(__name__)
34
38
@@ -39,7 +43,6 b' class SimpleSvnApp(object):'
39 'transfer-encoding', 'content-length']
43 'transfer-encoding', 'content-length']
40 rc_extras = {}
44 rc_extras = {}
41
45
42
43 def __init__(self, config):
46 def __init__(self, config):
44 self.config = config
47 self.config = config
45
48
@@ -52,9 +55,19 b' class SimpleSvnApp(object):'
52 # length, then we should transfer the payload in one request.
55 # length, then we should transfer the payload in one request.
53 if environ['REQUEST_METHOD'] == 'MKCOL' or 'CONTENT_LENGTH' in environ:
56 if environ['REQUEST_METHOD'] == 'MKCOL' or 'CONTENT_LENGTH' in environ:
54 data = data.read()
57 data = data.read()
58 if data.startswith('(create-txn-with-props'):
59 # store on-the-fly our rc_extra using svn revision properties
60 # those can be read later on in hooks executed so we have a way
61 # to pass in the data into svn hooks
62 rc_data = base64.urlsafe_b64encode(json.dumps(self.rc_extras))
63 rc_data_len = len(rc_data)
64 # header defines data lenght, and serialized data
65 skel = ' rc-scm-extras {} {}'.format(rc_data_len, rc_data)
66 data = data[:-2] + skel + '))'
55
67
56 log.debug('Calling: %s method via `%s`', environ['REQUEST_METHOD'],
68 log.debug('Calling: %s method via `%s`', environ['REQUEST_METHOD'],
57 self._get_url(environ['PATH_INFO']))
69 self._get_url(environ['PATH_INFO']))
70
58 response = requests.request(
71 response = requests.request(
59 environ['REQUEST_METHOD'], self._get_url(environ['PATH_INFO']),
72 environ['REQUEST_METHOD'], self._get_url(environ['PATH_INFO']),
60 data=data, headers=request_headers)
73 data=data, headers=request_headers)
@@ -70,6 +83,14 b' class SimpleSvnApp(object):'
70 log.debug('got response code: %s', response.status_code)
83 log.debug('got response code: %s', response.status_code)
71
84
72 response_headers = self._get_response_headers(response.headers)
85 response_headers = self._get_response_headers(response.headers)
86
87 if response.headers.get('SVN-Txn-name'):
88 svn_tx_id = response.headers.get('SVN-Txn-name')
89 txn_id = caches.compute_key_from_params(
90 self.config['repository'], svn_tx_id)
91 port = safe_int(self.rc_extras['hooks_uri'].split(':')[-1])
92 store_txn_id_data(txn_id, {'port': port})
93
73 start_response(
94 start_response(
74 '{} {}'.format(response.status_code, response.reason),
95 '{} {}'.format(response.status_code, response.reason),
75 response_headers)
96 response_headers)
@@ -156,6 +177,14 b' class SimpleSvn(simplevcs.SimpleVCS):'
156 if environ['REQUEST_METHOD'] in self.READ_ONLY_COMMANDS
177 if environ['REQUEST_METHOD'] in self.READ_ONLY_COMMANDS
157 else 'push')
178 else 'push')
158
179
180 def _should_use_callback_daemon(self, extras, environ, action):
181 # only MERGE command triggers hooks, so we don't want to start
182 # hooks server too many times. POST however starts the svn transaction
183 # so we also need to run the init of callback daemon of POST
184 if environ['REQUEST_METHOD'] in ['MERGE', 'POST']:
185 return True
186 return False
187
159 def _create_wsgi_app(self, repo_path, repo_name, config):
188 def _create_wsgi_app(self, repo_path, repo_name, config):
160 if self._is_svn_enabled():
189 if self._is_svn_enabled():
161 return SimpleSvnApp(config)
190 return SimpleSvnApp(config)
@@ -28,10 +28,12 b' import re'
28 import logging
28 import logging
29 import importlib
29 import importlib
30 from functools import wraps
30 from functools import wraps
31 from StringIO import StringIO
32 from lxml import etree
31
33
32 import time
34 import time
33 from paste.httpheaders import REMOTE_USER, AUTH_TYPE
35 from paste.httpheaders import REMOTE_USER, AUTH_TYPE
34 # TODO(marcink): check if we should use webob.exc here ?
36
35 from pyramid.httpexceptions import (
37 from pyramid.httpexceptions import (
36 HTTPNotFound, HTTPForbidden, HTTPNotAcceptable, HTTPInternalServerError)
38 HTTPNotFound, HTTPForbidden, HTTPNotAcceptable, HTTPInternalServerError)
37 from zope.cachedescriptors.property import Lazy as LazyProperty
39 from zope.cachedescriptors.property import Lazy as LazyProperty
@@ -43,9 +45,7 b' from rhodecode.lib import caches'
43 from rhodecode.lib.auth import AuthUser, HasPermissionAnyMiddleware
45 from rhodecode.lib.auth import AuthUser, HasPermissionAnyMiddleware
44 from rhodecode.lib.base import (
46 from rhodecode.lib.base import (
45 BasicAuth, get_ip_addr, get_user_agent, vcs_operation_context)
47 BasicAuth, get_ip_addr, get_user_agent, vcs_operation_context)
46 from rhodecode.lib.exceptions import (
48 from rhodecode.lib.exceptions import (UserCreationError, NotAllowedToCreateUserError)
47 HTTPLockedRC, HTTPRequirementError, UserCreationError,
48 NotAllowedToCreateUserError)
49 from rhodecode.lib.hooks_daemon import prepare_callback_daemon
49 from rhodecode.lib.hooks_daemon import prepare_callback_daemon
50 from rhodecode.lib.middleware import appenlight
50 from rhodecode.lib.middleware import appenlight
51 from rhodecode.lib.middleware.utils import scm_app_http
51 from rhodecode.lib.middleware.utils import scm_app_http
@@ -53,6 +53,7 b' from rhodecode.lib.utils import is_valid'
53 from rhodecode.lib.utils2 import safe_str, fix_PATH, str2bool, safe_unicode
53 from rhodecode.lib.utils2 import safe_str, fix_PATH, str2bool, safe_unicode
54 from rhodecode.lib.vcs.conf import settings as vcs_settings
54 from rhodecode.lib.vcs.conf import settings as vcs_settings
55 from rhodecode.lib.vcs.backends import base
55 from rhodecode.lib.vcs.backends import base
56
56 from rhodecode.model import meta
57 from rhodecode.model import meta
57 from rhodecode.model.db import User, Repository, PullRequest
58 from rhodecode.model.db import User, Repository, PullRequest
58 from rhodecode.model.scm import ScmModel
59 from rhodecode.model.scm import ScmModel
@@ -62,6 +63,28 b' from rhodecode.model.settings import Set'
62 log = logging.getLogger(__name__)
63 log = logging.getLogger(__name__)
63
64
64
65
66 def extract_svn_txn_id(acl_repo_name, data):
67 """
68 Helper method for extraction of svn txn_id from submited XML data during
69 POST operations
70 """
71 try:
72 root = etree.fromstring(data)
73 pat = re.compile(r'/txn/(?P<txn_id>.*)')
74 for el in root:
75 if el.tag == '{DAV:}source':
76 for sub_el in el:
77 if sub_el.tag == '{DAV:}href':
78 match = pat.search(sub_el.text)
79 if match:
80 svn_tx_id = match.groupdict()['txn_id']
81 txn_id = caches.compute_key_from_params(
82 acl_repo_name, svn_tx_id)
83 return txn_id
84 except Exception:
85 log.exception('Failed to extract txn_id')
86
87
65 def initialize_generator(factory):
88 def initialize_generator(factory):
66 """
89 """
67 Initializes the returned generator by draining its first element.
90 Initializes the returned generator by draining its first element.
@@ -359,8 +382,9 b' class SimpleVCS(object):'
359 # check if we have SSL required ! if not it's a bad request !
382 # check if we have SSL required ! if not it's a bad request !
360 require_ssl = str2bool(self.repo_vcs_config.get('web', 'push_ssl'))
383 require_ssl = str2bool(self.repo_vcs_config.get('web', 'push_ssl'))
361 if require_ssl and org_proto == 'http':
384 if require_ssl and org_proto == 'http':
362 log.debug('proto is %s and SSL is required BAD REQUEST !',
385 log.debug(
363 org_proto)
386 'Bad request: detected protocol is `%s` and '
387 'SSL/HTTPS is required.', org_proto)
364 return False
388 return False
365 return True
389 return True
366
390
@@ -420,8 +444,9 b' class SimpleVCS(object):'
420 # Check if the shadow repo actually exists, in case someone refers
444 # Check if the shadow repo actually exists, in case someone refers
421 # to it, and it has been deleted because of successful merge.
445 # to it, and it has been deleted because of successful merge.
422 if self.is_shadow_repo and not self.is_shadow_repo_dir:
446 if self.is_shadow_repo and not self.is_shadow_repo_dir:
423 log.debug('Shadow repo detected, and shadow repo dir `%s` is missing',
447 log.debug(
424 self.is_shadow_repo_dir)
448 'Shadow repo detected, and shadow repo dir `%s` is missing',
449 self.is_shadow_repo_dir)
425 return HTTPNotFound()(environ, start_response)
450 return HTTPNotFound()(environ, start_response)
426
451
427 # ======================================================================
452 # ======================================================================
@@ -436,7 +461,8 b' class SimpleVCS(object):'
436 anonymous_perm = self._check_permission(
461 anonymous_perm = self._check_permission(
437 action, anonymous_user, self.acl_repo_name, ip_addr,
462 action, anonymous_user, self.acl_repo_name, ip_addr,
438 plugin_id='anonymous_access',
463 plugin_id='anonymous_access',
439 plugin_cache_active=plugin_cache_active, cache_ttl=cache_ttl,
464 plugin_cache_active=plugin_cache_active,
465 cache_ttl=cache_ttl,
440 )
466 )
441 else:
467 else:
442 anonymous_perm = False
468 anonymous_perm = False
@@ -562,48 +588,42 b' class SimpleVCS(object):'
562 also handles the locking exceptions which will be triggered when
588 also handles the locking exceptions which will be triggered when
563 the first chunk is produced by the underlying WSGI application.
589 the first chunk is produced by the underlying WSGI application.
564 """
590 """
565 callback_daemon, extras = self._prepare_callback_daemon(extras)
591 txn_id = ''
566 config = self._create_config(extras, self.acl_repo_name)
592 if 'CONTENT_LENGTH' in environ and environ['REQUEST_METHOD'] == 'MERGE':
567 log.debug('HOOKS extras is %s', extras)
593 # case for SVN, we want to re-use the callback daemon port
568 app = self._create_wsgi_app(repo_path, self.url_repo_name, config)
594 # so we use the txn_id, for this we peek the body, and still save
569 app.rc_extras = extras
595 # it as wsgi.input
596 data = environ['wsgi.input'].read()
597 environ['wsgi.input'] = StringIO(data)
598 txn_id = extract_svn_txn_id(self.acl_repo_name, data)
570
599
571 try:
600 callback_daemon, extras = self._prepare_callback_daemon(
572 with callback_daemon:
601 extras, environ, action, txn_id=txn_id)
573 try:
602 log.debug('HOOKS extras is %s', extras)
574 response = app(environ, start_response)
603
575 finally:
604 config = self._create_config(extras, self.acl_repo_name)
576 # This statement works together with the decorator
605 app = self._create_wsgi_app(repo_path, self.url_repo_name, config)
577 # "initialize_generator" above. The decorator ensures that
606 with callback_daemon:
578 # we hit the first yield statement before the generator is
607 app.rc_extras = extras
579 # returned back to the WSGI server. This is needed to
580 # ensure that the call to "app" above triggers the
581 # needed callback to "start_response" before the
582 # generator is actually used.
583 yield "__init__"
584
608
585 for chunk in response:
609 try:
586 yield chunk
610 response = app(environ, start_response)
587 except Exception as exc:
611 finally:
588 # TODO: martinb: Exceptions are only raised in case of the Pyro4
612 # This statement works together with the decorator
589 # backend. Refactor this except block after dropping Pyro4 support.
613 # "initialize_generator" above. The decorator ensures that
590 # TODO: johbo: Improve "translating" back the exception.
614 # we hit the first yield statement before the generator is
591 if getattr(exc, '_vcs_kind', None) == 'repo_locked':
615 # returned back to the WSGI server. This is needed to
592 exc = HTTPLockedRC(*exc.args)
616 # ensure that the call to "app" above triggers the
593 _code = rhodecode.CONFIG.get('lock_ret_code')
617 # needed callback to "start_response" before the
594 log.debug('Repository LOCKED ret code %s!', (_code,))
618 # generator is actually used.
595 elif getattr(exc, '_vcs_kind', None) == 'requirement':
619 yield "__init__"
596 log.debug(
597 'Repository requires features unknown to this Mercurial')
598 exc = HTTPRequirementError(*exc.args)
599 else:
600 raise
601
620
602 for chunk in exc(environ, start_response):
621 # iter content
622 for chunk in response:
603 yield chunk
623 yield chunk
604 finally:
624
605 # invalidate cache on push
606 try:
625 try:
626 # invalidate cache on push
607 if action == 'push':
627 if action == 'push':
608 self._invalidate_cache(self.url_repo_name)
628 self._invalidate_cache(self.url_repo_name)
609 finally:
629 finally:
@@ -631,10 +651,18 b' class SimpleVCS(object):'
631 """Create a safe config representation."""
651 """Create a safe config representation."""
632 raise NotImplementedError()
652 raise NotImplementedError()
633
653
634 def _prepare_callback_daemon(self, extras):
654 def _should_use_callback_daemon(self, extras, environ, action):
655 return True
656
657 def _prepare_callback_daemon(self, extras, environ, action, txn_id=None):
658 direct_calls = vcs_settings.HOOKS_DIRECT_CALLS
659 if not self._should_use_callback_daemon(extras, environ, action):
660 # disable callback daemon for actions that don't require it
661 direct_calls = True
662
635 return prepare_callback_daemon(
663 return prepare_callback_daemon(
636 extras, protocol=vcs_settings.HOOKS_PROTOCOL,
664 extras, protocol=vcs_settings.HOOKS_PROTOCOL,
637 use_direct_calls=vcs_settings.HOOKS_DIRECT_CALLS)
665 use_direct_calls=direct_calls, txn_id=txn_id)
638
666
639
667
640 def _should_check_locking(query_string):
668 def _should_check_locking(query_string):
@@ -36,7 +36,9 b' def get_app_config(ini_path):'
36 return appconfig('config:{}'.format(ini_path), relative_to=os.getcwd())
36 return appconfig('config:{}'.format(ini_path), relative_to=os.getcwd())
37
37
38
38
39 def bootstrap(config_uri, request=None, options=None):
39 def bootstrap(config_uri, request=None, options=None, env=None):
40 if env:
41 os.environ.update(env)
40
42
41 config = get_config(config_uri)
43 config = get_config(config_uri)
42 base_url = 'http://rhodecode.local'
44 base_url = 'http://rhodecode.local'
@@ -95,7 +95,7 b' def command(ini_path, force_yes, user, e'
95 dbmanage.populate_default_permissions()
95 dbmanage.populate_default_permissions()
96 Session().commit()
96 Session().commit()
97
97
98 with bootstrap(ini_path) as env:
98 with bootstrap(ini_path, env={'RC_CMD_SETUP_RC': '1'}) as env:
99 msg = 'Successfully initialized database, schema and default data.'
99 msg = 'Successfully initialized database, schema and default data.'
100 print()
100 print()
101 print('*' * len(msg))
101 print('*' * len(msg))
@@ -40,7 +40,7 b' def main(ini_path, force_yes):'
40 def command(ini_path, force_yes):
40 def command(ini_path, force_yes):
41 pyramid.paster.setup_logging(ini_path)
41 pyramid.paster.setup_logging(ini_path)
42
42
43 with bootstrap(ini_path) as env:
43 with bootstrap(ini_path, env={'RC_CMD_UPGRADE_DB': '1'}) as env:
44 config = env['registry'].settings
44 config = env['registry'].settings
45 db_uri = config['sqlalchemy.db1.url']
45 db_uri = config['sqlalchemy.db1.url']
46 options = {}
46 options = {}
@@ -23,8 +23,10 b' import os'
23 import sys
23 import sys
24 import time
24 import time
25 import platform
25 import platform
26 import collections
26 import pkg_resources
27 import pkg_resources
27 import logging
28 import logging
29 import resource
28
30
29 from pyramid.compat import configparser
31 from pyramid.compat import configparser
30
32
@@ -140,6 +142,29 b' def platform_type():'
140 return SysInfoRes(value=value)
142 return SysInfoRes(value=value)
141
143
142
144
145 def ulimit_info():
146 data = collections.OrderedDict([
147 ('cpu time (seconds)', resource.getrlimit(resource.RLIMIT_CPU)),
148 ('file size', resource.getrlimit(resource.RLIMIT_FSIZE)),
149 ('stack size', resource.getrlimit(resource.RLIMIT_STACK)),
150 ('core file size', resource.getrlimit(resource.RLIMIT_CORE)),
151 ('address space size', resource.getrlimit(resource.RLIMIT_AS)),
152 ('locked in mem size', resource.getrlimit(resource.RLIMIT_MEMLOCK)),
153 ('heap size', resource.getrlimit(resource.RLIMIT_DATA)),
154 ('rss size', resource.getrlimit(resource.RLIMIT_RSS)),
155 ('number of processes', resource.getrlimit(resource.RLIMIT_NPROC)),
156 ('open files', resource.getrlimit(resource.RLIMIT_NOFILE)),
157 ])
158
159 text = ', '.join('{}:{}'.format(k,v) for k,v in data.items())
160
161 value = {
162 'limits': data,
163 'text': text,
164 }
165 return SysInfoRes(value=value)
166
167
143 def uptime():
168 def uptime():
144 from rhodecode.lib.helpers import age, time_to_datetime
169 from rhodecode.lib.helpers import age, time_to_datetime
145 from rhodecode.translation import TranslationString
170 from rhodecode.translation import TranslationString
@@ -687,6 +712,7 b' def usage_info():'
687 return SysInfoRes(value=value)
712 return SysInfoRes(value=value)
688
713
689
714
715
690 def get_system_info(environ):
716 def get_system_info(environ):
691 environ = environ or {}
717 environ = environ or {}
692 return {
718 return {
@@ -699,7 +725,7 b' def get_system_info(environ):'
699 'platform': SysInfo(platform_type)(),
725 'platform': SysInfo(platform_type)(),
700 'server': SysInfo(server_info, environ=environ)(),
726 'server': SysInfo(server_info, environ=environ)(),
701 'database': SysInfo(database_info)(),
727 'database': SysInfo(database_info)(),
702
728 'ulimit': SysInfo(ulimit_info)(),
703 'storage': SysInfo(storage)(),
729 'storage': SysInfo(storage)(),
704 'storage_inodes': SysInfo(storage_inodes)(),
730 'storage_inodes': SysInfo(storage_inodes)(),
705 'storage_archive': SysInfo(storage_archives)(),
731 'storage_archive': SysInfo(storage_archives)(),
@@ -134,13 +134,17 b' def get_user_group_slug(request):'
134 elif getattr(request, 'matchdict', None):
134 elif getattr(request, 'matchdict', None):
135 # pyramid
135 # pyramid
136 _user_group = request.matchdict.get('user_group_id')
136 _user_group = request.matchdict.get('user_group_id')
137
137 _user_group_name = request.matchdict.get('user_group_name')
138 try:
138 try:
139 _user_group = UserGroup.get(_user_group)
139 if _user_group:
140 _user_group = UserGroup.get(_user_group)
141 elif _user_group_name:
142 _user_group = UserGroup.get_by_group_name(_user_group_name)
143
140 if _user_group:
144 if _user_group:
141 _user_group = _user_group.users_group_name
145 _user_group = _user_group.users_group_name
142 except Exception:
146 except Exception:
143 log.exception('Failed to get user group by id')
147 log.exception('Failed to get user group by id and name')
144 # catch all failures here
148 # catch all failures here
145 return None
149 return None
146
150
@@ -352,11 +356,10 b' def config_data_from_db(clear_session=Tr'
352
356
353 ui_settings = settings_model.get_ui_settings()
357 ui_settings = settings_model.get_ui_settings()
354
358
359 ui_data = []
355 for setting in ui_settings:
360 for setting in ui_settings:
356 if setting.active:
361 if setting.active:
357 log.debug(
362 ui_data.append((setting.section, setting.key, setting.value))
358 'settings ui from db: [%s] %s=%s',
359 setting.section, setting.key, setting.value)
360 config.append((
363 config.append((
361 safe_str(setting.section), safe_str(setting.key),
364 safe_str(setting.section), safe_str(setting.key),
362 safe_str(setting.value)))
365 safe_str(setting.value)))
@@ -365,6 +368,9 b' def config_data_from_db(clear_session=Tr'
365 # handles that
368 # handles that
366 config.append((
369 config.append((
367 safe_str(setting.section), safe_str(setting.key), False))
370 safe_str(setting.section), safe_str(setting.key), False))
371 log.debug(
372 'settings ui from db: %s',
373 ','.join(map(lambda s: '[{}] {}={}'.format(*s), ui_data)))
368 if clear_session:
374 if clear_session:
369 meta.Session.remove()
375 meta.Session.remove()
370
376
@@ -508,7 +514,6 b' def repo2db_mapper(initial_repo_list, re'
508 :param remove_obsolete: check for obsolete entries in database
514 :param remove_obsolete: check for obsolete entries in database
509 """
515 """
510 from rhodecode.model.repo import RepoModel
516 from rhodecode.model.repo import RepoModel
511 from rhodecode.model.scm import ScmModel
512 from rhodecode.model.repo_group import RepoGroupModel
517 from rhodecode.model.repo_group import RepoGroupModel
513 from rhodecode.model.settings import SettingsModel
518 from rhodecode.model.settings import SettingsModel
514
519
@@ -560,9 +565,8 b' def repo2db_mapper(initial_repo_list, re'
560
565
561 config = db_repo._config
566 config = db_repo._config
562 config.set('extensions', 'largefiles', '')
567 config.set('extensions', 'largefiles', '')
563 ScmModel().install_hooks(
568 repo = db_repo.scm_instance(config=config)
564 db_repo.scm_instance(config=config),
569 repo.install_hooks()
565 repo_type=db_repo.repo_type)
566
570
567 removed = []
571 removed = []
568 if remove_obsolete:
572 if remove_obsolete:
@@ -709,7 +709,19 b' def extract_mentioned_users(s):'
709 return sorted(list(usrs), key=lambda k: k.lower())
709 return sorted(list(usrs), key=lambda k: k.lower())
710
710
711
711
712 class StrictAttributeDict(dict):
712 class AttributeDictBase(dict):
713 def __getstate__(self):
714 odict = self.__dict__ # get attribute dictionary
715 return odict
716
717 def __setstate__(self, dict):
718 self.__dict__ = dict
719
720 __setattr__ = dict.__setitem__
721 __delattr__ = dict.__delitem__
722
723
724 class StrictAttributeDict(AttributeDictBase):
713 """
725 """
714 Strict Version of Attribute dict which raises an Attribute error when
726 Strict Version of Attribute dict which raises an Attribute error when
715 requested attribute is not set
727 requested attribute is not set
@@ -720,15 +732,12 b' class StrictAttributeDict(dict):'
720 except KeyError:
732 except KeyError:
721 raise AttributeError('%s object has no attribute %s' % (
733 raise AttributeError('%s object has no attribute %s' % (
722 self.__class__, attr))
734 self.__class__, attr))
723 __setattr__ = dict.__setitem__
724 __delattr__ = dict.__delitem__
725
735
726
736
727 class AttributeDict(dict):
737 class AttributeDict(AttributeDictBase):
728 def __getattr__(self, attr):
738 def __getattr__(self, attr):
729 return self.get(attr, None)
739 return self.get(attr, None)
730 __setattr__ = dict.__setitem__
740
731 __delattr__ = dict.__delitem__
732
741
733
742
734 def fix_PATH(os_=None):
743 def fix_PATH(os_=None):
@@ -24,9 +24,11 b' Base module for all VCS systems'
24
24
25 import collections
25 import collections
26 import datetime
26 import datetime
27 import fnmatch
27 import itertools
28 import itertools
28 import logging
29 import logging
29 import os
30 import os
31 import re
30 import time
32 import time
31 import warnings
33 import warnings
32
34
@@ -173,6 +175,7 b' class BaseRepository(object):'
173 EMPTY_COMMIT_ID = '0' * 40
175 EMPTY_COMMIT_ID = '0' * 40
174
176
175 path = None
177 path = None
178 _remote = None
176
179
177 def __init__(self, repo_path, config=None, create=False, **kwargs):
180 def __init__(self, repo_path, config=None, create=False, **kwargs):
178 """
181 """
@@ -203,6 +206,12 b' class BaseRepository(object):'
203 def __ne__(self, other):
206 def __ne__(self, other):
204 return not self.__eq__(other)
207 return not self.__eq__(other)
205
208
209 def get_create_shadow_cache_pr_path(self, db_repo):
210 path = db_repo.cached_diffs_dir
211 if not os.path.exists(path):
212 os.makedirs(path, 0755)
213 return path
214
206 @classmethod
215 @classmethod
207 def get_default_config(cls, default=None):
216 def get_default_config(cls, default=None):
208 config = Config()
217 config = Config()
@@ -249,6 +258,20 b' class BaseRepository(object):'
249 raise NotImplementedError
258 raise NotImplementedError
250
259
251 @LazyProperty
260 @LazyProperty
261 def branches_closed(self):
262 """
263 A `dict` which maps tags names to commit ids.
264 """
265 raise NotImplementedError
266
267 @LazyProperty
268 def bookmarks(self):
269 """
270 A `dict` which maps tags names to commit ids.
271 """
272 raise NotImplementedError
273
274 @LazyProperty
252 def tags(self):
275 def tags(self):
253 """
276 """
254 A `dict` which maps tags names to commit ids.
277 A `dict` which maps tags names to commit ids.
@@ -623,6 +646,18 b' class BaseRepository(object):'
623 warnings.warn("Use in_memory_commit instead", DeprecationWarning)
646 warnings.warn("Use in_memory_commit instead", DeprecationWarning)
624 return self.in_memory_commit
647 return self.in_memory_commit
625
648
649 def get_path_permissions(self, username):
650 """
651 Returns a path permission checker or None if not supported
652
653 :param username: session user name
654 :return: an instance of BasePathPermissionChecker or None
655 """
656 return None
657
658 def install_hooks(self, force=False):
659 return self._remote.install_hooks(force)
660
626
661
627 class BaseCommit(object):
662 class BaseCommit(object):
628 """
663 """
@@ -716,9 +751,15 b' class BaseCommit(object):'
716 'branch': self.branch
751 'branch': self.branch
717 }
752 }
718
753
754 def __getstate__(self):
755 d = self.__dict__.copy()
756 d.pop('_remote', None)
757 d.pop('repository', None)
758 return d
759
719 def _get_refs(self):
760 def _get_refs(self):
720 return {
761 return {
721 'branches': [self.branch],
762 'branches': [self.branch] if self.branch else [],
722 'bookmarks': getattr(self, 'bookmarks', []),
763 'bookmarks': getattr(self, 'bookmarks', []),
723 'tags': self.tags
764 'tags': self.tags
724 }
765 }
@@ -1604,3 +1645,66 b' class DiffChunk(object):'
1604 self.header = match.groupdict()
1645 self.header = match.groupdict()
1605 self.diff = chunk[match.end():]
1646 self.diff = chunk[match.end():]
1606 self.raw = chunk
1647 self.raw = chunk
1648
1649
1650 class BasePathPermissionChecker(object):
1651
1652 @staticmethod
1653 def create_from_patterns(includes, excludes):
1654 if includes and '*' in includes and not excludes:
1655 return AllPathPermissionChecker()
1656 elif excludes and '*' in excludes:
1657 return NonePathPermissionChecker()
1658 else:
1659 return PatternPathPermissionChecker(includes, excludes)
1660
1661 @property
1662 def has_full_access(self):
1663 raise NotImplemented()
1664
1665 def has_access(self, path):
1666 raise NotImplemented()
1667
1668
1669 class AllPathPermissionChecker(BasePathPermissionChecker):
1670
1671 @property
1672 def has_full_access(self):
1673 return True
1674
1675 def has_access(self, path):
1676 return True
1677
1678
1679 class NonePathPermissionChecker(BasePathPermissionChecker):
1680
1681 @property
1682 def has_full_access(self):
1683 return False
1684
1685 def has_access(self, path):
1686 return False
1687
1688
1689 class PatternPathPermissionChecker(BasePathPermissionChecker):
1690
1691 def __init__(self, includes, excludes):
1692 self.includes = includes
1693 self.excludes = excludes
1694 self.includes_re = [] if not includes else [
1695 re.compile(fnmatch.translate(pattern)) for pattern in includes]
1696 self.excludes_re = [] if not excludes else [
1697 re.compile(fnmatch.translate(pattern)) for pattern in excludes]
1698
1699 @property
1700 def has_full_access(self):
1701 return '*' in self.includes and not self.excludes
1702
1703 def has_access(self, path):
1704 for regex in self.excludes_re:
1705 if regex.match(path):
1706 return False
1707 for regex in self.includes_re:
1708 if regex.match(path):
1709 return True
1710 return False
@@ -28,7 +28,6 b' from itertools import chain'
28 from StringIO import StringIO
28 from StringIO import StringIO
29
29
30 from zope.cachedescriptors.property import Lazy as LazyProperty
30 from zope.cachedescriptors.property import Lazy as LazyProperty
31 from pyramid.compat import configparser
32
31
33 from rhodecode.lib.datelib import utcdate_fromtimestamp
32 from rhodecode.lib.datelib import utcdate_fromtimestamp
34 from rhodecode.lib.utils import safe_unicode, safe_str
33 from rhodecode.lib.utils import safe_unicode, safe_str
@@ -40,6 +39,7 b' from rhodecode.lib.vcs.nodes import ('
40 FileNode, DirNode, NodeKind, RootNode, SubModuleNode,
39 FileNode, DirNode, NodeKind, RootNode, SubModuleNode,
41 ChangedFileNodesGenerator, AddedFileNodesGenerator,
40 ChangedFileNodesGenerator, AddedFileNodesGenerator,
42 RemovedFileNodesGenerator, LargeFileNode)
41 RemovedFileNodesGenerator, LargeFileNode)
42 from rhodecode.lib.vcs.compat import configparser
43
43
44
44
45 class GitCommit(base.BaseCommit):
45 class GitCommit(base.BaseCommit):
@@ -71,8 +71,6 b' class GitRepository(BaseRepository):'
71 # caches
71 # caches
72 self._commit_ids = {}
72 self._commit_ids = {}
73
73
74 self.bookmarks = {}
75
76 @LazyProperty
74 @LazyProperty
77 def bare(self):
75 def bare(self):
78 return self._remote.bare()
76 return self._remote.bare()
@@ -323,6 +321,10 b' class GitRepository(BaseRepository):'
323 return {}
321 return {}
324
322
325 @LazyProperty
323 @LazyProperty
324 def bookmarks(self):
325 return {}
326
327 @LazyProperty
326 def branches_all(self):
328 def branches_all(self):
327 all_branches = {}
329 all_branches = {}
328 all_branches.update(self.branches)
330 all_branches.update(self.branches)
@@ -791,7 +793,7 b' class GitRepository(BaseRepository):'
791 def _get_shadow_instance(self, shadow_repository_path, enable_hooks=False):
793 def _get_shadow_instance(self, shadow_repository_path, enable_hooks=False):
792 return GitRepository(shadow_repository_path)
794 return GitRepository(shadow_repository_path)
793
795
794 def _local_pull(self, repository_path, branch_name):
796 def _local_pull(self, repository_path, branch_name, ff_only=True):
795 """
797 """
796 Pull a branch from a local repository.
798 Pull a branch from a local repository.
797 """
799 """
@@ -802,7 +804,10 b' class GitRepository(BaseRepository):'
802 # conflicts with our current branch)
804 # conflicts with our current branch)
803 # Additionally, that option needs to go before --no-tags, otherwise git
805 # Additionally, that option needs to go before --no-tags, otherwise git
804 # pull complains about it being an unknown flag.
806 # pull complains about it being an unknown flag.
805 cmd = ['pull', '--ff-only', '--no-tags', repository_path, branch_name]
807 cmd = ['pull']
808 if ff_only:
809 cmd.append('--ff-only')
810 cmd.extend(['--no-tags', repository_path, branch_name])
806 self.run_git_command(cmd, fail_on_stderr=False)
811 self.run_git_command(cmd, fail_on_stderr=False)
807
812
808 def _local_merge(self, merge_message, user_name, user_email, heads):
813 def _local_merge(self, merge_message, user_name, user_email, heads):
@@ -915,6 +920,7 b' class GitRepository(BaseRepository):'
915
920
916 pr_branch = shadow_repo._get_new_pr_branch(
921 pr_branch = shadow_repo._get_new_pr_branch(
917 source_ref.name, target_ref.name)
922 source_ref.name, target_ref.name)
923 log.debug('using pull-request merge branch: `%s`', pr_branch)
918 shadow_repo._checkout(pr_branch, create=True)
924 shadow_repo._checkout(pr_branch, create=True)
919 try:
925 try:
920 shadow_repo._local_fetch(source_repo.path, source_ref.name)
926 shadow_repo._local_fetch(source_repo.path, source_ref.name)
@@ -21,10 +21,9 b''
21 """
21 """
22 HG repository module
22 HG repository module
23 """
23 """
24
24 import os
25 import logging
25 import logging
26 import binascii
26 import binascii
27 import os
28 import shutil
27 import shutil
29 import urllib
28 import urllib
30
29
@@ -32,19 +31,19 b' from zope.cachedescriptors.property impo'
32
31
33 from rhodecode.lib.compat import OrderedDict
32 from rhodecode.lib.compat import OrderedDict
34 from rhodecode.lib.datelib import (
33 from rhodecode.lib.datelib import (
35 date_to_timestamp_plus_offset, utcdate_fromtimestamp, makedate,
34 date_to_timestamp_plus_offset, utcdate_fromtimestamp, makedate)
36 date_astimestamp)
37 from rhodecode.lib.utils import safe_unicode, safe_str
35 from rhodecode.lib.utils import safe_unicode, safe_str
38 from rhodecode.lib.vcs import connection
36 from rhodecode.lib.vcs import connection, exceptions
39 from rhodecode.lib.vcs.backends.base import (
37 from rhodecode.lib.vcs.backends.base import (
40 BaseRepository, CollectionGenerator, Config, MergeResponse,
38 BaseRepository, CollectionGenerator, Config, MergeResponse,
41 MergeFailureReason, Reference)
39 MergeFailureReason, Reference, BasePathPermissionChecker)
42 from rhodecode.lib.vcs.backends.hg.commit import MercurialCommit
40 from rhodecode.lib.vcs.backends.hg.commit import MercurialCommit
43 from rhodecode.lib.vcs.backends.hg.diff import MercurialDiff
41 from rhodecode.lib.vcs.backends.hg.diff import MercurialDiff
44 from rhodecode.lib.vcs.backends.hg.inmemory import MercurialInMemoryCommit
42 from rhodecode.lib.vcs.backends.hg.inmemory import MercurialInMemoryCommit
45 from rhodecode.lib.vcs.exceptions import (
43 from rhodecode.lib.vcs.exceptions import (
46 EmptyRepositoryError, RepositoryError, TagAlreadyExistError,
44 EmptyRepositoryError, RepositoryError, TagAlreadyExistError,
47 TagDoesNotExistError, CommitDoesNotExistError, SubrepoMergeError)
45 TagDoesNotExistError, CommitDoesNotExistError, SubrepoMergeError)
46 from rhodecode.lib.vcs.compat import configparser
48
47
49 hexlify = binascii.hexlify
48 hexlify = binascii.hexlify
50 nullid = "\0" * 20
49 nullid = "\0" * 20
@@ -778,6 +777,7 b' class MercurialRepository(BaseRepository'
778 else:
777 else:
779 merge_possible = True
778 merge_possible = True
780
779
780 needs_push = False
781 if merge_possible:
781 if merge_possible:
782 try:
782 try:
783 merge_commit_id, needs_push = shadow_repo._local_merge(
783 merge_commit_id, needs_push = shadow_repo._local_merge(
@@ -891,6 +891,43 b' class MercurialRepository(BaseRepository'
891 self._remote.bookmark(bookmark, revision=revision)
891 self._remote.bookmark(bookmark, revision=revision)
892 self._remote.invalidate_vcs_cache()
892 self._remote.invalidate_vcs_cache()
893
893
894 def get_path_permissions(self, username):
895 hgacl_file = os.path.join(self.path, '.hg/hgacl')
896
897 def read_patterns(suffix):
898 svalue = None
899 try:
900 svalue = hgacl.get('narrowhgacl', username + suffix)
901 except configparser.NoOptionError:
902 try:
903 svalue = hgacl.get('narrowhgacl', 'default' + suffix)
904 except configparser.NoOptionError:
905 pass
906 if not svalue:
907 return None
908 result = ['/']
909 for pattern in svalue.split():
910 result.append(pattern)
911 if '*' not in pattern and '?' not in pattern:
912 result.append(pattern + '/*')
913 return result
914
915 if os.path.exists(hgacl_file):
916 try:
917 hgacl = configparser.RawConfigParser()
918 hgacl.read(hgacl_file)
919
920 includes = read_patterns('.includes')
921 excludes = read_patterns('.excludes')
922 return BasePathPermissionChecker.create_from_patterns(
923 includes, excludes)
924 except BaseException as e:
925 msg = 'Cannot read ACL settings from {} on {}: {}'.format(
926 hgacl_file, self.name, e)
927 raise exceptions.RepositoryRequirementError(msg)
928 else:
929 return None
930
894
931
895 class MercurialIndexBasedCollectionGenerator(CollectionGenerator):
932 class MercurialIndexBasedCollectionGenerator(CollectionGenerator):
896
933
@@ -77,8 +77,6 b' class SubversionRepository(base.BaseRepo'
77
77
78 self._init_repo(create, src_url)
78 self._init_repo(create, src_url)
79
79
80 self.bookmarks = {}
81
82 def _init_repo(self, create, src_url):
80 def _init_repo(self, create, src_url):
83 if create and os.path.exists(self.path):
81 if create and os.path.exists(self.path):
84 raise RepositoryError(
82 raise RepositoryError(
@@ -107,6 +105,10 b' class SubversionRepository(base.BaseRepo'
107 return {}
105 return {}
108
106
109 @LazyProperty
107 @LazyProperty
108 def bookmarks(self):
109 return {}
110
111 @LazyProperty
110 def branches_all(self):
112 def branches_all(self):
111 # TODO: johbo: Implement proper branch support
113 # TODO: johbo: Implement proper branch support
112 all_branches = {}
114 all_branches = {}
@@ -197,6 +197,13 b' def _remote_call(url, payload, exception'
197 response = session.post(url, data=msgpack.packb(payload))
197 response = session.post(url, data=msgpack.packb(payload))
198 except pycurl.error as e:
198 except pycurl.error as e:
199 raise exceptions.HttpVCSCommunicationError(e)
199 raise exceptions.HttpVCSCommunicationError(e)
200 except Exception as e:
201 message = getattr(e, 'message', '')
202 if 'Failed to connect' in message:
203 # gevent doesn't return proper pycurl errors
204 raise exceptions.HttpVCSCommunicationError(e)
205 else:
206 raise
200
207
201 if response.status_code >= 400:
208 if response.status_code >= 400:
202 log.error('Call to %s returned non 200 HTTP code: %s',
209 log.error('Call to %s returned non 200 HTTP code: %s',
@@ -1344,6 +1344,15 b' class UserGroup(Base, BaseModel):'
1344 except Exception:
1344 except Exception:
1345 log.error(traceback.format_exc())
1345 log.error(traceback.format_exc())
1346
1346
1347 @classmethod
1348 def _load_sync(cls, group_data):
1349 if group_data:
1350 return group_data.get('extern_type')
1351
1352 @property
1353 def sync(self):
1354 return self._load_sync(self.group_data)
1355
1347 def __unicode__(self):
1356 def __unicode__(self):
1348 return u"<%s('id:%s:%s')>" % (self.__class__.__name__,
1357 return u"<%s('id:%s:%s')>" % (self.__class__.__name__,
1349 self.users_group_id,
1358 self.users_group_id,
@@ -1453,6 +1462,7 b' class UserGroup(Base, BaseModel):'
1453 'group_description': user_group.user_group_description,
1462 'group_description': user_group.user_group_description,
1454 'active': user_group.users_group_active,
1463 'active': user_group.users_group_active,
1455 'owner': user_group.user.username,
1464 'owner': user_group.user.username,
1465 'sync': user_group.sync,
1456 'owner_email': user_group.user.email,
1466 'owner_email': user_group.user.email,
1457 }
1467 }
1458
1468
@@ -1557,6 +1567,9 b' class Repository(Base, BaseModel):'
1557 clone_uri = Column(
1567 clone_uri = Column(
1558 "clone_uri", EncryptedTextValue(), nullable=True, unique=False,
1568 "clone_uri", EncryptedTextValue(), nullable=True, unique=False,
1559 default=None)
1569 default=None)
1570 push_uri = Column(
1571 "push_uri", EncryptedTextValue(), nullable=True, unique=False,
1572 default=None)
1560 repo_type = Column(
1573 repo_type = Column(
1561 "repo_type", String(255), nullable=False, unique=False, default=None)
1574 "repo_type", String(255), nullable=False, unique=False, default=None)
1562 user_id = Column(
1575 user_id = Column(
@@ -1846,6 +1859,30 b' class Repository(Base, BaseModel):'
1846 .order_by(CacheKey.cache_key)\
1859 .order_by(CacheKey.cache_key)\
1847 .all()
1860 .all()
1848
1861
1862 @property
1863 def cached_diffs_relative_dir(self):
1864 """
1865 Return a relative to the repository store path of cached diffs
1866 used for safe display for users, who shouldn't know the absolute store
1867 path
1868 """
1869 return os.path.join(
1870 os.path.dirname(self.repo_name),
1871 self.cached_diffs_dir.split(os.path.sep)[-1])
1872
1873 @property
1874 def cached_diffs_dir(self):
1875 path = self.repo_full_path
1876 return os.path.join(
1877 os.path.dirname(path),
1878 '.__shadow_diff_cache_repo_{}'.format(self.repo_id))
1879
1880 def cached_diffs(self):
1881 diff_cache_dir = self.cached_diffs_dir
1882 if os.path.isdir(diff_cache_dir):
1883 return os.listdir(diff_cache_dir)
1884 return []
1885
1849 def get_new_name(self, repo_name):
1886 def get_new_name(self, repo_name):
1850 """
1887 """
1851 returns new full repository name based on assigned group and new new
1888 returns new full repository name based on assigned group and new new
@@ -1943,6 +1980,7 b' class Repository(Base, BaseModel):'
1943 'repo_name': repo.repo_name,
1980 'repo_name': repo.repo_name,
1944 'repo_type': repo.repo_type,
1981 'repo_type': repo.repo_type,
1945 'clone_uri': repo.clone_uri or '',
1982 'clone_uri': repo.clone_uri or '',
1983 'push_uri': repo.push_uri or '',
1946 'url': RepoModel().get_url(self),
1984 'url': RepoModel().get_url(self),
1947 'private': repo.private,
1985 'private': repo.private,
1948 'created_on': repo.created_on,
1986 'created_on': repo.created_on,
@@ -2077,6 +2115,16 b' class Repository(Base, BaseModel):'
2077 clone_uri = url_obj.with_password('*****')
2115 clone_uri = url_obj.with_password('*****')
2078 return clone_uri
2116 return clone_uri
2079
2117
2118 @property
2119 def push_uri_hidden(self):
2120 push_uri = self.push_uri
2121 if push_uri:
2122 import urlobject
2123 url_obj = urlobject.URLObject(cleaned_uri(push_uri))
2124 if url_obj.password:
2125 push_uri = url_obj.with_password('*****')
2126 return push_uri
2127
2080 def clone_url(self, **override):
2128 def clone_url(self, **override):
2081 from rhodecode.model.settings import SettingsModel
2129 from rhodecode.model.settings import SettingsModel
2082
2130
@@ -375,6 +375,7 b' def ApplicationSettingsForm(localizer):'
375
375
376
376
377 def ApplicationVisualisationForm(localizer):
377 def ApplicationVisualisationForm(localizer):
378 from rhodecode.model.db import Repository
378 _ = localizer
379 _ = localizer
379
380
380 class _ApplicationVisualisationForm(formencode.Schema):
381 class _ApplicationVisualisationForm(formencode.Schema):
@@ -392,8 +393,8 b' def ApplicationVisualisationForm(localiz'
392 rhodecode_use_gravatar = v.StringBoolean(if_missing=False)
393 rhodecode_use_gravatar = v.StringBoolean(if_missing=False)
393 rhodecode_markup_renderer = v.OneOf(['markdown', 'rst'])
394 rhodecode_markup_renderer = v.OneOf(['markdown', 'rst'])
394 rhodecode_gravatar_url = v.UnicodeString(min=3)
395 rhodecode_gravatar_url = v.UnicodeString(min=3)
395 rhodecode_clone_uri_tmpl = v.UnicodeString(min=3)
396 rhodecode_clone_uri_tmpl = v.UnicodeString(not_empty=False, if_empty=Repository.DEFAULT_CLONE_URI)
396 rhodecode_clone_uri_ssh_tmpl = v.UnicodeString(min=3)
397 rhodecode_clone_uri_ssh_tmpl = v.UnicodeString(not_empty=False, if_empty=Repository.DEFAULT_CLONE_URI_SSH)
397 rhodecode_support_url = v.UnicodeString()
398 rhodecode_support_url = v.UnicodeString()
398 rhodecode_show_revision_number = v.StringBoolean(if_missing=False)
399 rhodecode_show_revision_number = v.StringBoolean(if_missing=False)
399 rhodecode_show_sha_length = v.Int(min=4, not_empty=True)
400 rhodecode_show_sha_length = v.Int(min=4, not_empty=True)
@@ -429,6 +430,9 b' class _BaseVcsSettingsForm(formencode.Sc'
429 vcs_svn_proxy_http_requests_enabled = v.StringBoolean(if_missing=False)
430 vcs_svn_proxy_http_requests_enabled = v.StringBoolean(if_missing=False)
430 vcs_svn_proxy_http_server_url = v.UnicodeString(strip=True, if_missing=None)
431 vcs_svn_proxy_http_server_url = v.UnicodeString(strip=True, if_missing=None)
431
432
433 # cache
434 rhodecode_diff_cache = v.StringBoolean(if_missing=False)
435
432
436
433 def ApplicationUiSettingsForm(localizer):
437 def ApplicationUiSettingsForm(localizer):
434 _ = localizer
438 _ = localizer
@@ -1063,7 +1063,7 b' class PullRequestModel(BaseModel):'
1063 repo_name=safe_str(pull_request.target_repo.repo_name),
1063 repo_name=safe_str(pull_request.target_repo.repo_name),
1064 pull_request_id=pull_request.pull_request_id,)
1064 pull_request_id=pull_request.pull_request_id,)
1065
1065
1066 def get_shadow_clone_url(self, pull_request):
1066 def get_shadow_clone_url(self, pull_request, request=None):
1067 """
1067 """
1068 Returns qualified url pointing to the shadow repository. If this pull
1068 Returns qualified url pointing to the shadow repository. If this pull
1069 request is closed there is no shadow repository and ``None`` will be
1069 request is closed there is no shadow repository and ``None`` will be
@@ -1072,7 +1072,7 b' class PullRequestModel(BaseModel):'
1072 if pull_request.is_closed():
1072 if pull_request.is_closed():
1073 return None
1073 return None
1074 else:
1074 else:
1075 pr_url = urllib.unquote(self.get_url(pull_request))
1075 pr_url = urllib.unquote(self.get_url(pull_request, request=request))
1076 return safe_unicode('{pr_url}/repository'.format(pr_url=pr_url))
1076 return safe_unicode('{pr_url}/repository'.format(pr_url=pr_url))
1077
1077
1078 def notify_reviewers(self, pull_request, reviewers_ids):
1078 def notify_reviewers(self, pull_request, reviewers_ids):
@@ -304,6 +304,7 b' class RepoModel(BaseModel):'
304 {'k': 'repo_enable_locking', 'strip': True},
304 {'k': 'repo_enable_locking', 'strip': True},
305 {'k': 'repo_landing_rev', 'strip': True},
305 {'k': 'repo_landing_rev', 'strip': True},
306 {'k': 'clone_uri', 'strip': False},
306 {'k': 'clone_uri', 'strip': False},
307 {'k': 'push_uri', 'strip': False},
307 {'k': 'repo_private', 'strip': True},
308 {'k': 'repo_private', 'strip': True},
308 {'k': 'repo_enable_statistics', 'strip': True}
309 {'k': 'repo_enable_statistics', 'strip': True}
309 )
310 )
@@ -319,6 +320,8 b' class RepoModel(BaseModel):'
319 defaults[item['k']] = val
320 defaults[item['k']] = val
320 if item['k'] == 'clone_uri':
321 if item['k'] == 'clone_uri':
321 defaults['clone_uri_hidden'] = repo_info.clone_uri_hidden
322 defaults['clone_uri_hidden'] = repo_info.clone_uri_hidden
323 if item['k'] == 'push_uri':
324 defaults['push_uri_hidden'] = repo_info.push_uri_hidden
322
325
323 # fill owner
326 # fill owner
324 if repo_info.user:
327 if repo_info.user:
@@ -348,6 +351,7 b' class RepoModel(BaseModel):'
348 (1, 'repo_enable_locking'),
351 (1, 'repo_enable_locking'),
349 (1, 'repo_enable_statistics'),
352 (1, 'repo_enable_statistics'),
350 (0, 'clone_uri'),
353 (0, 'clone_uri'),
354 (0, 'push_uri'),
351 (0, 'fork_id')
355 (0, 'fork_id')
352 ]
356 ]
353 for strip, k in update_keys:
357 for strip, k in update_keys:
@@ -854,7 +858,7 b' class RepoModel(BaseModel):'
854 repo = backend(
858 repo = backend(
855 repo_path, config=config, create=True, src_url=clone_uri)
859 repo_path, config=config, create=True, src_url=clone_uri)
856
860
857 ScmModel().install_hooks(repo, repo_type=repo_type)
861 repo.install_hooks()
858
862
859 log.debug('Created repo %s with %s backend',
863 log.debug('Created repo %s with %s backend',
860 safe_unicode(repo_name), safe_unicode(repo_type))
864 safe_unicode(repo_name), safe_unicode(repo_type))
@@ -915,6 +919,11 b' class RepoModel(BaseModel):'
915 if os.path.isdir(rm_path):
919 if os.path.isdir(rm_path):
916 shutil.move(rm_path, os.path.join(self.repos_path, _d))
920 shutil.move(rm_path, os.path.join(self.repos_path, _d))
917
921
922 # finally cleanup diff-cache if it exists
923 cached_diffs_dir = repo.cached_diffs_dir
924 if os.path.isdir(cached_diffs_dir):
925 shutil.rmtree(cached_diffs_dir)
926
918
927
919 class ReadmeFinder:
928 class ReadmeFinder:
920 """
929 """
@@ -397,7 +397,7 b' class ScmModel(BaseModel):'
397
397
398 def push_changes(self, repo, username, remote_uri=None):
398 def push_changes(self, repo, username, remote_uri=None):
399 dbrepo = self._get_repo(repo)
399 dbrepo = self._get_repo(repo)
400 remote_uri = remote_uri or dbrepo.clone_uri
400 remote_uri = remote_uri or dbrepo.push_uri
401 if not remote_uri:
401 if not remote_uri:
402 raise Exception("This repository doesn't have a clone uri")
402 raise Exception("This repository doesn't have a clone uri")
403
403
@@ -807,116 +807,6 b' class ScmModel(BaseModel):'
807
807
808 return choices, hist_l
808 return choices, hist_l
809
809
810 def install_git_hook(self, repo, force_create=False):
811 """
812 Creates a rhodecode hook inside a git repository
813
814 :param repo: Instance of VCS repo
815 :param force_create: Create even if same name hook exists
816 """
817
818 loc = os.path.join(repo.path, 'hooks')
819 if not repo.bare:
820 loc = os.path.join(repo.path, '.git', 'hooks')
821 if not os.path.isdir(loc):
822 os.makedirs(loc, mode=0777)
823
824 tmpl_post = pkg_resources.resource_string(
825 'rhodecode', '/'.join(
826 ('config', 'hook_templates', 'git_post_receive.py.tmpl')))
827 tmpl_pre = pkg_resources.resource_string(
828 'rhodecode', '/'.join(
829 ('config', 'hook_templates', 'git_pre_receive.py.tmpl')))
830
831 for h_type, tmpl in [('pre', tmpl_pre), ('post', tmpl_post)]:
832 _hook_file = os.path.join(loc, '%s-receive' % h_type)
833 log.debug('Installing git hook in repo %s', repo)
834 _rhodecode_hook = _check_rhodecode_hook(_hook_file)
835
836 if _rhodecode_hook or force_create:
837 log.debug('writing %s hook file !', h_type)
838 try:
839 with open(_hook_file, 'wb') as f:
840 tmpl = tmpl.replace('_TMPL_', rhodecode.__version__)
841 tmpl = tmpl.replace('_ENV_', sys.executable)
842 f.write(tmpl)
843 os.chmod(_hook_file, 0755)
844 except IOError:
845 log.exception('error writing hook file %s', _hook_file)
846 else:
847 log.debug('skipping writing hook file')
848
849 def install_svn_hooks(self, repo, force_create=False):
850 """
851 Creates rhodecode hooks inside a svn repository
852
853 :param repo: Instance of VCS repo
854 :param force_create: Create even if same name hook exists
855 """
856 hooks_path = os.path.join(repo.path, 'hooks')
857 if not os.path.isdir(hooks_path):
858 os.makedirs(hooks_path)
859 post_commit_tmpl = pkg_resources.resource_string(
860 'rhodecode', '/'.join(
861 ('config', 'hook_templates', 'svn_post_commit_hook.py.tmpl')))
862 pre_commit_template = pkg_resources.resource_string(
863 'rhodecode', '/'.join(
864 ('config', 'hook_templates', 'svn_pre_commit_hook.py.tmpl')))
865 templates = {
866 'post-commit': post_commit_tmpl,
867 'pre-commit': pre_commit_template
868 }
869 for filename in templates:
870 _hook_file = os.path.join(hooks_path, filename)
871 _rhodecode_hook = _check_rhodecode_hook(_hook_file)
872 if _rhodecode_hook or force_create:
873 log.debug('writing %s hook file !', filename)
874 template = templates[filename]
875 try:
876 with open(_hook_file, 'wb') as f:
877 template = template.replace(
878 '_TMPL_', rhodecode.__version__)
879 template = template.replace('_ENV_', sys.executable)
880 f.write(template)
881 os.chmod(_hook_file, 0755)
882 except IOError:
883 log.exception('error writing hook file %s', filename)
884 else:
885 log.debug('skipping writing hook file')
886
887 def install_hooks(self, repo, repo_type):
888 if repo_type == 'git':
889 self.install_git_hook(repo)
890 elif repo_type == 'svn':
891 self.install_svn_hooks(repo)
892
893 def get_server_info(self, environ=None):
810 def get_server_info(self, environ=None):
894 server_info = get_system_info(environ)
811 server_info = get_system_info(environ)
895 return server_info
812 return server_info
896
897
898 def _check_rhodecode_hook(hook_path):
899 """
900 Check if the hook was created by RhodeCode
901 """
902 if not os.path.exists(hook_path):
903 return True
904
905 log.debug('hook exists, checking if it is from rhodecode')
906 hook_content = _read_hook(hook_path)
907 matches = re.search(r'(?:RC_HOOK_VER)\s*=\s*(.*)', hook_content)
908 if matches:
909 try:
910 version = matches.groups()[0]
911 log.debug('got %s, it is rhodecode', version)
912 return True
913 except Exception:
914 log.exception("Exception while reading the hook version.")
915
916 return False
917
918
919 def _read_hook(hook_path):
920 with open(hook_path, 'rb') as f:
921 content = f.read()
922 return content
@@ -417,7 +417,9 b' class VcsSettingsModel(object):'
417 'hg_use_rebase_for_merging',
417 'hg_use_rebase_for_merging',
418 'hg_close_branch_before_merging',
418 'hg_close_branch_before_merging',
419 'git_use_rebase_for_merging',
419 'git_use_rebase_for_merging',
420 'git_close_branch_before_merging')
420 'git_close_branch_before_merging',
421 'diff_cache',
422 )
421
423
422 HOOKS_SETTINGS = (
424 HOOKS_SETTINGS = (
423 ('hooks', 'changegroup.repo_size'),
425 ('hooks', 'changegroup.repo_size'),
@@ -438,6 +440,7 b' class VcsSettingsModel(object):'
438 GLOBAL_GIT_SETTINGS = (
440 GLOBAL_GIT_SETTINGS = (
439 ('vcs_git_lfs', 'enabled'),
441 ('vcs_git_lfs', 'enabled'),
440 ('vcs_git_lfs', 'store_location'))
442 ('vcs_git_lfs', 'store_location'))
443
441 GLOBAL_SVN_SETTINGS = (
444 GLOBAL_SVN_SETTINGS = (
442 ('vcs_svn_proxy', 'http_requests_enabled'),
445 ('vcs_svn_proxy', 'http_requests_enabled'),
443 ('vcs_svn_proxy', 'http_server_url'))
446 ('vcs_svn_proxy', 'http_server_url'))
@@ -571,6 +574,7 b' class VcsSettingsModel(object):'
571 self._create_or_update_ui(
574 self._create_or_update_ui(
572 self.repo_settings, *phases, value=safe_str(data[phases_key]))
575 self.repo_settings, *phases, value=safe_str(data[phases_key]))
573
576
577
574 def create_or_update_global_hg_settings(self, data):
578 def create_or_update_global_hg_settings(self, data):
575 largefiles, largefiles_store, phases, hgsubversion, evolve \
579 largefiles, largefiles_store, phases, hgsubversion, evolve \
576 = self.GLOBAL_HG_SETTINGS
580 = self.GLOBAL_HG_SETTINGS
@@ -663,6 +663,10 b' class UserModel(BaseModel):'
663 :param api_key: api key to fetch by
663 :param api_key: api key to fetch by
664 :param username: username to fetch by
664 :param username: username to fetch by
665 """
665 """
666 def token_obfuscate(token):
667 if token:
668 return token[:4] + "****"
669
666 if user_id is None and api_key is None and username is None:
670 if user_id is None and api_key is None and username is None:
667 raise Exception('You need to pass user_id, api_key or username')
671 raise Exception('You need to pass user_id, api_key or username')
668
672
@@ -681,7 +685,7 b' class UserModel(BaseModel):'
681 if not dbuser:
685 if not dbuser:
682 log.warning(
686 log.warning(
683 'Unable to lookup user by id:%s api_key:%s username:%s',
687 'Unable to lookup user by id:%s api_key:%s username:%s',
684 user_id, api_key, username)
688 user_id, token_obfuscate(api_key), username)
685 return False
689 return False
686 if not dbuser.active:
690 if not dbuser.active:
687 log.debug('User `%s:%s` is inactive, skipping fill data',
691 log.debug('User `%s:%s` is inactive, skipping fill data',
@@ -80,6 +80,7 b' class UserGroupModel(BaseModel):'
80 'updated': [],
80 'updated': [],
81 'deleted': []
81 'deleted': []
82 }
82 }
83 change_obj = user_group.get_api_data()
83 # update permissions
84 # update permissions
84 for member_id, perm, member_type in perm_updates:
85 for member_id, perm, member_type in perm_updates:
85 member_id = int(member_id)
86 member_id = int(member_id)
@@ -97,8 +98,10 b' class UserGroupModel(BaseModel):'
97 self.grant_user_group_permission(
98 self.grant_user_group_permission(
98 target_user_group=user_group, user_group=member_id, perm=perm)
99 target_user_group=user_group, user_group=member_id, perm=perm)
99
100
100 changes['updated'].append({'type': member_type, 'id': member_id,
101 changes['updated'].append({
101 'name': member_name, 'new_perm': perm})
102 'change_obj': change_obj,
103 'type': member_type, 'id': member_id,
104 'name': member_name, 'new_perm': perm})
102
105
103 # set new permissions
106 # set new permissions
104 for member_id, perm, member_type in perm_additions:
107 for member_id, perm, member_type in perm_additions:
@@ -115,8 +118,10 b' class UserGroupModel(BaseModel):'
115 self.grant_user_group_permission(
118 self.grant_user_group_permission(
116 target_user_group=user_group, user_group=member_id, perm=perm)
119 target_user_group=user_group, user_group=member_id, perm=perm)
117
120
118 changes['added'].append({'type': member_type, 'id': member_id,
121 changes['added'].append({
119 'name': member_name, 'new_perm': perm})
122 'change_obj': change_obj,
123 'type': member_type, 'id': member_id,
124 'name': member_name, 'new_perm': perm})
120
125
121 # delete permissions
126 # delete permissions
122 for member_id, perm, member_type in perm_deletions:
127 for member_id, perm, member_type in perm_deletions:
@@ -132,8 +137,11 b' class UserGroupModel(BaseModel):'
132 self.revoke_user_group_permission(
137 self.revoke_user_group_permission(
133 target_user_group=user_group, user_group=member_id)
138 target_user_group=user_group, user_group=member_id)
134
139
135 changes['deleted'].append({'type': member_type, 'id': member_id,
140 changes['deleted'].append({
136 'name': member_name, 'new_perm': perm})
141 'change_obj': change_obj,
142 'type': member_type, 'id': member_id,
143 'name': member_name, 'new_perm': perm})
144
137 return changes
145 return changes
138
146
139 def get(self, user_group_id, cache=False):
147 def get(self, user_group_id, cache=False):
@@ -217,7 +225,7 b' class UserGroupModel(BaseModel):'
217 members.append(uid)
225 members.append(uid)
218 return members
226 return members
219
227
220 def update(self, user_group, form_data):
228 def update(self, user_group, form_data, group_data=None):
221 user_group = self._get_user_group(user_group)
229 user_group = self._get_user_group(user_group)
222 if 'users_group_name' in form_data:
230 if 'users_group_name' in form_data:
223 user_group.users_group_name = form_data['users_group_name']
231 user_group.users_group_name = form_data['users_group_name']
@@ -247,6 +255,11 b' class UserGroupModel(BaseModel):'
247 added_user_ids, removed_user_ids = \
255 added_user_ids, removed_user_ids = \
248 self._update_members_from_user_ids(user_group, members_id_list)
256 self._update_members_from_user_ids(user_group, members_id_list)
249
257
258 if group_data:
259 new_group_data = {}
260 new_group_data.update(group_data)
261 user_group.group_data = new_group_data
262
250 self.sa.add(user_group)
263 self.sa.add(user_group)
251 return user_group, added_user_ids, removed_user_ids
264 return user_group, added_user_ids, removed_user_ids
252
265
@@ -395,10 +408,18 b' class UserGroupModel(BaseModel):'
395 :param user: Instance of User, user_id or username
408 :param user: Instance of User, user_id or username
396 :param perm: Instance of Permission, or permission_name
409 :param perm: Instance of Permission, or permission_name
397 """
410 """
411 changes = {
412 'added': [],
413 'updated': [],
414 'deleted': []
415 }
398
416
399 user_group = self._get_user_group(user_group)
417 user_group = self._get_user_group(user_group)
400 user = self._get_user(user)
418 user = self._get_user(user)
401 permission = self._get_perm(perm)
419 permission = self._get_perm(perm)
420 perm_name = permission.permission_name
421 member_id = user.user_id
422 member_name = user.username
402
423
403 # check if we have that permission already
424 # check if we have that permission already
404 obj = self.sa.query(UserUserGroupToPerm)\
425 obj = self.sa.query(UserUserGroupToPerm)\
@@ -417,7 +438,12 b' class UserGroupModel(BaseModel):'
417 'granted permission: {} to user: {} on usergroup: {}'.format(
438 'granted permission: {} to user: {} on usergroup: {}'.format(
418 perm, user, user_group), namespace='security.usergroup')
439 perm, user, user_group), namespace='security.usergroup')
419
440
420 return obj
441 changes['added'].append({
442 'change_obj': user_group.get_api_data(),
443 'type': 'user', 'id': member_id,
444 'name': member_name, 'new_perm': perm_name})
445
446 return changes
421
447
422 def revoke_user_permission(self, user_group, user):
448 def revoke_user_permission(self, user_group, user):
423 """
449 """
@@ -427,9 +453,17 b' class UserGroupModel(BaseModel):'
427 or users_group name
453 or users_group name
428 :param user: Instance of User, user_id or username
454 :param user: Instance of User, user_id or username
429 """
455 """
456 changes = {
457 'added': [],
458 'updated': [],
459 'deleted': []
460 }
430
461
431 user_group = self._get_user_group(user_group)
462 user_group = self._get_user_group(user_group)
432 user = self._get_user(user)
463 user = self._get_user(user)
464 perm_name = 'usergroup.none'
465 member_id = user.user_id
466 member_name = user.username
433
467
434 obj = self.sa.query(UserUserGroupToPerm)\
468 obj = self.sa.query(UserUserGroupToPerm)\
435 .filter(UserUserGroupToPerm.user == user)\
469 .filter(UserUserGroupToPerm.user == user)\
@@ -442,6 +476,13 b' class UserGroupModel(BaseModel):'
442 'revoked permission from user: {} on usergroup: {}'.format(
476 'revoked permission from user: {} on usergroup: {}'.format(
443 user, user_group), namespace='security.usergroup')
477 user, user_group), namespace='security.usergroup')
444
478
479 changes['deleted'].append({
480 'change_obj': user_group.get_api_data(),
481 'type': 'user', 'id': member_id,
482 'name': member_name, 'new_perm': perm_name})
483
484 return changes
485
445 def grant_user_group_permission(self, target_user_group, user_group, perm):
486 def grant_user_group_permission(self, target_user_group, user_group, perm):
446 """
487 """
447 Grant user group permission for given target_user_group
488 Grant user group permission for given target_user_group
@@ -450,9 +491,19 b' class UserGroupModel(BaseModel):'
450 :param user_group:
491 :param user_group:
451 :param perm:
492 :param perm:
452 """
493 """
494 changes = {
495 'added': [],
496 'updated': [],
497 'deleted': []
498 }
499
453 target_user_group = self._get_user_group(target_user_group)
500 target_user_group = self._get_user_group(target_user_group)
454 user_group = self._get_user_group(user_group)
501 user_group = self._get_user_group(user_group)
455 permission = self._get_perm(perm)
502 permission = self._get_perm(perm)
503 perm_name = permission.permission_name
504 member_id = user_group.users_group_id
505 member_name = user_group.users_group_name
506
456 # forbid assigning same user group to itself
507 # forbid assigning same user group to itself
457 if target_user_group == user_group:
508 if target_user_group == user_group:
458 raise RepoGroupAssignmentError('target repo:%s cannot be '
509 raise RepoGroupAssignmentError('target repo:%s cannot be '
@@ -477,7 +528,12 b' class UserGroupModel(BaseModel):'
477 perm, user_group, target_user_group),
528 perm, user_group, target_user_group),
478 namespace='security.usergroup')
529 namespace='security.usergroup')
479
530
480 return obj
531 changes['added'].append({
532 'change_obj': target_user_group.get_api_data(),
533 'type': 'user_group', 'id': member_id,
534 'name': member_name, 'new_perm': perm_name})
535
536 return changes
481
537
482 def revoke_user_group_permission(self, target_user_group, user_group):
538 def revoke_user_group_permission(self, target_user_group, user_group):
483 """
539 """
@@ -486,8 +542,17 b' class UserGroupModel(BaseModel):'
486 :param target_user_group:
542 :param target_user_group:
487 :param user_group:
543 :param user_group:
488 """
544 """
545 changes = {
546 'added': [],
547 'updated': [],
548 'deleted': []
549 }
550
489 target_user_group = self._get_user_group(target_user_group)
551 target_user_group = self._get_user_group(target_user_group)
490 user_group = self._get_user_group(user_group)
552 user_group = self._get_user_group(user_group)
553 perm_name = 'usergroup.none'
554 member_id = user_group.users_group_id
555 member_name = user_group.users_group_name
491
556
492 obj = self.sa.query(UserGroupUserGroupToPerm)\
557 obj = self.sa.query(UserGroupUserGroupToPerm)\
493 .filter(UserGroupUserGroupToPerm.target_user_group == target_user_group)\
558 .filter(UserGroupUserGroupToPerm.target_user_group == target_user_group)\
@@ -502,6 +567,13 b' class UserGroupModel(BaseModel):'
502 user_group, target_user_group),
567 user_group, target_user_group),
503 namespace='security.repogroup')
568 namespace='security.repogroup')
504
569
570 changes['deleted'].append({
571 'change_obj': target_user_group.get_api_data(),
572 'type': 'user_group', 'id': member_id,
573 'name': member_name, 'new_perm': perm_name})
574
575 return changes
576
505 def get_perms_summary(self, user_group_id):
577 def get_perms_summary(self, user_group_id):
506 permissions = {
578 permissions = {
507 'repositories': {},
579 'repositories': {},
@@ -66,7 +66,7 b' def deferred_landing_ref_validator(node,'
66
66
67
67
68 @colander.deferred
68 @colander.deferred
69 def deferred_clone_uri_validator(node, kw):
69 def deferred_sync_uri_validator(node, kw):
70 repo_type = kw.get('repo_type')
70 repo_type = kw.get('repo_type')
71 validator = validators.CloneUriValidator(repo_type)
71 validator = validators.CloneUriValidator(repo_type)
72 return validator
72 return validator
@@ -319,7 +319,13 b' class RepoSchema(colander.MappingSchema)'
319
319
320 repo_clone_uri = colander.SchemaNode(
320 repo_clone_uri = colander.SchemaNode(
321 colander.String(),
321 colander.String(),
322 validator=deferred_clone_uri_validator,
322 validator=deferred_sync_uri_validator,
323 preparers=[preparers.strip_preparer],
324 missing='')
325
326 repo_push_uri = colander.SchemaNode(
327 colander.String(),
328 validator=deferred_sync_uri_validator,
323 preparers=[preparers.strip_preparer],
329 preparers=[preparers.strip_preparer],
324 missing='')
330 missing='')
325
331
@@ -381,7 +387,17 b' class RepoSettingsSchema(RepoSchema):'
381 repo_clone_uri = colander.SchemaNode(
387 repo_clone_uri = colander.SchemaNode(
382 colander.String(),
388 colander.String(),
383 preparers=[preparers.strip_preparer],
389 preparers=[preparers.strip_preparer],
384 validator=deferred_clone_uri_validator,
390 validator=deferred_sync_uri_validator,
391 missing='')
392
393 repo_push_uri_change = colander.SchemaNode(
394 colander.String(),
395 missing='NEW')
396
397 repo_push_uri = colander.SchemaNode(
398 colander.String(),
399 preparers=[preparers.strip_preparer],
400 validator=deferred_sync_uri_validator,
385 missing='')
401 missing='')
386
402
387 def deserialize(self, cstruct):
403 def deserialize(self, cstruct):
@@ -22,10 +22,11 b' import re'
22 import colander
22 import colander
23
23
24 from rhodecode import forms
24 from rhodecode import forms
25 from rhodecode.model.db import User
25 from rhodecode.model.db import User, UserEmailMap
26 from rhodecode.model.validation_schema import types, validators
26 from rhodecode.model.validation_schema import types, validators
27 from rhodecode.translation import _
27 from rhodecode.translation import _
28 from rhodecode.lib.auth import check_password
28 from rhodecode.lib.auth import check_password
29 from rhodecode.lib import helpers as h
29
30
30
31
31 @colander.deferred
32 @colander.deferred
@@ -40,6 +41,7 b' def deferred_user_password_validator(nod'
40 return _user_password_validator
41 return _user_password_validator
41
42
42
43
44
43 class ChangePasswordSchema(colander.Schema):
45 class ChangePasswordSchema(colander.Schema):
44
46
45 current_password = colander.SchemaNode(
47 current_password = colander.SchemaNode(
@@ -123,3 +125,64 b' class UserSchema(colander.Schema):'
123
125
124 appstruct = super(UserSchema, self).deserialize(cstruct)
126 appstruct = super(UserSchema, self).deserialize(cstruct)
125 return appstruct
127 return appstruct
128
129
130 @colander.deferred
131 def deferred_user_email_in_emails_validator(node, kw):
132 return colander.OneOf(kw.get('user_emails'))
133
134
135 @colander.deferred
136 def deferred_additional_email_validator(node, kw):
137 emails = kw.get('user_emails')
138
139 def name_validator(node, value):
140 if value in emails:
141 msg = _('This e-mail address is already taken')
142 raise colander.Invalid(node, msg)
143 user = User.get_by_email(value, case_insensitive=True)
144 if user:
145 msg = _(u'This e-mail address is already taken')
146 raise colander.Invalid(node, msg)
147 c = colander.Email()
148 return c(node, value)
149 return name_validator
150
151
152 @colander.deferred
153 def deferred_user_email_in_emails_widget(node, kw):
154 import deform.widget
155 emails = [(email, email) for email in kw.get('user_emails')]
156 return deform.widget.Select2Widget(values=emails)
157
158
159 class UserProfileSchema(colander.Schema):
160 username = colander.SchemaNode(
161 colander.String(),
162 validator=deferred_username_validator)
163
164 firstname = colander.SchemaNode(
165 colander.String(), missing='', title='First name')
166
167 lastname = colander.SchemaNode(
168 colander.String(), missing='', title='Last name')
169
170 email = colander.SchemaNode(
171 colander.String(), widget=deferred_user_email_in_emails_widget,
172 validator=deferred_user_email_in_emails_validator,
173 description=h.literal(
174 _('Additional emails can be specified at <a href="{}">extra emails</a> page.').format(
175 '/_admin/my_account/emails')),
176 )
177
178
179 class AddEmailSchema(colander.Schema):
180 current_password = colander.SchemaNode(
181 colander.String(),
182 missing=colander.required,
183 widget=forms.widget.PasswordWidget(redisplay=True),
184 validator=deferred_user_password_validator)
185
186 email = colander.SchemaNode(
187 colander.String(), title='New Email',
188 validator=deferred_additional_email_validator)
@@ -18,15 +18,6 b''
18 }
18 }
19 }
19 }
20
20
21 .compare_view_files {
22
23 .diff-container {
24
25 .diffblock {
26 margin-bottom: 0;
27 }
28 }
29 }
30
21
31 div.diffblock .sidebyside {
22 div.diffblock .sidebyside {
32 background: #ffffff;
23 background: #ffffff;
@@ -1543,14 +1543,6 b' table.integrations {'
1543 }
1543 }
1544 }
1544 }
1545
1545
1546 .compare_view_files {
1547 width: 100%;
1548
1549 td {
1550 vertical-align: middle;
1551 }
1552 }
1553
1554 .compare_view_filepath {
1546 .compare_view_filepath {
1555 color: @grey1;
1547 color: @grey1;
1556 }
1548 }
@@ -485,58 +485,6 b' table.compare_view_commits {'
485 }
485 }
486 }
486 }
487
487
488 .compare_view_files {
489
490 td.td-actions {
491 text-align: right;
492 }
493
494 .flag_status {
495 margin: 0 0 0 5px;
496 }
497
498 td.injected_diff {
499
500 .code-difftable {
501 border:none;
502 }
503
504 .diff-container {
505 border: @border-thickness solid @border-default-color;
506 .border-radius(@border-radius);
507 }
508
509 div.diffblock {
510 border:none;
511 }
512
513 div.code-body {
514 max-width: 1152px;
515 }
516 }
517
518 .rctable {
519
520 td {
521 padding-top: @space;
522 }
523
524 &:first-child td {
525 padding-top: 0;
526 }
527 }
528
529 .comment-bubble,
530 .show_comments {
531 float: right;
532 visibility: hidden;
533 padding: 0 1em 0 0;
534 }
535
536 .injected_diff {
537 padding-bottom: @padding;
538 }
539 }
540
488
541 // Gist List
489 // Gist List
542 #gist_list_table {
490 #gist_list_table {
@@ -232,6 +232,7 b' function registerRCRoutes() {'
232 pyroutes.register('repo_edit_toggle_locking', '/%(repo_name)s/settings/toggle_locking', ['repo_name']);
232 pyroutes.register('repo_edit_toggle_locking', '/%(repo_name)s/settings/toggle_locking', ['repo_name']);
233 pyroutes.register('edit_repo_remote', '/%(repo_name)s/settings/remote', ['repo_name']);
233 pyroutes.register('edit_repo_remote', '/%(repo_name)s/settings/remote', ['repo_name']);
234 pyroutes.register('edit_repo_remote_pull', '/%(repo_name)s/settings/remote/pull', ['repo_name']);
234 pyroutes.register('edit_repo_remote_pull', '/%(repo_name)s/settings/remote/pull', ['repo_name']);
235 pyroutes.register('edit_repo_remote_push', '/%(repo_name)s/settings/remote/push', ['repo_name']);
235 pyroutes.register('edit_repo_statistics', '/%(repo_name)s/settings/statistics', ['repo_name']);
236 pyroutes.register('edit_repo_statistics', '/%(repo_name)s/settings/statistics', ['repo_name']);
236 pyroutes.register('edit_repo_statistics_reset', '/%(repo_name)s/settings/statistics/update', ['repo_name']);
237 pyroutes.register('edit_repo_statistics_reset', '/%(repo_name)s/settings/statistics/update', ['repo_name']);
237 pyroutes.register('edit_repo_issuetracker', '/%(repo_name)s/settings/issue_trackers', ['repo_name']);
238 pyroutes.register('edit_repo_issuetracker', '/%(repo_name)s/settings/issue_trackers', ['repo_name']);
@@ -243,6 +244,7 b' function registerRCRoutes() {'
243 pyroutes.register('edit_repo_vcs_svn_pattern_delete', '/%(repo_name)s/settings/vcs/svn_pattern/delete', ['repo_name']);
244 pyroutes.register('edit_repo_vcs_svn_pattern_delete', '/%(repo_name)s/settings/vcs/svn_pattern/delete', ['repo_name']);
244 pyroutes.register('repo_reviewers', '/%(repo_name)s/settings/review/rules', ['repo_name']);
245 pyroutes.register('repo_reviewers', '/%(repo_name)s/settings/review/rules', ['repo_name']);
245 pyroutes.register('repo_default_reviewers_data', '/%(repo_name)s/settings/review/default-reviewers', ['repo_name']);
246 pyroutes.register('repo_default_reviewers_data', '/%(repo_name)s/settings/review/default-reviewers', ['repo_name']);
247 pyroutes.register('repo_automation', '/%(repo_name)s/settings/automation', ['repo_name']);
246 pyroutes.register('edit_repo_strip', '/%(repo_name)s/settings/strip', ['repo_name']);
248 pyroutes.register('edit_repo_strip', '/%(repo_name)s/settings/strip', ['repo_name']);
247 pyroutes.register('strip_check', '/%(repo_name)s/settings/strip_check', ['repo_name']);
249 pyroutes.register('strip_check', '/%(repo_name)s/settings/strip_check', ['repo_name']);
248 pyroutes.register('strip_execute', '/%(repo_name)s/settings/strip_execute', ['repo_name']);
250 pyroutes.register('strip_execute', '/%(repo_name)s/settings/strip_execute', ['repo_name']);
@@ -293,44 +293,27 b' function scrollToElement(element, percen'
293 });
293 });
294 }
294 }
295 });
295 });
296 $('.compare_view_files').on(
297 'mouseenter mouseleave', 'tr.line .lineno a',function(event) {
298 if (event.type === "mouseenter") {
299 $(this).parents('tr.line').addClass('hover');
300 } else {
301 $(this).parents('tr.line').removeClass('hover');
302 }
303 });
304
296
305 $('.compare_view_files').on(
297 $('body').on('click', '.cb-lineno a', function(event) {
306 'mouseenter mouseleave', 'tr.line .add-comment-line a',function(event){
307 if (event.type === "mouseenter") {
308 $(this).parents('tr.line').addClass('commenting');
309 } else {
310 $(this).parents('tr.line').removeClass('commenting');
311 }
312 });
313
314 $('body').on( /* TODO: replace the $('.compare_view_files').on('click') below
315 when new diffs are integrated */
316 'click', '.cb-lineno a', function(event) {
317
298
318 function sortNumber(a,b) {
299 function sortNumber(a,b) {
319 return a - b;
300 return a - b;
320 }
301 }
321
302
322 if ($(this).attr('data-line-no') !== "") {
303 var lineNo = $(this).data('lineNo');
304 if (lineNo) {
323
305
324 // on shift, we do a range selection, if we got previous line
306 // on shift, we do a range selection, if we got previous line
325 var prevLine = $('.cb-line-selected a').attr('data-line-no');
307 var prevLine = $('.cb-line-selected a').data('lineNo');
326 if (event.shiftKey && prevLine !== undefined) {
308 if (event.shiftKey && prevLine !== undefined) {
327 var prevLine = parseInt(prevLine);
309 var prevLine = parseInt(prevLine);
328 var nextLine = parseInt($(this).attr('data-line-no'));
310 var nextLine = parseInt(lineNo);
329 var pos = [prevLine, nextLine].sort(sortNumber);
311 var pos = [prevLine, nextLine].sort(sortNumber);
330 var anchor = '#L{0}-{1}'.format(pos[0], pos[1]);
312 var anchor = '#L{0}-{1}'.format(pos[0], pos[1]);
331
313
332 } else {
314 } else {
333 var nextLine = parseInt($(this).attr('data-line-no'));
315
316 var nextLine = parseInt(lineNo);
334 var pos = [nextLine, nextLine];
317 var pos = [nextLine, nextLine];
335 var anchor = '#L{0}'.format(pos[0]);
318 var anchor = '#L{0}'.format(pos[0]);
336
319
@@ -352,54 +335,35 b' function scrollToElement(element, percen'
352 }
335 }
353 });
336 });
354
337
355 // Replace URL without jumping to it if browser supports.
356 // Default otherwise
357 if (history.pushState) {
358 var new_location = location.href.rstrip('#');
359 if (location.hash) {
360 // location without hash
361 new_location = new_location.replace(location.hash, "");
362 }
363
338
364 // Make new anchor url
339 } else {
365 new_location = new_location + anchor;
340 if ($(this).attr('name') !== undefined) {
366 history.pushState(true, document.title, new_location);
341 // clear selection
367
342 $('td.cb-line-selected').removeClass('cb-line-selected');
368 return false;
343 var aEl = $(this).closest('td');
344 aEl.addClass('cb-line-selected');
345 aEl.next('td').addClass('cb-line-selected');
369 }
346 }
370 }
347 }
348
349 // Replace URL without jumping to it if browser supports.
350 // Default otherwise
351 if (history.pushState && anchor !== undefined) {
352 var new_location = location.href.rstrip('#');
353 if (location.hash) {
354 // location without hash
355 new_location = new_location.replace(location.hash, "");
356 }
357
358 // Make new anchor url
359 new_location = new_location + anchor;
360 history.pushState(true, document.title, new_location);
361
362 return false;
363 }
364
371 });
365 });
372
366
373 $('.compare_view_files').on( /* TODO: replace this with .cb function above
374 when new diffs are integrated */
375 'click', 'tr.line .lineno a',function(event) {
376 if ($(this).text() != ""){
377 $('tr.line').removeClass('selected');
378 $(this).parents("tr.line").addClass('selected');
379
380 // Replace URL without jumping to it if browser supports.
381 // Default otherwise
382 if (history.pushState) {
383 var new_location = location.href;
384 if (location.hash){
385 new_location = new_location.replace(location.hash, "");
386 }
387
388 // Make new anchor url
389 var new_location = new_location+$(this).attr('href');
390 history.pushState(true, document.title, new_location);
391
392 return false;
393 }
394 }
395 });
396
397 $('.compare_view_files').on(
398 'click', 'tr.line .add-comment-line a',function(event) {
399 var tr = $(event.currentTarget).parents('tr.line')[0];
400 injectInlineForm(tr);
401 return false;
402 });
403
367
404 $('.collapse_file').on('click', function(e) {
368 $('.collapse_file').on('click', function(e) {
405 e.stopPropagation();
369 e.stopPropagation();
@@ -475,7 +475,7 b' var CommentsController = function() {'
475
475
476 this.getLineNumber = function(node) {
476 this.getLineNumber = function(node) {
477 var $node = $(node);
477 var $node = $(node);
478 var lineNo = $node.closest('td').attr('data-line-number');
478 var lineNo = $node.closest('td').attr('data-line-no');
479 if (lineNo === undefined && $node.data('commentInline')){
479 if (lineNo === undefined && $node.data('commentInline')){
480 lineNo = $node.data('commentLineNo')
480 lineNo = $node.data('commentLineNo')
481 }
481 }
@@ -598,6 +598,8 b' var CommentsController = function() {'
598 this.toggleLineComments = function(node) {
598 this.toggleLineComments = function(node) {
599 self.toggleComments(node, true);
599 self.toggleComments(node, true);
600 var $node = $(node);
600 var $node = $(node);
601 // mark outdated comments as visible before the toggle;
602 $(node.closest('tr')).find('.comment-outdated').show();
601 $node.closest('tr').toggleClass('hide-line-comments');
603 $node.closest('tr').toggleClass('hide-line-comments');
602 };
604 };
603
605
@@ -54,11 +54,9 b''
54 <div class="textarea text-area editor">
54 <div class="textarea text-area editor">
55 ${h.textarea('auth_plugins',cols=23,rows=5,class_="medium")}
55 ${h.textarea('auth_plugins',cols=23,rows=5,class_="medium")}
56 </div>
56 </div>
57 <p class="help-block">
57 <p class="help-block pre-formatting">${_('List of plugins, separated by commas.'
58 ${_('Add a list of plugins, separated by commas. '
58 '\nThe order of the plugins is also the order in which '
59 'The order of the plugins is also the order in which '
59 'RhodeCode Enterprise will try to authenticate a user.')}</p>
60 'RhodeCode Enterprise will try to authenticate a user.')}
61 </p>
62 </div>
60 </div>
63
61
64 <div class="field">
62 <div class="field">
@@ -91,15 +89,19 b''
91 <script>
89 <script>
92 $('.toggle-plugin').click(function(e){
90 $('.toggle-plugin').click(function(e){
93 var auth_plugins_input = $('#auth_plugins');
91 var auth_plugins_input = $('#auth_plugins');
94 var notEmpty = function(element, index, array) {
92 var elems = [];
95 return (element != "");
93
96 };
94 $.each(auth_plugins_input.val().split(',') , function (index, element) {
97 var elems = auth_plugins_input.val().split(',').filter(notEmpty);
95 if (element !== "") {
96 elems.push(element.strip())
97 }
98 });
99
98 var cur_button = e.currentTarget;
100 var cur_button = e.currentTarget;
99 var plugin_id = $(cur_button).attr('plugin_id');
101 var plugin_id = $(cur_button).attr('plugin_id');
100 if($(cur_button).hasClass('btn-success')){
102 if($(cur_button).hasClass('btn-success')){
101 elems.splice(elems.indexOf(plugin_id), 1);
103 elems.splice(elems.indexOf(plugin_id), 1);
102 auth_plugins_input.val(elems.join(','));
104 auth_plugins_input.val(elems.join(',\n'));
103 $(cur_button).removeClass('btn-success');
105 $(cur_button).removeClass('btn-success');
104 cur_button.innerHTML = _gettext('disabled');
106 cur_button.innerHTML = _gettext('disabled');
105 }
107 }
@@ -107,7 +109,7 b''
107 if(elems.indexOf(plugin_id) == -1){
109 if(elems.indexOf(plugin_id) == -1){
108 elems.push(plugin_id);
110 elems.push(plugin_id);
109 }
111 }
110 auth_plugins_input.val(elems.join(','));
112 auth_plugins_input.val(elems.join(',\n'));
111 $(cur_button).addClass('btn-success');
113 $(cur_button).addClass('btn-success');
112 cur_button.innerHTML = _gettext('enabled');
114 cur_button.innerHTML = _gettext('enabled');
113 }
115 }
@@ -50,40 +50,6 b''
50 </h3>
50 </h3>
51 </div>
51 </div>
52 <div class="panel-body">
52 <div class="panel-body">
53 <%
54 if c.repo:
55 home_url = request.route_path('repo_integrations_home',
56 repo_name=c.repo.repo_name)
57 elif c.repo_group:
58 home_url = request.route_path('repo_group_integrations_home',
59 repo_group_name=c.repo_group.group_name)
60 else:
61 home_url = request.route_path('global_integrations_home')
62 %>
63
64 <a href="${home_url}" class="btn ${not c.current_IntegrationType and 'btn-primary' or ''}">${_('All')}</a>
65
66 %for integration_key, IntegrationType in c.available_integrations.items():
67 % if not IntegrationType.is_dummy:
68 <%
69 if c.repo:
70 list_url = request.route_path('repo_integrations_list',
71 repo_name=c.repo.repo_name,
72 integration=integration_key)
73 elif c.repo_group:
74 list_url = request.route_path('repo_group_integrations_list',
75 repo_group_name=c.repo_group.group_name,
76 integration=integration_key)
77 else:
78 list_url = request.route_path('global_integrations_list',
79 integration=integration_key)
80 %>
81 <a href="${list_url}"
82 class="btn ${c.current_IntegrationType and integration_key == c.current_IntegrationType.key and 'btn-primary' or ''}">
83 ${IntegrationType.display_name}
84 </a>
85 % endif
86 %endfor
87
53
88 <%
54 <%
89 integration_type = c.current_IntegrationType and c.current_IntegrationType.display_name or ''
55 integration_type = c.current_IntegrationType and c.current_IntegrationType.display_name or ''
@@ -153,7 +119,7 b''
153 <td class="td-icon">
119 <td class="td-icon">
154 %if integration.integration_type in c.available_integrations:
120 %if integration.integration_type in c.available_integrations:
155 <div class="integration-icon">
121 <div class="integration-icon">
156 ${c.available_integrations[integration.integration_type].icon|n}
122 ${c.available_integrations[integration.integration_type].icon()|n}
157 </div>
123 </div>
158 %else:
124 %else:
159 ?
125 ?
@@ -56,7 +56,7 b''
56 <%widgets:panel>
56 <%widgets:panel>
57 <h2>
57 <h2>
58 <div class="integration-icon">
58 <div class="integration-icon">
59 ${IntegrationObject.icon|n}
59 ${IntegrationObject.icon()|n}
60 </div>
60 </div>
61 ${IntegrationObject.display_name}
61 ${IntegrationObject.display_name}
62 </h2>
62 </h2>
@@ -48,25 +48,7 b''
48 </div>
48 </div>
49
49
50 <div>
50 <div>
51 ${h.secure_form(h.route_path('my_account_emails_add'), request=request)}
51 ${c.form.render() | n}
52 <div class="form">
53 <!-- fields -->
54 <div class="fields">
55 <div class="field">
56 <div class="label">
57 <label for="new_email">${_('New email address')}:</label>
58 </div>
59 <div class="input">
60 ${h.text('new_email', class_='medium')}
61 </div>
62 </div>
63 <div class="buttons">
64 ${h.submit('save',_('Add'),class_="btn")}
65 ${h.reset('reset',_('Reset'),class_="btn")}
66 </div>
67 </div>
68 </div>
69 ${h.end_form()}
70 </div>
52 </div>
71 </div>
53 </div>
72 </div>
54 </div>
@@ -6,11 +6,10 b''
6 </div>
6 </div>
7
7
8 <div class="panel-body">
8 <div class="panel-body">
9 ${h.secure_form(h.route_path('my_account_update'), class_='form', request=request)}
10 <% readonly = None %>
9 <% readonly = None %>
11 <% disabled = "" %>
10 <% disabled = "" %>
12
11
13 % if c.extern_type != 'rhodecode':
12 %if c.extern_type != 'rhodecode':
14 <% readonly = "readonly" %>
13 <% readonly = "readonly" %>
15 <% disabled = "disabled" %>
14 <% disabled = "disabled" %>
16 <div class="infoform">
15 <div class="infoform">
@@ -24,7 +23,7 b''
24 <label for="username">${_('Username')}:</label>
23 <label for="username">${_('Username')}:</label>
25 </div>
24 </div>
26 <div class="input">
25 <div class="input">
27 ${h.text('username', class_='input-valuedisplay', readonly=readonly)}
26 ${c.user.username}
28 </div>
27 </div>
29 </div>
28 </div>
30
29
@@ -33,7 +32,7 b''
33 <label for="name">${_('First Name')}:</label>
32 <label for="name">${_('First Name')}:</label>
34 </div>
33 </div>
35 <div class="input">
34 <div class="input">
36 ${h.text('firstname', class_='input-valuedisplay', readonly=readonly)}
35 ${c.user.firstname}
37 </div>
36 </div>
38 </div>
37 </div>
39
38
@@ -42,7 +41,7 b''
42 <label for="lastname">${_('Last Name')}:</label>
41 <label for="lastname">${_('Last Name')}:</label>
43 </div>
42 </div>
44 <div class="input-valuedisplay">
43 <div class="input-valuedisplay">
45 ${h.text('lastname', class_='input-valuedisplay', readonly=readonly)}
44 ${c.user.lastname}
46 </div>
45 </div>
47 </div>
46 </div>
48 </div>
47 </div>
@@ -64,48 +63,7 b''
64 %endif
63 %endif
65 </div>
64 </div>
66 </div>
65 </div>
67 <div class="field">
66 ${c.form.render()| n}
68 <div class="label">
69 <label for="username">${_('Username')}:</label>
70 </div>
71 <div class="input">
72 ${h.text('username', class_='medium%s' % disabled, readonly=readonly)}
73 ${h.hidden('extern_name', c.extern_name)}
74 ${h.hidden('extern_type', c.extern_type)}
75 </div>
76 </div>
77 <div class="field">
78 <div class="label">
79 <label for="name">${_('First Name')}:</label>
80 </div>
81 <div class="input">
82 ${h.text('firstname', class_="medium")}
83 </div>
84 </div>
85
86 <div class="field">
87 <div class="label">
88 <label for="lastname">${_('Last Name')}:</label>
89 </div>
90 <div class="input">
91 ${h.text('lastname', class_="medium")}
92 </div>
93 </div>
94
95 <div class="field">
96 <div class="label">
97 <label for="email">${_('Email')}:</label>
98 </div>
99 <div class="input">
100 ## we should be able to edit email !
101 ${h.text('email', class_="medium")}
102 </div>
103 </div>
104
105 <div class="buttons">
106 ${h.submit('save', _('Save'), class_="btn")}
107 ${h.reset('reset', _('Reset'), class_="btn")}
108 </div>
109 </div>
67 </div>
110 </div>
68 </div>
111 % endif
69 % endif
@@ -98,7 +98,7 b''
98 ${_user_group.users_group_name}
98 ${_user_group.users_group_name}
99 </a>
99 </a>
100 %else:
100 %else:
101 ${_user_group.users_group_name}
101 ${h.link_to_group(_user_group.users_group_name)}
102 %endif
102 %endif
103 </td>
103 </td>
104 <td class="td-action">
104 <td class="td-action">
@@ -65,7 +65,7 b''
65 </li>
65 </li>
66 %if c.rhodecode_db_repo.repo_type != 'svn':
66 %if c.rhodecode_db_repo.repo_type != 'svn':
67 <li class="${'active' if c.active=='remote' else ''}">
67 <li class="${'active' if c.active=='remote' else ''}">
68 <a href="${h.route_path('edit_repo_remote', repo_name=c.repo_name)}">${_('Remote')}</a>
68 <a href="${h.route_path('edit_repo_remote', repo_name=c.repo_name)}">${_('Remote sync')}</a>
69 </li>
69 </li>
70 %endif
70 %endif
71 <li class="${'active' if c.active=='statistics' else ''}">
71 <li class="${'active' if c.active=='statistics' else ''}">
@@ -79,6 +79,9 b''
79 <a href="${h.route_path('repo_reviewers', repo_name=c.repo_name)}">${_('Reviewer Rules')}</a>
79 <a href="${h.route_path('repo_reviewers', repo_name=c.repo_name)}">${_('Reviewer Rules')}</a>
80 </li>
80 </li>
81 %endif
81 %endif
82 <li class="${'active' if c.active=='automation' else ''}">
83 <a href="${h.route_path('repo_automation', repo_name=c.repo_name)}">${_('Automation')}</a>
84 </li>
82 <li class="${'active' if c.active=='maintenance' else ''}">
85 <li class="${'active' if c.active=='maintenance' else ''}">
83 <a href="${h.route_path('edit_repo_maintenance', repo_name=c.repo_name)}">${_('Maintenance')}</a>
86 <a href="${h.route_path('edit_repo_maintenance', repo_name=c.repo_name)}">${_('Maintenance')}</a>
84 </li>
87 </li>
@@ -51,3 +51,26 b''
51 </div>
51 </div>
52 </div>
52 </div>
53 </div>
53 </div>
54
55
56 <div class="panel panel-default">
57 <div class="panel-heading">
58 <h3 class="panel-title">${_('Diff Caches')}</h3>
59 </div>
60 <div class="panel-body">
61 <table class="rctable edit_cache">
62 <tr>
63 <td>${_('Cached diff name')}:</td>
64 <td>${c.rhodecode_db_repo.cached_diffs_relative_dir}</td>
65 </tr>
66 <tr>
67 <td>${_('Cached diff files')}:</td>
68 <td>${c.cached_diff_count}</td>
69 </tr>
70 <tr>
71 <td>${_('Cached diff size')}:</td>
72 <td>${h.format_byte_size(c.cached_diff_size)}</td>
73 </tr>
74 </table>
75 </div>
76 </div>
@@ -89,7 +89,7 b''
89 ${_user_group.users_group_name}
89 ${_user_group.users_group_name}
90 </a>
90 </a>
91 %else:
91 %else:
92 ${_user_group.users_group_name}
92 ${h.link_to_group(_user_group.users_group_name)}
93 %endif
93 %endif
94 </td>
94 </td>
95 <td class="td-action">
95 <td class="td-action">
@@ -1,40 +1,68 b''
1 <div class="panel panel-default">
1 <div class="panel panel-default">
2 <div class="panel-heading">
2 <div class="panel-heading">
3 <h3 class="panel-title">${_('Remote url')}</h3>
3 <h3 class="panel-title">${_('Remote Sync')}</h3>
4 </div>
4 </div>
5 <div class="panel-body">
5 <div class="panel-body">
6
6
7 <h4>${_('Manually pull changes from external repository.')}</h4>
7 <h4>${_('Manually pull/push changes from/to external URLs.')}</h4>
8
9 %if c.rhodecode_db_repo.clone_uri:
10
11 ${_('Remote mirror url')}:
12 <a href="${c.rhodecode_db_repo.clone_uri}">${c.rhodecode_db_repo.clone_uri_hidden}</a>
13
14 <p>
15 ${_('Pull can be automated by such api call. Can be called periodically in crontab etc.')}
16 <br/>
17 <code>
18 ${h.api_call_example(method='pull', args={"repoid": c.rhodecode_db_repo.repo_name})}
19 </code>
20 </p>
21
8
22 ${h.secure_form(h.route_path('edit_repo_remote_pull', repo_name=c.repo_name), request=request)}
9 <table>
23 <div class="form">
10 <tr>
24 <div class="fields">
11 <td><div style="min-width: 80px"><strong>${_('Pull url')}</strong></div></td>
25 ${h.submit('remote_pull_%s' % c.rhodecode_db_repo.repo_name,_('Pull changes from remote location'),class_="btn btn-small",onclick="return confirm('"+_('Confirm to pull changes from remote side')+"');")}
12 <td>
26 </div>
13 % if c.rhodecode_db_repo.clone_uri:
27 </div>
14 <a href="${c.rhodecode_db_repo.clone_uri}">${c.rhodecode_db_repo.clone_uri_hidden}</a>
28 ${h.end_form()}
15 % else:
29 %else:
16 ${_('This repository does not have any pull url set.')}
17 <a href="${h.route_path('edit_repo', repo_name=c.rhodecode_db_repo.repo_name)}">${_('Set remote url.')}</a>
18 % endif
19 </td>
20 </tr>
21 % if c.rhodecode_db_repo.clone_uri:
22 <tr>
23 <td></td>
24 <td>
25 <p>
26 ${_('Pull can be automated by such api call. Can be called periodically in crontab etc.')}
27 <br/>
28 <code>
29 ${h.api_call_example(method='pull', args={"repoid": c.rhodecode_db_repo.repo_name})}
30 </code>
31 </p>
32 </td>
33 </tr>
34 <tr>
35 <td></td>
36 <td>
37 ${h.secure_form(h.route_path('edit_repo_remote_pull', repo_name=c.repo_name), request=request)}
38 <div class="form">
39 <div class="fields">
40 ${h.submit('remote_pull_%s' % c.rhodecode_db_repo.repo_name,_('Pull changes from remote location'),class_="btn btn-small",onclick="return confirm('"+_('Confirm to pull changes from remote side')+"');")}
41 </div>
42 </div>
43 ${h.end_form()}
44 </td>
45 </tr>
46 % endif
30
47
31 ${_('This repository does not have any remote mirror url set.')}
48 <tr>
32 <a href="${h.route_path('edit_repo', repo_name=c.rhodecode_db_repo.repo_name)}">${_('Set remote url.')}</a>
49 <td><div style="min-width: 80px"><strong>${_('Push url')}</strong></div></td>
33 <br/>
50 <td>
34 <br/>
51 % if c.rhodecode_db_repo.push_uri:
35 <button class="btn disabled" type="submit" disabled="disabled">
52 <a href="${c.rhodecode_db_repo.push_uri_hidden}">${c.rhodecode_db_repo.push_uri_hidden}</a>
36 ${_('Pull changes from remote location')}
53 % else:
37 </button>
54 ${_('This repository does not have any push url set.')}
38 %endif
55 <a href="${h.route_path('edit_repo', repo_name=c.rhodecode_db_repo.repo_name)}">${_('Set remote url.')}</a>
56 % endif
57 </td>
58 </tr>
59 <tr>
60 <td></td>
61 <td>
62 ${_('This feature is available in RhodeCode EE edition only. Contact {sales_email} to obtain a trial license.').format(sales_email='<a href="mailto:sales@rhodecode.com">sales@rhodecode.com</a>')|n}
63 </td>
64 </tr>
65
66 </table>
39 </div>
67 </div>
40 </div>
68 </div>
@@ -46,9 +46,10 b''
46 </div>
46 </div>
47
47
48 % if c.rhodecode_db_repo.repo_type != 'svn':
48 % if c.rhodecode_db_repo.repo_type != 'svn':
49 <% sync_link = h.literal(h.link_to('remote sync', h.route_path('edit_repo_remote', repo_name=c.repo_name))) %>
49 <div class="field">
50 <div class="field">
50 <div class="label">
51 <div class="label">
51 <label for="clone_uri">${_('Remote uri')}:</label>
52 <label for="clone_uri">${_('Remote pull uri')}:</label>
52 </div>
53 </div>
53 <div class="input">
54 <div class="input">
54 %if c.rhodecode_db_repo.clone_uri:
55 %if c.rhodecode_db_repo.clone_uri:
@@ -83,14 +84,56 b''
83 ${h.hidden('repo_clone_uri_change', 'NEW')}
84 ${h.hidden('repo_clone_uri_change', 'NEW')}
84 %endif
85 %endif
85 <p id="alter_clone_uri_help_block" class="help-block">
86 <p id="alter_clone_uri_help_block" class="help-block">
86 <% pull_link = h.literal(h.link_to('remote sync', h.route_path('edit_repo_remote', repo_name=c.repo_name))) %>
87 ${_('http[s] url where from repository was imported. This field can used for doing {sync_link}.').format(sync_link=sync_link)|n} <br/>
87 ${_('http[s] url where from repository was imported, this field can used for doing {pull_link}.').format(pull_link=pull_link)|n} <br/>
88 ${_('This field is stored encrypted inside Database, a format of http://user:password@server.com/repo_name can be used and will be hidden from display.')}
89 </p>
90 </div>
91 </div>
92 <div class="field">
93 <div class="label">
94 <label for="push_uri">${_('Remote push uri')}:</label>
95 </div>
96 <div class="input">
97 %if c.rhodecode_db_repo.push_uri:
98 ## display, if we don't have any errors
99 % if not c.form['repo_push_uri'].error:
100 <div id="push_uri_hidden" class='text-as-placeholder'>
101 <span id="push_uri_hidden_value">${c.rhodecode_db_repo.push_uri_hidden}</span>
102 <span class="link" id="edit_push_uri"><i class="icon-edit"></i>${_('edit')}</span>
103 </div>
104 % endif
105
106 ## alter field
107 <div id="alter_push_uri" style="${'' if c.form['repo_push_uri'].error else 'display: none'}">
108 ${c.form['repo_push_uri'].render(css_class='medium', oid='push_uri', placeholder=_('enter new value, or leave empty to remove'))|n}
109 ${c.form.render_error(request, c.form['repo_push_uri'])|n}
110 % if c.form['repo_push_uri'].error:
111 ## we got error from form subit, means we modify the url
112 ${h.hidden('repo_push_uri_change', 'MOD')}
113 % else:
114 ${h.hidden('repo_push_uri_change', 'OLD')}
115 % endif
116
117 % if not c.form['repo_push_uri'].error:
118 <span class="link" id="cancel_edit_push_uri">${_('cancel')}</span>
119 % endif
120
121 </div>
122 %else:
123 ## not set yet, display form to set it
124 ${c.form['repo_push_uri'].render(css_class='medium', oid='push_uri')|n}
125 ${c.form.render_error(request, c.form['repo_push_uri'])|n}
126 ${h.hidden('repo_push_uri_change', 'NEW')}
127 %endif
128 <p id="alter_push_uri_help_block" class="help-block">
129 ${_('http[s] url to sync data back. This field can used for doing {sync_link}.').format(sync_link=sync_link)|n} <br/>
88 ${_('This field is stored encrypted inside Database, a format of http://user:password@server.com/repo_name can be used and will be hidden from display.')}
130 ${_('This field is stored encrypted inside Database, a format of http://user:password@server.com/repo_name can be used and will be hidden from display.')}
89 </p>
131 </p>
90 </div>
132 </div>
91 </div>
133 </div>
92 % else:
134 % else:
93 ${h.hidden('repo_clone_uri', '')}
135 ${h.hidden('repo_clone_uri', '')}
136 ${h.hidden('repo_push_uri', '')}
94 % endif
137 % endif
95
138
96 <div class="field">
139 <div class="field">
@@ -207,16 +250,11 b''
207 </div>
250 </div>
208
251
209 <script>
252 <script>
210 $(document).ready(function(){
253 $(document).ready(function () {
211 var cloneUrl = function() {
254 var cloneUrl = function (
212 var alterButton = $('#alter_clone_uri');
255 alterButton, editButton, cancelEditButton,
213 var editButton = $('#edit_clone_uri');
256 hiddenUrl, hiddenUrlValue, input, helpBlock, changedFlag) {
214 var cancelEditButton = $('#cancel_edit_clone_uri');
257
215 var hiddenUrl = $('#clone_uri_hidden');
216 var hiddenUrlValue = $('#clone_uri_hidden_value');
217 var input = $('#clone_uri');
218 var helpBlock = $('#alter_clone_uri_help_block');
219 var changedFlag = $('#repo_clone_uri_change');
220 var originalText = helpBlock.html();
258 var originalText = helpBlock.html();
221 var obfuscatedUrl = hiddenUrlValue.html();
259 var obfuscatedUrl = hiddenUrlValue.html();
222
260
@@ -255,7 +293,32 b''
255
293
256 setInitialState();
294 setInitialState();
257 initEvents();
295 initEvents();
258 }();
296 };
297
298
299 var alterButton = $('#alter_clone_uri');
300 var editButton = $('#edit_clone_uri');
301 var cancelEditButton = $('#cancel_edit_clone_uri');
302 var hiddenUrl = $('#clone_uri_hidden');
303 var hiddenUrlValue = $('#clone_uri_hidden_value');
304 var input = $('#clone_uri');
305 var helpBlock = $('#alter_clone_uri_help_block');
306 var changedFlag = $('#repo_clone_uri_change');
307 cloneUrl(
308 alterButton, editButton, cancelEditButton, hiddenUrl,
309 hiddenUrlValue, input, helpBlock, changedFlag);
310
311 var alterButton = $('#alter_push_uri');
312 var editButton = $('#edit_push_uri');
313 var cancelEditButton = $('#cancel_edit_push_uri');
314 var hiddenUrl = $('#push_uri_hidden');
315 var hiddenUrlValue = $('#push_uri_hidden_value');
316 var input = $('#push_uri');
317 var helpBlock = $('#alter_push_uri_help_block');
318 var changedFlag = $('#repo_push_uri_change');
319 cloneUrl(
320 alterButton, editButton, cancelEditButton, hiddenUrl,
321 hiddenUrlValue, input, helpBlock, changedFlag);
259
322
260 selectMyGroup = function(element) {
323 selectMyGroup = function(element) {
261 $("#repo_group").val($(element).data('personalGroupId')).trigger("change");
324 $("#repo_group").val($(element).data('personalGroupId')).trigger("change");
@@ -70,6 +70,13 b' autoRefresh = function(value) {'
70 var loadData = function() {
70 var loadData = function() {
71 currentRequest = $.get(url)
71 currentRequest = $.get(url)
72 .done(function(data) {
72 .done(function(data) {
73
74 if(data.indexOf('id="processTimeStamp"') === -1){
75 clearInterval(intervalID);
76 $('#procList').html('ERROR LOADING DATA. PLEASE REFRESH THE PAGE');
77 return
78 }
79
73 currentRequest = null;
80 currentRequest = null;
74 $('#procList').html(data);
81 $('#procList').html(data);
75 timeagoActivate();
82 timeagoActivate();
@@ -2,12 +2,11 b''
2 <table id="procList">
2 <table id="procList">
3 <%
3 <%
4 def get_name(proc):
4 def get_name(proc):
5 cmd = ' '.join(proc.cmdline())
5 if 'vcsserver.ini' in proc.cmd:
6 if 'vcsserver.ini' in cmd:
7 return 'VCSServer'
6 return 'VCSServer'
8 elif 'rhodecode.ini' in cmd:
7 elif 'rhodecode.ini' in proc.cmd:
9 return 'RhodeCode'
8 return 'RhodeCode'
10 return proc.name()
9 return proc.name
11 %>
10 %>
12 <tr>
11 <tr>
13 <td colspan="8">
12 <td colspan="8">
@@ -15,10 +14,8 b''
15 </td>
14 </td>
16 </tr>
15 </tr>
17 % for proc in c.gunicorn_processes:
16 % for proc in c.gunicorn_processes:
18 <% mem = proc.memory_info()%>
19 <% children = proc.children(recursive=True) %>
20 % if children:
21
17
18 % if proc.children:
22 <tr>
19 <tr>
23 <td>
20 <td>
24 <code>
21 <code>
@@ -28,18 +25,18 b''
28 <td>
25 <td>
29 <a href="#showCommand" onclick="$('#pid'+${proc.pid}).toggle();return false"> command </a>
26 <a href="#showCommand" onclick="$('#pid'+${proc.pid}).toggle();return false"> command </a>
30 <code id="pid${proc.pid}" style="display: none">
27 <code id="pid${proc.pid}" style="display: none">
31 ${' '.join(proc.cmdline())}
28 ${proc.cmd}
32 </code>
29 </code>
33 </td>
30 </td>
34 <td></td>
31 <td></td>
35 <td>
32 <td>
36 RSS:${h.format_byte_size_binary(mem.rss)}
33 RSS:${h.format_byte_size_binary(proc.mem_rss)}
37 </td>
34 </td>
38 <td>
35 <td>
39 VMS:${h.format_byte_size_binary(mem.vms)}
36 VMS:${h.format_byte_size_binary(proc.mem_vms)}
40 </td>
37 </td>
41 <td>
38 <td>
42 AGE: ${h.age_component(h.time_to_utcdatetime(proc.create_time()))}
39 AGE: ${h.age_component(h.time_to_utcdatetime(proc.create_time))}
43 </td>
40 </td>
44 <td>
41 <td>
45 MASTER
42 MASTER
@@ -49,8 +46,7 b''
49 </td>
46 </td>
50 </tr>
47 </tr>
51 <% mem_sum = 0 %>
48 <% mem_sum = 0 %>
52 % for proc_child in children:
49 % for proc_child in proc.children:
53 <% mem = proc_child.memory_info()%>
54 <tr>
50 <tr>
55 <td>
51 <td>
56 <code>
52 <code>
@@ -60,21 +56,21 b''
60 <td>
56 <td>
61 <a href="#showCommand" onclick="$('#pid'+${proc_child.pid}).toggle();return false"> command </a>
57 <a href="#showCommand" onclick="$('#pid'+${proc_child.pid}).toggle();return false"> command </a>
62 <code id="pid${proc_child.pid}" style="display: none">
58 <code id="pid${proc_child.pid}" style="display: none">
63 ${' '.join(proc_child.cmdline())}
59 ${proc_child.cmd}
64 </code>
60 </code>
65 </td>
61 </td>
66 <td>
62 <td>
67 CPU: ${proc_child.cpu_percent()} %
63 CPU: ${proc_child.cpu_percent} %
68 </td>
64 </td>
69 <td>
65 <td>
70 RSS:${h.format_byte_size_binary(mem.rss)}
66 RSS:${h.format_byte_size_binary(proc_child.mem_rss)}
71 <% mem_sum += mem.rss %>
67 <% mem_sum += proc_child.mem_rss %>
72 </td>
68 </td>
73 <td>
69 <td>
74 VMS:${h.format_byte_size_binary(mem.vms)}
70 VMS:${h.format_byte_size_binary(proc_child.mem_vms)}
75 </td>
71 </td>
76 <td>
72 <td>
77 AGE: ${h.age_component(h.time_to_utcdatetime(proc_child.create_time()))}
73 AGE: ${h.age_component(h.time_to_utcdatetime(proc_child.create_time))}
78 </td>
74 </td>
79 <td>
75 <td>
80 <a href="#restartProcess" onclick="restart(this, ${proc_child.pid});return false">
76 <a href="#restartProcess" onclick="restart(this, ${proc_child.pid});return false">
@@ -84,7 +80,7 b''
84 </tr>
80 </tr>
85 % endfor
81 % endfor
86 <tr>
82 <tr>
87 <td colspan="2"><code>| total processes: ${len(children)}</code></td>
83 <td colspan="2"><code>| total processes: ${len(proc.children)}</code></td>
88 <td></td>
84 <td></td>
89 <td><strong>RSS:${h.format_byte_size_binary(mem_sum)}</strong></td>
85 <td><strong>RSS:${h.format_byte_size_binary(mem_sum)}</strong></td>
90 <td></td>
86 <td></td>
@@ -100,7 +100,7 b''
100 ${_user_group.users_group_name}
100 ${_user_group.users_group_name}
101 </a>
101 </a>
102 %else:
102 %else:
103 ${_user_group.users_group_name}
103 ${h.link_to_group(_user_group.users_group_name)}
104 %endif
104 %endif
105 </td>
105 </td>
106 <td class="td-action">
106 <td class="td-action">
@@ -356,8 +356,10 b''
356 <ol class="links">
356 <ol class="links">
357 <li>${h.link_to(_(u'My account'),h.route_path('my_account_profile'))}</li>
357 <li>${h.link_to(_(u'My account'),h.route_path('my_account_profile'))}</li>
358 % if c.rhodecode_user.personal_repo_group:
358 % if c.rhodecode_user.personal_repo_group:
359 <li>${h.link_to(_(u'My personal group'), h.route_path('repo_group_home', repo_group_name=c.rhodecode_user.personal_repo_group.group_name))}</li>
359 <li>${h.link_to(_(u'My personal group'), h.route_path('repo_group_home', repo_group_name=c.rhodecode_user.personal_repo_group.group_name))}</li>
360 % endif
360 % endif
361 <li>${h.link_to(_(u'Pull Requests'), h.route_path('my_account_pullrequests'))}</li>
362
361 <li class="logout">
363 <li class="logout">
362 ${h.secure_form(h.route_path('logout'), request=request)}
364 ${h.secure_form(h.route_path('logout'), request=request)}
363 ${h.submit('log_out', _(u'Sign Out'),class_="btn btn-primary")}
365 ${h.submit('log_out', _(u'Sign Out'),class_="btn btn-primary")}
@@ -312,6 +312,20 b''
312 </div>
312 </div>
313 % endif
313 % endif
314
314
315 % if display_globals or repo_type in ['hg', 'git', 'svn']:
316 <div class="panel panel-default">
317 <div class="panel-heading" id="vcs-pull-requests-options">
318 <h3 class="panel-title">${_('Diff cache')}<a class="permalink" href="#vcs-pull-requests-options"></a></h3>
319 </div>
320 <div class="panel-body">
321 <div class="checkbox">
322 ${h.checkbox('rhodecode_diff_cache' + suffix, 'True', **kwargs)}
323 <label for="rhodecode_diff_cache${suffix}">${_('Enable caching diffs for pull requests cache and commits')}</label>
324 </div>
325 </div>
326 </div>
327 % endif
328
315 % if display_globals or repo_type in ['hg',]:
329 % if display_globals or repo_type in ['hg',]:
316 <div class="panel panel-default">
330 <div class="panel panel-default">
317 <div class="panel-heading" id="vcs-pull-requests-options">
331 <div class="panel-heading" id="vcs-pull-requests-options">
@@ -213,7 +213,7 b''
213 <%namespace name="cbdiffs" file="/codeblocks/diffs.mako"/>
213 <%namespace name="cbdiffs" file="/codeblocks/diffs.mako"/>
214 ${cbdiffs.render_diffset_menu()}
214 ${cbdiffs.render_diffset_menu()}
215 ${cbdiffs.render_diffset(
215 ${cbdiffs.render_diffset(
216 c.changes[c.commit.raw_id], commit=c.commit, use_comments=True)}
216 c.changes[c.commit.raw_id], commit=c.commit, use_comments=True,inline_comments=c.inline_comments )}
217 </div>
217 </div>
218
218
219 ## template for inline comment form
219 ## template for inline comment form
@@ -44,13 +44,15 b" return '%s_%s_%i' % (h.safeid(filename),"
44
44
45 # special file-comments that were deleted in previous versions
45 # special file-comments that were deleted in previous versions
46 # it's used for showing outdated comments for deleted files in a PR
46 # it's used for showing outdated comments for deleted files in a PR
47 deleted_files_comments=None
47 deleted_files_comments=None,
48
49 # for cache purpose
50 inline_comments=None
48
51
49 )">
52 )">
50
51 %if use_comments:
53 %if use_comments:
52 <div id="cb-comments-inline-container-template" class="js-template">
54 <div id="cb-comments-inline-container-template" class="js-template">
53 ${inline_comments_container([])}
55 ${inline_comments_container([], inline_comments)}
54 </div>
56 </div>
55 <div class="js-template" id="cb-comment-inline-form-template">
57 <div class="js-template" id="cb-comment-inline-form-template">
56 <div class="comment-inline-form ac">
58 <div class="comment-inline-form ac">
@@ -132,7 +134,9 b' collapse_all = len(diffset.files) > coll'
132 </h2>
134 </h2>
133 </div>
135 </div>
134
136
135 %if not diffset.files:
137 %if diffset.has_hidden_changes:
138 <p class="empty_data">${_('Some changes may be hidden')}</p>
139 %elif not diffset.files:
136 <p class="empty_data">${_('No files')}</p>
140 <p class="empty_data">${_('No files')}</p>
137 %endif
141 %endif
138
142
@@ -209,9 +213,9 b' collapse_all = len(diffset.files) > coll'
209 </td>
213 </td>
210 </tr>
214 </tr>
211 %if c.diffmode == 'unified':
215 %if c.diffmode == 'unified':
212 ${render_hunk_lines_unified(hunk, use_comments=use_comments)}
216 ${render_hunk_lines_unified(hunk, use_comments=use_comments, inline_comments=inline_comments)}
213 %elif c.diffmode == 'sideside':
217 %elif c.diffmode == 'sideside':
214 ${render_hunk_lines_sideside(hunk, use_comments=use_comments)}
218 ${render_hunk_lines_sideside(hunk, use_comments=use_comments, inline_comments=inline_comments)}
215 %else:
219 %else:
216 <tr class="cb-line">
220 <tr class="cb-line">
217 <td>unknown diff mode</td>
221 <td>unknown diff mode</td>
@@ -228,7 +232,7 b' collapse_all = len(diffset.files) > coll'
228 <td class="cb-lineno cb-context"></td>
232 <td class="cb-lineno cb-context"></td>
229 <td class="cb-lineno cb-context"></td>
233 <td class="cb-lineno cb-context"></td>
230 <td class="cb-content cb-context">
234 <td class="cb-content cb-context">
231 ${inline_comments_container(comments)}
235 ${inline_comments_container(comments, inline_comments)}
232 </td>
236 </td>
233 </tr>
237 </tr>
234 %elif c.diffmode == 'sideside':
238 %elif c.diffmode == 'sideside':
@@ -237,7 +241,7 b' collapse_all = len(diffset.files) > coll'
237 <td class="cb-lineno cb-context"></td>
241 <td class="cb-lineno cb-context"></td>
238 <td class="cb-content cb-context">
242 <td class="cb-content cb-context">
239 % if lineno.startswith('o'):
243 % if lineno.startswith('o'):
240 ${inline_comments_container(comments)}
244 ${inline_comments_container(comments, inline_comments)}
241 % endif
245 % endif
242 </td>
246 </td>
243
247
@@ -245,7 +249,7 b' collapse_all = len(diffset.files) > coll'
245 <td class="cb-lineno cb-context"></td>
249 <td class="cb-lineno cb-context"></td>
246 <td class="cb-content cb-context">
250 <td class="cb-content cb-context">
247 % if lineno.startswith('n'):
251 % if lineno.startswith('n'):
248 ${inline_comments_container(comments)}
252 ${inline_comments_container(comments, inline_comments)}
249 % endif
253 % endif
250 </td>
254 </td>
251 </tr>
255 </tr>
@@ -296,7 +300,7 b' collapse_all = len(diffset.files) > coll'
296 <td class="cb-lineno cb-context"></td>
300 <td class="cb-lineno cb-context"></td>
297 <td class="cb-lineno cb-context"></td>
301 <td class="cb-lineno cb-context"></td>
298 <td class="cb-content cb-context">
302 <td class="cb-content cb-context">
299 ${inline_comments_container(comments_dict['comments'])}
303 ${inline_comments_container(comments_dict['comments'], inline_comments)}
300 </td>
304 </td>
301 </tr>
305 </tr>
302 %elif c.diffmode == 'sideside':
306 %elif c.diffmode == 'sideside':
@@ -308,7 +312,7 b' collapse_all = len(diffset.files) > coll'
308 <td class="cb-data cb-context"></td>
312 <td class="cb-data cb-context"></td>
309 <td class="cb-lineno cb-context"></td>
313 <td class="cb-lineno cb-context"></td>
310 <td class="cb-content cb-context">
314 <td class="cb-content cb-context">
311 ${inline_comments_container(comments_dict['comments'])}
315 ${inline_comments_container(comments_dict['comments'], inline_comments)}
312 </td>
316 </td>
313 </tr>
317 </tr>
314 %endif
318 %endif
@@ -482,12 +486,11 b' from rhodecode.lib.diffs import NEW_FILE'
482 </%def>
486 </%def>
483
487
484
488
485 <%def name="inline_comments_container(comments)">
489 <%def name="inline_comments_container(comments, inline_comments)">
486 <div class="inline-comments">
490 <div class="inline-comments">
487 %for comment in comments:
491 %for comment in comments:
488 ${commentblock.comment_block(comment, inline=True)}
492 ${commentblock.comment_block(comment, inline=True)}
489 %endfor
493 %endfor
490
491 % if comments and comments[-1].outdated:
494 % if comments and comments[-1].outdated:
492 <span class="btn btn-secondary cb-comment-add-button comment-outdated}"
495 <span class="btn btn-secondary cb-comment-add-button comment-outdated}"
493 style="display: none;}">
496 style="display: none;}">
@@ -503,8 +506,23 b' from rhodecode.lib.diffs import NEW_FILE'
503 </div>
506 </div>
504 </%def>
507 </%def>
505
508
509 <%!
510 def get_comments_for(comments, filename, line_version, line_number):
511 if hasattr(filename, 'unicode_path'):
512 filename = filename.unicode_path
506
513
507 <%def name="render_hunk_lines_sideside(hunk, use_comments=False)">
514 if not isinstance(filename, basestring):
515 return None
516
517 line_key = '{}{}'.format(line_version, line_number)
518 if comments and filename in comments:
519 file_comments = comments[filename]
520 if line_key in file_comments:
521 return file_comments[line_key]
522 %>
523
524 <%def name="render_hunk_lines_sideside(hunk, use_comments=False, inline_comments=None)">
525
508 %for i, line in enumerate(hunk.sideside):
526 %for i, line in enumerate(hunk.sideside):
509 <%
527 <%
510 old_line_anchor, new_line_anchor = None, None
528 old_line_anchor, new_line_anchor = None, None
@@ -516,16 +534,25 b' from rhodecode.lib.diffs import NEW_FILE'
516
534
517 <tr class="cb-line">
535 <tr class="cb-line">
518 <td class="cb-data ${action_class(line.original.action)}"
536 <td class="cb-data ${action_class(line.original.action)}"
519 data-line-number="${line.original.lineno}"
537 data-line-no="${line.original.lineno}"
520 >
538 >
521 <div>
539 <div>
522 %if line.original.comments:
540 <% loc = None %>
523 <i class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i>
541 %if line.original.get_comment_args:
542 <% loc = get_comments_for(inline_comments, *line.original.get_comment_args) %>
543 %endif
544 %if loc:
545 <% has_outdated = any([x.outdated for x in loc]) %>
546 % if has_outdated:
547 <i title="${_('comments including outdated')}:${len(loc)}" class="icon-comment_toggle" onclick="return Rhodecode.comments.toggleLineComments(this)"></i>
548 % else:
549 <i title="${_('comments')}: ${len(loc)}" class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i>
550 % endif
524 %endif
551 %endif
525 </div>
552 </div>
526 </td>
553 </td>
527 <td class="cb-lineno ${action_class(line.original.action)}"
554 <td class="cb-lineno ${action_class(line.original.action)}"
528 data-line-number="${line.original.lineno}"
555 data-line-no="${line.original.lineno}"
529 %if old_line_anchor:
556 %if old_line_anchor:
530 id="${old_line_anchor}"
557 id="${old_line_anchor}"
531 %endif
558 %endif
@@ -535,27 +562,40 b' from rhodecode.lib.diffs import NEW_FILE'
535 %endif
562 %endif
536 </td>
563 </td>
537 <td class="cb-content ${action_class(line.original.action)}"
564 <td class="cb-content ${action_class(line.original.action)}"
538 data-line-number="o${line.original.lineno}"
565 data-line-no="o${line.original.lineno}"
539 >
566 >
540 %if use_comments and line.original.lineno:
567 %if use_comments and line.original.lineno:
541 ${render_add_comment_button()}
568 ${render_add_comment_button()}
542 %endif
569 %endif
543 <span class="cb-code">${line.original.action} ${line.original.content or '' | n}</span>
570 <span class="cb-code">${line.original.action} ${line.original.content or '' | n}</span>
544 %if use_comments and line.original.lineno and line.original.comments:
571
545 ${inline_comments_container(line.original.comments)}
572 %if use_comments and line.original.lineno and loc:
573 ${inline_comments_container(loc, inline_comments)}
546 %endif
574 %endif
575
547 </td>
576 </td>
548 <td class="cb-data ${action_class(line.modified.action)}"
577 <td class="cb-data ${action_class(line.modified.action)}"
549 data-line-number="${line.modified.lineno}"
578 data-line-no="${line.modified.lineno}"
550 >
579 >
551 <div>
580 <div>
552 %if line.modified.comments:
581
553 <i class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i>
582 %if line.modified.get_comment_args:
583 <% lmc = get_comments_for(inline_comments, *line.modified.get_comment_args) %>
584 %else:
585 <% lmc = None%>
586 %endif
587 %if lmc:
588 <% has_outdated = any([x.outdated for x in lmc]) %>
589 % if has_outdated:
590 <i title="${_('comments including outdated')}:${len(lmc)}" class="icon-comment_toggle" onclick="return Rhodecode.comments.toggleLineComments(this)"></i>
591 % else:
592 <i title="${_('comments')}: ${len(lmc)}" class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i>
593 % endif
554 %endif
594 %endif
555 </div>
595 </div>
556 </td>
596 </td>
557 <td class="cb-lineno ${action_class(line.modified.action)}"
597 <td class="cb-lineno ${action_class(line.modified.action)}"
558 data-line-number="${line.modified.lineno}"
598 data-line-no="${line.modified.lineno}"
559 %if new_line_anchor:
599 %if new_line_anchor:
560 id="${new_line_anchor}"
600 id="${new_line_anchor}"
561 %endif
601 %endif
@@ -565,14 +605,14 b' from rhodecode.lib.diffs import NEW_FILE'
565 %endif
605 %endif
566 </td>
606 </td>
567 <td class="cb-content ${action_class(line.modified.action)}"
607 <td class="cb-content ${action_class(line.modified.action)}"
568 data-line-number="n${line.modified.lineno}"
608 data-line-no="n${line.modified.lineno}"
569 >
609 >
570 %if use_comments and line.modified.lineno:
610 %if use_comments and line.modified.lineno:
571 ${render_add_comment_button()}
611 ${render_add_comment_button()}
572 %endif
612 %endif
573 <span class="cb-code">${line.modified.action} ${line.modified.content or '' | n}</span>
613 <span class="cb-code">${line.modified.action} ${line.modified.content or '' | n}</span>
574 %if use_comments and line.modified.lineno and line.modified.comments:
614 %if use_comments and line.modified.lineno and lmc:
575 ${inline_comments_container(line.modified.comments)}
615 ${inline_comments_container(lmc, inline_comments)}
576 %endif
616 %endif
577 </td>
617 </td>
578 </tr>
618 </tr>
@@ -580,8 +620,8 b' from rhodecode.lib.diffs import NEW_FILE'
580 </%def>
620 </%def>
581
621
582
622
583 <%def name="render_hunk_lines_unified(hunk, use_comments=False)">
623 <%def name="render_hunk_lines_unified(hunk, use_comments=False, inline_comments=None)">
584 %for old_line_no, new_line_no, action, content, comments in hunk.unified:
624 %for old_line_no, new_line_no, action, content, comments_args in hunk.unified:
585 <%
625 <%
586 old_line_anchor, new_line_anchor = None, None
626 old_line_anchor, new_line_anchor = None, None
587 if old_line_no:
627 if old_line_no:
@@ -592,13 +632,25 b' from rhodecode.lib.diffs import NEW_FILE'
592 <tr class="cb-line">
632 <tr class="cb-line">
593 <td class="cb-data ${action_class(action)}">
633 <td class="cb-data ${action_class(action)}">
594 <div>
634 <div>
595 %if comments:
635
596 <i class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i>
636 %if comments_args:
637 <% comments = get_comments_for(inline_comments, *comments_args) %>
638 %else:
639 <% comments = None%>
597 %endif
640 %endif
641
642 % if comments:
643 <% has_outdated = any([x.outdated for x in comments]) %>
644 % if has_outdated:
645 <i title="${_('comments including outdated')}:${len(comments)}" class="icon-comment_toggle" onclick="return Rhodecode.comments.toggleLineComments(this)"></i>
646 % else:
647 <i title="${_('comments')}: ${len(comments)}" class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i>
648 % endif
649 % endif
598 </div>
650 </div>
599 </td>
651 </td>
600 <td class="cb-lineno ${action_class(action)}"
652 <td class="cb-lineno ${action_class(action)}"
601 data-line-number="${old_line_no}"
653 data-line-no="${old_line_no}"
602 %if old_line_anchor:
654 %if old_line_anchor:
603 id="${old_line_anchor}"
655 id="${old_line_anchor}"
604 %endif
656 %endif
@@ -608,7 +660,7 b' from rhodecode.lib.diffs import NEW_FILE'
608 %endif
660 %endif
609 </td>
661 </td>
610 <td class="cb-lineno ${action_class(action)}"
662 <td class="cb-lineno ${action_class(action)}"
611 data-line-number="${new_line_no}"
663 data-line-no="${new_line_no}"
612 %if new_line_anchor:
664 %if new_line_anchor:
613 id="${new_line_anchor}"
665 id="${new_line_anchor}"
614 %endif
666 %endif
@@ -618,14 +670,14 b' from rhodecode.lib.diffs import NEW_FILE'
618 %endif
670 %endif
619 </td>
671 </td>
620 <td class="cb-content ${action_class(action)}"
672 <td class="cb-content ${action_class(action)}"
621 data-line-number="${new_line_no and 'n' or 'o'}${new_line_no or old_line_no}"
673 data-line-no="${new_line_no and 'n' or 'o'}${new_line_no or old_line_no}"
622 >
674 >
623 %if use_comments:
675 %if use_comments:
624 ${render_add_comment_button()}
676 ${render_add_comment_button()}
625 %endif
677 %endif
626 <span class="cb-code">${action} ${content or '' | n}</span>
678 <span class="cb-code">${action} ${content or '' | n}</span>
627 %if use_comments and comments:
679 %if use_comments and comments:
628 ${inline_comments_container(comments)}
680 ${inline_comments_container(comments, inline_comments)}
629 %endif
681 %endif
630 </td>
682 </td>
631 </tr>
683 </tr>
@@ -282,10 +282,10 b''
282 ${base.gravatar_with_user(username, 16)}
282 ${base.gravatar_with_user(username, 16)}
283 </%def>
283 </%def>
284
284
285 <%def name="user_group_name(user_group_id, user_group_name)">
285 <%def name="user_group_name(user_group_name)">
286 <div>
286 <div>
287 <a href="${h.route_path('edit_user_group', user_group_id=user_group_id)}">
287 <i class="icon-group" title="${_('User group')}"></i>
288 <i class="icon-group" title="${_('User group')}"></i> ${user_group_name}</a>
288 ${h.link_to_group(user_group_name)}
289 </div>
289 </div>
290 </%def>
290 </%def>
291
291
@@ -17,6 +17,10 b''
17 tag: ${tag} <br/>
17 tag: ${tag} <br/>
18 % endfor
18 % endfor
19
19
20 % if has_hidden_changes:
21 Has hidden changes<br/>
22 % endif
23
20 commit: <a href="${h.route_url('repo_commit', repo_name=c.rhodecode_db_repo.repo_name, commit_id=commit.raw_id)}">${h.show_id(commit)}</a>
24 commit: <a href="${h.route_url('repo_commit', repo_name=c.rhodecode_db_repo.repo_name, commit_id=commit.raw_id)}">${h.show_id(commit)}</a>
21 <pre>
25 <pre>
22 ${h.urlify_commit_message(commit.message)}
26 ${h.urlify_commit_message(commit.message)}
@@ -29,6 +33,6 b' commit: <a href="${h.route_url(\'repo_com'
29 % endfor
33 % endfor
30
34
31 % if feed_include_diff:
35 % if feed_include_diff:
32 ${diff_processor.as_raw()}
36 ${c.path_filter.get_raw_patch(diff_processor)}
33 % endif
37 % endif
34 </pre>
38 </pre>
@@ -1,20 +1,26 b''
1 <span tal:define="name name|field.name;
2 true_val true_val|field.widget.true_val;
3 css_class css_class|field.widget.css_class;
4 style style|field.widget.style;
5 oid oid|field.oid;
6 help_block help_block|field.widget.help_block|'';
7 "
8 tal:omit-tag="">
9
1 <div class="checkbox">
10 <div class="checkbox">
2 <input tal:define="name name|field.name;
11 <input type="checkbox" name="${name}" value="${true_val}"
3 true_val true_val|field.widget.true_val;
12 id="${oid}"
4 css_class css_class|field.widget.css_class;
13 tal:attributes="checked cstruct == true_val;
5 style style|field.widget.style;
14 class css_class;
6 oid oid|field.oid"
15 style style;"
7 type="checkbox"
16 />
8 name="${name}" value="${true_val}"
17 <p tal:condition="help_block" class="help-block">${help_block}</p>
9 id="${oid}"
10 tal:attributes="checked cstruct == true_val;
11 class css_class;
12 style style;" />
13
14 <label for="${field.oid}">
18 <label for="${field.oid}">
15 <span tal:condition="hasattr(field, 'schema') and hasattr(field.schema, 'label')"
19 <span tal:condition="hasattr(field, 'schema') and hasattr(field.schema, 'label')"
16 tal:replace="field.schema.label" class="checkbox-label" >
20 tal:replace="field.schema.label" class="checkbox-label" >
17 </span>
21 </span>
18
22
19 </label>
23 </label>
20 </div> No newline at end of file
24 </div>
25
26 </span> No newline at end of file
@@ -5,6 +5,8 b''
5 name name|field.name;
5 name name|field.name;
6 style style|field.widget.style;
6 style style|field.widget.style;
7 help_block help_block|field.widget.help_block|'';
7 help_block help_block|field.widget.help_block|'';
8 help_block_collapsable_name help_block_collapsable_name|field.widget.help_block_collapsable_name|'';
9 help_block_collapsable help_block_collapsable|field.widget.help_block_collapsable|'';
8 codemirror_options codemirror_options|field.widget.codemirror_options|{};
10 codemirror_options codemirror_options|field.widget.codemirror_options|{};
9 codemirror_mode codemirror_mode|field.widget.codemirror_mode|''
11 codemirror_mode codemirror_mode|field.widget.codemirror_mode|''
10 ">
12 ">
@@ -17,6 +19,9 b''
17 name="${name}">${cstruct}</textarea>
19 name="${name}">${cstruct}</textarea>
18
20
19 <p tal:condition="help_block" class="help-block">${help_block}</p>
21 <p tal:condition="help_block" class="help-block">${help_block}</p>
22 <span tal:condition="help_block_collapsable" class="help-block pre-formatting"><a href="#showVars" onclick="$('#help_block_${oid}').toggle(); return false">${help_block_collapsable_name}</a>
23 <p id="help_block_${oid}" style="display: none">${help_block_collapsable}</p>
24 </span>
20 <script type="text/javascript">
25 <script type="text/javascript">
21 deform.addCallback(
26 deform.addCallback(
22 '${oid}',
27 '${oid}',
@@ -6,21 +6,25 b''
6 mask_placeholder mask_placeholder|field.widget.mask_placeholder;
6 mask_placeholder mask_placeholder|field.widget.mask_placeholder;
7 style style|field.widget.style;
7 style style|field.widget.style;
8 help_block help_block|field.widget.help_block|'';
8 help_block help_block|field.widget.help_block|'';
9 "
9 "
10 tal:omit-tag="">
10 tal:omit-tag="">
11 <input type="text" name="${name}" value="${cstruct}"
11
12 tal:attributes="class string: form-control ${css_class or ''};
12 <input type="text" name="${name}" value="${cstruct}"
13 style style"
13 id="${oid}"
14 placeholder="${placeholder}"
14 placeholder="${placeholder}"
15 id="${oid}"/>
15
16 tal:attributes="class string: form-control ${css_class or ''};
17 style style"
18 />
16
19
17 <p tal:condition="help_block" class="help-block">${help_block}</p>
20 <p tal:condition="help_block" class="help-block">${help_block}</p>
18 <script tal:condition="mask" type="text/javascript">
21 <script tal:condition="mask" type="text/javascript">
19 deform.addCallback(
22 deform.addCallback(
20 '${oid}',
23 '${oid}',
21 function (oid) {
24 function (oid) {
22 $("#" + oid).mask("${mask}",
25 $("#" + oid).mask("${mask}",
23 {placeholder:"${mask_placeholder}"});
26 {placeholder:"${mask_placeholder}"});
24 });
27 });
25 </script>
28 </script>
29
26 </span> No newline at end of file
30 </span>
@@ -570,7 +570,8 b''
570 c.diffset, use_comments=True,
570 c.diffset, use_comments=True,
571 collapse_when_files_over=30,
571 collapse_when_files_over=30,
572 disable_new_comments=not c.allowed_to_comment,
572 disable_new_comments=not c.allowed_to_comment,
573 deleted_files_comments=c.deleted_files_comments)}
573 deleted_files_comments=c.deleted_files_comments,
574 inline_comments=c.inline_comments)}
574 </div>
575 </div>
575 % else:
576 % else:
576 ## skipping commits we need to clear the view for missing commits
577 ## skipping commits we need to clear the view for missing commits
@@ -22,6 +22,8 b''
22 - Mercurial largefiles
22 - Mercurial largefiles
23 - Git LFS
23 - Git LFS
24 </pre>
24 </pre>
25 <br/>
26 Requirement error: ${c.repository_requirements_missing.get('error')}
25 </div>
27 </div>
26
28
27 </%def>
29 </%def>
@@ -1,14 +1,14 b''
1 <%inherit file="/base/base.mako"/>
1 <%inherit file="/base/base.mako"/>
2
2
3 <%def name="title()">
3 <%def name="title()">
4 ${c.user.username}
4 ${_('User')}: ${c.user.username}
5 %if c.rhodecode_name:
5 %if c.rhodecode_name:
6 &middot; ${h.branding(c.rhodecode_name)}
6 &middot; ${h.branding(c.rhodecode_name)}
7 %endif
7 %endif
8 </%def>
8 </%def>
9
9
10 <%def name="breadcrumbs_links()">
10 <%def name="breadcrumbs_links()">
11 ${c.user.username}
11 ${_('User')}: ${c.user.username}
12 </%def>
12 </%def>
13
13
14 <%def name="menu_bar_nav()">
14 <%def name="menu_bar_nav()">
@@ -2,7 +2,7 b''
2
2
3 <div class="panel panel-default user-profile">
3 <div class="panel panel-default user-profile">
4 <div class="panel-heading">
4 <div class="panel-heading">
5 <h3 class="panel-title">${_('Profile')}</h3>
5 <h3 class="panel-title">${_('User Profile')}</h3>
6 %if h.HasPermissionAny('hg.admin')():
6 %if h.HasPermissionAny('hg.admin')():
7 ${h.link_to(_('Edit'), h.route_path('user_edit', user_id=c.user.user_id), class_='panel-edit')}
7 ${h.link_to(_('Edit'), h.route_path('user_edit', user_id=c.user.user_id), class_='panel-edit')}
8 %endif
8 %endif
@@ -41,7 +41,7 b' log = logging.getLogger(__name__)'
41
41
42 __all__ = [
42 __all__ = [
43 'get_new_dir', 'TestController',
43 'get_new_dir', 'TestController',
44 'link_to', 'ldap_lib_installed', 'clear_all_caches',
44 'link_to', 'clear_all_caches',
45 'assert_session_flash', 'login_user', 'no_newline_id_generator',
45 'assert_session_flash', 'login_user', 'no_newline_id_generator',
46 'TESTS_TMP_PATH', 'HG_REPO', 'GIT_REPO', 'SVN_REPO',
46 'TESTS_TMP_PATH', 'HG_REPO', 'GIT_REPO', 'SVN_REPO',
47 'NEW_HG_REPO', 'NEW_GIT_REPO',
47 'NEW_HG_REPO', 'NEW_GIT_REPO',
@@ -95,17 +95,6 b" TEST_HG_REPO_PULL = jn(TESTS_TMP_PATH, '"
95 TEST_REPO_PREFIX = 'vcs-test'
95 TEST_REPO_PREFIX = 'vcs-test'
96
96
97
97
98 # skip ldap tests if LDAP lib is not installed
99 ldap_lib_installed = False
100 try:
101 import ldap
102 ldap_lib_installed = True
103 except ImportError:
104 ldap = None
105 # means that python-ldap is not installed
106 pass
107
108
109 def clear_all_caches():
98 def clear_all_caches():
110 from beaker.cache import cache_managers
99 from beaker.cache import cache_managers
111 for _cache in cache_managers.values():
100 for _cache in cache_managers.values():
@@ -100,7 +100,7 b' class RhodeCodeAuthPlugin(RhodeCodeExter'
100 }
100 }
101
101
102 log.debug('EXTERNAL user: \n%s' % formatted_json(user_attrs))
102 log.debug('EXTERNAL user: \n%s' % formatted_json(user_attrs))
103 log.info('user %s authenticated correctly' % user_attrs['username'])
103 log.info('user `%s` authenticated correctly' % user_attrs['username'])
104
104
105 return user_attrs
105 return user_attrs
106
106
@@ -86,7 +86,8 b' class DBBackend(object):'
86 _store = os.path.dirname(os.path.abspath(__file__))
86 _store = os.path.dirname(os.path.abspath(__file__))
87 _type = None
87 _type = None
88 _base_ini_config = [{'app:main': {'vcs.start_server': 'false',
88 _base_ini_config = [{'app:main': {'vcs.start_server': 'false',
89 'startup.import_repos': 'false'}}]
89 'startup.import_repos': 'false',
90 'is_test': 'False'}}]
90 _db_url = [{'app:main': {'sqlalchemy.db1.url': ''}}]
91 _db_url = [{'app:main': {'sqlalchemy.db1.url': ''}}]
91 _base_db_name = 'rhodecode_test_db_backend'
92 _base_db_name = 'rhodecode_test_db_backend'
92
93
@@ -152,6 +153,9 b' class DBBackend(object):'
152 print(self.stderr)
153 print(self.stderr)
153 raise AssertionError('non 0 retcode:{}'.format(self.p.returncode))
154 raise AssertionError('non 0 retcode:{}'.format(self.p.returncode))
154
155
156 def assert_correct_output(self, stdout, version):
157 assert 'UPGRADE FOR STEP {} COMPLETED'.format(version) in stdout
158
155 def setup_rhodecode_db(self, ini_params=None, env=None):
159 def setup_rhodecode_db(self, ini_params=None, env=None):
156 if not ini_params:
160 if not ini_params:
157 ini_params = self._base_ini_config
161 ini_params = self._base_ini_config
@@ -167,7 +171,7 b' class DBBackend(object):'
167 if not os.path.isdir(self._repos_git_lfs_store):
171 if not os.path.isdir(self._repos_git_lfs_store):
168 os.makedirs(self._repos_git_lfs_store)
172 os.makedirs(self._repos_git_lfs_store)
169
173
170 self.execute(
174 return self.execute(
171 "rc-setup-app {0} --user=marcink "
175 "rc-setup-app {0} --user=marcink "
172 "--email=marcin@rhodeocode.com --password={1} "
176 "--email=marcin@rhodeocode.com --password={1} "
173 "--repos={2} --force-yes".format(
177 "--repos={2} --force-yes".format(
@@ -183,7 +187,8 b' class DBBackend(object):'
183 with test_ini as ini_file:
187 with test_ini as ini_file:
184 if not os.path.isdir(self._repos_location):
188 if not os.path.isdir(self._repos_location):
185 os.makedirs(self._repos_location)
189 os.makedirs(self._repos_location)
186 self.execute(
190
191 return self.execute(
187 "rc-upgrade-db {0} --force-yes".format(ini_file))
192 "rc-upgrade-db {0} --force-yes".format(ini_file))
188
193
189 def setup_db(self):
194 def setup_db(self):
@@ -226,12 +231,11 b' class SQLiteDBBackend(DBBackend):'
226
231
227 def import_dump(self, dumpname):
232 def import_dump(self, dumpname):
228 dump = os.path.join(self.fixture_store, dumpname)
233 dump = os.path.join(self.fixture_store, dumpname)
229 shutil.copy(
234 target = os.path.join(self._basetemp, '{0.db_name}.sqlite'.format(self))
230 dump,
235 return self.execute('cp -v {} {}'.format(dump, target))
231 os.path.join(self._basetemp, '{0.db_name}.sqlite'.format(self)))
232
236
233 def teardown_db(self):
237 def teardown_db(self):
234 self.execute("rm -rf {}.sqlite".format(
238 return self.execute("rm -rf {}.sqlite".format(
235 os.path.join(self._basetemp, self.db_name)))
239 os.path.join(self._basetemp, self.db_name)))
236
240
237
241
@@ -246,16 +250,16 b' class MySQLDBBackend(DBBackend):'
246 # mysqldump -uroot -pqweqwe $TEST_DB_NAME
250 # mysqldump -uroot -pqweqwe $TEST_DB_NAME
247 self._db_url = [{'app:main': {
251 self._db_url = [{'app:main': {
248 'sqlalchemy.db1.url': self.connection_string}}]
252 'sqlalchemy.db1.url': self.connection_string}}]
249 self.execute("mysql -v -u{} -p{} -e 'create database '{}';'".format(
253 return self.execute("mysql -v -u{} -p{} -e 'create database '{}';'".format(
250 self.user, self.password, self.db_name))
254 self.user, self.password, self.db_name))
251
255
252 def import_dump(self, dumpname):
256 def import_dump(self, dumpname):
253 dump = os.path.join(self.fixture_store, dumpname)
257 dump = os.path.join(self.fixture_store, dumpname)
254 self.execute("mysql -u{} -p{} {} < {}".format(
258 return self.execute("mysql -u{} -p{} {} < {}".format(
255 self.user, self.password, self.db_name, dump))
259 self.user, self.password, self.db_name, dump))
256
260
257 def teardown_db(self):
261 def teardown_db(self):
258 self.execute("mysql -v -u{} -p{} -e 'drop database '{}';'".format(
262 return self.execute("mysql -v -u{} -p{} -e 'drop database '{}';'".format(
259 self.user, self.password, self.db_name))
263 self.user, self.password, self.db_name))
260
264
261
265
@@ -271,18 +275,18 b' class PostgresDBBackend(DBBackend):'
271 self._db_url = [{'app:main': {
275 self._db_url = [{'app:main': {
272 'sqlalchemy.db1.url':
276 'sqlalchemy.db1.url':
273 self.connection_string}}]
277 self.connection_string}}]
274 self.execute("PGPASSWORD={} psql -U {} -h localhost "
278 return self.execute("PGPASSWORD={} psql -U {} -h localhost "
275 "-c 'create database '{}';'".format(
279 "-c 'create database '{}';'".format(
276 self.password, self.user, self.db_name))
280 self.password, self.user, self.db_name))
277
281
278 def teardown_db(self):
282 def teardown_db(self):
279 self.execute("PGPASSWORD={} psql -U {} -h localhost "
283 return self.execute("PGPASSWORD={} psql -U {} -h localhost "
280 "-c 'drop database if exists '{}';'".format(
284 "-c 'drop database if exists '{}';'".format(
281 self.password, self.user, self.db_name))
285 self.password, self.user, self.db_name))
282
286
283 def import_dump(self, dumpname):
287 def import_dump(self, dumpname):
284 dump = os.path.join(self.fixture_store, dumpname)
288 dump = os.path.join(self.fixture_store, dumpname)
285 self.execute(
289 return self.execute(
286 "PGPASSWORD={} psql -U {} -h localhost -d {} -1 "
290 "PGPASSWORD={} psql -U {} -h localhost -d {} -1 "
287 "-f {}".format(
291 "-f {}".format(
288 self.password, self.user, self.db_name, dump))
292 self.password, self.user, self.db_name, dump))
@@ -59,5 +59,7 b' def _run_migration_test(db_backend, dump'
59 db_backend.assert_returncode_success()
59 db_backend.assert_returncode_success()
60
60
61 db_backend.import_dump(dumpname)
61 db_backend.import_dump(dumpname)
62 db_backend.upgrade_database()
62 stdout, stderr = db_backend.upgrade_database()
63
64 db_backend.assert_correct_output(stdout+stderr, version='16')
63 db_backend.assert_returncode_success()
65 db_backend.assert_returncode_success()
@@ -30,7 +30,7 b' import shutil'
30 import configobj
30 import configobj
31
31
32 from rhodecode.tests import *
32 from rhodecode.tests import *
33 from rhodecode.model.db import Repository, User, RepoGroup, UserGroup, Gist
33 from rhodecode.model.db import Repository, User, RepoGroup, UserGroup, Gist, UserEmailMap
34 from rhodecode.model.meta import Session
34 from rhodecode.model.meta import Session
35 from rhodecode.model.repo import RepoModel
35 from rhodecode.model.repo import RepoModel
36 from rhodecode.model.user import UserModel
36 from rhodecode.model.user import UserModel
@@ -125,6 +125,7 b' class Fixture(object):'
125 'repo_name': None,
125 'repo_name': None,
126 'repo_type': 'hg',
126 'repo_type': 'hg',
127 'clone_uri': '',
127 'clone_uri': '',
128 'push_uri': '',
128 'repo_group': '-1',
129 'repo_group': '-1',
129 'repo_description': 'DESC',
130 'repo_description': 'DESC',
130 'repo_private': False,
131 'repo_private': False,
@@ -274,6 +275,13 b' class Fixture(object):'
274 UserModel().delete(userid)
275 UserModel().delete(userid)
275 Session().commit()
276 Session().commit()
276
277
278 def create_additional_user_email(self, user, email):
279 uem = UserEmailMap()
280 uem.user = user
281 uem.email = email
282 Session().add(uem)
283 return uem
284
277 def destroy_users(self, userid_iter):
285 def destroy_users(self, userid_iter):
278 for user_id in userid_iter:
286 for user_id in userid_iter:
279 if User.get_by_username(user_id):
287 if User.get_by_username(user_id):
@@ -22,12 +22,13 b' import pytest'
22
22
23 from rhodecode import events
23 from rhodecode import events
24 from rhodecode.lib.utils2 import AttributeDict
24 from rhodecode.lib.utils2 import AttributeDict
25 from rhodecode.integrations.types.webhook import WebhookHandler
25 from rhodecode.integrations.types.webhook import WebhookDataHandler
26
26
27
27
28 @pytest.fixture
28 @pytest.fixture
29 def base_data():
29 def base_data():
30 return {
30 return {
31 'name': 'event',
31 'repo': {
32 'repo': {
32 'repo_name': 'foo',
33 'repo_name': 'foo',
33 'repo_type': 'hg',
34 'repo_type': 'hg',
@@ -44,31 +45,40 b' def base_data():'
44
45
45 def test_webhook_parse_url_invalid_event():
46 def test_webhook_parse_url_invalid_event():
46 template_url = 'http://server.com/${repo_name}/build'
47 template_url = 'http://server.com/${repo_name}/build'
47 handler = WebhookHandler(
48 handler = WebhookDataHandler(
48 template_url, 'secret_token', {'exmaple-header':'header-values'})
49 template_url, {'exmaple-header': 'header-values'})
50 event = events.RepoDeleteEvent('')
49 with pytest.raises(ValueError) as err:
51 with pytest.raises(ValueError) as err:
50 handler(events.RepoDeleteEvent(''), {})
52 handler(event, {})
51 assert str(err.value).startswith('event type not supported')
53
54 err = str(err.value)
55 assert err.startswith(
56 'event type `%s` not in supported list' % event.__class__)
52
57
53
58
54 @pytest.mark.parametrize('template,expected_urls', [
59 @pytest.mark.parametrize('template,expected_urls', [
55 ('http://server.com/${repo_name}/build', ['http://server.com/foo/build']),
60 ('http://server.com/${repo_name}/build',
56 ('http://server.com/${repo_name}/${repo_type}', ['http://server.com/foo/hg']),
61 ['http://server.com/foo/build']),
57 ('http://${server}.com/${repo_name}/${repo_id}', ['http://${server}.com/foo/12']),
62 ('http://server.com/${repo_name}/${repo_type}',
58 ('http://server.com/${branch}/build', ['http://server.com/${branch}/build']),
63 ['http://server.com/foo/hg']),
64 ('http://${server}.com/${repo_name}/${repo_id}',
65 ['http://${server}.com/foo/12']),
66 ('http://server.com/${branch}/build',
67 ['http://server.com/${branch}/build']),
59 ])
68 ])
60 def test_webook_parse_url_for_create_event(base_data, template, expected_urls):
69 def test_webook_parse_url_for_create_event(base_data, template, expected_urls):
61 headers = {'exmaple-header': 'header-values'}
70 headers = {'exmaple-header': 'header-values'}
62 handler = WebhookHandler(
71 handler = WebhookDataHandler(template, headers)
63 template, 'secret_token', headers)
64 urls = handler(events.RepoCreateEvent(''), base_data)
72 urls = handler(events.RepoCreateEvent(''), base_data)
65 assert urls == [
73 assert urls == [
66 (url, 'secret_token', headers, base_data) for url in expected_urls]
74 (url, headers, base_data) for url in expected_urls]
67
75
68
76
69 @pytest.mark.parametrize('template,expected_urls', [
77 @pytest.mark.parametrize('template,expected_urls', [
70 ('http://server.com/${repo_name}/${pull_request_id}', ['http://server.com/foo/999']),
78 ('http://server.com/${repo_name}/${pull_request_id}',
71 ('http://server.com/${repo_name}/${pull_request_url}', ['http://server.com/foo/http://pr-url.com']),
79 ['http://server.com/foo/999']),
80 ('http://server.com/${repo_name}/${pull_request_url}',
81 ['http://server.com/foo/http://pr-url.com']),
72 ])
82 ])
73 def test_webook_parse_url_for_pull_request_event(
83 def test_webook_parse_url_for_pull_request_event(
74 base_data, template, expected_urls):
84 base_data, template, expected_urls):
@@ -76,23 +86,27 b' def test_webook_parse_url_for_pull_reque'
76 base_data['pullrequest'] = {
86 base_data['pullrequest'] = {
77 'pull_request_id': 999,
87 'pull_request_id': 999,
78 'url': 'http://pr-url.com',
88 'url': 'http://pr-url.com',
89 'title': 'example-pr-title',
90 'commits_uid': 'abcdefg1234',
91 'shadow_url': 'http://pr-url.com/repository'
79 }
92 }
80 headers = {'exmaple-header': 'header-values'}
93 headers = {'exmaple-header': 'header-values'}
81 handler = WebhookHandler(
94 handler = WebhookDataHandler(template, headers)
82 template, 'secret_token', headers)
83 urls = handler(events.PullRequestCreateEvent(
95 urls = handler(events.PullRequestCreateEvent(
84 AttributeDict({'target_repo': 'foo'})), base_data)
96 AttributeDict({'target_repo': 'foo'})), base_data)
85 assert urls == [
97 assert urls == [
86 (url, 'secret_token', headers, base_data) for url in expected_urls]
98 (url, headers, base_data) for url in expected_urls]
87
99
88
100
89 @pytest.mark.parametrize('template,expected_urls', [
101 @pytest.mark.parametrize('template,expected_urls', [
90 ('http://server.com/${branch}/build', ['http://server.com/stable/build',
102 ('http://server.com/${branch}/build',
91 'http://server.com/dev/build']),
103 ['http://server.com/stable/build',
92 ('http://server.com/${branch}/${commit_id}', ['http://server.com/stable/stable-xxx',
104 'http://server.com/dev/build']),
93 'http://server.com/stable/stable-yyy',
105 ('http://server.com/${branch}/${commit_id}',
94 'http://server.com/dev/dev-xxx',
106 ['http://server.com/stable/stable-xxx',
95 'http://server.com/dev/dev-yyy']),
107 'http://server.com/stable/stable-yyy',
108 'http://server.com/dev/dev-xxx',
109 'http://server.com/dev/dev-yyy']),
96 ])
110 ])
97 def test_webook_parse_url_for_push_event(
111 def test_webook_parse_url_for_push_event(
98 baseapp, repo_push_event, base_data, template, expected_urls):
112 baseapp, repo_push_event, base_data, template, expected_urls):
@@ -104,8 +118,7 b' def test_webook_parse_url_for_push_event'
104 {'branch': 'dev', 'raw_id': 'dev-yyy'}]
118 {'branch': 'dev', 'raw_id': 'dev-yyy'}]
105 }
119 }
106 headers = {'exmaple-header': 'header-values'}
120 headers = {'exmaple-header': 'header-values'}
107 handler = WebhookHandler(
121 handler = WebhookDataHandler(template, headers)
108 template, 'secret_token', headers)
109 urls = handler(repo_push_event, base_data)
122 urls = handler(repo_push_event, base_data)
110 assert urls == [
123 assert urls == [
111 (url, 'secret_token', headers, base_data) for url in expected_urls]
124 (url, headers, base_data) for url in expected_urls]
@@ -377,34 +377,6 b' class TestGenerateVcsResponse(object):'
377 list(result)
377 list(result)
378 assert self.was_cache_invalidated()
378 assert self.was_cache_invalidated()
379
379
380 @mock.patch('rhodecode.lib.middleware.simplevcs.HTTPLockedRC')
381 def test_handles_locking_exception(self, http_locked_rc):
382 result = self.call_controller_with_response_body(
383 self.raise_result_iter(vcs_kind='repo_locked'))
384 assert not http_locked_rc.called
385 # Consume the result
386 list(result)
387 assert http_locked_rc.called
388
389 @mock.patch('rhodecode.lib.middleware.simplevcs.HTTPRequirementError')
390 def test_handles_requirement_exception(self, http_requirement):
391 result = self.call_controller_with_response_body(
392 self.raise_result_iter(vcs_kind='requirement'))
393 assert not http_requirement.called
394 # Consume the result
395 list(result)
396 assert http_requirement.called
397
398 @mock.patch('rhodecode.lib.middleware.simplevcs.HTTPLockedRC')
399 def test_handles_locking_exception_in_app_call(self, http_locked_rc):
400 app_factory_patcher = mock.patch.object(
401 StubVCSController, '_create_wsgi_app')
402 with app_factory_patcher as app_factory:
403 app_factory().side_effect = self.vcs_exception()
404 result = self.call_controller_with_response_body(['a'])
405 list(result)
406 assert http_locked_rc.called
407
408 def test_raises_unknown_exceptions(self):
380 def test_raises_unknown_exceptions(self):
409 result = self.call_controller_with_response_body(
381 result = self.call_controller_with_response_body(
410 self.raise_result_iter(vcs_kind='unknown'))
382 self.raise_result_iter(vcs_kind='unknown'))
@@ -412,7 +384,7 b' class TestGenerateVcsResponse(object):'
412 list(result)
384 list(result)
413
385
414 def test_prepare_callback_daemon_is_called(self):
386 def test_prepare_callback_daemon_is_called(self):
415 def side_effect(extras):
387 def side_effect(extras, environ, action, txn_id=None):
416 return DummyHooksCallbackDaemon(), extras
388 return DummyHooksCallbackDaemon(), extras
417
389
418 prepare_patcher = mock.patch.object(
390 prepare_patcher = mock.patch.object(
@@ -489,10 +461,11 b' class TestPrepareHooksDaemon(object):'
489 return_value=(daemon, expected_extras))
461 return_value=(daemon, expected_extras))
490 with prepare_patcher as prepare_mock:
462 with prepare_patcher as prepare_mock:
491 callback_daemon, extras = controller._prepare_callback_daemon(
463 callback_daemon, extras = controller._prepare_callback_daemon(
492 expected_extras.copy())
464 expected_extras.copy(), {}, 'push')
493 prepare_mock.assert_called_once_with(
465 prepare_mock.assert_called_once_with(
494 expected_extras,
466 expected_extras,
495 protocol=app_settings['vcs.hooks.protocol'],
467 protocol=app_settings['vcs.hooks.protocol'],
468 txn_id=None,
496 use_direct_calls=app_settings['vcs.hooks.direct_calls'])
469 use_direct_calls=app_settings['vcs.hooks.direct_calls'])
497
470
498 assert callback_daemon == daemon
471 assert callback_daemon == daemon
@@ -179,10 +179,12 b' class TestHttpHooksCallbackDaemon(object'
179 daemon = hooks_daemon.HttpHooksCallbackDaemon()
179 daemon = hooks_daemon.HttpHooksCallbackDaemon()
180 assert daemon._daemon == tcp_server
180 assert daemon._daemon == tcp_server
181
181
182 _, port = tcp_server.server_address
183 expected_uri = '{}:{}'.format(daemon.IP_ADDRESS, port)
184 msg = 'Preparing HTTP callback daemon at `{}` and ' \
185 'registering hook object'.format(expected_uri)
182 assert_message_in_log(
186 assert_message_in_log(
183 caplog.records,
187 caplog.records, msg, levelno=logging.DEBUG, module='hooks_daemon')
184 'Preparing HTTP callback daemon and registering hook object',
185 levelno=logging.DEBUG, module='hooks_daemon')
186
188
187 def test_prepare_inits_hooks_uri_and_logs_it(
189 def test_prepare_inits_hooks_uri_and_logs_it(
188 self, tcp_server, caplog):
190 self, tcp_server, caplog):
@@ -193,8 +195,10 b' class TestHttpHooksCallbackDaemon(object'
193 expected_uri = '{}:{}'.format(daemon.IP_ADDRESS, port)
195 expected_uri = '{}:{}'.format(daemon.IP_ADDRESS, port)
194 assert daemon.hooks_uri == expected_uri
196 assert daemon.hooks_uri == expected_uri
195
197
198 msg = 'Preparing HTTP callback daemon at `{}` and ' \
199 'registering hook object'.format(expected_uri)
196 assert_message_in_log(
200 assert_message_in_log(
197 caplog.records, 'Hooks uri is: {}'.format(expected_uri),
201 caplog.records, msg,
198 levelno=logging.DEBUG, module='hooks_daemon')
202 levelno=logging.DEBUG, module='hooks_daemon')
199
203
200 def test_run_creates_a_thread(self, tcp_server):
204 def test_run_creates_a_thread(self, tcp_server):
@@ -263,7 +267,8 b' class TestPrepareHooksDaemon(object):'
263 expected_extras.copy(), protocol=protocol, use_direct_calls=True)
267 expected_extras.copy(), protocol=protocol, use_direct_calls=True)
264 assert isinstance(callback, hooks_daemon.DummyHooksCallbackDaemon)
268 assert isinstance(callback, hooks_daemon.DummyHooksCallbackDaemon)
265 expected_extras['hooks_module'] = 'rhodecode.lib.hooks_daemon'
269 expected_extras['hooks_module'] = 'rhodecode.lib.hooks_daemon'
266 assert extras == expected_extras
270 expected_extras['time'] = extras['time']
271 assert 'extra1' in extras
267
272
268 @pytest.mark.parametrize('protocol, expected_class', (
273 @pytest.mark.parametrize('protocol, expected_class', (
269 ('http', hooks_daemon.HttpHooksCallbackDaemon),
274 ('http', hooks_daemon.HttpHooksCallbackDaemon),
@@ -272,12 +277,15 b' class TestPrepareHooksDaemon(object):'
272 self, protocol, expected_class):
277 self, protocol, expected_class):
273 expected_extras = {
278 expected_extras = {
274 'extra1': 'value1',
279 'extra1': 'value1',
280 'txn_id': 'txnid2',
275 'hooks_protocol': protocol.lower()
281 'hooks_protocol': protocol.lower()
276 }
282 }
277 callback, extras = hooks_daemon.prepare_callback_daemon(
283 callback, extras = hooks_daemon.prepare_callback_daemon(
278 expected_extras.copy(), protocol=protocol, use_direct_calls=False)
284 expected_extras.copy(), protocol=protocol, use_direct_calls=False,
285 txn_id='txnid2')
279 assert isinstance(callback, expected_class)
286 assert isinstance(callback, expected_class)
280 hooks_uri = extras.pop('hooks_uri')
287 extras.pop('hooks_uri')
288 expected_extras['time'] = extras['time']
281 assert extras == expected_extras
289 assert extras == expected_extras
282
290
283 @pytest.mark.parametrize('protocol', (
291 @pytest.mark.parametrize('protocol', (
@@ -245,22 +245,16 b' def test_repo2db_mapper_enables_largefil'
245 repo = backend.create_repo()
245 repo = backend.create_repo()
246 repo_list = {repo.repo_name: 'test'}
246 repo_list = {repo.repo_name: 'test'}
247 with mock.patch('rhodecode.model.db.Repository.scm_instance') as scm_mock:
247 with mock.patch('rhodecode.model.db.Repository.scm_instance') as scm_mock:
248 with mock.patch.multiple('rhodecode.model.scm.ScmModel',
248 utils.repo2db_mapper(repo_list, remove_obsolete=False)
249 install_git_hook=mock.DEFAULT,
249 _, kwargs = scm_mock.call_args
250 install_svn_hooks=mock.DEFAULT):
250 assert kwargs['config'].get('extensions', 'largefiles') == ''
251 utils.repo2db_mapper(repo_list, remove_obsolete=False)
252 _, kwargs = scm_mock.call_args
253 assert kwargs['config'].get('extensions', 'largefiles') == ''
254
251
255
252
256 @pytest.mark.backends("git", "svn")
253 @pytest.mark.backends("git", "svn")
257 def test_repo2db_mapper_installs_hooks_for_repos_in_db(backend):
254 def test_repo2db_mapper_installs_hooks_for_repos_in_db(backend):
258 repo = backend.create_repo()
255 repo = backend.create_repo()
259 repo_list = {repo.repo_name: 'test'}
256 repo_list = {repo.repo_name: 'test'}
260 with mock.patch.object(ScmModel, 'install_hooks') as install_hooks_mock:
257 utils.repo2db_mapper(repo_list, remove_obsolete=False)
261 utils.repo2db_mapper(repo_list, remove_obsolete=False)
262 install_hooks_mock.assert_called_once_with(
263 repo.scm_instance(), repo_type=backend.alias)
264
258
265
259
266 @pytest.mark.backends("git", "svn")
260 @pytest.mark.backends("git", "svn")
@@ -269,11 +263,7 b' def test_repo2db_mapper_installs_hooks_f'
269 RepoModel().delete(repo, fs_remove=False)
263 RepoModel().delete(repo, fs_remove=False)
270 meta.Session().commit()
264 meta.Session().commit()
271 repo_list = {repo.repo_name: repo.scm_instance()}
265 repo_list = {repo.repo_name: repo.scm_instance()}
272 with mock.patch.object(ScmModel, 'install_hooks') as install_hooks_mock:
266 utils.repo2db_mapper(repo_list, remove_obsolete=False)
273 utils.repo2db_mapper(repo_list, remove_obsolete=False)
274 assert install_hooks_mock.call_count == 1
275 install_hooks_args, _ = install_hooks_mock.call_args
276 assert install_hooks_args[0].name == repo.repo_name
277
267
278
268
279 class TestPasswordChanged(object):
269 class TestPasswordChanged(object):
@@ -44,6 +44,7 b' GENERAL_FORM_DATA = {'
44 'rhodecode_hg_close_branch_before_merging': True,
44 'rhodecode_hg_close_branch_before_merging': True,
45 'rhodecode_git_use_rebase_for_merging': True,
45 'rhodecode_git_use_rebase_for_merging': True,
46 'rhodecode_git_close_branch_before_merging': True,
46 'rhodecode_git_close_branch_before_merging': True,
47 'rhodecode_diff_cache': True,
47 }
48 }
48
49
49
50
@@ -18,10 +18,11 b''
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import os
22 import mock
23 import pytest
21 import tempfile
24 import tempfile
22
25
23 import mock
24 import pytest
25
26
26 from rhodecode.lib.exceptions import AttachedForksError
27 from rhodecode.lib.exceptions import AttachedForksError
27 from rhodecode.lib.utils import make_db_config
28 from rhodecode.lib.utils import make_db_config
@@ -119,22 +120,22 b' class TestRepoModel(object):'
119
120
120 @pytest.mark.backends("git", "svn")
121 @pytest.mark.backends("git", "svn")
121 def test_create_filesystem_repo_installs_hooks(self, tmpdir, backend):
122 def test_create_filesystem_repo_installs_hooks(self, tmpdir, backend):
122 hook_methods = {
123 'git': 'install_git_hook',
124 'svn': 'install_svn_hooks'
125 }
126 repo = backend.create_repo()
123 repo = backend.create_repo()
127 repo_name = repo.repo_name
124 repo_name = repo.repo_name
128 model = RepoModel()
125 model = RepoModel()
129 repo_location = tempfile.mkdtemp()
126 repo_location = tempfile.mkdtemp()
130 model.repos_path = repo_location
127 model.repos_path = repo_location
131 method = hook_methods[backend.alias]
128 repo = model._create_filesystem_repo(
132 with mock.patch.object(ScmModel, method) as hooks_mock:
129 repo_name, backend.alias, repo_group='', clone_uri=None)
133 model._create_filesystem_repo(
130
134 repo_name, backend.alias, repo_group='', clone_uri=None)
131 hooks = {
135 assert hooks_mock.call_count == 1
132 'svn': ('pre-commit', 'post-commit'),
136 hook_args, hook_kwargs = hooks_mock.call_args
133 'git': ('pre-receive', 'post-receive'),
137 assert hook_args[0].name == repo_name
134 }
135 for hook in hooks[backend.alias]:
136 with open(os.path.join(repo.path, 'hooks', hook)) as f:
137 data = f.read()
138 assert 'RC_HOOK_VER' in data
138
139
139 @pytest.mark.parametrize("use_global_config, repo_name_passed", [
140 @pytest.mark.parametrize("use_global_config, repo_name_passed", [
140 (True, False),
141 (True, False),
@@ -194,143 +194,3 b' def test_get_non_unicode_reference(backe'
194 u'tag:Ad\xc4\xb1n\xc4\xb1']
194 u'tag:Ad\xc4\xb1n\xc4\xb1']
195
195
196 assert choices == valid_choices
196 assert choices == valid_choices
197
198
199 class TestInstallSvnHooks(object):
200 HOOK_FILES = ('pre-commit', 'post-commit')
201
202 def test_new_hooks_are_created(self, backend_svn):
203 model = scm.ScmModel()
204 repo = backend_svn.create_repo()
205 vcs_repo = repo.scm_instance()
206 model.install_svn_hooks(vcs_repo)
207
208 hooks_path = os.path.join(vcs_repo.path, 'hooks')
209 assert os.path.isdir(hooks_path)
210 for file_name in self.HOOK_FILES:
211 file_path = os.path.join(hooks_path, file_name)
212 self._check_hook_file_mode(file_path)
213 self._check_hook_file_content(file_path)
214
215 def test_rc_hooks_are_replaced(self, backend_svn):
216 model = scm.ScmModel()
217 repo = backend_svn.create_repo()
218 vcs_repo = repo.scm_instance()
219 hooks_path = os.path.join(vcs_repo.path, 'hooks')
220 file_paths = [os.path.join(hooks_path, f) for f in self.HOOK_FILES]
221
222 for file_path in file_paths:
223 self._create_fake_hook(
224 file_path, content="RC_HOOK_VER = 'abcde'\n")
225
226 model.install_svn_hooks(vcs_repo)
227
228 for file_path in file_paths:
229 self._check_hook_file_content(file_path)
230
231 def test_non_rc_hooks_are_not_replaced_without_force_create(
232 self, backend_svn):
233 model = scm.ScmModel()
234 repo = backend_svn.create_repo()
235 vcs_repo = repo.scm_instance()
236 hooks_path = os.path.join(vcs_repo.path, 'hooks')
237 file_paths = [os.path.join(hooks_path, f) for f in self.HOOK_FILES]
238 non_rc_content = "exit 0\n"
239
240 for file_path in file_paths:
241 self._create_fake_hook(file_path, content=non_rc_content)
242
243 model.install_svn_hooks(vcs_repo)
244
245 for file_path in file_paths:
246 with open(file_path, 'rt') as hook_file:
247 content = hook_file.read()
248 assert content == non_rc_content
249
250 def test_non_rc_hooks_are_replaced_with_force_create(self, backend_svn):
251 model = scm.ScmModel()
252 repo = backend_svn.create_repo()
253 vcs_repo = repo.scm_instance()
254 hooks_path = os.path.join(vcs_repo.path, 'hooks')
255 file_paths = [os.path.join(hooks_path, f) for f in self.HOOK_FILES]
256 non_rc_content = "exit 0\n"
257
258 for file_path in file_paths:
259 self._create_fake_hook(file_path, content=non_rc_content)
260
261 model.install_svn_hooks(vcs_repo, force_create=True)
262
263 for file_path in file_paths:
264 self._check_hook_file_content(file_path)
265
266 def _check_hook_file_mode(self, file_path):
267 assert os.path.exists(file_path)
268 stat_info = os.stat(file_path)
269
270 file_mode = stat.S_IMODE(stat_info.st_mode)
271 expected_mode = int('755', 8)
272 assert expected_mode == file_mode
273
274 def _check_hook_file_content(self, file_path):
275 with open(file_path, 'rt') as hook_file:
276 content = hook_file.read()
277
278 expected_env = '#!{}'.format(sys.executable)
279 expected_rc_version = "\nRC_HOOK_VER = '{}'\n".format(
280 rhodecode.__version__)
281 assert content.strip().startswith(expected_env)
282 assert expected_rc_version in content
283
284 def _create_fake_hook(self, file_path, content):
285 with open(file_path, 'w') as hook_file:
286 hook_file.write(content)
287
288
289 class TestCheckRhodecodeHook(object):
290
291 @patch('os.path.exists', Mock(return_value=False))
292 def test_returns_true_when_no_hook_found(self):
293 result = scm._check_rhodecode_hook('/tmp/fake_hook_file.py')
294 assert result
295
296 @pytest.mark.parametrize("file_content, expected_result", [
297 ("RC_HOOK_VER = '3.3.3'\n", True),
298 ("RC_HOOK = '3.3.3'\n", False),
299 ], ids=no_newline_id_generator)
300 @patch('os.path.exists', Mock(return_value=True))
301 def test_signatures(self, file_content, expected_result):
302 hook_content_patcher = patch.object(
303 scm, '_read_hook', return_value=file_content)
304 with hook_content_patcher:
305 result = scm._check_rhodecode_hook('/tmp/fake_hook_file.py')
306
307 assert result is expected_result
308
309
310 class TestInstallHooks(object):
311 def test_hooks_are_installed_for_git_repo(self, backend_git):
312 repo = backend_git.create_repo()
313 model = scm.ScmModel()
314 scm_repo = repo.scm_instance()
315 with patch.object(model, 'install_git_hook') as hooks_mock:
316 model.install_hooks(scm_repo, repo_type='git')
317 hooks_mock.assert_called_once_with(scm_repo)
318
319 def test_hooks_are_installed_for_svn_repo(self, backend_svn):
320 repo = backend_svn.create_repo()
321 scm_repo = repo.scm_instance()
322 model = scm.ScmModel()
323 with patch.object(scm.ScmModel, 'install_svn_hooks') as hooks_mock:
324 model.install_hooks(scm_repo, repo_type='svn')
325 hooks_mock.assert_called_once_with(scm_repo)
326
327 @pytest.mark.parametrize('hook_method', [
328 'install_svn_hooks',
329 'install_git_hook'])
330 def test_mercurial_doesnt_trigger_hooks(self, backend_hg, hook_method):
331 repo = backend_hg.create_repo()
332 scm_repo = repo.scm_instance()
333 model = scm.ScmModel()
334 with patch.object(scm.ScmModel, hook_method) as hooks_mock:
335 model.install_hooks(scm_repo, repo_type='hg')
336 assert hooks_mock.call_count == 0
@@ -24,8 +24,7 b' import pytest'
24
24
25 from rhodecode.tests import (
25 from rhodecode.tests import (
26 HG_REPO, TEST_USER_REGULAR2_EMAIL, TEST_USER_REGULAR2_LOGIN,
26 HG_REPO, TEST_USER_REGULAR2_EMAIL, TEST_USER_REGULAR2_LOGIN,
27 TEST_USER_REGULAR2_PASS, TEST_USER_ADMIN_LOGIN, TESTS_TMP_PATH,
27 TEST_USER_REGULAR2_PASS, TEST_USER_ADMIN_LOGIN, TESTS_TMP_PATH)
28 ldap_lib_installed)
29
28
30 from rhodecode.model import validators as v
29 from rhodecode.model import validators as v
31 from rhodecode.model.user_group import UserGroupModel
30 from rhodecode.model.user_group import UserGroupModel
@@ -1178,6 +1178,10 b' class UserUtility(object):'
1178 self.user_ids.append(user.user_id)
1178 self.user_ids.append(user.user_id)
1179 return user
1179 return user
1180
1180
1181 def create_additional_user_email(self, user, email):
1182 uem = self.fixture.create_additional_user_email(user=user, email=email)
1183 return uem
1184
1181 def create_user_with_group(self):
1185 def create_user_with_group(self):
1182 user = self.create_user()
1186 user = self.create_user()
1183 user_group = self.create_user_group(members=[user])
1187 user_group = self.create_user_group(members=[user])
@@ -1706,7 +1710,10 b' def StubIntegrationType():'
1706 key = 'test'
1710 key = 'test'
1707 display_name = 'Test integration type'
1711 display_name = 'Test integration type'
1708 description = 'A test integration type for testing'
1712 description = 'A test integration type for testing'
1709 icon = 'test_icon_html_image'
1713
1714 @classmethod
1715 def icon(cls):
1716 return 'test_icon_html_image'
1710
1717
1711 def __init__(self, settings):
1718 def __init__(self, settings):
1712 super(_StubIntegrationType, self).__init__(settings)
1719 super(_StubIntegrationType, self).__init__(settings)
@@ -154,9 +154,6 b' force_https = false'
154 ## use Strict-Transport-Security headers
154 ## use Strict-Transport-Security headers
155 use_htsts = false
155 use_htsts = false
156
156
157 ## number of commits stats will parse on each iteration
158 commit_parse_limit = 25
159
160 ## git rev filter option, --all is the default filter, if you need to
157 ## git rev filter option, --all is the default filter, if you need to
161 ## hide all refs in changelog switch this to --branches --tags
158 ## hide all refs in changelog switch this to --branches --tags
162 git_rev_filter = --all
159 git_rev_filter = --all
@@ -21,41 +21,31 b''
21 """
21 """
22 Tests for main module's methods.
22 Tests for main module's methods.
23 """
23 """
24
24 import os
25 import tempfile
26 import shutil
25 import mock
27 import mock
26 import os
27 import shutil
28 import tempfile
29
30 import pytest
28 import pytest
31
29
32 from rhodecode.lib.vcs import VCSError, get_backend, get_vcs_instance
30 from rhodecode.lib.vcs import VCSError, get_backend, get_vcs_instance
33 from rhodecode.lib.vcs.backends.hg import MercurialRepository
34 from rhodecode.tests import TEST_HG_REPO, TEST_GIT_REPO
35
31
36
32
37 pytestmark = pytest.mark.usefixtures("baseapp")
33 pytestmark = pytest.mark.usefixtures("baseapp")
38
34
39
35
40 def test_get_backend():
36 def test_get_backend(backend):
41 hg = get_backend('hg')
37 repo_class = get_backend(backend.alias)
42 assert hg == MercurialRepository
38 assert repo_class == backend.repo.scm_instance().__class__
43
39
44
40
45 def test_alias_detect_hg():
41 def test_alias_detect(backend):
46 alias = 'hg'
42 alias = backend.alias
47 path = TEST_HG_REPO
43 path = backend.repo.scm_instance().path
48 backend = get_backend(alias)
49 repo = backend(path)
50 assert 'hg' == repo.alias
51
44
45 new_backend = get_backend(alias)
46 repo = new_backend(path)
52
47
53 def test_alias_detect_git():
48 assert alias == repo.alias
54 alias = 'git'
55 path = TEST_GIT_REPO
56 backend = get_backend(alias)
57 repo = backend(path)
58 assert 'git' == repo.alias
59
49
60
50
61 def test_wrong_alias():
51 def test_wrong_alias():
@@ -73,56 +63,13 b' def test_get_vcs_instance_by_path(vcs_re'
73 assert repo.name == vcs_repo.name
63 assert repo.name == vcs_repo.name
74
64
75
65
76 @mock.patch('rhodecode.lib.vcs.backends.get_scm')
66 def test_get_vcs_instance_by_path_empty_dir(request, tmpdir):
77 @mock.patch('rhodecode.lib.vcs.backends.get_backend')
78 def test_get_vcs_instance_by_path_args_passed(
79 get_backend_mock, get_scm_mock):
80 """
81 Test that the arguments passed to ``get_vcs_instance_by_path`` are
82 forewarded to the vcs backend class.
83 """
84 backend = mock.MagicMock()
85 get_backend_mock.return_value = backend
86 args = ['these-are-test-args', 0, True, None]
87 get_vcs_instance(TEST_HG_REPO, *args)
88
89 backend.assert_called_with(*args, repo_path=TEST_HG_REPO)
90
91
92 @mock.patch('rhodecode.lib.vcs.backends.get_scm')
93 @mock.patch('rhodecode.lib.vcs.backends.get_backend')
94 def test_get_vcs_instance_by_path_kwargs_passed(
95 get_backend_mock, get_scm_mock):
96 """
97 Test that the keyword arguments passed to ``get_vcs_instance_by_path`` are
98 forewarded to the vcs backend class.
99 """
100 backend = mock.MagicMock()
101 get_backend_mock.return_value = backend
102 kwargs = {
103 'foo': 'these-are-test-args',
104 'bar': 0,
105 'baz': True,
106 'foobar': None
107 }
108 get_vcs_instance(TEST_HG_REPO, **kwargs)
109
110 backend.assert_called_with(repo_path=TEST_HG_REPO, **kwargs)
111
112
113 def test_get_vcs_instance_by_path_err(request):
114 """
67 """
115 Test that ``get_vcs_instance_by_path`` returns None if a path is passed
68 Test that ``get_vcs_instance_by_path`` returns None if a path is passed
116 to an empty directory.
69 to an empty directory.
117 """
70 """
118 empty_dir = tempfile.mkdtemp(prefix='pytest-empty-dir-')
71 empty_dir = str(tmpdir)
119
120 def fin():
121 shutil.rmtree(empty_dir)
122 request.addfinalizer(fin)
123
124 repo = get_vcs_instance(empty_dir)
72 repo = get_vcs_instance(empty_dir)
125
126 assert repo is None
73 assert repo is None
127
74
128
75
@@ -142,3 +89,42 b' def test_get_vcs_instance_by_path_multip'
142 repo = get_vcs_instance(empty_dir)
89 repo = get_vcs_instance(empty_dir)
143
90
144 assert repo is None
91 assert repo is None
92
93
94 @mock.patch('rhodecode.lib.vcs.backends.get_scm')
95 @mock.patch('rhodecode.lib.vcs.backends.get_backend')
96 def test_get_vcs_instance_by_path_args_passed(
97 get_backend_mock, get_scm_mock, tmpdir, vcs_repo):
98 """
99 Test that the arguments passed to ``get_vcs_instance_by_path`` are
100 forwarded to the vcs backend class.
101 """
102 backend = mock.MagicMock()
103 get_backend_mock.return_value = backend
104 args = ['these-are-test-args', 0, True, None]
105 repo = vcs_repo.path
106 get_vcs_instance(repo, *args)
107
108 backend.assert_called_with(*args, repo_path=repo)
109
110
111 @mock.patch('rhodecode.lib.vcs.backends.get_scm')
112 @mock.patch('rhodecode.lib.vcs.backends.get_backend')
113 def test_get_vcs_instance_by_path_kwargs_passed(
114 get_backend_mock, get_scm_mock, vcs_repo):
115 """
116 Test that the keyword arguments passed to ``get_vcs_instance_by_path`` are
117 forwarded to the vcs backend class.
118 """
119 backend = mock.MagicMock()
120 get_backend_mock.return_value = backend
121 kwargs = {
122 'foo': 'these-are-test-args',
123 'bar': 0,
124 'baz': True,
125 'foobar': None
126 }
127 repo = vcs_repo.path
128 get_vcs_instance(repo, **kwargs)
129
130 backend.assert_called_with(repo_path=repo, **kwargs)
@@ -106,9 +106,10 b' def rc_web_server_config_factory(testini'
106 Configuration file used for the fixture `rc_web_server`.
106 Configuration file used for the fixture `rc_web_server`.
107 """
107 """
108
108
109 def factory(vcsserver_port):
109 def factory(rcweb_port, vcsserver_port):
110 custom_params = [
110 custom_params = [
111 {'handler_console': {'level': 'DEBUG'}},
111 {'handler_console': {'level': 'DEBUG'}},
112 {'server:main': {'port': rcweb_port}},
112 {'app:main': {'vcs.server': 'localhost:%s' % vcsserver_port}}
113 {'app:main': {'vcs.server': 'localhost:%s' % vcsserver_port}}
113 ]
114 ]
114 custom_params.extend(rc_web_server_config_modification)
115 custom_params.extend(rc_web_server_config_modification)
@@ -123,6 +124,8 b' def rc_web_server('
123 """
124 """
124 Run the web server as a subprocess. with it's own instance of vcsserver
125 Run the web server as a subprocess. with it's own instance of vcsserver
125 """
126 """
127 rcweb_port = available_port_factory()
128 print('Using rcweb ops test port {}'.format(rcweb_port))
126
129
127 vcsserver_port = available_port_factory()
130 vcsserver_port = available_port_factory()
128 print('Using vcsserver ops test port {}'.format(vcsserver_port))
131 print('Using vcsserver ops test port {}'.format(vcsserver_port))
@@ -138,6 +141,7 b' def rc_web_server('
138
141
139 rc_log = os.path.join(tempfile.gettempdir(), 'rc_op_web.log')
142 rc_log = os.path.join(tempfile.gettempdir(), 'rc_op_web.log')
140 rc_web_server_config = rc_web_server_config_factory(
143 rc_web_server_config = rc_web_server_config_factory(
144 rcweb_port=rcweb_port,
141 vcsserver_port=vcsserver_port)
145 vcsserver_port=vcsserver_port)
142 server = RcWebServer(rc_web_server_config, log_file=rc_log)
146 server = RcWebServer(rc_web_server_config, log_file=rc_log)
143 server.start()
147 server.start()
1 NO CONTENT: file was removed
NO CONTENT: file was removed
1 NO CONTENT: file was removed
NO CONTENT: file was removed
1 NO CONTENT: file was removed
NO CONTENT: file was removed
1 NO CONTENT: file was removed
NO CONTENT: file was removed
1 NO CONTENT: file was removed
NO CONTENT: file was removed
General Comments 0
You need to be logged in to leave comments. Login now