Show More
The requested changes are too big and content was truncated. Show full diff
@@ -0,0 +1,10 b'' | |||||
|
1 | .. _api: | |||
|
2 | ||||
|
3 | API Reference | |||
|
4 | ============= | |||
|
5 | ||||
|
6 | .. toctree:: | |||
|
7 | :maxdepth: 3 | |||
|
8 | ||||
|
9 | models | |||
|
10 | No newline at end of file |
@@ -0,0 +1,19 b'' | |||||
|
1 | .. _models: | |||
|
2 | ||||
|
3 | The :mod:`models` Module | |||
|
4 | ======================== | |||
|
5 | ||||
|
6 | .. automodule:: rhodecode.model | |||
|
7 | :members: | |||
|
8 | ||||
|
9 | .. automodule:: rhodecode.model.permission | |||
|
10 | :members: | |||
|
11 | ||||
|
12 | .. automodule:: rhodecode.model.repo | |||
|
13 | :members: | |||
|
14 | ||||
|
15 | .. automodule:: rhodecode.model.scm | |||
|
16 | :members: | |||
|
17 | ||||
|
18 | .. automodule:: rhodecode.model.user | |||
|
19 | :members: |
@@ -0,0 +1,9 b'' | |||||
|
1 | .. _contributing: | |||
|
2 | ||||
|
3 | Contributing in RhodeCode | |||
|
4 | ========================= | |||
|
5 | ||||
|
6 | If You would like to contribute to RhodeCode, please contact me, any help is | |||
|
7 | greatly appreciated. | |||
|
8 | ||||
|
9 | Thank You. |
@@ -0,0 +1,21 b'' | |||||
|
1 | .. _enable_git: | |||
|
2 | ||||
|
3 | Enabling GIT support (beta) | |||
|
4 | =========================== | |||
|
5 | ||||
|
6 | ||||
|
7 | Git support in RhodeCode 1.1 was disabled due to some instability issues, but | |||
|
8 | If You would like to test it fell free to re-enable it. To enable GIT just | |||
|
9 | uncomment git line in rhodecode/__init__.py file | |||
|
10 | ||||
|
11 | .. code-block:: python | |||
|
12 | ||||
|
13 | BACKENDS = { | |||
|
14 | 'hg': 'Mercurial repository', | |||
|
15 | #'git': 'Git repository', | |||
|
16 | } | |||
|
17 | ||||
|
18 | .. note:: | |||
|
19 | Please note that it's not fully stable and it might crash (that's why it | |||
|
20 | was disabled), so be careful about enabling git support. Don't use it in | |||
|
21 | production ! No newline at end of file |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
@@ -0,0 +1,13 b'' | |||||
|
1 | .. _screenshots: | |||
|
2 | ||||
|
3 | .. figure:: images/screenshot1_main_page.png | |||
|
4 | ||||
|
5 | Main page of RhodeCode | |||
|
6 | ||||
|
7 | .. figure:: images/screenshot2_summary_page.png | |||
|
8 | ||||
|
9 | Summary page | |||
|
10 | ||||
|
11 | .. figure:: images/screenshot3_changelog_page.png | |||
|
12 | ||||
|
13 | Changelog with DAG graph No newline at end of file |
@@ -0,0 +1,32 b'' | |||||
|
1 | .. _statistics: | |||
|
2 | ||||
|
3 | ||||
|
4 | Statistics | |||
|
5 | ========== | |||
|
6 | ||||
|
7 | RhodeCode statistics system is heavy on resources, so in order to keep a | |||
|
8 | balance between the usability and performance statistics are cached inside db | |||
|
9 | and are gathered incrementally, this is how RhodeCode does this: | |||
|
10 | ||||
|
11 | With Celery disabled | |||
|
12 | ++++++++++++++++++++ | |||
|
13 | ||||
|
14 | - on each first visit on summary page a set of 250 commits are parsed and | |||
|
15 | updates statistics cache | |||
|
16 | - this happens on each single visit of statistics page until all commits are | |||
|
17 | fetched. Statistics are kept cached until some more commits are added to | |||
|
18 | repository, in such case RhodeCode will fetch only the ones added and will | |||
|
19 | update it's cache. | |||
|
20 | ||||
|
21 | ||||
|
22 | With Celery enabled | |||
|
23 | +++++++++++++++++++ | |||
|
24 | ||||
|
25 | - on first visit on summary page RhodeCode will create task that will execute | |||
|
26 | on celery workers, that will gather all stats until all commits are parsed, | |||
|
27 | each task will parse 250 commits, and run next task to parse next 250 | |||
|
28 | commits, until all are parsed. | |||
|
29 | ||||
|
30 | .. note:: | |||
|
31 | In any time You can disable statistics on each repository in repository edit | |||
|
32 | form in admin panel, just uncheck the statistics checkbox. No newline at end of file |
@@ -0,0 +1,14 b'' | |||||
|
1 | {% extends "basic/layout.html" %} | |||
|
2 | ||||
|
3 | {% block sidebarlogo %} | |||
|
4 | <h3>Support my development effort.</h3> | |||
|
5 | <div style="text-align:center"> | |||
|
6 | <form action="https://www.paypal.com/cgi-bin/webscr" method="post"> | |||
|
7 | <input type="hidden" name="cmd" value="_s-xclick"> | |||
|
8 | <input type="hidden" name="hosted_button_id" value="8U2LLRPLBKWDU"> | |||
|
9 | <input style="border:0px !important" type="image" src="https://www.paypal.com/en_US/i/btn/btn_donate_SM.gif" | |||
|
10 | border="0" name="submit" alt="PayPal - The safer, easier way to pay online!"> | |||
|
11 | <img alt="" border="0" src="https://www.paypal.com/en_US/i/scr/pixel.gif" width="1" height="1"> | |||
|
12 | </form> | |||
|
13 | </div> | |||
|
14 | {% endblock %}} |
@@ -0,0 +1,106 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | """ | |||
|
3 | package.rhodecode.controllers.admin.ldap_settings | |||
|
4 | ~~~~~~~~~~~~~~ | |||
|
5 | ||||
|
6 | ldap controller for RhodeCode | |||
|
7 | :created_on: Nov 26, 2010 | |||
|
8 | :author: marcink | |||
|
9 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
10 | :license: GPLv3, see COPYING for more details. | |||
|
11 | """ | |||
|
12 | # This program is free software; you can redistribute it and/or | |||
|
13 | # modify it under the terms of the GNU General Public License | |||
|
14 | # as published by the Free Software Foundation; version 2 | |||
|
15 | # of the License or (at your opinion) any later version of the license. | |||
|
16 | # | |||
|
17 | # This program is distributed in the hope that it will be useful, | |||
|
18 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
19 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
20 | # GNU General Public License for more details. | |||
|
21 | # | |||
|
22 | # You should have received a copy of the GNU General Public License | |||
|
23 | # along with this program; if not, write to the Free Software | |||
|
24 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |||
|
25 | # MA 02110-1301, USA. | |||
|
26 | import logging | |||
|
27 | import formencode | |||
|
28 | import traceback | |||
|
29 | ||||
|
30 | from formencode import htmlfill | |||
|
31 | ||||
|
32 | from pylons import request, response, session, tmpl_context as c, url | |||
|
33 | from pylons.controllers.util import abort, redirect | |||
|
34 | from pylons.i18n.translation import _ | |||
|
35 | ||||
|
36 | from rhodecode.lib.base import BaseController, render | |||
|
37 | from rhodecode.lib import helpers as h | |||
|
38 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator | |||
|
39 | from rhodecode.lib.auth_ldap import LdapImportError | |||
|
40 | from rhodecode.model.settings import SettingsModel | |||
|
41 | from rhodecode.model.forms import LdapSettingsForm | |||
|
42 | from sqlalchemy.exc import DatabaseError | |||
|
43 | ||||
|
44 | log = logging.getLogger(__name__) | |||
|
45 | ||||
|
46 | ||||
|
47 | ||||
|
48 | class LdapSettingsController(BaseController): | |||
|
49 | ||||
|
50 | @LoginRequired() | |||
|
51 | @HasPermissionAllDecorator('hg.admin') | |||
|
52 | def __before__(self): | |||
|
53 | c.admin_user = session.get('admin_user') | |||
|
54 | c.admin_username = session.get('admin_username') | |||
|
55 | super(LdapSettingsController, self).__before__() | |||
|
56 | ||||
|
57 | def index(self): | |||
|
58 | defaults = SettingsModel().get_ldap_settings() | |||
|
59 | ||||
|
60 | return htmlfill.render( | |||
|
61 | render('admin/ldap/ldap.html'), | |||
|
62 | defaults=defaults, | |||
|
63 | encoding="UTF-8", | |||
|
64 | force_defaults=True,) | |||
|
65 | ||||
|
66 | def ldap_settings(self): | |||
|
67 | """ | |||
|
68 | POST ldap create and store ldap settings | |||
|
69 | """ | |||
|
70 | ||||
|
71 | settings_model = SettingsModel() | |||
|
72 | _form = LdapSettingsForm()() | |||
|
73 | ||||
|
74 | try: | |||
|
75 | form_result = _form.to_python(dict(request.POST)) | |||
|
76 | try: | |||
|
77 | ||||
|
78 | for k, v in form_result.items(): | |||
|
79 | if k.startswith('ldap_'): | |||
|
80 | setting = settings_model.get(k) | |||
|
81 | setting.app_settings_value = v | |||
|
82 | self.sa.add(setting) | |||
|
83 | ||||
|
84 | self.sa.commit() | |||
|
85 | h.flash(_('Ldap settings updated successfully'), | |||
|
86 | category='success') | |||
|
87 | except (DatabaseError,): | |||
|
88 | raise | |||
|
89 | except LdapImportError: | |||
|
90 | h.flash(_('Unable to activate ldap. The "python-ldap" library ' | |||
|
91 | 'is missing.'), category='warning') | |||
|
92 | ||||
|
93 | except formencode.Invalid, errors: | |||
|
94 | ||||
|
95 | return htmlfill.render( | |||
|
96 | render('admin/ldap/ldap.html'), | |||
|
97 | defaults=errors.value, | |||
|
98 | errors=errors.error_dict or {}, | |||
|
99 | prefix_error=False, | |||
|
100 | encoding="UTF-8") | |||
|
101 | except Exception: | |||
|
102 | log.error(traceback.format_exc()) | |||
|
103 | h.flash(_('error occured during update of ldap settings'), | |||
|
104 | category='error') | |||
|
105 | ||||
|
106 | return redirect(url('ldap_home')) |
@@ -0,0 +1,93 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | # encoding: utf-8 | |||
|
3 | # journal controller for pylons | |||
|
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
5 | # | |||
|
6 | # This program is free software; you can redistribute it and/or | |||
|
7 | # modify it under the terms of the GNU General Public License | |||
|
8 | # as published by the Free Software Foundation; version 2 | |||
|
9 | # of the License or (at your opinion) any later version of the license. | |||
|
10 | # | |||
|
11 | # This program is distributed in the hope that it will be useful, | |||
|
12 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
13 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
14 | # GNU General Public License for more details. | |||
|
15 | # | |||
|
16 | # You should have received a copy of the GNU General Public License | |||
|
17 | # along with this program; if not, write to the Free Software | |||
|
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |||
|
19 | # MA 02110-1301, USA. | |||
|
20 | """ | |||
|
21 | Created on November 21, 2010 | |||
|
22 | journal controller for pylons | |||
|
23 | @author: marcink | |||
|
24 | """ | |||
|
25 | ||||
|
26 | from pylons import request, response, session, tmpl_context as c, url | |||
|
27 | from pylons.controllers.util import abort, redirect | |||
|
28 | from rhodecode.lib.auth import LoginRequired, NotAnonymous | |||
|
29 | from rhodecode.lib.base import BaseController, render | |||
|
30 | from rhodecode.lib.helpers import get_token | |||
|
31 | from rhodecode.model.db import UserLog, UserFollowing | |||
|
32 | from rhodecode.model.scm import ScmModel | |||
|
33 | from sqlalchemy import or_ | |||
|
34 | import logging | |||
|
35 | from paste.httpexceptions import HTTPInternalServerError, HTTPNotFound | |||
|
36 | ||||
|
37 | log = logging.getLogger(__name__) | |||
|
38 | ||||
|
39 | class JournalController(BaseController): | |||
|
40 | ||||
|
41 | ||||
|
42 | @LoginRequired() | |||
|
43 | @NotAnonymous() | |||
|
44 | def __before__(self): | |||
|
45 | super(JournalController, self).__before__() | |||
|
46 | ||||
|
47 | def index(self): | |||
|
48 | # Return a rendered template | |||
|
49 | ||||
|
50 | c.following = self.sa.query(UserFollowing)\ | |||
|
51 | .filter(UserFollowing.user_id == c.rhodecode_user.user_id).all() | |||
|
52 | ||||
|
53 | repo_ids = [x.follows_repository.repo_id for x in c.following | |||
|
54 | if x.follows_repository is not None] | |||
|
55 | user_ids = [x.follows_user.user_id for x in c.following | |||
|
56 | if x.follows_user is not None] | |||
|
57 | ||||
|
58 | c.journal = self.sa.query(UserLog)\ | |||
|
59 | .filter(or_( | |||
|
60 | UserLog.repository_id.in_(repo_ids), | |||
|
61 | UserLog.user_id.in_(user_ids), | |||
|
62 | ))\ | |||
|
63 | .order_by(UserLog.action_date.desc())\ | |||
|
64 | .limit(20)\ | |||
|
65 | .all() | |||
|
66 | return render('/journal.html') | |||
|
67 | ||||
|
68 | def toggle_following(self): | |||
|
69 | ||||
|
70 | if request.POST.get('auth_token') == get_token(): | |||
|
71 | scm_model = ScmModel() | |||
|
72 | ||||
|
73 | user_id = request.POST.get('follows_user_id') | |||
|
74 | if user_id: | |||
|
75 | try: | |||
|
76 | scm_model.toggle_following_user(user_id, | |||
|
77 | c.rhodecode_user.user_id) | |||
|
78 | return 'ok' | |||
|
79 | except: | |||
|
80 | raise HTTPInternalServerError() | |||
|
81 | ||||
|
82 | repo_id = request.POST.get('follows_repo_id') | |||
|
83 | if repo_id: | |||
|
84 | try: | |||
|
85 | scm_model.toggle_following_repo(repo_id, | |||
|
86 | c.rhodecode_user.user_id) | |||
|
87 | return 'ok' | |||
|
88 | except: | |||
|
89 | raise HTTPInternalServerError() | |||
|
90 | ||||
|
91 | ||||
|
92 | ||||
|
93 | raise HTTPInternalServerError() |
@@ -0,0 +1,104 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | # encoding: utf-8 | |||
|
3 | # ldap authentication lib | |||
|
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
5 | # | |||
|
6 | # This program is free software; you can redistribute it and/or | |||
|
7 | # modify it under the terms of the GNU General Public License | |||
|
8 | # as published by the Free Software Foundation; version 2 | |||
|
9 | # of the License or (at your opinion) any later version of the license. | |||
|
10 | # | |||
|
11 | # This program is distributed in the hope that it will be useful, | |||
|
12 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
13 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
14 | # GNU General Public License for more details. | |||
|
15 | # | |||
|
16 | # You should have received a copy of the GNU General Public License | |||
|
17 | # along with this program; if not, write to the Free Software | |||
|
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |||
|
19 | # MA 02110-1301, USA. | |||
|
20 | """ | |||
|
21 | Created on Nov 17, 2010 | |||
|
22 | ||||
|
23 | @author: marcink | |||
|
24 | """ | |||
|
25 | ||||
|
26 | from rhodecode.lib.exceptions import * | |||
|
27 | import logging | |||
|
28 | ||||
|
29 | log = logging.getLogger(__name__) | |||
|
30 | ||||
|
31 | try: | |||
|
32 | import ldap | |||
|
33 | except ImportError: | |||
|
34 | pass | |||
|
35 | ||||
|
36 | class AuthLdap(object): | |||
|
37 | ||||
|
38 | def __init__(self, server, base_dn, port=389, bind_dn='', bind_pass='', | |||
|
39 | use_ldaps=False, ldap_version=3): | |||
|
40 | self.ldap_version = ldap_version | |||
|
41 | if use_ldaps: | |||
|
42 | port = port or 689 | |||
|
43 | self.LDAP_USE_LDAPS = use_ldaps | |||
|
44 | self.LDAP_SERVER_ADDRESS = server | |||
|
45 | self.LDAP_SERVER_PORT = port | |||
|
46 | ||||
|
47 | #USE FOR READ ONLY BIND TO LDAP SERVER | |||
|
48 | self.LDAP_BIND_DN = bind_dn | |||
|
49 | self.LDAP_BIND_PASS = bind_pass | |||
|
50 | ||||
|
51 | ldap_server_type = 'ldap' | |||
|
52 | if self.LDAP_USE_LDAPS:ldap_server_type = ldap_server_type + 's' | |||
|
53 | self.LDAP_SERVER = "%s://%s:%s" % (ldap_server_type, | |||
|
54 | self.LDAP_SERVER_ADDRESS, | |||
|
55 | self.LDAP_SERVER_PORT) | |||
|
56 | ||||
|
57 | self.BASE_DN = base_dn | |||
|
58 | ||||
|
59 | def authenticate_ldap(self, username, password): | |||
|
60 | """Authenticate a user via LDAP and return his/her LDAP properties. | |||
|
61 | ||||
|
62 | Raises AuthenticationError if the credentials are rejected, or | |||
|
63 | EnvironmentError if the LDAP server can't be reached. | |||
|
64 | ||||
|
65 | :param username: username | |||
|
66 | :param password: password | |||
|
67 | """ | |||
|
68 | ||||
|
69 | from rhodecode.lib.helpers import chop_at | |||
|
70 | ||||
|
71 | uid = chop_at(username, "@%s" % self.LDAP_SERVER_ADDRESS) | |||
|
72 | ||||
|
73 | if "," in username: | |||
|
74 | raise LdapUsernameError("invalid character in username: ,") | |||
|
75 | try: | |||
|
76 | ldap.set_option(ldap.OPT_X_TLS_CACERTDIR, '/etc/openldap/cacerts') | |||
|
77 | ldap.set_option(ldap.OPT_NETWORK_TIMEOUT, 10) | |||
|
78 | server = ldap.initialize(self.LDAP_SERVER) | |||
|
79 | if self.ldap_version == 2: | |||
|
80 | server.protocol = ldap.VERSION2 | |||
|
81 | else: | |||
|
82 | server.protocol = ldap.VERSION3 | |||
|
83 | ||||
|
84 | if self.LDAP_BIND_DN and self.LDAP_BIND_PASS: | |||
|
85 | server.simple_bind_s(self.LDAP_BIND_DN, self.LDAP_BIND_PASS) | |||
|
86 | ||||
|
87 | dn = self.BASE_DN % {'user':uid} | |||
|
88 | log.debug("Authenticating %r at %s", dn, self.LDAP_SERVER) | |||
|
89 | server.simple_bind_s(dn, password) | |||
|
90 | ||||
|
91 | properties = server.search_s(dn, ldap.SCOPE_SUBTREE) | |||
|
92 | if not properties: | |||
|
93 | raise ldap.NO_SUCH_OBJECT() | |||
|
94 | except ldap.NO_SUCH_OBJECT, e: | |||
|
95 | log.debug("LDAP says no such user '%s' (%s)", uid, username) | |||
|
96 | raise LdapUsernameError() | |||
|
97 | except ldap.INVALID_CREDENTIALS, e: | |||
|
98 | log.debug("LDAP rejected password for user '%s' (%s)", uid, username) | |||
|
99 | raise LdapPasswordError() | |||
|
100 | except ldap.SERVER_DOWN, e: | |||
|
101 | raise LdapConnectionError("LDAP can't access authentication server") | |||
|
102 | ||||
|
103 | return properties[0] | |||
|
104 |
@@ -0,0 +1,16 b'' | |||||
|
1 | """ | |||
|
2 | Automatically sets the environment variable `CELERY_LOADER` to | |||
|
3 | `celerypylons.loader:PylonsLoader`. This ensures the loader is | |||
|
4 | specified when accessing the rest of this package, and allows celery | |||
|
5 | to be installed in a webapp just by importing celerypylons:: | |||
|
6 | ||||
|
7 | import celerypylons | |||
|
8 | ||||
|
9 | """ | |||
|
10 | import os | |||
|
11 | import warnings | |||
|
12 | ||||
|
13 | CELERYPYLONS_LOADER = 'rhodecode.lib.celerypylons.loader.PylonsLoader' | |||
|
14 | if os.environ.get('CELERY_LOADER', CELERYPYLONS_LOADER) != CELERYPYLONS_LOADER: | |||
|
15 | warnings.warn("'CELERY_LOADER' environment variable will be overridden by celery-pylons.") | |||
|
16 | os.environ['CELERY_LOADER'] = CELERYPYLONS_LOADER |
@@ -0,0 +1,90 b'' | |||||
|
1 | from rhodecode.lib.utils import BasePasterCommand, Command | |||
|
2 | ||||
|
3 | ||||
|
4 | __all__ = ['CeleryDaemonCommand', 'CeleryBeatCommand', | |||
|
5 | 'CAMQPAdminCommand', 'CeleryEventCommand'] | |||
|
6 | ||||
|
7 | ||||
|
8 | class CeleryDaemonCommand(BasePasterCommand): | |||
|
9 | """Start the celery worker | |||
|
10 | ||||
|
11 | Starts the celery worker that uses a paste.deploy configuration | |||
|
12 | file. | |||
|
13 | """ | |||
|
14 | usage = 'CONFIG_FILE [celeryd options...]' | |||
|
15 | summary = __doc__.splitlines()[0] | |||
|
16 | description = "".join(__doc__.splitlines()[2:]) | |||
|
17 | ||||
|
18 | parser = Command.standard_parser(quiet=True) | |||
|
19 | ||||
|
20 | def update_parser(self): | |||
|
21 | from celery.bin import celeryd | |||
|
22 | for x in celeryd.WorkerCommand().get_options(): | |||
|
23 | self.parser.add_option(x) | |||
|
24 | ||||
|
25 | def command(self): | |||
|
26 | from celery.bin import celeryd | |||
|
27 | return celeryd.WorkerCommand().run(**vars(self.options)) | |||
|
28 | ||||
|
29 | ||||
|
30 | class CeleryBeatCommand(BasePasterCommand): | |||
|
31 | """Start the celery beat server | |||
|
32 | ||||
|
33 | Starts the celery beat server using a paste.deploy configuration | |||
|
34 | file. | |||
|
35 | """ | |||
|
36 | usage = 'CONFIG_FILE [celerybeat options...]' | |||
|
37 | summary = __doc__.splitlines()[0] | |||
|
38 | description = "".join(__doc__.splitlines()[2:]) | |||
|
39 | ||||
|
40 | parser = Command.standard_parser(quiet=True) | |||
|
41 | ||||
|
42 | def update_parser(self): | |||
|
43 | from celery.bin import celerybeat | |||
|
44 | for x in celerybeat.BeatCommand().get_options(): | |||
|
45 | self.parser.add_option(x) | |||
|
46 | ||||
|
47 | def command(self): | |||
|
48 | from celery.bin import celerybeat | |||
|
49 | return celerybeat.BeatCommand(**vars(self.options)) | |||
|
50 | ||||
|
51 | class CAMQPAdminCommand(BasePasterCommand): | |||
|
52 | """CAMQP Admin | |||
|
53 | ||||
|
54 | CAMQP celery admin tool. | |||
|
55 | """ | |||
|
56 | usage = 'CONFIG_FILE [camqadm options...]' | |||
|
57 | summary = __doc__.splitlines()[0] | |||
|
58 | description = "".join(__doc__.splitlines()[2:]) | |||
|
59 | ||||
|
60 | parser = Command.standard_parser(quiet=True) | |||
|
61 | ||||
|
62 | def update_parser(self): | |||
|
63 | from celery.bin import camqadm | |||
|
64 | for x in camqadm.OPTION_LIST: | |||
|
65 | self.parser.add_option(x) | |||
|
66 | ||||
|
67 | def command(self): | |||
|
68 | from celery.bin import camqadm | |||
|
69 | return camqadm.camqadm(*self.args, **vars(self.options)) | |||
|
70 | ||||
|
71 | ||||
|
72 | class CeleryEventCommand(BasePasterCommand): | |||
|
73 | """Celery event commandd. | |||
|
74 | ||||
|
75 | Capture celery events. | |||
|
76 | """ | |||
|
77 | usage = 'CONFIG_FILE [celeryev options...]' | |||
|
78 | summary = __doc__.splitlines()[0] | |||
|
79 | description = "".join(__doc__.splitlines()[2:]) | |||
|
80 | ||||
|
81 | parser = Command.standard_parser(quiet=True) | |||
|
82 | ||||
|
83 | def update_parser(self): | |||
|
84 | from celery.bin import celeryev | |||
|
85 | for x in celeryev.OPTION_LIST: | |||
|
86 | self.parser.add_option(x) | |||
|
87 | ||||
|
88 | def command(self): | |||
|
89 | from celery.bin import celeryev | |||
|
90 | return celeryev.run_celeryev(**vars(self.options)) |
@@ -0,0 +1,55 b'' | |||||
|
1 | from celery.loaders.base import BaseLoader | |||
|
2 | from pylons import config | |||
|
3 | ||||
|
4 | to_pylons = lambda x: x.replace('_', '.').lower() | |||
|
5 | to_celery = lambda x: x.replace('.', '_').upper() | |||
|
6 | ||||
|
7 | LIST_PARAMS = """CELERY_IMPORTS ADMINS ROUTES""".split() | |||
|
8 | ||||
|
9 | ||||
|
10 | class PylonsSettingsProxy(object): | |||
|
11 | """Pylons Settings Proxy | |||
|
12 | ||||
|
13 | Proxies settings from pylons.config | |||
|
14 | ||||
|
15 | """ | |||
|
16 | def __getattr__(self, key): | |||
|
17 | pylons_key = to_pylons(key) | |||
|
18 | try: | |||
|
19 | value = config[pylons_key] | |||
|
20 | if key in LIST_PARAMS: return value.split() | |||
|
21 | return self.type_converter(value) | |||
|
22 | except KeyError: | |||
|
23 | raise AttributeError(pylons_key) | |||
|
24 | ||||
|
25 | def __setattr__(self, key, value): | |||
|
26 | pylons_key = to_pylons(key) | |||
|
27 | config[pylons_key] = value | |||
|
28 | ||||
|
29 | ||||
|
30 | def type_converter(self, value): | |||
|
31 | #cast to int | |||
|
32 | if value.isdigit(): | |||
|
33 | return int(value) | |||
|
34 | ||||
|
35 | #cast to bool | |||
|
36 | if value.lower() in ['true', 'false']: | |||
|
37 | return value.lower() == 'true' | |||
|
38 | ||||
|
39 | return value | |||
|
40 | ||||
|
41 | class PylonsLoader(BaseLoader): | |||
|
42 | """Pylons celery loader | |||
|
43 | ||||
|
44 | Maps the celery config onto pylons.config | |||
|
45 | ||||
|
46 | """ | |||
|
47 | def read_configuration(self): | |||
|
48 | self.configured = True | |||
|
49 | return PylonsSettingsProxy() | |||
|
50 | ||||
|
51 | def on_worker_init(self): | |||
|
52 | """ | |||
|
53 | Import task modules. | |||
|
54 | """ | |||
|
55 | self.import_default_modules() |
@@ -0,0 +1,69 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | """ | |||
|
3 | rhodecode.lib.dbmigrate.__init__ | |||
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |||
|
5 | ||||
|
6 | Database migration modules | |||
|
7 | ||||
|
8 | :created_on: Dec 11, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | """ | |||
|
13 | # This program is free software; you can redistribute it and/or | |||
|
14 | # modify it under the terms of the GNU General Public License | |||
|
15 | # as published by the Free Software Foundation; version 2 | |||
|
16 | # of the License or (at your opinion) any later version of the license. | |||
|
17 | # | |||
|
18 | # This program is distributed in the hope that it will be useful, | |||
|
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
21 | # GNU General Public License for more details. | |||
|
22 | # | |||
|
23 | # You should have received a copy of the GNU General Public License | |||
|
24 | # along with this program; if not, write to the Free Software | |||
|
25 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |||
|
26 | # MA 02110-1301, USA. | |||
|
27 | ||||
|
28 | import logging | |||
|
29 | from sqlalchemy import engine_from_config | |||
|
30 | ||||
|
31 | ||||
|
32 | from rhodecode.lib.utils import BasePasterCommand, Command, add_cache | |||
|
33 | from rhodecode.lib.db_manage import DbManage | |||
|
34 | ||||
|
35 | log = logging.getLogger(__name__) | |||
|
36 | ||||
|
37 | class UpgradeDb(BasePasterCommand): | |||
|
38 | """Command used for paster to upgrade our database to newer version | |||
|
39 | """ | |||
|
40 | ||||
|
41 | max_args = 1 | |||
|
42 | min_args = 1 | |||
|
43 | ||||
|
44 | usage = "CONFIG_FILE" | |||
|
45 | summary = "Upgrades current db to newer version given configuration file" | |||
|
46 | group_name = "RhodeCode" | |||
|
47 | ||||
|
48 | parser = Command.standard_parser(verbose=True) | |||
|
49 | ||||
|
50 | def command(self): | |||
|
51 | from pylons import config | |||
|
52 | ||||
|
53 | add_cache(config) | |||
|
54 | ||||
|
55 | db_uri = config['sqlalchemy.db1.url'] | |||
|
56 | ||||
|
57 | dbmanage = DbManage(log_sql=True, dbconf=db_uri, | |||
|
58 | root=config['here'], tests=False) | |||
|
59 | ||||
|
60 | dbmanage.upgrade() | |||
|
61 | ||||
|
62 | ||||
|
63 | ||||
|
64 | def update_parser(self): | |||
|
65 | self.parser.add_option('--sql', | |||
|
66 | action='store_true', | |||
|
67 | dest='just_sql', | |||
|
68 | help="Prints upgrade sql for further investigation", | |||
|
69 | default=False) |
@@ -0,0 +1,20 b'' | |||||
|
1 | [db_settings] | |||
|
2 | # Used to identify which repository this database is versioned under. | |||
|
3 | # You can use the name of your project. | |||
|
4 | repository_id=rhodecode_db_migrations | |||
|
5 | ||||
|
6 | # The name of the database table used to track the schema version. | |||
|
7 | # This name shouldn't already be used by your project. | |||
|
8 | # If this is changed once a database is under version control, you'll need to | |||
|
9 | # change the table name in each database too. | |||
|
10 | version_table=db_migrate_version | |||
|
11 | ||||
|
12 | # When committing a change script, Migrate will attempt to generate the | |||
|
13 | # sql for all supported databases; normally, if one of them fails - probably | |||
|
14 | # because you don't have that database installed - it is ignored and the | |||
|
15 | # commit continues, perhaps ending successfully. | |||
|
16 | # Databases in this list MUST compile successfully during a commit, or the | |||
|
17 | # entire commit will fail. List the databases your application will actually | |||
|
18 | # be using to ensure your updates to that database work properly. | |||
|
19 | # This must be a list; example: ['postgres','sqlite'] | |||
|
20 | required_dbs=['sqlite'] |
@@ -0,0 +1,9 b'' | |||||
|
1 | """ | |||
|
2 | SQLAlchemy migrate provides two APIs :mod:`migrate.versioning` for | |||
|
3 | database schema version and repository management and | |||
|
4 | :mod:`migrate.changeset` that allows to define database schema changes | |||
|
5 | using Python. | |||
|
6 | """ | |||
|
7 | ||||
|
8 | from rhodecode.lib.dbmigrate.migrate.versioning import * | |||
|
9 | from rhodecode.lib.dbmigrate.migrate.changeset import * |
@@ -0,0 +1,28 b'' | |||||
|
1 | """ | |||
|
2 | This module extends SQLAlchemy and provides additional DDL [#]_ | |||
|
3 | support. | |||
|
4 | ||||
|
5 | .. [#] SQL Data Definition Language | |||
|
6 | """ | |||
|
7 | import re | |||
|
8 | import warnings | |||
|
9 | ||||
|
10 | import sqlalchemy | |||
|
11 | from sqlalchemy import __version__ as _sa_version | |||
|
12 | ||||
|
13 | warnings.simplefilter('always', DeprecationWarning) | |||
|
14 | ||||
|
15 | _sa_version = tuple(int(re.match("\d+", x).group(0)) for x in _sa_version.split(".")) | |||
|
16 | SQLA_06 = _sa_version >= (0, 6) | |||
|
17 | ||||
|
18 | del re | |||
|
19 | del _sa_version | |||
|
20 | ||||
|
21 | from rhodecode.lib.dbmigrate.migrate.changeset.schema import * | |||
|
22 | from rhodecode.lib.dbmigrate.migrate.changeset.constraint import * | |||
|
23 | ||||
|
24 | sqlalchemy.schema.Table.__bases__ += (ChangesetTable, ) | |||
|
25 | sqlalchemy.schema.Column.__bases__ += (ChangesetColumn, ) | |||
|
26 | sqlalchemy.schema.Index.__bases__ += (ChangesetIndex, ) | |||
|
27 | ||||
|
28 | sqlalchemy.schema.DefaultClause.__bases__ += (ChangesetDefaultClause, ) |
@@ -0,0 +1,358 b'' | |||||
|
1 | """ | |||
|
2 | Extensions to SQLAlchemy for altering existing tables. | |||
|
3 | ||||
|
4 | At the moment, this isn't so much based off of ANSI as much as | |||
|
5 | things that just happen to work with multiple databases. | |||
|
6 | """ | |||
|
7 | import StringIO | |||
|
8 | ||||
|
9 | import sqlalchemy as sa | |||
|
10 | from sqlalchemy.schema import SchemaVisitor | |||
|
11 | from sqlalchemy.engine.default import DefaultDialect | |||
|
12 | from sqlalchemy.sql import ClauseElement | |||
|
13 | from sqlalchemy.schema import (ForeignKeyConstraint, | |||
|
14 | PrimaryKeyConstraint, | |||
|
15 | CheckConstraint, | |||
|
16 | UniqueConstraint, | |||
|
17 | Index) | |||
|
18 | ||||
|
19 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
20 | from rhodecode.lib.dbmigrate.migrate.changeset import constraint, SQLA_06 | |||
|
21 | ||||
|
22 | if not SQLA_06: | |||
|
23 | from sqlalchemy.sql.compiler import SchemaGenerator, SchemaDropper | |||
|
24 | else: | |||
|
25 | from sqlalchemy.schema import AddConstraint, DropConstraint | |||
|
26 | from sqlalchemy.sql.compiler import DDLCompiler | |||
|
27 | SchemaGenerator = SchemaDropper = DDLCompiler | |||
|
28 | ||||
|
29 | ||||
|
30 | class AlterTableVisitor(SchemaVisitor): | |||
|
31 | """Common operations for ``ALTER TABLE`` statements.""" | |||
|
32 | ||||
|
33 | if SQLA_06: | |||
|
34 | # engine.Compiler looks for .statement | |||
|
35 | # when it spawns off a new compiler | |||
|
36 | statement = ClauseElement() | |||
|
37 | ||||
|
38 | def append(self, s): | |||
|
39 | """Append content to the SchemaIterator's query buffer.""" | |||
|
40 | ||||
|
41 | self.buffer.write(s) | |||
|
42 | ||||
|
43 | def execute(self): | |||
|
44 | """Execute the contents of the SchemaIterator's buffer.""" | |||
|
45 | try: | |||
|
46 | return self.connection.execute(self.buffer.getvalue()) | |||
|
47 | finally: | |||
|
48 | self.buffer.truncate(0) | |||
|
49 | ||||
|
50 | def __init__(self, dialect, connection, **kw): | |||
|
51 | self.connection = connection | |||
|
52 | self.buffer = StringIO.StringIO() | |||
|
53 | self.preparer = dialect.identifier_preparer | |||
|
54 | self.dialect = dialect | |||
|
55 | ||||
|
56 | def traverse_single(self, elem): | |||
|
57 | ret = super(AlterTableVisitor, self).traverse_single(elem) | |||
|
58 | if ret: | |||
|
59 | # adapt to 0.6 which uses a string-returning | |||
|
60 | # object | |||
|
61 | self.append(" %s" % ret) | |||
|
62 | ||||
|
63 | def _to_table(self, param): | |||
|
64 | """Returns the table object for the given param object.""" | |||
|
65 | if isinstance(param, (sa.Column, sa.Index, sa.schema.Constraint)): | |||
|
66 | ret = param.table | |||
|
67 | else: | |||
|
68 | ret = param | |||
|
69 | return ret | |||
|
70 | ||||
|
71 | def start_alter_table(self, param): | |||
|
72 | """Returns the start of an ``ALTER TABLE`` SQL-Statement. | |||
|
73 | ||||
|
74 | Use the param object to determine the table name and use it | |||
|
75 | for building the SQL statement. | |||
|
76 | ||||
|
77 | :param param: object to determine the table from | |||
|
78 | :type param: :class:`sqlalchemy.Column`, :class:`sqlalchemy.Index`, | |||
|
79 | :class:`sqlalchemy.schema.Constraint`, :class:`sqlalchemy.Table`, | |||
|
80 | or string (table name) | |||
|
81 | """ | |||
|
82 | table = self._to_table(param) | |||
|
83 | self.append('\nALTER TABLE %s ' % self.preparer.format_table(table)) | |||
|
84 | return table | |||
|
85 | ||||
|
86 | ||||
|
87 | class ANSIColumnGenerator(AlterTableVisitor, SchemaGenerator): | |||
|
88 | """Extends ansisql generator for column creation (alter table add col)""" | |||
|
89 | ||||
|
90 | def visit_column(self, column): | |||
|
91 | """Create a column (table already exists). | |||
|
92 | ||||
|
93 | :param column: column object | |||
|
94 | :type column: :class:`sqlalchemy.Column` instance | |||
|
95 | """ | |||
|
96 | if column.default is not None: | |||
|
97 | self.traverse_single(column.default) | |||
|
98 | ||||
|
99 | table = self.start_alter_table(column) | |||
|
100 | self.append("ADD ") | |||
|
101 | self.append(self.get_column_specification(column)) | |||
|
102 | ||||
|
103 | for cons in column.constraints: | |||
|
104 | self.traverse_single(cons) | |||
|
105 | self.execute() | |||
|
106 | ||||
|
107 | # ALTER TABLE STATEMENTS | |||
|
108 | ||||
|
109 | # add indexes and unique constraints | |||
|
110 | if column.index_name: | |||
|
111 | Index(column.index_name,column).create() | |||
|
112 | elif column.unique_name: | |||
|
113 | constraint.UniqueConstraint(column, | |||
|
114 | name=column.unique_name).create() | |||
|
115 | ||||
|
116 | # SA bounds FK constraints to table, add manually | |||
|
117 | for fk in column.foreign_keys: | |||
|
118 | self.add_foreignkey(fk.constraint) | |||
|
119 | ||||
|
120 | # add primary key constraint if needed | |||
|
121 | if column.primary_key_name: | |||
|
122 | cons = constraint.PrimaryKeyConstraint(column, | |||
|
123 | name=column.primary_key_name) | |||
|
124 | cons.create() | |||
|
125 | ||||
|
126 | if SQLA_06: | |||
|
127 | def add_foreignkey(self, fk): | |||
|
128 | self.connection.execute(AddConstraint(fk)) | |||
|
129 | ||||
|
130 | class ANSIColumnDropper(AlterTableVisitor, SchemaDropper): | |||
|
131 | """Extends ANSI SQL dropper for column dropping (``ALTER TABLE | |||
|
132 | DROP COLUMN``). | |||
|
133 | """ | |||
|
134 | ||||
|
135 | def visit_column(self, column): | |||
|
136 | """Drop a column from its table. | |||
|
137 | ||||
|
138 | :param column: the column object | |||
|
139 | :type column: :class:`sqlalchemy.Column` | |||
|
140 | """ | |||
|
141 | table = self.start_alter_table(column) | |||
|
142 | self.append('DROP COLUMN %s' % self.preparer.format_column(column)) | |||
|
143 | self.execute() | |||
|
144 | ||||
|
145 | ||||
|
146 | class ANSISchemaChanger(AlterTableVisitor, SchemaGenerator): | |||
|
147 | """Manages changes to existing schema elements. | |||
|
148 | ||||
|
149 | Note that columns are schema elements; ``ALTER TABLE ADD COLUMN`` | |||
|
150 | is in SchemaGenerator. | |||
|
151 | ||||
|
152 | All items may be renamed. Columns can also have many of their properties - | |||
|
153 | type, for example - changed. | |||
|
154 | ||||
|
155 | Each function is passed a tuple, containing (object, name); where | |||
|
156 | object is a type of object you'd expect for that function | |||
|
157 | (ie. table for visit_table) and name is the object's new | |||
|
158 | name. NONE means the name is unchanged. | |||
|
159 | """ | |||
|
160 | ||||
|
161 | def visit_table(self, table): | |||
|
162 | """Rename a table. Other ops aren't supported.""" | |||
|
163 | self.start_alter_table(table) | |||
|
164 | self.append("RENAME TO %s" % self.preparer.quote(table.new_name, | |||
|
165 | table.quote)) | |||
|
166 | self.execute() | |||
|
167 | ||||
|
168 | def visit_index(self, index): | |||
|
169 | """Rename an index""" | |||
|
170 | if hasattr(self, '_validate_identifier'): | |||
|
171 | # SA <= 0.6.3 | |||
|
172 | self.append("ALTER INDEX %s RENAME TO %s" % ( | |||
|
173 | self.preparer.quote( | |||
|
174 | self._validate_identifier( | |||
|
175 | index.name, True), index.quote), | |||
|
176 | self.preparer.quote( | |||
|
177 | self._validate_identifier( | |||
|
178 | index.new_name, True), index.quote))) | |||
|
179 | else: | |||
|
180 | # SA >= 0.6.5 | |||
|
181 | self.append("ALTER INDEX %s RENAME TO %s" % ( | |||
|
182 | self.preparer.quote( | |||
|
183 | self._index_identifier( | |||
|
184 | index.name), index.quote), | |||
|
185 | self.preparer.quote( | |||
|
186 | self._index_identifier( | |||
|
187 | index.new_name), index.quote))) | |||
|
188 | self.execute() | |||
|
189 | ||||
|
190 | def visit_column(self, delta): | |||
|
191 | """Rename/change a column.""" | |||
|
192 | # ALTER COLUMN is implemented as several ALTER statements | |||
|
193 | keys = delta.keys() | |||
|
194 | if 'type' in keys: | |||
|
195 | self._run_subvisit(delta, self._visit_column_type) | |||
|
196 | if 'nullable' in keys: | |||
|
197 | self._run_subvisit(delta, self._visit_column_nullable) | |||
|
198 | if 'server_default' in keys: | |||
|
199 | # Skip 'default': only handle server-side defaults, others | |||
|
200 | # are managed by the app, not the db. | |||
|
201 | self._run_subvisit(delta, self._visit_column_default) | |||
|
202 | if 'name' in keys: | |||
|
203 | self._run_subvisit(delta, self._visit_column_name, start_alter=False) | |||
|
204 | ||||
|
205 | def _run_subvisit(self, delta, func, start_alter=True): | |||
|
206 | """Runs visit method based on what needs to be changed on column""" | |||
|
207 | table = self._to_table(delta.table) | |||
|
208 | col_name = delta.current_name | |||
|
209 | if start_alter: | |||
|
210 | self.start_alter_column(table, col_name) | |||
|
211 | ret = func(table, delta.result_column, delta) | |||
|
212 | self.execute() | |||
|
213 | ||||
|
214 | def start_alter_column(self, table, col_name): | |||
|
215 | """Starts ALTER COLUMN""" | |||
|
216 | self.start_alter_table(table) | |||
|
217 | self.append("ALTER COLUMN %s " % self.preparer.quote(col_name, table.quote)) | |||
|
218 | ||||
|
219 | def _visit_column_nullable(self, table, column, delta): | |||
|
220 | nullable = delta['nullable'] | |||
|
221 | if nullable: | |||
|
222 | self.append("DROP NOT NULL") | |||
|
223 | else: | |||
|
224 | self.append("SET NOT NULL") | |||
|
225 | ||||
|
226 | def _visit_column_default(self, table, column, delta): | |||
|
227 | default_text = self.get_column_default_string(column) | |||
|
228 | if default_text is not None: | |||
|
229 | self.append("SET DEFAULT %s" % default_text) | |||
|
230 | else: | |||
|
231 | self.append("DROP DEFAULT") | |||
|
232 | ||||
|
233 | def _visit_column_type(self, table, column, delta): | |||
|
234 | type_ = delta['type'] | |||
|
235 | if SQLA_06: | |||
|
236 | type_text = str(type_.compile(dialect=self.dialect)) | |||
|
237 | else: | |||
|
238 | type_text = type_.dialect_impl(self.dialect).get_col_spec() | |||
|
239 | self.append("TYPE %s" % type_text) | |||
|
240 | ||||
|
241 | def _visit_column_name(self, table, column, delta): | |||
|
242 | self.start_alter_table(table) | |||
|
243 | col_name = self.preparer.quote(delta.current_name, table.quote) | |||
|
244 | new_name = self.preparer.format_column(delta.result_column) | |||
|
245 | self.append('RENAME COLUMN %s TO %s' % (col_name, new_name)) | |||
|
246 | ||||
|
247 | ||||
|
248 | class ANSIConstraintCommon(AlterTableVisitor): | |||
|
249 | """ | |||
|
250 | Migrate's constraints require a separate creation function from | |||
|
251 | SA's: Migrate's constraints are created independently of a table; | |||
|
252 | SA's are created at the same time as the table. | |||
|
253 | """ | |||
|
254 | ||||
|
255 | def get_constraint_name(self, cons): | |||
|
256 | """Gets a name for the given constraint. | |||
|
257 | ||||
|
258 | If the name is already set it will be used otherwise the | |||
|
259 | constraint's :meth:`autoname <migrate.changeset.constraint.ConstraintChangeset.autoname>` | |||
|
260 | method is used. | |||
|
261 | ||||
|
262 | :param cons: constraint object | |||
|
263 | """ | |||
|
264 | if cons.name is not None: | |||
|
265 | ret = cons.name | |||
|
266 | else: | |||
|
267 | ret = cons.name = cons.autoname() | |||
|
268 | return self.preparer.quote(ret, cons.quote) | |||
|
269 | ||||
|
270 | def visit_migrate_primary_key_constraint(self, *p, **k): | |||
|
271 | self._visit_constraint(*p, **k) | |||
|
272 | ||||
|
273 | def visit_migrate_foreign_key_constraint(self, *p, **k): | |||
|
274 | self._visit_constraint(*p, **k) | |||
|
275 | ||||
|
276 | def visit_migrate_check_constraint(self, *p, **k): | |||
|
277 | self._visit_constraint(*p, **k) | |||
|
278 | ||||
|
279 | def visit_migrate_unique_constraint(self, *p, **k): | |||
|
280 | self._visit_constraint(*p, **k) | |||
|
281 | ||||
|
282 | if SQLA_06: | |||
|
283 | class ANSIConstraintGenerator(ANSIConstraintCommon, SchemaGenerator): | |||
|
284 | def _visit_constraint(self, constraint): | |||
|
285 | constraint.name = self.get_constraint_name(constraint) | |||
|
286 | self.append(self.process(AddConstraint(constraint))) | |||
|
287 | self.execute() | |||
|
288 | ||||
|
289 | class ANSIConstraintDropper(ANSIConstraintCommon, SchemaDropper): | |||
|
290 | def _visit_constraint(self, constraint): | |||
|
291 | constraint.name = self.get_constraint_name(constraint) | |||
|
292 | self.append(self.process(DropConstraint(constraint, cascade=constraint.cascade))) | |||
|
293 | self.execute() | |||
|
294 | ||||
|
295 | else: | |||
|
296 | class ANSIConstraintGenerator(ANSIConstraintCommon, SchemaGenerator): | |||
|
297 | ||||
|
298 | def get_constraint_specification(self, cons, **kwargs): | |||
|
299 | """Constaint SQL generators. | |||
|
300 | ||||
|
301 | We cannot use SA visitors because they append comma. | |||
|
302 | """ | |||
|
303 | ||||
|
304 | if isinstance(cons, PrimaryKeyConstraint): | |||
|
305 | if cons.name is not None: | |||
|
306 | self.append("CONSTRAINT %s " % self.preparer.format_constraint(cons)) | |||
|
307 | self.append("PRIMARY KEY ") | |||
|
308 | self.append("(%s)" % ', '.join(self.preparer.quote(c.name, c.quote) | |||
|
309 | for c in cons)) | |||
|
310 | self.define_constraint_deferrability(cons) | |||
|
311 | elif isinstance(cons, ForeignKeyConstraint): | |||
|
312 | self.define_foreign_key(cons) | |||
|
313 | elif isinstance(cons, CheckConstraint): | |||
|
314 | if cons.name is not None: | |||
|
315 | self.append("CONSTRAINT %s " % | |||
|
316 | self.preparer.format_constraint(cons)) | |||
|
317 | self.append("CHECK (%s)" % cons.sqltext) | |||
|
318 | self.define_constraint_deferrability(cons) | |||
|
319 | elif isinstance(cons, UniqueConstraint): | |||
|
320 | if cons.name is not None: | |||
|
321 | self.append("CONSTRAINT %s " % | |||
|
322 | self.preparer.format_constraint(cons)) | |||
|
323 | self.append("UNIQUE (%s)" % \ | |||
|
324 | (', '.join(self.preparer.quote(c.name, c.quote) for c in cons))) | |||
|
325 | self.define_constraint_deferrability(cons) | |||
|
326 | else: | |||
|
327 | raise exceptions.InvalidConstraintError(cons) | |||
|
328 | ||||
|
329 | def _visit_constraint(self, constraint): | |||
|
330 | ||||
|
331 | table = self.start_alter_table(constraint) | |||
|
332 | constraint.name = self.get_constraint_name(constraint) | |||
|
333 | self.append("ADD ") | |||
|
334 | self.get_constraint_specification(constraint) | |||
|
335 | self.execute() | |||
|
336 | ||||
|
337 | ||||
|
338 | class ANSIConstraintDropper(ANSIConstraintCommon, SchemaDropper): | |||
|
339 | ||||
|
340 | def _visit_constraint(self, constraint): | |||
|
341 | self.start_alter_table(constraint) | |||
|
342 | self.append("DROP CONSTRAINT ") | |||
|
343 | constraint.name = self.get_constraint_name(constraint) | |||
|
344 | self.append(self.preparer.format_constraint(constraint)) | |||
|
345 | if constraint.cascade: | |||
|
346 | self.cascade_constraint(constraint) | |||
|
347 | self.execute() | |||
|
348 | ||||
|
349 | def cascade_constraint(self, constraint): | |||
|
350 | self.append(" CASCADE") | |||
|
351 | ||||
|
352 | ||||
|
353 | class ANSIDialect(DefaultDialect): | |||
|
354 | columngenerator = ANSIColumnGenerator | |||
|
355 | columndropper = ANSIColumnDropper | |||
|
356 | schemachanger = ANSISchemaChanger | |||
|
357 | constraintgenerator = ANSIConstraintGenerator | |||
|
358 | constraintdropper = ANSIConstraintDropper |
@@ -0,0 +1,202 b'' | |||||
|
1 | """ | |||
|
2 | This module defines standalone schema constraint classes. | |||
|
3 | """ | |||
|
4 | from sqlalchemy import schema | |||
|
5 | ||||
|
6 | from rhodecode.lib.dbmigrate.migrate.exceptions import * | |||
|
7 | from rhodecode.lib.dbmigrate.migrate.changeset import SQLA_06 | |||
|
8 | ||||
|
9 | class ConstraintChangeset(object): | |||
|
10 | """Base class for Constraint classes.""" | |||
|
11 | ||||
|
12 | def _normalize_columns(self, cols, table_name=False): | |||
|
13 | """Given: column objects or names; return col names and | |||
|
14 | (maybe) a table""" | |||
|
15 | colnames = [] | |||
|
16 | table = None | |||
|
17 | for col in cols: | |||
|
18 | if isinstance(col, schema.Column): | |||
|
19 | if col.table is not None and table is None: | |||
|
20 | table = col.table | |||
|
21 | if table_name: | |||
|
22 | col = '.'.join((col.table.name, col.name)) | |||
|
23 | else: | |||
|
24 | col = col.name | |||
|
25 | colnames.append(col) | |||
|
26 | return colnames, table | |||
|
27 | ||||
|
28 | def __do_imports(self, visitor_name, *a, **kw): | |||
|
29 | engine = kw.pop('engine', self.table.bind) | |||
|
30 | from rhodecode.lib.dbmigrate.migrate.changeset.databases.visitor import (get_engine_visitor, | |||
|
31 | run_single_visitor) | |||
|
32 | visitorcallable = get_engine_visitor(engine, visitor_name) | |||
|
33 | run_single_visitor(engine, visitorcallable, self, *a, **kw) | |||
|
34 | ||||
|
35 | def create(self, *a, **kw): | |||
|
36 | """Create the constraint in the database. | |||
|
37 | ||||
|
38 | :param engine: the database engine to use. If this is \ | |||
|
39 | :keyword:`None` the instance's engine will be used | |||
|
40 | :type engine: :class:`sqlalchemy.engine.base.Engine` | |||
|
41 | :param connection: reuse connection istead of creating new one. | |||
|
42 | :type connection: :class:`sqlalchemy.engine.base.Connection` instance | |||
|
43 | """ | |||
|
44 | # TODO: set the parent here instead of in __init__ | |||
|
45 | self.__do_imports('constraintgenerator', *a, **kw) | |||
|
46 | ||||
|
47 | def drop(self, *a, **kw): | |||
|
48 | """Drop the constraint from the database. | |||
|
49 | ||||
|
50 | :param engine: the database engine to use. If this is | |||
|
51 | :keyword:`None` the instance's engine will be used | |||
|
52 | :param cascade: Issue CASCADE drop if database supports it | |||
|
53 | :type engine: :class:`sqlalchemy.engine.base.Engine` | |||
|
54 | :type cascade: bool | |||
|
55 | :param connection: reuse connection istead of creating new one. | |||
|
56 | :type connection: :class:`sqlalchemy.engine.base.Connection` instance | |||
|
57 | :returns: Instance with cleared columns | |||
|
58 | """ | |||
|
59 | self.cascade = kw.pop('cascade', False) | |||
|
60 | self.__do_imports('constraintdropper', *a, **kw) | |||
|
61 | # the spirit of Constraint objects is that they | |||
|
62 | # are immutable (just like in a DB. they're only ADDed | |||
|
63 | # or DROPped). | |||
|
64 | #self.columns.clear() | |||
|
65 | return self | |||
|
66 | ||||
|
67 | ||||
|
68 | class PrimaryKeyConstraint(ConstraintChangeset, schema.PrimaryKeyConstraint): | |||
|
69 | """Construct PrimaryKeyConstraint | |||
|
70 | ||||
|
71 | Migrate's additional parameters: | |||
|
72 | ||||
|
73 | :param cols: Columns in constraint. | |||
|
74 | :param table: If columns are passed as strings, this kw is required | |||
|
75 | :type table: Table instance | |||
|
76 | :type cols: strings or Column instances | |||
|
77 | """ | |||
|
78 | ||||
|
79 | __migrate_visit_name__ = 'migrate_primary_key_constraint' | |||
|
80 | ||||
|
81 | def __init__(self, *cols, **kwargs): | |||
|
82 | colnames, table = self._normalize_columns(cols) | |||
|
83 | table = kwargs.pop('table', table) | |||
|
84 | super(PrimaryKeyConstraint, self).__init__(*colnames, **kwargs) | |||
|
85 | if table is not None: | |||
|
86 | self._set_parent(table) | |||
|
87 | ||||
|
88 | ||||
|
89 | def autoname(self): | |||
|
90 | """Mimic the database's automatic constraint names""" | |||
|
91 | return "%s_pkey" % self.table.name | |||
|
92 | ||||
|
93 | ||||
|
94 | class ForeignKeyConstraint(ConstraintChangeset, schema.ForeignKeyConstraint): | |||
|
95 | """Construct ForeignKeyConstraint | |||
|
96 | ||||
|
97 | Migrate's additional parameters: | |||
|
98 | ||||
|
99 | :param columns: Columns in constraint | |||
|
100 | :param refcolumns: Columns that this FK reffers to in another table. | |||
|
101 | :param table: If columns are passed as strings, this kw is required | |||
|
102 | :type table: Table instance | |||
|
103 | :type columns: list of strings or Column instances | |||
|
104 | :type refcolumns: list of strings or Column instances | |||
|
105 | """ | |||
|
106 | ||||
|
107 | __migrate_visit_name__ = 'migrate_foreign_key_constraint' | |||
|
108 | ||||
|
109 | def __init__(self, columns, refcolumns, *args, **kwargs): | |||
|
110 | colnames, table = self._normalize_columns(columns) | |||
|
111 | table = kwargs.pop('table', table) | |||
|
112 | refcolnames, reftable = self._normalize_columns(refcolumns, | |||
|
113 | table_name=True) | |||
|
114 | super(ForeignKeyConstraint, self).__init__(colnames, refcolnames, *args, | |||
|
115 | **kwargs) | |||
|
116 | if table is not None: | |||
|
117 | self._set_parent(table) | |||
|
118 | ||||
|
119 | @property | |||
|
120 | def referenced(self): | |||
|
121 | return [e.column for e in self.elements] | |||
|
122 | ||||
|
123 | @property | |||
|
124 | def reftable(self): | |||
|
125 | return self.referenced[0].table | |||
|
126 | ||||
|
127 | def autoname(self): | |||
|
128 | """Mimic the database's automatic constraint names""" | |||
|
129 | if hasattr(self.columns, 'keys'): | |||
|
130 | # SA <= 0.5 | |||
|
131 | firstcol = self.columns[self.columns.keys()[0]] | |||
|
132 | ret = "%(table)s_%(firstcolumn)s_fkey" % dict( | |||
|
133 | table=firstcol.table.name, | |||
|
134 | firstcolumn=firstcol.name,) | |||
|
135 | else: | |||
|
136 | # SA >= 0.6 | |||
|
137 | ret = "%(table)s_%(firstcolumn)s_fkey" % dict( | |||
|
138 | table=self.table.name, | |||
|
139 | firstcolumn=self.columns[0],) | |||
|
140 | return ret | |||
|
141 | ||||
|
142 | ||||
|
143 | class CheckConstraint(ConstraintChangeset, schema.CheckConstraint): | |||
|
144 | """Construct CheckConstraint | |||
|
145 | ||||
|
146 | Migrate's additional parameters: | |||
|
147 | ||||
|
148 | :param sqltext: Plain SQL text to check condition | |||
|
149 | :param columns: If not name is applied, you must supply this kw\ | |||
|
150 | to autoname constraint | |||
|
151 | :param table: If columns are passed as strings, this kw is required | |||
|
152 | :type table: Table instance | |||
|
153 | :type columns: list of Columns instances | |||
|
154 | :type sqltext: string | |||
|
155 | """ | |||
|
156 | ||||
|
157 | __migrate_visit_name__ = 'migrate_check_constraint' | |||
|
158 | ||||
|
159 | def __init__(self, sqltext, *args, **kwargs): | |||
|
160 | cols = kwargs.pop('columns', []) | |||
|
161 | if not cols and not kwargs.get('name', False): | |||
|
162 | raise InvalidConstraintError('You must either set "name"' | |||
|
163 | 'parameter or "columns" to autogenarate it.') | |||
|
164 | colnames, table = self._normalize_columns(cols) | |||
|
165 | table = kwargs.pop('table', table) | |||
|
166 | schema.CheckConstraint.__init__(self, sqltext, *args, **kwargs) | |||
|
167 | if table is not None: | |||
|
168 | if not SQLA_06: | |||
|
169 | self.table = table | |||
|
170 | self._set_parent(table) | |||
|
171 | self.colnames = colnames | |||
|
172 | ||||
|
173 | def autoname(self): | |||
|
174 | return "%(table)s_%(cols)s_check" % \ | |||
|
175 | dict(table=self.table.name, cols="_".join(self.colnames)) | |||
|
176 | ||||
|
177 | ||||
|
178 | class UniqueConstraint(ConstraintChangeset, schema.UniqueConstraint): | |||
|
179 | """Construct UniqueConstraint | |||
|
180 | ||||
|
181 | Migrate's additional parameters: | |||
|
182 | ||||
|
183 | :param cols: Columns in constraint. | |||
|
184 | :param table: If columns are passed as strings, this kw is required | |||
|
185 | :type table: Table instance | |||
|
186 | :type cols: strings or Column instances | |||
|
187 | ||||
|
188 | .. versionadded:: 0.6.0 | |||
|
189 | """ | |||
|
190 | ||||
|
191 | __migrate_visit_name__ = 'migrate_unique_constraint' | |||
|
192 | ||||
|
193 | def __init__(self, *cols, **kwargs): | |||
|
194 | self.colnames, table = self._normalize_columns(cols) | |||
|
195 | table = kwargs.pop('table', table) | |||
|
196 | super(UniqueConstraint, self).__init__(*self.colnames, **kwargs) | |||
|
197 | if table is not None: | |||
|
198 | self._set_parent(table) | |||
|
199 | ||||
|
200 | def autoname(self): | |||
|
201 | """Mimic the database's automatic constraint names""" | |||
|
202 | return "%s_%s_key" % (self.table.name, self.colnames[0]) |
@@ -0,0 +1,10 b'' | |||||
|
1 | """ | |||
|
2 | This module contains database dialect specific changeset | |||
|
3 | implementations. | |||
|
4 | """ | |||
|
5 | __all__ = [ | |||
|
6 | 'postgres', | |||
|
7 | 'sqlite', | |||
|
8 | 'mysql', | |||
|
9 | 'oracle', | |||
|
10 | ] |
@@ -0,0 +1,80 b'' | |||||
|
1 | """ | |||
|
2 | Firebird database specific implementations of changeset classes. | |||
|
3 | """ | |||
|
4 | from sqlalchemy.databases import firebird as sa_base | |||
|
5 | ||||
|
6 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
7 | from rhodecode.lib.dbmigrate.migrate.changeset import ansisql, SQLA_06 | |||
|
8 | ||||
|
9 | ||||
|
10 | if SQLA_06: | |||
|
11 | FBSchemaGenerator = sa_base.FBDDLCompiler | |||
|
12 | else: | |||
|
13 | FBSchemaGenerator = sa_base.FBSchemaGenerator | |||
|
14 | ||||
|
15 | class FBColumnGenerator(FBSchemaGenerator, ansisql.ANSIColumnGenerator): | |||
|
16 | """Firebird column generator implementation.""" | |||
|
17 | ||||
|
18 | ||||
|
19 | class FBColumnDropper(ansisql.ANSIColumnDropper): | |||
|
20 | """Firebird column dropper implementation.""" | |||
|
21 | ||||
|
22 | def visit_column(self, column): | |||
|
23 | """Firebird supports 'DROP col' instead of 'DROP COLUMN col' syntax | |||
|
24 | ||||
|
25 | Drop primary key and unique constraints if dropped column is referencing it.""" | |||
|
26 | if column.primary_key: | |||
|
27 | if column.table.primary_key.columns.contains_column(column): | |||
|
28 | column.table.primary_key.drop() | |||
|
29 | # TODO: recreate primary key if it references more than this column | |||
|
30 | if column.unique or getattr(column, 'unique_name', None): | |||
|
31 | for cons in column.table.constraints: | |||
|
32 | if cons.contains_column(column): | |||
|
33 | cons.drop() | |||
|
34 | # TODO: recreate unique constraint if it refenrences more than this column | |||
|
35 | ||||
|
36 | table = self.start_alter_table(column) | |||
|
37 | self.append('DROP %s' % self.preparer.format_column(column)) | |||
|
38 | self.execute() | |||
|
39 | ||||
|
40 | ||||
|
41 | class FBSchemaChanger(ansisql.ANSISchemaChanger): | |||
|
42 | """Firebird schema changer implementation.""" | |||
|
43 | ||||
|
44 | def visit_table(self, table): | |||
|
45 | """Rename table not supported""" | |||
|
46 | raise exceptions.NotSupportedError( | |||
|
47 | "Firebird does not support renaming tables.") | |||
|
48 | ||||
|
49 | def _visit_column_name(self, table, column, delta): | |||
|
50 | self.start_alter_table(table) | |||
|
51 | col_name = self.preparer.quote(delta.current_name, table.quote) | |||
|
52 | new_name = self.preparer.format_column(delta.result_column) | |||
|
53 | self.append('ALTER COLUMN %s TO %s' % (col_name, new_name)) | |||
|
54 | ||||
|
55 | def _visit_column_nullable(self, table, column, delta): | |||
|
56 | """Changing NULL is not supported""" | |||
|
57 | # TODO: http://www.firebirdfaq.org/faq103/ | |||
|
58 | raise exceptions.NotSupportedError( | |||
|
59 | "Firebird does not support altering NULL bevahior.") | |||
|
60 | ||||
|
61 | ||||
|
62 | class FBConstraintGenerator(ansisql.ANSIConstraintGenerator): | |||
|
63 | """Firebird constraint generator implementation.""" | |||
|
64 | ||||
|
65 | ||||
|
66 | class FBConstraintDropper(ansisql.ANSIConstraintDropper): | |||
|
67 | """Firebird constaint dropper implementation.""" | |||
|
68 | ||||
|
69 | def cascade_constraint(self, constraint): | |||
|
70 | """Cascading constraints is not supported""" | |||
|
71 | raise exceptions.NotSupportedError( | |||
|
72 | "Firebird does not support cascading constraints") | |||
|
73 | ||||
|
74 | ||||
|
75 | class FBDialect(ansisql.ANSIDialect): | |||
|
76 | columngenerator = FBColumnGenerator | |||
|
77 | columndropper = FBColumnDropper | |||
|
78 | schemachanger = FBSchemaChanger | |||
|
79 | constraintgenerator = FBConstraintGenerator | |||
|
80 | constraintdropper = FBConstraintDropper |
@@ -0,0 +1,94 b'' | |||||
|
1 | """ | |||
|
2 | MySQL database specific implementations of changeset classes. | |||
|
3 | """ | |||
|
4 | ||||
|
5 | from sqlalchemy.databases import mysql as sa_base | |||
|
6 | from sqlalchemy import types as sqltypes | |||
|
7 | ||||
|
8 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
9 | from rhodecode.lib.dbmigrate.migrate.changeset import ansisql, SQLA_06 | |||
|
10 | ||||
|
11 | ||||
|
12 | if not SQLA_06: | |||
|
13 | MySQLSchemaGenerator = sa_base.MySQLSchemaGenerator | |||
|
14 | else: | |||
|
15 | MySQLSchemaGenerator = sa_base.MySQLDDLCompiler | |||
|
16 | ||||
|
17 | class MySQLColumnGenerator(MySQLSchemaGenerator, ansisql.ANSIColumnGenerator): | |||
|
18 | pass | |||
|
19 | ||||
|
20 | ||||
|
21 | class MySQLColumnDropper(ansisql.ANSIColumnDropper): | |||
|
22 | pass | |||
|
23 | ||||
|
24 | ||||
|
25 | class MySQLSchemaChanger(MySQLSchemaGenerator, ansisql.ANSISchemaChanger): | |||
|
26 | ||||
|
27 | def visit_column(self, delta): | |||
|
28 | table = delta.table | |||
|
29 | colspec = self.get_column_specification(delta.result_column) | |||
|
30 | if delta.result_column.autoincrement: | |||
|
31 | primary_keys = [c for c in table.primary_key.columns | |||
|
32 | if (c.autoincrement and | |||
|
33 | isinstance(c.type, sqltypes.Integer) and | |||
|
34 | not c.foreign_keys)] | |||
|
35 | ||||
|
36 | if primary_keys: | |||
|
37 | first = primary_keys.pop(0) | |||
|
38 | if first.name == delta.current_name: | |||
|
39 | colspec += " AUTO_INCREMENT" | |||
|
40 | old_col_name = self.preparer.quote(delta.current_name, table.quote) | |||
|
41 | ||||
|
42 | self.start_alter_table(table) | |||
|
43 | ||||
|
44 | self.append("CHANGE COLUMN %s " % old_col_name) | |||
|
45 | self.append(colspec) | |||
|
46 | self.execute() | |||
|
47 | ||||
|
48 | def visit_index(self, param): | |||
|
49 | # If MySQL can do this, I can't find how | |||
|
50 | raise exceptions.NotSupportedError("MySQL cannot rename indexes") | |||
|
51 | ||||
|
52 | ||||
|
53 | class MySQLConstraintGenerator(ansisql.ANSIConstraintGenerator): | |||
|
54 | pass | |||
|
55 | ||||
|
56 | if SQLA_06: | |||
|
57 | class MySQLConstraintDropper(MySQLSchemaGenerator, ansisql.ANSIConstraintDropper): | |||
|
58 | def visit_migrate_check_constraint(self, *p, **k): | |||
|
59 | raise exceptions.NotSupportedError("MySQL does not support CHECK" | |||
|
60 | " constraints, use triggers instead.") | |||
|
61 | ||||
|
62 | else: | |||
|
63 | class MySQLConstraintDropper(ansisql.ANSIConstraintDropper): | |||
|
64 | ||||
|
65 | def visit_migrate_primary_key_constraint(self, constraint): | |||
|
66 | self.start_alter_table(constraint) | |||
|
67 | self.append("DROP PRIMARY KEY") | |||
|
68 | self.execute() | |||
|
69 | ||||
|
70 | def visit_migrate_foreign_key_constraint(self, constraint): | |||
|
71 | self.start_alter_table(constraint) | |||
|
72 | self.append("DROP FOREIGN KEY ") | |||
|
73 | constraint.name = self.get_constraint_name(constraint) | |||
|
74 | self.append(self.preparer.format_constraint(constraint)) | |||
|
75 | self.execute() | |||
|
76 | ||||
|
77 | def visit_migrate_check_constraint(self, *p, **k): | |||
|
78 | raise exceptions.NotSupportedError("MySQL does not support CHECK" | |||
|
79 | " constraints, use triggers instead.") | |||
|
80 | ||||
|
81 | def visit_migrate_unique_constraint(self, constraint, *p, **k): | |||
|
82 | self.start_alter_table(constraint) | |||
|
83 | self.append('DROP INDEX ') | |||
|
84 | constraint.name = self.get_constraint_name(constraint) | |||
|
85 | self.append(self.preparer.format_constraint(constraint)) | |||
|
86 | self.execute() | |||
|
87 | ||||
|
88 | ||||
|
89 | class MySQLDialect(ansisql.ANSIDialect): | |||
|
90 | columngenerator = MySQLColumnGenerator | |||
|
91 | columndropper = MySQLColumnDropper | |||
|
92 | schemachanger = MySQLSchemaChanger | |||
|
93 | constraintgenerator = MySQLConstraintGenerator | |||
|
94 | constraintdropper = MySQLConstraintDropper |
@@ -0,0 +1,111 b'' | |||||
|
1 | """ | |||
|
2 | Oracle database specific implementations of changeset classes. | |||
|
3 | """ | |||
|
4 | import sqlalchemy as sa | |||
|
5 | from sqlalchemy.databases import oracle as sa_base | |||
|
6 | ||||
|
7 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
8 | from rhodecode.lib.dbmigrate.migrate.changeset import ansisql, SQLA_06 | |||
|
9 | ||||
|
10 | ||||
|
11 | if not SQLA_06: | |||
|
12 | OracleSchemaGenerator = sa_base.OracleSchemaGenerator | |||
|
13 | else: | |||
|
14 | OracleSchemaGenerator = sa_base.OracleDDLCompiler | |||
|
15 | ||||
|
16 | ||||
|
17 | class OracleColumnGenerator(OracleSchemaGenerator, ansisql.ANSIColumnGenerator): | |||
|
18 | pass | |||
|
19 | ||||
|
20 | ||||
|
21 | class OracleColumnDropper(ansisql.ANSIColumnDropper): | |||
|
22 | pass | |||
|
23 | ||||
|
24 | ||||
|
25 | class OracleSchemaChanger(OracleSchemaGenerator, ansisql.ANSISchemaChanger): | |||
|
26 | ||||
|
27 | def get_column_specification(self, column, **kwargs): | |||
|
28 | # Ignore the NOT NULL generated | |||
|
29 | override_nullable = kwargs.pop('override_nullable', None) | |||
|
30 | if override_nullable: | |||
|
31 | orig = column.nullable | |||
|
32 | column.nullable = True | |||
|
33 | ret = super(OracleSchemaChanger, self).get_column_specification( | |||
|
34 | column, **kwargs) | |||
|
35 | if override_nullable: | |||
|
36 | column.nullable = orig | |||
|
37 | return ret | |||
|
38 | ||||
|
39 | def visit_column(self, delta): | |||
|
40 | keys = delta.keys() | |||
|
41 | ||||
|
42 | if 'name' in keys: | |||
|
43 | self._run_subvisit(delta, | |||
|
44 | self._visit_column_name, | |||
|
45 | start_alter=False) | |||
|
46 | ||||
|
47 | if len(set(('type', 'nullable', 'server_default')).intersection(keys)): | |||
|
48 | self._run_subvisit(delta, | |||
|
49 | self._visit_column_change, | |||
|
50 | start_alter=False) | |||
|
51 | ||||
|
52 | def _visit_column_change(self, table, column, delta): | |||
|
53 | # Oracle cannot drop a default once created, but it can set it | |||
|
54 | # to null. We'll do that if default=None | |||
|
55 | # http://forums.oracle.com/forums/message.jspa?messageID=1273234#1273234 | |||
|
56 | dropdefault_hack = (column.server_default is None \ | |||
|
57 | and 'server_default' in delta.keys()) | |||
|
58 | # Oracle apparently doesn't like it when we say "not null" if | |||
|
59 | # the column's already not null. Fudge it, so we don't need a | |||
|
60 | # new function | |||
|
61 | notnull_hack = ((not column.nullable) \ | |||
|
62 | and ('nullable' not in delta.keys())) | |||
|
63 | # We need to specify NULL if we're removing a NOT NULL | |||
|
64 | # constraint | |||
|
65 | null_hack = (column.nullable and ('nullable' in delta.keys())) | |||
|
66 | ||||
|
67 | if dropdefault_hack: | |||
|
68 | column.server_default = sa.PassiveDefault(sa.sql.null()) | |||
|
69 | if notnull_hack: | |||
|
70 | column.nullable = True | |||
|
71 | colspec = self.get_column_specification(column, | |||
|
72 | override_nullable=null_hack) | |||
|
73 | if null_hack: | |||
|
74 | colspec += ' NULL' | |||
|
75 | if notnull_hack: | |||
|
76 | column.nullable = False | |||
|
77 | if dropdefault_hack: | |||
|
78 | column.server_default = None | |||
|
79 | ||||
|
80 | self.start_alter_table(table) | |||
|
81 | self.append("MODIFY (") | |||
|
82 | self.append(colspec) | |||
|
83 | self.append(")") | |||
|
84 | ||||
|
85 | ||||
|
86 | class OracleConstraintCommon(object): | |||
|
87 | ||||
|
88 | def get_constraint_name(self, cons): | |||
|
89 | # Oracle constraints can't guess their name like other DBs | |||
|
90 | if not cons.name: | |||
|
91 | raise exceptions.NotSupportedError( | |||
|
92 | "Oracle constraint names must be explicitly stated") | |||
|
93 | return cons.name | |||
|
94 | ||||
|
95 | ||||
|
96 | class OracleConstraintGenerator(OracleConstraintCommon, | |||
|
97 | ansisql.ANSIConstraintGenerator): | |||
|
98 | pass | |||
|
99 | ||||
|
100 | ||||
|
101 | class OracleConstraintDropper(OracleConstraintCommon, | |||
|
102 | ansisql.ANSIConstraintDropper): | |||
|
103 | pass | |||
|
104 | ||||
|
105 | ||||
|
106 | class OracleDialect(ansisql.ANSIDialect): | |||
|
107 | columngenerator = OracleColumnGenerator | |||
|
108 | columndropper = OracleColumnDropper | |||
|
109 | schemachanger = OracleSchemaChanger | |||
|
110 | constraintgenerator = OracleConstraintGenerator | |||
|
111 | constraintdropper = OracleConstraintDropper |
@@ -0,0 +1,46 b'' | |||||
|
1 | """ | |||
|
2 | `PostgreSQL`_ database specific implementations of changeset classes. | |||
|
3 | ||||
|
4 | .. _`PostgreSQL`: http://www.postgresql.org/ | |||
|
5 | """ | |||
|
6 | from rhodecode.lib.dbmigrate.migrate.changeset import ansisql, SQLA_06 | |||
|
7 | ||||
|
8 | if not SQLA_06: | |||
|
9 | from sqlalchemy.databases import postgres as sa_base | |||
|
10 | PGSchemaGenerator = sa_base.PGSchemaGenerator | |||
|
11 | else: | |||
|
12 | from sqlalchemy.databases import postgresql as sa_base | |||
|
13 | PGSchemaGenerator = sa_base.PGDDLCompiler | |||
|
14 | ||||
|
15 | ||||
|
16 | class PGColumnGenerator(PGSchemaGenerator, ansisql.ANSIColumnGenerator): | |||
|
17 | """PostgreSQL column generator implementation.""" | |||
|
18 | pass | |||
|
19 | ||||
|
20 | ||||
|
21 | class PGColumnDropper(ansisql.ANSIColumnDropper): | |||
|
22 | """PostgreSQL column dropper implementation.""" | |||
|
23 | pass | |||
|
24 | ||||
|
25 | ||||
|
26 | class PGSchemaChanger(ansisql.ANSISchemaChanger): | |||
|
27 | """PostgreSQL schema changer implementation.""" | |||
|
28 | pass | |||
|
29 | ||||
|
30 | ||||
|
31 | class PGConstraintGenerator(ansisql.ANSIConstraintGenerator): | |||
|
32 | """PostgreSQL constraint generator implementation.""" | |||
|
33 | pass | |||
|
34 | ||||
|
35 | ||||
|
36 | class PGConstraintDropper(ansisql.ANSIConstraintDropper): | |||
|
37 | """PostgreSQL constaint dropper implementation.""" | |||
|
38 | pass | |||
|
39 | ||||
|
40 | ||||
|
41 | class PGDialect(ansisql.ANSIDialect): | |||
|
42 | columngenerator = PGColumnGenerator | |||
|
43 | columndropper = PGColumnDropper | |||
|
44 | schemachanger = PGSchemaChanger | |||
|
45 | constraintgenerator = PGConstraintGenerator | |||
|
46 | constraintdropper = PGConstraintDropper |
@@ -0,0 +1,148 b'' | |||||
|
1 | """ | |||
|
2 | `SQLite`_ database specific implementations of changeset classes. | |||
|
3 | ||||
|
4 | .. _`SQLite`: http://www.sqlite.org/ | |||
|
5 | """ | |||
|
6 | from UserDict import DictMixin | |||
|
7 | from copy import copy | |||
|
8 | ||||
|
9 | from sqlalchemy.databases import sqlite as sa_base | |||
|
10 | ||||
|
11 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
12 | from rhodecode.lib.dbmigrate.migrate.changeset import ansisql, SQLA_06 | |||
|
13 | ||||
|
14 | ||||
|
15 | if not SQLA_06: | |||
|
16 | SQLiteSchemaGenerator = sa_base.SQLiteSchemaGenerator | |||
|
17 | else: | |||
|
18 | SQLiteSchemaGenerator = sa_base.SQLiteDDLCompiler | |||
|
19 | ||||
|
20 | class SQLiteCommon(object): | |||
|
21 | ||||
|
22 | def _not_supported(self, op): | |||
|
23 | raise exceptions.NotSupportedError("SQLite does not support " | |||
|
24 | "%s; see http://www.sqlite.org/lang_altertable.html" % op) | |||
|
25 | ||||
|
26 | ||||
|
27 | class SQLiteHelper(SQLiteCommon): | |||
|
28 | ||||
|
29 | def recreate_table(self,table,column=None,delta=None): | |||
|
30 | table_name = self.preparer.format_table(table) | |||
|
31 | ||||
|
32 | # we remove all indexes so as not to have | |||
|
33 | # problems during copy and re-create | |||
|
34 | for index in table.indexes: | |||
|
35 | index.drop() | |||
|
36 | ||||
|
37 | self.append('ALTER TABLE %s RENAME TO migration_tmp' % table_name) | |||
|
38 | self.execute() | |||
|
39 | ||||
|
40 | insertion_string = self._modify_table(table, column, delta) | |||
|
41 | ||||
|
42 | table.create() | |||
|
43 | self.append(insertion_string % {'table_name': table_name}) | |||
|
44 | self.execute() | |||
|
45 | self.append('DROP TABLE migration_tmp') | |||
|
46 | self.execute() | |||
|
47 | ||||
|
48 | def visit_column(self, delta): | |||
|
49 | if isinstance(delta, DictMixin): | |||
|
50 | column = delta.result_column | |||
|
51 | table = self._to_table(delta.table) | |||
|
52 | else: | |||
|
53 | column = delta | |||
|
54 | table = self._to_table(column.table) | |||
|
55 | self.recreate_table(table,column,delta) | |||
|
56 | ||||
|
57 | class SQLiteColumnGenerator(SQLiteSchemaGenerator, | |||
|
58 | ansisql.ANSIColumnGenerator, | |||
|
59 | # at the end so we get the normal | |||
|
60 | # visit_column by default | |||
|
61 | SQLiteHelper, | |||
|
62 | SQLiteCommon | |||
|
63 | ): | |||
|
64 | """SQLite ColumnGenerator""" | |||
|
65 | ||||
|
66 | def _modify_table(self, table, column, delta): | |||
|
67 | columns = ' ,'.join(map( | |||
|
68 | self.preparer.format_column, | |||
|
69 | [c for c in table.columns if c.name!=column.name])) | |||
|
70 | return ('INSERT INTO %%(table_name)s (%(cols)s) ' | |||
|
71 | 'SELECT %(cols)s from migration_tmp')%{'cols':columns} | |||
|
72 | ||||
|
73 | def visit_column(self,column): | |||
|
74 | if column.foreign_keys: | |||
|
75 | SQLiteHelper.visit_column(self,column) | |||
|
76 | else: | |||
|
77 | super(SQLiteColumnGenerator,self).visit_column(column) | |||
|
78 | ||||
|
79 | class SQLiteColumnDropper(SQLiteHelper, ansisql.ANSIColumnDropper): | |||
|
80 | """SQLite ColumnDropper""" | |||
|
81 | ||||
|
82 | def _modify_table(self, table, column, delta): | |||
|
83 | columns = ' ,'.join(map(self.preparer.format_column, table.columns)) | |||
|
84 | return 'INSERT INTO %(table_name)s SELECT ' + columns + \ | |||
|
85 | ' from migration_tmp' | |||
|
86 | ||||
|
87 | ||||
|
88 | class SQLiteSchemaChanger(SQLiteHelper, ansisql.ANSISchemaChanger): | |||
|
89 | """SQLite SchemaChanger""" | |||
|
90 | ||||
|
91 | def _modify_table(self, table, column, delta): | |||
|
92 | return 'INSERT INTO %(table_name)s SELECT * from migration_tmp' | |||
|
93 | ||||
|
94 | def visit_index(self, index): | |||
|
95 | """Does not support ALTER INDEX""" | |||
|
96 | self._not_supported('ALTER INDEX') | |||
|
97 | ||||
|
98 | ||||
|
99 | class SQLiteConstraintGenerator(ansisql.ANSIConstraintGenerator, SQLiteHelper, SQLiteCommon): | |||
|
100 | ||||
|
101 | def visit_migrate_primary_key_constraint(self, constraint): | |||
|
102 | tmpl = "CREATE UNIQUE INDEX %s ON %s ( %s )" | |||
|
103 | cols = ', '.join(map(self.preparer.format_column, constraint.columns)) | |||
|
104 | tname = self.preparer.format_table(constraint.table) | |||
|
105 | name = self.get_constraint_name(constraint) | |||
|
106 | msg = tmpl % (name, tname, cols) | |||
|
107 | self.append(msg) | |||
|
108 | self.execute() | |||
|
109 | ||||
|
110 | def _modify_table(self, table, column, delta): | |||
|
111 | return 'INSERT INTO %(table_name)s SELECT * from migration_tmp' | |||
|
112 | ||||
|
113 | def visit_migrate_foreign_key_constraint(self, *p, **k): | |||
|
114 | self.recreate_table(p[0].table) | |||
|
115 | ||||
|
116 | def visit_migrate_unique_constraint(self, *p, **k): | |||
|
117 | self.recreate_table(p[0].table) | |||
|
118 | ||||
|
119 | ||||
|
120 | class SQLiteConstraintDropper(ansisql.ANSIColumnDropper, | |||
|
121 | SQLiteCommon, | |||
|
122 | ansisql.ANSIConstraintCommon): | |||
|
123 | ||||
|
124 | def visit_migrate_primary_key_constraint(self, constraint): | |||
|
125 | tmpl = "DROP INDEX %s " | |||
|
126 | name = self.get_constraint_name(constraint) | |||
|
127 | msg = tmpl % (name) | |||
|
128 | self.append(msg) | |||
|
129 | self.execute() | |||
|
130 | ||||
|
131 | def visit_migrate_foreign_key_constraint(self, *p, **k): | |||
|
132 | self._not_supported('ALTER TABLE DROP CONSTRAINT') | |||
|
133 | ||||
|
134 | def visit_migrate_check_constraint(self, *p, **k): | |||
|
135 | self._not_supported('ALTER TABLE DROP CONSTRAINT') | |||
|
136 | ||||
|
137 | def visit_migrate_unique_constraint(self, *p, **k): | |||
|
138 | self._not_supported('ALTER TABLE DROP CONSTRAINT') | |||
|
139 | ||||
|
140 | ||||
|
141 | # TODO: technically primary key is a NOT NULL + UNIQUE constraint, should add NOT NULL to index | |||
|
142 | ||||
|
143 | class SQLiteDialect(ansisql.ANSIDialect): | |||
|
144 | columngenerator = SQLiteColumnGenerator | |||
|
145 | columndropper = SQLiteColumnDropper | |||
|
146 | schemachanger = SQLiteSchemaChanger | |||
|
147 | constraintgenerator = SQLiteConstraintGenerator | |||
|
148 | constraintdropper = SQLiteConstraintDropper |
@@ -0,0 +1,78 b'' | |||||
|
1 | """ | |||
|
2 | Module for visitor class mapping. | |||
|
3 | """ | |||
|
4 | import sqlalchemy as sa | |||
|
5 | ||||
|
6 | from rhodecode.lib.dbmigrate.migrate.changeset import ansisql | |||
|
7 | from rhodecode.lib.dbmigrate.migrate.changeset.databases import (sqlite, | |||
|
8 | postgres, | |||
|
9 | mysql, | |||
|
10 | oracle, | |||
|
11 | firebird) | |||
|
12 | ||||
|
13 | ||||
|
14 | # Map SA dialects to the corresponding Migrate extensions | |||
|
15 | DIALECTS = { | |||
|
16 | "default": ansisql.ANSIDialect, | |||
|
17 | "sqlite": sqlite.SQLiteDialect, | |||
|
18 | "postgres": postgres.PGDialect, | |||
|
19 | "postgresql": postgres.PGDialect, | |||
|
20 | "mysql": mysql.MySQLDialect, | |||
|
21 | "oracle": oracle.OracleDialect, | |||
|
22 | "firebird": firebird.FBDialect, | |||
|
23 | } | |||
|
24 | ||||
|
25 | ||||
|
26 | def get_engine_visitor(engine, name): | |||
|
27 | """ | |||
|
28 | Get the visitor implementation for the given database engine. | |||
|
29 | ||||
|
30 | :param engine: SQLAlchemy Engine | |||
|
31 | :param name: Name of the visitor | |||
|
32 | :type name: string | |||
|
33 | :type engine: Engine | |||
|
34 | :returns: visitor | |||
|
35 | """ | |||
|
36 | # TODO: link to supported visitors | |||
|
37 | return get_dialect_visitor(engine.dialect, name) | |||
|
38 | ||||
|
39 | ||||
|
40 | def get_dialect_visitor(sa_dialect, name): | |||
|
41 | """ | |||
|
42 | Get the visitor implementation for the given dialect. | |||
|
43 | ||||
|
44 | Finds the visitor implementation based on the dialect class and | |||
|
45 | returns and instance initialized with the given name. | |||
|
46 | ||||
|
47 | Binds dialect specific preparer to visitor. | |||
|
48 | """ | |||
|
49 | ||||
|
50 | # map sa dialect to migrate dialect and return visitor | |||
|
51 | sa_dialect_name = getattr(sa_dialect, 'name', 'default') | |||
|
52 | migrate_dialect_cls = DIALECTS[sa_dialect_name] | |||
|
53 | visitor = getattr(migrate_dialect_cls, name) | |||
|
54 | ||||
|
55 | # bind preparer | |||
|
56 | visitor.preparer = sa_dialect.preparer(sa_dialect) | |||
|
57 | ||||
|
58 | return visitor | |||
|
59 | ||||
|
60 | def run_single_visitor(engine, visitorcallable, element, | |||
|
61 | connection=None, **kwargs): | |||
|
62 | """Taken from :meth:`sqlalchemy.engine.base.Engine._run_single_visitor` | |||
|
63 | with support for migrate visitors. | |||
|
64 | """ | |||
|
65 | if connection is None: | |||
|
66 | conn = engine.contextual_connect(close_with_result=False) | |||
|
67 | else: | |||
|
68 | conn = connection | |||
|
69 | visitor = visitorcallable(engine.dialect, conn) | |||
|
70 | try: | |||
|
71 | if hasattr(element, '__migrate_visit_name__'): | |||
|
72 | fn = getattr(visitor, 'visit_' + element.__migrate_visit_name__) | |||
|
73 | else: | |||
|
74 | fn = getattr(visitor, 'visit_' + element.__visit_name__) | |||
|
75 | fn(element, **kwargs) | |||
|
76 | finally: | |||
|
77 | if connection is None: | |||
|
78 | conn.close() |
This diff has been collapsed as it changes many lines, (669 lines changed) Show them Hide them | |||||
@@ -0,0 +1,669 b'' | |||||
|
1 | """ | |||
|
2 | Schema module providing common schema operations. | |||
|
3 | """ | |||
|
4 | import warnings | |||
|
5 | ||||
|
6 | from UserDict import DictMixin | |||
|
7 | ||||
|
8 | import sqlalchemy | |||
|
9 | ||||
|
10 | from sqlalchemy.schema import ForeignKeyConstraint | |||
|
11 | from sqlalchemy.schema import UniqueConstraint | |||
|
12 | ||||
|
13 | from rhodecode.lib.dbmigrate.migrate.exceptions import * | |||
|
14 | from rhodecode.lib.dbmigrate.migrate.changeset import SQLA_06 | |||
|
15 | from rhodecode.lib.dbmigrate.migrate.changeset.databases.visitor import (get_engine_visitor, | |||
|
16 | run_single_visitor) | |||
|
17 | ||||
|
18 | ||||
|
19 | __all__ = [ | |||
|
20 | 'create_column', | |||
|
21 | 'drop_column', | |||
|
22 | 'alter_column', | |||
|
23 | 'rename_table', | |||
|
24 | 'rename_index', | |||
|
25 | 'ChangesetTable', | |||
|
26 | 'ChangesetColumn', | |||
|
27 | 'ChangesetIndex', | |||
|
28 | 'ChangesetDefaultClause', | |||
|
29 | 'ColumnDelta', | |||
|
30 | ] | |||
|
31 | ||||
|
32 | DEFAULT_ALTER_METADATA = True | |||
|
33 | ||||
|
34 | ||||
|
35 | def create_column(column, table=None, *p, **kw): | |||
|
36 | """Create a column, given the table. | |||
|
37 | ||||
|
38 | API to :meth:`ChangesetColumn.create`. | |||
|
39 | """ | |||
|
40 | if table is not None: | |||
|
41 | return table.create_column(column, *p, **kw) | |||
|
42 | return column.create(*p, **kw) | |||
|
43 | ||||
|
44 | ||||
|
45 | def drop_column(column, table=None, *p, **kw): | |||
|
46 | """Drop a column, given the table. | |||
|
47 | ||||
|
48 | API to :meth:`ChangesetColumn.drop`. | |||
|
49 | """ | |||
|
50 | if table is not None: | |||
|
51 | return table.drop_column(column, *p, **kw) | |||
|
52 | return column.drop(*p, **kw) | |||
|
53 | ||||
|
54 | ||||
|
55 | def rename_table(table, name, engine=None, **kw): | |||
|
56 | """Rename a table. | |||
|
57 | ||||
|
58 | If Table instance is given, engine is not used. | |||
|
59 | ||||
|
60 | API to :meth:`ChangesetTable.rename`. | |||
|
61 | ||||
|
62 | :param table: Table to be renamed. | |||
|
63 | :param name: New name for Table. | |||
|
64 | :param engine: Engine instance. | |||
|
65 | :type table: string or Table instance | |||
|
66 | :type name: string | |||
|
67 | :type engine: obj | |||
|
68 | """ | |||
|
69 | table = _to_table(table, engine) | |||
|
70 | table.rename(name, **kw) | |||
|
71 | ||||
|
72 | ||||
|
73 | def rename_index(index, name, table=None, engine=None, **kw): | |||
|
74 | """Rename an index. | |||
|
75 | ||||
|
76 | If Index instance is given, | |||
|
77 | table and engine are not used. | |||
|
78 | ||||
|
79 | API to :meth:`ChangesetIndex.rename`. | |||
|
80 | ||||
|
81 | :param index: Index to be renamed. | |||
|
82 | :param name: New name for index. | |||
|
83 | :param table: Table to which Index is reffered. | |||
|
84 | :param engine: Engine instance. | |||
|
85 | :type index: string or Index instance | |||
|
86 | :type name: string | |||
|
87 | :type table: string or Table instance | |||
|
88 | :type engine: obj | |||
|
89 | """ | |||
|
90 | index = _to_index(index, table, engine) | |||
|
91 | index.rename(name, **kw) | |||
|
92 | ||||
|
93 | ||||
|
94 | def alter_column(*p, **k): | |||
|
95 | """Alter a column. | |||
|
96 | ||||
|
97 | This is a helper function that creates a :class:`ColumnDelta` and | |||
|
98 | runs it. | |||
|
99 | ||||
|
100 | :argument column: | |||
|
101 | The name of the column to be altered or a | |||
|
102 | :class:`ChangesetColumn` column representing it. | |||
|
103 | ||||
|
104 | :param table: | |||
|
105 | A :class:`~sqlalchemy.schema.Table` or table name to | |||
|
106 | for the table where the column will be changed. | |||
|
107 | ||||
|
108 | :param engine: | |||
|
109 | The :class:`~sqlalchemy.engine.base.Engine` to use for table | |||
|
110 | reflection and schema alterations. | |||
|
111 | ||||
|
112 | :param alter_metadata: | |||
|
113 | If `True`, which is the default, the | |||
|
114 | :class:`~sqlalchemy.schema.Column` will also modified. | |||
|
115 | If `False`, the :class:`~sqlalchemy.schema.Column` will be left | |||
|
116 | as it was. | |||
|
117 | ||||
|
118 | :returns: A :class:`ColumnDelta` instance representing the change. | |||
|
119 | ||||
|
120 | ||||
|
121 | """ | |||
|
122 | ||||
|
123 | k.setdefault('alter_metadata', DEFAULT_ALTER_METADATA) | |||
|
124 | ||||
|
125 | if 'table' not in k and isinstance(p[0], sqlalchemy.Column): | |||
|
126 | k['table'] = p[0].table | |||
|
127 | if 'engine' not in k: | |||
|
128 | k['engine'] = k['table'].bind | |||
|
129 | ||||
|
130 | # deprecation | |||
|
131 | if len(p) >= 2 and isinstance(p[1], sqlalchemy.Column): | |||
|
132 | warnings.warn( | |||
|
133 | "Passing a Column object to alter_column is deprecated." | |||
|
134 | " Just pass in keyword parameters instead.", | |||
|
135 | MigrateDeprecationWarning | |||
|
136 | ) | |||
|
137 | engine = k['engine'] | |||
|
138 | delta = ColumnDelta(*p, **k) | |||
|
139 | ||||
|
140 | visitorcallable = get_engine_visitor(engine, 'schemachanger') | |||
|
141 | engine._run_visitor(visitorcallable, delta) | |||
|
142 | ||||
|
143 | return delta | |||
|
144 | ||||
|
145 | ||||
|
146 | def _to_table(table, engine=None): | |||
|
147 | """Return if instance of Table, else construct new with metadata""" | |||
|
148 | if isinstance(table, sqlalchemy.Table): | |||
|
149 | return table | |||
|
150 | ||||
|
151 | # Given: table name, maybe an engine | |||
|
152 | meta = sqlalchemy.MetaData() | |||
|
153 | if engine is not None: | |||
|
154 | meta.bind = engine | |||
|
155 | return sqlalchemy.Table(table, meta) | |||
|
156 | ||||
|
157 | ||||
|
158 | def _to_index(index, table=None, engine=None): | |||
|
159 | """Return if instance of Index, else construct new with metadata""" | |||
|
160 | if isinstance(index, sqlalchemy.Index): | |||
|
161 | return index | |||
|
162 | ||||
|
163 | # Given: index name; table name required | |||
|
164 | table = _to_table(table, engine) | |||
|
165 | ret = sqlalchemy.Index(index) | |||
|
166 | ret.table = table | |||
|
167 | return ret | |||
|
168 | ||||
|
169 | ||||
|
170 | class ColumnDelta(DictMixin, sqlalchemy.schema.SchemaItem): | |||
|
171 | """Extracts the differences between two columns/column-parameters | |||
|
172 | ||||
|
173 | May receive parameters arranged in several different ways: | |||
|
174 | ||||
|
175 | * **current_column, new_column, \*p, \*\*kw** | |||
|
176 | Additional parameters can be specified to override column | |||
|
177 | differences. | |||
|
178 | ||||
|
179 | * **current_column, \*p, \*\*kw** | |||
|
180 | Additional parameters alter current_column. Table name is extracted | |||
|
181 | from current_column object. | |||
|
182 | Name is changed to current_column.name from current_name, | |||
|
183 | if current_name is specified. | |||
|
184 | ||||
|
185 | * **current_col_name, \*p, \*\*kw** | |||
|
186 | Table kw must specified. | |||
|
187 | ||||
|
188 | :param table: Table at which current Column should be bound to.\ | |||
|
189 | If table name is given, reflection will be used. | |||
|
190 | :type table: string or Table instance | |||
|
191 | :param alter_metadata: If True, it will apply changes to metadata. | |||
|
192 | :type alter_metadata: bool | |||
|
193 | :param metadata: If `alter_metadata` is true, \ | |||
|
194 | metadata is used to reflect table names into | |||
|
195 | :type metadata: :class:`MetaData` instance | |||
|
196 | :param engine: When reflecting tables, either engine or metadata must \ | |||
|
197 | be specified to acquire engine object. | |||
|
198 | :type engine: :class:`Engine` instance | |||
|
199 | :returns: :class:`ColumnDelta` instance provides interface for altered attributes to \ | |||
|
200 | `result_column` through :func:`dict` alike object. | |||
|
201 | ||||
|
202 | * :class:`ColumnDelta`.result_column is altered column with new attributes | |||
|
203 | ||||
|
204 | * :class:`ColumnDelta`.current_name is current name of column in db | |||
|
205 | ||||
|
206 | ||||
|
207 | """ | |||
|
208 | ||||
|
209 | # Column attributes that can be altered | |||
|
210 | diff_keys = ('name', 'type', 'primary_key', 'nullable', | |||
|
211 | 'server_onupdate', 'server_default', 'autoincrement') | |||
|
212 | diffs = dict() | |||
|
213 | __visit_name__ = 'column' | |||
|
214 | ||||
|
215 | def __init__(self, *p, **kw): | |||
|
216 | self.alter_metadata = kw.pop("alter_metadata", False) | |||
|
217 | self.meta = kw.pop("metadata", None) | |||
|
218 | self.engine = kw.pop("engine", None) | |||
|
219 | ||||
|
220 | # Things are initialized differently depending on how many column | |||
|
221 | # parameters are given. Figure out how many and call the appropriate | |||
|
222 | # method. | |||
|
223 | if len(p) >= 1 and isinstance(p[0], sqlalchemy.Column): | |||
|
224 | # At least one column specified | |||
|
225 | if len(p) >= 2 and isinstance(p[1], sqlalchemy.Column): | |||
|
226 | # Two columns specified | |||
|
227 | diffs = self.compare_2_columns(*p, **kw) | |||
|
228 | else: | |||
|
229 | # Exactly one column specified | |||
|
230 | diffs = self.compare_1_column(*p, **kw) | |||
|
231 | else: | |||
|
232 | # Zero columns specified | |||
|
233 | if not len(p) or not isinstance(p[0], basestring): | |||
|
234 | raise ValueError("First argument must be column name") | |||
|
235 | diffs = self.compare_parameters(*p, **kw) | |||
|
236 | ||||
|
237 | self.apply_diffs(diffs) | |||
|
238 | ||||
|
239 | def __repr__(self): | |||
|
240 | return '<ColumnDelta altermetadata=%r, %s>' % (self.alter_metadata, | |||
|
241 | super(ColumnDelta, self).__repr__()) | |||
|
242 | ||||
|
243 | def __getitem__(self, key): | |||
|
244 | if key not in self.keys(): | |||
|
245 | raise KeyError("No such diff key, available: %s" % self.diffs) | |||
|
246 | return getattr(self.result_column, key) | |||
|
247 | ||||
|
248 | def __setitem__(self, key, value): | |||
|
249 | if key not in self.keys(): | |||
|
250 | raise KeyError("No such diff key, available: %s" % self.diffs) | |||
|
251 | setattr(self.result_column, key, value) | |||
|
252 | ||||
|
253 | def __delitem__(self, key): | |||
|
254 | raise NotImplementedError | |||
|
255 | ||||
|
256 | def keys(self): | |||
|
257 | return self.diffs.keys() | |||
|
258 | ||||
|
259 | def compare_parameters(self, current_name, *p, **k): | |||
|
260 | """Compares Column objects with reflection""" | |||
|
261 | self.table = k.pop('table') | |||
|
262 | self.result_column = self._table.c.get(current_name) | |||
|
263 | if len(p): | |||
|
264 | k = self._extract_parameters(p, k, self.result_column) | |||
|
265 | return k | |||
|
266 | ||||
|
267 | def compare_1_column(self, col, *p, **k): | |||
|
268 | """Compares one Column object""" | |||
|
269 | self.table = k.pop('table', None) | |||
|
270 | if self.table is None: | |||
|
271 | self.table = col.table | |||
|
272 | self.result_column = col | |||
|
273 | if len(p): | |||
|
274 | k = self._extract_parameters(p, k, self.result_column) | |||
|
275 | return k | |||
|
276 | ||||
|
277 | def compare_2_columns(self, old_col, new_col, *p, **k): | |||
|
278 | """Compares two Column objects""" | |||
|
279 | self.process_column(new_col) | |||
|
280 | self.table = k.pop('table', None) | |||
|
281 | # we cannot use bool() on table in SA06 | |||
|
282 | if self.table is None: | |||
|
283 | self.table = old_col.table | |||
|
284 | if self.table is None: | |||
|
285 | new_col.table | |||
|
286 | self.result_column = old_col | |||
|
287 | ||||
|
288 | # set differences | |||
|
289 | # leave out some stuff for later comp | |||
|
290 | for key in (set(self.diff_keys) - set(('type',))): | |||
|
291 | val = getattr(new_col, key, None) | |||
|
292 | if getattr(self.result_column, key, None) != val: | |||
|
293 | k.setdefault(key, val) | |||
|
294 | ||||
|
295 | # inspect types | |||
|
296 | if not self.are_column_types_eq(self.result_column.type, new_col.type): | |||
|
297 | k.setdefault('type', new_col.type) | |||
|
298 | ||||
|
299 | if len(p): | |||
|
300 | k = self._extract_parameters(p, k, self.result_column) | |||
|
301 | return k | |||
|
302 | ||||
|
303 | def apply_diffs(self, diffs): | |||
|
304 | """Populate dict and column object with new values""" | |||
|
305 | self.diffs = diffs | |||
|
306 | for key in self.diff_keys: | |||
|
307 | if key in diffs: | |||
|
308 | setattr(self.result_column, key, diffs[key]) | |||
|
309 | ||||
|
310 | self.process_column(self.result_column) | |||
|
311 | ||||
|
312 | # create an instance of class type if not yet | |||
|
313 | if 'type' in diffs and callable(self.result_column.type): | |||
|
314 | self.result_column.type = self.result_column.type() | |||
|
315 | ||||
|
316 | # add column to the table | |||
|
317 | if self.table is not None and self.alter_metadata: | |||
|
318 | self.result_column.add_to_table(self.table) | |||
|
319 | ||||
|
320 | def are_column_types_eq(self, old_type, new_type): | |||
|
321 | """Compares two types to be equal""" | |||
|
322 | ret = old_type.__class__ == new_type.__class__ | |||
|
323 | ||||
|
324 | # String length is a special case | |||
|
325 | if ret and isinstance(new_type, sqlalchemy.types.String): | |||
|
326 | ret = (getattr(old_type, 'length', None) == \ | |||
|
327 | getattr(new_type, 'length', None)) | |||
|
328 | return ret | |||
|
329 | ||||
|
330 | def _extract_parameters(self, p, k, column): | |||
|
331 | """Extracts data from p and modifies diffs""" | |||
|
332 | p = list(p) | |||
|
333 | while len(p): | |||
|
334 | if isinstance(p[0], basestring): | |||
|
335 | k.setdefault('name', p.pop(0)) | |||
|
336 | elif isinstance(p[0], sqlalchemy.types.AbstractType): | |||
|
337 | k.setdefault('type', p.pop(0)) | |||
|
338 | elif callable(p[0]): | |||
|
339 | p[0] = p[0]() | |||
|
340 | else: | |||
|
341 | break | |||
|
342 | ||||
|
343 | if len(p): | |||
|
344 | new_col = column.copy_fixed() | |||
|
345 | new_col._init_items(*p) | |||
|
346 | k = self.compare_2_columns(column, new_col, **k) | |||
|
347 | return k | |||
|
348 | ||||
|
349 | def process_column(self, column): | |||
|
350 | """Processes default values for column""" | |||
|
351 | # XXX: this is a snippet from SA processing of positional parameters | |||
|
352 | if not SQLA_06 and column.args: | |||
|
353 | toinit = list(column.args) | |||
|
354 | else: | |||
|
355 | toinit = list() | |||
|
356 | ||||
|
357 | if column.server_default is not None: | |||
|
358 | if isinstance(column.server_default, sqlalchemy.FetchedValue): | |||
|
359 | toinit.append(column.server_default) | |||
|
360 | else: | |||
|
361 | toinit.append(sqlalchemy.DefaultClause(column.server_default)) | |||
|
362 | if column.server_onupdate is not None: | |||
|
363 | if isinstance(column.server_onupdate, FetchedValue): | |||
|
364 | toinit.append(column.server_default) | |||
|
365 | else: | |||
|
366 | toinit.append(sqlalchemy.DefaultClause(column.server_onupdate, | |||
|
367 | for_update=True)) | |||
|
368 | if toinit: | |||
|
369 | column._init_items(*toinit) | |||
|
370 | ||||
|
371 | if not SQLA_06: | |||
|
372 | column.args = [] | |||
|
373 | ||||
|
374 | def _get_table(self): | |||
|
375 | return getattr(self, '_table', None) | |||
|
376 | ||||
|
377 | def _set_table(self, table): | |||
|
378 | if isinstance(table, basestring): | |||
|
379 | if self.alter_metadata: | |||
|
380 | if not self.meta: | |||
|
381 | raise ValueError("metadata must be specified for table" | |||
|
382 | " reflection when using alter_metadata") | |||
|
383 | meta = self.meta | |||
|
384 | if self.engine: | |||
|
385 | meta.bind = self.engine | |||
|
386 | else: | |||
|
387 | if not self.engine and not self.meta: | |||
|
388 | raise ValueError("engine or metadata must be specified" | |||
|
389 | " to reflect tables") | |||
|
390 | if not self.engine: | |||
|
391 | self.engine = self.meta.bind | |||
|
392 | meta = sqlalchemy.MetaData(bind=self.engine) | |||
|
393 | self._table = sqlalchemy.Table(table, meta, autoload=True) | |||
|
394 | elif isinstance(table, sqlalchemy.Table): | |||
|
395 | self._table = table | |||
|
396 | if not self.alter_metadata: | |||
|
397 | self._table.meta = sqlalchemy.MetaData(bind=self._table.bind) | |||
|
398 | ||||
|
399 | def _get_result_column(self): | |||
|
400 | return getattr(self, '_result_column', None) | |||
|
401 | ||||
|
402 | def _set_result_column(self, column): | |||
|
403 | """Set Column to Table based on alter_metadata evaluation.""" | |||
|
404 | self.process_column(column) | |||
|
405 | if not hasattr(self, 'current_name'): | |||
|
406 | self.current_name = column.name | |||
|
407 | if self.alter_metadata: | |||
|
408 | self._result_column = column | |||
|
409 | else: | |||
|
410 | self._result_column = column.copy_fixed() | |||
|
411 | ||||
|
412 | table = property(_get_table, _set_table) | |||
|
413 | result_column = property(_get_result_column, _set_result_column) | |||
|
414 | ||||
|
415 | ||||
|
416 | class ChangesetTable(object): | |||
|
417 | """Changeset extensions to SQLAlchemy tables.""" | |||
|
418 | ||||
|
419 | def create_column(self, column, *p, **kw): | |||
|
420 | """Creates a column. | |||
|
421 | ||||
|
422 | The column parameter may be a column definition or the name of | |||
|
423 | a column in this table. | |||
|
424 | ||||
|
425 | API to :meth:`ChangesetColumn.create` | |||
|
426 | ||||
|
427 | :param column: Column to be created | |||
|
428 | :type column: Column instance or string | |||
|
429 | """ | |||
|
430 | if not isinstance(column, sqlalchemy.Column): | |||
|
431 | # It's a column name | |||
|
432 | column = getattr(self.c, str(column)) | |||
|
433 | column.create(table=self, *p, **kw) | |||
|
434 | ||||
|
435 | def drop_column(self, column, *p, **kw): | |||
|
436 | """Drop a column, given its name or definition. | |||
|
437 | ||||
|
438 | API to :meth:`ChangesetColumn.drop` | |||
|
439 | ||||
|
440 | :param column: Column to be droped | |||
|
441 | :type column: Column instance or string | |||
|
442 | """ | |||
|
443 | if not isinstance(column, sqlalchemy.Column): | |||
|
444 | # It's a column name | |||
|
445 | try: | |||
|
446 | column = getattr(self.c, str(column)) | |||
|
447 | except AttributeError: | |||
|
448 | # That column isn't part of the table. We don't need | |||
|
449 | # its entire definition to drop the column, just its | |||
|
450 | # name, so create a dummy column with the same name. | |||
|
451 | column = sqlalchemy.Column(str(column), sqlalchemy.Integer()) | |||
|
452 | column.drop(table=self, *p, **kw) | |||
|
453 | ||||
|
454 | def rename(self, name, connection=None, **kwargs): | |||
|
455 | """Rename this table. | |||
|
456 | ||||
|
457 | :param name: New name of the table. | |||
|
458 | :type name: string | |||
|
459 | :param alter_metadata: If True, table will be removed from metadata | |||
|
460 | :type alter_metadata: bool | |||
|
461 | :param connection: reuse connection istead of creating new one. | |||
|
462 | :type connection: :class:`sqlalchemy.engine.base.Connection` instance | |||
|
463 | """ | |||
|
464 | self.alter_metadata = kwargs.pop('alter_metadata', DEFAULT_ALTER_METADATA) | |||
|
465 | engine = self.bind | |||
|
466 | self.new_name = name | |||
|
467 | visitorcallable = get_engine_visitor(engine, 'schemachanger') | |||
|
468 | run_single_visitor(engine, visitorcallable, self, connection, **kwargs) | |||
|
469 | ||||
|
470 | # Fix metadata registration | |||
|
471 | if self.alter_metadata: | |||
|
472 | self.name = name | |||
|
473 | self.deregister() | |||
|
474 | self._set_parent(self.metadata) | |||
|
475 | ||||
|
476 | def _meta_key(self): | |||
|
477 | return sqlalchemy.schema._get_table_key(self.name, self.schema) | |||
|
478 | ||||
|
479 | def deregister(self): | |||
|
480 | """Remove this table from its metadata""" | |||
|
481 | key = self._meta_key() | |||
|
482 | meta = self.metadata | |||
|
483 | if key in meta.tables: | |||
|
484 | del meta.tables[key] | |||
|
485 | ||||
|
486 | ||||
|
487 | class ChangesetColumn(object): | |||
|
488 | """Changeset extensions to SQLAlchemy columns.""" | |||
|
489 | ||||
|
490 | def alter(self, *p, **k): | |||
|
491 | """Makes a call to :func:`alter_column` for the column this | |||
|
492 | method is called on. | |||
|
493 | """ | |||
|
494 | if 'table' not in k: | |||
|
495 | k['table'] = self.table | |||
|
496 | if 'engine' not in k: | |||
|
497 | k['engine'] = k['table'].bind | |||
|
498 | return alter_column(self, *p, **k) | |||
|
499 | ||||
|
500 | def create(self, table=None, index_name=None, unique_name=None, | |||
|
501 | primary_key_name=None, populate_default=True, connection=None, **kwargs): | |||
|
502 | """Create this column in the database. | |||
|
503 | ||||
|
504 | Assumes the given table exists. ``ALTER TABLE ADD COLUMN``, | |||
|
505 | for most databases. | |||
|
506 | ||||
|
507 | :param table: Table instance to create on. | |||
|
508 | :param index_name: Creates :class:`ChangesetIndex` on this column. | |||
|
509 | :param unique_name: Creates :class:\ | |||
|
510 | `~migrate.changeset.constraint.UniqueConstraint` on this column. | |||
|
511 | :param primary_key_name: Creates :class:\ | |||
|
512 | `~migrate.changeset.constraint.PrimaryKeyConstraint` on this column. | |||
|
513 | :param alter_metadata: If True, column will be added to table object. | |||
|
514 | :param populate_default: If True, created column will be \ | |||
|
515 | populated with defaults | |||
|
516 | :param connection: reuse connection istead of creating new one. | |||
|
517 | :type table: Table instance | |||
|
518 | :type index_name: string | |||
|
519 | :type unique_name: string | |||
|
520 | :type primary_key_name: string | |||
|
521 | :type alter_metadata: bool | |||
|
522 | :type populate_default: bool | |||
|
523 | :type connection: :class:`sqlalchemy.engine.base.Connection` instance | |||
|
524 | ||||
|
525 | :returns: self | |||
|
526 | """ | |||
|
527 | self.populate_default = populate_default | |||
|
528 | self.alter_metadata = kwargs.pop('alter_metadata', DEFAULT_ALTER_METADATA) | |||
|
529 | self.index_name = index_name | |||
|
530 | self.unique_name = unique_name | |||
|
531 | self.primary_key_name = primary_key_name | |||
|
532 | for cons in ('index_name', 'unique_name', 'primary_key_name'): | |||
|
533 | self._check_sanity_constraints(cons) | |||
|
534 | ||||
|
535 | if self.alter_metadata: | |||
|
536 | self.add_to_table(table) | |||
|
537 | engine = self.table.bind | |||
|
538 | visitorcallable = get_engine_visitor(engine, 'columngenerator') | |||
|
539 | engine._run_visitor(visitorcallable, self, connection, **kwargs) | |||
|
540 | ||||
|
541 | # TODO: reuse existing connection | |||
|
542 | if self.populate_default and self.default is not None: | |||
|
543 | stmt = table.update().values({self: engine._execute_default(self.default)}) | |||
|
544 | engine.execute(stmt) | |||
|
545 | ||||
|
546 | return self | |||
|
547 | ||||
|
548 | def drop(self, table=None, connection=None, **kwargs): | |||
|
549 | """Drop this column from the database, leaving its table intact. | |||
|
550 | ||||
|
551 | ``ALTER TABLE DROP COLUMN``, for most databases. | |||
|
552 | ||||
|
553 | :param alter_metadata: If True, column will be removed from table object. | |||
|
554 | :type alter_metadata: bool | |||
|
555 | :param connection: reuse connection istead of creating new one. | |||
|
556 | :type connection: :class:`sqlalchemy.engine.base.Connection` instance | |||
|
557 | """ | |||
|
558 | self.alter_metadata = kwargs.pop('alter_metadata', DEFAULT_ALTER_METADATA) | |||
|
559 | if table is not None: | |||
|
560 | self.table = table | |||
|
561 | engine = self.table.bind | |||
|
562 | if self.alter_metadata: | |||
|
563 | self.remove_from_table(self.table, unset_table=False) | |||
|
564 | visitorcallable = get_engine_visitor(engine, 'columndropper') | |||
|
565 | engine._run_visitor(visitorcallable, self, connection, **kwargs) | |||
|
566 | if self.alter_metadata: | |||
|
567 | self.table = None | |||
|
568 | return self | |||
|
569 | ||||
|
570 | def add_to_table(self, table): | |||
|
571 | if table is not None and self.table is None: | |||
|
572 | self._set_parent(table) | |||
|
573 | ||||
|
574 | def _col_name_in_constraint(self, cons, name): | |||
|
575 | return False | |||
|
576 | ||||
|
577 | def remove_from_table(self, table, unset_table=True): | |||
|
578 | # TODO: remove primary keys, constraints, etc | |||
|
579 | if unset_table: | |||
|
580 | self.table = None | |||
|
581 | ||||
|
582 | to_drop = set() | |||
|
583 | for index in table.indexes: | |||
|
584 | columns = [] | |||
|
585 | for col in index.columns: | |||
|
586 | if col.name != self.name: | |||
|
587 | columns.append(col) | |||
|
588 | if columns: | |||
|
589 | index.columns = columns | |||
|
590 | else: | |||
|
591 | to_drop.add(index) | |||
|
592 | table.indexes = table.indexes - to_drop | |||
|
593 | ||||
|
594 | to_drop = set() | |||
|
595 | for cons in table.constraints: | |||
|
596 | # TODO: deal with other types of constraint | |||
|
597 | if isinstance(cons, (ForeignKeyConstraint, | |||
|
598 | UniqueConstraint)): | |||
|
599 | for col_name in cons.columns: | |||
|
600 | if not isinstance(col_name, basestring): | |||
|
601 | col_name = col_name.name | |||
|
602 | if self.name == col_name: | |||
|
603 | to_drop.add(cons) | |||
|
604 | table.constraints = table.constraints - to_drop | |||
|
605 | ||||
|
606 | if table.c.contains_column(self): | |||
|
607 | table.c.remove(self) | |||
|
608 | ||||
|
609 | # TODO: this is fixed in 0.6 | |||
|
610 | def copy_fixed(self, **kw): | |||
|
611 | """Create a copy of this ``Column``, with all attributes.""" | |||
|
612 | return sqlalchemy.Column(self.name, self.type, self.default, | |||
|
613 | key=self.key, | |||
|
614 | primary_key=self.primary_key, | |||
|
615 | nullable=self.nullable, | |||
|
616 | quote=self.quote, | |||
|
617 | index=self.index, | |||
|
618 | unique=self.unique, | |||
|
619 | onupdate=self.onupdate, | |||
|
620 | autoincrement=self.autoincrement, | |||
|
621 | server_default=self.server_default, | |||
|
622 | server_onupdate=self.server_onupdate, | |||
|
623 | *[c.copy(**kw) for c in self.constraints]) | |||
|
624 | ||||
|
625 | def _check_sanity_constraints(self, name): | |||
|
626 | """Check if constraints names are correct""" | |||
|
627 | obj = getattr(self, name) | |||
|
628 | if (getattr(self, name[:-5]) and not obj): | |||
|
629 | raise InvalidConstraintError("Column.create() accepts index_name," | |||
|
630 | " primary_key_name and unique_name to generate constraints") | |||
|
631 | if not isinstance(obj, basestring) and obj is not None: | |||
|
632 | raise InvalidConstraintError( | |||
|
633 | "%s argument for column must be constraint name" % name) | |||
|
634 | ||||
|
635 | ||||
|
636 | class ChangesetIndex(object): | |||
|
637 | """Changeset extensions to SQLAlchemy Indexes.""" | |||
|
638 | ||||
|
639 | __visit_name__ = 'index' | |||
|
640 | ||||
|
641 | def rename(self, name, connection=None, **kwargs): | |||
|
642 | """Change the name of an index. | |||
|
643 | ||||
|
644 | :param name: New name of the Index. | |||
|
645 | :type name: string | |||
|
646 | :param alter_metadata: If True, Index object will be altered. | |||
|
647 | :type alter_metadata: bool | |||
|
648 | :param connection: reuse connection istead of creating new one. | |||
|
649 | :type connection: :class:`sqlalchemy.engine.base.Connection` instance | |||
|
650 | """ | |||
|
651 | self.alter_metadata = kwargs.pop('alter_metadata', DEFAULT_ALTER_METADATA) | |||
|
652 | engine = self.table.bind | |||
|
653 | self.new_name = name | |||
|
654 | visitorcallable = get_engine_visitor(engine, 'schemachanger') | |||
|
655 | engine._run_visitor(visitorcallable, self, connection, **kwargs) | |||
|
656 | if self.alter_metadata: | |||
|
657 | self.name = name | |||
|
658 | ||||
|
659 | ||||
|
660 | class ChangesetDefaultClause(object): | |||
|
661 | """Implements comparison between :class:`DefaultClause` instances""" | |||
|
662 | ||||
|
663 | def __eq__(self, other): | |||
|
664 | if isinstance(other, self.__class__): | |||
|
665 | if self.arg == other.arg: | |||
|
666 | return True | |||
|
667 | ||||
|
668 | def __ne__(self, other): | |||
|
669 | return not self.__eq__(other) |
@@ -0,0 +1,87 b'' | |||||
|
1 | """ | |||
|
2 | Provide exception classes for :mod:`migrate` | |||
|
3 | """ | |||
|
4 | ||||
|
5 | ||||
|
6 | class Error(Exception): | |||
|
7 | """Error base class.""" | |||
|
8 | ||||
|
9 | ||||
|
10 | class ApiError(Error): | |||
|
11 | """Base class for API errors.""" | |||
|
12 | ||||
|
13 | ||||
|
14 | class KnownError(ApiError): | |||
|
15 | """A known error condition.""" | |||
|
16 | ||||
|
17 | ||||
|
18 | class UsageError(ApiError): | |||
|
19 | """A known error condition where help should be displayed.""" | |||
|
20 | ||||
|
21 | ||||
|
22 | class ControlledSchemaError(Error): | |||
|
23 | """Base class for controlled schema errors.""" | |||
|
24 | ||||
|
25 | ||||
|
26 | class InvalidVersionError(ControlledSchemaError): | |||
|
27 | """Invalid version number.""" | |||
|
28 | ||||
|
29 | ||||
|
30 | class DatabaseNotControlledError(ControlledSchemaError): | |||
|
31 | """Database should be under version control, but it's not.""" | |||
|
32 | ||||
|
33 | ||||
|
34 | class DatabaseAlreadyControlledError(ControlledSchemaError): | |||
|
35 | """Database shouldn't be under version control, but it is""" | |||
|
36 | ||||
|
37 | ||||
|
38 | class WrongRepositoryError(ControlledSchemaError): | |||
|
39 | """This database is under version control by another repository.""" | |||
|
40 | ||||
|
41 | ||||
|
42 | class NoSuchTableError(ControlledSchemaError): | |||
|
43 | """The table does not exist.""" | |||
|
44 | ||||
|
45 | ||||
|
46 | class PathError(Error): | |||
|
47 | """Base class for path errors.""" | |||
|
48 | ||||
|
49 | ||||
|
50 | class PathNotFoundError(PathError): | |||
|
51 | """A path with no file was required; found a file.""" | |||
|
52 | ||||
|
53 | ||||
|
54 | class PathFoundError(PathError): | |||
|
55 | """A path with a file was required; found no file.""" | |||
|
56 | ||||
|
57 | ||||
|
58 | class RepositoryError(Error): | |||
|
59 | """Base class for repository errors.""" | |||
|
60 | ||||
|
61 | ||||
|
62 | class InvalidRepositoryError(RepositoryError): | |||
|
63 | """Invalid repository error.""" | |||
|
64 | ||||
|
65 | ||||
|
66 | class ScriptError(Error): | |||
|
67 | """Base class for script errors.""" | |||
|
68 | ||||
|
69 | ||||
|
70 | class InvalidScriptError(ScriptError): | |||
|
71 | """Invalid script error.""" | |||
|
72 | ||||
|
73 | ||||
|
74 | class InvalidVersionError(Error): | |||
|
75 | """Invalid version error.""" | |||
|
76 | ||||
|
77 | # migrate.changeset | |||
|
78 | ||||
|
79 | class NotSupportedError(Error): | |||
|
80 | """Not supported error""" | |||
|
81 | ||||
|
82 | ||||
|
83 | class InvalidConstraintError(Error): | |||
|
84 | """Invalid constraint error""" | |||
|
85 | ||||
|
86 | class MigrateDeprecationWarning(DeprecationWarning): | |||
|
87 | """Warning for deprecated features in Migrate""" |
@@ -0,0 +1,5 b'' | |||||
|
1 | """ | |||
|
2 | This package provides functionality to create and manage | |||
|
3 | repositories of database schema changesets and to apply these | |||
|
4 | changesets to databases. | |||
|
5 | """ |
@@ -0,0 +1,383 b'' | |||||
|
1 | """ | |||
|
2 | This module provides an external API to the versioning system. | |||
|
3 | ||||
|
4 | .. versionchanged:: 0.6.0 | |||
|
5 | :func:`migrate.versioning.api.test` and schema diff functions | |||
|
6 | changed order of positional arguments so all accept `url` and `repository` | |||
|
7 | as first arguments. | |||
|
8 | ||||
|
9 | .. versionchanged:: 0.5.4 | |||
|
10 | ``--preview_sql`` displays source file when using SQL scripts. | |||
|
11 | If Python script is used, it runs the action with mocked engine and | |||
|
12 | returns captured SQL statements. | |||
|
13 | ||||
|
14 | .. versionchanged:: 0.5.4 | |||
|
15 | Deprecated ``--echo`` parameter in favour of new | |||
|
16 | :func:`migrate.versioning.util.construct_engine` behavior. | |||
|
17 | """ | |||
|
18 | ||||
|
19 | # Dear migrate developers, | |||
|
20 | # | |||
|
21 | # please do not comment this module using sphinx syntax because its | |||
|
22 | # docstrings are presented as user help and most users cannot | |||
|
23 | # interpret sphinx annotated ReStructuredText. | |||
|
24 | # | |||
|
25 | # Thanks, | |||
|
26 | # Jan Dittberner | |||
|
27 | ||||
|
28 | import sys | |||
|
29 | import inspect | |||
|
30 | import logging | |||
|
31 | ||||
|
32 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
33 | from rhodecode.lib.dbmigrate.migrate.versioning import repository, schema, version, \ | |||
|
34 | script as script_ # command name conflict | |||
|
35 | from rhodecode.lib.dbmigrate.migrate.versioning.util import catch_known_errors, with_engine | |||
|
36 | ||||
|
37 | ||||
|
38 | log = logging.getLogger(__name__) | |||
|
39 | command_desc = { | |||
|
40 | 'help': 'displays help on a given command', | |||
|
41 | 'create': 'create an empty repository at the specified path', | |||
|
42 | 'script': 'create an empty change Python script', | |||
|
43 | 'script_sql': 'create empty change SQL scripts for given database', | |||
|
44 | 'version': 'display the latest version available in a repository', | |||
|
45 | 'db_version': 'show the current version of the repository under version control', | |||
|
46 | 'source': 'display the Python code for a particular version in this repository', | |||
|
47 | 'version_control': 'mark a database as under this repository\'s version control', | |||
|
48 | 'upgrade': 'upgrade a database to a later version', | |||
|
49 | 'downgrade': 'downgrade a database to an earlier version', | |||
|
50 | 'drop_version_control': 'removes version control from a database', | |||
|
51 | 'manage': 'creates a Python script that runs Migrate with a set of default values', | |||
|
52 | 'test': 'performs the upgrade and downgrade command on the given database', | |||
|
53 | 'compare_model_to_db': 'compare MetaData against the current database state', | |||
|
54 | 'create_model': 'dump the current database as a Python model to stdout', | |||
|
55 | 'make_update_script_for_model': 'create a script changing the old MetaData to the new (current) MetaData', | |||
|
56 | 'update_db_from_model': 'modify the database to match the structure of the current MetaData', | |||
|
57 | } | |||
|
58 | __all__ = command_desc.keys() | |||
|
59 | ||||
|
60 | Repository = repository.Repository | |||
|
61 | ControlledSchema = schema.ControlledSchema | |||
|
62 | VerNum = version.VerNum | |||
|
63 | PythonScript = script_.PythonScript | |||
|
64 | SqlScript = script_.SqlScript | |||
|
65 | ||||
|
66 | ||||
|
67 | # deprecated | |||
|
68 | def help(cmd=None, **opts): | |||
|
69 | """%prog help COMMAND | |||
|
70 | ||||
|
71 | Displays help on a given command. | |||
|
72 | """ | |||
|
73 | if cmd is None: | |||
|
74 | raise exceptions.UsageError(None) | |||
|
75 | try: | |||
|
76 | func = globals()[cmd] | |||
|
77 | except: | |||
|
78 | raise exceptions.UsageError( | |||
|
79 | "'%s' isn't a valid command. Try 'help COMMAND'" % cmd) | |||
|
80 | ret = func.__doc__ | |||
|
81 | if sys.argv[0]: | |||
|
82 | ret = ret.replace('%prog', sys.argv[0]) | |||
|
83 | return ret | |||
|
84 | ||||
|
85 | @catch_known_errors | |||
|
86 | def create(repository, name, **opts): | |||
|
87 | """%prog create REPOSITORY_PATH NAME [--table=TABLE] | |||
|
88 | ||||
|
89 | Create an empty repository at the specified path. | |||
|
90 | ||||
|
91 | You can specify the version_table to be used; by default, it is | |||
|
92 | 'migrate_version'. This table is created in all version-controlled | |||
|
93 | databases. | |||
|
94 | """ | |||
|
95 | repo_path = Repository.create(repository, name, **opts) | |||
|
96 | ||||
|
97 | ||||
|
98 | @catch_known_errors | |||
|
99 | def script(description, repository, **opts): | |||
|
100 | """%prog script DESCRIPTION REPOSITORY_PATH | |||
|
101 | ||||
|
102 | Create an empty change script using the next unused version number | |||
|
103 | appended with the given description. | |||
|
104 | ||||
|
105 | For instance, manage.py script "Add initial tables" creates: | |||
|
106 | repository/versions/001_Add_initial_tables.py | |||
|
107 | """ | |||
|
108 | repo = Repository(repository) | |||
|
109 | repo.create_script(description, **opts) | |||
|
110 | ||||
|
111 | ||||
|
112 | @catch_known_errors | |||
|
113 | def script_sql(database, repository, **opts): | |||
|
114 | """%prog script_sql DATABASE REPOSITORY_PATH | |||
|
115 | ||||
|
116 | Create empty change SQL scripts for given DATABASE, where DATABASE | |||
|
117 | is either specific ('postgres', 'mysql', 'oracle', 'sqlite', etc.) | |||
|
118 | or generic ('default'). | |||
|
119 | ||||
|
120 | For instance, manage.py script_sql postgres creates: | |||
|
121 | repository/versions/001_postgres_upgrade.sql and | |||
|
122 | repository/versions/001_postgres_postgres.sql | |||
|
123 | """ | |||
|
124 | repo = Repository(repository) | |||
|
125 | repo.create_script_sql(database, **opts) | |||
|
126 | ||||
|
127 | ||||
|
128 | def version(repository, **opts): | |||
|
129 | """%prog version REPOSITORY_PATH | |||
|
130 | ||||
|
131 | Display the latest version available in a repository. | |||
|
132 | """ | |||
|
133 | repo = Repository(repository) | |||
|
134 | return repo.latest | |||
|
135 | ||||
|
136 | ||||
|
137 | @with_engine | |||
|
138 | def db_version(url, repository, **opts): | |||
|
139 | """%prog db_version URL REPOSITORY_PATH | |||
|
140 | ||||
|
141 | Show the current version of the repository with the given | |||
|
142 | connection string, under version control of the specified | |||
|
143 | repository. | |||
|
144 | ||||
|
145 | The url should be any valid SQLAlchemy connection string. | |||
|
146 | """ | |||
|
147 | engine = opts.pop('engine') | |||
|
148 | schema = ControlledSchema(engine, repository) | |||
|
149 | return schema.version | |||
|
150 | ||||
|
151 | ||||
|
152 | def source(version, dest=None, repository=None, **opts): | |||
|
153 | """%prog source VERSION [DESTINATION] --repository=REPOSITORY_PATH | |||
|
154 | ||||
|
155 | Display the Python code for a particular version in this | |||
|
156 | repository. Save it to the file at DESTINATION or, if omitted, | |||
|
157 | send to stdout. | |||
|
158 | """ | |||
|
159 | if repository is None: | |||
|
160 | raise exceptions.UsageError("A repository must be specified") | |||
|
161 | repo = Repository(repository) | |||
|
162 | ret = repo.version(version).script().source() | |||
|
163 | if dest is not None: | |||
|
164 | dest = open(dest, 'w') | |||
|
165 | dest.write(ret) | |||
|
166 | dest.close() | |||
|
167 | ret = None | |||
|
168 | return ret | |||
|
169 | ||||
|
170 | ||||
|
171 | def upgrade(url, repository, version=None, **opts): | |||
|
172 | """%prog upgrade URL REPOSITORY_PATH [VERSION] [--preview_py|--preview_sql] | |||
|
173 | ||||
|
174 | Upgrade a database to a later version. | |||
|
175 | ||||
|
176 | This runs the upgrade() function defined in your change scripts. | |||
|
177 | ||||
|
178 | By default, the database is updated to the latest available | |||
|
179 | version. You may specify a version instead, if you wish. | |||
|
180 | ||||
|
181 | You may preview the Python or SQL code to be executed, rather than | |||
|
182 | actually executing it, using the appropriate 'preview' option. | |||
|
183 | """ | |||
|
184 | err = "Cannot upgrade a database of version %s to version %s. "\ | |||
|
185 | "Try 'downgrade' instead." | |||
|
186 | return _migrate(url, repository, version, upgrade=True, err=err, **opts) | |||
|
187 | ||||
|
188 | ||||
|
189 | def downgrade(url, repository, version, **opts): | |||
|
190 | """%prog downgrade URL REPOSITORY_PATH VERSION [--preview_py|--preview_sql] | |||
|
191 | ||||
|
192 | Downgrade a database to an earlier version. | |||
|
193 | ||||
|
194 | This is the reverse of upgrade; this runs the downgrade() function | |||
|
195 | defined in your change scripts. | |||
|
196 | ||||
|
197 | You may preview the Python or SQL code to be executed, rather than | |||
|
198 | actually executing it, using the appropriate 'preview' option. | |||
|
199 | """ | |||
|
200 | err = "Cannot downgrade a database of version %s to version %s. "\ | |||
|
201 | "Try 'upgrade' instead." | |||
|
202 | return _migrate(url, repository, version, upgrade=False, err=err, **opts) | |||
|
203 | ||||
|
204 | @with_engine | |||
|
205 | def test(url, repository, **opts): | |||
|
206 | """%prog test URL REPOSITORY_PATH [VERSION] | |||
|
207 | ||||
|
208 | Performs the upgrade and downgrade option on the given | |||
|
209 | database. This is not a real test and may leave the database in a | |||
|
210 | bad state. You should therefore better run the test on a copy of | |||
|
211 | your database. | |||
|
212 | """ | |||
|
213 | engine = opts.pop('engine') | |||
|
214 | repos = Repository(repository) | |||
|
215 | script = repos.version(None).script() | |||
|
216 | ||||
|
217 | # Upgrade | |||
|
218 | log.info("Upgrading...") | |||
|
219 | script.run(engine, 1) | |||
|
220 | log.info("done") | |||
|
221 | ||||
|
222 | log.info("Downgrading...") | |||
|
223 | script.run(engine, -1) | |||
|
224 | log.info("done") | |||
|
225 | log.info("Success") | |||
|
226 | ||||
|
227 | ||||
|
228 | @with_engine | |||
|
229 | def version_control(url, repository, version=None, **opts): | |||
|
230 | """%prog version_control URL REPOSITORY_PATH [VERSION] | |||
|
231 | ||||
|
232 | Mark a database as under this repository's version control. | |||
|
233 | ||||
|
234 | Once a database is under version control, schema changes should | |||
|
235 | only be done via change scripts in this repository. | |||
|
236 | ||||
|
237 | This creates the table version_table in the database. | |||
|
238 | ||||
|
239 | The url should be any valid SQLAlchemy connection string. | |||
|
240 | ||||
|
241 | By default, the database begins at version 0 and is assumed to be | |||
|
242 | empty. If the database is not empty, you may specify a version at | |||
|
243 | which to begin instead. No attempt is made to verify this | |||
|
244 | version's correctness - the database schema is expected to be | |||
|
245 | identical to what it would be if the database were created from | |||
|
246 | scratch. | |||
|
247 | """ | |||
|
248 | engine = opts.pop('engine') | |||
|
249 | ControlledSchema.create(engine, repository, version) | |||
|
250 | ||||
|
251 | ||||
|
252 | @with_engine | |||
|
253 | def drop_version_control(url, repository, **opts): | |||
|
254 | """%prog drop_version_control URL REPOSITORY_PATH | |||
|
255 | ||||
|
256 | Removes version control from a database. | |||
|
257 | """ | |||
|
258 | engine = opts.pop('engine') | |||
|
259 | schema = ControlledSchema(engine, repository) | |||
|
260 | schema.drop() | |||
|
261 | ||||
|
262 | ||||
|
263 | def manage(file, **opts): | |||
|
264 | """%prog manage FILENAME [VARIABLES...] | |||
|
265 | ||||
|
266 | Creates a script that runs Migrate with a set of default values. | |||
|
267 | ||||
|
268 | For example:: | |||
|
269 | ||||
|
270 | %prog manage manage.py --repository=/path/to/repository \ | |||
|
271 | --url=sqlite:///project.db | |||
|
272 | ||||
|
273 | would create the script manage.py. The following two commands | |||
|
274 | would then have exactly the same results:: | |||
|
275 | ||||
|
276 | python manage.py version | |||
|
277 | %prog version --repository=/path/to/repository | |||
|
278 | """ | |||
|
279 | Repository.create_manage_file(file, **opts) | |||
|
280 | ||||
|
281 | ||||
|
282 | @with_engine | |||
|
283 | def compare_model_to_db(url, repository, model, **opts): | |||
|
284 | """%prog compare_model_to_db URL REPOSITORY_PATH MODEL | |||
|
285 | ||||
|
286 | Compare the current model (assumed to be a module level variable | |||
|
287 | of type sqlalchemy.MetaData) against the current database. | |||
|
288 | ||||
|
289 | NOTE: This is EXPERIMENTAL. | |||
|
290 | """ # TODO: get rid of EXPERIMENTAL label | |||
|
291 | engine = opts.pop('engine') | |||
|
292 | return ControlledSchema.compare_model_to_db(engine, model, repository) | |||
|
293 | ||||
|
294 | ||||
|
295 | @with_engine | |||
|
296 | def create_model(url, repository, **opts): | |||
|
297 | """%prog create_model URL REPOSITORY_PATH [DECLERATIVE=True] | |||
|
298 | ||||
|
299 | Dump the current database as a Python model to stdout. | |||
|
300 | ||||
|
301 | NOTE: This is EXPERIMENTAL. | |||
|
302 | """ # TODO: get rid of EXPERIMENTAL label | |||
|
303 | engine = opts.pop('engine') | |||
|
304 | declarative = opts.get('declarative', False) | |||
|
305 | return ControlledSchema.create_model(engine, repository, declarative) | |||
|
306 | ||||
|
307 | ||||
|
308 | @catch_known_errors | |||
|
309 | @with_engine | |||
|
310 | def make_update_script_for_model(url, repository, oldmodel, model, **opts): | |||
|
311 | """%prog make_update_script_for_model URL OLDMODEL MODEL REPOSITORY_PATH | |||
|
312 | ||||
|
313 | Create a script changing the old Python model to the new (current) | |||
|
314 | Python model, sending to stdout. | |||
|
315 | ||||
|
316 | NOTE: This is EXPERIMENTAL. | |||
|
317 | """ # TODO: get rid of EXPERIMENTAL label | |||
|
318 | engine = opts.pop('engine') | |||
|
319 | return PythonScript.make_update_script_for_model( | |||
|
320 | engine, oldmodel, model, repository, **opts) | |||
|
321 | ||||
|
322 | ||||
|
323 | @with_engine | |||
|
324 | def update_db_from_model(url, repository, model, **opts): | |||
|
325 | """%prog update_db_from_model URL REPOSITORY_PATH MODEL | |||
|
326 | ||||
|
327 | Modify the database to match the structure of the current Python | |||
|
328 | model. This also sets the db_version number to the latest in the | |||
|
329 | repository. | |||
|
330 | ||||
|
331 | NOTE: This is EXPERIMENTAL. | |||
|
332 | """ # TODO: get rid of EXPERIMENTAL label | |||
|
333 | engine = opts.pop('engine') | |||
|
334 | schema = ControlledSchema(engine, repository) | |||
|
335 | schema.update_db_from_model(model) | |||
|
336 | ||||
|
337 | @with_engine | |||
|
338 | def _migrate(url, repository, version, upgrade, err, **opts): | |||
|
339 | engine = opts.pop('engine') | |||
|
340 | url = str(engine.url) | |||
|
341 | schema = ControlledSchema(engine, repository) | |||
|
342 | version = _migrate_version(schema, version, upgrade, err) | |||
|
343 | ||||
|
344 | changeset = schema.changeset(version) | |||
|
345 | for ver, change in changeset: | |||
|
346 | nextver = ver + changeset.step | |||
|
347 | log.info('%s -> %s... ', ver, nextver) | |||
|
348 | ||||
|
349 | if opts.get('preview_sql'): | |||
|
350 | if isinstance(change, PythonScript): | |||
|
351 | log.info(change.preview_sql(url, changeset.step, **opts)) | |||
|
352 | elif isinstance(change, SqlScript): | |||
|
353 | log.info(change.source()) | |||
|
354 | ||||
|
355 | elif opts.get('preview_py'): | |||
|
356 | if not isinstance(change, PythonScript): | |||
|
357 | raise exceptions.UsageError("Python source can be only displayed" | |||
|
358 | " for python migration files") | |||
|
359 | source_ver = max(ver, nextver) | |||
|
360 | module = schema.repository.version(source_ver).script().module | |||
|
361 | funcname = upgrade and "upgrade" or "downgrade" | |||
|
362 | func = getattr(module, funcname) | |||
|
363 | log.info(inspect.getsource(func)) | |||
|
364 | else: | |||
|
365 | schema.runchange(ver, change, changeset.step) | |||
|
366 | log.info('done') | |||
|
367 | ||||
|
368 | ||||
|
369 | def _migrate_version(schema, version, upgrade, err): | |||
|
370 | if version is None: | |||
|
371 | return version | |||
|
372 | # Version is specified: ensure we're upgrading in the right direction | |||
|
373 | # (current version < target version for upgrading; reverse for down) | |||
|
374 | version = VerNum(version) | |||
|
375 | cur = schema.version | |||
|
376 | if upgrade is not None: | |||
|
377 | if upgrade: | |||
|
378 | direction = cur <= version | |||
|
379 | else: | |||
|
380 | direction = cur >= version | |||
|
381 | if not direction: | |||
|
382 | raise exceptions.KnownError(err % (cur, version)) | |||
|
383 | return version |
@@ -0,0 +1,27 b'' | |||||
|
1 | """ | |||
|
2 | Configuration parser module. | |||
|
3 | """ | |||
|
4 | ||||
|
5 | from ConfigParser import ConfigParser | |||
|
6 | ||||
|
7 | from rhodecode.lib.dbmigrate.migrate.versioning.config import * | |||
|
8 | from rhodecode.lib.dbmigrate.migrate.versioning import pathed | |||
|
9 | ||||
|
10 | ||||
|
11 | class Parser(ConfigParser): | |||
|
12 | """A project configuration file.""" | |||
|
13 | ||||
|
14 | def to_dict(self, sections=None): | |||
|
15 | """It's easier to access config values like dictionaries""" | |||
|
16 | return self._sections | |||
|
17 | ||||
|
18 | ||||
|
19 | class Config(pathed.Pathed, Parser): | |||
|
20 | """Configuration class.""" | |||
|
21 | ||||
|
22 | def __init__(self, path, *p, **k): | |||
|
23 | """Confirm the config file exists; read it.""" | |||
|
24 | self.require_found(path) | |||
|
25 | pathed.Pathed.__init__(self, path) | |||
|
26 | Parser.__init__(self, *p, **k) | |||
|
27 | self.read(path) |
@@ -0,0 +1,14 b'' | |||||
|
1 | #!/usr/bin/python | |||
|
2 | # -*- coding: utf-8 -*- | |||
|
3 | ||||
|
4 | from sqlalchemy.util import OrderedDict | |||
|
5 | ||||
|
6 | ||||
|
7 | __all__ = ['databases', 'operations'] | |||
|
8 | ||||
|
9 | databases = ('sqlite', 'postgres', 'mysql', 'oracle', 'mssql', 'firebird') | |||
|
10 | ||||
|
11 | # Map operation names to function names | |||
|
12 | operations = OrderedDict() | |||
|
13 | operations['upgrade'] = 'upgrade' | |||
|
14 | operations['downgrade'] = 'downgrade' |
@@ -0,0 +1,253 b'' | |||||
|
1 | """ | |||
|
2 | Code to generate a Python model from a database or differences | |||
|
3 | between a model and database. | |||
|
4 | ||||
|
5 | Some of this is borrowed heavily from the AutoCode project at: | |||
|
6 | http://code.google.com/p/sqlautocode/ | |||
|
7 | """ | |||
|
8 | ||||
|
9 | import sys | |||
|
10 | import logging | |||
|
11 | ||||
|
12 | import sqlalchemy | |||
|
13 | ||||
|
14 | from rhodecode.lib.dbmigrate import migrate | |||
|
15 | from rhodecode.lib.dbmigrate.migrate import changeset | |||
|
16 | ||||
|
17 | log = logging.getLogger(__name__) | |||
|
18 | HEADER = """ | |||
|
19 | ## File autogenerated by genmodel.py | |||
|
20 | ||||
|
21 | from sqlalchemy import * | |||
|
22 | meta = MetaData() | |||
|
23 | """ | |||
|
24 | ||||
|
25 | DECLARATIVE_HEADER = """ | |||
|
26 | ## File autogenerated by genmodel.py | |||
|
27 | ||||
|
28 | from sqlalchemy import * | |||
|
29 | from sqlalchemy.ext import declarative | |||
|
30 | ||||
|
31 | Base = declarative.declarative_base() | |||
|
32 | """ | |||
|
33 | ||||
|
34 | ||||
|
35 | class ModelGenerator(object): | |||
|
36 | ||||
|
37 | def __init__(self, diff, engine, declarative=False): | |||
|
38 | self.diff = diff | |||
|
39 | self.engine = engine | |||
|
40 | self.declarative = declarative | |||
|
41 | ||||
|
42 | def column_repr(self, col): | |||
|
43 | kwarg = [] | |||
|
44 | if col.key != col.name: | |||
|
45 | kwarg.append('key') | |||
|
46 | if col.primary_key: | |||
|
47 | col.primary_key = True # otherwise it dumps it as 1 | |||
|
48 | kwarg.append('primary_key') | |||
|
49 | if not col.nullable: | |||
|
50 | kwarg.append('nullable') | |||
|
51 | if col.onupdate: | |||
|
52 | kwarg.append('onupdate') | |||
|
53 | if col.default: | |||
|
54 | if col.primary_key: | |||
|
55 | # I found that PostgreSQL automatically creates a | |||
|
56 | # default value for the sequence, but let's not show | |||
|
57 | # that. | |||
|
58 | pass | |||
|
59 | else: | |||
|
60 | kwarg.append('default') | |||
|
61 | ks = ', '.join('%s=%r' % (k, getattr(col, k)) for k in kwarg) | |||
|
62 | ||||
|
63 | # crs: not sure if this is good idea, but it gets rid of extra | |||
|
64 | # u'' | |||
|
65 | name = col.name.encode('utf8') | |||
|
66 | ||||
|
67 | type_ = col.type | |||
|
68 | for cls in col.type.__class__.__mro__: | |||
|
69 | if cls.__module__ == 'sqlalchemy.types' and \ | |||
|
70 | not cls.__name__.isupper(): | |||
|
71 | if cls is not type_.__class__: | |||
|
72 | type_ = cls() | |||
|
73 | break | |||
|
74 | ||||
|
75 | data = { | |||
|
76 | 'name': name, | |||
|
77 | 'type': type_, | |||
|
78 | 'constraints': ', '.join([repr(cn) for cn in col.constraints]), | |||
|
79 | 'args': ks and ks or ''} | |||
|
80 | ||||
|
81 | if data['constraints']: | |||
|
82 | if data['args']: | |||
|
83 | data['args'] = ',' + data['args'] | |||
|
84 | ||||
|
85 | if data['constraints'] or data['args']: | |||
|
86 | data['maybeComma'] = ',' | |||
|
87 | else: | |||
|
88 | data['maybeComma'] = '' | |||
|
89 | ||||
|
90 | commonStuff = """ %(maybeComma)s %(constraints)s %(args)s)""" % data | |||
|
91 | commonStuff = commonStuff.strip() | |||
|
92 | data['commonStuff'] = commonStuff | |||
|
93 | if self.declarative: | |||
|
94 | return """%(name)s = Column(%(type)r%(commonStuff)s""" % data | |||
|
95 | else: | |||
|
96 | return """Column(%(name)r, %(type)r%(commonStuff)s""" % data | |||
|
97 | ||||
|
98 | def getTableDefn(self, table): | |||
|
99 | out = [] | |||
|
100 | tableName = table.name | |||
|
101 | if self.declarative: | |||
|
102 | out.append("class %(table)s(Base):" % {'table': tableName}) | |||
|
103 | out.append(" __tablename__ = '%(table)s'" % {'table': tableName}) | |||
|
104 | for col in table.columns: | |||
|
105 | out.append(" %s" % self.column_repr(col)) | |||
|
106 | else: | |||
|
107 | out.append("%(table)s = Table('%(table)s', meta," % \ | |||
|
108 | {'table': tableName}) | |||
|
109 | for col in table.columns: | |||
|
110 | out.append(" %s," % self.column_repr(col)) | |||
|
111 | out.append(")") | |||
|
112 | return out | |||
|
113 | ||||
|
114 | def _get_tables(self, missingA=False, missingB=False, modified=False): | |||
|
115 | to_process = [] | |||
|
116 | for bool_, names, metadata in ( | |||
|
117 | (missingA, self.diff.tables_missing_from_A, self.diff.metadataB), | |||
|
118 | (missingB, self.diff.tables_missing_from_B, self.diff.metadataA), | |||
|
119 | (modified, self.diff.tables_different, self.diff.metadataA), | |||
|
120 | ): | |||
|
121 | if bool_: | |||
|
122 | for name in names: | |||
|
123 | yield metadata.tables.get(name) | |||
|
124 | ||||
|
125 | def toPython(self): | |||
|
126 | """Assume database is current and model is empty.""" | |||
|
127 | out = [] | |||
|
128 | if self.declarative: | |||
|
129 | out.append(DECLARATIVE_HEADER) | |||
|
130 | else: | |||
|
131 | out.append(HEADER) | |||
|
132 | out.append("") | |||
|
133 | for table in self._get_tables(missingA=True): | |||
|
134 | out.extend(self.getTableDefn(table)) | |||
|
135 | out.append("") | |||
|
136 | return '\n'.join(out) | |||
|
137 | ||||
|
138 | def toUpgradeDowngradePython(self, indent=' '): | |||
|
139 | ''' Assume model is most current and database is out-of-date. ''' | |||
|
140 | decls = ['from rhodecode.lib.dbmigrate.migrate.changeset import schema', | |||
|
141 | 'meta = MetaData()'] | |||
|
142 | for table in self._get_tables( | |||
|
143 | missingA=True, missingB=True, modified=True | |||
|
144 | ): | |||
|
145 | decls.extend(self.getTableDefn(table)) | |||
|
146 | ||||
|
147 | upgradeCommands, downgradeCommands = [], [] | |||
|
148 | for tableName in self.diff.tables_missing_from_A: | |||
|
149 | upgradeCommands.append("%(table)s.drop()" % {'table': tableName}) | |||
|
150 | downgradeCommands.append("%(table)s.create()" % \ | |||
|
151 | {'table': tableName}) | |||
|
152 | for tableName in self.diff.tables_missing_from_B: | |||
|
153 | upgradeCommands.append("%(table)s.create()" % {'table': tableName}) | |||
|
154 | downgradeCommands.append("%(table)s.drop()" % {'table': tableName}) | |||
|
155 | ||||
|
156 | for tableName in self.diff.tables_different: | |||
|
157 | dbTable = self.diff.metadataB.tables[tableName] | |||
|
158 | missingInDatabase, missingInModel, diffDecl = \ | |||
|
159 | self.diff.colDiffs[tableName] | |||
|
160 | for col in missingInDatabase: | |||
|
161 | upgradeCommands.append('%s.columns[%r].create()' % ( | |||
|
162 | modelTable, col.name)) | |||
|
163 | downgradeCommands.append('%s.columns[%r].drop()' % ( | |||
|
164 | modelTable, col.name)) | |||
|
165 | for col in missingInModel: | |||
|
166 | upgradeCommands.append('%s.columns[%r].drop()' % ( | |||
|
167 | modelTable, col.name)) | |||
|
168 | downgradeCommands.append('%s.columns[%r].create()' % ( | |||
|
169 | modelTable, col.name)) | |||
|
170 | for modelCol, databaseCol, modelDecl, databaseDecl in diffDecl: | |||
|
171 | upgradeCommands.append( | |||
|
172 | 'assert False, "Can\'t alter columns: %s:%s=>%s"', | |||
|
173 | modelTable, modelCol.name, databaseCol.name) | |||
|
174 | downgradeCommands.append( | |||
|
175 | 'assert False, "Can\'t alter columns: %s:%s=>%s"', | |||
|
176 | modelTable, modelCol.name, databaseCol.name) | |||
|
177 | pre_command = ' meta.bind = migrate_engine' | |||
|
178 | ||||
|
179 | return ( | |||
|
180 | '\n'.join(decls), | |||
|
181 | '\n'.join([pre_command] + ['%s%s' % (indent, line) for line in upgradeCommands]), | |||
|
182 | '\n'.join([pre_command] + ['%s%s' % (indent, line) for line in downgradeCommands])) | |||
|
183 | ||||
|
184 | def _db_can_handle_this_change(self, td): | |||
|
185 | if (td.columns_missing_from_B | |||
|
186 | and not td.columns_missing_from_A | |||
|
187 | and not td.columns_different): | |||
|
188 | # Even sqlite can handle this. | |||
|
189 | return True | |||
|
190 | else: | |||
|
191 | return not self.engine.url.drivername.startswith('sqlite') | |||
|
192 | ||||
|
193 | def applyModel(self): | |||
|
194 | """Apply model to current database.""" | |||
|
195 | ||||
|
196 | meta = sqlalchemy.MetaData(self.engine) | |||
|
197 | ||||
|
198 | for table in self._get_tables(missingA=True): | |||
|
199 | table = table.tometadata(meta) | |||
|
200 | table.drop() | |||
|
201 | for table in self._get_tables(missingB=True): | |||
|
202 | table = table.tometadata(meta) | |||
|
203 | table.create() | |||
|
204 | for modelTable in self._get_tables(modified=True): | |||
|
205 | tableName = modelTable.name | |||
|
206 | modelTable = modelTable.tometadata(meta) | |||
|
207 | dbTable = self.diff.metadataB.tables[tableName] | |||
|
208 | ||||
|
209 | td = self.diff.tables_different[tableName] | |||
|
210 | ||||
|
211 | if self._db_can_handle_this_change(td): | |||
|
212 | ||||
|
213 | for col in td.columns_missing_from_B: | |||
|
214 | modelTable.columns[col].create() | |||
|
215 | for col in td.columns_missing_from_A: | |||
|
216 | dbTable.columns[col].drop() | |||
|
217 | # XXX handle column changes here. | |||
|
218 | else: | |||
|
219 | # Sqlite doesn't support drop column, so you have to | |||
|
220 | # do more: create temp table, copy data to it, drop | |||
|
221 | # old table, create new table, copy data back. | |||
|
222 | # | |||
|
223 | # I wonder if this is guaranteed to be unique? | |||
|
224 | tempName = '_temp_%s' % modelTable.name | |||
|
225 | ||||
|
226 | def getCopyStatement(): | |||
|
227 | preparer = self.engine.dialect.preparer | |||
|
228 | commonCols = [] | |||
|
229 | for modelCol in modelTable.columns: | |||
|
230 | if modelCol.name in dbTable.columns: | |||
|
231 | commonCols.append(modelCol.name) | |||
|
232 | commonColsStr = ', '.join(commonCols) | |||
|
233 | return 'INSERT INTO %s (%s) SELECT %s FROM %s' % \ | |||
|
234 | (tableName, commonColsStr, commonColsStr, tempName) | |||
|
235 | ||||
|
236 | # Move the data in one transaction, so that we don't | |||
|
237 | # leave the database in a nasty state. | |||
|
238 | connection = self.engine.connect() | |||
|
239 | trans = connection.begin() | |||
|
240 | try: | |||
|
241 | connection.execute( | |||
|
242 | 'CREATE TEMPORARY TABLE %s as SELECT * from %s' % \ | |||
|
243 | (tempName, modelTable.name)) | |||
|
244 | # make sure the drop takes place inside our | |||
|
245 | # transaction with the bind parameter | |||
|
246 | modelTable.drop(bind=connection) | |||
|
247 | modelTable.create(bind=connection) | |||
|
248 | connection.execute(getCopyStatement()) | |||
|
249 | connection.execute('DROP TABLE %s' % tempName) | |||
|
250 | trans.commit() | |||
|
251 | except: | |||
|
252 | trans.rollback() | |||
|
253 | raise |
@@ -0,0 +1,100 b'' | |||||
|
1 | """ | |||
|
2 | Script to migrate repository from sqlalchemy <= 0.4.4 to the new | |||
|
3 | repository schema. This shouldn't use any other migrate modules, so | |||
|
4 | that it can work in any version. | |||
|
5 | """ | |||
|
6 | ||||
|
7 | import os | |||
|
8 | import sys | |||
|
9 | import logging | |||
|
10 | ||||
|
11 | log = logging.getLogger(__name__) | |||
|
12 | ||||
|
13 | ||||
|
14 | def usage(): | |||
|
15 | """Gives usage information.""" | |||
|
16 | print """Usage: %(prog)s repository-to-migrate | |||
|
17 | ||||
|
18 | Upgrade your repository to the new flat format. | |||
|
19 | ||||
|
20 | NOTE: You should probably make a backup before running this. | |||
|
21 | """ % {'prog': sys.argv[0]} | |||
|
22 | ||||
|
23 | sys.exit(1) | |||
|
24 | ||||
|
25 | ||||
|
26 | def delete_file(filepath): | |||
|
27 | """Deletes a file and prints a message.""" | |||
|
28 | log.info('Deleting file: %s' % filepath) | |||
|
29 | os.remove(filepath) | |||
|
30 | ||||
|
31 | ||||
|
32 | def move_file(src, tgt): | |||
|
33 | """Moves a file and prints a message.""" | |||
|
34 | log.info('Moving file %s to %s' % (src, tgt)) | |||
|
35 | if os.path.exists(tgt): | |||
|
36 | raise Exception( | |||
|
37 | 'Cannot move file %s because target %s already exists' % \ | |||
|
38 | (src, tgt)) | |||
|
39 | os.rename(src, tgt) | |||
|
40 | ||||
|
41 | ||||
|
42 | def delete_directory(dirpath): | |||
|
43 | """Delete a directory and print a message.""" | |||
|
44 | log.info('Deleting directory: %s' % dirpath) | |||
|
45 | os.rmdir(dirpath) | |||
|
46 | ||||
|
47 | ||||
|
48 | def migrate_repository(repos): | |||
|
49 | """Does the actual migration to the new repository format.""" | |||
|
50 | log.info('Migrating repository at: %s to new format' % repos) | |||
|
51 | versions = '%s/versions' % repos | |||
|
52 | dirs = os.listdir(versions) | |||
|
53 | # Only use int's in list. | |||
|
54 | numdirs = [int(dirname) for dirname in dirs if dirname.isdigit()] | |||
|
55 | numdirs.sort() # Sort list. | |||
|
56 | for dirname in numdirs: | |||
|
57 | origdir = '%s/%s' % (versions, dirname) | |||
|
58 | log.info('Working on directory: %s' % origdir) | |||
|
59 | files = os.listdir(origdir) | |||
|
60 | files.sort() | |||
|
61 | for filename in files: | |||
|
62 | # Delete compiled Python files. | |||
|
63 | if filename.endswith('.pyc') or filename.endswith('.pyo'): | |||
|
64 | delete_file('%s/%s' % (origdir, filename)) | |||
|
65 | ||||
|
66 | # Delete empty __init__.py files. | |||
|
67 | origfile = '%s/__init__.py' % origdir | |||
|
68 | if os.path.exists(origfile) and len(open(origfile).read()) == 0: | |||
|
69 | delete_file(origfile) | |||
|
70 | ||||
|
71 | # Move sql upgrade scripts. | |||
|
72 | if filename.endswith('.sql'): | |||
|
73 | version, dbms, operation = filename.split('.', 3)[0:3] | |||
|
74 | origfile = '%s/%s' % (origdir, filename) | |||
|
75 | # For instance: 2.postgres.upgrade.sql -> | |||
|
76 | # 002_postgres_upgrade.sql | |||
|
77 | tgtfile = '%s/%03d_%s_%s.sql' % ( | |||
|
78 | versions, int(version), dbms, operation) | |||
|
79 | move_file(origfile, tgtfile) | |||
|
80 | ||||
|
81 | # Move Python upgrade script. | |||
|
82 | pyfile = '%s.py' % dirname | |||
|
83 | pyfilepath = '%s/%s' % (origdir, pyfile) | |||
|
84 | if os.path.exists(pyfilepath): | |||
|
85 | tgtfile = '%s/%03d.py' % (versions, int(dirname)) | |||
|
86 | move_file(pyfilepath, tgtfile) | |||
|
87 | ||||
|
88 | # Try to remove directory. Will fail if it's not empty. | |||
|
89 | delete_directory(origdir) | |||
|
90 | ||||
|
91 | ||||
|
92 | def main(): | |||
|
93 | """Main function to be called when using this script.""" | |||
|
94 | if len(sys.argv) != 2: | |||
|
95 | usage() | |||
|
96 | migrate_repository(sys.argv[1]) | |||
|
97 | ||||
|
98 | ||||
|
99 | if __name__ == '__main__': | |||
|
100 | main() |
@@ -0,0 +1,75 b'' | |||||
|
1 | """ | |||
|
2 | A path/directory class. | |||
|
3 | """ | |||
|
4 | ||||
|
5 | import os | |||
|
6 | import shutil | |||
|
7 | import logging | |||
|
8 | ||||
|
9 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
10 | from rhodecode.lib.dbmigrate.migrate.versioning.config import * | |||
|
11 | from rhodecode.lib.dbmigrate.migrate.versioning.util import KeyedInstance | |||
|
12 | ||||
|
13 | ||||
|
14 | log = logging.getLogger(__name__) | |||
|
15 | ||||
|
16 | class Pathed(KeyedInstance): | |||
|
17 | """ | |||
|
18 | A class associated with a path/directory tree. | |||
|
19 | ||||
|
20 | Only one instance of this class may exist for a particular file; | |||
|
21 | __new__ will return an existing instance if possible | |||
|
22 | """ | |||
|
23 | parent = None | |||
|
24 | ||||
|
25 | @classmethod | |||
|
26 | def _key(cls, path): | |||
|
27 | return str(path) | |||
|
28 | ||||
|
29 | def __init__(self, path): | |||
|
30 | self.path = path | |||
|
31 | if self.__class__.parent is not None: | |||
|
32 | self._init_parent(path) | |||
|
33 | ||||
|
34 | def _init_parent(self, path): | |||
|
35 | """Try to initialize this object's parent, if it has one""" | |||
|
36 | parent_path = self.__class__._parent_path(path) | |||
|
37 | self.parent = self.__class__.parent(parent_path) | |||
|
38 | log.debug("Getting parent %r:%r" % (self.__class__.parent, parent_path)) | |||
|
39 | self.parent._init_child(path, self) | |||
|
40 | ||||
|
41 | def _init_child(self, child, path): | |||
|
42 | """Run when a child of this object is initialized. | |||
|
43 | ||||
|
44 | Parameters: the child object; the path to this object (its | |||
|
45 | parent) | |||
|
46 | """ | |||
|
47 | ||||
|
48 | @classmethod | |||
|
49 | def _parent_path(cls, path): | |||
|
50 | """ | |||
|
51 | Fetch the path of this object's parent from this object's path. | |||
|
52 | """ | |||
|
53 | # os.path.dirname(), but strip directories like files (like | |||
|
54 | # unix basename) | |||
|
55 | # | |||
|
56 | # Treat directories like files... | |||
|
57 | if path[-1] == '/': | |||
|
58 | path = path[:-1] | |||
|
59 | ret = os.path.dirname(path) | |||
|
60 | return ret | |||
|
61 | ||||
|
62 | @classmethod | |||
|
63 | def require_notfound(cls, path): | |||
|
64 | """Ensures a given path does not already exist""" | |||
|
65 | if os.path.exists(path): | |||
|
66 | raise exceptions.PathFoundError(path) | |||
|
67 | ||||
|
68 | @classmethod | |||
|
69 | def require_found(cls, path): | |||
|
70 | """Ensures a given path already exists""" | |||
|
71 | if not os.path.exists(path): | |||
|
72 | raise exceptions.PathNotFoundError(path) | |||
|
73 | ||||
|
74 | def __str__(self): | |||
|
75 | return self.path |
@@ -0,0 +1,231 b'' | |||||
|
1 | """ | |||
|
2 | SQLAlchemy migrate repository management. | |||
|
3 | """ | |||
|
4 | import os | |||
|
5 | import shutil | |||
|
6 | import string | |||
|
7 | import logging | |||
|
8 | ||||
|
9 | from pkg_resources import resource_filename | |||
|
10 | from tempita import Template as TempitaTemplate | |||
|
11 | ||||
|
12 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
13 | from rhodecode.lib.dbmigrate.migrate.versioning import version, pathed, cfgparse | |||
|
14 | from rhodecode.lib.dbmigrate.migrate.versioning.template import Template | |||
|
15 | from rhodecode.lib.dbmigrate.migrate.versioning.config import * | |||
|
16 | ||||
|
17 | ||||
|
18 | log = logging.getLogger(__name__) | |||
|
19 | ||||
|
20 | class Changeset(dict): | |||
|
21 | """A collection of changes to be applied to a database. | |||
|
22 | ||||
|
23 | Changesets are bound to a repository and manage a set of | |||
|
24 | scripts from that repository. | |||
|
25 | ||||
|
26 | Behaves like a dict, for the most part. Keys are ordered based on step value. | |||
|
27 | """ | |||
|
28 | ||||
|
29 | def __init__(self, start, *changes, **k): | |||
|
30 | """ | |||
|
31 | Give a start version; step must be explicitly stated. | |||
|
32 | """ | |||
|
33 | self.step = k.pop('step', 1) | |||
|
34 | self.start = version.VerNum(start) | |||
|
35 | self.end = self.start | |||
|
36 | for change in changes: | |||
|
37 | self.add(change) | |||
|
38 | ||||
|
39 | def __iter__(self): | |||
|
40 | return iter(self.items()) | |||
|
41 | ||||
|
42 | def keys(self): | |||
|
43 | """ | |||
|
44 | In a series of upgrades x -> y, keys are version x. Sorted. | |||
|
45 | """ | |||
|
46 | ret = super(Changeset, self).keys() | |||
|
47 | # Reverse order if downgrading | |||
|
48 | ret.sort(reverse=(self.step < 1)) | |||
|
49 | return ret | |||
|
50 | ||||
|
51 | def values(self): | |||
|
52 | return [self[k] for k in self.keys()] | |||
|
53 | ||||
|
54 | def items(self): | |||
|
55 | return zip(self.keys(), self.values()) | |||
|
56 | ||||
|
57 | def add(self, change): | |||
|
58 | """Add new change to changeset""" | |||
|
59 | key = self.end | |||
|
60 | self.end += self.step | |||
|
61 | self[key] = change | |||
|
62 | ||||
|
63 | def run(self, *p, **k): | |||
|
64 | """Run the changeset scripts""" | |||
|
65 | for version, script in self: | |||
|
66 | script.run(*p, **k) | |||
|
67 | ||||
|
68 | ||||
|
69 | class Repository(pathed.Pathed): | |||
|
70 | """A project's change script repository""" | |||
|
71 | ||||
|
72 | _config = 'migrate.cfg' | |||
|
73 | _versions = 'versions' | |||
|
74 | ||||
|
75 | def __init__(self, path): | |||
|
76 | log.debug('Loading repository %s...' % path) | |||
|
77 | self.verify(path) | |||
|
78 | super(Repository, self).__init__(path) | |||
|
79 | self.config = cfgparse.Config(os.path.join(self.path, self._config)) | |||
|
80 | self.versions = version.Collection(os.path.join(self.path, | |||
|
81 | self._versions)) | |||
|
82 | log.debug('Repository %s loaded successfully' % path) | |||
|
83 | log.debug('Config: %r' % self.config.to_dict()) | |||
|
84 | ||||
|
85 | @classmethod | |||
|
86 | def verify(cls, path): | |||
|
87 | """ | |||
|
88 | Ensure the target path is a valid repository. | |||
|
89 | ||||
|
90 | :raises: :exc:`InvalidRepositoryError <migrate.exceptions.InvalidRepositoryError>` | |||
|
91 | """ | |||
|
92 | # Ensure the existence of required files | |||
|
93 | try: | |||
|
94 | cls.require_found(path) | |||
|
95 | cls.require_found(os.path.join(path, cls._config)) | |||
|
96 | cls.require_found(os.path.join(path, cls._versions)) | |||
|
97 | except exceptions.PathNotFoundError, e: | |||
|
98 | raise exceptions.InvalidRepositoryError(path) | |||
|
99 | ||||
|
100 | @classmethod | |||
|
101 | def prepare_config(cls, tmpl_dir, name, options=None): | |||
|
102 | """ | |||
|
103 | Prepare a project configuration file for a new project. | |||
|
104 | ||||
|
105 | :param tmpl_dir: Path to Repository template | |||
|
106 | :param config_file: Name of the config file in Repository template | |||
|
107 | :param name: Repository name | |||
|
108 | :type tmpl_dir: string | |||
|
109 | :type config_file: string | |||
|
110 | :type name: string | |||
|
111 | :returns: Populated config file | |||
|
112 | """ | |||
|
113 | if options is None: | |||
|
114 | options = {} | |||
|
115 | options.setdefault('version_table', 'migrate_version') | |||
|
116 | options.setdefault('repository_id', name) | |||
|
117 | options.setdefault('required_dbs', []) | |||
|
118 | ||||
|
119 | tmpl = open(os.path.join(tmpl_dir, cls._config)).read() | |||
|
120 | ret = TempitaTemplate(tmpl).substitute(options) | |||
|
121 | ||||
|
122 | # cleanup | |||
|
123 | del options['__template_name__'] | |||
|
124 | ||||
|
125 | return ret | |||
|
126 | ||||
|
127 | @classmethod | |||
|
128 | def create(cls, path, name, **opts): | |||
|
129 | """Create a repository at a specified path""" | |||
|
130 | cls.require_notfound(path) | |||
|
131 | theme = opts.pop('templates_theme', None) | |||
|
132 | t_path = opts.pop('templates_path', None) | |||
|
133 | ||||
|
134 | # Create repository | |||
|
135 | tmpl_dir = Template(t_path).get_repository(theme=theme) | |||
|
136 | shutil.copytree(tmpl_dir, path) | |||
|
137 | ||||
|
138 | # Edit config defaults | |||
|
139 | config_text = cls.prepare_config(tmpl_dir, name, options=opts) | |||
|
140 | fd = open(os.path.join(path, cls._config), 'w') | |||
|
141 | fd.write(config_text) | |||
|
142 | fd.close() | |||
|
143 | ||||
|
144 | opts['repository_name'] = name | |||
|
145 | ||||
|
146 | # Create a management script | |||
|
147 | manager = os.path.join(path, 'manage.py') | |||
|
148 | Repository.create_manage_file(manager, templates_theme=theme, | |||
|
149 | templates_path=t_path, **opts) | |||
|
150 | ||||
|
151 | return cls(path) | |||
|
152 | ||||
|
153 | def create_script(self, description, **k): | |||
|
154 | """API to :meth:`migrate.versioning.version.Collection.create_new_python_version`""" | |||
|
155 | self.versions.create_new_python_version(description, **k) | |||
|
156 | ||||
|
157 | def create_script_sql(self, database, **k): | |||
|
158 | """API to :meth:`migrate.versioning.version.Collection.create_new_sql_version`""" | |||
|
159 | self.versions.create_new_sql_version(database, **k) | |||
|
160 | ||||
|
161 | @property | |||
|
162 | def latest(self): | |||
|
163 | """API to :attr:`migrate.versioning.version.Collection.latest`""" | |||
|
164 | return self.versions.latest | |||
|
165 | ||||
|
166 | @property | |||
|
167 | def version_table(self): | |||
|
168 | """Returns version_table name specified in config""" | |||
|
169 | return self.config.get('db_settings', 'version_table') | |||
|
170 | ||||
|
171 | @property | |||
|
172 | def id(self): | |||
|
173 | """Returns repository id specified in config""" | |||
|
174 | return self.config.get('db_settings', 'repository_id') | |||
|
175 | ||||
|
176 | def version(self, *p, **k): | |||
|
177 | """API to :attr:`migrate.versioning.version.Collection.version`""" | |||
|
178 | return self.versions.version(*p, **k) | |||
|
179 | ||||
|
180 | @classmethod | |||
|
181 | def clear(cls): | |||
|
182 | # TODO: deletes repo | |||
|
183 | super(Repository, cls).clear() | |||
|
184 | version.Collection.clear() | |||
|
185 | ||||
|
186 | def changeset(self, database, start, end=None): | |||
|
187 | """Create a changeset to migrate this database from ver. start to end/latest. | |||
|
188 | ||||
|
189 | :param database: name of database to generate changeset | |||
|
190 | :param start: version to start at | |||
|
191 | :param end: version to end at (latest if None given) | |||
|
192 | :type database: string | |||
|
193 | :type start: int | |||
|
194 | :type end: int | |||
|
195 | :returns: :class:`Changeset instance <migration.versioning.repository.Changeset>` | |||
|
196 | """ | |||
|
197 | start = version.VerNum(start) | |||
|
198 | ||||
|
199 | if end is None: | |||
|
200 | end = self.latest | |||
|
201 | else: | |||
|
202 | end = version.VerNum(end) | |||
|
203 | ||||
|
204 | if start <= end: | |||
|
205 | step = 1 | |||
|
206 | range_mod = 1 | |||
|
207 | op = 'upgrade' | |||
|
208 | else: | |||
|
209 | step = -1 | |||
|
210 | range_mod = 0 | |||
|
211 | op = 'downgrade' | |||
|
212 | ||||
|
213 | versions = range(start + range_mod, end + range_mod, step) | |||
|
214 | changes = [self.version(v).script(database, op) for v in versions] | |||
|
215 | ret = Changeset(start, step=step, *changes) | |||
|
216 | return ret | |||
|
217 | ||||
|
218 | @classmethod | |||
|
219 | def create_manage_file(cls, file_, **opts): | |||
|
220 | """Create a project management script (manage.py) | |||
|
221 | ||||
|
222 | :param file_: Destination file to be written | |||
|
223 | :param opts: Options that are passed to :func:`migrate.versioning.shell.main` | |||
|
224 | """ | |||
|
225 | mng_file = Template(opts.pop('templates_path', None))\ | |||
|
226 | .get_manage(theme=opts.pop('templates_theme', None)) | |||
|
227 | ||||
|
228 | tmpl = open(mng_file).read() | |||
|
229 | fd = open(file_, 'w') | |||
|
230 | fd.write(TempitaTemplate(tmpl).substitute(opts)) | |||
|
231 | fd.close() |
@@ -0,0 +1,213 b'' | |||||
|
1 | """ | |||
|
2 | Database schema version management. | |||
|
3 | """ | |||
|
4 | import sys | |||
|
5 | import logging | |||
|
6 | ||||
|
7 | from sqlalchemy import (Table, Column, MetaData, String, Text, Integer, | |||
|
8 | create_engine) | |||
|
9 | from sqlalchemy.sql import and_ | |||
|
10 | from sqlalchemy import exceptions as sa_exceptions | |||
|
11 | from sqlalchemy.sql import bindparam | |||
|
12 | ||||
|
13 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
14 | from rhodecode.lib.dbmigrate.migrate.versioning import genmodel, schemadiff | |||
|
15 | from rhodecode.lib.dbmigrate.migrate.versioning.repository import Repository | |||
|
16 | from rhodecode.lib.dbmigrate.migrate.versioning.util import load_model | |||
|
17 | from rhodecode.lib.dbmigrate.migrate.versioning.version import VerNum | |||
|
18 | ||||
|
19 | ||||
|
20 | log = logging.getLogger(__name__) | |||
|
21 | ||||
|
22 | class ControlledSchema(object): | |||
|
23 | """A database under version control""" | |||
|
24 | ||||
|
25 | def __init__(self, engine, repository): | |||
|
26 | if isinstance(repository, basestring): | |||
|
27 | repository = Repository(repository) | |||
|
28 | self.engine = engine | |||
|
29 | self.repository = repository | |||
|
30 | self.meta = MetaData(engine) | |||
|
31 | self.load() | |||
|
32 | ||||
|
33 | def __eq__(self, other): | |||
|
34 | """Compare two schemas by repositories and versions""" | |||
|
35 | return (self.repository is other.repository \ | |||
|
36 | and self.version == other.version) | |||
|
37 | ||||
|
38 | def load(self): | |||
|
39 | """Load controlled schema version info from DB""" | |||
|
40 | tname = self.repository.version_table | |||
|
41 | try: | |||
|
42 | if not hasattr(self, 'table') or self.table is None: | |||
|
43 | self.table = Table(tname, self.meta, autoload=True) | |||
|
44 | ||||
|
45 | result = self.engine.execute(self.table.select( | |||
|
46 | self.table.c.repository_id == str(self.repository.id))) | |||
|
47 | ||||
|
48 | data = list(result)[0] | |||
|
49 | except: | |||
|
50 | cls, exc, tb = sys.exc_info() | |||
|
51 | raise exceptions.DatabaseNotControlledError, exc.__str__(), tb | |||
|
52 | ||||
|
53 | self.version = data['version'] | |||
|
54 | return data | |||
|
55 | ||||
|
56 | def drop(self): | |||
|
57 | """ | |||
|
58 | Remove version control from a database. | |||
|
59 | """ | |||
|
60 | try: | |||
|
61 | self.table.drop() | |||
|
62 | except (sa_exceptions.SQLError): | |||
|
63 | raise exceptions.DatabaseNotControlledError(str(self.table)) | |||
|
64 | ||||
|
65 | def changeset(self, version=None): | |||
|
66 | """API to Changeset creation. | |||
|
67 | ||||
|
68 | Uses self.version for start version and engine.name | |||
|
69 | to get database name. | |||
|
70 | """ | |||
|
71 | database = self.engine.name | |||
|
72 | start_ver = self.version | |||
|
73 | changeset = self.repository.changeset(database, start_ver, version) | |||
|
74 | return changeset | |||
|
75 | ||||
|
76 | def runchange(self, ver, change, step): | |||
|
77 | startver = ver | |||
|
78 | endver = ver + step | |||
|
79 | # Current database version must be correct! Don't run if corrupt! | |||
|
80 | if self.version != startver: | |||
|
81 | raise exceptions.InvalidVersionError("%s is not %s" % \ | |||
|
82 | (self.version, startver)) | |||
|
83 | # Run the change | |||
|
84 | change.run(self.engine, step) | |||
|
85 | ||||
|
86 | # Update/refresh database version | |||
|
87 | self.update_repository_table(startver, endver) | |||
|
88 | self.load() | |||
|
89 | ||||
|
90 | def update_repository_table(self, startver, endver): | |||
|
91 | """Update version_table with new information""" | |||
|
92 | update = self.table.update(and_(self.table.c.version == int(startver), | |||
|
93 | self.table.c.repository_id == str(self.repository.id))) | |||
|
94 | self.engine.execute(update, version=int(endver)) | |||
|
95 | ||||
|
96 | def upgrade(self, version=None): | |||
|
97 | """ | |||
|
98 | Upgrade (or downgrade) to a specified version, or latest version. | |||
|
99 | """ | |||
|
100 | changeset = self.changeset(version) | |||
|
101 | for ver, change in changeset: | |||
|
102 | self.runchange(ver, change, changeset.step) | |||
|
103 | ||||
|
104 | def update_db_from_model(self, model): | |||
|
105 | """ | |||
|
106 | Modify the database to match the structure of the current Python model. | |||
|
107 | """ | |||
|
108 | model = load_model(model) | |||
|
109 | ||||
|
110 | diff = schemadiff.getDiffOfModelAgainstDatabase( | |||
|
111 | model, self.engine, excludeTables=[self.repository.version_table] | |||
|
112 | ) | |||
|
113 | genmodel.ModelGenerator(diff,self.engine).applyModel() | |||
|
114 | ||||
|
115 | self.update_repository_table(self.version, int(self.repository.latest)) | |||
|
116 | ||||
|
117 | self.load() | |||
|
118 | ||||
|
119 | @classmethod | |||
|
120 | def create(cls, engine, repository, version=None): | |||
|
121 | """ | |||
|
122 | Declare a database to be under a repository's version control. | |||
|
123 | ||||
|
124 | :raises: :exc:`DatabaseAlreadyControlledError` | |||
|
125 | :returns: :class:`ControlledSchema` | |||
|
126 | """ | |||
|
127 | # Confirm that the version # is valid: positive, integer, | |||
|
128 | # exists in repos | |||
|
129 | if isinstance(repository, basestring): | |||
|
130 | repository = Repository(repository) | |||
|
131 | version = cls._validate_version(repository, version) | |||
|
132 | table = cls._create_table_version(engine, repository, version) | |||
|
133 | # TODO: history table | |||
|
134 | # Load repository information and return | |||
|
135 | return cls(engine, repository) | |||
|
136 | ||||
|
137 | @classmethod | |||
|
138 | def _validate_version(cls, repository, version): | |||
|
139 | """ | |||
|
140 | Ensures this is a valid version number for this repository. | |||
|
141 | ||||
|
142 | :raises: :exc:`InvalidVersionError` if invalid | |||
|
143 | :return: valid version number | |||
|
144 | """ | |||
|
145 | if version is None: | |||
|
146 | version = 0 | |||
|
147 | try: | |||
|
148 | version = VerNum(version) # raises valueerror | |||
|
149 | if version < 0 or version > repository.latest: | |||
|
150 | raise ValueError() | |||
|
151 | except ValueError: | |||
|
152 | raise exceptions.InvalidVersionError(version) | |||
|
153 | return version | |||
|
154 | ||||
|
155 | @classmethod | |||
|
156 | def _create_table_version(cls, engine, repository, version): | |||
|
157 | """ | |||
|
158 | Creates the versioning table in a database. | |||
|
159 | ||||
|
160 | :raises: :exc:`DatabaseAlreadyControlledError` | |||
|
161 | """ | |||
|
162 | # Create tables | |||
|
163 | tname = repository.version_table | |||
|
164 | meta = MetaData(engine) | |||
|
165 | ||||
|
166 | table = Table( | |||
|
167 | tname, meta, | |||
|
168 | Column('repository_id', String(250), primary_key=True), | |||
|
169 | Column('repository_path', Text), | |||
|
170 | Column('version', Integer), ) | |||
|
171 | ||||
|
172 | # there can be multiple repositories/schemas in the same db | |||
|
173 | if not table.exists(): | |||
|
174 | table.create() | |||
|
175 | ||||
|
176 | # test for existing repository_id | |||
|
177 | s = table.select(table.c.repository_id == bindparam("repository_id")) | |||
|
178 | result = engine.execute(s, repository_id=repository.id) | |||
|
179 | if result.fetchone(): | |||
|
180 | raise exceptions.DatabaseAlreadyControlledError | |||
|
181 | ||||
|
182 | # Insert data | |||
|
183 | engine.execute(table.insert().values( | |||
|
184 | repository_id=repository.id, | |||
|
185 | repository_path=repository.path, | |||
|
186 | version=int(version))) | |||
|
187 | return table | |||
|
188 | ||||
|
189 | @classmethod | |||
|
190 | def compare_model_to_db(cls, engine, model, repository): | |||
|
191 | """ | |||
|
192 | Compare the current model against the current database. | |||
|
193 | """ | |||
|
194 | if isinstance(repository, basestring): | |||
|
195 | repository = Repository(repository) | |||
|
196 | model = load_model(model) | |||
|
197 | ||||
|
198 | diff = schemadiff.getDiffOfModelAgainstDatabase( | |||
|
199 | model, engine, excludeTables=[repository.version_table]) | |||
|
200 | return diff | |||
|
201 | ||||
|
202 | @classmethod | |||
|
203 | def create_model(cls, engine, repository, declarative=False): | |||
|
204 | """ | |||
|
205 | Dump the current database as a Python model. | |||
|
206 | """ | |||
|
207 | if isinstance(repository, basestring): | |||
|
208 | repository = Repository(repository) | |||
|
209 | ||||
|
210 | diff = schemadiff.getDiffOfModelAgainstDatabase( | |||
|
211 | MetaData(), engine, excludeTables=[repository.version_table] | |||
|
212 | ) | |||
|
213 | return genmodel.ModelGenerator(diff, engine, declarative).toPython() |
@@ -0,0 +1,285 b'' | |||||
|
1 | """ | |||
|
2 | Schema differencing support. | |||
|
3 | """ | |||
|
4 | ||||
|
5 | import logging | |||
|
6 | import sqlalchemy | |||
|
7 | ||||
|
8 | from rhodecode.lib.dbmigrate.migrate.changeset import SQLA_06 | |||
|
9 | from sqlalchemy.types import Float | |||
|
10 | ||||
|
11 | log = logging.getLogger(__name__) | |||
|
12 | ||||
|
13 | def getDiffOfModelAgainstDatabase(metadata, engine, excludeTables=None): | |||
|
14 | """ | |||
|
15 | Return differences of model against database. | |||
|
16 | ||||
|
17 | :return: object which will evaluate to :keyword:`True` if there \ | |||
|
18 | are differences else :keyword:`False`. | |||
|
19 | """ | |||
|
20 | return SchemaDiff(metadata, | |||
|
21 | sqlalchemy.MetaData(engine, reflect=True), | |||
|
22 | labelA='model', | |||
|
23 | labelB='database', | |||
|
24 | excludeTables=excludeTables) | |||
|
25 | ||||
|
26 | ||||
|
27 | def getDiffOfModelAgainstModel(metadataA, metadataB, excludeTables=None): | |||
|
28 | """ | |||
|
29 | Return differences of model against another model. | |||
|
30 | ||||
|
31 | :return: object which will evaluate to :keyword:`True` if there \ | |||
|
32 | are differences else :keyword:`False`. | |||
|
33 | """ | |||
|
34 | return SchemaDiff(metadataA, metadataB, excludeTables) | |||
|
35 | ||||
|
36 | ||||
|
37 | class ColDiff(object): | |||
|
38 | """ | |||
|
39 | Container for differences in one :class:`~sqlalchemy.schema.Column` | |||
|
40 | between two :class:`~sqlalchemy.schema.Table` instances, ``A`` | |||
|
41 | and ``B``. | |||
|
42 | ||||
|
43 | .. attribute:: col_A | |||
|
44 | ||||
|
45 | The :class:`~sqlalchemy.schema.Column` object for A. | |||
|
46 | ||||
|
47 | .. attribute:: col_B | |||
|
48 | ||||
|
49 | The :class:`~sqlalchemy.schema.Column` object for B. | |||
|
50 | ||||
|
51 | .. attribute:: type_A | |||
|
52 | ||||
|
53 | The most generic type of the :class:`~sqlalchemy.schema.Column` | |||
|
54 | object in A. | |||
|
55 | ||||
|
56 | .. attribute:: type_B | |||
|
57 | ||||
|
58 | The most generic type of the :class:`~sqlalchemy.schema.Column` | |||
|
59 | object in A. | |||
|
60 | ||||
|
61 | """ | |||
|
62 | ||||
|
63 | diff = False | |||
|
64 | ||||
|
65 | def __init__(self,col_A,col_B): | |||
|
66 | self.col_A = col_A | |||
|
67 | self.col_B = col_B | |||
|
68 | ||||
|
69 | self.type_A = col_A.type | |||
|
70 | self.type_B = col_B.type | |||
|
71 | ||||
|
72 | self.affinity_A = self.type_A._type_affinity | |||
|
73 | self.affinity_B = self.type_B._type_affinity | |||
|
74 | ||||
|
75 | if self.affinity_A is not self.affinity_B: | |||
|
76 | self.diff = True | |||
|
77 | return | |||
|
78 | ||||
|
79 | if isinstance(self.type_A,Float) or isinstance(self.type_B,Float): | |||
|
80 | if not (isinstance(self.type_A,Float) and isinstance(self.type_B,Float)): | |||
|
81 | self.diff=True | |||
|
82 | return | |||
|
83 | ||||
|
84 | for attr in ('precision','scale','length'): | |||
|
85 | A = getattr(self.type_A,attr,None) | |||
|
86 | B = getattr(self.type_B,attr,None) | |||
|
87 | if not (A is None or B is None) and A!=B: | |||
|
88 | self.diff=True | |||
|
89 | return | |||
|
90 | ||||
|
91 | def __nonzero__(self): | |||
|
92 | return self.diff | |||
|
93 | ||||
|
94 | class TableDiff(object): | |||
|
95 | """ | |||
|
96 | Container for differences in one :class:`~sqlalchemy.schema.Table` | |||
|
97 | between two :class:`~sqlalchemy.schema.MetaData` instances, ``A`` | |||
|
98 | and ``B``. | |||
|
99 | ||||
|
100 | .. attribute:: columns_missing_from_A | |||
|
101 | ||||
|
102 | A sequence of column names that were found in B but weren't in | |||
|
103 | A. | |||
|
104 | ||||
|
105 | .. attribute:: columns_missing_from_B | |||
|
106 | ||||
|
107 | A sequence of column names that were found in A but weren't in | |||
|
108 | B. | |||
|
109 | ||||
|
110 | .. attribute:: columns_different | |||
|
111 | ||||
|
112 | A dictionary containing information about columns that were | |||
|
113 | found to be different. | |||
|
114 | It maps column names to a :class:`ColDiff` objects describing the | |||
|
115 | differences found. | |||
|
116 | """ | |||
|
117 | __slots__ = ( | |||
|
118 | 'columns_missing_from_A', | |||
|
119 | 'columns_missing_from_B', | |||
|
120 | 'columns_different', | |||
|
121 | ) | |||
|
122 | ||||
|
123 | def __nonzero__(self): | |||
|
124 | return bool( | |||
|
125 | self.columns_missing_from_A or | |||
|
126 | self.columns_missing_from_B or | |||
|
127 | self.columns_different | |||
|
128 | ) | |||
|
129 | ||||
|
130 | class SchemaDiff(object): | |||
|
131 | """ | |||
|
132 | Compute the difference between two :class:`~sqlalchemy.schema.MetaData` | |||
|
133 | objects. | |||
|
134 | ||||
|
135 | The string representation of a :class:`SchemaDiff` will summarise | |||
|
136 | the changes found between the two | |||
|
137 | :class:`~sqlalchemy.schema.MetaData` objects. | |||
|
138 | ||||
|
139 | The length of a :class:`SchemaDiff` will give the number of | |||
|
140 | changes found, enabling it to be used much like a boolean in | |||
|
141 | expressions. | |||
|
142 | ||||
|
143 | :param metadataA: | |||
|
144 | First :class:`~sqlalchemy.schema.MetaData` to compare. | |||
|
145 | ||||
|
146 | :param metadataB: | |||
|
147 | Second :class:`~sqlalchemy.schema.MetaData` to compare. | |||
|
148 | ||||
|
149 | :param labelA: | |||
|
150 | The label to use in messages about the first | |||
|
151 | :class:`~sqlalchemy.schema.MetaData`. | |||
|
152 | ||||
|
153 | :param labelB: | |||
|
154 | The label to use in messages about the second | |||
|
155 | :class:`~sqlalchemy.schema.MetaData`. | |||
|
156 | ||||
|
157 | :param excludeTables: | |||
|
158 | A sequence of table names to exclude. | |||
|
159 | ||||
|
160 | .. attribute:: tables_missing_from_A | |||
|
161 | ||||
|
162 | A sequence of table names that were found in B but weren't in | |||
|
163 | A. | |||
|
164 | ||||
|
165 | .. attribute:: tables_missing_from_B | |||
|
166 | ||||
|
167 | A sequence of table names that were found in A but weren't in | |||
|
168 | B. | |||
|
169 | ||||
|
170 | .. attribute:: tables_different | |||
|
171 | ||||
|
172 | A dictionary containing information about tables that were found | |||
|
173 | to be different. | |||
|
174 | It maps table names to a :class:`TableDiff` objects describing the | |||
|
175 | differences found. | |||
|
176 | """ | |||
|
177 | ||||
|
178 | def __init__(self, | |||
|
179 | metadataA, metadataB, | |||
|
180 | labelA='metadataA', | |||
|
181 | labelB='metadataB', | |||
|
182 | excludeTables=None): | |||
|
183 | ||||
|
184 | self.metadataA, self.metadataB = metadataA, metadataB | |||
|
185 | self.labelA, self.labelB = labelA, labelB | |||
|
186 | self.label_width = max(len(labelA),len(labelB)) | |||
|
187 | excludeTables = set(excludeTables or []) | |||
|
188 | ||||
|
189 | A_table_names = set(metadataA.tables.keys()) | |||
|
190 | B_table_names = set(metadataB.tables.keys()) | |||
|
191 | ||||
|
192 | self.tables_missing_from_A = sorted( | |||
|
193 | B_table_names - A_table_names - excludeTables | |||
|
194 | ) | |||
|
195 | self.tables_missing_from_B = sorted( | |||
|
196 | A_table_names - B_table_names - excludeTables | |||
|
197 | ) | |||
|
198 | ||||
|
199 | self.tables_different = {} | |||
|
200 | for table_name in A_table_names.intersection(B_table_names): | |||
|
201 | ||||
|
202 | td = TableDiff() | |||
|
203 | ||||
|
204 | A_table = metadataA.tables[table_name] | |||
|
205 | B_table = metadataB.tables[table_name] | |||
|
206 | ||||
|
207 | A_column_names = set(A_table.columns.keys()) | |||
|
208 | B_column_names = set(B_table.columns.keys()) | |||
|
209 | ||||
|
210 | td.columns_missing_from_A = sorted( | |||
|
211 | B_column_names - A_column_names | |||
|
212 | ) | |||
|
213 | ||||
|
214 | td.columns_missing_from_B = sorted( | |||
|
215 | A_column_names - B_column_names | |||
|
216 | ) | |||
|
217 | ||||
|
218 | td.columns_different = {} | |||
|
219 | ||||
|
220 | for col_name in A_column_names.intersection(B_column_names): | |||
|
221 | ||||
|
222 | cd = ColDiff( | |||
|
223 | A_table.columns.get(col_name), | |||
|
224 | B_table.columns.get(col_name) | |||
|
225 | ) | |||
|
226 | ||||
|
227 | if cd: | |||
|
228 | td.columns_different[col_name]=cd | |||
|
229 | ||||
|
230 | # XXX - index and constraint differences should | |||
|
231 | # be checked for here | |||
|
232 | ||||
|
233 | if td: | |||
|
234 | self.tables_different[table_name]=td | |||
|
235 | ||||
|
236 | def __str__(self): | |||
|
237 | ''' Summarize differences. ''' | |||
|
238 | out = [] | |||
|
239 | column_template =' %%%is: %%r' % self.label_width | |||
|
240 | ||||
|
241 | for names,label in ( | |||
|
242 | (self.tables_missing_from_A,self.labelA), | |||
|
243 | (self.tables_missing_from_B,self.labelB), | |||
|
244 | ): | |||
|
245 | if names: | |||
|
246 | out.append( | |||
|
247 | ' tables missing from %s: %s' % ( | |||
|
248 | label,', '.join(sorted(names)) | |||
|
249 | ) | |||
|
250 | ) | |||
|
251 | ||||
|
252 | for name,td in sorted(self.tables_different.items()): | |||
|
253 | out.append( | |||
|
254 | ' table with differences: %s' % name | |||
|
255 | ) | |||
|
256 | for names,label in ( | |||
|
257 | (td.columns_missing_from_A,self.labelA), | |||
|
258 | (td.columns_missing_from_B,self.labelB), | |||
|
259 | ): | |||
|
260 | if names: | |||
|
261 | out.append( | |||
|
262 | ' %s missing these columns: %s' % ( | |||
|
263 | label,', '.join(sorted(names)) | |||
|
264 | ) | |||
|
265 | ) | |||
|
266 | for name,cd in td.columns_different.items(): | |||
|
267 | out.append(' column with differences: %s' % name) | |||
|
268 | out.append(column_template % (self.labelA,cd.col_A)) | |||
|
269 | out.append(column_template % (self.labelB,cd.col_B)) | |||
|
270 | ||||
|
271 | if out: | |||
|
272 | out.insert(0, 'Schema diffs:') | |||
|
273 | return '\n'.join(out) | |||
|
274 | else: | |||
|
275 | return 'No schema diffs' | |||
|
276 | ||||
|
277 | def __len__(self): | |||
|
278 | """ | |||
|
279 | Used in bool evaluation, return of 0 means no diffs. | |||
|
280 | """ | |||
|
281 | return ( | |||
|
282 | len(self.tables_missing_from_A) + | |||
|
283 | len(self.tables_missing_from_B) + | |||
|
284 | len(self.tables_different) | |||
|
285 | ) |
@@ -0,0 +1,6 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | # -*- coding: utf-8 -*- | |||
|
3 | ||||
|
4 | from rhodecode.lib.dbmigrate.migrate.versioning.script.base import BaseScript | |||
|
5 | from rhodecode.lib.dbmigrate.migrate.versioning.script.py import PythonScript | |||
|
6 | from rhodecode.lib.dbmigrate.migrate.versioning.script.sql import SqlScript |
@@ -0,0 +1,57 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | # -*- coding: utf-8 -*- | |||
|
3 | import logging | |||
|
4 | ||||
|
5 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
6 | from rhodecode.lib.dbmigrate.migrate.versioning.config import operations | |||
|
7 | from rhodecode.lib.dbmigrate.migrate.versioning import pathed | |||
|
8 | ||||
|
9 | ||||
|
10 | log = logging.getLogger(__name__) | |||
|
11 | ||||
|
12 | class BaseScript(pathed.Pathed): | |||
|
13 | """Base class for other types of scripts. | |||
|
14 | All scripts have the following properties: | |||
|
15 | ||||
|
16 | source (script.source()) | |||
|
17 | The source code of the script | |||
|
18 | version (script.version()) | |||
|
19 | The version number of the script | |||
|
20 | operations (script.operations()) | |||
|
21 | The operations defined by the script: upgrade(), downgrade() or both. | |||
|
22 | Returns a tuple of operations. | |||
|
23 | Can also check for an operation with ex. script.operation(Script.ops.up) | |||
|
24 | """ # TODO: sphinxfy this and implement it correctly | |||
|
25 | ||||
|
26 | def __init__(self, path): | |||
|
27 | log.debug('Loading script %s...' % path) | |||
|
28 | self.verify(path) | |||
|
29 | super(BaseScript, self).__init__(path) | |||
|
30 | log.debug('Script %s loaded successfully' % path) | |||
|
31 | ||||
|
32 | @classmethod | |||
|
33 | def verify(cls, path): | |||
|
34 | """Ensure this is a valid script | |||
|
35 | This version simply ensures the script file's existence | |||
|
36 | ||||
|
37 | :raises: :exc:`InvalidScriptError <migrate.exceptions.InvalidScriptError>` | |||
|
38 | """ | |||
|
39 | try: | |||
|
40 | cls.require_found(path) | |||
|
41 | except: | |||
|
42 | raise exceptions.InvalidScriptError(path) | |||
|
43 | ||||
|
44 | def source(self): | |||
|
45 | """:returns: source code of the script. | |||
|
46 | :rtype: string | |||
|
47 | """ | |||
|
48 | fd = open(self.path) | |||
|
49 | ret = fd.read() | |||
|
50 | fd.close() | |||
|
51 | return ret | |||
|
52 | ||||
|
53 | def run(self, engine): | |||
|
54 | """Core of each BaseScript subclass. | |||
|
55 | This method executes the script. | |||
|
56 | """ | |||
|
57 | raise NotImplementedError() |
@@ -0,0 +1,159 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | # -*- coding: utf-8 -*- | |||
|
3 | ||||
|
4 | import shutil | |||
|
5 | import warnings | |||
|
6 | import logging | |||
|
7 | from StringIO import StringIO | |||
|
8 | ||||
|
9 | from rhodecode.lib.dbmigrate import migrate | |||
|
10 | from rhodecode.lib.dbmigrate.migrate.versioning import genmodel, schemadiff | |||
|
11 | from rhodecode.lib.dbmigrate.migrate.versioning.config import operations | |||
|
12 | from rhodecode.lib.dbmigrate.migrate.versioning.template import Template | |||
|
13 | from rhodecode.lib.dbmigrate.migrate.versioning.script import base | |||
|
14 | from rhodecode.lib.dbmigrate.migrate.versioning.util import import_path, load_model, with_engine | |||
|
15 | from rhodecode.lib.dbmigrate.migrate.exceptions import MigrateDeprecationWarning, InvalidScriptError, ScriptError | |||
|
16 | ||||
|
17 | log = logging.getLogger(__name__) | |||
|
18 | __all__ = ['PythonScript'] | |||
|
19 | ||||
|
20 | ||||
|
21 | class PythonScript(base.BaseScript): | |||
|
22 | """Base for Python scripts""" | |||
|
23 | ||||
|
24 | @classmethod | |||
|
25 | def create(cls, path, **opts): | |||
|
26 | """Create an empty migration script at specified path | |||
|
27 | ||||
|
28 | :returns: :class:`PythonScript instance <migrate.versioning.script.py.PythonScript>`""" | |||
|
29 | cls.require_notfound(path) | |||
|
30 | ||||
|
31 | src = Template(opts.pop('templates_path', None)).get_script(theme=opts.pop('templates_theme', None)) | |||
|
32 | shutil.copy(src, path) | |||
|
33 | ||||
|
34 | return cls(path) | |||
|
35 | ||||
|
36 | @classmethod | |||
|
37 | def make_update_script_for_model(cls, engine, oldmodel, | |||
|
38 | model, repository, **opts): | |||
|
39 | """Create a migration script based on difference between two SA models. | |||
|
40 | ||||
|
41 | :param repository: path to migrate repository | |||
|
42 | :param oldmodel: dotted.module.name:SAClass or SAClass object | |||
|
43 | :param model: dotted.module.name:SAClass or SAClass object | |||
|
44 | :param engine: SQLAlchemy engine | |||
|
45 | :type repository: string or :class:`Repository instance <migrate.versioning.repository.Repository>` | |||
|
46 | :type oldmodel: string or Class | |||
|
47 | :type model: string or Class | |||
|
48 | :type engine: Engine instance | |||
|
49 | :returns: Upgrade / Downgrade script | |||
|
50 | :rtype: string | |||
|
51 | """ | |||
|
52 | ||||
|
53 | if isinstance(repository, basestring): | |||
|
54 | # oh dear, an import cycle! | |||
|
55 | from rhodecode.lib.dbmigrate.migrate.versioning.repository import Repository | |||
|
56 | repository = Repository(repository) | |||
|
57 | ||||
|
58 | oldmodel = load_model(oldmodel) | |||
|
59 | model = load_model(model) | |||
|
60 | ||||
|
61 | # Compute differences. | |||
|
62 | diff = schemadiff.getDiffOfModelAgainstModel( | |||
|
63 | oldmodel, | |||
|
64 | model, | |||
|
65 | excludeTables=[repository.version_table]) | |||
|
66 | # TODO: diff can be False (there is no difference?) | |||
|
67 | decls, upgradeCommands, downgradeCommands = \ | |||
|
68 | genmodel.ModelGenerator(diff, engine).toUpgradeDowngradePython() | |||
|
69 | ||||
|
70 | # Store differences into file. | |||
|
71 | src = Template(opts.pop('templates_path', None)).get_script(opts.pop('templates_theme', None)) | |||
|
72 | f = open(src) | |||
|
73 | contents = f.read() | |||
|
74 | f.close() | |||
|
75 | ||||
|
76 | # generate source | |||
|
77 | search = 'def upgrade(migrate_engine):' | |||
|
78 | contents = contents.replace(search, '\n\n'.join((decls, search)), 1) | |||
|
79 | if upgradeCommands: | |||
|
80 | contents = contents.replace(' pass', upgradeCommands, 1) | |||
|
81 | if downgradeCommands: | |||
|
82 | contents = contents.replace(' pass', downgradeCommands, 1) | |||
|
83 | return contents | |||
|
84 | ||||
|
85 | @classmethod | |||
|
86 | def verify_module(cls, path): | |||
|
87 | """Ensure path is a valid script | |||
|
88 | ||||
|
89 | :param path: Script location | |||
|
90 | :type path: string | |||
|
91 | :raises: :exc:`InvalidScriptError <migrate.exceptions.InvalidScriptError>` | |||
|
92 | :returns: Python module | |||
|
93 | """ | |||
|
94 | # Try to import and get the upgrade() func | |||
|
95 | module = import_path(path) | |||
|
96 | try: | |||
|
97 | assert callable(module.upgrade) | |||
|
98 | except Exception, e: | |||
|
99 | raise InvalidScriptError(path + ': %s' % str(e)) | |||
|
100 | return module | |||
|
101 | ||||
|
102 | def preview_sql(self, url, step, **args): | |||
|
103 | """Mocks SQLAlchemy Engine to store all executed calls in a string | |||
|
104 | and runs :meth:`PythonScript.run <migrate.versioning.script.py.PythonScript.run>` | |||
|
105 | ||||
|
106 | :returns: SQL file | |||
|
107 | """ | |||
|
108 | buf = StringIO() | |||
|
109 | args['engine_arg_strategy'] = 'mock' | |||
|
110 | args['engine_arg_executor'] = lambda s, p = '': buf.write(str(s) + p) | |||
|
111 | ||||
|
112 | @with_engine | |||
|
113 | def go(url, step, **kw): | |||
|
114 | engine = kw.pop('engine') | |||
|
115 | self.run(engine, step) | |||
|
116 | return buf.getvalue() | |||
|
117 | ||||
|
118 | return go(url, step, **args) | |||
|
119 | ||||
|
120 | def run(self, engine, step): | |||
|
121 | """Core method of Script file. | |||
|
122 | Exectues :func:`update` or :func:`downgrade` functions | |||
|
123 | ||||
|
124 | :param engine: SQLAlchemy Engine | |||
|
125 | :param step: Operation to run | |||
|
126 | :type engine: string | |||
|
127 | :type step: int | |||
|
128 | """ | |||
|
129 | if step > 0: | |||
|
130 | op = 'upgrade' | |||
|
131 | elif step < 0: | |||
|
132 | op = 'downgrade' | |||
|
133 | else: | |||
|
134 | raise ScriptError("%d is not a valid step" % step) | |||
|
135 | ||||
|
136 | funcname = base.operations[op] | |||
|
137 | script_func = self._func(funcname) | |||
|
138 | ||||
|
139 | try: | |||
|
140 | script_func(engine) | |||
|
141 | except TypeError: | |||
|
142 | warnings.warn("upgrade/downgrade functions must accept engine" | |||
|
143 | " parameter (since version > 0.5.4)", MigrateDeprecationWarning) | |||
|
144 | raise | |||
|
145 | ||||
|
146 | @property | |||
|
147 | def module(self): | |||
|
148 | """Calls :meth:`migrate.versioning.script.py.verify_module` | |||
|
149 | and returns it. | |||
|
150 | """ | |||
|
151 | if not hasattr(self, '_module'): | |||
|
152 | self._module = self.verify_module(self.path) | |||
|
153 | return self._module | |||
|
154 | ||||
|
155 | def _func(self, funcname): | |||
|
156 | if not hasattr(self.module, funcname): | |||
|
157 | msg = "Function '%s' is not defined in this script" | |||
|
158 | raise ScriptError(msg % funcname) | |||
|
159 | return getattr(self.module, funcname) |
@@ -0,0 +1,48 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | # -*- coding: utf-8 -*- | |||
|
3 | import logging | |||
|
4 | import shutil | |||
|
5 | ||||
|
6 | from rhodecode.lib.dbmigrate.migrate.versioning.script import base | |||
|
7 | from rhodecode.lib.dbmigrate.migrate.versioning.template import Template | |||
|
8 | ||||
|
9 | ||||
|
10 | log = logging.getLogger(__name__) | |||
|
11 | ||||
|
12 | class SqlScript(base.BaseScript): | |||
|
13 | """A file containing plain SQL statements.""" | |||
|
14 | ||||
|
15 | @classmethod | |||
|
16 | def create(cls, path, **opts): | |||
|
17 | """Create an empty migration script at specified path | |||
|
18 | ||||
|
19 | :returns: :class:`SqlScript instance <migrate.versioning.script.sql.SqlScript>`""" | |||
|
20 | cls.require_notfound(path) | |||
|
21 | src = Template(opts.pop('templates_path', None)).get_sql_script(theme=opts.pop('templates_theme', None)) | |||
|
22 | shutil.copy(src, path) | |||
|
23 | return cls(path) | |||
|
24 | ||||
|
25 | # TODO: why is step parameter even here? | |||
|
26 | def run(self, engine, step=None, executemany=True): | |||
|
27 | """Runs SQL script through raw dbapi execute call""" | |||
|
28 | text = self.source() | |||
|
29 | # Don't rely on SA's autocommit here | |||
|
30 | # (SA uses .startswith to check if a commit is needed. What if script | |||
|
31 | # starts with a comment?) | |||
|
32 | conn = engine.connect() | |||
|
33 | try: | |||
|
34 | trans = conn.begin() | |||
|
35 | try: | |||
|
36 | # HACK: SQLite doesn't allow multiple statements through | |||
|
37 | # its execute() method, but it provides executescript() instead | |||
|
38 | dbapi = conn.engine.raw_connection() | |||
|
39 | if executemany and getattr(dbapi, 'executescript', None): | |||
|
40 | dbapi.executescript(text) | |||
|
41 | else: | |||
|
42 | conn.execute(text) | |||
|
43 | trans.commit() | |||
|
44 | except: | |||
|
45 | trans.rollback() | |||
|
46 | raise | |||
|
47 | finally: | |||
|
48 | conn.close() |
@@ -0,0 +1,215 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | # -*- coding: utf-8 -*- | |||
|
3 | ||||
|
4 | """The migrate command-line tool.""" | |||
|
5 | ||||
|
6 | import sys | |||
|
7 | import inspect | |||
|
8 | import logging | |||
|
9 | from optparse import OptionParser, BadOptionError | |||
|
10 | ||||
|
11 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
12 | from rhodecode.lib.dbmigrate.migrate.versioning import api | |||
|
13 | from rhodecode.lib.dbmigrate.migrate.versioning.config import * | |||
|
14 | from rhodecode.lib.dbmigrate.migrate.versioning.util import asbool | |||
|
15 | ||||
|
16 | ||||
|
17 | alias = dict( | |||
|
18 | s=api.script, | |||
|
19 | vc=api.version_control, | |||
|
20 | dbv=api.db_version, | |||
|
21 | v=api.version, | |||
|
22 | ) | |||
|
23 | ||||
|
24 | def alias_setup(): | |||
|
25 | global alias | |||
|
26 | for key, val in alias.iteritems(): | |||
|
27 | setattr(api, key, val) | |||
|
28 | alias_setup() | |||
|
29 | ||||
|
30 | ||||
|
31 | class PassiveOptionParser(OptionParser): | |||
|
32 | ||||
|
33 | def _process_args(self, largs, rargs, values): | |||
|
34 | """little hack to support all --some_option=value parameters""" | |||
|
35 | ||||
|
36 | while rargs: | |||
|
37 | arg = rargs[0] | |||
|
38 | if arg == "--": | |||
|
39 | del rargs[0] | |||
|
40 | return | |||
|
41 | elif arg[0:2] == "--": | |||
|
42 | # if parser does not know about the option | |||
|
43 | # pass it along (make it anonymous) | |||
|
44 | try: | |||
|
45 | opt = arg.split('=', 1)[0] | |||
|
46 | self._match_long_opt(opt) | |||
|
47 | except BadOptionError: | |||
|
48 | largs.append(arg) | |||
|
49 | del rargs[0] | |||
|
50 | else: | |||
|
51 | self._process_long_opt(rargs, values) | |||
|
52 | elif arg[:1] == "-" and len(arg) > 1: | |||
|
53 | self._process_short_opts(rargs, values) | |||
|
54 | elif self.allow_interspersed_args: | |||
|
55 | largs.append(arg) | |||
|
56 | del rargs[0] | |||
|
57 | ||||
|
58 | def main(argv=None, **kwargs): | |||
|
59 | """Shell interface to :mod:`migrate.versioning.api`. | |||
|
60 | ||||
|
61 | kwargs are default options that can be overriden with passing | |||
|
62 | --some_option as command line option | |||
|
63 | ||||
|
64 | :param disable_logging: Let migrate configure logging | |||
|
65 | :type disable_logging: bool | |||
|
66 | """ | |||
|
67 | if argv is not None: | |||
|
68 | argv = argv | |||
|
69 | else: | |||
|
70 | argv = list(sys.argv[1:]) | |||
|
71 | commands = list(api.__all__) | |||
|
72 | commands.sort() | |||
|
73 | ||||
|
74 | usage = """%%prog COMMAND ... | |||
|
75 | ||||
|
76 | Available commands: | |||
|
77 | %s | |||
|
78 | ||||
|
79 | Enter "%%prog help COMMAND" for information on a particular command. | |||
|
80 | """ % '\n\t'.join(["%s - %s" % (command.ljust(28), | |||
|
81 | api.command_desc.get(command)) for command in commands]) | |||
|
82 | ||||
|
83 | parser = PassiveOptionParser(usage=usage) | |||
|
84 | parser.add_option("-d", "--debug", | |||
|
85 | action="store_true", | |||
|
86 | dest="debug", | |||
|
87 | default=False, | |||
|
88 | help="Shortcut to turn on DEBUG mode for logging") | |||
|
89 | parser.add_option("-q", "--disable_logging", | |||
|
90 | action="store_true", | |||
|
91 | dest="disable_logging", | |||
|
92 | default=False, | |||
|
93 | help="Use this option to disable logging configuration") | |||
|
94 | help_commands = ['help', '-h', '--help'] | |||
|
95 | HELP = False | |||
|
96 | ||||
|
97 | try: | |||
|
98 | command = argv.pop(0) | |||
|
99 | if command in help_commands: | |||
|
100 | HELP = True | |||
|
101 | command = argv.pop(0) | |||
|
102 | except IndexError: | |||
|
103 | parser.print_help() | |||
|
104 | return | |||
|
105 | ||||
|
106 | command_func = getattr(api, command, None) | |||
|
107 | if command_func is None or command.startswith('_'): | |||
|
108 | parser.error("Invalid command %s" % command) | |||
|
109 | ||||
|
110 | parser.set_usage(inspect.getdoc(command_func)) | |||
|
111 | f_args, f_varargs, f_kwargs, f_defaults = inspect.getargspec(command_func) | |||
|
112 | for arg in f_args: | |||
|
113 | parser.add_option( | |||
|
114 | "--%s" % arg, | |||
|
115 | dest=arg, | |||
|
116 | action='store', | |||
|
117 | type="string") | |||
|
118 | ||||
|
119 | # display help of the current command | |||
|
120 | if HELP: | |||
|
121 | parser.print_help() | |||
|
122 | return | |||
|
123 | ||||
|
124 | options, args = parser.parse_args(argv) | |||
|
125 | ||||
|
126 | # override kwargs with anonymous parameters | |||
|
127 | override_kwargs = dict() | |||
|
128 | for arg in list(args): | |||
|
129 | if arg.startswith('--'): | |||
|
130 | args.remove(arg) | |||
|
131 | if '=' in arg: | |||
|
132 | opt, value = arg[2:].split('=', 1) | |||
|
133 | else: | |||
|
134 | opt = arg[2:] | |||
|
135 | value = True | |||
|
136 | override_kwargs[opt] = value | |||
|
137 | ||||
|
138 | # override kwargs with options if user is overwriting | |||
|
139 | for key, value in options.__dict__.iteritems(): | |||
|
140 | if value is not None: | |||
|
141 | override_kwargs[key] = value | |||
|
142 | ||||
|
143 | # arguments that function accepts without passed kwargs | |||
|
144 | f_required = list(f_args) | |||
|
145 | candidates = dict(kwargs) | |||
|
146 | candidates.update(override_kwargs) | |||
|
147 | for key, value in candidates.iteritems(): | |||
|
148 | if key in f_args: | |||
|
149 | f_required.remove(key) | |||
|
150 | ||||
|
151 | # map function arguments to parsed arguments | |||
|
152 | for arg in args: | |||
|
153 | try: | |||
|
154 | kw = f_required.pop(0) | |||
|
155 | except IndexError: | |||
|
156 | parser.error("Too many arguments for command %s: %s" % (command, | |||
|
157 | arg)) | |||
|
158 | kwargs[kw] = arg | |||
|
159 | ||||
|
160 | # apply overrides | |||
|
161 | kwargs.update(override_kwargs) | |||
|
162 | ||||
|
163 | # configure options | |||
|
164 | for key, value in options.__dict__.iteritems(): | |||
|
165 | kwargs.setdefault(key, value) | |||
|
166 | ||||
|
167 | # configure logging | |||
|
168 | if not asbool(kwargs.pop('disable_logging', False)): | |||
|
169 | # filter to log =< INFO into stdout and rest to stderr | |||
|
170 | class SingleLevelFilter(logging.Filter): | |||
|
171 | def __init__(self, min=None, max=None): | |||
|
172 | self.min = min or 0 | |||
|
173 | self.max = max or 100 | |||
|
174 | ||||
|
175 | def filter(self, record): | |||
|
176 | return self.min <= record.levelno <= self.max | |||
|
177 | ||||
|
178 | logger = logging.getLogger() | |||
|
179 | h1 = logging.StreamHandler(sys.stdout) | |||
|
180 | f1 = SingleLevelFilter(max=logging.INFO) | |||
|
181 | h1.addFilter(f1) | |||
|
182 | h2 = logging.StreamHandler(sys.stderr) | |||
|
183 | f2 = SingleLevelFilter(min=logging.WARN) | |||
|
184 | h2.addFilter(f2) | |||
|
185 | logger.addHandler(h1) | |||
|
186 | logger.addHandler(h2) | |||
|
187 | ||||
|
188 | if options.debug: | |||
|
189 | logger.setLevel(logging.DEBUG) | |||
|
190 | else: | |||
|
191 | logger.setLevel(logging.INFO) | |||
|
192 | ||||
|
193 | log = logging.getLogger(__name__) | |||
|
194 | ||||
|
195 | # check if all args are given | |||
|
196 | try: | |||
|
197 | num_defaults = len(f_defaults) | |||
|
198 | except TypeError: | |||
|
199 | num_defaults = 0 | |||
|
200 | f_args_default = f_args[len(f_args) - num_defaults:] | |||
|
201 | required = list(set(f_required) - set(f_args_default)) | |||
|
202 | if required: | |||
|
203 | parser.error("Not enough arguments for command %s: %s not specified" \ | |||
|
204 | % (command, ', '.join(required))) | |||
|
205 | ||||
|
206 | # handle command | |||
|
207 | try: | |||
|
208 | ret = command_func(**kwargs) | |||
|
209 | if ret is not None: | |||
|
210 | log.info(ret) | |||
|
211 | except (exceptions.UsageError, exceptions.KnownError), e: | |||
|
212 | parser.error(e.args[0]) | |||
|
213 | ||||
|
214 | if __name__ == "__main__": | |||
|
215 | main() |
@@ -0,0 +1,94 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | # -*- coding: utf-8 -*- | |||
|
3 | ||||
|
4 | import os | |||
|
5 | import shutil | |||
|
6 | import sys | |||
|
7 | ||||
|
8 | from pkg_resources import resource_filename | |||
|
9 | ||||
|
10 | from rhodecode.lib.dbmigrate.migrate.versioning.config import * | |||
|
11 | from rhodecode.lib.dbmigrate.migrate.versioning import pathed | |||
|
12 | ||||
|
13 | ||||
|
14 | class Collection(pathed.Pathed): | |||
|
15 | """A collection of templates of a specific type""" | |||
|
16 | _mask = None | |||
|
17 | ||||
|
18 | def get_path(self, file): | |||
|
19 | return os.path.join(self.path, str(file)) | |||
|
20 | ||||
|
21 | ||||
|
22 | class RepositoryCollection(Collection): | |||
|
23 | _mask = '%s' | |||
|
24 | ||||
|
25 | class ScriptCollection(Collection): | |||
|
26 | _mask = '%s.py_tmpl' | |||
|
27 | ||||
|
28 | class ManageCollection(Collection): | |||
|
29 | _mask = '%s.py_tmpl' | |||
|
30 | ||||
|
31 | class SQLScriptCollection(Collection): | |||
|
32 | _mask = '%s.py_tmpl' | |||
|
33 | ||||
|
34 | class Template(pathed.Pathed): | |||
|
35 | """Finds the paths/packages of various Migrate templates. | |||
|
36 | ||||
|
37 | :param path: Templates are loaded from rhodecode.lib.dbmigrate.migrate package | |||
|
38 | if `path` is not provided. | |||
|
39 | """ | |||
|
40 | pkg = 'rhodecode.lib.dbmigrate.migrate.versioning.templates' | |||
|
41 | _manage = 'manage.py_tmpl' | |||
|
42 | ||||
|
43 | def __new__(cls, path=None): | |||
|
44 | if path is None: | |||
|
45 | path = cls._find_path(cls.pkg) | |||
|
46 | return super(Template, cls).__new__(cls, path) | |||
|
47 | ||||
|
48 | def __init__(self, path=None): | |||
|
49 | if path is None: | |||
|
50 | path = Template._find_path(self.pkg) | |||
|
51 | super(Template, self).__init__(path) | |||
|
52 | self.repository = RepositoryCollection(os.path.join(path, 'repository')) | |||
|
53 | self.script = ScriptCollection(os.path.join(path, 'script')) | |||
|
54 | self.manage = ManageCollection(os.path.join(path, 'manage')) | |||
|
55 | self.sql_script = SQLScriptCollection(os.path.join(path, 'sql_script')) | |||
|
56 | ||||
|
57 | @classmethod | |||
|
58 | def _find_path(cls, pkg): | |||
|
59 | """Returns absolute path to dotted python package.""" | |||
|
60 | tmp_pkg = pkg.rsplit('.', 1) | |||
|
61 | ||||
|
62 | if len(tmp_pkg) != 1: | |||
|
63 | return resource_filename(tmp_pkg[0], tmp_pkg[1]) | |||
|
64 | else: | |||
|
65 | return resource_filename(tmp_pkg[0], '') | |||
|
66 | ||||
|
67 | def _get_item(self, collection, theme=None): | |||
|
68 | """Locates and returns collection. | |||
|
69 | ||||
|
70 | :param collection: name of collection to locate | |||
|
71 | :param type_: type of subfolder in collection (defaults to "_default") | |||
|
72 | :returns: (package, source) | |||
|
73 | :rtype: str, str | |||
|
74 | """ | |||
|
75 | item = getattr(self, collection) | |||
|
76 | theme_mask = getattr(item, '_mask') | |||
|
77 | theme = theme_mask % (theme or 'default') | |||
|
78 | return item.get_path(theme) | |||
|
79 | ||||
|
80 | def get_repository(self, *a, **kw): | |||
|
81 | """Calls self._get_item('repository', *a, **kw)""" | |||
|
82 | return self._get_item('repository', *a, **kw) | |||
|
83 | ||||
|
84 | def get_script(self, *a, **kw): | |||
|
85 | """Calls self._get_item('script', *a, **kw)""" | |||
|
86 | return self._get_item('script', *a, **kw) | |||
|
87 | ||||
|
88 | def get_sql_script(self, *a, **kw): | |||
|
89 | """Calls self._get_item('sql_script', *a, **kw)""" | |||
|
90 | return self._get_item('sql_script', *a, **kw) | |||
|
91 | ||||
|
92 | def get_manage(self, *a, **kw): | |||
|
93 | """Calls self._get_item('manage', *a, **kw)""" | |||
|
94 | return self._get_item('manage', *a, **kw) |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 |
@@ -0,0 +1,5 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | from migrate.versioning.shell import main | |||
|
3 | ||||
|
4 | if __name__ == '__main__': | |||
|
5 | main(%(defaults)s) |
@@ -0,0 +1,10 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | from migrate.versioning.shell import main | |||
|
3 | ||||
|
4 | {{py: | |||
|
5 | _vars = locals().copy() | |||
|
6 | del _vars['__template_name__'] | |||
|
7 | _vars.pop('repository_name', None) | |||
|
8 | defaults = ", ".join(["%s='%s'" % var for var in _vars.iteritems()]) | |||
|
9 | }} | |||
|
10 | main({{ defaults }}) |
@@ -0,0 +1,29 b'' | |||||
|
1 | #!/usr/bin/python | |||
|
2 | # -*- coding: utf-8 -*- | |||
|
3 | import sys | |||
|
4 | ||||
|
5 | from sqlalchemy import engine_from_config | |||
|
6 | from paste.deploy.loadwsgi import ConfigLoader | |||
|
7 | ||||
|
8 | from migrate.versioning.shell import main | |||
|
9 | from {{ locals().pop('repository_name') }}.model import migrations | |||
|
10 | ||||
|
11 | ||||
|
12 | if '-c' in sys.argv: | |||
|
13 | pos = sys.argv.index('-c') | |||
|
14 | conf_path = sys.argv[pos + 1] | |||
|
15 | del sys.argv[pos:pos + 2] | |||
|
16 | else: | |||
|
17 | conf_path = 'development.ini' | |||
|
18 | ||||
|
19 | {{py: | |||
|
20 | _vars = locals().copy() | |||
|
21 | del _vars['__template_name__'] | |||
|
22 | defaults = ", ".join(["%s='%s'" % var for var in _vars.iteritems()]) | |||
|
23 | }} | |||
|
24 | ||||
|
25 | conf_dict = ConfigLoader(conf_path).parser._sections['app:main'] | |||
|
26 | ||||
|
27 | # migrate supports passing url as an existing Engine instance (since 0.6.0) | |||
|
28 | # usage: migrate -c path/to/config.ini COMMANDS | |||
|
29 | main(url=engine_from_config(conf_dict), repository=migrations.__path__[0],{{ defaults }}) |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 |
@@ -0,0 +1,4 b'' | |||||
|
1 | This is a database migration repository. | |||
|
2 | ||||
|
3 | More information at | |||
|
4 | http://code.google.com/p/sqlalchemy-migrate/ |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 |
@@ -0,0 +1,20 b'' | |||||
|
1 | [db_settings] | |||
|
2 | # Used to identify which repository this database is versioned under. | |||
|
3 | # You can use the name of your project. | |||
|
4 | repository_id={{ locals().pop('repository_id') }} | |||
|
5 | ||||
|
6 | # The name of the database table used to track the schema version. | |||
|
7 | # This name shouldn't already be used by your project. | |||
|
8 | # If this is changed once a database is under version control, you'll need to | |||
|
9 | # change the table name in each database too. | |||
|
10 | version_table={{ locals().pop('version_table') }} | |||
|
11 | ||||
|
12 | # When committing a change script, Migrate will attempt to generate the | |||
|
13 | # sql for all supported databases; normally, if one of them fails - probably | |||
|
14 | # because you don't have that database installed - it is ignored and the | |||
|
15 | # commit continues, perhaps ending successfully. | |||
|
16 | # Databases in this list MUST compile successfully during a commit, or the | |||
|
17 | # entire commit will fail. List the databases your application will actually | |||
|
18 | # be using to ensure your updates to that database work properly. | |||
|
19 | # This must be a list; example: ['postgres','sqlite'] | |||
|
20 | required_dbs={{ locals().pop('required_dbs') }} |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 |
@@ -0,0 +1,4 b'' | |||||
|
1 | This is a database migration repository. | |||
|
2 | ||||
|
3 | More information at | |||
|
4 | http://code.google.com/p/sqlalchemy-migrate/ |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 |
@@ -0,0 +1,20 b'' | |||||
|
1 | [db_settings] | |||
|
2 | # Used to identify which repository this database is versioned under. | |||
|
3 | # You can use the name of your project. | |||
|
4 | repository_id={{ locals().pop('repository_id') }} | |||
|
5 | ||||
|
6 | # The name of the database table used to track the schema version. | |||
|
7 | # This name shouldn't already be used by your project. | |||
|
8 | # If this is changed once a database is under version control, you'll need to | |||
|
9 | # change the table name in each database too. | |||
|
10 | version_table={{ locals().pop('version_table') }} | |||
|
11 | ||||
|
12 | # When committing a change script, Migrate will attempt to generate the | |||
|
13 | # sql for all supported databases; normally, if one of them fails - probably | |||
|
14 | # because you don't have that database installed - it is ignored and the | |||
|
15 | # commit continues, perhaps ending successfully. | |||
|
16 | # Databases in this list MUST compile successfully during a commit, or the | |||
|
17 | # entire commit will fail. List the databases your application will actually | |||
|
18 | # be using to ensure your updates to that database work properly. | |||
|
19 | # This must be a list; example: ['postgres','sqlite'] | |||
|
20 | required_dbs={{ locals().pop('required_dbs') }} |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 |
@@ -0,0 +1,11 b'' | |||||
|
1 | from sqlalchemy import * | |||
|
2 | from migrate import * | |||
|
3 | ||||
|
4 | def upgrade(migrate_engine): | |||
|
5 | # Upgrade operations go here. Don't create your own engine; bind migrate_engine | |||
|
6 | # to your metadata | |||
|
7 | pass | |||
|
8 | ||||
|
9 | def downgrade(migrate_engine): | |||
|
10 | # Operations to reverse the above upgrade go here. | |||
|
11 | pass |
@@ -0,0 +1,11 b'' | |||||
|
1 | from sqlalchemy import * | |||
|
2 | from migrate import * | |||
|
3 | ||||
|
4 | def upgrade(migrate_engine): | |||
|
5 | # Upgrade operations go here. Don't create your own engine; bind migrate_engine | |||
|
6 | # to your metadata | |||
|
7 | pass | |||
|
8 | ||||
|
9 | def downgrade(migrate_engine): | |||
|
10 | # Operations to reverse the above upgrade go here. | |||
|
11 | pass |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 |
@@ -0,0 +1,179 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | # -*- coding: utf-8 -*- | |||
|
3 | """.. currentmodule:: migrate.versioning.util""" | |||
|
4 | ||||
|
5 | import warnings | |||
|
6 | import logging | |||
|
7 | from decorator import decorator | |||
|
8 | from pkg_resources import EntryPoint | |||
|
9 | ||||
|
10 | from sqlalchemy import create_engine | |||
|
11 | from sqlalchemy.engine import Engine | |||
|
12 | from sqlalchemy.pool import StaticPool | |||
|
13 | ||||
|
14 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
15 | from rhodecode.lib.dbmigrate.migrate.versioning.util.keyedinstance import KeyedInstance | |||
|
16 | from rhodecode.lib.dbmigrate.migrate.versioning.util.importpath import import_path | |||
|
17 | ||||
|
18 | ||||
|
19 | log = logging.getLogger(__name__) | |||
|
20 | ||||
|
21 | def load_model(dotted_name): | |||
|
22 | """Import module and use module-level variable". | |||
|
23 | ||||
|
24 | :param dotted_name: path to model in form of string: ``some.python.module:Class`` | |||
|
25 | ||||
|
26 | .. versionchanged:: 0.5.4 | |||
|
27 | ||||
|
28 | """ | |||
|
29 | if isinstance(dotted_name, basestring): | |||
|
30 | if ':' not in dotted_name: | |||
|
31 | # backwards compatibility | |||
|
32 | warnings.warn('model should be in form of module.model:User ' | |||
|
33 | 'and not module.model.User', exceptions.MigrateDeprecationWarning) | |||
|
34 | dotted_name = ':'.join(dotted_name.rsplit('.', 1)) | |||
|
35 | return EntryPoint.parse('x=%s' % dotted_name).load(False) | |||
|
36 | else: | |||
|
37 | # Assume it's already loaded. | |||
|
38 | return dotted_name | |||
|
39 | ||||
|
40 | def asbool(obj): | |||
|
41 | """Do everything to use object as bool""" | |||
|
42 | if isinstance(obj, basestring): | |||
|
43 | obj = obj.strip().lower() | |||
|
44 | if obj in ['true', 'yes', 'on', 'y', 't', '1']: | |||
|
45 | return True | |||
|
46 | elif obj in ['false', 'no', 'off', 'n', 'f', '0']: | |||
|
47 | return False | |||
|
48 | else: | |||
|
49 | raise ValueError("String is not true/false: %r" % obj) | |||
|
50 | if obj in (True, False): | |||
|
51 | return bool(obj) | |||
|
52 | else: | |||
|
53 | raise ValueError("String is not true/false: %r" % obj) | |||
|
54 | ||||
|
55 | def guess_obj_type(obj): | |||
|
56 | """Do everything to guess object type from string | |||
|
57 | ||||
|
58 | Tries to convert to `int`, `bool` and finally returns if not succeded. | |||
|
59 | ||||
|
60 | .. versionadded: 0.5.4 | |||
|
61 | """ | |||
|
62 | ||||
|
63 | result = None | |||
|
64 | ||||
|
65 | try: | |||
|
66 | result = int(obj) | |||
|
67 | except: | |||
|
68 | pass | |||
|
69 | ||||
|
70 | if result is None: | |||
|
71 | try: | |||
|
72 | result = asbool(obj) | |||
|
73 | except: | |||
|
74 | pass | |||
|
75 | ||||
|
76 | if result is not None: | |||
|
77 | return result | |||
|
78 | else: | |||
|
79 | return obj | |||
|
80 | ||||
|
81 | @decorator | |||
|
82 | def catch_known_errors(f, *a, **kw): | |||
|
83 | """Decorator that catches known api errors | |||
|
84 | ||||
|
85 | .. versionadded: 0.5.4 | |||
|
86 | """ | |||
|
87 | ||||
|
88 | try: | |||
|
89 | return f(*a, **kw) | |||
|
90 | except exceptions.PathFoundError, e: | |||
|
91 | raise exceptions.KnownError("The path %s already exists" % e.args[0]) | |||
|
92 | ||||
|
93 | def construct_engine(engine, **opts): | |||
|
94 | """.. versionadded:: 0.5.4 | |||
|
95 | ||||
|
96 | Constructs and returns SQLAlchemy engine. | |||
|
97 | ||||
|
98 | Currently, there are 2 ways to pass create_engine options to :mod:`migrate.versioning.api` functions: | |||
|
99 | ||||
|
100 | :param engine: connection string or a existing engine | |||
|
101 | :param engine_dict: python dictionary of options to pass to `create_engine` | |||
|
102 | :param engine_arg_*: keyword parameters to pass to `create_engine` (evaluated with :func:`migrate.versioning.util.guess_obj_type`) | |||
|
103 | :type engine_dict: dict | |||
|
104 | :type engine: string or Engine instance | |||
|
105 | :type engine_arg_*: string | |||
|
106 | :returns: SQLAlchemy Engine | |||
|
107 | ||||
|
108 | .. note:: | |||
|
109 | ||||
|
110 | keyword parameters override ``engine_dict`` values. | |||
|
111 | ||||
|
112 | """ | |||
|
113 | if isinstance(engine, Engine): | |||
|
114 | return engine | |||
|
115 | elif not isinstance(engine, basestring): | |||
|
116 | raise ValueError("you need to pass either an existing engine or a database uri") | |||
|
117 | ||||
|
118 | # get options for create_engine | |||
|
119 | if opts.get('engine_dict') and isinstance(opts['engine_dict'], dict): | |||
|
120 | kwargs = opts['engine_dict'] | |||
|
121 | else: | |||
|
122 | kwargs = dict() | |||
|
123 | ||||
|
124 | # DEPRECATED: handle echo the old way | |||
|
125 | echo = asbool(opts.get('echo', False)) | |||
|
126 | if echo: | |||
|
127 | warnings.warn('echo=True parameter is deprecated, pass ' | |||
|
128 | 'engine_arg_echo=True or engine_dict={"echo": True}', | |||
|
129 | exceptions.MigrateDeprecationWarning) | |||
|
130 | kwargs['echo'] = echo | |||
|
131 | ||||
|
132 | # parse keyword arguments | |||
|
133 | for key, value in opts.iteritems(): | |||
|
134 | if key.startswith('engine_arg_'): | |||
|
135 | kwargs[key[11:]] = guess_obj_type(value) | |||
|
136 | ||||
|
137 | log.debug('Constructing engine') | |||
|
138 | # TODO: return create_engine(engine, poolclass=StaticPool, **kwargs) | |||
|
139 | # seems like 0.5.x branch does not work with engine.dispose and staticpool | |||
|
140 | return create_engine(engine, **kwargs) | |||
|
141 | ||||
|
142 | @decorator | |||
|
143 | def with_engine(f, *a, **kw): | |||
|
144 | """Decorator for :mod:`migrate.versioning.api` functions | |||
|
145 | to safely close resources after function usage. | |||
|
146 | ||||
|
147 | Passes engine parameters to :func:`construct_engine` and | |||
|
148 | resulting parameter is available as kw['engine']. | |||
|
149 | ||||
|
150 | Engine is disposed after wrapped function is executed. | |||
|
151 | ||||
|
152 | .. versionadded: 0.6.0 | |||
|
153 | """ | |||
|
154 | url = a[0] | |||
|
155 | engine = construct_engine(url, **kw) | |||
|
156 | ||||
|
157 | try: | |||
|
158 | kw['engine'] = engine | |||
|
159 | return f(*a, **kw) | |||
|
160 | finally: | |||
|
161 | if isinstance(engine, Engine): | |||
|
162 | log.debug('Disposing SQLAlchemy engine %s', engine) | |||
|
163 | engine.dispose() | |||
|
164 | ||||
|
165 | ||||
|
166 | class Memoize: | |||
|
167 | """Memoize(fn) - an instance which acts like fn but memoizes its arguments | |||
|
168 | Will only work on functions with non-mutable arguments | |||
|
169 | ||||
|
170 | ActiveState Code 52201 | |||
|
171 | """ | |||
|
172 | def __init__(self, fn): | |||
|
173 | self.fn = fn | |||
|
174 | self.memo = {} | |||
|
175 | ||||
|
176 | def __call__(self, *args): | |||
|
177 | if not self.memo.has_key(args): | |||
|
178 | self.memo[args] = self.fn(*args) | |||
|
179 | return self.memo[args] |
@@ -0,0 +1,16 b'' | |||||
|
1 | import os | |||
|
2 | import sys | |||
|
3 | ||||
|
4 | def import_path(fullpath): | |||
|
5 | """ Import a file with full path specification. Allows one to | |||
|
6 | import from anywhere, something __import__ does not do. | |||
|
7 | """ | |||
|
8 | # http://zephyrfalcon.org/weblog/arch_d7_2002_08_31.html | |||
|
9 | path, filename = os.path.split(fullpath) | |||
|
10 | filename, ext = os.path.splitext(filename) | |||
|
11 | sys.path.append(path) | |||
|
12 | module = __import__(filename) | |||
|
13 | reload(module) # Might be out of date during tests | |||
|
14 | del sys.path[-1] | |||
|
15 | return module | |||
|
16 |
@@ -0,0 +1,36 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | # -*- coding: utf-8 -*- | |||
|
3 | ||||
|
4 | class KeyedInstance(object): | |||
|
5 | """A class whose instances have a unique identifier of some sort | |||
|
6 | No two instances with the same unique ID should exist - if we try to create | |||
|
7 | a second instance, the first should be returned. | |||
|
8 | """ | |||
|
9 | ||||
|
10 | _instances = dict() | |||
|
11 | ||||
|
12 | def __new__(cls, *p, **k): | |||
|
13 | instances = cls._instances | |||
|
14 | clskey = str(cls) | |||
|
15 | if clskey not in instances: | |||
|
16 | instances[clskey] = dict() | |||
|
17 | instances = instances[clskey] | |||
|
18 | ||||
|
19 | key = cls._key(*p, **k) | |||
|
20 | if key not in instances: | |||
|
21 | instances[key] = super(KeyedInstance, cls).__new__(cls) | |||
|
22 | return instances[key] | |||
|
23 | ||||
|
24 | @classmethod | |||
|
25 | def _key(cls, *p, **k): | |||
|
26 | """Given a unique identifier, return a dictionary key | |||
|
27 | This should be overridden by child classes, to specify which parameters | |||
|
28 | should determine an object's uniqueness | |||
|
29 | """ | |||
|
30 | raise NotImplementedError() | |||
|
31 | ||||
|
32 | @classmethod | |||
|
33 | def clear(cls): | |||
|
34 | # Allow cls.clear() as well as uniqueInstance.clear(cls) | |||
|
35 | if str(cls) in cls._instances: | |||
|
36 | del cls._instances[str(cls)] |
@@ -0,0 +1,215 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | # -*- coding: utf-8 -*- | |||
|
3 | ||||
|
4 | import os | |||
|
5 | import re | |||
|
6 | import shutil | |||
|
7 | import logging | |||
|
8 | ||||
|
9 | from rhodecode.lib.dbmigrate.migrate import exceptions | |||
|
10 | from rhodecode.lib.dbmigrate.migrate.versioning import pathed, script | |||
|
11 | ||||
|
12 | ||||
|
13 | log = logging.getLogger(__name__) | |||
|
14 | ||||
|
15 | class VerNum(object): | |||
|
16 | """A version number that behaves like a string and int at the same time""" | |||
|
17 | ||||
|
18 | _instances = dict() | |||
|
19 | ||||
|
20 | def __new__(cls, value): | |||
|
21 | val = str(value) | |||
|
22 | if val not in cls._instances: | |||
|
23 | cls._instances[val] = super(VerNum, cls).__new__(cls) | |||
|
24 | ret = cls._instances[val] | |||
|
25 | return ret | |||
|
26 | ||||
|
27 | def __init__(self,value): | |||
|
28 | self.value = str(int(value)) | |||
|
29 | if self < 0: | |||
|
30 | raise ValueError("Version number cannot be negative") | |||
|
31 | ||||
|
32 | def __add__(self, value): | |||
|
33 | ret = int(self) + int(value) | |||
|
34 | return VerNum(ret) | |||
|
35 | ||||
|
36 | def __sub__(self, value): | |||
|
37 | return self + (int(value) * -1) | |||
|
38 | ||||
|
39 | def __cmp__(self, value): | |||
|
40 | return int(self) - int(value) | |||
|
41 | ||||
|
42 | def __repr__(self): | |||
|
43 | return "<VerNum(%s)>" % self.value | |||
|
44 | ||||
|
45 | def __str__(self): | |||
|
46 | return str(self.value) | |||
|
47 | ||||
|
48 | def __int__(self): | |||
|
49 | return int(self.value) | |||
|
50 | ||||
|
51 | ||||
|
52 | class Collection(pathed.Pathed): | |||
|
53 | """A collection of versioning scripts in a repository""" | |||
|
54 | ||||
|
55 | FILENAME_WITH_VERSION = re.compile(r'^(\d{3,}).*') | |||
|
56 | ||||
|
57 | def __init__(self, path): | |||
|
58 | """Collect current version scripts in repository | |||
|
59 | and store them in self.versions | |||
|
60 | """ | |||
|
61 | super(Collection, self).__init__(path) | |||
|
62 | ||||
|
63 | # Create temporary list of files, allowing skipped version numbers. | |||
|
64 | files = os.listdir(path) | |||
|
65 | if '1' in files: | |||
|
66 | # deprecation | |||
|
67 | raise Exception('It looks like you have a repository in the old ' | |||
|
68 | 'format (with directories for each version). ' | |||
|
69 | 'Please convert repository before proceeding.') | |||
|
70 | ||||
|
71 | tempVersions = dict() | |||
|
72 | for filename in files: | |||
|
73 | match = self.FILENAME_WITH_VERSION.match(filename) | |||
|
74 | if match: | |||
|
75 | num = int(match.group(1)) | |||
|
76 | tempVersions.setdefault(num, []).append(filename) | |||
|
77 | else: | |||
|
78 | pass # Must be a helper file or something, let's ignore it. | |||
|
79 | ||||
|
80 | # Create the versions member where the keys | |||
|
81 | # are VerNum's and the values are Version's. | |||
|
82 | self.versions = dict() | |||
|
83 | for num, files in tempVersions.items(): | |||
|
84 | self.versions[VerNum(num)] = Version(num, path, files) | |||
|
85 | ||||
|
86 | @property | |||
|
87 | def latest(self): | |||
|
88 | """:returns: Latest version in Collection""" | |||
|
89 | return max([VerNum(0)] + self.versions.keys()) | |||
|
90 | ||||
|
91 | def create_new_python_version(self, description, **k): | |||
|
92 | """Create Python files for new version""" | |||
|
93 | ver = self.latest + 1 | |||
|
94 | extra = str_to_filename(description) | |||
|
95 | ||||
|
96 | if extra: | |||
|
97 | if extra == '_': | |||
|
98 | extra = '' | |||
|
99 | elif not extra.startswith('_'): | |||
|
100 | extra = '_%s' % extra | |||
|
101 | ||||
|
102 | filename = '%03d%s.py' % (ver, extra) | |||
|
103 | filepath = self._version_path(filename) | |||
|
104 | ||||
|
105 | script.PythonScript.create(filepath, **k) | |||
|
106 | self.versions[ver] = Version(ver, self.path, [filename]) | |||
|
107 | ||||
|
108 | def create_new_sql_version(self, database, **k): | |||
|
109 | """Create SQL files for new version""" | |||
|
110 | ver = self.latest + 1 | |||
|
111 | self.versions[ver] = Version(ver, self.path, []) | |||
|
112 | ||||
|
113 | # Create new files. | |||
|
114 | for op in ('upgrade', 'downgrade'): | |||
|
115 | filename = '%03d_%s_%s.sql' % (ver, database, op) | |||
|
116 | filepath = self._version_path(filename) | |||
|
117 | script.SqlScript.create(filepath, **k) | |||
|
118 | self.versions[ver].add_script(filepath) | |||
|
119 | ||||
|
120 | def version(self, vernum=None): | |||
|
121 | """Returns latest Version if vernum is not given. | |||
|
122 | Otherwise, returns wanted version""" | |||
|
123 | if vernum is None: | |||
|
124 | vernum = self.latest | |||
|
125 | return self.versions[VerNum(vernum)] | |||
|
126 | ||||
|
127 | @classmethod | |||
|
128 | def clear(cls): | |||
|
129 | super(Collection, cls).clear() | |||
|
130 | ||||
|
131 | def _version_path(self, ver): | |||
|
132 | """Returns path of file in versions repository""" | |||
|
133 | return os.path.join(self.path, str(ver)) | |||
|
134 | ||||
|
135 | ||||
|
136 | class Version(object): | |||
|
137 | """A single version in a collection | |||
|
138 | :param vernum: Version Number | |||
|
139 | :param path: Path to script files | |||
|
140 | :param filelist: List of scripts | |||
|
141 | :type vernum: int, VerNum | |||
|
142 | :type path: string | |||
|
143 | :type filelist: list | |||
|
144 | """ | |||
|
145 | ||||
|
146 | def __init__(self, vernum, path, filelist): | |||
|
147 | self.version = VerNum(vernum) | |||
|
148 | ||||
|
149 | # Collect scripts in this folder | |||
|
150 | self.sql = dict() | |||
|
151 | self.python = None | |||
|
152 | ||||
|
153 | for script in filelist: | |||
|
154 | self.add_script(os.path.join(path, script)) | |||
|
155 | ||||
|
156 | def script(self, database=None, operation=None): | |||
|
157 | """Returns SQL or Python Script""" | |||
|
158 | for db in (database, 'default'): | |||
|
159 | # Try to return a .sql script first | |||
|
160 | try: | |||
|
161 | return self.sql[db][operation] | |||
|
162 | except KeyError: | |||
|
163 | continue # No .sql script exists | |||
|
164 | ||||
|
165 | # TODO: maybe add force Python parameter? | |||
|
166 | ret = self.python | |||
|
167 | ||||
|
168 | assert ret is not None, \ | |||
|
169 | "There is no script for %d version" % self.version | |||
|
170 | return ret | |||
|
171 | ||||
|
172 | def add_script(self, path): | |||
|
173 | """Add script to Collection/Version""" | |||
|
174 | if path.endswith(Extensions.py): | |||
|
175 | self._add_script_py(path) | |||
|
176 | elif path.endswith(Extensions.sql): | |||
|
177 | self._add_script_sql(path) | |||
|
178 | ||||
|
179 | SQL_FILENAME = re.compile(r'^(\d+)_([^_]+)_([^_]+).sql') | |||
|
180 | ||||
|
181 | def _add_script_sql(self, path): | |||
|
182 | basename = os.path.basename(path) | |||
|
183 | match = self.SQL_FILENAME.match(basename) | |||
|
184 | ||||
|
185 | if match: | |||
|
186 | version, dbms, op = match.group(1), match.group(2), match.group(3) | |||
|
187 | else: | |||
|
188 | raise exceptions.ScriptError( | |||
|
189 | "Invalid SQL script name %s " % basename + \ | |||
|
190 | "(needs to be ###_database_operation.sql)") | |||
|
191 | ||||
|
192 | # File the script into a dictionary | |||
|
193 | self.sql.setdefault(dbms, {})[op] = script.SqlScript(path) | |||
|
194 | ||||
|
195 | def _add_script_py(self, path): | |||
|
196 | if self.python is not None: | |||
|
197 | raise exceptions.ScriptError('You can only have one Python script ' | |||
|
198 | 'per version, but you have: %s and %s' % (self.python, path)) | |||
|
199 | self.python = script.PythonScript(path) | |||
|
200 | ||||
|
201 | ||||
|
202 | class Extensions: | |||
|
203 | """A namespace for file extensions""" | |||
|
204 | py = 'py' | |||
|
205 | sql = 'sql' | |||
|
206 | ||||
|
207 | def str_to_filename(s): | |||
|
208 | """Replaces spaces, (double and single) quotes | |||
|
209 | and double underscores to underscores | |||
|
210 | """ | |||
|
211 | ||||
|
212 | s = s.replace(' ', '_').replace('"', '_').replace("'", '_').replace(".", "_") | |||
|
213 | while '__' in s: | |||
|
214 | s = s.replace('__', '_') | |||
|
215 | return s |
@@ -0,0 +1,237 b'' | |||||
|
1 | #============================================================================== | |||
|
2 | # DB INITIAL MODEL | |||
|
3 | #============================================================================== | |||
|
4 | import logging | |||
|
5 | import datetime | |||
|
6 | ||||
|
7 | from sqlalchemy import * | |||
|
8 | from sqlalchemy.exc import DatabaseError | |||
|
9 | from sqlalchemy.orm import relation, backref, class_mapper | |||
|
10 | from sqlalchemy.orm.session import Session | |||
|
11 | from rhodecode.model.meta import Base | |||
|
12 | ||||
|
13 | from rhodecode.lib.dbmigrate.migrate import * | |||
|
14 | ||||
|
15 | log = logging.getLogger(__name__) | |||
|
16 | ||||
|
17 | class BaseModel(object): | |||
|
18 | ||||
|
19 | @classmethod | |||
|
20 | def _get_keys(cls): | |||
|
21 | """return column names for this model """ | |||
|
22 | return class_mapper(cls).c.keys() | |||
|
23 | ||||
|
24 | def get_dict(self): | |||
|
25 | """return dict with keys and values corresponding | |||
|
26 | to this model data """ | |||
|
27 | ||||
|
28 | d = {} | |||
|
29 | for k in self._get_keys(): | |||
|
30 | d[k] = getattr(self, k) | |||
|
31 | return d | |||
|
32 | ||||
|
33 | def get_appstruct(self): | |||
|
34 | """return list with keys and values tupples corresponding | |||
|
35 | to this model data """ | |||
|
36 | ||||
|
37 | l = [] | |||
|
38 | for k in self._get_keys(): | |||
|
39 | l.append((k, getattr(self, k),)) | |||
|
40 | return l | |||
|
41 | ||||
|
42 | def populate_obj(self, populate_dict): | |||
|
43 | """populate model with data from given populate_dict""" | |||
|
44 | ||||
|
45 | for k in self._get_keys(): | |||
|
46 | if k in populate_dict: | |||
|
47 | setattr(self, k, populate_dict[k]) | |||
|
48 | ||||
|
49 | class RhodeCodeSettings(Base, BaseModel): | |||
|
50 | __tablename__ = 'rhodecode_settings' | |||
|
51 | __table_args__ = (UniqueConstraint('app_settings_name'), {'useexisting':True}) | |||
|
52 | app_settings_id = Column("app_settings_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
53 | app_settings_name = Column("app_settings_name", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
54 | app_settings_value = Column("app_settings_value", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
55 | ||||
|
56 | def __init__(self, k, v): | |||
|
57 | self.app_settings_name = k | |||
|
58 | self.app_settings_value = v | |||
|
59 | ||||
|
60 | def __repr__(self): | |||
|
61 | return "<RhodeCodeSetting('%s:%s')>" % (self.app_settings_name, | |||
|
62 | self.app_settings_value) | |||
|
63 | ||||
|
64 | class RhodeCodeUi(Base, BaseModel): | |||
|
65 | __tablename__ = 'rhodecode_ui' | |||
|
66 | __table_args__ = {'useexisting':True} | |||
|
67 | ui_id = Column("ui_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
68 | ui_section = Column("ui_section", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
69 | ui_key = Column("ui_key", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
70 | ui_value = Column("ui_value", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
71 | ui_active = Column("ui_active", Boolean(), nullable=True, unique=None, default=True) | |||
|
72 | ||||
|
73 | ||||
|
74 | class User(Base, BaseModel): | |||
|
75 | __tablename__ = 'users' | |||
|
76 | __table_args__ = (UniqueConstraint('username'), UniqueConstraint('email'), {'useexisting':True}) | |||
|
77 | user_id = Column("user_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
78 | username = Column("username", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
79 | password = Column("password", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
80 | active = Column("active", Boolean(), nullable=True, unique=None, default=None) | |||
|
81 | admin = Column("admin", Boolean(), nullable=True, unique=None, default=False) | |||
|
82 | name = Column("name", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
83 | lastname = Column("lastname", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
84 | email = Column("email", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
85 | last_login = Column("last_login", DateTime(timezone=False), nullable=True, unique=None, default=None) | |||
|
86 | is_ldap = Column("is_ldap", Boolean(), nullable=False, unique=None, default=False) | |||
|
87 | ||||
|
88 | user_log = relation('UserLog', cascade='all') | |||
|
89 | user_perms = relation('UserToPerm', primaryjoin="User.user_id==UserToPerm.user_id", cascade='all') | |||
|
90 | ||||
|
91 | repositories = relation('Repository') | |||
|
92 | user_followers = relation('UserFollowing', primaryjoin='UserFollowing.follows_user_id==User.user_id', cascade='all') | |||
|
93 | ||||
|
94 | @property | |||
|
95 | def full_contact(self): | |||
|
96 | return '%s %s <%s>' % (self.name, self.lastname, self.email) | |||
|
97 | ||||
|
98 | def __repr__(self): | |||
|
99 | return "<User('id:%s:%s')>" % (self.user_id, self.username) | |||
|
100 | ||||
|
101 | def update_lastlogin(self): | |||
|
102 | """Update user lastlogin""" | |||
|
103 | ||||
|
104 | try: | |||
|
105 | session = Session.object_session(self) | |||
|
106 | self.last_login = datetime.datetime.now() | |||
|
107 | session.add(self) | |||
|
108 | session.commit() | |||
|
109 | log.debug('updated user %s lastlogin', self.username) | |||
|
110 | except (DatabaseError,): | |||
|
111 | session.rollback() | |||
|
112 | ||||
|
113 | ||||
|
114 | class UserLog(Base, BaseModel): | |||
|
115 | __tablename__ = 'user_logs' | |||
|
116 | __table_args__ = {'useexisting':True} | |||
|
117 | user_log_id = Column("user_log_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
118 | user_id = Column("user_id", Integer(), ForeignKey(u'users.user_id'), nullable=False, unique=None, default=None) | |||
|
119 | repository_id = Column("repository_id", Integer(length=None, convert_unicode=False, assert_unicode=None), ForeignKey(u'repositories.repo_id'), nullable=False, unique=None, default=None) | |||
|
120 | repository_name = Column("repository_name", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
121 | user_ip = Column("user_ip", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
122 | action = Column("action", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
123 | action_date = Column("action_date", DateTime(timezone=False), nullable=True, unique=None, default=None) | |||
|
124 | ||||
|
125 | user = relation('User') | |||
|
126 | repository = relation('Repository') | |||
|
127 | ||||
|
128 | class Repository(Base, BaseModel): | |||
|
129 | __tablename__ = 'repositories' | |||
|
130 | __table_args__ = (UniqueConstraint('repo_name'), {'useexisting':True},) | |||
|
131 | repo_id = Column("repo_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
132 | repo_name = Column("repo_name", String(length=None, convert_unicode=False, assert_unicode=None), nullable=False, unique=True, default=None) | |||
|
133 | repo_type = Column("repo_type", String(length=None, convert_unicode=False, assert_unicode=None), nullable=False, unique=False, default=None) | |||
|
134 | user_id = Column("user_id", Integer(), ForeignKey(u'users.user_id'), nullable=False, unique=False, default=None) | |||
|
135 | private = Column("private", Boolean(), nullable=True, unique=None, default=None) | |||
|
136 | enable_statistics = Column("statistics", Boolean(), nullable=True, unique=None, default=True) | |||
|
137 | description = Column("description", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
138 | fork_id = Column("fork_id", Integer(), ForeignKey(u'repositories.repo_id'), nullable=True, unique=False, default=None) | |||
|
139 | ||||
|
140 | user = relation('User') | |||
|
141 | fork = relation('Repository', remote_side=repo_id) | |||
|
142 | repo_to_perm = relation('RepoToPerm', cascade='all') | |||
|
143 | stats = relation('Statistics', cascade='all', uselist=False) | |||
|
144 | ||||
|
145 | repo_followers = relation('UserFollowing', primaryjoin='UserFollowing.follows_repo_id==Repository.repo_id', cascade='all') | |||
|
146 | ||||
|
147 | ||||
|
148 | def __repr__(self): | |||
|
149 | return "<Repository('%s:%s')>" % (self.repo_id, self.repo_name) | |||
|
150 | ||||
|
151 | class Permission(Base, BaseModel): | |||
|
152 | __tablename__ = 'permissions' | |||
|
153 | __table_args__ = {'useexisting':True} | |||
|
154 | permission_id = Column("permission_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
155 | permission_name = Column("permission_name", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
156 | permission_longname = Column("permission_longname", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
157 | ||||
|
158 | def __repr__(self): | |||
|
159 | return "<Permission('%s:%s')>" % (self.permission_id, self.permission_name) | |||
|
160 | ||||
|
161 | class RepoToPerm(Base, BaseModel): | |||
|
162 | __tablename__ = 'repo_to_perm' | |||
|
163 | __table_args__ = (UniqueConstraint('user_id', 'repository_id'), {'useexisting':True}) | |||
|
164 | repo_to_perm_id = Column("repo_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
165 | user_id = Column("user_id", Integer(), ForeignKey(u'users.user_id'), nullable=False, unique=None, default=None) | |||
|
166 | permission_id = Column("permission_id", Integer(), ForeignKey(u'permissions.permission_id'), nullable=False, unique=None, default=None) | |||
|
167 | repository_id = Column("repository_id", Integer(), ForeignKey(u'repositories.repo_id'), nullable=False, unique=None, default=None) | |||
|
168 | ||||
|
169 | user = relation('User') | |||
|
170 | permission = relation('Permission') | |||
|
171 | repository = relation('Repository') | |||
|
172 | ||||
|
173 | class UserToPerm(Base, BaseModel): | |||
|
174 | __tablename__ = 'user_to_perm' | |||
|
175 | __table_args__ = (UniqueConstraint('user_id', 'permission_id'), {'useexisting':True}) | |||
|
176 | user_to_perm_id = Column("user_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
177 | user_id = Column("user_id", Integer(), ForeignKey(u'users.user_id'), nullable=False, unique=None, default=None) | |||
|
178 | permission_id = Column("permission_id", Integer(), ForeignKey(u'permissions.permission_id'), nullable=False, unique=None, default=None) | |||
|
179 | ||||
|
180 | user = relation('User') | |||
|
181 | permission = relation('Permission') | |||
|
182 | ||||
|
183 | class Statistics(Base, BaseModel): | |||
|
184 | __tablename__ = 'statistics' | |||
|
185 | __table_args__ = (UniqueConstraint('repository_id'), {'useexisting':True}) | |||
|
186 | stat_id = Column("stat_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
187 | repository_id = Column("repository_id", Integer(), ForeignKey(u'repositories.repo_id'), nullable=False, unique=True, default=None) | |||
|
188 | stat_on_revision = Column("stat_on_revision", Integer(), nullable=False) | |||
|
189 | commit_activity = Column("commit_activity", LargeBinary(), nullable=False)#JSON data | |||
|
190 | commit_activity_combined = Column("commit_activity_combined", LargeBinary(), nullable=False)#JSON data | |||
|
191 | languages = Column("languages", LargeBinary(), nullable=False)#JSON data | |||
|
192 | ||||
|
193 | repository = relation('Repository', single_parent=True) | |||
|
194 | ||||
|
195 | class UserFollowing(Base, BaseModel): | |||
|
196 | __tablename__ = 'user_followings' | |||
|
197 | __table_args__ = (UniqueConstraint('user_id', 'follows_repository_id'), | |||
|
198 | UniqueConstraint('user_id', 'follows_user_id') | |||
|
199 | , {'useexisting':True}) | |||
|
200 | ||||
|
201 | user_following_id = Column("user_following_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
202 | user_id = Column("user_id", Integer(), ForeignKey(u'users.user_id'), nullable=False, unique=None, default=None) | |||
|
203 | follows_repo_id = Column("follows_repository_id", Integer(), ForeignKey(u'repositories.repo_id'), nullable=True, unique=None, default=None) | |||
|
204 | follows_user_id = Column("follows_user_id", Integer(), ForeignKey(u'users.user_id'), nullable=True, unique=None, default=None) | |||
|
205 | ||||
|
206 | user = relation('User', primaryjoin='User.user_id==UserFollowing.user_id') | |||
|
207 | ||||
|
208 | follows_user = relation('User', primaryjoin='User.user_id==UserFollowing.follows_user_id') | |||
|
209 | follows_repository = relation('Repository') | |||
|
210 | ||||
|
211 | ||||
|
212 | class CacheInvalidation(Base, BaseModel): | |||
|
213 | __tablename__ = 'cache_invalidation' | |||
|
214 | __table_args__ = (UniqueConstraint('cache_key'), {'useexisting':True}) | |||
|
215 | cache_id = Column("cache_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
216 | cache_key = Column("cache_key", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
217 | cache_args = Column("cache_args", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
218 | cache_active = Column("cache_active", Boolean(), nullable=True, unique=None, default=False) | |||
|
219 | ||||
|
220 | ||||
|
221 | def __init__(self, cache_key, cache_args=''): | |||
|
222 | self.cache_key = cache_key | |||
|
223 | self.cache_args = cache_args | |||
|
224 | self.cache_active = False | |||
|
225 | ||||
|
226 | def __repr__(self): | |||
|
227 | return "<CacheInvalidation('%s:%s')>" % (self.cache_id, self.cache_key) | |||
|
228 | ||||
|
229 | ||||
|
230 | def upgrade(migrate_engine): | |||
|
231 | # Upgrade operations go here. Don't create your own engine; bind migrate_engine | |||
|
232 | # to your metadata | |||
|
233 | Base.metadata.create_all(bind=migrate_engine, checkfirst=False) | |||
|
234 | ||||
|
235 | def downgrade(migrate_engine): | |||
|
236 | # Operations to reverse the above upgrade go here. | |||
|
237 | Base.metadata.drop_all(bind=migrate_engine, checkfirst=False) |
@@ -0,0 +1,127 b'' | |||||
|
1 | import logging | |||
|
2 | import datetime | |||
|
3 | ||||
|
4 | from sqlalchemy import * | |||
|
5 | from sqlalchemy.exc import DatabaseError | |||
|
6 | from sqlalchemy.orm import relation, backref, class_mapper | |||
|
7 | from sqlalchemy.orm.session import Session | |||
|
8 | from rhodecode.model.meta import Base | |||
|
9 | from rhodecode.model.db import BaseModel | |||
|
10 | ||||
|
11 | from rhodecode.lib.dbmigrate.migrate import * | |||
|
12 | ||||
|
13 | log = logging.getLogger(__name__) | |||
|
14 | ||||
|
15 | def upgrade(migrate_engine): | |||
|
16 | """ Upgrade operations go here. | |||
|
17 | Don't create your own engine; bind migrate_engine to your metadata | |||
|
18 | """ | |||
|
19 | ||||
|
20 | #========================================================================== | |||
|
21 | # Upgrade of `users` table | |||
|
22 | #========================================================================== | |||
|
23 | tblname = 'users' | |||
|
24 | tbl = Table(tblname, MetaData(bind=migrate_engine), autoload=True, | |||
|
25 | autoload_with=migrate_engine) | |||
|
26 | ||||
|
27 | #ADD is_ldap column | |||
|
28 | is_ldap = Column("is_ldap", Boolean(), nullable=True, | |||
|
29 | unique=None, default=False) | |||
|
30 | is_ldap.create(tbl, populate_default=True) | |||
|
31 | is_ldap.alter(nullable=False) | |||
|
32 | ||||
|
33 | #========================================================================== | |||
|
34 | # Upgrade of `user_logs` table | |||
|
35 | #========================================================================== | |||
|
36 | ||||
|
37 | tblname = 'users' | |||
|
38 | tbl = Table(tblname, MetaData(bind=migrate_engine), autoload=True, | |||
|
39 | autoload_with=migrate_engine) | |||
|
40 | ||||
|
41 | #ADD revision column | |||
|
42 | revision = Column('revision', TEXT(length=None, convert_unicode=False, | |||
|
43 | assert_unicode=None), | |||
|
44 | nullable=True, unique=None, default=None) | |||
|
45 | revision.create(tbl) | |||
|
46 | ||||
|
47 | ||||
|
48 | ||||
|
49 | #========================================================================== | |||
|
50 | # Upgrade of `repositories` table | |||
|
51 | #========================================================================== | |||
|
52 | tblname = 'repositories' | |||
|
53 | tbl = Table(tblname, MetaData(bind=migrate_engine), autoload=True, | |||
|
54 | autoload_with=migrate_engine) | |||
|
55 | ||||
|
56 | #ADD repo_type column# | |||
|
57 | repo_type = Column("repo_type", String(length=None, convert_unicode=False, | |||
|
58 | assert_unicode=None), | |||
|
59 | nullable=True, unique=False, default='hg') | |||
|
60 | ||||
|
61 | repo_type.create(tbl, populate_default=True) | |||
|
62 | #repo_type.alter(nullable=False) | |||
|
63 | ||||
|
64 | #ADD statistics column# | |||
|
65 | enable_statistics = Column("statistics", Boolean(), nullable=True, | |||
|
66 | unique=None, default=True) | |||
|
67 | enable_statistics.create(tbl) | |||
|
68 | ||||
|
69 | #========================================================================== | |||
|
70 | # Add table `user_followings` | |||
|
71 | #========================================================================== | |||
|
72 | tblname = 'user_followings' | |||
|
73 | ||||
|
74 | class UserFollowing(Base, BaseModel): | |||
|
75 | __tablename__ = 'user_followings' | |||
|
76 | __table_args__ = (UniqueConstraint('user_id', 'follows_repository_id'), | |||
|
77 | UniqueConstraint('user_id', 'follows_user_id') | |||
|
78 | , {'useexisting':True}) | |||
|
79 | ||||
|
80 | user_following_id = Column("user_following_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
81 | user_id = Column("user_id", Integer(), ForeignKey(u'users.user_id'), nullable=False, unique=None, default=None) | |||
|
82 | follows_repo_id = Column("follows_repository_id", Integer(), ForeignKey(u'repositories.repo_id'), nullable=True, unique=None, default=None) | |||
|
83 | follows_user_id = Column("follows_user_id", Integer(), ForeignKey(u'users.user_id'), nullable=True, unique=None, default=None) | |||
|
84 | ||||
|
85 | user = relation('User', primaryjoin='User.user_id==UserFollowing.user_id') | |||
|
86 | ||||
|
87 | follows_user = relation('User', primaryjoin='User.user_id==UserFollowing.follows_user_id') | |||
|
88 | follows_repository = relation('Repository') | |||
|
89 | ||||
|
90 | Base.metadata.tables[tblname].create(migrate_engine) | |||
|
91 | ||||
|
92 | #========================================================================== | |||
|
93 | # Add table `cache_invalidation` | |||
|
94 | #========================================================================== | |||
|
95 | tblname = 'cache_invalidation' | |||
|
96 | ||||
|
97 | class CacheInvalidation(Base, BaseModel): | |||
|
98 | __tablename__ = 'cache_invalidation' | |||
|
99 | __table_args__ = (UniqueConstraint('cache_key'), {'useexisting':True}) | |||
|
100 | cache_id = Column("cache_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |||
|
101 | cache_key = Column("cache_key", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
102 | cache_args = Column("cache_args", String(length=None, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None) | |||
|
103 | cache_active = Column("cache_active", Boolean(), nullable=True, unique=None, default=False) | |||
|
104 | ||||
|
105 | ||||
|
106 | def __init__(self, cache_key, cache_args=''): | |||
|
107 | self.cache_key = cache_key | |||
|
108 | self.cache_args = cache_args | |||
|
109 | self.cache_active = False | |||
|
110 | ||||
|
111 | def __repr__(self): | |||
|
112 | return "<CacheInvalidation('%s:%s')>" % (self.cache_id, self.cache_key) | |||
|
113 | ||||
|
114 | Base.metadata.tables[tblname].create(migrate_engine) | |||
|
115 | ||||
|
116 | return | |||
|
117 | ||||
|
118 | ||||
|
119 | ||||
|
120 | ||||
|
121 | ||||
|
122 | ||||
|
123 | def downgrade(migrate_engine): | |||
|
124 | meta = MetaData() | |||
|
125 | meta.bind = migrate_engine | |||
|
126 | ||||
|
127 |
@@ -0,0 +1,26 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | """ | |||
|
3 | rhodecode.lib.dbmigrate.versions.__init__ | |||
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |||
|
5 | ||||
|
6 | Package containing new versions of database models | |||
|
7 | ||||
|
8 | :created_on: Dec 11, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | """ | |||
|
13 | # This program is free software; you can redistribute it and/or | |||
|
14 | # modify it under the terms of the GNU General Public License | |||
|
15 | # as published by the Free Software Foundation; version 2 | |||
|
16 | # of the License or (at your opinion) any later version of the license. | |||
|
17 | # | |||
|
18 | # This program is distributed in the hope that it will be useful, | |||
|
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
21 | # GNU General Public License for more details. | |||
|
22 | # | |||
|
23 | # You should have received a copy of the GNU General Public License | |||
|
24 | # along with this program; if not, write to the Free Software | |||
|
25 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |||
|
26 | # MA 02110-1301, USA. |
@@ -0,0 +1,32 b'' | |||||
|
1 | #!/usr/bin/env python | |||
|
2 | # encoding: utf-8 | |||
|
3 | # Custom Exceptions modules | |||
|
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
5 | # | |||
|
6 | # This program is free software; you can redistribute it and/or | |||
|
7 | # modify it under the terms of the GNU General Public License | |||
|
8 | # as published by the Free Software Foundation; version 2 | |||
|
9 | # of the License or (at your opinion) any later version of the license. | |||
|
10 | # | |||
|
11 | # This program is distributed in the hope that it will be useful, | |||
|
12 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
13 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
14 | # GNU General Public License for more details. | |||
|
15 | # | |||
|
16 | # You should have received a copy of the GNU General Public License | |||
|
17 | # along with this program; if not, write to the Free Software | |||
|
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |||
|
19 | # MA 02110-1301, USA. | |||
|
20 | """ | |||
|
21 | Created on Nov 17, 2010 | |||
|
22 | Custom Exceptions modules | |||
|
23 | @author: marcink | |||
|
24 | """ | |||
|
25 | ||||
|
26 | class LdapUsernameError(Exception):pass | |||
|
27 | class LdapPasswordError(Exception):pass | |||
|
28 | class LdapConnectionError(Exception):pass | |||
|
29 | class LdapImportError(Exception):pass | |||
|
30 | ||||
|
31 | class DefaultUserException(Exception):pass | |||
|
32 | class UserOwnsReposException(Exception):pass |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
@@ -1,5 +1,11 b'' | |||||
|
1 | syntax: glob | |||
|
2 | *.pyc | |||
|
3 | *.swp | |||
1 |
|
4 | |||
2 | syntax: regexp |
|
5 | syntax: regexp | |
|
6 | ^build | |||
|
7 | ^docs/build/ | |||
|
8 | ^docs/_build/ | |||
3 | ^data$ |
|
9 | ^data$ | |
4 | ^\.settings$ |
|
10 | ^\.settings$ | |
5 | ^\.project$ |
|
11 | ^\.project$ | |
@@ -7,4 +13,4 b' syntax: regexp' | |||||
7 | ^rhodecode\.db$ |
|
13 | ^rhodecode\.db$ | |
8 | ^test\.db$ |
|
14 | ^test\.db$ | |
9 | ^repositories\.config$ |
|
15 | ^repositories\.config$ | |
10 | ^RhodeCode\.egg-info$ No newline at end of file |
|
16 | ^RhodeCode\.egg-info$ |
@@ -1,4 +1,5 b'' | |||||
1 | include rhodecode/config/deployment.ini_tmpl |
|
1 | include rhodecode/config/deployment.ini_tmpl | |
|
2 | include rhodecode/lib/dbmigrate/migrate.cfg | |||
2 |
|
3 | |||
3 | include README.rst |
|
4 | include README.rst | |
4 | recursive-include rhodecode/i18n/ * |
|
5 | recursive-include rhodecode/i18n/ * | |
@@ -7,7 +8,7 b' recursive-include rhodecode/i18n/ *' | |||||
7 | recursive-include rhodecode/public/css * |
|
8 | recursive-include rhodecode/public/css * | |
8 | recursive-include rhodecode/public/images * |
|
9 | recursive-include rhodecode/public/images * | |
9 | #js |
|
10 | #js | |
10 | include rhodecode/public/js/yui2.js |
|
11 | include rhodecode/public/js/yui2a.js | |
11 | include rhodecode/public/js/excanvas.min.js |
|
12 | include rhodecode/public/js/excanvas.min.js | |
12 | include rhodecode/public/js/yui.flot.js |
|
13 | include rhodecode/public/js/yui.flot.js | |
13 | include rhodecode/public/js/graph.js |
|
14 | include rhodecode/public/js/graph.js |
@@ -1,23 +1,25 b'' | |||||
1 |
|
1 | |||
2 | RhodeCode (RhodiumCode) |
|
2 | ================================================= | |
3 | ======================= |
|
3 | Welcome to RhodeCode (RhodiumCode) documentation! | |
|
4 | ================================================= | |||
4 |
|
5 | |||
5 |
``RhodeCode`` (formerly hg-app) is Pylons based repository |
|
6 | ``RhodeCode`` (formerly hg-app) is Pylons framework based Mercurial repository | |
6 | serving for mercurial_. It's similar to github or bitbucket, but it's suppose to run |
|
7 | browser/management with build in push/pull server and full text search. | |
7 | as standalone app, it's open source and focuses more on restricted access to repositories |
|
8 | It works on http/https, has build in permission/authentication(+ldap) features | |
8 | There's no default free access to RhodeCode You have to create an account in order |
|
9 | It's similar to github or bitbucket, but it's suppose to run as standalone | |
9 | to use the application. It's powered by vcs_ library that we created to handle |
|
10 | hosted application, it's open source and focuses more on restricted access to | |
10 | many various version control systems. |
|
11 | repositories. It's powered by vcs_ library that me and Lukasz Balcerzak created | |
|
12 | to handle many various version control systems. | |||
11 |
|
13 | |||
12 | RhodeCode uses `Semantic Versioning <http://semver.org/>`_ |
|
14 | RhodeCode uses `Semantic Versioning <http://semver.org/>`_ | |
13 |
|
15 | |||
14 |
|
||||
15 | RhodeCode demo |
|
16 | RhodeCode demo | |
16 | -------------- |
|
17 | -------------- | |
17 |
|
18 | |||
18 | http://hg.python-works.com |
|
19 | http://hg.python-works.com | |
19 |
|
20 | |||
20 | The default access is |
|
21 | The default access is anonymous but You can login to administrative account | |
|
22 | using those credentials | |||
21 |
|
23 | |||
22 | - username: demo |
|
24 | - username: demo | |
23 | - password: demo |
|
25 | - password: demo | |
@@ -25,14 +27,14 b' The default access is' | |||||
25 | Source code |
|
27 | Source code | |
26 | ----------- |
|
28 | ----------- | |
27 |
|
29 | |||
28 | Source code is along with issue tracker is available at |
|
30 | The most up to date sources can be obtained from my own RhodeCode instance | |
|
31 | https://rhodecode.org | |||
|
32 | ||||
|
33 | Rarely updated source code and issue tracker is available at bitbcuket | |||
29 | http://bitbucket.org/marcinkuzminski/rhodecode |
|
34 | http://bitbucket.org/marcinkuzminski/rhodecode | |
30 |
|
35 | |||
31 | Also a source codes can be obtained from demo rhodecode instance |
|
36 | Installation | |
32 | http://hg.python-works.com/rhodecode/summary |
|
37 | ------------ | |
33 |
|
||||
34 | Instalation |
|
|||
35 | ----------- |
|
|||
36 |
|
38 | |||
37 | Please visit http://packages.python.org/RhodeCode/installation.html |
|
39 | Please visit http://packages.python.org/RhodeCode/installation.html | |
38 |
|
40 | |||
@@ -40,41 +42,49 b' Instalation' | |||||
40 | Features |
|
42 | Features | |
41 | -------- |
|
43 | -------- | |
42 |
|
44 | |||
43 |
- Has it's own middleware to handle mercurial_ protocol request. |
|
45 | - Has it's own middleware to handle mercurial_ protocol request. | |
44 |
can be logged and authenticated. Runs on threads unlikely to |
|
46 | Each request can be logged and authenticated. Runs on threads unlikely to | |
45 | make multiple pulls/pushes simultaneous. Supports http/https |
|
47 | hgweb. You can make multiple pulls/pushes simultaneous. Supports http/https | |
46 | - Full permissions and authentication per project private/read/write/admin. |
|
48 | and ldap | |
47 | One account for web interface and mercurial_ push/pull/clone. |
|
49 | - Full permissions (private/read/write/admin) and authentication per project. | |
|
50 | One account for web interface and mercurial_ push/pull/clone operations. | |||
48 | - Mako templates let's you customize look and feel of application. |
|
51 | - Mako templates let's you customize look and feel of application. | |
49 | - Beautiful diffs, annotations and source codes all colored by pygments. |
|
52 | - Beautiful diffs, annotations and source codes all colored by pygments. | |
50 | - Mercurial_ branch graph and yui-flot powered graphs with zooming and statistics |
|
53 | - Mercurial_ branch graph and yui-flot powered graphs with zooming and statistics | |
51 |
- Admin interface with user/permission management. |
|
54 | - Admin interface with user/permission management. Admin activity journal, logs | |
52 |
pulls, pushes, forks,registrations |
|
55 | pulls, pushes, forks, registrations and other actions made by all users. | |
53 | - Server side forks, it's possible to fork a project and hack it free without |
|
56 | - Server side forks, it's possible to fork a project and hack it free without | |
54 |
breaking the main |
|
57 | breaking the main repository. | |
55 |
- Full text search on source codes, |
|
58 | - Full text search powered by Whoosh on source codes, and file names. | |
56 | and build in indexing daemons |
|
59 | Build in indexing daemons, with optional incremental index build | |
57 | (no external search servers required all in one application) |
|
60 | (no external search servers required all in one application) | |
58 | - Rss / atom feeds, gravatar support, download sources as zip/tarballs |
|
61 | - Setup project descriptions and info inside built in db for easy, non | |
|
62 | file-system operations | |||
|
63 | - Inteligent cache with invalidation after push or project change, provides high | |||
|
64 | performance and always up to date data. | |||
|
65 | - Rss / atom feeds, gravatar support, download sources as zip/tar/gz | |||
59 | - Async tasks for speed and performance using celery_ (works without them too) |
|
66 | - Async tasks for speed and performance using celery_ (works without them too) | |
60 | - Backup scripts can do backup of whole app and send it over scp to desired |
|
67 | - Backup scripts can do backup of whole app and send it over scp to desired | |
61 | location |
|
68 | location | |
62 | - Setup project descriptions and info inside built in db for easy, non |
|
69 | - Based on pylons / sqlalchemy / sqlite / whoosh / vcs | |
63 | file-system operations |
|
|||
64 | - Added cache with invalidation on push/repo management for high performance and |
|
|||
65 | always up to date data. |
|
|||
66 | - Based on pylons 1.0 / sqlalchemy 0.6 / sqlite |
|
|||
67 |
|
70 | |||
68 |
|
71 | |||
69 | Incoming |
|
72 | .. include:: ./docs/screenshots.rst | |
70 | -------- |
|
73 | ||
|
74 | ||||
|
75 | Incoming / Plans | |||
|
76 | ---------------- | |||
71 |
|
77 | |||
|
78 | - project grouping | |||
|
79 | - User groups/teams | |||
72 | - code review (probably based on hg-review) |
|
80 | - code review (probably based on hg-review) | |
73 | - full git_ support, with push/pull server |
|
81 | - full git_ support, with push/pull server (currently in beta tests) | |
|
82 | - redmine integration | |||
|
83 | - public accessible activity feeds | |||
74 | - commit based build in wiki system |
|
84 | - commit based build in wiki system | |
75 | - clone points and cloning from remote repositories into rhodecode |
|
85 | - clone points and cloning from remote repositories into rhodecode | |
76 | (git_ and mercurial_) |
|
86 | (git_ and mercurial_) | |
77 | - some cache optimizations |
|
87 | - more statistics and graph (global annotation + some more statistics) | |
78 | - other cools stuff that i can figure out (or You can help me figure out) |
|
88 | - other cools stuff that i can figure out (or You can help me figure out) | |
79 |
|
89 | |||
80 | License |
|
90 | License | |
@@ -83,8 +93,18 b' License' | |||||
83 | ``rhodecode`` is released under GPL_ license. |
|
93 | ``rhodecode`` is released under GPL_ license. | |
84 |
|
94 | |||
85 |
|
95 | |||
86 | Documentation |
|
96 | Mailing group Q&A | |
87 | ------------- |
|
97 | ----------------- | |
|
98 | ||||
|
99 | join the `Google group <http://groups.google.com/group/rhodecode>`_ | |||
|
100 | ||||
|
101 | open an issue at `issue tracker <http://bitbucket.org/marcinkuzminski/rhodecode/issues>`_ | |||
|
102 | ||||
|
103 | join #rhodecode on FreeNode (irc.freenode.net) | |||
|
104 | or use http://webchat.freenode.net/?channels=rhodecode for web access to irc. | |||
|
105 | ||||
|
106 | Online documentation | |||
|
107 | -------------------- | |||
88 |
|
108 | |||
89 | Online documentation for current version is available at |
|
109 | Online documentation for current version is available at | |
90 | http://packages.python.org/RhodeCode/. |
|
110 | http://packages.python.org/RhodeCode/. | |
@@ -92,13 +112,3 b' Documentation' | |||||
92 |
|
112 | |||
93 | make html |
|
113 | make html | |
94 |
|
114 | |||
95 | .. _virtualenv: http://pypi.python.org/pypi/virtualenv |
|
|||
96 | .. _python: http://www.python.org/ |
|
|||
97 | .. _django: http://www.djangoproject.com/ |
|
|||
98 | .. _mercurial: http://mercurial.selenic.com/ |
|
|||
99 | .. _subversion: http://subversion.tigris.org/ |
|
|||
100 | .. _git: http://git-scm.com/ |
|
|||
101 | .. _celery: http://celeryproject.org/ |
|
|||
102 | .. _Sphinx: http://sphinx.pocoo.org/ |
|
|||
103 | .. _GPL: http://www.gnu.org/licenses/gpl.html |
|
|||
104 | .. _vcs: http://pypi.python.org/pypi/vcs No newline at end of file |
|
@@ -1,6 +1,6 b'' | |||||
1 | ################################################################################ |
|
1 | ################################################################################ | |
2 | ################################################################################ |
|
2 | ################################################################################ | |
3 |
# |
|
3 | # RhodeCode - Pylons environment configuration # | |
4 | # # |
|
4 | # # | |
5 | # The %(here)s variable will be replaced with the parent directory of this file# |
|
5 | # The %(here)s variable will be replaced with the parent directory of this file# | |
6 | ################################################################################ |
|
6 | ################################################################################ | |
@@ -9,8 +9,8 b'' | |||||
9 | debug = true |
|
9 | debug = true | |
10 | ################################################################################ |
|
10 | ################################################################################ | |
11 | ## Uncomment and replace with the address which should receive ## |
|
11 | ## Uncomment and replace with the address which should receive ## | |
12 |
## any error reports after application crash |
|
12 | ## any error reports after application crash ## | |
13 |
## Additionally those settings will be used by |
|
13 | ## Additionally those settings will be used by RhodeCode mailing system ## | |
14 | ################################################################################ |
|
14 | ################################################################################ | |
15 | #email_to = admin@localhost |
|
15 | #email_to = admin@localhost | |
16 | #error_email_from = paste_error@localhost |
|
16 | #error_email_from = paste_error@localhost | |
@@ -19,22 +19,23 b' debug = true' | |||||
19 |
|
19 | |||
20 | #smtp_server = mail.server.com |
|
20 | #smtp_server = mail.server.com | |
21 | #smtp_username = |
|
21 | #smtp_username = | |
22 | #smtp_password = |
|
22 | #smtp_password = | |
23 | #smtp_port = |
|
23 | #smtp_port = | |
24 | #smtp_use_tls = |
|
24 | #smtp_use_tls = false | |
|
25 | #smtp_use_ssl = true | |||
25 |
|
26 | |||
26 | [server:main] |
|
27 | [server:main] | |
27 | ##nr of threads to spawn |
|
28 | ##nr of threads to spawn | |
28 | threadpool_workers = 5 |
|
29 | threadpool_workers = 5 | |
29 |
|
30 | |||
30 | ##max request before |
|
31 | ##max request before thread respawn | |
31 | threadpool_max_requests = 6 |
|
32 | threadpool_max_requests = 6 | |
32 |
|
33 | |||
33 | ##option to use threads of process |
|
34 | ##option to use threads of process | |
34 | use_threadpool = false |
|
35 | use_threadpool = false | |
35 |
|
36 | |||
36 | use = egg:Paste#http |
|
37 | use = egg:Paste#http | |
37 |
host = |
|
38 | host = 0.0.0.0 | |
38 | port = 5000 |
|
39 | port = 5000 | |
39 |
|
40 | |||
40 | [app:main] |
|
41 | [app:main] | |
@@ -43,6 +44,35 b' full_stack = true' | |||||
43 | static_files = true |
|
44 | static_files = true | |
44 | lang=en |
|
45 | lang=en | |
45 | cache_dir = %(here)s/data |
|
46 | cache_dir = %(here)s/data | |
|
47 | index_dir = %(here)s/data/index | |||
|
48 | cut_off_limit = 256000 | |||
|
49 | ||||
|
50 | #################################### | |||
|
51 | ### CELERY CONFIG #### | |||
|
52 | #################################### | |||
|
53 | use_celery = false | |||
|
54 | broker.host = localhost | |||
|
55 | broker.vhost = rabbitmqhost | |||
|
56 | broker.port = 5672 | |||
|
57 | broker.user = rabbitmq | |||
|
58 | broker.password = qweqwe | |||
|
59 | ||||
|
60 | celery.imports = rhodecode.lib.celerylib.tasks | |||
|
61 | ||||
|
62 | celery.result.backend = amqp | |||
|
63 | celery.result.dburi = amqp:// | |||
|
64 | celery.result.serialier = json | |||
|
65 | ||||
|
66 | #celery.send.task.error.emails = true | |||
|
67 | #celery.amqp.task.result.expires = 18000 | |||
|
68 | ||||
|
69 | celeryd.concurrency = 2 | |||
|
70 | #celeryd.log.file = celeryd.log | |||
|
71 | celeryd.log.level = debug | |||
|
72 | celeryd.max.tasks.per.child = 3 | |||
|
73 | ||||
|
74 | #tasks will never be sent to the queue, but executed locally instead. | |||
|
75 | celery.always.eager = false | |||
46 |
|
76 | |||
47 | #################################### |
|
77 | #################################### | |
48 | ### BEAKER CACHE #### |
|
78 | ### BEAKER CACHE #### | |
@@ -60,9 +90,8 b' beaker.cache.short_term.expire=60' | |||||
60 | beaker.cache.long_term.type=memory |
|
90 | beaker.cache.long_term.type=memory | |
61 | beaker.cache.long_term.expire=36000 |
|
91 | beaker.cache.long_term.expire=36000 | |
62 |
|
92 | |||
63 |
|
||||
64 | beaker.cache.sql_cache_short.type=memory |
|
93 | beaker.cache.sql_cache_short.type=memory | |
65 |
beaker.cache.sql_cache_short.expire= |
|
94 | beaker.cache.sql_cache_short.expire=10 | |
66 |
|
95 | |||
67 | beaker.cache.sql_cache_med.type=memory |
|
96 | beaker.cache.sql_cache_med.type=memory | |
68 | beaker.cache.sql_cache_med.expire=360 |
|
97 | beaker.cache.sql_cache_med.expire=360 | |
@@ -74,7 +103,7 b' beaker.cache.sql_cache_long.expire=3600' | |||||
74 | ### BEAKER SESSION #### |
|
103 | ### BEAKER SESSION #### | |
75 | #################################### |
|
104 | #################################### | |
76 | ## Type of storage used for the session, current types are |
|
105 | ## Type of storage used for the session, current types are | |
77 |
## |
|
106 | ## dbm, file, memcached, database, and memory. | |
78 | ## The storage uses the Container API |
|
107 | ## The storage uses the Container API | |
79 | ##that is also used by the cache system. |
|
108 | ##that is also used by the cache system. | |
80 | beaker.session.type = file |
|
109 | beaker.session.type = file | |
@@ -116,7 +145,7 b' sqlalchemy.convert_unicode = true' | |||||
116 | ### LOGGING CONFIGURATION #### |
|
145 | ### LOGGING CONFIGURATION #### | |
117 | ################################ |
|
146 | ################################ | |
118 | [loggers] |
|
147 | [loggers] | |
119 | keys = root, routes, rhodecode, sqlalchemy |
|
148 | keys = root, routes, rhodecode, sqlalchemy,beaker,templates | |
120 |
|
149 | |||
121 | [handlers] |
|
150 | [handlers] | |
122 | keys = console |
|
151 | keys = console | |
@@ -136,6 +165,19 b' level = DEBUG' | |||||
136 | handlers = console |
|
165 | handlers = console | |
137 | qualname = routes.middleware |
|
166 | qualname = routes.middleware | |
138 | # "level = DEBUG" logs the route matched and routing variables. |
|
167 | # "level = DEBUG" logs the route matched and routing variables. | |
|
168 | propagate = 0 | |||
|
169 | ||||
|
170 | [logger_beaker] | |||
|
171 | level = ERROR | |||
|
172 | handlers = console | |||
|
173 | qualname = beaker.container | |||
|
174 | propagate = 0 | |||
|
175 | ||||
|
176 | [logger_templates] | |||
|
177 | level = INFO | |||
|
178 | handlers = console | |||
|
179 | qualname = pylons.templating | |||
|
180 | propagate = 0 | |||
139 |
|
181 | |||
140 | [logger_rhodecode] |
|
182 | [logger_rhodecode] | |
141 | level = DEBUG |
|
183 | level = DEBUG |
@@ -3,22 +3,88 b'' | |||||
3 | Changelog |
|
3 | Changelog | |
4 | ========= |
|
4 | ========= | |
5 |
|
5 | |||
6 |
1. |
|
6 | 1.1.0 (**2010-12-18**) | |
7 | ---------------------- |
|
7 | ---------------------- | |
8 |
|
8 | |||
|
9 | :status: in-progress | |||
|
10 | :branch: beta | |||
|
11 | ||||
|
12 | news | |||
|
13 | ++++ | |||
|
14 | ||||
|
15 | - rewrite of internals for vcs >=0.1.10 | |||
|
16 | - uses mercurial 1.7 with dotencode disabled for maintaining compatibility | |||
|
17 | with older clients | |||
|
18 | - anonymous access, authentication via ldap | |||
|
19 | - performance upgrade for cached repos list - each repository has it's own | |||
|
20 | cache that's invalidated when needed. | |||
|
21 | - performance upgrades on repositories with large amount of commits (20K+) | |||
|
22 | - main page quick filter for filtering repositories | |||
|
23 | - user dashboards with ability to follow chosen repositories actions | |||
|
24 | - sends email to admin on new user registration | |||
|
25 | - added cache/statistics reset options into repository settings | |||
|
26 | - more detailed action logger (based on hooks) with pushed changesets lists | |||
|
27 | and options to disable those hooks from admin panel | |||
|
28 | - introduced new enhanced changelog for merges that shows more accurate results | |||
|
29 | - new improved and faster code stats (based on pygments lexers mapping tables, | |||
|
30 | showing up to 10 trending sources for each repository. Additionally stats | |||
|
31 | can be disabled in repository settings. | |||
|
32 | - gui optimizations, fixed application width to 1024px | |||
|
33 | - added cut off (for large files/changesets) limit into config files | |||
|
34 | - whoosh, celeryd, upgrade moved to paster command | |||
|
35 | - other than sqlite database backends can be used | |||
|
36 | ||||
|
37 | fixes | |||
|
38 | +++++ | |||
|
39 | ||||
|
40 | - fixes #61 forked repo was showing only after cache expired | |||
|
41 | - fixes #76 no confirmation on user deletes | |||
|
42 | - fixes #66 Name field misspelled | |||
|
43 | - fixes #72 block user removal when he owns repositories | |||
|
44 | - fixes #69 added password confirmation fields | |||
|
45 | - fixes #87 RhodeCode crashes occasionally on updating repository owner | |||
|
46 | - fixes #82 broken annotations on files with more than 1 blank line at the end | |||
|
47 | - a lot of fixes and tweaks for file browser | |||
|
48 | - fixed detached session issues | |||
|
49 | - fixed when user had no repos he would see all repos listed in my account | |||
|
50 | - fixed ui() instance bug when global hgrc settings was loaded for server | |||
|
51 | instance and all hgrc options were merged with our db ui() object | |||
|
52 | - numerous small bugfixes | |||
|
53 | ||||
|
54 | (special thanks for TkSoh for detailed feedback) | |||
|
55 | ||||
|
56 | ||||
|
57 | 1.0.2 (**2010-11-12**) | |||
|
58 | ---------------------- | |||
|
59 | ||||
|
60 | news | |||
|
61 | ++++ | |||
|
62 | ||||
|
63 | - tested under python2.7 | |||
|
64 | - bumped sqlalchemy and celery versions | |||
|
65 | ||||
|
66 | fixes | |||
|
67 | +++++ | |||
|
68 | ||||
9 | - fixed #59 missing graph.js |
|
69 | - fixed #59 missing graph.js | |
10 | - fixed repo_size crash when repository had broken symlinks |
|
70 | - fixed repo_size crash when repository had broken symlinks | |
11 | - fixed python2.5 crashes. |
|
71 | - fixed python2.5 crashes. | |
12 | - tested under python2.7 |
|
72 | ||
13 | - bumped sqlalcehmy and celery versions |
|
|||
14 |
|
73 | |||
15 | 1.0.1 (**2010-11-10**) |
|
74 | 1.0.1 (**2010-11-10**) | |
16 | ---------------------- |
|
75 | ---------------------- | |
17 |
|
76 | |||
|
77 | news | |||
|
78 | ++++ | |||
|
79 | ||||
|
80 | - small css updated | |||
|
81 | ||||
|
82 | fixes | |||
|
83 | +++++ | |||
|
84 | ||||
18 | - fixed #53 python2.5 incompatible enumerate calls |
|
85 | - fixed #53 python2.5 incompatible enumerate calls | |
19 | - fixed #52 disable mercurial extension for web |
|
86 | - fixed #52 disable mercurial extension for web | |
20 | - fixed #51 deleting repositories don't delete it's dependent objects |
|
87 | - fixed #51 deleting repositories don't delete it's dependent objects | |
21 | - small css updated |
|
|||
22 |
|
88 | |||
23 |
|
89 | |||
24 | 1.0.0 (**2010-11-02**) |
|
90 | 1.0.0 (**2010-11-02**) | |
@@ -52,3 +118,4 b' 1.0.0rc2 (**2010-10-11**)' | |||||
52 | - Disabled dirsize in file browser, it's causing nasty bug when dir renames |
|
118 | - Disabled dirsize in file browser, it's causing nasty bug when dir renames | |
53 | occure. After vcs is fixed it'll be put back again. |
|
119 | occure. After vcs is fixed it'll be put back again. | |
54 | - templating/css rewrites, optimized css. |
|
120 | - templating/css rewrites, optimized css. | |
|
121 |
@@ -16,7 +16,7 b' import sys, os' | |||||
16 | # If extensions (or modules to document with autodoc) are in another directory, |
|
16 | # If extensions (or modules to document with autodoc) are in another directory, | |
17 | # add these directories to sys.path here. If the directory is relative to the |
|
17 | # add these directories to sys.path here. If the directory is relative to the | |
18 | # documentation root, use os.path.abspath to make it absolute, like shown here. |
|
18 | # documentation root, use os.path.abspath to make it absolute, like shown here. | |
19 |
|
|
19 | sys.path.insert(0, os.path.abspath('..')) | |
20 |
|
20 | |||
21 | # -- General configuration ----------------------------------------------------- |
|
21 | # -- General configuration ----------------------------------------------------- | |
22 |
|
22 |
1 | NO CONTENT: modified file, binary diff hidden |
|
NO CONTENT: modified file, binary diff hidden |
1 | NO CONTENT: modified file, binary diff hidden |
|
NO CONTENT: modified file, binary diff hidden |
@@ -1,107 +1,43 b'' | |||||
1 | .. _index: |
|
1 | .. _index: | |
2 |
|
2 | |||
3 | Welcome to RhodeCode (RhodiumCode) documentation! |
|
3 | .. include:: ./../README.rst | |
4 | ================================================= |
|
|||
5 |
|
||||
6 | ``RhodeCode`` (formerly hg-app) is Pylons based repository management and |
|
|||
7 | serving for mercurial_. It's similar to github or bitbucket, but it's suppose to run |
|
|||
8 | as standalone app, it's open source and focuses more on restricted access to repositories |
|
|||
9 | There's no default free access to RhodeCode You have to create an account in order |
|
|||
10 | to use the application. It's powered by vcs_ library that we created to handle |
|
|||
11 | many various version control systems. |
|
|||
12 |
|
||||
13 | RhodeCode uses `Semantic Versioning <http://semver.org/>`_ |
|
|||
14 |
|
||||
15 |
|
||||
16 | RhodeCode demo |
|
|||
17 | -------------- |
|
|||
18 |
|
||||
19 | http://hg.python-works.com |
|
|||
20 |
|
||||
21 | The default access is |
|
|||
22 |
|
||||
23 | - username: demo |
|
|||
24 | - password: demo |
|
|||
25 |
|
||||
26 | Source code |
|
|||
27 | ----------- |
|
|||
28 |
|
||||
29 | Source code is along with issue tracker is available at |
|
|||
30 | http://bitbucket.org/marcinkuzminski/rhodecode |
|
|||
31 |
|
||||
32 | Also a source codes can be obtained from demo rhodecode instance |
|
|||
33 | http://hg.python-works.com/rhodecode/summary |
|
|||
34 |
|
||||
35 | Features |
|
|||
36 | -------- |
|
|||
37 |
|
||||
38 | - Has it's own middleware to handle mercurial_ protocol request. Each request |
|
|||
39 | can be logged and authenticated. Runs on threads unlikely to hgweb You can |
|
|||
40 | make multiple pulls/pushes simultaneous. Supports http/https |
|
|||
41 | - Full permissions and authentication per project private/read/write/admin. |
|
|||
42 | One account for web interface and mercurial_ push/pull/clone. |
|
|||
43 | - Mako templates let's you customize look and feel of application. |
|
|||
44 | - Beautiful diffs, annotations and source codes all colored by pygments. |
|
|||
45 | - Mercurial_ branch graph and yui-flot powered graphs with zooming and statistics |
|
|||
46 | - Admin interface with user/permission management. User activity journal logs |
|
|||
47 | pulls, pushes, forks,registrations. Possible to disable built in hooks |
|
|||
48 | - Server side forks, it's possible to fork a project and hack it free without |
|
|||
49 | breaking the main. |
|
|||
50 | - Full text search on source codes, search on file names. All powered by whoosh |
|
|||
51 | and build in indexing daemons |
|
|||
52 | (no external search servers required all in one application) |
|
|||
53 | - Rss / atom feeds, gravatar support, download sources as zip/tarballs |
|
|||
54 | - Async tasks for speed and performance using celery_ (works without them too) |
|
|||
55 | - Backup scripts can do backup of whole app and send it over scp to desired |
|
|||
56 | location |
|
|||
57 | - Setup project descriptions and info inside built in db for easy, non |
|
|||
58 | file-system operations |
|
|||
59 | - Added cache with invalidation on push/repo management for high performance and |
|
|||
60 | always up to date data. |
|
|||
61 | - Based on pylons 1.0 / sqlalchemy 0.6 / sqlite |
|
|||
62 |
|
||||
63 |
|
||||
64 | .. figure:: images/screenshot1_main_page.png |
|
|||
65 | :align: left |
|
|||
66 |
|
||||
67 | Main page of RhodeCode |
|
|||
68 |
|
||||
69 | .. figure:: images/screenshot2_summary_page.png |
|
|||
70 | :align: left |
|
|||
71 |
|
||||
72 | Summary page |
|
|||
73 |
|
||||
74 |
|
||||
75 | Incoming |
|
|||
76 | -------- |
|
|||
77 |
|
||||
78 | - code review (probably based on hg-review) |
|
|||
79 | - full git_ support, with push/pull server |
|
|||
80 | - commit based build in wiki system |
|
|||
81 | - clone points and cloning from remote repositories into rhodecode |
|
|||
82 | (git_ and mercurial_) |
|
|||
83 | - more statistics and graph (global annotation + some more statistics) |
|
|||
84 | - user customized activity dashboards |
|
|||
85 | - some cache optimizations |
|
|||
86 | - other cools stuff that i can figure out (or You can help me figure out) |
|
|||
87 |
|
||||
88 | License |
|
|||
89 | ------- |
|
|||
90 |
|
||||
91 | ``rhodecode`` is released under GPL_ license. |
|
|||
92 |
|
||||
93 |
|
4 | |||
94 | Documentation |
|
5 | Documentation | |
95 | ------------- |
|
6 | ------------- | |
96 |
|
7 | |||
|
8 | **Installation:** | |||
|
9 | ||||
97 | .. toctree:: |
|
10 | .. toctree:: | |
98 | :maxdepth: 1 |
|
11 | :maxdepth: 1 | |
99 |
|
12 | |||
100 | installation |
|
13 | installation | |
|
14 | setup | |||
101 | upgrade |
|
15 | upgrade | |
102 | setup |
|
16 | ||
|
17 | **Usage** | |||
|
18 | ||||
|
19 | .. toctree:: | |||
|
20 | :maxdepth: 1 | |||
|
21 | ||||
|
22 | enable_git | |||
|
23 | statistics | |||
|
24 | ||||
|
25 | **Develop** | |||
|
26 | ||||
|
27 | .. toctree:: | |||
|
28 | :maxdepth: 1 | |||
|
29 | ||||
|
30 | contributing | |||
103 | changelog |
|
31 | changelog | |
104 |
|
32 | |||
|
33 | **API** | |||
|
34 | ||||
|
35 | .. toctree:: | |||
|
36 | :maxdepth: 2 | |||
|
37 | ||||
|
38 | api/index | |||
|
39 | ||||
|
40 | ||||
105 | Other topics |
|
41 | Other topics | |
106 | ------------ |
|
42 | ------------ | |
107 |
|
43 |
@@ -5,31 +5,20 b' Installation' | |||||
5 |
|
5 | |||
6 | ``RhodeCode`` is written entirely in Python, but in order to use it's full |
|
6 | ``RhodeCode`` is written entirely in Python, but in order to use it's full | |
7 | potential there are some third-party requirements. When RhodeCode is used |
|
7 | potential there are some third-party requirements. When RhodeCode is used | |
8 |
together with celery |
|
8 | together with celery You have to install some kind of message broker, | |
9 | recommended one is rabbitmq_ to make the async tasks work. |
|
9 | recommended one is rabbitmq_ to make the async tasks work. | |
10 |
|
10 | |||
11 | Of course RhodeCode works in sync mode also, then You don't have to install |
|
11 | Of course RhodeCode works in sync mode also, then You don't have to install | |
12 | any third party apps. Celery_ will give You large speed improvement when using |
|
12 | any third party apps. Celery_ will give You large speed improvement when using | |
13 |
many big repositories. If You plan to use it for |
|
13 | many big repositories. If You plan to use it for 7 or 10 small repositories, it | |
14 | will work just fine without celery running. |
|
14 | will work just fine without celery running. | |
15 |
|
15 | |||
16 |
After You decide to Run it with celery make sure You run celeryd |
|
16 | After You decide to Run it with celery make sure You run celeryd using paster | |
17 | message broker together with the application. |
|
17 | and message broker together with the application. | |
18 |
|
||||
19 | Requirements for Celery |
|
|||
20 | ----------------------- |
|
|||
21 |
|
||||
22 | **Message Broker** |
|
|||
23 |
|
||||
24 | - preferred is `RabbitMq <http://www.rabbitmq.com/>`_ |
|
|||
25 | - possible other is `Redis <http://code.google.com/p/redis/>`_ |
|
|||
26 |
|
||||
27 | For installation instructions You can visit: |
|
|||
28 | http://ask.github.com/celery/getting-started/index.html |
|
|||
29 | It's very nice tutorial how to start celery_ with rabbitmq_ |
|
|||
30 |
|
18 | |||
31 | Install from Cheese Shop |
|
19 | Install from Cheese Shop | |
32 | ------------------------ |
|
20 | ------------------------ | |
|
21 | Rhodecode requires python 2.x greater than version 2.5 | |||
33 |
|
22 | |||
34 | Easiest way to install ``rhodecode`` is to run:: |
|
23 | Easiest way to install ``rhodecode`` is to run:: | |
35 |
|
24 | |||
@@ -42,7 +31,7 b' Or::' | |||||
42 | If you prefer to install manually simply grab latest release from |
|
31 | If you prefer to install manually simply grab latest release from | |
43 | http://pypi.python.org/pypi/rhodecode, decompres archive and run:: |
|
32 | http://pypi.python.org/pypi/rhodecode, decompres archive and run:: | |
44 |
|
33 | |||
45 | python setup.py install |
|
34 | python setup.py install | |
46 |
|
35 | |||
47 |
|
36 | |||
48 | Step by step installation example |
|
37 | Step by step installation example | |
@@ -62,7 +51,7 b' Step by step installation example' | |||||
62 |
|
51 | |||
63 | :: |
|
52 | :: | |
64 |
|
53 | |||
65 | source /var/www/rhodecode-venv/bin/activate |
|
54 | source activate /var/www/rhodecode-venv/bin/activate | |
66 |
|
55 | |||
67 | - Make a folder for rhodecode somewhere on the filesystem for example |
|
56 | - Make a folder for rhodecode somewhere on the filesystem for example | |
68 |
|
57 | |||
@@ -80,8 +69,28 b' Step by step installation example' | |||||
80 | - this will install rhodecode together with pylons |
|
69 | - this will install rhodecode together with pylons | |
81 | and all other required python libraries |
|
70 | and all other required python libraries | |
82 |
|
71 | |||
|
72 | Requirements for Celery (optional) | |||
|
73 | ---------------------------------- | |||
|
74 | ||||
|
75 | .. note:: | |||
|
76 | Installing message broker and using celery is optional, RhodeCode will | |||
|
77 | work without them perfectly fine. | |||
|
78 | ||||
|
79 | ||||
|
80 | **Message Broker** | |||
|
81 | ||||
|
82 | - preferred is `RabbitMq <http://www.rabbitmq.com/>`_ | |||
|
83 | - possible other is `Redis <http://code.google.com/p/redis/>`_ | |||
|
84 | ||||
|
85 | For installation instructions You can visit: | |||
|
86 | http://ask.github.com/celery/getting-started/index.html | |||
|
87 | It's very nice tutorial how to start celery_ with rabbitmq_ | |||
|
88 | ||||
83 |
|
89 | |||
84 | You can now proceed to :ref:`setup` |
|
90 | You can now proceed to :ref:`setup` | |
|
91 | ----------------------------------- | |||
|
92 | ||||
|
93 | ||||
85 |
|
94 | |||
86 | .. _virtualenv: http://pypi.python.org/pypi/virtualenv |
|
95 | .. _virtualenv: http://pypi.python.org/pypi/virtualenv | |
87 | .. _python: http://www.python.org/ |
|
96 | .. _python: http://www.python.org/ |
@@ -7,13 +7,20 b' Setup' | |||||
7 | Setting up the application |
|
7 | Setting up the application | |
8 | -------------------------- |
|
8 | -------------------------- | |
9 |
|
9 | |||
|
10 | First You'll ned to create RhodeCode config file. Run the following command | |||
|
11 | to do this | |||
|
12 | ||||
10 | :: |
|
13 | :: | |
11 |
|
14 | |||
12 | paster make-config RhodeCode production.ini |
|
15 | paster make-config RhodeCode production.ini | |
13 |
|
16 | |||
14 | - This will create `production.ini` config inside the directory |
|
17 | - This will create `production.ini` config inside the directory | |
15 |
this config contain various settings for |
|
18 | this config contains various settings for RhodeCode, e.g proxy port, | |
16 | static files, cache and logging. |
|
19 | email settings, usage of static files, cache, celery settings and logging. | |
|
20 | ||||
|
21 | ||||
|
22 | ||||
|
23 | Next we need to create the database. | |||
17 |
|
24 | |||
18 | :: |
|
25 | :: | |
19 |
|
26 | |||
@@ -24,55 +31,136 b' Setting up the application' | |||||
24 | existing ones. RhodeCode will simply add all new found repositories to |
|
31 | existing ones. RhodeCode will simply add all new found repositories to | |
25 | it's database. Also make sure You specify correct path to repositories. |
|
32 | it's database. Also make sure You specify correct path to repositories. | |
26 | - Remember that the given path for mercurial_ repositories must be write |
|
33 | - Remember that the given path for mercurial_ repositories must be write | |
27 |
accessible for the application. It's very important since RhodeCode web |
|
34 | accessible for the application. It's very important since RhodeCode web | |
28 |
will work even without such an access but, when trying to do a |
|
35 | interface will work even without such an access but, when trying to do a | |
29 | eventually fail with permission denied errors. |
|
36 | push it'll eventually fail with permission denied errors. | |
30 | - Run |
|
37 | ||
|
38 | You are ready to use rhodecode, to run it simply execute | |||
31 |
|
39 | |||
32 | :: |
|
40 | :: | |
33 |
|
41 | |||
34 | paster serve production.ini |
|
42 | paster serve production.ini | |
35 |
|
43 | |||
36 |
- This command runs the |
|
44 | - This command runs the RhodeCode server the app should be available at the | |
37 | 127.0.0.1:5000. This ip and port is configurable via the production.ini |
|
45 | 127.0.0.1:5000. This ip and port is configurable via the production.ini | |
38 |
file |
|
46 | file created in previous step | |
39 | - Use admin account you created to login. |
|
47 | - Use admin account you created to login. | |
40 | - Default permissions on each repository is read, and owner is admin. So |
|
48 | - Default permissions on each repository is read, and owner is admin. So | |
41 | remember to update these if needed. |
|
49 | remember to update these if needed. In the admin panel You can toggle ldap, | |
|
50 | anonymous, permissions settings. As well as edit more advanced options on | |||
|
51 | users and repositories | |||
42 |
|
52 | |||
43 | Note |
|
53 | ||
44 | ---- |
|
54 | Setting up Whoosh full text search | |
|
55 | ---------------------------------- | |||
45 |
|
56 | |||
46 | RhodeCode when running without the celery it's running all it's task in sync |
|
57 | Index for whoosh can be build starting from version 1.1 using paster command | |
47 | mode, for first few times when visiting summary page You can notice few |
|
58 | passing repo locations to index, as well as Your config file that stores | |
48 | slow downs, this is due the statistics building it's cache. After all changesets |
|
59 | whoosh index files locations. There is possible to pass `-f` to the options | |
49 | are parsed it'll take the stats from cache and run much faster. Each summary |
|
60 | to enable full index rebuild. Without that indexing will run always in in | |
50 | page display parse at most 250 changesets in order to not stress the cpu, so |
|
61 | incremental mode. | |
51 | the full stats are going to be loaded after total_number_of_changesets/250 |
|
62 | ||
52 | summary page visits. |
|
63 | :: | |
53 |
|
|
64 | ||
|
65 | paster make-index production.ini --repo-location=<location for repos> | |||
54 |
|
66 | |||
55 |
|
67 | for full index rebuild You can use | ||
56 | Setting up Whoosh |
|
68 | ||
57 | ----------------- |
|
69 | :: | |
|
70 | ||||
|
71 | paster make-index production.ini -f --repo-location=<location for repos> | |||
58 |
|
72 | |||
59 | - For full text search You can either put crontab entry for |
|
73 | - For full text search You can either put crontab entry for | |
60 |
|
74 | |||
|
75 | This command can be run even from crontab in order to do periodical | |||
|
76 | index builds and keep Your index always up to date. An example entry might | |||
|
77 | look like this | |||
|
78 | ||||
61 | :: |
|
79 | :: | |
62 |
|
80 | |||
63 | python /var/www/rhodecode/<rhodecode_installation_path>/lib/indexers/daemon.py incremental <put_here_path_to_repos> |
|
81 | /path/to/python/bin/paster /path/to/rhodecode/production.ini --repo-location=<location for repos> | |
64 |
|
82 | |||
65 |
When using incremental mode whoosh will check last modification date |
|
83 | When using incremental(default) mode whoosh will check last modification date | |
66 |
and add it to reindex if newer file is available. Also indexing |
|
84 | of each file and add it to reindex if newer file is available. Also indexing | |
67 |
for removed files and removes them from index. |
|
85 | daemon checks for removed files and removes them from index. | |
68 | index from scratch, in admin panel You can check `build from scratch` flag |
|
86 | ||
69 | and in standalone daemon You can pass `full` instead on incremental to build |
|
87 | Sometime You might want to rebuild index from scratch. You can do that using | |
70 | remove previous index and build new one. |
|
88 | the `-f` flag passed to paster command or, in admin panel You can check | |
|
89 | `build from scratch` flag. | |||
|
90 | ||||
|
91 | ||||
|
92 | Setting up LDAP support | |||
|
93 | ----------------------- | |||
|
94 | ||||
|
95 | RhodeCode starting from version 1.1 supports ldap authentication. In order | |||
|
96 | to use ldap, You have to install python-ldap package. This package is available | |||
|
97 | via pypi, so You can install it by running | |||
|
98 | ||||
|
99 | :: | |||
|
100 | ||||
|
101 | easy_install python-ldap | |||
|
102 | ||||
|
103 | :: | |||
|
104 | ||||
|
105 | pip install python-ldap | |||
|
106 | ||||
|
107 | .. note:: | |||
|
108 | python-ldap requires some certain libs on Your system, so before installing | |||
|
109 | it check that You have at least `openldap`, and `sasl` libraries. | |||
|
110 | ||||
|
111 | ldap settings are located in admin->ldap section, | |||
|
112 | ||||
|
113 | Here's a typical ldap setup:: | |||
|
114 | ||||
|
115 | Enable ldap = checked #controls if ldap access is enabled | |||
|
116 | Host = host.domain.org #actual ldap server to connect | |||
|
117 | Port = 389 or 689 for ldaps #ldap server ports | |||
|
118 | Enable LDAPS = unchecked #enable disable ldaps | |||
|
119 | Account = <account> #access for ldap server(if required) | |||
|
120 | Password = <password> #password for ldap server(if required) | |||
|
121 | Base DN = uid=%(user)s,CN=users,DC=host,DC=domain,DC=org | |||
|
122 | ||||
|
123 | ||||
|
124 | `Account` and `Password` are optional, and used for two-phase ldap | |||
|
125 | authentication so those are credentials to access Your ldap, if it doesn't | |||
|
126 | support anonymous search/user lookups. | |||
|
127 | ||||
|
128 | Base DN must have %(user)s template inside, it's a placer where Your uid used | |||
|
129 | to login would go, it allows admins to specify not standard schema for uid | |||
|
130 | variable | |||
|
131 | ||||
|
132 | If all data are entered correctly, and `python-ldap` is properly installed | |||
|
133 | Users should be granted to access RhodeCode wit ldap accounts. When | |||
|
134 | logging at the first time an special ldap account is created inside RhodeCode, | |||
|
135 | so You can control over permissions even on ldap users. If such user exists | |||
|
136 | already in RhodeCode database ldap user with the same username would be not | |||
|
137 | able to access RhodeCode. | |||
|
138 | ||||
|
139 | If You have problems with ldap access and believe You entered correct | |||
|
140 | information check out the RhodeCode logs,any error messages sent from | |||
|
141 | ldap will be saved there. | |||
|
142 | ||||
|
143 | ||||
|
144 | ||||
|
145 | Setting Up Celery | |||
|
146 | ----------------- | |||
|
147 | ||||
|
148 | Since version 1.1 celery is configured by the rhodecode ini configuration files | |||
|
149 | simply set use_celery=true in the ini file then add / change the configuration | |||
|
150 | variables inside the ini file. | |||
|
151 | ||||
|
152 | Remember that the ini files uses format with '.' not with '_' like celery | |||
|
153 | so for example setting `BROKER_HOST` in celery means setting `broker.host` in | |||
|
154 | the config file. | |||
|
155 | ||||
|
156 | In order to make start using celery run:: | |||
|
157 | paster celeryd <configfile.ini> | |||
|
158 | ||||
71 |
|
159 | |||
72 | Nginx virtual host example |
|
160 | Nginx virtual host example | |
73 | -------------------------- |
|
161 | -------------------------- | |
74 |
|
162 | |||
75 | Sample config for nginx:: |
|
163 | Sample config for nginx using proxy:: | |
76 |
|
164 | |||
77 | server { |
|
165 | server { | |
78 | listen 80; |
|
166 | listen 80; | |
@@ -122,6 +210,16 b' in production.ini file::' | |||||
122 |
|
210 | |||
123 | To not have the statics served by the application. And improve speed. |
|
211 | To not have the statics served by the application. And improve speed. | |
124 |
|
212 | |||
|
213 | Apache reverse proxy | |||
|
214 | -------------------- | |||
|
215 | Tutorial can be found here | |||
|
216 | http://wiki.pylonshq.com/display/pylonscookbook/Apache+as+a+reverse+proxy+for+Pylons | |||
|
217 | ||||
|
218 | ||||
|
219 | Apache's example FCGI config | |||
|
220 | ---------------------------- | |||
|
221 | ||||
|
222 | TODO ! | |||
125 |
|
223 | |||
126 | Other configuration files |
|
224 | Other configuration files | |
127 | ------------------------- |
|
225 | ------------------------- | |
@@ -132,6 +230,29 b' http://hg.python-works.com/rhodecode/fil' | |||||
132 | and also an celeryconfig file can be use from here: |
|
230 | and also an celeryconfig file can be use from here: | |
133 | http://hg.python-works.com/rhodecode/files/tip/celeryconfig.py |
|
231 | http://hg.python-works.com/rhodecode/files/tip/celeryconfig.py | |
134 |
|
232 | |||
|
233 | Troubleshooting | |||
|
234 | --------------- | |||
|
235 | ||||
|
236 | - missing static files ? | |||
|
237 | ||||
|
238 | - make sure either to set the `static_files = true` in the .ini file or | |||
|
239 | double check the root path for Your http setup. It should point to | |||
|
240 | for example: | |||
|
241 | /home/my-virtual-python/lib/python2.6/site-packages/rhodecode/public | |||
|
242 | ||||
|
243 | - can't install celery/rabbitmq | |||
|
244 | ||||
|
245 | - don't worry RhodeCode works without them too. No extra setup required | |||
|
246 | ||||
|
247 | - long lasting push timeouts ? | |||
|
248 | ||||
|
249 | - make sure You set a longer timeouts in Your proxy/fcgi settings, timeouts | |||
|
250 | are caused by https server and not RhodeCode | |||
|
251 | ||||
|
252 | - large pushes timeouts ? | |||
|
253 | ||||
|
254 | - make sure You set a proper max_body_size for the http server | |||
|
255 | ||||
135 |
|
256 | |||
136 |
|
257 | |||
137 | .. _virtualenv: http://pypi.python.org/pypi/virtualenv |
|
258 | .. _virtualenv: http://pypi.python.org/pypi/virtualenv |
@@ -22,7 +22,26 b' Then make sure You run from the installa' | |||||
22 | paster make-config RhodeCode production.ini |
|
22 | paster make-config RhodeCode production.ini | |
23 |
|
23 | |||
24 | This will display any changes made from new version of RhodeCode To your |
|
24 | This will display any changes made from new version of RhodeCode To your | |
25 | current config. And tries to do an automerge. |
|
25 | current config. And tries to do an automerge. It's always better to do a backup | |
|
26 | of config file and recheck the content after merge. | |||
|
27 | ||||
|
28 | It's also good to rebuild the whoosh index since after upgrading the whoosh | |||
|
29 | version there could be introduced incompatible index changes. | |||
|
30 | ||||
|
31 | ||||
|
32 | The last step is to upgrade the database. To do this simply run | |||
|
33 | ||||
|
34 | :: | |||
|
35 | ||||
|
36 | paster upgrade-db production.ini | |||
|
37 | ||||
|
38 | This will upgrade schema, as well as update some default on the database, | |||
|
39 | always recheck the settings of the application, if there are no new options | |||
|
40 | that need to be set. | |||
|
41 | ||||
|
42 | .. note:: | |||
|
43 | Always perform a database backup before doing upgrade. | |||
|
44 | ||||
26 |
|
45 | |||
27 |
|
46 | |||
28 | .. _virtualenv: http://pypi.python.org/pypi/virtualenv |
|
47 | .. _virtualenv: http://pypi.python.org/pypi/virtualenv |
@@ -22,6 +22,7 b' debug = true' | |||||
22 | #smtp_password = |
|
22 | #smtp_password = | |
23 | #smtp_port = |
|
23 | #smtp_port = | |
24 | #smtp_use_tls = false |
|
24 | #smtp_use_tls = false | |
|
25 | #smtp_use_ssl = true | |||
25 |
|
26 | |||
26 | [server:main] |
|
27 | [server:main] | |
27 | ##nr of threads to spawn |
|
28 | ##nr of threads to spawn | |
@@ -43,6 +44,35 b' full_stack = true' | |||||
43 | static_files = false |
|
44 | static_files = false | |
44 | lang=en |
|
45 | lang=en | |
45 | cache_dir = %(here)s/data |
|
46 | cache_dir = %(here)s/data | |
|
47 | index_dir = %(here)s/data/index | |||
|
48 | cut_off_limit = 256000 | |||
|
49 | ||||
|
50 | #################################### | |||
|
51 | ### CELERY CONFIG #### | |||
|
52 | #################################### | |||
|
53 | use_celery = false | |||
|
54 | broker.host = localhost | |||
|
55 | broker.vhost = rabbitmqhost | |||
|
56 | broker.port = 5672 | |||
|
57 | broker.user = rabbitmq | |||
|
58 | broker.password = qweqwe | |||
|
59 | ||||
|
60 | celery.imports = rhodecode.lib.celerylib.tasks | |||
|
61 | ||||
|
62 | celery.result.backend = amqp | |||
|
63 | celery.result.dburi = amqp:// | |||
|
64 | celery.result.serialier = json | |||
|
65 | ||||
|
66 | #celery.send.task.error.emails = true | |||
|
67 | #celery.amqp.task.result.expires = 18000 | |||
|
68 | ||||
|
69 | celeryd.concurrency = 2 | |||
|
70 | #celeryd.log.file = celeryd.log | |||
|
71 | celeryd.log.level = debug | |||
|
72 | celeryd.max.tasks.per.child = 3 | |||
|
73 | ||||
|
74 | #tasks will never be sent to the queue, but executed locally instead. | |||
|
75 | celery.always.eager = false | |||
46 |
|
76 | |||
47 | #################################### |
|
77 | #################################### | |
48 | ### BEAKER CACHE #### |
|
78 | ### BEAKER CACHE #### | |
@@ -136,6 +166,7 b' level = INFO' | |||||
136 | handlers = console |
|
166 | handlers = console | |
137 | qualname = routes.middleware |
|
167 | qualname = routes.middleware | |
138 | # "level = DEBUG" logs the route matched and routing variables. |
|
168 | # "level = DEBUG" logs the route matched and routing variables. | |
|
169 | propagate = 0 | |||
139 |
|
170 | |||
140 | [logger_rhodecode] |
|
171 | [logger_rhodecode] | |
141 | level = DEBUG |
|
172 | level = DEBUG |
@@ -1,8 +1,16 b'' | |||||
1 | #!/usr/bin/env python |
|
1 | # -*- coding: utf-8 -*- | |
2 | # encoding: utf-8 |
|
2 | """ | |
3 | # RhodeCode, a web based repository management based on pylons |
|
3 | rhodecode.__init__ | |
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> |
|
4 | ~~~~~~~~~~~~~~~~~~ | |
5 | # |
|
5 | ||
|
6 | RhodeCode, a web based repository management based on pylons | |||
|
7 | versioning implementation: http://semver.org/ | |||
|
8 | ||||
|
9 | :created_on: Apr 9, 2010 | |||
|
10 | :author: marcink | |||
|
11 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
12 | :license: GPLv3, see COPYING for more details. | |||
|
13 | """ | |||
6 | # This program is free software; you can redistribute it and/or |
|
14 | # This program is free software; you can redistribute it and/or | |
7 | # modify it under the terms of the GNU General Public License |
|
15 | # modify it under the terms of the GNU General Public License | |
8 | # as published by the Free Software Foundation; version 2 |
|
16 | # as published by the Free Software Foundation; version 2 | |
@@ -17,19 +25,28 b'' | |||||
17 | # along with this program; if not, write to the Free Software |
|
25 | # along with this program; if not, write to the Free Software | |
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, |
|
26 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |
19 | # MA 02110-1301, USA. |
|
27 | # MA 02110-1301, USA. | |
20 | """ |
|
28 | ||
21 | Created on April 9, 2010 |
|
29 | ||
22 | RhodeCode, a web based repository management based on pylons |
|
30 | VERSION = (1, 1, 0) | |
23 | versioning implementation: http://semver.org/ |
|
31 | __version__ = '.'.join((str(each) for each in VERSION[:4])) | |
24 | @author: marcink |
|
32 | __dbversion__ = 2 #defines current db version for migrations | |
25 | """ |
|
|||
26 |
|
33 | |||
27 | VERSION = (1, 0, 2,) |
|
34 | try: | |
|
35 | from rhodecode.lib.utils import get_current_revision | |||
|
36 | _rev = get_current_revision() | |||
|
37 | except ImportError: | |||
|
38 | #this is needed when doing some setup.py operations | |||
|
39 | _rev = False | |||
28 |
|
40 | |||
29 | __version__ = '.'.join((str(each) for each in VERSION[:4])) |
|
41 | if len(VERSION) > 3 and _rev: | |
|
42 | __version__ += ' [rev:%s]' % _rev[0] | |||
30 |
|
43 | |||
31 | def get_version(): |
|
44 | def get_version(): | |
32 | """ |
|
45 | """Returns shorter version (digit parts only) as string.""" | |
33 | Returns shorter version (digit parts only) as string. |
|
46 | ||
34 | """ |
|
|||
35 | return '.'.join((str(each) for each in VERSION[:3])) |
|
47 | return '.'.join((str(each) for each in VERSION[:3])) | |
|
48 | ||||
|
49 | BACKENDS = { | |||
|
50 | 'hg': 'Mercurial repository', | |||
|
51 | #'git': 'Git repository', | |||
|
52 | } |
@@ -1,6 +1,6 b'' | |||||
1 | ################################################################################ |
|
1 | ################################################################################ | |
2 | ################################################################################ |
|
2 | ################################################################################ | |
3 |
# |
|
3 | # RhodeCode - Pylons environment configuration # | |
4 | # # |
|
4 | # # | |
5 | # The %(here)s variable will be replaced with the parent directory of this file# |
|
5 | # The %(here)s variable will be replaced with the parent directory of this file# | |
6 | ################################################################################ |
|
6 | ################################################################################ | |
@@ -10,7 +10,7 b' debug = true' | |||||
10 | ################################################################################ |
|
10 | ################################################################################ | |
11 | ## Uncomment and replace with the address which should receive ## |
|
11 | ## Uncomment and replace with the address which should receive ## | |
12 | ## any error reports after application crash ## |
|
12 | ## any error reports after application crash ## | |
13 |
## Additionally those settings will be used by |
|
13 | ## Additionally those settings will be used by RhodeCode mailing system ## | |
14 | ################################################################################ |
|
14 | ################################################################################ | |
15 | #email_to = admin@localhost |
|
15 | #email_to = admin@localhost | |
16 | #error_email_from = paste_error@localhost |
|
16 | #error_email_from = paste_error@localhost | |
@@ -22,13 +22,14 b' debug = true' | |||||
22 | #smtp_password = |
|
22 | #smtp_password = | |
23 | #smtp_port = |
|
23 | #smtp_port = | |
24 | #smtp_use_tls = false |
|
24 | #smtp_use_tls = false | |
|
25 | #smtp_use_ssl = true | |||
25 |
|
26 | |||
26 | [server:main] |
|
27 | [server:main] | |
27 | ##nr of threads to spawn |
|
28 | ##nr of threads to spawn | |
28 | threadpool_workers = 5 |
|
29 | threadpool_workers = 5 | |
29 |
|
30 | |||
30 | ##max request before thread respawn |
|
31 | ##max request before thread respawn | |
31 |
threadpool_max_requests = |
|
32 | threadpool_max_requests = 10 | |
32 |
|
33 | |||
33 | ##option to use threads of process |
|
34 | ##option to use threads of process | |
34 | use_threadpool = true |
|
35 | use_threadpool = true | |
@@ -43,7 +44,36 b' full_stack = true' | |||||
43 | static_files = true |
|
44 | static_files = true | |
44 | lang=en |
|
45 | lang=en | |
45 | cache_dir = %(here)s/data |
|
46 | cache_dir = %(here)s/data | |
|
47 | index_dir = %(here)s/data/index | |||
46 | app_instance_uuid = ${app_instance_uuid} |
|
48 | app_instance_uuid = ${app_instance_uuid} | |
|
49 | cut_off_limit = 256000 | |||
|
50 | ||||
|
51 | #################################### | |||
|
52 | ### CELERY CONFIG #### | |||
|
53 | #################################### | |||
|
54 | use_celery = false | |||
|
55 | broker.host = localhost | |||
|
56 | broker.vhost = rabbitmqhost | |||
|
57 | broker.port = 5672 | |||
|
58 | broker.user = rabbitmq | |||
|
59 | broker.password = qweqwe | |||
|
60 | ||||
|
61 | celery.imports = rhodecode.lib.celerylib.tasks | |||
|
62 | ||||
|
63 | celery.result.backend = amqp | |||
|
64 | celery.result.dburi = amqp:// | |||
|
65 | celery.result.serialier = json | |||
|
66 | ||||
|
67 | #celery.send.task.error.emails = true | |||
|
68 | #celery.amqp.task.result.expires = 18000 | |||
|
69 | ||||
|
70 | celeryd.concurrency = 2 | |||
|
71 | #celeryd.log.file = celeryd.log | |||
|
72 | celeryd.log.level = debug | |||
|
73 | celeryd.max.tasks.per.child = 3 | |||
|
74 | ||||
|
75 | #tasks will never be sent to the queue, but executed locally instead. | |||
|
76 | celery.always.eager = false | |||
47 |
|
77 | |||
48 | #################################### |
|
78 | #################################### | |
49 | ### BEAKER CACHE #### |
|
79 | ### BEAKER CACHE #### | |
@@ -62,7 +92,7 b' beaker.cache.long_term.type=memory' | |||||
62 | beaker.cache.long_term.expire=36000 |
|
92 | beaker.cache.long_term.expire=36000 | |
63 |
|
93 | |||
64 | beaker.cache.sql_cache_short.type=memory |
|
94 | beaker.cache.sql_cache_short.type=memory | |
65 |
beaker.cache.sql_cache_short.expire= |
|
95 | beaker.cache.sql_cache_short.expire=10 | |
66 |
|
96 | |||
67 | beaker.cache.sql_cache_med.type=memory |
|
97 | beaker.cache.sql_cache_med.type=memory | |
68 | beaker.cache.sql_cache_med.expire=360 |
|
98 | beaker.cache.sql_cache_med.expire=360 | |
@@ -136,6 +166,7 b' level = INFO' | |||||
136 | handlers = console |
|
166 | handlers = console | |
137 | qualname = routes.middleware |
|
167 | qualname = routes.middleware | |
138 | # "level = DEBUG" logs the route matched and routing variables. |
|
168 | # "level = DEBUG" logs the route matched and routing variables. | |
|
169 | propagate = 0 | |||
139 |
|
170 | |||
140 | [logger_rhodecode] |
|
171 | [logger_rhodecode] | |
141 | level = DEBUG |
|
172 | level = DEBUG |
@@ -6,7 +6,7 b' from rhodecode.config.routing import mak' | |||||
6 | from rhodecode.lib.auth import set_available_permissions, set_base_path |
|
6 | from rhodecode.lib.auth import set_available_permissions, set_base_path | |
7 | from rhodecode.lib.utils import repo2db_mapper, make_ui, set_rhodecode_config |
|
7 | from rhodecode.lib.utils import repo2db_mapper, make_ui, set_rhodecode_config | |
8 | from rhodecode.model import init_model |
|
8 | from rhodecode.model import init_model | |
9 |
from rhodecode.model. |
|
9 | from rhodecode.model.scm import ScmModel | |
10 | from sqlalchemy import engine_from_config |
|
10 | from sqlalchemy import engine_from_config | |
11 | import logging |
|
11 | import logging | |
12 | import os |
|
12 | import os | |
@@ -20,7 +20,7 b' def load_environment(global_conf, app_co' | |||||
20 | object |
|
20 | object | |
21 | """ |
|
21 | """ | |
22 | config = PylonsConfig() |
|
22 | config = PylonsConfig() | |
23 |
|
23 | |||
24 | # Pylons paths |
|
24 | # Pylons paths | |
25 | root = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) |
|
25 | root = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) | |
26 | paths = dict(root=root, |
|
26 | paths = dict(root=root, | |
@@ -34,11 +34,11 b' def load_environment(global_conf, app_co' | |||||
34 | config['routes.map'] = make_map(config) |
|
34 | config['routes.map'] = make_map(config) | |
35 | config['pylons.app_globals'] = app_globals.Globals(config) |
|
35 | config['pylons.app_globals'] = app_globals.Globals(config) | |
36 | config['pylons.h'] = rhodecode.lib.helpers |
|
36 | config['pylons.h'] = rhodecode.lib.helpers | |
37 |
|
37 | |||
38 | # Setup cache object as early as possible |
|
38 | # Setup cache object as early as possible | |
39 | import pylons |
|
39 | import pylons | |
40 | pylons.cache._push_object(config['pylons.app_globals'].cache) |
|
40 | pylons.cache._push_object(config['pylons.app_globals'].cache) | |
41 |
|
41 | |||
42 | # Create the Mako TemplateLookup, with the default auto-escaping |
|
42 | # Create the Mako TemplateLookup, with the default auto-escaping | |
43 | config['pylons.app_globals'].mako_lookup = TemplateLookup( |
|
43 | config['pylons.app_globals'].mako_lookup = TemplateLookup( | |
44 | directories=paths['templates'], |
|
44 | directories=paths['templates'], | |
@@ -52,9 +52,10 b' def load_environment(global_conf, app_co' | |||||
52 | test = os.path.split(config['__file__'])[-1] == 'test.ini' |
|
52 | test = os.path.split(config['__file__'])[-1] == 'test.ini' | |
53 | if test: |
|
53 | if test: | |
54 | from rhodecode.lib.utils import create_test_env, create_test_index |
|
54 | from rhodecode.lib.utils import create_test_env, create_test_index | |
55 | create_test_env('/tmp', config) |
|
55 | from rhodecode.tests import TESTS_TMP_PATH | |
56 | create_test_index('/tmp/*', True) |
|
56 | create_test_env(TESTS_TMP_PATH, config) | |
57 |
|
57 | create_test_index(TESTS_TMP_PATH, True) | ||
|
58 | ||||
58 | #MULTIPLE DB configs |
|
59 | #MULTIPLE DB configs | |
59 | # Setup the SQLAlchemy database engine |
|
60 | # Setup the SQLAlchemy database engine | |
60 | if config['debug'] and not test: |
|
61 | if config['debug'] and not test: | |
@@ -68,12 +69,13 b' def load_environment(global_conf, app_co' | |||||
68 | init_model(sa_engine_db1) |
|
69 | init_model(sa_engine_db1) | |
69 | #init baseui |
|
70 | #init baseui | |
70 | config['pylons.app_globals'].baseui = make_ui('db') |
|
71 | config['pylons.app_globals'].baseui = make_ui('db') | |
71 |
|
72 | |||
72 | repo2db_mapper(_get_repos_cached_initial(config['pylons.app_globals'], initial)) |
|
73 | g = config['pylons.app_globals'] | |
|
74 | repo2db_mapper(ScmModel().repo_scan(g.paths[0][1], g.baseui)) | |||
73 | set_available_permissions(config) |
|
75 | set_available_permissions(config) | |
74 | set_base_path(config) |
|
76 | set_base_path(config) | |
75 | set_rhodecode_config(config) |
|
77 | set_rhodecode_config(config) | |
76 | # CONFIGURATION OPTIONS HERE (note: all config options will override |
|
78 | # CONFIGURATION OPTIONS HERE (note: all config options will override | |
77 | # any Pylons config options) |
|
79 | # any Pylons config options) | |
78 |
|
80 | |||
79 | return config |
|
81 | return config |
@@ -8,8 +8,10 b' from pylons.middleware import ErrorHandl' | |||||
8 | from pylons.wsgiapp import PylonsApp |
|
8 | from pylons.wsgiapp import PylonsApp | |
9 | from routes.middleware import RoutesMiddleware |
|
9 | from routes.middleware import RoutesMiddleware | |
10 | from rhodecode.lib.middleware.simplehg import SimpleHg |
|
10 | from rhodecode.lib.middleware.simplehg import SimpleHg | |
|
11 | from rhodecode.lib.middleware.simplegit import SimpleGit | |||
11 | from rhodecode.lib.middleware.https_fixup import HttpsFixup |
|
12 | from rhodecode.lib.middleware.https_fixup import HttpsFixup | |
12 | from rhodecode.config.environment import load_environment |
|
13 | from rhodecode.config.environment import load_environment | |
|
14 | from paste.gzipper import make_gzip_middleware | |||
13 |
|
15 | |||
14 | def make_app(global_conf, full_stack=True, static_files=True, **app_conf): |
|
16 | def make_app(global_conf, full_stack=True, static_files=True, **app_conf): | |
15 | """Create a Pylons WSGI application and return it |
|
17 | """Create a Pylons WSGI application and return it | |
@@ -35,15 +37,16 b' def make_app(global_conf, full_stack=Tru' | |||||
35 |
|
37 | |||
36 | # The Pylons WSGI app |
|
38 | # The Pylons WSGI app | |
37 | app = PylonsApp(config=config) |
|
39 | app = PylonsApp(config=config) | |
38 |
|
40 | |||
39 | # Routing/Session/Cache Middleware |
|
41 | # Routing/Session/Cache Middleware | |
40 | app = RoutesMiddleware(app, config['routes.map']) |
|
42 | app = RoutesMiddleware(app, config['routes.map']) | |
41 | app = SessionMiddleware(app, config) |
|
43 | app = SessionMiddleware(app, config) | |
42 |
|
44 | |||
43 | # CUSTOM MIDDLEWARE HERE (filtered by error handling middlewares) |
|
45 | # CUSTOM MIDDLEWARE HERE (filtered by error handling middlewares) | |
44 |
|
46 | |||
45 | app = SimpleHg(app, config) |
|
47 | app = SimpleHg(app, config) | |
46 |
|
48 | app = SimpleGit(app, config) | ||
|
49 | ||||
47 | if asbool(full_stack): |
|
50 | if asbool(full_stack): | |
48 | # Handle Python exceptions |
|
51 | # Handle Python exceptions | |
49 | app = ErrorHandler(app, global_conf, **config['pylons.errorware']) |
|
52 | app = ErrorHandler(app, global_conf, **config['pylons.errorware']) | |
@@ -54,10 +57,10 b' def make_app(global_conf, full_stack=Tru' | |||||
54 | app = StatusCodeRedirect(app) |
|
57 | app = StatusCodeRedirect(app) | |
55 | else: |
|
58 | else: | |
56 | app = StatusCodeRedirect(app, [400, 401, 403, 404, 500]) |
|
59 | app = StatusCodeRedirect(app, [400, 401, 403, 404, 500]) | |
57 |
|
60 | |||
58 | #enable https redirets based on HTTP_X_URL_SCHEME set by proxy |
|
61 | #enable https redirets based on HTTP_X_URL_SCHEME set by proxy | |
59 | app = HttpsFixup(app) |
|
62 | app = HttpsFixup(app) | |
60 |
|
63 | |||
61 | # Establish the Registry for this application |
|
64 | # Establish the Registry for this application | |
62 | app = RegistryManager(app) |
|
65 | app = RegistryManager(app) | |
63 |
|
66 | |||
@@ -65,7 +68,8 b' def make_app(global_conf, full_stack=Tru' | |||||
65 | # Serve static files |
|
68 | # Serve static files | |
66 | static_app = StaticURLParser(config['pylons.paths']['static_files']) |
|
69 | static_app = StaticURLParser(config['pylons.paths']['static_files']) | |
67 | app = Cascade([static_app, app]) |
|
70 | app = Cascade([static_app, app]) | |
68 |
|
71 | app = make_gzip_middleware(app, global_conf, compress_level=1) | ||
|
72 | ||||
69 | app.config = config |
|
73 | app.config = config | |
70 |
|
74 | |||
71 | return app |
|
75 | return app |
@@ -35,7 +35,7 b' def make_map(config):' | |||||
35 | #========================================================================== |
|
35 | #========================================================================== | |
36 |
|
36 | |||
37 | #MAIN PAGE |
|
37 | #MAIN PAGE | |
38 |
map.connect(' |
|
38 | map.connect('home', '/', controller='home', action='index') | |
39 | map.connect('bugtracker', "http://bitbucket.org/marcinkuzminski/rhodecode/issues", _static=True) |
|
39 | map.connect('bugtracker', "http://bitbucket.org/marcinkuzminski/rhodecode/issues", _static=True) | |
40 | map.connect('gpl_license', "http://www.gnu.org/licenses/gpl.html", _static=True) |
|
40 | map.connect('gpl_license', "http://www.gnu.org/licenses/gpl.html", _static=True) | |
41 | #ADMIN REPOSITORY REST ROUTES |
|
41 | #ADMIN REPOSITORY REST ROUTES | |
@@ -73,13 +73,27 b' def make_map(config):' | |||||
73 | m.connect('delete_repo_user', "/repos_delete_user/{repo_name:.*}", |
|
73 | m.connect('delete_repo_user', "/repos_delete_user/{repo_name:.*}", | |
74 | action="delete_perm_user", conditions=dict(method=["DELETE"], |
|
74 | action="delete_perm_user", conditions=dict(method=["DELETE"], | |
75 | function=check_repo)) |
|
75 | function=check_repo)) | |
76 |
|
76 | #settings actions | ||
|
77 | m.connect('repo_stats', "/repos_stats/{repo_name:.*}", | |||
|
78 | action="repo_stats", conditions=dict(method=["DELETE"], | |||
|
79 | function=check_repo)) | |||
|
80 | m.connect('repo_cache', "/repos_cache/{repo_name:.*}", | |||
|
81 | action="repo_cache", conditions=dict(method=["DELETE"], | |||
|
82 | function=check_repo)) | |||
77 | #ADMIN USER REST ROUTES |
|
83 | #ADMIN USER REST ROUTES | |
78 | map.resource('user', 'users', controller='admin/users', path_prefix='/_admin') |
|
84 | map.resource('user', 'users', controller='admin/users', path_prefix='/_admin') | |
79 |
|
85 | |||
80 | #ADMIN PERMISSIONS REST ROUTES |
|
86 | #ADMIN PERMISSIONS REST ROUTES | |
81 | map.resource('permission', 'permissions', controller='admin/permissions', path_prefix='/_admin') |
|
87 | map.resource('permission', 'permissions', controller='admin/permissions', path_prefix='/_admin') | |
82 |
|
88 | |||
|
89 | ||||
|
90 | ##ADMIN LDAP SETTINGS | |||
|
91 | map.connect('ldap_settings', '/_admin/ldap', controller='admin/ldap_settings', | |||
|
92 | action='ldap_settings', conditions=dict(method=["POST"])) | |||
|
93 | map.connect('ldap_home', '/_admin/ldap', controller='admin/ldap_settings',) | |||
|
94 | ||||
|
95 | ||||
|
96 | ||||
83 | #ADMIN SETTINGS REST ROUTES |
|
97 | #ADMIN SETTINGS REST ROUTES | |
84 | with map.submapper(path_prefix='/_admin', controller='admin/settings') as m: |
|
98 | with map.submapper(path_prefix='/_admin', controller='admin/settings') as m: | |
85 | m.connect("admin_settings", "/settings", |
|
99 | m.connect("admin_settings", "/settings", | |
@@ -116,6 +130,14 b' def make_map(config):' | |||||
116 | m.connect('admin_home', '', action='index')#main page |
|
130 | m.connect('admin_home', '', action='index')#main page | |
117 | m.connect('admin_add_repo', '/add_repo/{new_repo:[a-z0-9\. _-]*}', |
|
131 | m.connect('admin_add_repo', '/add_repo/{new_repo:[a-z0-9\. _-]*}', | |
118 | action='add_repo') |
|
132 | action='add_repo') | |
|
133 | ||||
|
134 | ||||
|
135 | #USER JOURNAL | |||
|
136 | map.connect('journal', '/_admin/journal', controller='journal',) | |||
|
137 | map.connect('toggle_following', '/_admin/toggle_following', controller='journal', | |||
|
138 | action='toggle_following', conditions=dict(method=["POST"])) | |||
|
139 | ||||
|
140 | ||||
119 | #SEARCH |
|
141 | #SEARCH | |
120 | map.connect('search', '/_admin/search', controller='search',) |
|
142 | map.connect('search', '/_admin/search', controller='search',) | |
121 | map.connect('search_repo', '/_admin/search/{search_repo:.*}', controller='search') |
|
143 | map.connect('search_repo', '/_admin/search/{search_repo:.*}', controller='search') |
@@ -1,8 +1,15 b'' | |||||
1 | #!/usr/bin/env python |
|
1 | # -*- coding: utf-8 -*- | |
2 | # encoding: utf-8 |
|
2 | """ | |
3 | # admin controller for pylons |
|
3 | rhodecode.controllers.admin.admin | |
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
5 |
|
5 | |||
|
6 | Controller for Admin panel of Rhodecode | |||
|
7 | ||||
|
8 | :created_on: Apr 7, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | """ | |||
6 | # This program is free software; you can redistribute it and/or |
|
13 | # This program is free software; you can redistribute it and/or | |
7 | # modify it under the terms of the GNU General Public License |
|
14 | # modify it under the terms of the GNU General Public License | |
8 | # as published by the Free Software Foundation; version 2 |
|
15 | # as published by the Free Software Foundation; version 2 | |
@@ -17,15 +24,10 b'' | |||||
17 | # along with this program; if not, write to the Free Software |
|
24 | # along with this program; if not, write to the Free Software | |
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, |
|
25 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |
19 | # MA 02110-1301, USA. |
|
26 | # MA 02110-1301, USA. | |
20 | """ |
|
27 | ||
21 | Created on April 7, 2010 |
|
|||
22 | admin controller for pylons |
|
|||
23 | @author: marcink |
|
|||
24 | """ |
|
|||
25 | import logging |
|
28 | import logging | |
26 |
from pylons import request, |
|
29 | from pylons import request, tmpl_context as c | |
27 | from rhodecode.lib.base import BaseController, render |
|
30 | from rhodecode.lib.base import BaseController, render | |
28 | from rhodecode.model import meta |
|
|||
29 | from rhodecode.model.db import UserLog |
|
31 | from rhodecode.model.db import UserLog | |
30 | from webhelpers.paginate import Page |
|
32 | from webhelpers.paginate import Page | |
31 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator |
|
33 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator | |
@@ -33,19 +35,19 b' from rhodecode.lib.auth import LoginRequ' | |||||
33 | log = logging.getLogger(__name__) |
|
35 | log = logging.getLogger(__name__) | |
34 |
|
36 | |||
35 | class AdminController(BaseController): |
|
37 | class AdminController(BaseController): | |
36 |
|
38 | |||
37 | @LoginRequired() |
|
39 | @LoginRequired() | |
38 | def __before__(self): |
|
40 | def __before__(self): | |
39 | super(AdminController, self).__before__() |
|
41 | super(AdminController, self).__before__() | |
40 |
|
42 | |||
41 |
@HasPermissionAllDecorator('hg.admin') |
|
43 | @HasPermissionAllDecorator('hg.admin') | |
42 | def index(self): |
|
44 | def index(self): | |
43 |
|
45 | |||
44 | users_log = self.sa.query(UserLog).order_by(UserLog.action_date.desc()) |
|
46 | users_log = self.sa.query(UserLog).order_by(UserLog.action_date.desc()) | |
45 | p = int(request.params.get('page', 1)) |
|
47 | p = int(request.params.get('page', 1)) | |
46 | c.users_log = Page(users_log, page=p, items_per_page=10) |
|
48 | c.users_log = Page(users_log, page=p, items_per_page=10) | |
47 | c.log_data = render('admin/admin_log.html') |
|
49 | c.log_data = render('admin/admin_log.html') | |
48 | if request.params.get('partial'): |
|
50 | if request.params.get('partial'): | |
49 | return c.log_data |
|
51 | return c.log_data | |
50 |
return render('admin/admin.html') |
|
52 | return render('admin/admin.html') | |
51 |
|
53 |
@@ -1,8 +1,15 b'' | |||||
1 | #!/usr/bin/env python |
|
1 | # -*- coding: utf-8 -*- | |
2 | # encoding: utf-8 |
|
2 | """ | |
3 | # permissions controller for pylons |
|
3 | rhodecode.controllers.admin.permissions | |
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
5 | # |
|
5 | ||
|
6 | permissions controller for Rhodecode | |||
|
7 | ||||
|
8 | :created_on: Apr 27, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | """ | |||
6 | # This program is free software; you can redistribute it and/or |
|
13 | # This program is free software; you can redistribute it and/or | |
7 | # modify it under the terms of the GNU General Public License |
|
14 | # modify it under the terms of the GNU General Public License | |
8 | # as published by the Free Software Foundation; version 2 |
|
15 | # as published by the Free Software Foundation; version 2 | |
@@ -17,11 +24,6 b'' | |||||
17 | # along with this program; if not, write to the Free Software |
|
24 | # along with this program; if not, write to the Free Software | |
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, |
|
25 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |
19 | # MA 02110-1301, USA. |
|
26 | # MA 02110-1301, USA. | |
20 | """ |
|
|||
21 | Created on April 27, 2010 |
|
|||
22 | permissions controller for pylons |
|
|||
23 | @author: marcink |
|
|||
24 | """ |
|
|||
25 |
|
27 | |||
26 | from formencode import htmlfill |
|
28 | from formencode import htmlfill | |
27 | from pylons import request, session, tmpl_context as c, url |
|
29 | from pylons import request, session, tmpl_context as c, url | |
@@ -29,11 +31,12 b' from pylons.controllers.util import abor' | |||||
29 | from pylons.i18n.translation import _ |
|
31 | from pylons.i18n.translation import _ | |
30 | from rhodecode.lib import helpers as h |
|
32 | from rhodecode.lib import helpers as h | |
31 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator |
|
33 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator | |
|
34 | from rhodecode.lib.auth_ldap import LdapImportError | |||
32 | from rhodecode.lib.base import BaseController, render |
|
35 | from rhodecode.lib.base import BaseController, render | |
33 | from rhodecode.model.db import User, UserLog |
|
36 | from rhodecode.model.forms import LdapSettingsForm, DefaultPermissionsForm | |
34 |
from rhodecode.model. |
|
37 | from rhodecode.model.permission import PermissionModel | |
35 |
from rhodecode.model. |
|
38 | from rhodecode.model.settings import SettingsModel | |
36 |
from rhodecode.model.user |
|
39 | from rhodecode.model.user import UserModel | |
37 | import formencode |
|
40 | import formencode | |
38 | import logging |
|
41 | import logging | |
39 | import traceback |
|
42 | import traceback | |
@@ -45,29 +48,30 b' class PermissionsController(BaseControll' | |||||
45 | # To properly map this controller, ensure your config/routing.py |
|
48 | # To properly map this controller, ensure your config/routing.py | |
46 | # file has a resource setup: |
|
49 | # file has a resource setup: | |
47 | # map.resource('permission', 'permissions') |
|
50 | # map.resource('permission', 'permissions') | |
48 |
|
51 | |||
49 | @LoginRequired() |
|
52 | @LoginRequired() | |
50 | @HasPermissionAllDecorator('hg.admin') |
|
53 | @HasPermissionAllDecorator('hg.admin') | |
51 | def __before__(self): |
|
54 | def __before__(self): | |
52 | c.admin_user = session.get('admin_user') |
|
55 | c.admin_user = session.get('admin_user') | |
53 | c.admin_username = session.get('admin_username') |
|
56 | c.admin_username = session.get('admin_username') | |
54 | super(PermissionsController, self).__before__() |
|
57 | super(PermissionsController, self).__before__() | |
55 |
|
58 | |||
56 | self.perms_choices = [('repository.none', _('None'),), |
|
59 | self.perms_choices = [('repository.none', _('None'),), | |
57 | ('repository.read', _('Read'),), |
|
60 | ('repository.read', _('Read'),), | |
58 | ('repository.write', _('Write'),), |
|
61 | ('repository.write', _('Write'),), | |
59 | ('repository.admin', _('Admin'),)] |
|
62 | ('repository.admin', _('Admin'),)] | |
60 | self.register_choices = [ |
|
63 | self.register_choices = [ | |
61 |
('hg.register.none', |
|
64 | ('hg.register.none', | |
|
65 | _('disabled')), | |||
62 | ('hg.register.manual_activate', |
|
66 | ('hg.register.manual_activate', | |
63 |
|
|
67 | _('allowed with manual account activation')), | |
64 | ('hg.register.auto_activate', |
|
68 | ('hg.register.auto_activate', | |
65 |
|
|
69 | _('allowed with automatic account activation')), ] | |
66 |
|
70 | |||
67 | self.create_choices = [('hg.create.none', _('Disabled')), |
|
71 | self.create_choices = [('hg.create.none', _('Disabled')), | |
68 |
('hg.create.repository', _('Enabled'))] |
|
72 | ('hg.create.repository', _('Enabled'))] | |
69 |
|
73 | |||
70 |
|
74 | |||
71 | def index(self, format='html'): |
|
75 | def index(self, format='html'): | |
72 | """GET /permissions: All items in the collection""" |
|
76 | """GET /permissions: All items in the collection""" | |
73 | # url('permissions') |
|
77 | # url('permissions') | |
@@ -88,38 +92,39 b' class PermissionsController(BaseControll' | |||||
88 | # h.form(url('permission', id=ID), |
|
92 | # h.form(url('permission', id=ID), | |
89 | # method='put') |
|
93 | # method='put') | |
90 | # url('permission', id=ID) |
|
94 | # url('permission', id=ID) | |
91 |
|
95 | |||
92 | permission_model = PermissionModel() |
|
96 | permission_model = PermissionModel() | |
93 |
|
97 | |||
94 | _form = DefaultPermissionsForm([x[0] for x in self.perms_choices], |
|
98 | _form = DefaultPermissionsForm([x[0] for x in self.perms_choices], | |
95 | [x[0] for x in self.register_choices], |
|
99 | [x[0] for x in self.register_choices], | |
96 | [x[0] for x in self.create_choices])() |
|
100 | [x[0] for x in self.create_choices])() | |
97 |
|
101 | |||
98 | try: |
|
102 | try: | |
99 | form_result = _form.to_python(dict(request.POST)) |
|
103 | form_result = _form.to_python(dict(request.POST)) | |
100 | form_result.update({'perm_user_name':id}) |
|
104 | form_result.update({'perm_user_name':id}) | |
101 | permission_model.update(form_result) |
|
105 | permission_model.update(form_result) | |
102 | h.flash(_('Default permissions updated succesfully'), |
|
106 | h.flash(_('Default permissions updated successfully'), | |
103 | category='success') |
|
107 | category='success') | |
104 |
|
108 | |||
105 | except formencode.Invalid, errors: |
|
109 | except formencode.Invalid, errors: | |
106 | c.perms_choices = self.perms_choices |
|
110 | c.perms_choices = self.perms_choices | |
107 | c.register_choices = self.register_choices |
|
111 | c.register_choices = self.register_choices | |
108 | c.create_choices = self.create_choices |
|
112 | c.create_choices = self.create_choices | |
109 |
|
113 | defaults = errors.value | ||
|
114 | ||||
110 | return htmlfill.render( |
|
115 | return htmlfill.render( | |
111 | render('admin/permissions/permissions.html'), |
|
116 | render('admin/permissions/permissions.html'), | |
112 |
defaults= |
|
117 | defaults=defaults, | |
113 | errors=errors.error_dict or {}, |
|
118 | errors=errors.error_dict or {}, | |
114 | prefix_error=False, |
|
119 | prefix_error=False, | |
115 |
encoding="UTF-8") |
|
120 | encoding="UTF-8") | |
116 | except Exception: |
|
121 | except Exception: | |
117 | log.error(traceback.format_exc()) |
|
122 | log.error(traceback.format_exc()) | |
118 | h.flash(_('error occured during update of permissions'), |
|
123 | h.flash(_('error occured during update of permissions'), | |
119 | category='error') |
|
124 | category='error') | |
120 |
|
125 | |||
121 | return redirect(url('edit_permission', id=id)) |
|
126 | return redirect(url('edit_permission', id=id)) | |
122 |
|
127 | |||
123 |
|
128 | |||
124 |
|
129 | |||
125 | def delete(self, id): |
|
130 | def delete(self, id): | |
@@ -141,23 +146,26 b' class PermissionsController(BaseControll' | |||||
141 | c.perms_choices = self.perms_choices |
|
146 | c.perms_choices = self.perms_choices | |
142 | c.register_choices = self.register_choices |
|
147 | c.register_choices = self.register_choices | |
143 | c.create_choices = self.create_choices |
|
148 | c.create_choices = self.create_choices | |
144 |
|
149 | |||
145 | if id == 'default': |
|
150 | if id == 'default': | |
146 | defaults = {'_method':'put'} |
|
151 | default_user = UserModel().get_by_username('default') | |
147 | for p in UserModel().get_default().user_perms: |
|
152 | defaults = {'_method':'put', | |
|
153 | 'anonymous':default_user.active} | |||
|
154 | ||||
|
155 | for p in default_user.user_perms: | |||
148 | if p.permission.permission_name.startswith('repository.'): |
|
156 | if p.permission.permission_name.startswith('repository.'): | |
149 |
defaults['default_perm'] = p.permission.permission_name |
|
157 | defaults['default_perm'] = p.permission.permission_name | |
150 |
|
158 | |||
151 | if p.permission.permission_name.startswith('hg.register.'): |
|
159 | if p.permission.permission_name.startswith('hg.register.'): | |
152 | defaults['default_register'] = p.permission.permission_name |
|
160 | defaults['default_register'] = p.permission.permission_name | |
153 |
|
161 | |||
154 | if p.permission.permission_name.startswith('hg.create.'): |
|
162 | if p.permission.permission_name.startswith('hg.create.'): | |
155 | defaults['default_create'] = p.permission.permission_name |
|
163 | defaults['default_create'] = p.permission.permission_name | |
156 |
|
164 | |||
157 | return htmlfill.render( |
|
165 | return htmlfill.render( | |
158 | render('admin/permissions/permissions.html'), |
|
166 | render('admin/permissions/permissions.html'), | |
159 | defaults=defaults, |
|
167 | defaults=defaults, | |
160 | encoding="UTF-8", |
|
168 | encoding="UTF-8", | |
161 |
force_defaults=True,) |
|
169 | force_defaults=True,) | |
162 | else: |
|
170 | else: | |
163 | return redirect(url('admin_home')) |
|
171 | return redirect(url('admin_home')) |
@@ -1,8 +1,15 b'' | |||||
1 | #!/usr/bin/env python |
|
1 | # -*- coding: utf-8 -*- | |
2 | # encoding: utf-8 |
|
2 | """ | |
3 | # repos controller for pylons |
|
3 | rhodecode.controllers.admin.repos | |
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
5 | # |
|
5 | ||
|
6 | Admin controller for RhodeCode | |||
|
7 | ||||
|
8 | :created_on: Apr 7, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | """ | |||
6 | # This program is free software; you can redistribute it and/or |
|
13 | # This program is free software; you can redistribute it and/or | |
7 | # modify it under the terms of the GNU General Public License |
|
14 | # modify it under the terms of the GNU General Public License | |
8 | # as published by the Free Software Foundation; version 2 |
|
15 | # as published by the Free Software Foundation; version 2 | |
@@ -17,17 +24,18 b'' | |||||
17 | # along with this program; if not, write to the Free Software |
|
24 | # along with this program; if not, write to the Free Software | |
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, |
|
25 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |
19 | # MA 02110-1301, USA. |
|
26 | # MA 02110-1301, USA. | |
20 | """ |
|
27 | ||
21 | Created on April 7, 2010 |
|
28 | import logging | |
22 | admin controller for pylons |
|
29 | import traceback | |
23 | @author: marcink |
|
30 | import formencode | |
24 | """ |
|
31 | from operator import itemgetter | |
25 | from formencode import htmlfill |
|
32 | from formencode import htmlfill | |
26 | from operator import itemgetter |
|
33 | ||
27 | from paste.httpexceptions import HTTPInternalServerError |
|
34 | from paste.httpexceptions import HTTPInternalServerError | |
28 | from pylons import request, response, session, tmpl_context as c, url |
|
35 | from pylons import request, response, session, tmpl_context as c, url | |
29 | from pylons.controllers.util import abort, redirect |
|
36 | from pylons.controllers.util import abort, redirect | |
30 | from pylons.i18n.translation import _ |
|
37 | from pylons.i18n.translation import _ | |
|
38 | ||||
31 | from rhodecode.lib import helpers as h |
|
39 | from rhodecode.lib import helpers as h | |
32 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator, \ |
|
40 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator, \ | |
33 | HasPermissionAnyDecorator |
|
41 | HasPermissionAnyDecorator | |
@@ -35,11 +43,9 b' from rhodecode.lib.base import BaseContr' | |||||
35 | from rhodecode.lib.utils import invalidate_cache, action_logger |
|
43 | from rhodecode.lib.utils import invalidate_cache, action_logger | |
36 | from rhodecode.model.db import User |
|
44 | from rhodecode.model.db import User | |
37 | from rhodecode.model.forms import RepoForm |
|
45 | from rhodecode.model.forms import RepoForm | |
38 |
from rhodecode.model. |
|
46 | from rhodecode.model.scm import ScmModel | |
39 |
from rhodecode.model.repo |
|
47 | from rhodecode.model.repo import RepoModel | |
40 | import formencode |
|
48 | ||
41 | import logging |
|
|||
42 | import traceback |
|
|||
43 |
|
49 | |||
44 | log = logging.getLogger(__name__) |
|
50 | log = logging.getLogger(__name__) | |
45 |
|
51 | |||
@@ -48,22 +54,22 b' class ReposController(BaseController):' | |||||
48 | # To properly map this controller, ensure your config/routing.py |
|
54 | # To properly map this controller, ensure your config/routing.py | |
49 | # file has a resource setup: |
|
55 | # file has a resource setup: | |
50 | # map.resource('repo', 'repos') |
|
56 | # map.resource('repo', 'repos') | |
51 |
|
57 | |||
52 | @LoginRequired() |
|
58 | @LoginRequired() | |
53 | @HasPermissionAnyDecorator('hg.admin', 'hg.create.repository') |
|
59 | @HasPermissionAnyDecorator('hg.admin', 'hg.create.repository') | |
54 | def __before__(self): |
|
60 | def __before__(self): | |
55 | c.admin_user = session.get('admin_user') |
|
61 | c.admin_user = session.get('admin_user') | |
56 | c.admin_username = session.get('admin_username') |
|
62 | c.admin_username = session.get('admin_username') | |
57 | super(ReposController, self).__before__() |
|
63 | super(ReposController, self).__before__() | |
58 |
|
64 | |||
59 |
@HasPermissionAllDecorator('hg.admin') |
|
65 | @HasPermissionAllDecorator('hg.admin') | |
60 | def index(self, format='html'): |
|
66 | def index(self, format='html'): | |
61 | """GET /repos: All items in the collection""" |
|
67 | """GET /repos: All items in the collection""" | |
62 | # url('repos') |
|
68 | # url('repos') | |
63 |
cached_repo_list = |
|
69 | cached_repo_list = ScmModel().get_repos() | |
64 | c.repos_list = sorted(cached_repo_list, key=itemgetter('name_sort')) |
|
70 | c.repos_list = sorted(cached_repo_list, key=itemgetter('name_sort')) | |
65 | return render('admin/repos/repos.html') |
|
71 | return render('admin/repos/repos.html') | |
66 |
|
72 | |||
67 | @HasPermissionAnyDecorator('hg.admin', 'hg.create.repository') |
|
73 | @HasPermissionAnyDecorator('hg.admin', 'hg.create.repository') | |
68 | def create(self): |
|
74 | def create(self): | |
69 | """POST /repos: Create a new item""" |
|
75 | """POST /repos: Create a new item""" | |
@@ -74,7 +80,6 b' class ReposController(BaseController):' | |||||
74 | try: |
|
80 | try: | |
75 | form_result = _form.to_python(dict(request.POST)) |
|
81 | form_result = _form.to_python(dict(request.POST)) | |
76 | repo_model.create(form_result, c.rhodecode_user) |
|
82 | repo_model.create(form_result, c.rhodecode_user) | |
77 | invalidate_cache('cached_repo_list') |
|
|||
78 | h.flash(_('created repository %s') % form_result['repo_name'], |
|
83 | h.flash(_('created repository %s') % form_result['repo_name'], | |
79 | category='success') |
|
84 | category='success') | |
80 |
|
85 | |||
@@ -83,22 +88,22 b' class ReposController(BaseController):' | |||||
83 | form_result['repo_name'], '', self.sa) |
|
88 | form_result['repo_name'], '', self.sa) | |
84 | else: |
|
89 | else: | |
85 | action_logger(self.rhodecode_user, 'admin_created_repo', |
|
90 | action_logger(self.rhodecode_user, 'admin_created_repo', | |
86 |
form_result['repo_name'], '', self.sa) |
|
91 | form_result['repo_name'], '', self.sa) | |
87 |
|
92 | |||
88 | except formencode.Invalid, errors: |
|
93 | except formencode.Invalid, errors: | |
89 | c.new_repo = errors.value['repo_name'] |
|
94 | c.new_repo = errors.value['repo_name'] | |
90 |
|
95 | |||
91 | if request.POST.get('user_created'): |
|
96 | if request.POST.get('user_created'): | |
92 | r = render('admin/repos/repo_add_create_repository.html') |
|
97 | r = render('admin/repos/repo_add_create_repository.html') | |
93 |
else: |
|
98 | else: | |
94 | r = render('admin/repos/repo_add.html') |
|
99 | r = render('admin/repos/repo_add.html') | |
95 |
|
100 | |||
96 | return htmlfill.render( |
|
101 | return htmlfill.render( | |
97 | r, |
|
102 | r, | |
98 | defaults=errors.value, |
|
103 | defaults=errors.value, | |
99 | errors=errors.error_dict or {}, |
|
104 | errors=errors.error_dict or {}, | |
100 | prefix_error=False, |
|
105 | prefix_error=False, | |
101 |
encoding="UTF-8") |
|
106 | encoding="UTF-8") | |
102 |
|
107 | |||
103 | except Exception: |
|
108 | except Exception: | |
104 | log.error(traceback.format_exc()) |
|
109 | log.error(traceback.format_exc()) | |
@@ -106,9 +111,9 b' class ReposController(BaseController):' | |||||
106 | % form_result.get('repo_name') |
|
111 | % form_result.get('repo_name') | |
107 | h.flash(msg, category='error') |
|
112 | h.flash(msg, category='error') | |
108 | if request.POST.get('user_created'): |
|
113 | if request.POST.get('user_created'): | |
109 |
return redirect(url(' |
|
114 | return redirect(url('home')) | |
110 | return redirect(url('repos')) |
|
115 | return redirect(url('repos')) | |
111 |
|
116 | |||
112 | @HasPermissionAllDecorator('hg.admin') |
|
117 | @HasPermissionAllDecorator('hg.admin') | |
113 | def new(self, format='html'): |
|
118 | def new(self, format='html'): | |
114 | """GET /repos/new: Form to create a new item""" |
|
119 | """GET /repos/new: Form to create a new item""" | |
@@ -116,7 +121,7 b' class ReposController(BaseController):' | |||||
116 | c.new_repo = h.repo_name_slug(new_repo) |
|
121 | c.new_repo = h.repo_name_slug(new_repo) | |
117 |
|
122 | |||
118 | return render('admin/repos/repo_add.html') |
|
123 | return render('admin/repos/repo_add.html') | |
119 |
|
124 | |||
120 | @HasPermissionAllDecorator('hg.admin') |
|
125 | @HasPermissionAllDecorator('hg.admin') | |
121 | def update(self, repo_name): |
|
126 | def update(self, repo_name): | |
122 | """PUT /repos/repo_name: Update an existing item""" |
|
127 | """PUT /repos/repo_name: Update an existing item""" | |
@@ -129,16 +134,33 b' class ReposController(BaseController):' | |||||
129 | repo_model = RepoModel() |
|
134 | repo_model = RepoModel() | |
130 | changed_name = repo_name |
|
135 | changed_name = repo_name | |
131 | _form = RepoForm(edit=True, old_data={'repo_name':repo_name})() |
|
136 | _form = RepoForm(edit=True, old_data={'repo_name':repo_name})() | |
132 |
|
137 | |||
133 | try: |
|
138 | try: | |
134 | form_result = _form.to_python(dict(request.POST)) |
|
139 | form_result = _form.to_python(dict(request.POST)) | |
135 | repo_model.update(repo_name, form_result) |
|
140 | repo_model.update(repo_name, form_result) | |
136 |
invalidate_cache(' |
|
141 | invalidate_cache('get_repo_cached_%s' % repo_name) | |
137 | h.flash(_('Repository %s updated succesfully' % repo_name), |
|
142 | h.flash(_('Repository %s updated successfully' % repo_name), | |
138 | category='success') |
|
143 | category='success') | |
139 | changed_name = form_result['repo_name'] |
|
144 | changed_name = form_result['repo_name'] | |
|
145 | action_logger(self.rhodecode_user, 'admin_updated_repo', | |||
|
146 | changed_name, '', self.sa) | |||
|
147 | ||||
140 | except formencode.Invalid, errors: |
|
148 | except formencode.Invalid, errors: | |
141 | c.repo_info = repo_model.get(repo_name) |
|
149 | c.repo_info = repo_model.get_by_repo_name(repo_name) | |
|
150 | if c.repo_info.stats: | |||
|
151 | last_rev = c.repo_info.stats.stat_on_revision | |||
|
152 | else: | |||
|
153 | last_rev = 0 | |||
|
154 | c.stats_revision = last_rev | |||
|
155 | r = ScmModel().get(repo_name) | |||
|
156 | c.repo_last_rev = r.revisions[-1] if r.revisions else 0 | |||
|
157 | ||||
|
158 | if last_rev == 0: | |||
|
159 | c.stats_percentage = 0 | |||
|
160 | else: | |||
|
161 | c.stats_percentage = '%.2f' % ((float((last_rev)) / | |||
|
162 | c.repo_last_rev) * 100) | |||
|
163 | ||||
142 | c.users_array = repo_model.get_users_js() |
|
164 | c.users_array = repo_model.get_users_js() | |
143 | errors.value.update({'user':c.repo_info.user.username}) |
|
165 | errors.value.update({'user':c.repo_info.user.username}) | |
144 | return htmlfill.render( |
|
166 | return htmlfill.render( | |
@@ -147,14 +169,14 b' class ReposController(BaseController):' | |||||
147 | errors=errors.error_dict or {}, |
|
169 | errors=errors.error_dict or {}, | |
148 | prefix_error=False, |
|
170 | prefix_error=False, | |
149 | encoding="UTF-8") |
|
171 | encoding="UTF-8") | |
150 |
|
172 | |||
151 | except Exception: |
|
173 | except Exception: | |
152 | log.error(traceback.format_exc()) |
|
174 | log.error(traceback.format_exc()) | |
153 | h.flash(_('error occured during update of repository %s') \ |
|
175 | h.flash(_('error occurred during update of repository %s') \ | |
154 | % repo_name, category='error') |
|
176 | % repo_name, category='error') | |
155 |
|
177 | |||
156 | return redirect(url('edit_repo', repo_name=changed_name)) |
|
178 | return redirect(url('edit_repo', repo_name=changed_name)) | |
157 |
|
179 | |||
158 | @HasPermissionAllDecorator('hg.admin') |
|
180 | @HasPermissionAllDecorator('hg.admin') | |
159 | def delete(self, repo_name): |
|
181 | def delete(self, repo_name): | |
160 | """DELETE /repos/repo_name: Delete an existing item""" |
|
182 | """DELETE /repos/repo_name: Delete an existing item""" | |
@@ -164,82 +186,128 b' class ReposController(BaseController):' | |||||
164 | # h.form(url('repo', repo_name=ID), |
|
186 | # h.form(url('repo', repo_name=ID), | |
165 | # method='delete') |
|
187 | # method='delete') | |
166 | # url('repo', repo_name=ID) |
|
188 | # url('repo', repo_name=ID) | |
167 |
|
189 | |||
168 | repo_model = RepoModel() |
|
190 | repo_model = RepoModel() | |
169 | repo = repo_model.get(repo_name) |
|
191 | repo = repo_model.get_by_repo_name(repo_name) | |
170 | if not repo: |
|
192 | if not repo: | |
171 |
h.flash(_('%s repository is not mapped to db perhaps' |
|
193 | h.flash(_('%s repository is not mapped to db perhaps' | |
172 | ' it was moved or renamed from the filesystem' |
|
194 | ' it was moved or renamed from the filesystem' | |
173 | ' please run the application again' |
|
195 | ' please run the application again' | |
174 | ' in order to rescan repositories') % repo_name, |
|
196 | ' in order to rescan repositories') % repo_name, | |
175 | category='error') |
|
197 | category='error') | |
176 |
|
198 | |||
177 | return redirect(url('repos')) |
|
199 | return redirect(url('repos')) | |
178 | try: |
|
200 | try: | |
179 | action_logger(self.rhodecode_user, 'admin_deleted_repo', |
|
201 | action_logger(self.rhodecode_user, 'admin_deleted_repo', | |
180 | repo_name, '', self.sa) |
|
202 | repo_name, '', self.sa) | |
181 |
repo_model.delete(repo) |
|
203 | repo_model.delete(repo) | |
182 |
invalidate_cache(' |
|
204 | invalidate_cache('get_repo_cached_%s' % repo_name) | |
183 | h.flash(_('deleted repository %s') % repo_name, category='success') |
|
205 | h.flash(_('deleted repository %s') % repo_name, category='success') | |
184 |
|
206 | |||
185 | except Exception, e: |
|
207 | except Exception, e: | |
186 | log.error(traceback.format_exc()) |
|
208 | log.error(traceback.format_exc()) | |
187 | h.flash(_('An error occured during deletion of %s') % repo_name, |
|
209 | h.flash(_('An error occured during deletion of %s') % repo_name, | |
188 | category='error') |
|
210 | category='error') | |
189 |
|
211 | |||
190 | return redirect(url('repos')) |
|
212 | return redirect(url('repos')) | |
191 |
|
213 | |||
192 |
@HasPermissionAllDecorator('hg.admin') |
|
214 | @HasPermissionAllDecorator('hg.admin') | |
193 | def delete_perm_user(self, repo_name): |
|
215 | def delete_perm_user(self, repo_name): | |
194 | """ |
|
216 | """ | |
195 | DELETE an existing repository permission user |
|
217 | DELETE an existing repository permission user | |
196 | :param repo_name: |
|
218 | :param repo_name: | |
197 | """ |
|
219 | """ | |
198 |
|
220 | |||
199 | try: |
|
221 | try: | |
200 | repo_model = RepoModel() |
|
222 | repo_model = RepoModel() | |
201 |
repo_model.delete_perm_user(request.POST, repo_name) |
|
223 | repo_model.delete_perm_user(request.POST, repo_name) | |
202 | except Exception, e: |
|
224 | except Exception, e: | |
203 | h.flash(_('An error occured during deletion of repository user'), |
|
225 | h.flash(_('An error occured during deletion of repository user'), | |
204 | category='error') |
|
226 | category='error') | |
205 | raise HTTPInternalServerError() |
|
227 | raise HTTPInternalServerError() | |
206 |
|
228 | |||
207 |
@HasPermissionAllDecorator('hg.admin') |
|
229 | @HasPermissionAllDecorator('hg.admin') | |
|
230 | def repo_stats(self, repo_name): | |||
|
231 | """ | |||
|
232 | DELETE an existing repository statistics | |||
|
233 | :param repo_name: | |||
|
234 | """ | |||
|
235 | ||||
|
236 | try: | |||
|
237 | repo_model = RepoModel() | |||
|
238 | repo_model.delete_stats(repo_name) | |||
|
239 | except Exception, e: | |||
|
240 | h.flash(_('An error occured during deletion of repository stats'), | |||
|
241 | category='error') | |||
|
242 | return redirect(url('edit_repo', repo_name=repo_name)) | |||
|
243 | ||||
|
244 | @HasPermissionAllDecorator('hg.admin') | |||
|
245 | def repo_cache(self, repo_name): | |||
|
246 | """ | |||
|
247 | INVALIDATE exisitings repository cache | |||
|
248 | :param repo_name: | |||
|
249 | """ | |||
|
250 | ||||
|
251 | try: | |||
|
252 | ScmModel().mark_for_invalidation(repo_name) | |||
|
253 | except Exception, e: | |||
|
254 | h.flash(_('An error occurred during cache invalidation'), | |||
|
255 | category='error') | |||
|
256 | return redirect(url('edit_repo', repo_name=repo_name)) | |||
|
257 | ||||
|
258 | @HasPermissionAllDecorator('hg.admin') | |||
208 | def show(self, repo_name, format='html'): |
|
259 | def show(self, repo_name, format='html'): | |
209 | """GET /repos/repo_name: Show a specific item""" |
|
260 | """GET /repos/repo_name: Show a specific item""" | |
210 | # url('repo', repo_name=ID) |
|
261 | # url('repo', repo_name=ID) | |
211 |
|
262 | |||
212 |
@HasPermissionAllDecorator('hg.admin') |
|
263 | @HasPermissionAllDecorator('hg.admin') | |
213 | def edit(self, repo_name, format='html'): |
|
264 | def edit(self, repo_name, format='html'): | |
214 | """GET /repos/repo_name/edit: Form to edit an existing item""" |
|
265 | """GET /repos/repo_name/edit: Form to edit an existing item""" | |
215 | # url('edit_repo', repo_name=ID) |
|
266 | # url('edit_repo', repo_name=ID) | |
216 | repo_model = RepoModel() |
|
267 | repo_model = RepoModel() | |
217 |
|
|
268 | r = ScmModel().get(repo_name) | |
218 | if not repo: |
|
269 | c.repo_info = repo_model.get_by_repo_name(repo_name) | |
219 | h.flash(_('%s repository is not mapped to db perhaps' |
|
270 | ||
|
271 | if c.repo_info is None: | |||
|
272 | h.flash(_('%s repository is not mapped to db perhaps' | |||
220 | ' it was created or renamed from the filesystem' |
|
273 | ' it was created or renamed from the filesystem' | |
221 | ' please run the application again' |
|
274 | ' please run the application again' | |
222 | ' in order to rescan repositories') % repo_name, |
|
275 | ' in order to rescan repositories') % repo_name, | |
223 | category='error') |
|
276 | category='error') | |
224 |
|
277 | |||
225 |
return redirect(url('repos')) |
|
278 | return redirect(url('repos')) | |
226 | defaults = c.repo_info.__dict__ |
|
279 | ||
|
280 | if c.repo_info.stats: | |||
|
281 | last_rev = c.repo_info.stats.stat_on_revision | |||
|
282 | else: | |||
|
283 | last_rev = 0 | |||
|
284 | c.stats_revision = last_rev | |||
|
285 | ||||
|
286 | c.repo_last_rev = r.revisions[-1] if r.revisions else 0 | |||
|
287 | ||||
|
288 | if last_rev == 0: | |||
|
289 | c.stats_percentage = 0 | |||
|
290 | else: | |||
|
291 | c.stats_percentage = '%.2f' % ((float((last_rev)) / | |||
|
292 | c.repo_last_rev) * 100) | |||
|
293 | ||||
|
294 | defaults = c.repo_info.get_dict() | |||
227 | if c.repo_info.user: |
|
295 | if c.repo_info.user: | |
228 | defaults.update({'user':c.repo_info.user.username}) |
|
296 | defaults.update({'user':c.repo_info.user.username}) | |
229 | else: |
|
297 | else: | |
230 | replacement_user = self.sa.query(User)\ |
|
298 | replacement_user = self.sa.query(User)\ | |
231 | .filter(User.admin == True).first().username |
|
299 | .filter(User.admin == True).first().username | |
232 | defaults.update({'user':replacement_user}) |
|
300 | defaults.update({'user':replacement_user}) | |
233 |
|
301 | |||
234 | c.users_array = repo_model.get_users_js() |
|
302 | c.users_array = repo_model.get_users_js() | |
235 |
|
303 | |||
236 | for p in c.repo_info.repo_to_perm: |
|
304 | for p in c.repo_info.repo_to_perm: | |
237 |
defaults.update({'perm_%s' % p.user.username: |
|
305 | defaults.update({'perm_%s' % p.user.username: | |
238 | p.permission.permission_name}) |
|
306 | p.permission.permission_name}) | |
239 |
|
307 | |||
240 | return htmlfill.render( |
|
308 | return htmlfill.render( | |
241 | render('admin/repos/repo_edit.html'), |
|
309 | render('admin/repos/repo_edit.html'), | |
242 | defaults=defaults, |
|
310 | defaults=defaults, | |
243 | encoding="UTF-8", |
|
311 | encoding="UTF-8", | |
244 | force_defaults=False |
|
312 | force_defaults=False | |
245 |
) |
|
313 | ) |
@@ -1,8 +1,14 b'' | |||||
1 | #!/usr/bin/env python |
|
1 | # -*- coding: utf-8 -*- | |
2 | # encoding: utf-8 |
|
2 | """ | |
3 | # settings controller for pylons |
|
3 | package.rhodecode.controllers.admin.settings | |
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> |
|
4 | ~~~~~~~~~~~~~~ | |
5 | # |
|
5 | settings controller for rhodecode admin | |
|
6 | ||||
|
7 | :created_on: Jul 14, 2010 | |||
|
8 | :author: marcink | |||
|
9 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
10 | :license: GPLv3, see COPYING for more details. | |||
|
11 | """ | |||
6 | # This program is free software; you can redistribute it and/or |
|
12 | # This program is free software; you can redistribute it and/or | |
7 | # modify it under the terms of the GNU General Public License |
|
13 | # modify it under the terms of the GNU General Public License | |
8 | # as published by the Free Software Foundation; version 2 |
|
14 | # as published by the Free Software Foundation; version 2 | |
@@ -17,11 +23,7 b'' | |||||
17 | # along with this program; if not, write to the Free Software |
|
23 | # along with this program; if not, write to the Free Software | |
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, |
|
24 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |
19 | # MA 02110-1301, USA. |
|
25 | # MA 02110-1301, USA. | |
20 | """ |
|
26 | ||
21 | Created on July 14, 2010 |
|
|||
22 | settings controller for pylons |
|
|||
23 | @author: marcink |
|
|||
24 | """ |
|
|||
25 | from formencode import htmlfill |
|
27 | from formencode import htmlfill | |
26 | from pylons import request, session, tmpl_context as c, url, app_globals as g, \ |
|
28 | from pylons import request, session, tmpl_context as c, url, app_globals as g, \ | |
27 | config |
|
29 | config | |
@@ -29,20 +31,22 b' from pylons.controllers.util import abor' | |||||
29 | from pylons.i18n.translation import _ |
|
31 | from pylons.i18n.translation import _ | |
30 | from rhodecode.lib import helpers as h |
|
32 | from rhodecode.lib import helpers as h | |
31 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator, \ |
|
33 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator, \ | |
32 | HasPermissionAnyDecorator |
|
34 | HasPermissionAnyDecorator, NotAnonymous | |
33 | from rhodecode.lib.base import BaseController, render |
|
35 | from rhodecode.lib.base import BaseController, render | |
|
36 | from rhodecode.lib.celerylib import tasks, run_task | |||
34 | from rhodecode.lib.utils import repo2db_mapper, invalidate_cache, \ |
|
37 | from rhodecode.lib.utils import repo2db_mapper, invalidate_cache, \ | |
35 | set_rhodecode_config, get_hg_settings, get_hg_ui_settings, make_ui |
|
38 | set_rhodecode_config | |
36 |
from rhodecode.model.db import |
|
39 | from rhodecode.model.db import RhodeCodeUi, Repository | |
37 | from rhodecode.model.forms import UserForm, ApplicationSettingsForm, \ |
|
40 | from rhodecode.model.forms import UserForm, ApplicationSettingsForm, \ | |
38 | ApplicationUiSettingsForm |
|
41 | ApplicationUiSettingsForm | |
39 |
from rhodecode.model. |
|
42 | from rhodecode.model.scm import ScmModel | |
40 |
from rhodecode.model. |
|
43 | from rhodecode.model.settings import SettingsModel | |
41 |
from rhodecode.l |
|
44 | from rhodecode.model.user import UserModel | |
|
45 | from sqlalchemy import func | |||
42 | import formencode |
|
46 | import formencode | |
43 | import logging |
|
47 | import logging | |
44 | import traceback |
|
48 | import traceback | |
45 |
|
49 | |||
46 | log = logging.getLogger(__name__) |
|
50 | log = logging.getLogger(__name__) | |
47 |
|
51 | |||
48 |
|
52 | |||
@@ -59,32 +63,32 b' class SettingsController(BaseController)' | |||||
59 | c.admin_user = session.get('admin_user') |
|
63 | c.admin_user = session.get('admin_user') | |
60 | c.admin_username = session.get('admin_username') |
|
64 | c.admin_username = session.get('admin_username') | |
61 | super(SettingsController, self).__before__() |
|
65 | super(SettingsController, self).__before__() | |
62 |
|
66 | |||
63 |
|
67 | |||
64 |
@HasPermissionAllDecorator('hg.admin') |
|
68 | @HasPermissionAllDecorator('hg.admin') | |
65 | def index(self, format='html'): |
|
69 | def index(self, format='html'): | |
66 | """GET /admin/settings: All items in the collection""" |
|
70 | """GET /admin/settings: All items in the collection""" | |
67 | # url('admin_settings') |
|
71 | # url('admin_settings') | |
68 |
|
72 | |||
69 |
defaults = |
|
73 | defaults = SettingsModel().get_app_settings() | |
70 | defaults.update(get_hg_ui_settings()) |
|
74 | defaults.update(self.get_hg_ui_settings()) | |
71 | return htmlfill.render( |
|
75 | return htmlfill.render( | |
72 | render('admin/settings/settings.html'), |
|
76 | render('admin/settings/settings.html'), | |
73 | defaults=defaults, |
|
77 | defaults=defaults, | |
74 | encoding="UTF-8", |
|
78 | encoding="UTF-8", | |
75 | force_defaults=False |
|
79 | force_defaults=False | |
76 |
) |
|
80 | ) | |
77 |
|
81 | |||
78 | @HasPermissionAllDecorator('hg.admin') |
|
82 | @HasPermissionAllDecorator('hg.admin') | |
79 | def create(self): |
|
83 | def create(self): | |
80 | """POST /admin/settings: Create a new item""" |
|
84 | """POST /admin/settings: Create a new item""" | |
81 | # url('admin_settings') |
|
85 | # url('admin_settings') | |
82 |
|
86 | |||
83 | @HasPermissionAllDecorator('hg.admin') |
|
87 | @HasPermissionAllDecorator('hg.admin') | |
84 | def new(self, format='html'): |
|
88 | def new(self, format='html'): | |
85 | """GET /admin/settings/new: Form to create a new item""" |
|
89 | """GET /admin/settings/new: Form to create a new item""" | |
86 | # url('admin_new_setting') |
|
90 | # url('admin_new_setting') | |
87 |
|
91 | |||
88 | @HasPermissionAllDecorator('hg.admin') |
|
92 | @HasPermissionAllDecorator('hg.admin') | |
89 | def update(self, setting_id): |
|
93 | def update(self, setting_id): | |
90 | """PUT /admin/settings/setting_id: Update an existing item""" |
|
94 | """PUT /admin/settings/setting_id: Update an existing item""" | |
@@ -98,47 +102,48 b' class SettingsController(BaseController)' | |||||
98 | rm_obsolete = request.POST.get('destroy', False) |
|
102 | rm_obsolete = request.POST.get('destroy', False) | |
99 | log.debug('Rescanning directories with destroy=%s', rm_obsolete) |
|
103 | log.debug('Rescanning directories with destroy=%s', rm_obsolete) | |
100 |
|
104 | |||
101 |
initial = |
|
105 | initial = ScmModel().repo_scan(g.paths[0][1], g.baseui) | |
|
106 | for repo_name in initial.keys(): | |||
|
107 | invalidate_cache('get_repo_cached_%s' % repo_name) | |||
|
108 | ||||
102 | repo2db_mapper(initial, rm_obsolete) |
|
109 | repo2db_mapper(initial, rm_obsolete) | |
103 | invalidate_cache('cached_repo_list') |
|
110 | ||
104 |
h.flash(_('Repositories successfully rescanned'), category='success') |
|
111 | h.flash(_('Repositories successfully rescanned'), category='success') | |
105 |
|
112 | |||
106 | if setting_id == 'whoosh': |
|
113 | if setting_id == 'whoosh': | |
107 | repo_location = get_hg_ui_settings()['paths_root_path'] |
|
114 | repo_location = self.get_hg_ui_settings()['paths_root_path'] | |
108 | full_index = request.POST.get('full_index', False) |
|
115 | full_index = request.POST.get('full_index', False) | |
109 | task = run_task(tasks.whoosh_index, repo_location, full_index) |
|
116 | task = run_task(tasks.whoosh_index, repo_location, full_index) | |
110 |
|
117 | |||
111 | h.flash(_('Whoosh reindex task scheduled'), category='success') |
|
118 | h.flash(_('Whoosh reindex task scheduled'), category='success') | |
112 | if setting_id == 'global': |
|
119 | if setting_id == 'global': | |
113 |
|
120 | |||
114 | application_form = ApplicationSettingsForm()() |
|
121 | application_form = ApplicationSettingsForm()() | |
115 | try: |
|
122 | try: | |
116 | form_result = application_form.to_python(dict(request.POST)) |
|
123 | form_result = application_form.to_python(dict(request.POST)) | |
117 |
|
124 | settings_model = SettingsModel() | ||
118 | try: |
|
125 | try: | |
119 |
hgsettings1 = sel |
|
126 | hgsettings1 = settings_model.get('title') | |
120 |
. |
|
127 | hgsettings1.app_settings_value = form_result['rhodecode_title'] | |
121 | hgsettings1.app_settings_value = form_result['rhodecode_title'] |
|
128 | ||
122 |
|
129 | hgsettings2 = settings_model.get('realm') | ||
123 | hgsettings2 = self.sa.query(RhodeCodeSettings)\ |
|
130 | hgsettings2.app_settings_value = form_result['rhodecode_realm'] | |
124 | .filter(RhodeCodeSettings.app_settings_name == 'realm').one() |
|
131 | ||
125 | hgsettings2.app_settings_value = form_result['rhodecode_realm'] |
|
132 | ||
126 |
|
||||
127 |
|
||||
128 | self.sa.add(hgsettings1) |
|
133 | self.sa.add(hgsettings1) | |
129 | self.sa.add(hgsettings2) |
|
134 | self.sa.add(hgsettings2) | |
130 | self.sa.commit() |
|
135 | self.sa.commit() | |
131 | set_rhodecode_config(config) |
|
136 | set_rhodecode_config(config) | |
132 | h.flash(_('Updated application settings'), |
|
137 | h.flash(_('Updated application settings'), | |
133 | category='success') |
|
138 | category='success') | |
134 |
|
139 | |||
135 | except: |
|
140 | except: | |
136 | log.error(traceback.format_exc()) |
|
141 | log.error(traceback.format_exc()) | |
137 | h.flash(_('error occurred during updating application settings'), |
|
142 | h.flash(_('error occurred during updating application settings'), | |
138 | category='error') |
|
143 | category='error') | |
139 |
|
144 | |||
140 | self.sa.rollback() |
|
145 | self.sa.rollback() | |
141 |
|
146 | |||
142 |
|
147 | |||
143 | except formencode.Invalid, errors: |
|
148 | except formencode.Invalid, errors: | |
144 | return htmlfill.render( |
|
149 | return htmlfill.render( | |
@@ -146,52 +151,60 b' class SettingsController(BaseController)' | |||||
146 | defaults=errors.value, |
|
151 | defaults=errors.value, | |
147 | errors=errors.error_dict or {}, |
|
152 | errors=errors.error_dict or {}, | |
148 | prefix_error=False, |
|
153 | prefix_error=False, | |
149 |
encoding="UTF-8") |
|
154 | encoding="UTF-8") | |
150 |
|
155 | |||
151 | if setting_id == 'mercurial': |
|
156 | if setting_id == 'mercurial': | |
152 | application_form = ApplicationUiSettingsForm()() |
|
157 | application_form = ApplicationUiSettingsForm()() | |
153 | try: |
|
158 | try: | |
154 | form_result = application_form.to_python(dict(request.POST)) |
|
159 | form_result = application_form.to_python(dict(request.POST)) | |
155 |
|
160 | |||
156 | try: |
|
161 | try: | |
157 |
|
162 | |||
158 | hgsettings1 = self.sa.query(RhodeCodeUi)\ |
|
163 | hgsettings1 = self.sa.query(RhodeCodeUi)\ | |
159 | .filter(RhodeCodeUi.ui_key == 'push_ssl').one() |
|
164 | .filter(RhodeCodeUi.ui_key == 'push_ssl').one() | |
160 | hgsettings1.ui_value = form_result['web_push_ssl'] |
|
165 | hgsettings1.ui_value = form_result['web_push_ssl'] | |
161 |
|
166 | |||
162 | hgsettings2 = self.sa.query(RhodeCodeUi)\ |
|
167 | hgsettings2 = self.sa.query(RhodeCodeUi)\ | |
163 | .filter(RhodeCodeUi.ui_key == '/').one() |
|
168 | .filter(RhodeCodeUi.ui_key == '/').one() | |
164 |
hgsettings2.ui_value = form_result['paths_root_path'] |
|
169 | hgsettings2.ui_value = form_result['paths_root_path'] | |
165 |
|
170 | |||
166 |
|
171 | |||
167 | #HOOKS |
|
172 | #HOOKS | |
168 | hgsettings3 = self.sa.query(RhodeCodeUi)\ |
|
173 | hgsettings3 = self.sa.query(RhodeCodeUi)\ | |
169 | .filter(RhodeCodeUi.ui_key == 'changegroup.update').one() |
|
174 | .filter(RhodeCodeUi.ui_key == 'changegroup.update').one() | |
170 |
hgsettings3.ui_active = bool(form_result['hooks_changegroup_update']) |
|
175 | hgsettings3.ui_active = bool(form_result['hooks_changegroup_update']) | |
171 |
|
176 | |||
172 | hgsettings4 = self.sa.query(RhodeCodeUi)\ |
|
177 | hgsettings4 = self.sa.query(RhodeCodeUi)\ | |
173 | .filter(RhodeCodeUi.ui_key == 'changegroup.repo_size').one() |
|
178 | .filter(RhodeCodeUi.ui_key == 'changegroup.repo_size').one() | |
174 |
hgsettings4.ui_active = bool(form_result['hooks_changegroup_repo_size']) |
|
179 | hgsettings4.ui_active = bool(form_result['hooks_changegroup_repo_size']) | |
175 |
|
180 | |||
176 |
|
181 | hgsettings5 = self.sa.query(RhodeCodeUi)\ | ||
177 |
|
182 | .filter(RhodeCodeUi.ui_key == 'pretxnchangegroup.push_logger').one() | ||
178 |
|
183 | hgsettings5.ui_active = bool(form_result['hooks_pretxnchangegroup_push_logger']) | ||
|
184 | ||||
|
185 | hgsettings6 = self.sa.query(RhodeCodeUi)\ | |||
|
186 | .filter(RhodeCodeUi.ui_key == 'preoutgoing.pull_logger').one() | |||
|
187 | hgsettings6.ui_active = bool(form_result['hooks_preoutgoing_pull_logger']) | |||
|
188 | ||||
|
189 | ||||
179 | self.sa.add(hgsettings1) |
|
190 | self.sa.add(hgsettings1) | |
180 | self.sa.add(hgsettings2) |
|
191 | self.sa.add(hgsettings2) | |
181 | self.sa.add(hgsettings3) |
|
192 | self.sa.add(hgsettings3) | |
182 | self.sa.add(hgsettings4) |
|
193 | self.sa.add(hgsettings4) | |
|
194 | self.sa.add(hgsettings5) | |||
|
195 | self.sa.add(hgsettings6) | |||
183 | self.sa.commit() |
|
196 | self.sa.commit() | |
184 |
|
197 | |||
185 | h.flash(_('Updated mercurial settings'), |
|
198 | h.flash(_('Updated mercurial settings'), | |
186 | category='success') |
|
199 | category='success') | |
187 |
|
200 | |||
188 | except: |
|
201 | except: | |
189 | log.error(traceback.format_exc()) |
|
202 | log.error(traceback.format_exc()) | |
190 | h.flash(_('error occurred during updating application settings'), |
|
203 | h.flash(_('error occurred during updating application settings'), | |
191 | category='error') |
|
204 | category='error') | |
192 |
|
205 | |||
193 | self.sa.rollback() |
|
206 | self.sa.rollback() | |
194 |
|
207 | |||
195 |
|
208 | |||
196 | except formencode.Invalid, errors: |
|
209 | except formencode.Invalid, errors: | |
197 | return htmlfill.render( |
|
210 | return htmlfill.render( | |
@@ -199,12 +212,12 b' class SettingsController(BaseController)' | |||||
199 | defaults=errors.value, |
|
212 | defaults=errors.value, | |
200 | errors=errors.error_dict or {}, |
|
213 | errors=errors.error_dict or {}, | |
201 | prefix_error=False, |
|
214 | prefix_error=False, | |
202 |
encoding="UTF-8") |
|
215 | encoding="UTF-8") | |
203 |
|
216 | |||
204 |
|
217 | |||
205 |
|
218 | |||
206 | return redirect(url('admin_settings')) |
|
219 | return redirect(url('admin_settings')) | |
207 |
|
220 | |||
208 | @HasPermissionAllDecorator('hg.admin') |
|
221 | @HasPermissionAllDecorator('hg.admin') | |
209 | def delete(self, setting_id): |
|
222 | def delete(self, setting_id): | |
210 | """DELETE /admin/settings/setting_id: Delete an existing item""" |
|
223 | """DELETE /admin/settings/setting_id: Delete an existing item""" | |
@@ -214,41 +227,44 b' class SettingsController(BaseController)' | |||||
214 | # h.form(url('admin_setting', setting_id=ID), |
|
227 | # h.form(url('admin_setting', setting_id=ID), | |
215 | # method='delete') |
|
228 | # method='delete') | |
216 | # url('admin_setting', setting_id=ID) |
|
229 | # url('admin_setting', setting_id=ID) | |
217 |
|
230 | |||
218 | @HasPermissionAllDecorator('hg.admin') |
|
231 | @HasPermissionAllDecorator('hg.admin') | |
219 | def show(self, setting_id, format='html'): |
|
232 | def show(self, setting_id, format='html'): | |
220 | """GET /admin/settings/setting_id: Show a specific item""" |
|
233 | """GET /admin/settings/setting_id: Show a specific item""" | |
221 | # url('admin_setting', setting_id=ID) |
|
234 | # url('admin_setting', setting_id=ID) | |
222 |
|
235 | |||
223 |
@HasPermissionAllDecorator('hg.admin') |
|
236 | @HasPermissionAllDecorator('hg.admin') | |
224 | def edit(self, setting_id, format='html'): |
|
237 | def edit(self, setting_id, format='html'): | |
225 | """GET /admin/settings/setting_id/edit: Form to edit an existing item""" |
|
238 | """GET /admin/settings/setting_id/edit: Form to edit an existing item""" | |
226 | # url('admin_edit_setting', setting_id=ID) |
|
239 | # url('admin_edit_setting', setting_id=ID) | |
227 |
|
240 | |||
228 |
|
241 | @NotAnonymous() | ||
229 | def my_account(self): |
|
242 | def my_account(self): | |
230 | """ |
|
243 | """ | |
231 | GET /_admin/my_account Displays info about my account |
|
244 | GET /_admin/my_account Displays info about my account | |
232 | """ |
|
245 | """ | |
233 | # url('admin_settings_my_account') |
|
246 | # url('admin_settings_my_account') | |
234 | c.user = self.sa.query(User).get(c.rhodecode_user.user_id) |
|
247 | ||
235 | c.user_repos = [] |
|
248 | c.user = UserModel().get(c.rhodecode_user.user_id, cache=False) | |
236 | for repo in c.cached_repo_list.values(): |
|
249 | all_repos = self.sa.query(Repository)\ | |
237 |
|
|
250 | .filter(Repository.user_id == c.user.user_id)\ | |
238 | c.user_repos.append(repo) |
|
251 | .order_by(func.lower(Repository.repo_name))\ | |
239 |
|
|
252 | .all() | |
|
253 | ||||
|
254 | c.user_repos = ScmModel().get_repos(all_repos) | |||
|
255 | ||||
240 | if c.user.username == 'default': |
|
256 | if c.user.username == 'default': | |
241 |
h.flash(_("You can't edit this user since it's" |
|
257 | h.flash(_("You can't edit this user since it's" | |
242 | " crucial for entire application"), category='warning') |
|
258 | " crucial for entire application"), category='warning') | |
243 | return redirect(url('users')) |
|
259 | return redirect(url('users')) | |
244 |
|
260 | |||
245 |
defaults = c.user. |
|
261 | defaults = c.user.get_dict() | |
246 | return htmlfill.render( |
|
262 | return htmlfill.render( | |
247 | render('admin/users/user_edit_my_account.html'), |
|
263 | render('admin/users/user_edit_my_account.html'), | |
248 | defaults=defaults, |
|
264 | defaults=defaults, | |
249 | encoding="UTF-8", |
|
265 | encoding="UTF-8", | |
250 | force_defaults=False |
|
266 | force_defaults=False | |
251 |
) |
|
267 | ) | |
252 |
|
268 | |||
253 | def my_account_update(self): |
|
269 | def my_account_update(self): | |
254 | """PUT /_admin/my_account_update: Update an existing item""" |
|
270 | """PUT /_admin/my_account_update: Update an existing item""" | |
@@ -266,15 +282,18 b' class SettingsController(BaseController)' | |||||
266 | try: |
|
282 | try: | |
267 | form_result = _form.to_python(dict(request.POST)) |
|
283 | form_result = _form.to_python(dict(request.POST)) | |
268 | user_model.update_my_account(uid, form_result) |
|
284 | user_model.update_my_account(uid, form_result) | |
269 | h.flash(_('Your account was updated succesfully'), |
|
285 | h.flash(_('Your account was updated successfully'), | |
270 | category='success') |
|
286 | category='success') | |
271 |
|
287 | |||
272 | except formencode.Invalid, errors: |
|
288 | except formencode.Invalid, errors: | |
273 |
c.user = |
|
289 | c.user = user_model.get(c.rhodecode_user.user_id, cache=False) | |
274 | c.user_repos = [] |
|
290 | c.user = UserModel().get(c.rhodecode_user.user_id, cache=False) | |
275 | for repo in c.cached_repo_list.values(): |
|
291 | all_repos = self.sa.query(Repository)\ | |
276 |
|
|
292 | .filter(Repository.user_id == c.user.user_id)\ | |
277 | c.user_repos.append(repo) |
|
293 | .order_by(func.lower(Repository.repo_name))\ | |
|
294 | .all() | |||
|
295 | c.user_repos = ScmModel().get_repos(all_repos) | |||
|
296 | ||||
278 | return htmlfill.render( |
|
297 | return htmlfill.render( | |
279 | render('admin/users/user_edit_my_account.html'), |
|
298 | render('admin/users/user_edit_my_account.html'), | |
280 | defaults=errors.value, |
|
299 | defaults=errors.value, | |
@@ -283,11 +302,12 b' class SettingsController(BaseController)' | |||||
283 | encoding="UTF-8") |
|
302 | encoding="UTF-8") | |
284 | except Exception: |
|
303 | except Exception: | |
285 | log.error(traceback.format_exc()) |
|
304 | log.error(traceback.format_exc()) | |
286 | h.flash(_('error occured during update of user %s') \ |
|
305 | h.flash(_('error occurred during update of user %s') \ | |
287 | % form_result.get('username'), category='error') |
|
306 | % form_result.get('username'), category='error') | |
288 |
|
307 | |||
289 | return redirect(url('my_account')) |
|
308 | return redirect(url('my_account')) | |
290 |
|
309 | |||
|
310 | @NotAnonymous() | |||
291 | @HasPermissionAnyDecorator('hg.admin', 'hg.create.repository') |
|
311 | @HasPermissionAnyDecorator('hg.admin', 'hg.create.repository') | |
292 | def create_repository(self): |
|
312 | def create_repository(self): | |
293 | """GET /_admin/create_repository: Form to create a new item""" |
|
313 | """GET /_admin/create_repository: Form to create a new item""" | |
@@ -295,4 +315,25 b' class SettingsController(BaseController)' | |||||
295 | c.new_repo = h.repo_name_slug(new_repo) |
|
315 | c.new_repo = h.repo_name_slug(new_repo) | |
296 |
|
316 | |||
297 | return render('admin/repos/repo_add_create_repository.html') |
|
317 | return render('admin/repos/repo_add_create_repository.html') | |
298 |
|
318 | |||
|
319 | def get_hg_ui_settings(self): | |||
|
320 | ret = self.sa.query(RhodeCodeUi).all() | |||
|
321 | ||||
|
322 | if not ret: | |||
|
323 | raise Exception('Could not get application ui settings !') | |||
|
324 | settings = {} | |||
|
325 | for each in ret: | |||
|
326 | k = each.ui_key | |||
|
327 | v = each.ui_value | |||
|
328 | if k == '/': | |||
|
329 | k = 'root_path' | |||
|
330 | ||||
|
331 | if k.find('.') != -1: | |||
|
332 | k = k.replace('.', '_') | |||
|
333 | ||||
|
334 | if each.ui_section == 'hooks': | |||
|
335 | v = each.ui_active | |||
|
336 | ||||
|
337 | settings[each.ui_section + '_' + k] = v | |||
|
338 | ||||
|
339 | return settings |
@@ -1,8 +1,15 b'' | |||||
1 | #!/usr/bin/env python |
|
1 | # -*- coding: utf-8 -*- | |
2 | # encoding: utf-8 |
|
2 | """ | |
3 | # users controller for pylons |
|
3 | rhodecode.controllers.admin.users | |
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
5 | # |
|
5 | ||
|
6 | Users crud controller for pylons | |||
|
7 | ||||
|
8 | :created_on: Apr 4, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | """ | |||
6 | # This program is free software; you can redistribute it and/or |
|
13 | # This program is free software; you can redistribute it and/or | |
7 | # modify it under the terms of the GNU General Public License |
|
14 | # modify it under the terms of the GNU General Public License | |
8 | # as published by the Free Software Foundation; version 2 |
|
15 | # as published by the Free Software Foundation; version 2 | |
@@ -17,26 +24,24 b'' | |||||
17 | # along with this program; if not, write to the Free Software |
|
24 | # along with this program; if not, write to the Free Software | |
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, |
|
25 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |
19 | # MA 02110-1301, USA. |
|
26 | # MA 02110-1301, USA. | |
20 | from rhodecode.lib.utils import action_logger |
|
27 | ||
21 | """ |
|
28 | import logging | |
22 | Created on April 4, 2010 |
|
29 | import traceback | |
23 | users controller for pylons |
|
30 | import formencode | |
24 | @author: marcink |
|
|||
25 | """ |
|
|||
26 |
|
31 | |||
27 | from formencode import htmlfill |
|
32 | from formencode import htmlfill | |
28 | from pylons import request, session, tmpl_context as c, url |
|
33 | from pylons import request, session, tmpl_context as c, url | |
29 | from pylons.controllers.util import abort, redirect |
|
34 | from pylons.controllers.util import abort, redirect | |
30 | from pylons.i18n.translation import _ |
|
35 | from pylons.i18n.translation import _ | |
|
36 | ||||
|
37 | from rhodecode.lib.exceptions import * | |||
31 | from rhodecode.lib import helpers as h |
|
38 | from rhodecode.lib import helpers as h | |
32 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator |
|
39 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator | |
33 | from rhodecode.lib.base import BaseController, render |
|
40 | from rhodecode.lib.base import BaseController, render | |
34 | from rhodecode.model.db import User, UserLog |
|
41 | ||
|
42 | from rhodecode.model.db import User | |||
35 | from rhodecode.model.forms import UserForm |
|
43 | from rhodecode.model.forms import UserForm | |
36 |
from rhodecode.model.user |
|
44 | from rhodecode.model.user import UserModel | |
37 | import formencode |
|
|||
38 | import logging |
|
|||
39 | import traceback |
|
|||
40 |
|
45 | |||
41 | log = logging.getLogger(__name__) |
|
46 | log = logging.getLogger(__name__) | |
42 |
|
47 | |||
@@ -45,26 +50,26 b' class UsersController(BaseController):' | |||||
45 | # To properly map this controller, ensure your config/routing.py |
|
50 | # To properly map this controller, ensure your config/routing.py | |
46 | # file has a resource setup: |
|
51 | # file has a resource setup: | |
47 | # map.resource('user', 'users') |
|
52 | # map.resource('user', 'users') | |
48 |
|
53 | |||
49 | @LoginRequired() |
|
54 | @LoginRequired() | |
50 | @HasPermissionAllDecorator('hg.admin') |
|
55 | @HasPermissionAllDecorator('hg.admin') | |
51 | def __before__(self): |
|
56 | def __before__(self): | |
52 | c.admin_user = session.get('admin_user') |
|
57 | c.admin_user = session.get('admin_user') | |
53 | c.admin_username = session.get('admin_username') |
|
58 | c.admin_username = session.get('admin_username') | |
54 | super(UsersController, self).__before__() |
|
59 | super(UsersController, self).__before__() | |
55 |
|
60 | |||
56 |
|
61 | |||
57 | def index(self, format='html'): |
|
62 | def index(self, format='html'): | |
58 | """GET /users: All items in the collection""" |
|
63 | """GET /users: All items in the collection""" | |
59 | # url('users') |
|
64 | # url('users') | |
60 |
|
65 | |||
61 |
c.users_list = self.sa.query(User).all() |
|
66 | c.users_list = self.sa.query(User).all() | |
62 | return render('admin/users/users.html') |
|
67 | return render('admin/users/users.html') | |
63 |
|
68 | |||
64 | def create(self): |
|
69 | def create(self): | |
65 | """POST /users: Create a new item""" |
|
70 | """POST /users: Create a new item""" | |
66 | # url('users') |
|
71 | # url('users') | |
67 |
|
72 | |||
68 | user_model = UserModel() |
|
73 | user_model = UserModel() | |
69 | login_form = UserForm()() |
|
74 | login_form = UserForm()() | |
70 | try: |
|
75 | try: | |
@@ -79,13 +84,13 b' class UsersController(BaseController):' | |||||
79 | defaults=errors.value, |
|
84 | defaults=errors.value, | |
80 | errors=errors.error_dict or {}, |
|
85 | errors=errors.error_dict or {}, | |
81 | prefix_error=False, |
|
86 | prefix_error=False, | |
82 |
encoding="UTF-8") |
|
87 | encoding="UTF-8") | |
83 | except Exception: |
|
88 | except Exception: | |
84 | log.error(traceback.format_exc()) |
|
89 | log.error(traceback.format_exc()) | |
85 | h.flash(_('error occured during creation of user %s') \ |
|
90 | h.flash(_('error occured during creation of user %s') \ | |
86 |
% request.POST.get('username'), category='error') |
|
91 | % request.POST.get('username'), category='error') | |
87 | return redirect(url('users')) |
|
92 | return redirect(url('users')) | |
88 |
|
93 | |||
89 | def new(self, format='html'): |
|
94 | def new(self, format='html'): | |
90 | """GET /users/new: Form to create a new item""" |
|
95 | """GET /users/new: Form to create a new item""" | |
91 | # url('new_user') |
|
96 | # url('new_user') | |
@@ -100,8 +105,8 b' class UsersController(BaseController):' | |||||
100 | # method='put') |
|
105 | # method='put') | |
101 | # url('user', id=ID) |
|
106 | # url('user', id=ID) | |
102 | user_model = UserModel() |
|
107 | user_model = UserModel() | |
103 |
c.user = user_model.get |
|
108 | c.user = user_model.get(id) | |
104 |
|
109 | |||
105 | _form = UserForm(edit=True, old_data={'user_id':id, |
|
110 | _form = UserForm(edit=True, old_data={'user_id':id, | |
106 | 'email':c.user.email})() |
|
111 | 'email':c.user.email})() | |
107 | form_result = {} |
|
112 | form_result = {} | |
@@ -109,21 +114,21 b' class UsersController(BaseController):' | |||||
109 | form_result = _form.to_python(dict(request.POST)) |
|
114 | form_result = _form.to_python(dict(request.POST)) | |
110 | user_model.update(id, form_result) |
|
115 | user_model.update(id, form_result) | |
111 | h.flash(_('User updated succesfully'), category='success') |
|
116 | h.flash(_('User updated succesfully'), category='success') | |
112 |
|
117 | |||
113 | except formencode.Invalid, errors: |
|
118 | except formencode.Invalid, errors: | |
114 | return htmlfill.render( |
|
119 | return htmlfill.render( | |
115 | render('admin/users/user_edit.html'), |
|
120 | render('admin/users/user_edit.html'), | |
116 | defaults=errors.value, |
|
121 | defaults=errors.value, | |
117 | errors=errors.error_dict or {}, |
|
122 | errors=errors.error_dict or {}, | |
118 | prefix_error=False, |
|
123 | prefix_error=False, | |
119 |
encoding="UTF-8") |
|
124 | encoding="UTF-8") | |
120 | except Exception: |
|
125 | except Exception: | |
121 | log.error(traceback.format_exc()) |
|
126 | log.error(traceback.format_exc()) | |
122 | h.flash(_('error occured during update of user %s') \ |
|
127 | h.flash(_('error occurred during update of user %s') \ | |
123 | % form_result.get('username'), category='error') |
|
128 | % form_result.get('username'), category='error') | |
124 |
|
129 | |||
125 | return redirect(url('users')) |
|
130 | return redirect(url('users')) | |
126 |
|
131 | |||
127 | def delete(self, id): |
|
132 | def delete(self, id): | |
128 | """DELETE /users/id: Delete an existing item""" |
|
133 | """DELETE /users/id: Delete an existing item""" | |
129 | # Forms posted to this method should contain a hidden field: |
|
134 | # Forms posted to this method should contain a hidden field: | |
@@ -136,18 +141,18 b' class UsersController(BaseController):' | |||||
136 | try: |
|
141 | try: | |
137 | user_model.delete(id) |
|
142 | user_model.delete(id) | |
138 | h.flash(_('sucessfully deleted user'), category='success') |
|
143 | h.flash(_('sucessfully deleted user'), category='success') | |
139 | except DefaultUserException, e: |
|
144 | except (UserOwnsReposException, DefaultUserException), e: | |
140 | h.flash(str(e), category='warning') |
|
145 | h.flash(str(e), category='warning') | |
141 | except Exception: |
|
146 | except Exception: | |
142 | h.flash(_('An error occured during deletion of user'), |
|
147 | h.flash(_('An error occured during deletion of user'), | |
143 |
category='error') |
|
148 | category='error') | |
144 | return redirect(url('users')) |
|
149 | return redirect(url('users')) | |
145 |
|
150 | |||
146 | def show(self, id, format='html'): |
|
151 | def show(self, id, format='html'): | |
147 | """GET /users/id: Show a specific item""" |
|
152 | """GET /users/id: Show a specific item""" | |
148 | # url('user', id=ID) |
|
153 | # url('user', id=ID) | |
149 |
|
154 | |||
150 |
|
155 | |||
151 | def edit(self, id, format='html'): |
|
156 | def edit(self, id, format='html'): | |
152 | """GET /users/id/edit: Form to edit an existing item""" |
|
157 | """GET /users/id/edit: Form to edit an existing item""" | |
153 | # url('edit_user', id=ID) |
|
158 | # url('edit_user', id=ID) | |
@@ -155,14 +160,13 b' class UsersController(BaseController):' | |||||
155 | if not c.user: |
|
160 | if not c.user: | |
156 | return redirect(url('users')) |
|
161 | return redirect(url('users')) | |
157 | if c.user.username == 'default': |
|
162 | if c.user.username == 'default': | |
158 |
h.flash(_("You can't edit this user |
|
163 | h.flash(_("You can't edit this user"), category='warning') | |
159 | " crucial for entire application"), category='warning') |
|
|||
160 | return redirect(url('users')) |
|
164 | return redirect(url('users')) | |
161 |
|
165 | |||
162 |
defaults = c.user. |
|
166 | defaults = c.user.get_dict() | |
163 | return htmlfill.render( |
|
167 | return htmlfill.render( | |
164 | render('admin/users/user_edit.html'), |
|
168 | render('admin/users/user_edit.html'), | |
165 | defaults=defaults, |
|
169 | defaults=defaults, | |
166 | encoding="UTF-8", |
|
170 | encoding="UTF-8", | |
167 | force_defaults=False |
|
171 | force_defaults=False | |
168 |
) |
|
172 | ) |
@@ -26,7 +26,7 b' from pylons import tmpl_context as c' | |||||
26 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator |
|
26 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator | |
27 | from rhodecode.lib.base import BaseController, render |
|
27 | from rhodecode.lib.base import BaseController, render | |
28 | from rhodecode.lib.utils import OrderedDict |
|
28 | from rhodecode.lib.utils import OrderedDict | |
29 |
from rhodecode.model. |
|
29 | from rhodecode.model.scm import ScmModel | |
30 | import logging |
|
30 | import logging | |
31 | log = logging.getLogger(__name__) |
|
31 | log = logging.getLogger(__name__) | |
32 |
|
32 | |||
@@ -38,7 +38,7 b' class BranchesController(BaseController)' | |||||
38 | super(BranchesController, self).__before__() |
|
38 | super(BranchesController, self).__before__() | |
39 |
|
39 | |||
40 | def index(self): |
|
40 | def index(self): | |
41 |
hg_model = |
|
41 | hg_model = ScmModel() | |
42 | c.repo_info = hg_model.get_repo(c.repo_name) |
|
42 | c.repo_info = hg_model.get_repo(c.repo_name) | |
43 | c.repo_branches = OrderedDict() |
|
43 | c.repo_branches = OrderedDict() | |
44 | for name, hash_ in c.repo_info.branches.items(): |
|
44 | for name, hash_ in c.repo_info.branches.items(): |
@@ -32,19 +32,19 b' from mercurial.graphmod import colored, ' | |||||
32 | from pylons import request, session, tmpl_context as c |
|
32 | from pylons import request, session, tmpl_context as c | |
33 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator |
|
33 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator | |
34 | from rhodecode.lib.base import BaseController, render |
|
34 | from rhodecode.lib.base import BaseController, render | |
35 |
from rhodecode.model. |
|
35 | from rhodecode.model.scm import ScmModel | |
36 | from webhelpers.paginate import Page |
|
36 | from webhelpers.paginate import Page | |
37 | import logging |
|
37 | import logging | |
38 | log = logging.getLogger(__name__) |
|
38 | log = logging.getLogger(__name__) | |
39 |
|
39 | |||
40 | class ChangelogController(BaseController): |
|
40 | class ChangelogController(BaseController): | |
41 |
|
41 | |||
42 | @LoginRequired() |
|
42 | @LoginRequired() | |
43 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', |
|
43 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
44 |
'repository.admin') |
|
44 | 'repository.admin') | |
45 | def __before__(self): |
|
45 | def __before__(self): | |
46 | super(ChangelogController, self).__before__() |
|
46 | super(ChangelogController, self).__before__() | |
47 |
|
47 | |||
48 | def index(self): |
|
48 | def index(self): | |
49 | limit = 100 |
|
49 | limit = 100 | |
50 | default = 20 |
|
50 | default = 20 | |
@@ -53,43 +53,47 b' class ChangelogController(BaseController' | |||||
53 | int_size = int(request.params.get('size')) |
|
53 | int_size = int(request.params.get('size')) | |
54 | except ValueError: |
|
54 | except ValueError: | |
55 | int_size = default |
|
55 | int_size = default | |
56 |
int_size = int_size if int_size <= limit else limit |
|
56 | int_size = int_size if int_size <= limit else limit | |
57 | c.size = int_size |
|
57 | c.size = int_size | |
58 | session['changelog_size'] = c.size |
|
58 | session['changelog_size'] = c.size | |
59 | session.save() |
|
59 | session.save() | |
60 | else: |
|
60 | else: | |
61 | c.size = int(session.get('changelog_size', default)) |
|
61 | c.size = int(session.get('changelog_size', default)) | |
62 |
|
62 | |||
63 |
changesets = |
|
63 | changesets = ScmModel().get_repo(c.repo_name) | |
64 |
|
64 | |||
65 | p = int(request.params.get('page', 1)) |
|
65 | p = int(request.params.get('page', 1)) | |
66 | c.total_cs = len(changesets) |
|
66 | c.total_cs = len(changesets) | |
67 | c.pagination = Page(changesets, page=p, item_count=c.total_cs, |
|
67 | c.pagination = Page(changesets, page=p, item_count=c.total_cs, | |
68 | items_per_page=c.size) |
|
68 | items_per_page=c.size) | |
69 |
|
69 | |||
70 | self._graph(changesets, c.size, p) |
|
70 | self._graph(changesets, c.size, p) | |
71 |
|
71 | |||
72 | return render('changelog/changelog.html') |
|
72 | return render('changelog/changelog.html') | |
73 |
|
73 | |||
74 |
|
74 | |||
75 | def _graph(self, repo, size, p): |
|
75 | def _graph(self, repo, size, p): | |
76 | revcount = size |
|
76 | revcount = size | |
77 | if not repo.revisions:return json.dumps([]), 0 |
|
77 | if not repo.revisions or repo.alias == 'git': | |
78 |
|
78 | c.jsdata = json.dumps([]) | ||
|
79 | return | |||
|
80 | ||||
79 | max_rev = repo.revisions[-1] |
|
81 | max_rev = repo.revisions[-1] | |
|
82 | ||||
80 | offset = 1 if p == 1 else ((p - 1) * revcount + 1) |
|
83 | offset = 1 if p == 1 else ((p - 1) * revcount + 1) | |
|
84 | ||||
81 | rev_start = repo.revisions[(-1 * offset)] |
|
85 | rev_start = repo.revisions[(-1 * offset)] | |
82 |
|
86 | |||
83 | revcount = min(max_rev, revcount) |
|
87 | revcount = min(max_rev, revcount) | |
84 | rev_end = max(0, rev_start - revcount) |
|
88 | rev_end = max(0, rev_start - revcount) | |
85 | dag = graph_rev(repo.repo, rev_start, rev_end) |
|
89 | dag = graph_rev(repo.repo, rev_start, rev_end) | |
86 |
|
90 | |||
87 | c.dag = tree = list(colored(dag)) |
|
91 | c.dag = tree = list(colored(dag)) | |
88 | data = [] |
|
92 | data = [] | |
89 | for (id, type, ctx, vtx, edges) in tree: |
|
93 | for (id, type, ctx, vtx, edges) in tree: | |
90 | if type != CHANGESET: |
|
94 | if type != CHANGESET: | |
91 | continue |
|
95 | continue | |
92 | data.append(('', vtx, edges)) |
|
96 | data.append(('', vtx, edges)) | |
93 |
|
||||
94 | c.jsdata = json.dumps(data) |
|
|||
95 |
|
97 | |||
|
98 | c.jsdata = json.dumps(data) | |||
|
99 |
@@ -1,7 +1,15 b'' | |||||
1 | #!/usr/bin/env python |
|
1 | # -*- coding: utf-8 -*- | |
2 | # encoding: utf-8 |
|
2 | """ | |
3 | # changeset controller for pylons |
|
3 | rhodecode.controllers.changeset | |
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
|
5 | ||||
|
6 | changeset controller for pylons | |||
|
7 | ||||
|
8 | :created_on: Apr 25, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | """ | |||
5 | # This program is free software; you can redistribute it and/or |
|
13 | # This program is free software; you can redistribute it and/or | |
6 | # modify it under the terms of the GNU General Public License |
|
14 | # modify it under the terms of the GNU General Public License | |
7 | # as published by the Free Software Foundation; version 2 |
|
15 | # as published by the Free Software Foundation; version 2 | |
@@ -16,97 +24,96 b'' | |||||
16 | # along with this program; if not, write to the Free Software |
|
24 | # along with this program; if not, write to the Free Software | |
17 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, |
|
25 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |
18 | # MA 02110-1301, USA. |
|
26 | # MA 02110-1301, USA. | |
19 | from rhodecode.lib.utils import EmptyChangeset |
|
27 | import logging | |
20 | """ |
|
28 | import traceback | |
21 | Created on April 25, 2010 |
|
29 | ||
22 | changeset controller for pylons |
|
|||
23 | @author: marcink |
|
|||
24 | """ |
|
|||
25 | from pylons import tmpl_context as c, url, request, response |
|
30 | from pylons import tmpl_context as c, url, request, response | |
26 | from pylons.i18n.translation import _ |
|
31 | from pylons.i18n.translation import _ | |
27 | from pylons.controllers.util import redirect |
|
32 | from pylons.controllers.util import redirect | |
|
33 | ||||
|
34 | import rhodecode.lib.helpers as h | |||
28 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator |
|
35 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator | |
29 | from rhodecode.lib.base import BaseController, render |
|
36 | from rhodecode.lib.base import BaseController, render | |
30 |
from rhodecode. |
|
37 | from rhodecode.lib.utils import EmptyChangeset | |
|
38 | from rhodecode.model.scm import ScmModel | |||
|
39 | ||||
31 | from vcs.exceptions import RepositoryError, ChangesetError |
|
40 | from vcs.exceptions import RepositoryError, ChangesetError | |
32 | from vcs.nodes import FileNode |
|
41 | from vcs.nodes import FileNode | |
33 | from vcs.utils import diffs as differ |
|
42 | from vcs.utils import diffs as differ | |
34 | import logging |
|
|||
35 | import traceback |
|
|||
36 |
|
43 | |||
37 | log = logging.getLogger(__name__) |
|
44 | log = logging.getLogger(__name__) | |
38 |
|
45 | |||
39 | class ChangesetController(BaseController): |
|
46 | class ChangesetController(BaseController): | |
40 |
|
47 | |||
41 | @LoginRequired() |
|
48 | @LoginRequired() | |
42 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', |
|
49 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
43 |
'repository.admin') |
|
50 | 'repository.admin') | |
44 | def __before__(self): |
|
51 | def __before__(self): | |
45 | super(ChangesetController, self).__before__() |
|
52 | super(ChangesetController, self).__before__() | |
46 |
|
53 | |||
47 | def index(self, revision): |
|
54 | def index(self, revision): | |
48 |
hg_model = |
|
55 | hg_model = ScmModel() | |
49 | cut_off_limit = 1024 * 250 |
|
56 | ||
50 |
|
||||
51 | def wrap_to_table(str): |
|
57 | def wrap_to_table(str): | |
52 |
|
58 | |||
53 | return '''<table class="code-difftable"> |
|
59 | return '''<table class="code-difftable"> | |
54 | <tr class="line"> |
|
60 | <tr class="line"> | |
55 | <td class="lineno new"></td> |
|
61 | <td class="lineno new"></td> | |
56 | <td class="code"><pre>%s</pre></td> |
|
62 | <td class="code"><pre>%s</pre></td> | |
57 | </tr> |
|
63 | </tr> | |
58 | </table>''' % str |
|
64 | </table>''' % str | |
59 |
|
65 | |||
60 | try: |
|
66 | try: | |
61 | c.changeset = hg_model.get_repo(c.repo_name).get_changeset(revision) |
|
67 | c.changeset = hg_model.get_repo(c.repo_name).get_changeset(revision) | |
62 | except RepositoryError: |
|
68 | except RepositoryError, e: | |
63 | log.error(traceback.format_exc()) |
|
69 | log.error(traceback.format_exc()) | |
64 | return redirect(url('hg_home')) |
|
70 | h.flash(str(e), category='warning') | |
|
71 | return redirect(url('home')) | |||
65 | else: |
|
72 | else: | |
66 | try: |
|
73 | try: | |
67 | c.changeset_old = c.changeset.parents[0] |
|
74 | c.changeset_old = c.changeset.parents[0] | |
68 | except IndexError: |
|
75 | except IndexError: | |
69 | c.changeset_old = None |
|
76 | c.changeset_old = None | |
70 | c.changes = [] |
|
77 | c.changes = [] | |
71 |
|
78 | |||
72 | #=================================================================== |
|
79 | #=================================================================== | |
73 | # ADDED FILES |
|
80 | # ADDED FILES | |
74 | #=================================================================== |
|
81 | #=================================================================== | |
75 | c.sum_added = 0 |
|
82 | c.sum_added = 0 | |
76 | for node in c.changeset.added: |
|
83 | for node in c.changeset.added: | |
77 |
|
84 | |||
78 | filenode_old = FileNode(node.path, '', EmptyChangeset()) |
|
85 | filenode_old = FileNode(node.path, '', EmptyChangeset()) | |
79 | if filenode_old.is_binary or node.is_binary: |
|
86 | if filenode_old.is_binary or node.is_binary: | |
80 | diff = wrap_to_table(_('binary file')) |
|
87 | diff = wrap_to_table(_('binary file')) | |
81 | else: |
|
88 | else: | |
82 | c.sum_added += node.size |
|
89 | c.sum_added += node.size | |
83 | if c.sum_added < cut_off_limit: |
|
90 | if c.sum_added < self.cut_off_limit: | |
84 | f_udiff = differ.get_udiff(filenode_old, node) |
|
91 | f_udiff = differ.get_udiff(filenode_old, node) | |
85 | diff = differ.DiffProcessor(f_udiff).as_html() |
|
92 | diff = differ.DiffProcessor(f_udiff).as_html() | |
86 |
|
93 | |||
87 | else: |
|
94 | else: | |
88 | diff = wrap_to_table(_('Changeset is to big and was cut' |
|
95 | diff = wrap_to_table(_('Changeset is to big and was cut' | |
89 | ' off, see raw changeset instead')) |
|
96 | ' off, see raw changeset instead')) | |
90 |
|
97 | |||
91 | cs1 = None |
|
98 | cs1 = None | |
92 |
cs2 = node.last_changeset. |
|
99 | cs2 = node.last_changeset.raw_id | |
93 | c.changes.append(('added', node, diff, cs1, cs2)) |
|
100 | c.changes.append(('added', node, diff, cs1, cs2)) | |
94 |
|
101 | |||
95 | #=================================================================== |
|
102 | #=================================================================== | |
96 | # CHANGED FILES |
|
103 | # CHANGED FILES | |
97 | #=================================================================== |
|
104 | #=================================================================== | |
98 |
c.sum_removed = 0 |
|
105 | c.sum_removed = 0 | |
99 | for node in c.changeset.changed: |
|
106 | for node in c.changeset.changed: | |
100 | try: |
|
107 | try: | |
101 | filenode_old = c.changeset_old.get_node(node.path) |
|
108 | filenode_old = c.changeset_old.get_node(node.path) | |
102 | except ChangesetError: |
|
109 | except ChangesetError: | |
103 | filenode_old = FileNode(node.path, '', EmptyChangeset()) |
|
110 | filenode_old = FileNode(node.path, '', EmptyChangeset()) | |
104 |
|
111 | |||
105 | if filenode_old.is_binary or node.is_binary: |
|
112 | if filenode_old.is_binary or node.is_binary: | |
106 | diff = wrap_to_table(_('binary file')) |
|
113 | diff = wrap_to_table(_('binary file')) | |
107 | else: |
|
114 | else: | |
108 |
|
115 | |||
109 | if c.sum_removed < cut_off_limit: |
|
116 | if c.sum_removed < self.cut_off_limit: | |
110 | f_udiff = differ.get_udiff(filenode_old, node) |
|
117 | f_udiff = differ.get_udiff(filenode_old, node) | |
111 | diff = differ.DiffProcessor(f_udiff).as_html() |
|
118 | diff = differ.DiffProcessor(f_udiff).as_html() | |
112 | if diff: |
|
119 | if diff: | |
@@ -114,68 +121,72 b' class ChangesetController(BaseController' | |||||
114 | else: |
|
121 | else: | |
115 | diff = wrap_to_table(_('Changeset is to big and was cut' |
|
122 | diff = wrap_to_table(_('Changeset is to big and was cut' | |
116 | ' off, see raw changeset instead')) |
|
123 | ' off, see raw changeset instead')) | |
117 |
|
124 | |||
118 |
|
125 | |||
119 |
cs1 = filenode_old.last_changeset. |
|
126 | cs1 = filenode_old.last_changeset.raw_id | |
120 |
cs2 = node.last_changeset. |
|
127 | cs2 = node.last_changeset.raw_id | |
121 | c.changes.append(('changed', node, diff, cs1, cs2)) |
|
128 | c.changes.append(('changed', node, diff, cs1, cs2)) | |
122 |
|
129 | |||
123 | #=================================================================== |
|
130 | #=================================================================== | |
124 | # REMOVED FILES |
|
131 | # REMOVED FILES | |
125 | #=================================================================== |
|
132 | #=================================================================== | |
126 | for node in c.changeset.removed: |
|
133 | for node in c.changeset.removed: | |
127 |
c.changes.append(('removed', node, None, None, None)) |
|
134 | c.changes.append(('removed', node, None, None, None)) | |
128 |
|
135 | |||
129 | return render('changeset/changeset.html') |
|
136 | return render('changeset/changeset.html') | |
130 |
|
137 | |||
131 | def raw_changeset(self, revision): |
|
138 | def raw_changeset(self, revision): | |
132 |
|
139 | |||
133 |
hg_model = |
|
140 | hg_model = ScmModel() | |
134 | method = request.GET.get('diff', 'show') |
|
141 | method = request.GET.get('diff', 'show') | |
135 | try: |
|
142 | try: | |
136 |
|
|
143 | r = hg_model.get_repo(c.repo_name) | |
|
144 | c.scm_type = r.alias | |||
|
145 | c.changeset = r.get_changeset(revision) | |||
137 | except RepositoryError: |
|
146 | except RepositoryError: | |
138 | log.error(traceback.format_exc()) |
|
147 | log.error(traceback.format_exc()) | |
139 |
return redirect(url(' |
|
148 | return redirect(url('home')) | |
140 | else: |
|
149 | else: | |
141 | try: |
|
150 | try: | |
142 | c.changeset_old = c.changeset.parents[0] |
|
151 | c.changeset_old = c.changeset.parents[0] | |
143 | except IndexError: |
|
152 | except IndexError: | |
144 | c.changeset_old = None |
|
153 | c.changeset_old = None | |
145 | c.changes = [] |
|
154 | c.changes = [] | |
146 |
|
155 | |||
147 | for node in c.changeset.added: |
|
156 | for node in c.changeset.added: | |
148 | filenode_old = FileNode(node.path, '') |
|
157 | filenode_old = FileNode(node.path, '') | |
149 | if filenode_old.is_binary or node.is_binary: |
|
158 | if filenode_old.is_binary or node.is_binary: | |
150 | diff = _('binary file') |
|
159 | diff = _('binary file') + '\n' | |
151 |
else: |
|
160 | else: | |
152 | f_udiff = differ.get_udiff(filenode_old, node) |
|
161 | f_udiff = differ.get_udiff(filenode_old, node) | |
153 | diff = differ.DiffProcessor(f_udiff).raw_diff() |
|
162 | diff = differ.DiffProcessor(f_udiff).raw_diff() | |
154 |
|
163 | |||
155 | cs1 = None |
|
164 | cs1 = None | |
156 |
cs2 = node.last_changeset. |
|
165 | cs2 = node.last_changeset.raw_id | |
157 | c.changes.append(('added', node, diff, cs1, cs2)) |
|
166 | c.changes.append(('added', node, diff, cs1, cs2)) | |
158 |
|
167 | |||
159 | for node in c.changeset.changed: |
|
168 | for node in c.changeset.changed: | |
160 | filenode_old = c.changeset_old.get_node(node.path) |
|
169 | filenode_old = c.changeset_old.get_node(node.path) | |
161 | if filenode_old.is_binary or node.is_binary: |
|
170 | if filenode_old.is_binary or node.is_binary: | |
162 | diff = _('binary file') |
|
171 | diff = _('binary file') | |
163 |
else: |
|
172 | else: | |
164 | f_udiff = differ.get_udiff(filenode_old, node) |
|
173 | f_udiff = differ.get_udiff(filenode_old, node) | |
165 | diff = differ.DiffProcessor(f_udiff).raw_diff() |
|
174 | diff = differ.DiffProcessor(f_udiff).raw_diff() | |
166 |
|
175 | |||
167 |
cs1 = filenode_old.last_changeset. |
|
176 | cs1 = filenode_old.last_changeset.raw_id | |
168 |
cs2 = node.last_changeset. |
|
177 | cs2 = node.last_changeset.raw_id | |
169 |
c.changes.append(('changed', node, diff, cs1, cs2)) |
|
178 | c.changes.append(('changed', node, diff, cs1, cs2)) | |
170 |
|
179 | |||
171 | response.content_type = 'text/plain' |
|
180 | response.content_type = 'text/plain' | |
|
181 | ||||
172 | if method == 'download': |
|
182 | if method == 'download': | |
173 |
response.content_disposition = 'attachment; filename=%s.patch' % revision |
|
183 | response.content_disposition = 'attachment; filename=%s.patch' % revision | |
|
184 | ||||
174 | parent = True if len(c.changeset.parents) > 0 else False |
|
185 | parent = True if len(c.changeset.parents) > 0 else False | |
175 | c.parent_tmpl = 'Parent %s' % c.changeset.parents[0].raw_id if parent else '' |
|
186 | c.parent_tmpl = 'Parent %s' % c.changeset.parents[0].raw_id if parent else '' | |
176 |
|
187 | |||
177 | c.diffs = '' |
|
188 | c.diffs = '' | |
178 | for x in c.changes: |
|
189 | for x in c.changes: | |
179 | c.diffs += x[2] |
|
190 | c.diffs += x[2] | |
180 |
|
191 | |||
181 | return render('changeset/raw_changeset.html') |
|
192 | return render('changeset/raw_changeset.html') |
@@ -1,33 +1,58 b'' | |||||
1 | import logging |
|
1 | # -*- coding: utf-8 -*- | |
|
2 | """ | |||
|
3 | package.rhodecode.controllers.error | |||
|
4 | ~~~~~~~~~~~~~~ | |||
|
5 | ||||
|
6 | RhodeCode error controller | |||
|
7 | ||||
|
8 | :created_on: Dec 8, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | """ | |||
|
13 | # This program is free software; you can redistribute it and/or | |||
|
14 | # modify it under the terms of the GNU General Public License | |||
|
15 | # as published by the Free Software Foundation; version 2 | |||
|
16 | # of the License or (at your opinion) any later version of the license. | |||
|
17 | # | |||
|
18 | # This program is distributed in the hope that it will be useful, | |||
|
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
21 | # GNU General Public License for more details. | |||
|
22 | # | |||
|
23 | # You should have received a copy of the GNU General Public License | |||
|
24 | # along with this program; if not, write to the Free Software | |||
|
25 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |||
|
26 | # MA 02110-1301, USA. | |||
|
27 | import os | |||
2 | import cgi |
|
28 | import cgi | |
3 |
import |
|
29 | import logging | |
4 | import paste.fileapp |
|
30 | import paste.fileapp | |
5 | from pylons import tmpl_context as c, app_globals as g, request, config |
|
31 | ||
6 | from pylons.controllers.util import forward |
|
32 | from pylons import tmpl_context as c, request | |
7 | from pylons.i18n.translation import _ |
|
33 | from pylons.i18n.translation import _ | |
|
34 | from pylons.middleware import media_path | |||
|
35 | ||||
8 | from rhodecode.lib.base import BaseController, render |
|
36 | from rhodecode.lib.base import BaseController, render | |
9 | from pylons.middleware import media_path |
|
37 | ||
10 | from rhodecode.lib.utils import check_repo |
|
|||
11 | import rhodecode.lib.helpers as h |
|
|||
12 | from rhodecode import __version__ |
|
|||
13 | log = logging.getLogger(__name__) |
|
38 | log = logging.getLogger(__name__) | |
14 |
|
39 | |||
15 | class ErrorController(BaseController): |
|
40 | class ErrorController(BaseController): | |
16 | """ |
|
41 | """Generates error documents as and when they are required. | |
17 | Generates error documents as and when they are required. |
|
|||
18 |
|
42 | |||
19 | The ErrorDocuments middleware forwards to ErrorController when error |
|
43 | The ErrorDocuments middleware forwards to ErrorController when error | |
20 | related status codes are returned from the application. |
|
44 | related status codes are returned from the application. | |
21 |
|
45 | |||
22 |
This behavio |
|
46 | This behavior can be altered by changing the parameters to the | |
23 | ErrorDocuments middleware in your config/middleware.py file. |
|
47 | ErrorDocuments middleware in your config/middleware.py file. | |
24 | """ |
|
48 | """ | |
|
49 | ||||
25 | def __before__(self): |
|
50 | def __before__(self): | |
26 | pass#disable all base actions since we don't need them here |
|
51 | pass#disable all base actions since we don't need them here | |
27 |
|
52 | |||
28 | def document(self): |
|
53 | def document(self): | |
29 | resp = request.environ.get('pylons.original_response') |
|
54 | resp = request.environ.get('pylons.original_response') | |
30 |
|
55 | |||
31 | log.debug('### %s ###', resp.status) |
|
56 | log.debug('### %s ###', resp.status) | |
32 |
|
57 | |||
33 | e = request.environ |
|
58 | e = request.environ | |
@@ -36,7 +61,7 b' class ErrorController(BaseController):' | |||||
36 | 'host':e.get('HTTP_HOST'), |
|
61 | 'host':e.get('HTTP_HOST'), | |
37 | } |
|
62 | } | |
38 |
|
63 | |||
39 |
|
64 | |||
40 | c.error_message = cgi.escape(request.GET.get('code', str(resp.status))) |
|
65 | c.error_message = cgi.escape(request.GET.get('code', str(resp.status))) | |
41 | c.error_explanation = self.get_error_explanation(resp.status_int) |
|
66 | c.error_explanation = self.get_error_explanation(resp.status_int) | |
42 |
|
67 | |||
@@ -74,7 +99,7 b' class ErrorController(BaseController):' | |||||
74 | if code == 400: |
|
99 | if code == 400: | |
75 | return _('The request could not be understood by the server due to malformed syntax.') |
|
100 | return _('The request could not be understood by the server due to malformed syntax.') | |
76 | if code == 401: |
|
101 | if code == 401: | |
77 | return _('Unathorized access to resource') |
|
102 | return _('Unauthorized access to resource') | |
78 | if code == 403: |
|
103 | if code == 403: | |
79 | return _("You don't have permission to view this page") |
|
104 | return _("You don't have permission to view this page") | |
80 | if code == 404: |
|
105 | if code == 404: |
@@ -24,7 +24,7 b' feed controller for pylons' | |||||
24 | """ |
|
24 | """ | |
25 | from pylons import tmpl_context as c, url, response |
|
25 | from pylons import tmpl_context as c, url, response | |
26 | from rhodecode.lib.base import BaseController, render |
|
26 | from rhodecode.lib.base import BaseController, render | |
27 |
from rhodecode.model. |
|
27 | from rhodecode.model.scm import ScmModel | |
28 | from webhelpers.feedgenerator import Atom1Feed, Rss201rev2Feed |
|
28 | from webhelpers.feedgenerator import Atom1Feed, Rss201rev2Feed | |
29 | import logging |
|
29 | import logging | |
30 | log = logging.getLogger(__name__) |
|
30 | log = logging.getLogger(__name__) | |
@@ -49,12 +49,12 b' class FeedController(BaseController):' | |||||
49 | language=self.language, |
|
49 | language=self.language, | |
50 | ttl=self.ttl) |
|
50 | ttl=self.ttl) | |
51 |
|
51 | |||
52 |
changesets = |
|
52 | changesets = ScmModel().get_repo(repo_name) | |
53 |
|
53 | |||
54 | for cs in changesets[:self.feed_nr]: |
|
54 | for cs in changesets[:self.feed_nr]: | |
55 | feed.add_item(title=cs.message, |
|
55 | feed.add_item(title=cs.message, | |
56 | link=url('changeset_home', repo_name=repo_name, |
|
56 | link=url('changeset_home', repo_name=repo_name, | |
57 |
revision=cs. |
|
57 | revision=cs.raw_id, qualified=True), | |
58 | description=str(cs.date)) |
|
58 | description=str(cs.date)) | |
59 |
|
59 | |||
60 | response.content_type = feed.mime_type |
|
60 | response.content_type = feed.mime_type | |
@@ -69,11 +69,11 b' class FeedController(BaseController):' | |||||
69 | language=self.language, |
|
69 | language=self.language, | |
70 | ttl=self.ttl) |
|
70 | ttl=self.ttl) | |
71 |
|
71 | |||
72 |
changesets = |
|
72 | changesets = ScmModel().get_repo(repo_name) | |
73 | for cs in changesets[:self.feed_nr]: |
|
73 | for cs in changesets[:self.feed_nr]: | |
74 | feed.add_item(title=cs.message, |
|
74 | feed.add_item(title=cs.message, | |
75 | link=url('changeset_home', repo_name=repo_name, |
|
75 | link=url('changeset_home', repo_name=repo_name, | |
76 |
revision=cs. |
|
76 | revision=cs.raw_id, qualified=True), | |
77 | description=str(cs.date)) |
|
77 | description=str(cs.date)) | |
78 |
|
78 | |||
79 | response.content_type = feed.mime_type |
|
79 | response.content_type = feed.mime_type |
@@ -1,8 +1,15 b'' | |||||
1 | #!/usr/bin/env python |
|
1 | # -*- coding: utf-8 -*- | |
2 | # encoding: utf-8 |
|
2 | """ | |
3 | # files controller for pylons |
|
3 | rhodecode.controllers.files | |
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
5 |
|
|
5 | ||
|
6 | Files controller for RhodeCode | |||
|
7 | ||||
|
8 | :created_on: Apr 21, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | """ | |||
6 | # This program is free software; you can redistribute it and/or |
|
13 | # This program is free software; you can redistribute it and/or | |
7 | # modify it under the terms of the GNU General Public License |
|
14 | # modify it under the terms of the GNU General Public License | |
8 | # as published by the Free Software Foundation; version 2 |
|
15 | # as published by the Free Software Foundation; version 2 | |
@@ -17,107 +24,115 b'' | |||||
17 | # along with this program; if not, write to the Free Software |
|
24 | # along with this program; if not, write to the Free Software | |
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, |
|
25 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |
19 | # MA 02110-1301, USA. |
|
26 | # MA 02110-1301, USA. | |
20 | """ |
|
27 | import tempfile | |
21 | Created on April 21, 2010 |
|
28 | import logging | |
22 | files controller for pylons |
|
29 | import rhodecode.lib.helpers as h | |
23 | @author: marcink |
|
30 | ||
24 | """ |
|
|||
25 | from mercurial import archival |
|
31 | from mercurial import archival | |
|
32 | ||||
26 | from pylons import request, response, session, tmpl_context as c, url |
|
33 | from pylons import request, response, session, tmpl_context as c, url | |
27 | from pylons.i18n.translation import _ |
|
34 | from pylons.i18n.translation import _ | |
28 | from pylons.controllers.util import redirect |
|
35 | from pylons.controllers.util import redirect | |
|
36 | ||||
29 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator |
|
37 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator | |
30 | from rhodecode.lib.base import BaseController, render |
|
38 | from rhodecode.lib.base import BaseController, render | |
31 | from rhodecode.lib.utils import EmptyChangeset |
|
39 | from rhodecode.lib.utils import EmptyChangeset | |
32 |
from rhodecode.model. |
|
40 | from rhodecode.model.scm import ScmModel | |
|
41 | ||||
33 | from vcs.exceptions import RepositoryError, ChangesetError |
|
42 | from vcs.exceptions import RepositoryError, ChangesetError | |
34 | from vcs.nodes import FileNode |
|
43 | from vcs.nodes import FileNode | |
35 | from vcs.utils import diffs as differ |
|
44 | from vcs.utils import diffs as differ | |
36 | import logging |
|
45 | ||
37 | import rhodecode.lib.helpers as h |
|
|||
38 | import tempfile |
|
|||
39 |
|
||||
40 | log = logging.getLogger(__name__) |
|
46 | log = logging.getLogger(__name__) | |
41 |
|
47 | |||
42 | class FilesController(BaseController): |
|
48 | class FilesController(BaseController): | |
43 |
|
49 | |||
44 | @LoginRequired() |
|
50 | @LoginRequired() | |
45 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', |
|
51 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
46 |
'repository.admin') |
|
52 | 'repository.admin') | |
47 | def __before__(self): |
|
53 | def __before__(self): | |
48 | super(FilesController, self).__before__() |
|
54 | super(FilesController, self).__before__() | |
49 | c.file_size_limit = 250 * 1024 #limit of file size to display |
|
55 | c.cut_off_limit = self.cut_off_limit | |
50 |
|
56 | |||
51 | def index(self, repo_name, revision, f_path): |
|
57 | def index(self, repo_name, revision, f_path): | |
52 |
hg_model = |
|
58 | hg_model = ScmModel() | |
53 |
c. |
|
59 | c.repo = hg_model.get_repo(c.repo_name) | |
54 | revision = request.POST.get('at_rev', None) or revision |
|
60 | revision = request.POST.get('at_rev', None) or revision | |
55 |
|
61 | |||
56 | def get_next_rev(cur): |
|
62 | def get_next_rev(cur): | |
57 | max_rev = len(c.repo.revisions) - 1 |
|
63 | max_rev = len(c.repo.revisions) - 1 | |
58 | r = cur + 1 |
|
64 | r = cur + 1 | |
59 | if r > max_rev: |
|
65 | if r > max_rev: | |
60 | r = max_rev |
|
66 | r = max_rev | |
61 | return r |
|
67 | return r | |
62 |
|
68 | |||
63 | def get_prev_rev(cur): |
|
69 | def get_prev_rev(cur): | |
64 | r = cur - 1 |
|
70 | r = cur - 1 | |
65 | return r |
|
71 | return r | |
66 |
|
72 | |||
67 | c.f_path = f_path |
|
73 | c.f_path = f_path | |
68 |
|
74 | |||
69 |
|
75 | |||
70 | try: |
|
76 | try: | |
71 |
c |
|
77 | c.changeset = c.repo.get_changeset(revision) | |
72 |
|
|
78 | cur_rev = c.changeset.revision | |
73 |
|
|
79 | prev_rev = c.repo.get_changeset(get_prev_rev(cur_rev)).raw_id | |
74 |
|
80 | next_rev = c.repo.get_changeset(get_next_rev(cur_rev)).raw_id | ||
|
81 | ||||
75 | c.url_prev = url('files_home', repo_name=c.repo_name, |
|
82 | c.url_prev = url('files_home', repo_name=c.repo_name, | |
76 |
revision=prev_rev, f_path=f_path) |
|
83 | revision=prev_rev, f_path=f_path) | |
77 | c.url_next = url('files_home', repo_name=c.repo_name, |
|
84 | c.url_next = url('files_home', repo_name=c.repo_name, | |
78 |
|
|
85 | revision=next_rev, f_path=f_path) | |
79 |
|
86 | |||
80 | c.changeset = repo.get_changeset(revision) |
|
87 | try: | |
81 |
|
88 | c.files_list = c.changeset.get_node(f_path) | ||
82 | c.cur_rev = c.changeset.short_id |
|
89 | c.file_history = self._get_history(c.repo, c.files_list, f_path) | |
83 | c.rev_nr = c.changeset.revision |
|
90 | except RepositoryError, e: | |
84 | c.files_list = c.changeset.get_node(f_path) |
|
91 | h.flash(str(e), category='warning') | |
85 | c.file_history = self._get_history(repo, c.files_list, f_path) |
|
92 | redirect(h.url('files_home', repo_name=repo_name, revision=revision)) | |
86 |
|
93 | |||
87 |
except |
|
94 | except RepositoryError, e: | |
88 | c.files_list = None |
|
95 | h.flash(str(e), category='warning') | |
89 |
|
96 | redirect(h.url('files_home', repo_name=repo_name, revision='tip')) | ||
|
97 | ||||
|
98 | ||||
|
99 | ||||
90 | return render('files/files.html') |
|
100 | return render('files/files.html') | |
91 |
|
101 | |||
92 | def rawfile(self, repo_name, revision, f_path): |
|
102 | def rawfile(self, repo_name, revision, f_path): | |
93 |
hg_model = |
|
103 | hg_model = ScmModel() | |
94 | c.repo = hg_model.get_repo(c.repo_name) |
|
104 | c.repo = hg_model.get_repo(c.repo_name) | |
95 | file_node = c.repo.get_changeset(revision).get_node(f_path) |
|
105 | file_node = c.repo.get_changeset(revision).get_node(f_path) | |
96 | response.content_type = file_node.mimetype |
|
106 | response.content_type = file_node.mimetype | |
97 | response.content_disposition = 'attachment; filename=%s' \ |
|
107 | response.content_disposition = 'attachment; filename=%s' \ | |
98 |
% f_path.split('/')[-1] |
|
108 | % f_path.split('/')[-1] | |
99 | return file_node.content |
|
109 | return file_node.content | |
100 |
|
110 | |||
101 | def raw(self, repo_name, revision, f_path): |
|
111 | def raw(self, repo_name, revision, f_path): | |
102 |
hg_model = |
|
112 | hg_model = ScmModel() | |
103 | c.repo = hg_model.get_repo(c.repo_name) |
|
113 | c.repo = hg_model.get_repo(c.repo_name) | |
104 | file_node = c.repo.get_changeset(revision).get_node(f_path) |
|
114 | file_node = c.repo.get_changeset(revision).get_node(f_path) | |
105 | response.content_type = 'text/plain' |
|
115 | response.content_type = 'text/plain' | |
106 |
|
116 | |||
107 | return file_node.content |
|
117 | return file_node.content | |
108 |
|
118 | |||
109 | def annotate(self, repo_name, revision, f_path): |
|
119 | def annotate(self, repo_name, revision, f_path): | |
110 |
hg_model = |
|
120 | hg_model = ScmModel() | |
111 | c.repo = hg_model.get_repo(c.repo_name) |
|
121 | c.repo = hg_model.get_repo(c.repo_name) | |
112 | cs = c.repo.get_changeset(revision) |
|
122 | ||
113 | c.file = cs.get_node(f_path) |
|
123 | try: | |
114 | c.file_msg = cs.get_file_message(f_path) |
|
124 | c.cs = c.repo.get_changeset(revision) | |
115 | c.cur_rev = cs.short_id |
|
125 | c.file = c.cs.get_node(f_path) | |
116 | c.rev_nr = cs.revision |
|
126 | except RepositoryError, e: | |
|
127 | h.flash(str(e), category='warning') | |||
|
128 | redirect(h.url('files_home', repo_name=repo_name, revision=revision)) | |||
|
129 | ||||
|
130 | c.file_history = self._get_history(c.repo, c.file, f_path) | |||
|
131 | ||||
117 | c.f_path = f_path |
|
132 | c.f_path = f_path | |
118 |
|
133 | |||
119 | return render('files/files_annotate.html') |
|
134 | return render('files/files_annotate.html') | |
120 |
|
135 | |||
121 | def archivefile(self, repo_name, revision, fileformat): |
|
136 | def archivefile(self, repo_name, revision, fileformat): | |
122 | archive_specs = { |
|
137 | archive_specs = { | |
123 | '.tar.bz2': ('application/x-tar', 'tbz2'), |
|
138 | '.tar.bz2': ('application/x-tar', 'tbz2'), | |
@@ -126,7 +141,7 b' class FilesController(BaseController):' | |||||
126 | } |
|
141 | } | |
127 | if not archive_specs.has_key(fileformat): |
|
142 | if not archive_specs.has_key(fileformat): | |
128 | return 'Unknown archive type %s' % fileformat |
|
143 | return 'Unknown archive type %s' % fileformat | |
129 |
|
144 | |||
130 | def read_in_chunks(file_object, chunk_size=1024 * 40): |
|
145 | def read_in_chunks(file_object, chunk_size=1024 * 40): | |
131 | """Lazy function (generator) to read a file piece by piece. |
|
146 | """Lazy function (generator) to read a file piece by piece. | |
132 | Default chunk size: 40k.""" |
|
147 | Default chunk size: 40k.""" | |
@@ -134,10 +149,10 b' class FilesController(BaseController):' | |||||
134 | data = file_object.read(chunk_size) |
|
149 | data = file_object.read(chunk_size) | |
135 | if not data: |
|
150 | if not data: | |
136 | break |
|
151 | break | |
137 |
yield data |
|
152 | yield data | |
138 |
|
153 | |||
139 | archive = tempfile.TemporaryFile() |
|
154 | archive = tempfile.TemporaryFile() | |
140 |
repo = |
|
155 | repo = ScmModel().get_repo(repo_name).repo | |
141 | fname = '%s-%s%s' % (repo_name, revision, fileformat) |
|
156 | fname = '%s-%s%s' % (repo_name, revision, fileformat) | |
142 | archival.archive(repo, archive, revision, archive_specs[fileformat][1], |
|
157 | archival.archive(repo, archive, revision, archive_specs[fileformat][1], | |
143 | prefix='%s-%s' % (repo_name, revision)) |
|
158 | prefix='%s-%s' % (repo_name, revision)) | |
@@ -145,9 +160,9 b' class FilesController(BaseController):' | |||||
145 | response.content_disposition = 'attachment; filename=%s' % fname |
|
160 | response.content_disposition = 'attachment; filename=%s' % fname | |
146 | archive.seek(0) |
|
161 | archive.seek(0) | |
147 | return read_in_chunks(archive) |
|
162 | return read_in_chunks(archive) | |
148 |
|
163 | |||
149 | def diff(self, repo_name, f_path): |
|
164 | def diff(self, repo_name, f_path): | |
150 |
hg_model = |
|
165 | hg_model = ScmModel() | |
151 | diff1 = request.GET.get('diff1') |
|
166 | diff1 = request.GET.get('diff1') | |
152 | diff2 = request.GET.get('diff2') |
|
167 | diff2 = request.GET.get('diff2') | |
153 | c.action = request.GET.get('diff') |
|
168 | c.action = request.GET.get('diff') | |
@@ -162,7 +177,7 b' class FilesController(BaseController):' | |||||
162 | else: |
|
177 | else: | |
163 | c.changeset_1 = EmptyChangeset() |
|
178 | c.changeset_1 = EmptyChangeset() | |
164 | node1 = FileNode('.', '', changeset=c.changeset_1) |
|
179 | node1 = FileNode('.', '', changeset=c.changeset_1) | |
165 |
|
180 | |||
166 | if diff2 not in ['', None, 'None', '0' * 12, '0' * 40]: |
|
181 | if diff2 not in ['', None, 'None', '0' * 12, '0' * 40]: | |
167 | c.changeset_2 = c.repo.get_changeset(diff2) |
|
182 | c.changeset_2 = c.repo.get_changeset(diff2) | |
168 | node2 = c.changeset_2.get_node(f_path) |
|
183 | node2 = c.changeset_2.get_node(f_path) | |
@@ -173,43 +188,66 b' class FilesController(BaseController):' | |||||
173 | return redirect(url('files_home', |
|
188 | return redirect(url('files_home', | |
174 | repo_name=c.repo_name, f_path=f_path)) |
|
189 | repo_name=c.repo_name, f_path=f_path)) | |
175 |
|
190 | |||
176 | c.diff1 = 'r%s:%s' % (c.changeset_1.revision, c.changeset_1.short_id) |
|
|||
177 | c.diff2 = 'r%s:%s' % (c.changeset_2.revision, c.changeset_2.short_id) |
|
|||
178 |
|
||||
179 | f_udiff = differ.get_udiff(node1, node2) |
|
191 | f_udiff = differ.get_udiff(node1, node2) | |
180 | diff = differ.DiffProcessor(f_udiff) |
|
192 | diff = differ.DiffProcessor(f_udiff) | |
181 |
|
193 | |||
182 | if c.action == 'download': |
|
194 | if c.action == 'download': | |
183 | diff_name = '%s_vs_%s.diff' % (diff1, diff2) |
|
195 | diff_name = '%s_vs_%s.diff' % (diff1, diff2) | |
184 | response.content_type = 'text/plain' |
|
196 | response.content_type = 'text/plain' | |
185 | response.content_disposition = 'attachment; filename=%s' \ |
|
197 | response.content_disposition = 'attachment; filename=%s' \ | |
186 |
% diff_name |
|
198 | % diff_name | |
187 | return diff.raw_diff() |
|
199 | return diff.raw_diff() | |
188 |
|
200 | |||
189 | elif c.action == 'raw': |
|
201 | elif c.action == 'raw': | |
190 | c.cur_diff = '<pre class="raw">%s</pre>' % h.escape(diff.raw_diff()) |
|
202 | response.content_type = 'text/plain' | |
|
203 | return diff.raw_diff() | |||
|
204 | ||||
191 | elif c.action == 'diff': |
|
205 | elif c.action == 'diff': | |
192 |
if node1.size > |
|
206 | if node1.size > self.cut_off_limit or node2.size > self.cut_off_limit: | |
193 | c.cur_diff = _('Diff is to big to display') |
|
207 | c.cur_diff = _('Diff is to big to display') | |
194 | else: |
|
208 | else: | |
195 | c.cur_diff = diff.as_html() |
|
209 | c.cur_diff = diff.as_html() | |
196 | else: |
|
210 | else: | |
197 | #default option |
|
211 | #default option | |
198 |
if node1.size > |
|
212 | if node1.size > self.cut_off_limit or node2.size > self.cut_off_limit: | |
199 | c.cur_diff = _('Diff is to big to display') |
|
213 | c.cur_diff = _('Diff is to big to display') | |
200 | else: |
|
214 | else: | |
201 | c.cur_diff = diff.as_html() |
|
215 | c.cur_diff = diff.as_html() | |
202 |
|
216 | |||
203 |
if not c.cur_diff: c.no_changes = True |
|
217 | if not c.cur_diff: c.no_changes = True | |
204 | return render('files/file_diff.html') |
|
218 | return render('files/file_diff.html') | |
205 |
|
219 | |||
206 | def _get_history(self, repo, node, f_path): |
|
220 | def _get_history(self, repo, node, f_path): | |
207 | from vcs.nodes import NodeKind |
|
221 | from vcs.nodes import NodeKind | |
208 | if not node.kind is NodeKind.FILE: |
|
222 | if not node.kind is NodeKind.FILE: | |
209 | return [] |
|
223 | return [] | |
210 | changesets = node.history |
|
224 | changesets = node.history | |
211 | hist_l = [] |
|
225 | hist_l = [] | |
|
226 | ||||
|
227 | changesets_group = ([], _("Changesets")) | |||
|
228 | branches_group = ([], _("Branches")) | |||
|
229 | tags_group = ([], _("Tags")) | |||
|
230 | ||||
212 | for chs in changesets: |
|
231 | for chs in changesets: | |
213 | n_desc = 'r%s:%s' % (chs.revision, chs.short_id) |
|
232 | n_desc = 'r%s:%s' % (chs.revision, chs.short_id) | |
214 |
|
|
233 | changesets_group[0].append((chs.raw_id, n_desc,)) | |
|
234 | ||||
|
235 | hist_l.append(changesets_group) | |||
|
236 | ||||
|
237 | for name, chs in c.repository_branches.items(): | |||
|
238 | #chs = chs.split(':')[-1] | |||
|
239 | branches_group[0].append((chs, name),) | |||
|
240 | hist_l.append(branches_group) | |||
|
241 | ||||
|
242 | for name, chs in c.repository_tags.items(): | |||
|
243 | #chs = chs.split(':')[-1] | |||
|
244 | tags_group[0].append((chs, name),) | |||
|
245 | hist_l.append(tags_group) | |||
|
246 | ||||
215 | return hist_l |
|
247 | return hist_l | |
|
248 | ||||
|
249 | # [ | |||
|
250 | # ([("u1", "User1"), ("u2", "User2")], "Users"), | |||
|
251 | # ([("g1", "Group1"), ("g2", "Group2")], "Groups") | |||
|
252 | # ] | |||
|
253 |
@@ -26,33 +26,33 b' from operator import itemgetter' | |||||
26 | from pylons import tmpl_context as c, request |
|
26 | from pylons import tmpl_context as c, request | |
27 | from rhodecode.lib.auth import LoginRequired |
|
27 | from rhodecode.lib.auth import LoginRequired | |
28 | from rhodecode.lib.base import BaseController, render |
|
28 | from rhodecode.lib.base import BaseController, render | |
29 |
from rhodecode.model. |
|
29 | from rhodecode.model.scm import ScmModel | |
30 | import logging |
|
30 | import logging | |
31 | log = logging.getLogger(__name__) |
|
31 | log = logging.getLogger(__name__) | |
32 |
|
32 | |||
33 |
class H |
|
33 | class HomeController(BaseController): | |
34 |
|
34 | |||
35 | @LoginRequired() |
|
35 | @LoginRequired() | |
36 | def __before__(self): |
|
36 | def __before__(self): | |
37 |
super(H |
|
37 | super(HomeController, self).__before__() | |
38 |
|
38 | |||
39 | def index(self): |
|
39 | def index(self): | |
40 | sortables = ['name', 'description', 'last_change', 'tip', 'contact'] |
|
40 | sortables = ['name', 'description', 'last_change', 'tip', 'contact'] | |
41 | current_sort = request.GET.get('sort', 'name') |
|
41 | current_sort = request.GET.get('sort', 'name') | |
42 | current_sort_slug = current_sort.replace('-', '') |
|
42 | current_sort_slug = current_sort.replace('-', '') | |
43 |
|
43 | |||
44 | if current_sort_slug not in sortables: |
|
44 | if current_sort_slug not in sortables: | |
45 | c.sort_by = 'name' |
|
45 | c.sort_by = 'name' | |
46 | current_sort_slug = c.sort_by |
|
46 | current_sort_slug = c.sort_by | |
47 | else: |
|
47 | else: | |
48 | c.sort_by = current_sort |
|
48 | c.sort_by = current_sort | |
49 | c.sort_slug = current_sort_slug |
|
49 | c.sort_slug = current_sort_slug | |
50 |
cached_repo_list = |
|
50 | cached_repo_list = ScmModel().get_repos() | |
51 |
|
51 | |||
52 | sort_key = current_sort_slug + '_sort' |
|
52 | sort_key = current_sort_slug + '_sort' | |
53 | if c.sort_by.startswith('-'): |
|
53 | if c.sort_by.startswith('-'): | |
54 | c.repos_list = sorted(cached_repo_list, key=itemgetter(sort_key), reverse=True) |
|
54 | c.repos_list = sorted(cached_repo_list, key=itemgetter(sort_key), reverse=True) | |
55 | else: |
|
55 | else: | |
56 | c.repos_list = sorted(cached_repo_list, key=itemgetter(sort_key), reverse=False) |
|
56 | c.repos_list = sorted(cached_repo_list, key=itemgetter(sort_key), reverse=False) | |
57 |
|
57 | |||
58 | return render('/index.html') |
|
58 | return render('/index.html') |
@@ -28,10 +28,10 b' from pylons import request, response, se' | |||||
28 | from pylons.controllers.util import abort, redirect |
|
28 | from pylons.controllers.util import abort, redirect | |
29 | from rhodecode.lib.auth import AuthUser, HasPermissionAnyDecorator |
|
29 | from rhodecode.lib.auth import AuthUser, HasPermissionAnyDecorator | |
30 | from rhodecode.lib.base import BaseController, render |
|
30 | from rhodecode.lib.base import BaseController, render | |
31 |
import rhodecode.lib.helpers as h |
|
31 | import rhodecode.lib.helpers as h | |
32 | from pylons.i18n.translation import _ |
|
32 | from pylons.i18n.translation import _ | |
33 | from rhodecode.model.forms import LoginForm, RegisterForm, PasswordResetForm |
|
33 | from rhodecode.model.forms import LoginForm, RegisterForm, PasswordResetForm | |
34 |
from rhodecode.model.user |
|
34 | from rhodecode.model.user import UserModel | |
35 | import formencode |
|
35 | import formencode | |
36 | import logging |
|
36 | import logging | |
37 |
|
37 | |||
@@ -45,17 +45,19 b' class LoginController(BaseController):' | |||||
45 | def index(self): |
|
45 | def index(self): | |
46 | #redirect if already logged in |
|
46 | #redirect if already logged in | |
47 | c.came_from = request.GET.get('came_from', None) |
|
47 | c.came_from = request.GET.get('came_from', None) | |
48 |
|
48 | |||
49 |
if c.rhodecode_user.is_authenticated |
|
49 | if c.rhodecode_user.is_authenticated \ | |
50 | return redirect(url('hg_home')) |
|
50 | and c.rhodecode_user.username != 'default': | |
51 |
|
51 | |||
|
52 | return redirect(url('home')) | |||
|
53 | ||||
52 | if request.POST: |
|
54 | if request.POST: | |
53 | #import Login Form validator class |
|
55 | #import Login Form validator class | |
54 | login_form = LoginForm() |
|
56 | login_form = LoginForm() | |
55 | try: |
|
57 | try: | |
56 | c.form_result = login_form.to_python(dict(request.POST)) |
|
58 | c.form_result = login_form.to_python(dict(request.POST)) | |
57 | username = c.form_result['username'] |
|
59 | username = c.form_result['username'] | |
58 |
user = UserModel().get_user |
|
60 | user = UserModel().get_by_username(username, case_insensitive=True) | |
59 | auth_user = AuthUser() |
|
61 | auth_user = AuthUser() | |
60 | auth_user.username = user.username |
|
62 | auth_user.username = user.username | |
61 | auth_user.is_authenticated = True |
|
63 | auth_user.is_authenticated = True | |
@@ -66,14 +68,14 b' class LoginController(BaseController):' | |||||
66 | session['rhodecode_user'] = auth_user |
|
68 | session['rhodecode_user'] = auth_user | |
67 | session.save() |
|
69 | session.save() | |
68 | log.info('user %s is now authenticated', username) |
|
70 | log.info('user %s is now authenticated', username) | |
69 |
|
71 | |||
70 | user.update_lastlogin() |
|
72 | user.update_lastlogin() | |
71 |
|
73 | |||
72 | if c.came_from: |
|
74 | if c.came_from: | |
73 | return redirect(c.came_from) |
|
75 | return redirect(c.came_from) | |
74 | else: |
|
76 | else: | |
75 |
return redirect(url(' |
|
77 | return redirect(url('home')) | |
76 |
|
78 | |||
77 | except formencode.Invalid, errors: |
|
79 | except formencode.Invalid, errors: | |
78 | return htmlfill.render( |
|
80 | return htmlfill.render( | |
79 | render('/login.html'), |
|
81 | render('/login.html'), | |
@@ -81,30 +83,30 b' class LoginController(BaseController):' | |||||
81 | errors=errors.error_dict or {}, |
|
83 | errors=errors.error_dict or {}, | |
82 | prefix_error=False, |
|
84 | prefix_error=False, | |
83 | encoding="UTF-8") |
|
85 | encoding="UTF-8") | |
84 |
|
86 | |||
85 | return render('/login.html') |
|
87 | return render('/login.html') | |
86 |
|
88 | |||
87 | @HasPermissionAnyDecorator('hg.admin', 'hg.register.auto_activate', |
|
89 | @HasPermissionAnyDecorator('hg.admin', 'hg.register.auto_activate', | |
88 | 'hg.register.manual_activate') |
|
90 | 'hg.register.manual_activate') | |
89 | def register(self): |
|
91 | def register(self): | |
90 | user_model = UserModel() |
|
92 | user_model = UserModel() | |
91 | c.auto_active = False |
|
93 | c.auto_active = False | |
92 |
for perm in user_model.get_ |
|
94 | for perm in user_model.get_by_username('default', cache=False).user_perms: | |
93 | if perm.permission.permission_name == 'hg.register.auto_activate': |
|
95 | if perm.permission.permission_name == 'hg.register.auto_activate': | |
94 | c.auto_active = True |
|
96 | c.auto_active = True | |
95 | break |
|
97 | break | |
96 |
|
98 | |||
97 | if request.POST: |
|
99 | if request.POST: | |
98 |
|
100 | |||
99 | register_form = RegisterForm()() |
|
101 | register_form = RegisterForm()() | |
100 | try: |
|
102 | try: | |
101 | form_result = register_form.to_python(dict(request.POST)) |
|
103 | form_result = register_form.to_python(dict(request.POST)) | |
102 | form_result['active'] = c.auto_active |
|
104 | form_result['active'] = c.auto_active | |
103 | user_model.create_registration(form_result) |
|
105 | user_model.create_registration(form_result) | |
104 | h.flash(_('You have successfully registered into rhodecode'), |
|
106 | h.flash(_('You have successfully registered into rhodecode'), | |
105 |
category='success') |
|
107 | category='success') | |
106 | return redirect(url('login_home')) |
|
108 | return redirect(url('login_home')) | |
107 |
|
109 | |||
108 | except formencode.Invalid, errors: |
|
110 | except formencode.Invalid, errors: | |
109 | return htmlfill.render( |
|
111 | return htmlfill.render( | |
110 | render('/register.html'), |
|
112 | render('/register.html'), | |
@@ -112,21 +114,21 b' class LoginController(BaseController):' | |||||
112 | errors=errors.error_dict or {}, |
|
114 | errors=errors.error_dict or {}, | |
113 | prefix_error=False, |
|
115 | prefix_error=False, | |
114 | encoding="UTF-8") |
|
116 | encoding="UTF-8") | |
115 |
|
117 | |||
116 | return render('/register.html') |
|
118 | return render('/register.html') | |
117 |
|
119 | |||
118 | def password_reset(self): |
|
120 | def password_reset(self): | |
119 | user_model = UserModel() |
|
121 | user_model = UserModel() | |
120 | if request.POST: |
|
122 | if request.POST: | |
121 |
|
123 | |||
122 | password_reset_form = PasswordResetForm()() |
|
124 | password_reset_form = PasswordResetForm()() | |
123 | try: |
|
125 | try: | |
124 | form_result = password_reset_form.to_python(dict(request.POST)) |
|
126 | form_result = password_reset_form.to_python(dict(request.POST)) | |
125 | user_model.reset_password(form_result) |
|
127 | user_model.reset_password(form_result) | |
126 | h.flash(_('Your new password was sent'), |
|
128 | h.flash(_('Your new password was sent'), | |
127 |
category='success') |
|
129 | category='success') | |
128 | return redirect(url('login_home')) |
|
130 | return redirect(url('login_home')) | |
129 |
|
131 | |||
130 | except formencode.Invalid, errors: |
|
132 | except formencode.Invalid, errors: | |
131 | return htmlfill.render( |
|
133 | return htmlfill.render( | |
132 | render('/password_reset.html'), |
|
134 | render('/password_reset.html'), | |
@@ -134,11 +136,11 b' class LoginController(BaseController):' | |||||
134 | errors=errors.error_dict or {}, |
|
136 | errors=errors.error_dict or {}, | |
135 | prefix_error=False, |
|
137 | prefix_error=False, | |
136 | encoding="UTF-8") |
|
138 | encoding="UTF-8") | |
137 |
|
139 | |||
138 | return render('/password_reset.html') |
|
140 | return render('/password_reset.html') | |
139 |
|
141 | |||
140 | def logout(self): |
|
142 | def logout(self): | |
141 | session['rhodecode_user'] = AuthUser() |
|
143 | session['rhodecode_user'] = AuthUser() | |
142 | session.save() |
|
144 | session.save() | |
143 | log.info('Logging out and setting user as Empty') |
|
145 | log.info('Logging out and setting user as Empty') | |
144 |
redirect(url(' |
|
146 | redirect(url('home')) |
@@ -22,11 +22,11 b' Created on Aug 7, 2010' | |||||
22 | search controller for pylons |
|
22 | search controller for pylons | |
23 | @author: marcink |
|
23 | @author: marcink | |
24 | """ |
|
24 | """ | |
25 | from pylons import request, response, session, tmpl_context as c, url |
|
25 | from pylons import request, response, config, session, tmpl_context as c, url | |
26 | from pylons.controllers.util import abort, redirect |
|
26 | from pylons.controllers.util import abort, redirect | |
27 | from rhodecode.lib.auth import LoginRequired |
|
27 | from rhodecode.lib.auth import LoginRequired | |
28 | from rhodecode.lib.base import BaseController, render |
|
28 | from rhodecode.lib.base import BaseController, render | |
29 |
from rhodecode.lib.indexers import |
|
29 | from rhodecode.lib.indexers import SCHEMA, IDX_NAME, ResultWrapper | |
30 | from webhelpers.paginate import Page |
|
30 | from webhelpers.paginate import Page | |
31 | from webhelpers.util import update_params |
|
31 | from webhelpers.util import update_params | |
32 | from pylons.i18n.translation import _ |
|
32 | from pylons.i18n.translation import _ | |
@@ -42,7 +42,7 b' class SearchController(BaseController):' | |||||
42 |
|
42 | |||
43 | @LoginRequired() |
|
43 | @LoginRequired() | |
44 | def __before__(self): |
|
44 | def __before__(self): | |
45 |
super(SearchController, self).__before__() |
|
45 | super(SearchController, self).__before__() | |
46 |
|
46 | |||
47 | def index(self, search_repo=None): |
|
47 | def index(self, search_repo=None): | |
48 | c.repo_name = search_repo |
|
48 | c.repo_name = search_repo | |
@@ -56,15 +56,16 b' class SearchController(BaseController):' | |||||
56 | 'repository':'repository'}\ |
|
56 | 'repository':'repository'}\ | |
57 | .get(c.cur_type, 'content') |
|
57 | .get(c.cur_type, 'content') | |
58 |
|
58 | |||
59 |
|
59 | |||
60 | if c.cur_query: |
|
60 | if c.cur_query: | |
61 | cur_query = c.cur_query.lower() |
|
61 | cur_query = c.cur_query.lower() | |
62 |
|
62 | |||
63 | if c.cur_query: |
|
63 | if c.cur_query: | |
64 | p = int(request.params.get('page', 1)) |
|
64 | p = int(request.params.get('page', 1)) | |
65 | highlight_items = set() |
|
65 | highlight_items = set() | |
66 | try: |
|
66 | try: | |
67 |
idx = open_dir( |
|
67 | idx = open_dir(config['app_conf']['index_dir'] | |
|
68 | , indexname=IDX_NAME) | |||
68 | searcher = idx.searcher() |
|
69 | searcher = idx.searcher() | |
69 |
|
70 | |||
70 | qp = QueryParser(search_type, schema=SCHEMA) |
|
71 | qp = QueryParser(search_type, schema=SCHEMA) | |
@@ -72,7 +73,7 b' class SearchController(BaseController):' | |||||
72 | cur_query = u'repository:%s %s' % (c.repo_name, cur_query) |
|
73 | cur_query = u'repository:%s %s' % (c.repo_name, cur_query) | |
73 | try: |
|
74 | try: | |
74 | query = qp.parse(unicode(cur_query)) |
|
75 | query = qp.parse(unicode(cur_query)) | |
75 |
|
76 | |||
76 | if isinstance(query, Phrase): |
|
77 | if isinstance(query, Phrase): | |
77 | highlight_items.update(query.words) |
|
78 | highlight_items.update(query.words) | |
78 | else: |
|
79 | else: | |
@@ -81,14 +82,14 b' class SearchController(BaseController):' | |||||
81 | highlight_items.add(i[1]) |
|
82 | highlight_items.add(i[1]) | |
82 |
|
83 | |||
83 | matcher = query.matcher(searcher) |
|
84 | matcher = query.matcher(searcher) | |
84 |
|
85 | |||
85 | log.debug(query) |
|
86 | log.debug(query) | |
86 | log.debug(highlight_items) |
|
87 | log.debug(highlight_items) | |
87 | results = searcher.search(query) |
|
88 | results = searcher.search(query) | |
88 | res_ln = len(results) |
|
89 | res_ln = len(results) | |
89 | c.runtime = '%s results (%.3f seconds)' \ |
|
90 | c.runtime = '%s results (%.3f seconds)' \ | |
90 | % (res_ln, results.runtime) |
|
91 | % (res_ln, results.runtime) | |
91 |
|
92 | |||
92 | def url_generator(**kw): |
|
93 | def url_generator(**kw): | |
93 | return update_params("?q=%s&type=%s" \ |
|
94 | return update_params("?q=%s&type=%s" \ | |
94 | % (c.cur_query, c.cur_search), **kw) |
|
95 | % (c.cur_query, c.cur_search), **kw) | |
@@ -98,8 +99,8 b' class SearchController(BaseController):' | |||||
98 | highlight_items), |
|
99 | highlight_items), | |
99 | page=p, item_count=res_ln, |
|
100 | page=p, item_count=res_ln, | |
100 | items_per_page=10, url=url_generator) |
|
101 | items_per_page=10, url=url_generator) | |
101 |
|
102 | |||
102 |
|
103 | |||
103 | except QueryParserError: |
|
104 | except QueryParserError: | |
104 | c.runtime = _('Invalid search query. Try quoting it.') |
|
105 | c.runtime = _('Invalid search query. Try quoting it.') | |
105 | searcher.close() |
|
106 | searcher.close() | |
@@ -107,6 +108,6 b' class SearchController(BaseController):' | |||||
107 | log.error(traceback.format_exc()) |
|
108 | log.error(traceback.format_exc()) | |
108 | log.error('Empty Index data') |
|
109 | log.error('Empty Index data') | |
109 | c.runtime = _('There is no index to search in. Please run whoosh indexer') |
|
110 | c.runtime = _('There is no index to search in. Please run whoosh indexer') | |
110 |
|
111 | |||
111 | # Return a rendered template |
|
112 | # Return a rendered template | |
112 | return render('/search/search.html') |
|
113 | return render('/search/search.html') |
@@ -30,7 +30,7 b' from rhodecode.lib.auth import LoginRequ' | |||||
30 | from rhodecode.lib.base import BaseController, render |
|
30 | from rhodecode.lib.base import BaseController, render | |
31 | from rhodecode.lib.utils import invalidate_cache, action_logger |
|
31 | from rhodecode.lib.utils import invalidate_cache, action_logger | |
32 | from rhodecode.model.forms import RepoSettingsForm, RepoForkForm |
|
32 | from rhodecode.model.forms import RepoSettingsForm, RepoForkForm | |
33 |
from rhodecode.model.repo |
|
33 | from rhodecode.model.repo import RepoModel | |
34 | import formencode |
|
34 | import formencode | |
35 | import logging |
|
35 | import logging | |
36 | import rhodecode.lib.helpers as h |
|
36 | import rhodecode.lib.helpers as h | |
@@ -41,35 +41,35 b' log = logging.getLogger(__name__)' | |||||
41 | class SettingsController(BaseController): |
|
41 | class SettingsController(BaseController): | |
42 |
|
42 | |||
43 | @LoginRequired() |
|
43 | @LoginRequired() | |
44 |
@HasRepoPermissionAllDecorator('repository.admin') |
|
44 | @HasRepoPermissionAllDecorator('repository.admin') | |
45 | def __before__(self): |
|
45 | def __before__(self): | |
46 | super(SettingsController, self).__before__() |
|
46 | super(SettingsController, self).__before__() | |
47 |
|
47 | |||
48 | def index(self, repo_name): |
|
48 | def index(self, repo_name): | |
49 | repo_model = RepoModel() |
|
49 | repo_model = RepoModel() | |
50 | c.repo_info = repo = repo_model.get(repo_name) |
|
50 | c.repo_info = repo = repo_model.get_by_repo_name(repo_name) | |
51 | if not repo: |
|
51 | if not repo: | |
52 |
h.flash(_('%s repository is not mapped to db perhaps' |
|
52 | h.flash(_('%s repository is not mapped to db perhaps' | |
53 | ' it was created or renamed from the filesystem' |
|
53 | ' it was created or renamed from the filesystem' | |
54 | ' please run the application again' |
|
54 | ' please run the application again' | |
55 | ' in order to rescan repositories') % repo_name, |
|
55 | ' in order to rescan repositories') % repo_name, | |
56 | category='error') |
|
56 | category='error') | |
57 |
|
57 | |||
58 |
return redirect(url(' |
|
58 | return redirect(url('home')) | |
59 |
defaults = c.repo_info. |
|
59 | defaults = c.repo_info.get_dict() | |
60 | defaults.update({'user':c.repo_info.user.username}) |
|
60 | defaults.update({'user':c.repo_info.user.username}) | |
61 | c.users_array = repo_model.get_users_js() |
|
61 | c.users_array = repo_model.get_users_js() | |
62 |
|
62 | |||
63 | for p in c.repo_info.repo_to_perm: |
|
63 | for p in c.repo_info.repo_to_perm: | |
64 |
defaults.update({'perm_%s' % p.user.username: |
|
64 | defaults.update({'perm_%s' % p.user.username: | |
65 | p.permission.permission_name}) |
|
65 | p.permission.permission_name}) | |
66 |
|
66 | |||
67 | return htmlfill.render( |
|
67 | return htmlfill.render( | |
68 | render('settings/repo_settings.html'), |
|
68 | render('settings/repo_settings.html'), | |
69 | defaults=defaults, |
|
69 | defaults=defaults, | |
70 | encoding="UTF-8", |
|
70 | encoding="UTF-8", | |
71 | force_defaults=False |
|
71 | force_defaults=False | |
72 |
) |
|
72 | ) | |
73 |
|
73 | |||
74 | def update(self, repo_name): |
|
74 | def update(self, repo_name): | |
75 | repo_model = RepoModel() |
|
75 | repo_model = RepoModel() | |
@@ -78,12 +78,14 b' class SettingsController(BaseController)' | |||||
78 | try: |
|
78 | try: | |
79 | form_result = _form.to_python(dict(request.POST)) |
|
79 | form_result = _form.to_python(dict(request.POST)) | |
80 | repo_model.update(repo_name, form_result) |
|
80 | repo_model.update(repo_name, form_result) | |
81 |
invalidate_cache(' |
|
81 | invalidate_cache('get_repo_cached_%s' % repo_name) | |
82 | h.flash(_('Repository %s updated successfully' % repo_name), |
|
82 | h.flash(_('Repository %s updated successfully' % repo_name), | |
83 | category='success') |
|
83 | category='success') | |
84 |
changed_name = form_result['repo_name'] |
|
84 | changed_name = form_result['repo_name'] | |
|
85 | action_logger(self.rhodecode_user, 'user_updated_repo', | |||
|
86 | changed_name, '', self.sa) | |||
85 | except formencode.Invalid, errors: |
|
87 | except formencode.Invalid, errors: | |
86 | c.repo_info = repo_model.get(repo_name) |
|
88 | c.repo_info = repo_model.get_by_repo_name(repo_name) | |
87 | c.users_array = repo_model.get_users_js() |
|
89 | c.users_array = repo_model.get_users_js() | |
88 | errors.value.update({'user':c.repo_info.user.username}) |
|
90 | errors.value.update({'user':c.repo_info.user.username}) | |
89 | return htmlfill.render( |
|
91 | return htmlfill.render( | |
@@ -91,17 +93,17 b' class SettingsController(BaseController)' | |||||
91 | defaults=errors.value, |
|
93 | defaults=errors.value, | |
92 | errors=errors.error_dict or {}, |
|
94 | errors=errors.error_dict or {}, | |
93 | prefix_error=False, |
|
95 | prefix_error=False, | |
94 |
encoding="UTF-8") |
|
96 | encoding="UTF-8") | |
95 | except Exception: |
|
97 | except Exception: | |
96 | log.error(traceback.format_exc()) |
|
98 | log.error(traceback.format_exc()) | |
97 | h.flash(_('error occured during update of repository %s') \ |
|
99 | h.flash(_('error occurred during update of repository %s') \ | |
98 | % repo_name, category='error') |
|
100 | % repo_name, category='error') | |
99 |
|
101 | |||
100 | return redirect(url('repo_settings_home', repo_name=changed_name)) |
|
102 | return redirect(url('repo_settings_home', repo_name=changed_name)) | |
101 |
|
103 | |||
102 |
|
104 | |||
103 |
|
105 | |||
104 |
def delete(self, repo_name): |
|
106 | def delete(self, repo_name): | |
105 | """DELETE /repos/repo_name: Delete an existing item""" |
|
107 | """DELETE /repos/repo_name: Delete an existing item""" | |
106 | # Forms posted to this method should contain a hidden field: |
|
108 | # Forms posted to this method should contain a hidden field: | |
107 | # <input type="hidden" name="_method" value="DELETE" /> |
|
109 | # <input type="hidden" name="_method" value="DELETE" /> | |
@@ -109,67 +111,68 b' class SettingsController(BaseController)' | |||||
109 | # h.form(url('repo_settings_delete', repo_name=ID), |
|
111 | # h.form(url('repo_settings_delete', repo_name=ID), | |
110 | # method='delete') |
|
112 | # method='delete') | |
111 | # url('repo_settings_delete', repo_name=ID) |
|
113 | # url('repo_settings_delete', repo_name=ID) | |
112 |
|
114 | |||
113 | repo_model = RepoModel() |
|
115 | repo_model = RepoModel() | |
114 | repo = repo_model.get(repo_name) |
|
116 | repo = repo_model.get_by_repo_name(repo_name) | |
115 | if not repo: |
|
117 | if not repo: | |
116 |
h.flash(_('%s repository is not mapped to db perhaps' |
|
118 | h.flash(_('%s repository is not mapped to db perhaps' | |
117 | ' it was moved or renamed from the filesystem' |
|
119 | ' it was moved or renamed from the filesystem' | |
118 | ' please run the application again' |
|
120 | ' please run the application again' | |
119 | ' in order to rescan repositories') % repo_name, |
|
121 | ' in order to rescan repositories') % repo_name, | |
120 | category='error') |
|
122 | category='error') | |
121 |
|
123 | |||
122 |
return redirect(url(' |
|
124 | return redirect(url('home')) | |
123 | try: |
|
125 | try: | |
124 | action_logger(self.rhodecode_user, 'user_deleted_repo', |
|
126 | action_logger(self.rhodecode_user, 'user_deleted_repo', | |
125 |
repo_name, '', self.sa) |
|
127 | repo_name, '', self.sa) | |
126 |
repo_model.delete(repo) |
|
128 | repo_model.delete(repo) | |
127 |
invalidate_cache(' |
|
129 | invalidate_cache('get_repo_cached_%s' % repo_name) | |
128 | h.flash(_('deleted repository %s') % repo_name, category='success') |
|
130 | h.flash(_('deleted repository %s') % repo_name, category='success') | |
129 | except Exception: |
|
131 | except Exception: | |
130 | h.flash(_('An error occurred during deletion of %s') % repo_name, |
|
132 | h.flash(_('An error occurred during deletion of %s') % repo_name, | |
131 | category='error') |
|
133 | category='error') | |
132 |
|
134 | |||
133 |
return redirect(url(' |
|
135 | return redirect(url('home')) | |
134 |
|
136 | |||
135 | def fork(self, repo_name): |
|
137 | def fork(self, repo_name): | |
136 | repo_model = RepoModel() |
|
138 | repo_model = RepoModel() | |
137 | c.repo_info = repo = repo_model.get(repo_name) |
|
139 | c.repo_info = repo = repo_model.get_by_repo_name(repo_name) | |
138 | if not repo: |
|
140 | if not repo: | |
139 |
h.flash(_('%s repository is not mapped to db perhaps' |
|
141 | h.flash(_('%s repository is not mapped to db perhaps' | |
140 | ' it was created or renamed from the filesystem' |
|
142 | ' it was created or renamed from the filesystem' | |
141 | ' please run the application again' |
|
143 | ' please run the application again' | |
142 | ' in order to rescan repositories') % repo_name, |
|
144 | ' in order to rescan repositories') % repo_name, | |
143 | category='error') |
|
145 | category='error') | |
144 |
|
146 | |||
145 |
return redirect(url(' |
|
147 | return redirect(url('home')) | |
146 |
|
148 | |||
147 | return render('settings/repo_fork.html') |
|
149 | return render('settings/repo_fork.html') | |
148 |
|
150 | |||
149 |
|
151 | |||
150 |
|
152 | |||
151 | def fork_create(self, repo_name): |
|
153 | def fork_create(self, repo_name): | |
152 | repo_model = RepoModel() |
|
154 | repo_model = RepoModel() | |
153 | c.repo_info = repo_model.get(repo_name) |
|
155 | c.repo_info = repo_model.get_by_repo_name(repo_name) | |
154 | _form = RepoForkForm()() |
|
156 | _form = RepoForkForm(old_data={'repo_type':c.repo_info.repo_type})() | |
155 | form_result = {} |
|
157 | form_result = {} | |
156 | try: |
|
158 | try: | |
157 | form_result = _form.to_python(dict(request.POST)) |
|
159 | form_result = _form.to_python(dict(request.POST)) | |
158 | form_result.update({'repo_name':repo_name}) |
|
160 | form_result.update({'repo_name':repo_name}) | |
159 | repo_model.create_fork(form_result, c.rhodecode_user) |
|
161 | repo_model.create_fork(form_result, c.rhodecode_user) | |
160 |
h.flash(_('fork %s repository as %s |
|
162 | h.flash(_('forked %s repository as %s') \ | |
161 | % (repo_name, form_result['fork_name']), |
|
163 | % (repo_name, form_result['fork_name']), | |
162 | category='success') |
|
164 | category='success') | |
163 |
action_logger(self.rhodecode_user, |
|
165 | action_logger(self.rhodecode_user, | |
164 | repo_name, '', self.sa) |
|
166 | 'user_forked_repo:%s' % form_result['fork_name'], | |
|
167 | repo_name, '', self.sa) | |||
165 | except formencode.Invalid, errors: |
|
168 | except formencode.Invalid, errors: | |
166 | c.new_repo = errors.value['fork_name'] |
|
169 | c.new_repo = errors.value['fork_name'] | |
167 | r = render('settings/repo_fork.html') |
|
170 | r = render('settings/repo_fork.html') | |
168 |
|
171 | |||
169 | return htmlfill.render( |
|
172 | return htmlfill.render( | |
170 | r, |
|
173 | r, | |
171 | defaults=errors.value, |
|
174 | defaults=errors.value, | |
172 | errors=errors.error_dict or {}, |
|
175 | errors=errors.error_dict or {}, | |
173 | prefix_error=False, |
|
176 | prefix_error=False, | |
174 |
encoding="UTF-8") |
|
177 | encoding="UTF-8") | |
175 |
return redirect(url(' |
|
178 | return redirect(url('home')) |
@@ -25,7 +25,7 b' shortlog controller for pylons' | |||||
25 | from pylons import tmpl_context as c, request |
|
25 | from pylons import tmpl_context as c, request | |
26 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator |
|
26 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator | |
27 | from rhodecode.lib.base import BaseController, render |
|
27 | from rhodecode.lib.base import BaseController, render | |
28 |
from rhodecode.model. |
|
28 | from rhodecode.model.scm import ScmModel | |
29 | from webhelpers.paginate import Page |
|
29 | from webhelpers.paginate import Page | |
30 | import logging |
|
30 | import logging | |
31 | log = logging.getLogger(__name__) |
|
31 | log = logging.getLogger(__name__) | |
@@ -40,7 +40,7 b' class ShortlogController(BaseController)' | |||||
40 |
|
40 | |||
41 | def index(self): |
|
41 | def index(self): | |
42 | p = int(request.params.get('page', 1)) |
|
42 | p = int(request.params.get('page', 1)) | |
43 |
repo = |
|
43 | repo = ScmModel().get_repo(c.repo_name) | |
44 | c.repo_changesets = Page(repo, page=p, items_per_page=20) |
|
44 | c.repo_changesets = Page(repo, page=p, items_per_page=20) | |
45 | c.shortlog_data = render('shortlog/shortlog_data.html') |
|
45 | c.shortlog_data = render('shortlog/shortlog_data.html') | |
46 | if request.params.get('partial'): |
|
46 | if request.params.get('partial'): |
@@ -1,8 +1,15 b'' | |||||
1 | #!/usr/bin/env python |
|
1 | # -*- coding: utf-8 -*- | |
2 | # encoding: utf-8 |
|
2 | """ | |
3 | # summary controller for pylons |
|
3 | package.rhodecode.controllers.summary | |
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> |
|
4 | ~~~~~~~~~~~~~~ | |
5 | # |
|
5 | ||
|
6 | Summary controller for Rhodecode | |||
|
7 | ||||
|
8 | :created_on: Apr 18, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | """ | |||
6 | # This program is free software; you can redistribute it and/or |
|
13 | # This program is free software; you can redistribute it and/or | |
7 | # modify it under the terms of the GNU General Public License |
|
14 | # modify it under the terms of the GNU General Public License | |
8 | # as published by the Free Software Foundation; version 2 |
|
15 | # as published by the Free Software Foundation; version 2 | |
@@ -17,24 +24,29 b'' | |||||
17 | # along with this program; if not, write to the Free Software |
|
24 | # along with this program; if not, write to the Free Software | |
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, |
|
25 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |
19 | # MA 02110-1301, USA. |
|
26 | # MA 02110-1301, USA. | |
20 | """ |
|
27 | ||
21 | Created on April 18, 2010 |
|
28 | import calendar | |
22 | summary controller for pylons |
|
29 | import logging | |
23 | @author: marcink |
|
30 | from time import mktime | |
24 | """ |
|
31 | from datetime import datetime, timedelta | |
|
32 | ||||
|
33 | from vcs.exceptions import ChangesetError | |||
|
34 | ||||
25 | from pylons import tmpl_context as c, request, url |
|
35 | from pylons import tmpl_context as c, request, url | |
|
36 | from pylons.i18n.translation import _ | |||
|
37 | ||||
|
38 | from rhodecode.model.scm import ScmModel | |||
|
39 | from rhodecode.model.db import Statistics | |||
|
40 | ||||
26 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator |
|
41 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator | |
27 | from rhodecode.lib.base import BaseController, render |
|
42 | from rhodecode.lib.base import BaseController, render | |
28 | from rhodecode.lib.utils import OrderedDict |
|
43 | from rhodecode.lib.utils import OrderedDict, EmptyChangeset | |
29 | from rhodecode.model.hg_model import HgModel |
|
44 | ||
30 | from rhodecode.model.db import Statistics |
|
|||
31 | from webhelpers.paginate import Page |
|
|||
32 | from rhodecode.lib.celerylib import run_task |
|
45 | from rhodecode.lib.celerylib import run_task | |
33 | from rhodecode.lib.celerylib.tasks import get_commits_stats |
|
46 | from rhodecode.lib.celerylib.tasks import get_commits_stats | |
34 | from datetime import datetime, timedelta |
|
47 | ||
35 | from time import mktime |
|
48 | from webhelpers.paginate import Page | |
36 | import calendar |
|
49 | ||
37 | import logging |
|
|||
38 | try: |
|
50 | try: | |
39 | import json |
|
51 | import json | |
40 | except ImportError: |
|
52 | except ImportError: | |
@@ -43,66 +55,90 b' except ImportError:' | |||||
43 | log = logging.getLogger(__name__) |
|
55 | log = logging.getLogger(__name__) | |
44 |
|
56 | |||
45 | class SummaryController(BaseController): |
|
57 | class SummaryController(BaseController): | |
46 |
|
58 | |||
47 | @LoginRequired() |
|
59 | @LoginRequired() | |
48 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', |
|
60 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
49 |
'repository.admin') |
|
61 | 'repository.admin') | |
50 | def __before__(self): |
|
62 | def __before__(self): | |
51 | super(SummaryController, self).__before__() |
|
63 | super(SummaryController, self).__before__() | |
52 |
|
64 | |||
53 | def index(self): |
|
65 | def index(self): | |
54 |
|
|
66 | scm_model = ScmModel() | |
55 |
c.repo_info = |
|
67 | c.repo_info = scm_model.get_repo(c.repo_name) | |
56 | c.repo_changesets = Page(list(c.repo_info[:10]), page=1, items_per_page=20) |
|
68 | c.following = scm_model.is_following_repo(c.repo_name, | |
|
69 | c.rhodecode_user.user_id) | |||
|
70 | def url_generator(**kw): | |||
|
71 | return url('shortlog_home', repo_name=c.repo_name, **kw) | |||
|
72 | ||||
|
73 | c.repo_changesets = Page(c.repo_info, page=1, items_per_page=10, | |||
|
74 | url=url_generator) | |||
|
75 | ||||
57 | e = request.environ |
|
76 | e = request.environ | |
58 |
|
77 | |||
59 | uri = u'%(protocol)s://%(user)s@%(host)s%(prefix)s/%(repo_name)s' % { |
|
78 | if self.rhodecode_user.username == 'default': | |
|
79 | password = ':default' | |||
|
80 | else: | |||
|
81 | password = '' | |||
|
82 | ||||
|
83 | uri = u'%(protocol)s://%(user)s%(password)s@%(host)s%(prefix)s/%(repo_name)s' % { | |||
60 | 'protocol': e.get('wsgi.url_scheme'), |
|
84 | 'protocol': e.get('wsgi.url_scheme'), | |
61 | 'user':str(c.rhodecode_user.username), |
|
85 | 'user':str(c.rhodecode_user.username), | |
|
86 | 'password':password, | |||
62 | 'host':e.get('HTTP_HOST'), |
|
87 | 'host':e.get('HTTP_HOST'), | |
63 | 'prefix':e.get('SCRIPT_NAME'), |
|
88 | 'prefix':e.get('SCRIPT_NAME'), | |
64 | 'repo_name':c.repo_name, } |
|
89 | 'repo_name':c.repo_name, } | |
65 | c.clone_repo_url = uri |
|
90 | c.clone_repo_url = uri | |
66 | c.repo_tags = OrderedDict() |
|
91 | c.repo_tags = OrderedDict() | |
67 | for name, hash in c.repo_info.tags.items()[:10]: |
|
92 | for name, hash in c.repo_info.tags.items()[:10]: | |
68 | c.repo_tags[name] = c.repo_info.get_changeset(hash) |
|
93 | try: | |
69 |
|
94 | c.repo_tags[name] = c.repo_info.get_changeset(hash) | ||
|
95 | except ChangesetError: | |||
|
96 | c.repo_tags[name] = EmptyChangeset(hash) | |||
|
97 | ||||
70 | c.repo_branches = OrderedDict() |
|
98 | c.repo_branches = OrderedDict() | |
71 | for name, hash in c.repo_info.branches.items()[:10]: |
|
99 | for name, hash in c.repo_info.branches.items()[:10]: | |
72 | c.repo_branches[name] = c.repo_info.get_changeset(hash) |
|
100 | try: | |
73 |
|
101 | c.repo_branches[name] = c.repo_info.get_changeset(hash) | ||
74 | td = datetime.today() + timedelta(days=1) |
|
102 | except ChangesetError: | |
|
103 | c.repo_branches[name] = EmptyChangeset(hash) | |||
|
104 | ||||
|
105 | td = datetime.today() + timedelta(days=1) | |||
75 | y, m, d = td.year, td.month, td.day |
|
106 | y, m, d = td.year, td.month, td.day | |
76 |
|
107 | |||
77 | ts_min_y = mktime((y - 1, (td - timedelta(days=calendar.mdays[m])).month, |
|
108 | ts_min_y = mktime((y - 1, (td - timedelta(days=calendar.mdays[m])).month, | |
78 | d, 0, 0, 0, 0, 0, 0,)) |
|
109 | d, 0, 0, 0, 0, 0, 0,)) | |
79 | ts_min_m = mktime((y, (td - timedelta(days=calendar.mdays[m])).month, |
|
110 | ts_min_m = mktime((y, (td - timedelta(days=calendar.mdays[m])).month, | |
80 | d, 0, 0, 0, 0, 0, 0,)) |
|
111 | d, 0, 0, 0, 0, 0, 0,)) | |
81 |
|
112 | |||
82 | ts_max_y = mktime((y, m, d, 0, 0, 0, 0, 0, 0,)) |
|
113 | ts_max_y = mktime((y, m, d, 0, 0, 0, 0, 0, 0,)) | |
83 |
|
114 | if c.repo_info.dbrepo.enable_statistics: | ||
84 | run_task(get_commits_stats, c.repo_info.name, ts_min_y, ts_max_y) |
|
115 | c.no_data_msg = _('No data loaded yet') | |
|
116 | run_task(get_commits_stats, c.repo_info.name, ts_min_y, ts_max_y) | |||
|
117 | else: | |||
|
118 | c.no_data_msg = _('Statistics update are disabled for this repository') | |||
85 | c.ts_min = ts_min_m |
|
119 | c.ts_min = ts_min_m | |
86 | c.ts_max = ts_max_y |
|
120 | c.ts_max = ts_max_y | |
87 |
|
121 | |||
88 | stats = self.sa.query(Statistics)\ |
|
122 | stats = self.sa.query(Statistics)\ | |
89 | .filter(Statistics.repository == c.repo_info.dbrepo)\ |
|
123 | .filter(Statistics.repository == c.repo_info.dbrepo)\ | |
90 | .scalar() |
|
124 | .scalar() | |
91 |
|
125 | |||
92 |
|
126 | |||
93 | if stats and stats.languages: |
|
127 | if stats and stats.languages: | |
|
128 | c.no_data = False is c.repo_info.dbrepo.enable_statistics | |||
94 | lang_stats = json.loads(stats.languages) |
|
129 | lang_stats = json.loads(stats.languages) | |
95 | c.commit_data = stats.commit_activity |
|
130 | c.commit_data = stats.commit_activity | |
96 | c.overview_data = stats.commit_activity_combined |
|
131 | c.overview_data = stats.commit_activity_combined | |
97 | c.trending_languages = json.dumps(OrderedDict( |
|
132 | c.trending_languages = json.dumps(OrderedDict( | |
98 | sorted(lang_stats.items(), reverse=True, |
|
133 | sorted(lang_stats.items(), reverse=True, | |
99 |
key=lambda k: k[1])[: |
|
134 | key=lambda k: k[1])[:10] | |
100 | ) |
|
135 | ) | |
101 | ) |
|
136 | ) | |
102 | else: |
|
137 | else: | |
103 | c.commit_data = json.dumps({}) |
|
138 | c.commit_data = json.dumps({}) | |
104 | c.overview_data = json.dumps([[ts_min_y, 0], [ts_max_y, 0] ]) |
|
139 | c.overview_data = json.dumps([[ts_min_y, 0], [ts_max_y, 10] ]) | |
105 | c.trending_languages = json.dumps({}) |
|
140 | c.trending_languages = json.dumps({}) | |
106 |
|
141 | c.no_data = True | ||
|
142 | ||||
107 | return render('summary/summary.html') |
|
143 | return render('summary/summary.html') | |
108 |
|
144 |
@@ -26,7 +26,7 b' from pylons import tmpl_context as c' | |||||
26 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator |
|
26 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator | |
27 | from rhodecode.lib.base import BaseController, render |
|
27 | from rhodecode.lib.base import BaseController, render | |
28 | from rhodecode.lib.utils import OrderedDict |
|
28 | from rhodecode.lib.utils import OrderedDict | |
29 |
from rhodecode.model. |
|
29 | from rhodecode.model.scm import ScmModel | |
30 | import logging |
|
30 | import logging | |
31 | log = logging.getLogger(__name__) |
|
31 | log = logging.getLogger(__name__) | |
32 |
|
32 | |||
@@ -38,7 +38,7 b' class TagsController(BaseController):' | |||||
38 | super(TagsController, self).__before__() |
|
38 | super(TagsController, self).__before__() | |
39 |
|
39 | |||
40 | def index(self): |
|
40 | def index(self): | |
41 |
hg_model = |
|
41 | hg_model = ScmModel() | |
42 | c.repo_info = hg_model.get_repo(c.repo_name) |
|
42 | c.repo_info = hg_model.get_repo(c.repo_name) | |
43 | c.repo_tags = OrderedDict() |
|
43 | c.repo_tags = OrderedDict() | |
44 | for name, hash_ in c.repo_info.tags.items(): |
|
44 | for name, hash_ in c.repo_info.tags.items(): |
@@ -19,13 +19,13 b' class Globals(object):' | |||||
19 | self.cache = CacheManager(**parse_cache_config_options(config)) |
|
19 | self.cache = CacheManager(**parse_cache_config_options(config)) | |
20 | self.available_permissions = None # propagated after init_model |
|
20 | self.available_permissions = None # propagated after init_model | |
21 | self.baseui = None # propagated after init_model |
|
21 | self.baseui = None # propagated after init_model | |
22 |
|
22 | |||
23 | @LazyProperty |
|
23 | @LazyProperty | |
24 | def paths(self): |
|
24 | def paths(self): | |
25 | if self.baseui: |
|
25 | if self.baseui: | |
26 | return self.baseui.configitems('paths') |
|
26 | return self.baseui.configitems('paths') | |
27 |
|
27 | |||
28 | @LazyProperty |
|
28 | @LazyProperty | |
29 | def base_path(self): |
|
29 | def base_path(self): | |
30 | if self.baseui: |
|
30 | if self.baseui: | |
31 |
return self.paths[0][1] |
|
31 | return self.paths[0][1] |
@@ -22,22 +22,23 b' Created on April 4, 2010' | |||||
22 |
|
22 | |||
23 | @author: marcink |
|
23 | @author: marcink | |
24 | """ |
|
24 | """ | |
25 | from beaker.cache import cache_region |
|
|||
26 | from pylons import config, session, url, request |
|
25 | from pylons import config, session, url, request | |
27 | from pylons.controllers.util import abort, redirect |
|
26 | from pylons.controllers.util import abort, redirect | |
|
27 | from rhodecode.lib.exceptions import * | |||
28 | from rhodecode.lib.utils import get_repo_slug |
|
28 | from rhodecode.lib.utils import get_repo_slug | |
|
29 | from rhodecode.lib.auth_ldap import AuthLdap | |||
29 | from rhodecode.model import meta |
|
30 | from rhodecode.model import meta | |
|
31 | from rhodecode.model.user import UserModel | |||
30 | from rhodecode.model.caching_query import FromCache |
|
32 | from rhodecode.model.caching_query import FromCache | |
31 | from rhodecode.model.db import User, RepoToPerm, Repository, Permission, \ |
|
33 | from rhodecode.model.db import User, RepoToPerm, Repository, Permission, \ | |
32 | UserToPerm |
|
34 | UserToPerm | |
33 | from sqlalchemy.exc import OperationalError |
|
|||
34 | from sqlalchemy.orm.exc import NoResultFound, MultipleResultsFound |
|
|||
35 | import bcrypt |
|
35 | import bcrypt | |
36 | from decorator import decorator |
|
36 | from decorator import decorator | |
37 | import logging |
|
37 | import logging | |
38 | import random |
|
38 | import random | |
|
39 | import traceback | |||
39 |
|
40 | |||
40 |
log = logging.getLogger(__name__) |
|
41 | log = logging.getLogger(__name__) | |
41 |
|
42 | |||
42 | class PasswordGenerator(object): |
|
43 | class PasswordGenerator(object): | |
43 | """This is a simple class for generating password from |
|
44 | """This is a simple class for generating password from | |
@@ -56,7 +57,7 b' class PasswordGenerator(object):' | |||||
56 | ALPHABETS_BIG_SMALL = ALPHABETS_BIG + ALPHABETS_SMALL |
|
57 | ALPHABETS_BIG_SMALL = ALPHABETS_BIG + ALPHABETS_SMALL | |
57 | ALPHABETS_ALPHANUM_BIG = ALPHABETS_BIG + ALPHABETS_NUM#[6] |
|
58 | ALPHABETS_ALPHANUM_BIG = ALPHABETS_BIG + ALPHABETS_NUM#[6] | |
58 | ALPHABETS_ALPHANUM_SMALL = ALPHABETS_SMALL + ALPHABETS_NUM#[7] |
|
59 | ALPHABETS_ALPHANUM_SMALL = ALPHABETS_SMALL + ALPHABETS_NUM#[7] | |
59 |
|
60 | |||
60 | def __init__(self, passwd=''): |
|
61 | def __init__(self, passwd=''): | |
61 | self.passwd = passwd |
|
62 | self.passwd = passwd | |
62 |
|
63 | |||
@@ -64,40 +65,93 b' class PasswordGenerator(object):' | |||||
64 | self.passwd = ''.join([random.choice(type) for _ in xrange(len)]) |
|
65 | self.passwd = ''.join([random.choice(type) for _ in xrange(len)]) | |
65 | return self.passwd |
|
66 | return self.passwd | |
66 |
|
67 | |||
67 |
|
68 | |||
68 | def get_crypt_password(password): |
|
69 | def get_crypt_password(password): | |
69 | """Cryptographic function used for password hashing based on sha1 |
|
70 | """Cryptographic function used for password hashing based on sha1 | |
70 | :param password: password to hash |
|
71 | :param password: password to hash | |
71 |
""" |
|
72 | """ | |
72 | return bcrypt.hashpw(password, bcrypt.gensalt(10)) |
|
73 | return bcrypt.hashpw(password, bcrypt.gensalt(10)) | |
73 |
|
74 | |||
74 | def check_password(password, hashed): |
|
75 | def check_password(password, hashed): | |
75 | return bcrypt.hashpw(password, hashed) == hashed |
|
76 | return bcrypt.hashpw(password, hashed) == hashed | |
76 |
|
77 | |||
77 | @cache_region('super_short_term', 'cached_user') |
|
78 | def authfunc(environ, username, password): | |
78 | def get_user_cached(username): |
|
79 | """ | |
79 | sa = meta.Session |
|
80 | Dummy authentication function used in Mercurial/Git/ and access control, | |
80 |
|
|
81 | ||
81 | user = sa.query(User).filter(User.username == username).one() |
|
82 | :param environ: needed only for using in Basic auth | |
82 | finally: |
|
83 | """ | |
83 | meta.Session.remove() |
|
84 | return authenticate(username, password) | |
84 | return user |
|
85 | ||
85 |
|
86 | |||
86 |
def auth |
|
87 | def authenticate(username, password): | |
87 | try: |
|
88 | """ | |
88 | user = get_user_cached(username) |
|
89 | Authentication function used for access control, | |
89 | except (NoResultFound, MultipleResultsFound, OperationalError), e: |
|
90 | firstly checks for db authentication then if ldap is enabled for ldap | |
90 | log.error(e) |
|
91 | authentication, also creates ldap user if not in database | |
91 | user = None |
|
92 | ||
92 |
|
93 | :param username: username | ||
93 | if user: |
|
94 | :param password: password | |
|
95 | """ | |||
|
96 | user_model = UserModel() | |||
|
97 | user = user_model.get_by_username(username, cache=False) | |||
|
98 | ||||
|
99 | log.debug('Authenticating user using RhodeCode account') | |||
|
100 | if user is not None and user.is_ldap is False: | |||
94 | if user.active: |
|
101 | if user.active: | |
95 | if user.username == username and check_password(password, user.password): |
|
102 | ||
|
103 | if user.username == 'default' and user.active: | |||
|
104 | log.info('user %s authenticated correctly as anonymous user', | |||
|
105 | username) | |||
|
106 | return True | |||
|
107 | ||||
|
108 | elif user.username == username and check_password(password, user.password): | |||
96 | log.info('user %s authenticated correctly', username) |
|
109 | log.info('user %s authenticated correctly', username) | |
97 | return True |
|
110 | return True | |
98 | else: |
|
111 | else: | |
99 |
log. |
|
112 | log.warning('user %s is disabled', username) | |
100 |
|
113 | |||
|
114 | else: | |||
|
115 | log.debug('Regular authentication failed') | |||
|
116 | user_obj = user_model.get_by_username(username, cache=False, | |||
|
117 | case_insensitive=True) | |||
|
118 | ||||
|
119 | if user_obj is not None and user_obj.is_ldap is False: | |||
|
120 | log.debug('this user already exists as non ldap') | |||
|
121 | return False | |||
|
122 | ||||
|
123 | from rhodecode.model.settings import SettingsModel | |||
|
124 | ldap_settings = SettingsModel().get_ldap_settings() | |||
|
125 | ||||
|
126 | #====================================================================== | |||
|
127 | # FALLBACK TO LDAP AUTH IN ENABLE | |||
|
128 | #====================================================================== | |||
|
129 | if ldap_settings.get('ldap_active', False): | |||
|
130 | log.debug("Authenticating user using ldap") | |||
|
131 | kwargs = { | |||
|
132 | 'server':ldap_settings.get('ldap_host', ''), | |||
|
133 | 'base_dn':ldap_settings.get('ldap_base_dn', ''), | |||
|
134 | 'port':ldap_settings.get('ldap_port'), | |||
|
135 | 'bind_dn':ldap_settings.get('ldap_dn_user'), | |||
|
136 | 'bind_pass':ldap_settings.get('ldap_dn_pass'), | |||
|
137 | 'use_ldaps':ldap_settings.get('ldap_ldaps'), | |||
|
138 | 'ldap_version':3, | |||
|
139 | } | |||
|
140 | log.debug('Checking for ldap authentication') | |||
|
141 | try: | |||
|
142 | aldap = AuthLdap(**kwargs) | |||
|
143 | res = aldap.authenticate_ldap(username, password) | |||
|
144 | log.debug('Got ldap response %s', res) | |||
|
145 | ||||
|
146 | if user_model.create_ldap(username, password): | |||
|
147 | log.info('created new ldap user') | |||
|
148 | ||||
|
149 | return True | |||
|
150 | except (LdapUsernameError, LdapPasswordError,): | |||
|
151 | pass | |||
|
152 | except (Exception,): | |||
|
153 | log.error(traceback.format_exc()) | |||
|
154 | pass | |||
101 | return False |
|
155 | return False | |
102 |
|
156 | |||
103 | class AuthUser(object): |
|
157 | class AuthUser(object): | |
@@ -114,6 +168,8 b' class AuthUser(object):' | |||||
114 | self.is_admin = False |
|
168 | self.is_admin = False | |
115 | self.permissions = {} |
|
169 | self.permissions = {} | |
116 |
|
170 | |||
|
171 | def __repr__(self): | |||
|
172 | return "<AuthUser('id:%s:%s')>" % (self.user_id, self.username) | |||
117 |
|
173 | |||
118 | def set_available_permissions(config): |
|
174 | def set_available_permissions(config): | |
119 | """ |
|
175 | """ | |
@@ -125,82 +181,62 b' def set_available_permissions(config):' | |||||
125 | """ |
|
181 | """ | |
126 | log.info('getting information about all available permissions') |
|
182 | log.info('getting information about all available permissions') | |
127 | try: |
|
183 | try: | |
128 | sa = meta.Session |
|
184 | sa = meta.Session() | |
129 | all_perms = sa.query(Permission).all() |
|
185 | all_perms = sa.query(Permission).all() | |
|
186 | except: | |||
|
187 | pass | |||
130 | finally: |
|
188 | finally: | |
131 | meta.Session.remove() |
|
189 | meta.Session.remove() | |
132 |
|
190 | |||
133 | config['available_permissions'] = [x.permission_name for x in all_perms] |
|
191 | config['available_permissions'] = [x.permission_name for x in all_perms] | |
134 |
|
192 | |||
135 | def set_base_path(config): |
|
193 | def set_base_path(config): | |
136 | config['base_path'] = config['pylons.app_globals'].base_path |
|
194 | config['base_path'] = config['pylons.app_globals'].base_path | |
137 |
|
195 | |||
138 | def fill_data(user): |
|
196 | ||
139 | """ |
|
|||
140 | Fills user data with those from database and log out user if not present |
|
|||
141 | in database |
|
|||
142 | :param user: |
|
|||
143 | """ |
|
|||
144 | sa = meta.Session |
|
|||
145 | dbuser = sa.query(User).options(FromCache('sql_cache_short', |
|
|||
146 | 'getuser_%s' % user.user_id))\ |
|
|||
147 | .get(user.user_id) |
|
|||
148 | if dbuser: |
|
|||
149 | user.username = dbuser.username |
|
|||
150 | user.is_admin = dbuser.admin |
|
|||
151 | user.name = dbuser.name |
|
|||
152 | user.lastname = dbuser.lastname |
|
|||
153 | user.email = dbuser.email |
|
|||
154 | else: |
|
|||
155 | user.is_authenticated = False |
|
|||
156 | meta.Session.remove() |
|
|||
157 | return user |
|
|||
158 |
|
||||
159 | def fill_perms(user): |
|
197 | def fill_perms(user): | |
160 | """ |
|
198 | """ | |
161 | Fills user permission attribute with permissions taken from database |
|
199 | Fills user permission attribute with permissions taken from database | |
162 | :param user: |
|
200 | :param user: | |
163 | """ |
|
201 | """ | |
164 |
|
202 | |||
165 | sa = meta.Session |
|
203 | sa = meta.Session() | |
166 | user.permissions['repositories'] = {} |
|
204 | user.permissions['repositories'] = {} | |
167 | user.permissions['global'] = set() |
|
205 | user.permissions['global'] = set() | |
168 |
|
206 | |||
169 | #=========================================================================== |
|
207 | #=========================================================================== | |
170 | # fetch default permissions |
|
208 | # fetch default permissions | |
171 | #=========================================================================== |
|
209 | #=========================================================================== | |
172 | default_user = sa.query(User)\ |
|
210 | default_user = UserModel().get_by_username('default', cache=True) | |
173 | .options(FromCache('sql_cache_short','getuser_%s' % 'default'))\ |
|
211 | ||
174 | .filter(User.username == 'default').scalar() |
|
|||
175 |
|
||||
176 | default_perms = sa.query(RepoToPerm, Repository, Permission)\ |
|
212 | default_perms = sa.query(RepoToPerm, Repository, Permission)\ | |
177 | .join((Repository, RepoToPerm.repository_id == Repository.repo_id))\ |
|
213 | .join((Repository, RepoToPerm.repository_id == Repository.repo_id))\ | |
178 | .join((Permission, RepoToPerm.permission_id == Permission.permission_id))\ |
|
214 | .join((Permission, RepoToPerm.permission_id == Permission.permission_id))\ | |
179 | .filter(RepoToPerm.user == default_user).all() |
|
215 | .filter(RepoToPerm.user == default_user).all() | |
180 |
|
216 | |||
181 | if user.is_admin: |
|
217 | if user.is_admin: | |
182 | #======================================================================= |
|
218 | #======================================================================= | |
183 | # #admin have all default rights set to admin |
|
219 | # #admin have all default rights set to admin | |
184 | #======================================================================= |
|
220 | #======================================================================= | |
185 | user.permissions['global'].add('hg.admin') |
|
221 | user.permissions['global'].add('hg.admin') | |
186 |
|
222 | |||
187 | for perm in default_perms: |
|
223 | for perm in default_perms: | |
188 | p = 'repository.admin' |
|
224 | p = 'repository.admin' | |
189 | user.permissions['repositories'][perm.RepoToPerm.repository.repo_name] = p |
|
225 | user.permissions['repositories'][perm.RepoToPerm.repository.repo_name] = p | |
190 |
|
226 | |||
191 | else: |
|
227 | else: | |
192 | #======================================================================= |
|
228 | #======================================================================= | |
193 | # set default permissions |
|
229 | # set default permissions | |
194 | #======================================================================= |
|
230 | #======================================================================= | |
195 |
|
231 | |||
196 | #default global |
|
232 | #default global | |
197 | default_global_perms = sa.query(UserToPerm)\ |
|
233 | default_global_perms = sa.query(UserToPerm)\ | |
198 |
.filter(UserToPerm.user == sa.query(User) |
|
234 | .filter(UserToPerm.user == sa.query(User)\ | |
199 | 'default').one()) |
|
235 | .filter(User.username == 'default').one()) | |
200 |
|
236 | |||
201 | for perm in default_global_perms: |
|
237 | for perm in default_global_perms: | |
202 | user.permissions['global'].add(perm.permission.permission_name) |
|
238 | user.permissions['global'].add(perm.permission.permission_name) | |
203 |
|
239 | |||
204 | #default repositories |
|
240 | #default repositories | |
205 | for perm in default_perms: |
|
241 | for perm in default_perms: | |
206 | if perm.Repository.private and not perm.Repository.user_id == user.user_id: |
|
242 | if perm.Repository.private and not perm.Repository.user_id == user.user_id: | |
@@ -211,9 +247,9 b' def fill_perms(user):' | |||||
211 | p = 'repository.admin' |
|
247 | p = 'repository.admin' | |
212 | else: |
|
248 | else: | |
213 | p = perm.Permission.permission_name |
|
249 | p = perm.Permission.permission_name | |
214 |
|
250 | |||
215 | user.permissions['repositories'][perm.RepoToPerm.repository.repo_name] = p |
|
251 | user.permissions['repositories'][perm.RepoToPerm.repository.repo_name] = p | |
216 |
|
252 | |||
217 | #======================================================================= |
|
253 | #======================================================================= | |
218 | # #overwrite default with user permissions if any |
|
254 | # #overwrite default with user permissions if any | |
219 | #======================================================================= |
|
255 | #======================================================================= | |
@@ -221,38 +257,52 b' def fill_perms(user):' | |||||
221 | .join((Repository, RepoToPerm.repository_id == Repository.repo_id))\ |
|
257 | .join((Repository, RepoToPerm.repository_id == Repository.repo_id))\ | |
222 | .join((Permission, RepoToPerm.permission_id == Permission.permission_id))\ |
|
258 | .join((Permission, RepoToPerm.permission_id == Permission.permission_id))\ | |
223 | .filter(RepoToPerm.user_id == user.user_id).all() |
|
259 | .filter(RepoToPerm.user_id == user.user_id).all() | |
224 |
|
260 | |||
225 | for perm in user_perms: |
|
261 | for perm in user_perms: | |
226 | if perm.Repository.user_id == user.user_id:#set admin if owner |
|
262 | if perm.Repository.user_id == user.user_id:#set admin if owner | |
227 | p = 'repository.admin' |
|
263 | p = 'repository.admin' | |
228 | else: |
|
264 | else: | |
229 | p = perm.Permission.permission_name |
|
265 | p = perm.Permission.permission_name | |
230 | user.permissions['repositories'][perm.RepoToPerm.repository.repo_name] = p |
|
266 | user.permissions['repositories'][perm.RepoToPerm.repository.repo_name] = p | |
231 |
meta.Session.remove() |
|
267 | meta.Session.remove() | |
232 | return user |
|
268 | return user | |
233 |
|
269 | |||
234 | def get_user(session): |
|
270 | def get_user(session): | |
235 | """ |
|
271 | """ | |
236 | Gets user from session, and wraps permissions into user |
|
272 | Gets user from session, and wraps permissions into user | |
237 | :param session: |
|
273 | :param session: | |
238 | """ |
|
274 | """ | |
239 | user = session.get('rhodecode_user', AuthUser()) |
|
275 | user = session.get('rhodecode_user', AuthUser()) | |
|
276 | #if the user is not logged in we check for anonymous access | |||
|
277 | #if user is logged and it's a default user check if we still have anonymous | |||
|
278 | #access enabled | |||
|
279 | if user.user_id is None or user.username == 'default': | |||
|
280 | anonymous_user = UserModel().get_by_username('default', cache=True) | |||
|
281 | if anonymous_user.active is True: | |||
|
282 | #then we set this user is logged in | |||
|
283 | user.is_authenticated = True | |||
|
284 | user.user_id = anonymous_user.user_id | |||
|
285 | else: | |||
|
286 | user.is_authenticated = False | |||
|
287 | ||||
240 | if user.is_authenticated: |
|
288 | if user.is_authenticated: | |
241 | user = fill_data(user) |
|
289 | user = UserModel().fill_data(user) | |
|
290 | ||||
242 | user = fill_perms(user) |
|
291 | user = fill_perms(user) | |
243 | session['rhodecode_user'] = user |
|
292 | session['rhodecode_user'] = user | |
244 | session.save() |
|
293 | session.save() | |
245 | return user |
|
294 | return user | |
246 |
|
295 | |||
247 | #=============================================================================== |
|
296 | #=============================================================================== | |
248 | # CHECK DECORATORS |
|
297 | # CHECK DECORATORS | |
249 | #=============================================================================== |
|
298 | #=============================================================================== | |
250 | class LoginRequired(object): |
|
299 | class LoginRequired(object): | |
251 |
"""Must be logged in to execute this function else |
|
300 | """Must be logged in to execute this function else | |
252 |
|
301 | redirect to login page""" | ||
|
302 | ||||
253 | def __call__(self, func): |
|
303 | def __call__(self, func): | |
254 | return decorator(self.__wrapper, func) |
|
304 | return decorator(self.__wrapper, func) | |
255 |
|
305 | |||
256 | def __wrapper(self, func, *fargs, **fkwargs): |
|
306 | def __wrapper(self, func, *fargs, **fkwargs): | |
257 | user = session.get('rhodecode_user', AuthUser()) |
|
307 | user = session.get('rhodecode_user', AuthUser()) | |
258 | log.debug('Checking login required for user:%s', user.username) |
|
308 | log.debug('Checking login required for user:%s', user.username) | |
@@ -261,21 +311,46 b' class LoginRequired(object):' | |||||
261 | return func(*fargs, **fkwargs) |
|
311 | return func(*fargs, **fkwargs) | |
262 | else: |
|
312 | else: | |
263 | log.warn('user %s not authenticated', user.username) |
|
313 | log.warn('user %s not authenticated', user.username) | |
264 |
|
314 | |||
265 | p = '' |
|
315 | p = '' | |
266 | if request.environ.get('SCRIPT_NAME') != '/': |
|
316 | if request.environ.get('SCRIPT_NAME') != '/': | |
267 | p += request.environ.get('SCRIPT_NAME') |
|
317 | p += request.environ.get('SCRIPT_NAME') | |
268 |
|
318 | |||
269 | p += request.environ.get('PATH_INFO') |
|
319 | p += request.environ.get('PATH_INFO') | |
270 | if request.environ.get('QUERY_STRING'): |
|
320 | if request.environ.get('QUERY_STRING'): | |
271 | p += '?' + request.environ.get('QUERY_STRING') |
|
321 | p += '?' + request.environ.get('QUERY_STRING') | |
272 |
|
322 | |||
273 |
log.debug('redirecting to login page with %s', p) |
|
323 | log.debug('redirecting to login page with %s', p) | |
274 | return redirect(url('login_home', came_from=p)) |
|
324 | return redirect(url('login_home', came_from=p)) | |
275 |
|
325 | |||
|
326 | class NotAnonymous(object): | |||
|
327 | """Must be logged in to execute this function else | |||
|
328 | redirect to login page""" | |||
|
329 | ||||
|
330 | def __call__(self, func): | |||
|
331 | return decorator(self.__wrapper, func) | |||
|
332 | ||||
|
333 | def __wrapper(self, func, *fargs, **fkwargs): | |||
|
334 | user = session.get('rhodecode_user', AuthUser()) | |||
|
335 | log.debug('Checking if user is not anonymous') | |||
|
336 | ||||
|
337 | anonymous = user.username == 'default' | |||
|
338 | ||||
|
339 | if anonymous: | |||
|
340 | p = '' | |||
|
341 | if request.environ.get('SCRIPT_NAME') != '/': | |||
|
342 | p += request.environ.get('SCRIPT_NAME') | |||
|
343 | ||||
|
344 | p += request.environ.get('PATH_INFO') | |||
|
345 | if request.environ.get('QUERY_STRING'): | |||
|
346 | p += '?' + request.environ.get('QUERY_STRING') | |||
|
347 | return redirect(url('login_home', came_from=p)) | |||
|
348 | else: | |||
|
349 | return func(*fargs, **fkwargs) | |||
|
350 | ||||
276 | class PermsDecorator(object): |
|
351 | class PermsDecorator(object): | |
277 | """Base class for decorators""" |
|
352 | """Base class for decorators""" | |
278 |
|
353 | |||
279 | def __init__(self, *required_perms): |
|
354 | def __init__(self, *required_perms): | |
280 | available_perms = config['available_permissions'] |
|
355 | available_perms = config['available_permissions'] | |
281 | for perm in required_perms: |
|
356 | for perm in required_perms: | |
@@ -283,32 +358,33 b' class PermsDecorator(object):' | |||||
283 | raise Exception("'%s' permission is not defined" % perm) |
|
358 | raise Exception("'%s' permission is not defined" % perm) | |
284 | self.required_perms = set(required_perms) |
|
359 | self.required_perms = set(required_perms) | |
285 | self.user_perms = None |
|
360 | self.user_perms = None | |
286 |
|
361 | |||
287 | def __call__(self, func): |
|
362 | def __call__(self, func): | |
288 | return decorator(self.__wrapper, func) |
|
363 | return decorator(self.__wrapper, func) | |
289 |
|
364 | |||
290 |
|
365 | |||
291 | def __wrapper(self, func, *fargs, **fkwargs): |
|
366 | def __wrapper(self, func, *fargs, **fkwargs): | |
292 | # _wrapper.__name__ = func.__name__ |
|
367 | # _wrapper.__name__ = func.__name__ | |
293 | # _wrapper.__dict__.update(func.__dict__) |
|
368 | # _wrapper.__dict__.update(func.__dict__) | |
294 | # _wrapper.__doc__ = func.__doc__ |
|
369 | # _wrapper.__doc__ = func.__doc__ | |
|
370 | self.user = session.get('rhodecode_user', AuthUser()) | |||
|
371 | self.user_perms = self.user.permissions | |||
|
372 | log.debug('checking %s permissions %s for %s %s', | |||
|
373 | self.__class__.__name__, self.required_perms, func.__name__, | |||
|
374 | self.user) | |||
295 |
|
375 | |||
296 | self.user_perms = session.get('rhodecode_user', AuthUser()).permissions |
|
|||
297 | log.debug('checking %s permissions %s for %s', |
|
|||
298 | self.__class__.__name__, self.required_perms, func.__name__) |
|
|||
299 |
|
||||
300 | if self.check_permissions(): |
|
376 | if self.check_permissions(): | |
301 | log.debug('Permission granted for %s', func.__name__) |
|
377 | log.debug('Permission granted for %s %s', func.__name__, self.user) | |
302 |
|
378 | |||
303 | return func(*fargs, **fkwargs) |
|
379 | return func(*fargs, **fkwargs) | |
304 |
|
380 | |||
305 | else: |
|
381 | else: | |
306 | log.warning('Permission denied for %s', func.__name__) |
|
382 | log.warning('Permission denied for %s %s', func.__name__, self.user) | |
307 | #redirect with forbidden ret code |
|
383 | #redirect with forbidden ret code | |
308 | return abort(403) |
|
384 | return abort(403) | |
309 |
|
385 | |||
310 |
|
386 | |||
311 |
|
387 | |||
312 | def check_permissions(self): |
|
388 | def check_permissions(self): | |
313 | """Dummy function for overriding""" |
|
389 | """Dummy function for overriding""" | |
314 | raise Exception('You have to write this function in child class') |
|
390 | raise Exception('You have to write this function in child class') | |
@@ -317,18 +393,18 b' class HasPermissionAllDecorator(PermsDec' | |||||
317 | """Checks for access permission for all given predicates. All of them |
|
393 | """Checks for access permission for all given predicates. All of them | |
318 | have to be meet in order to fulfill the request |
|
394 | have to be meet in order to fulfill the request | |
319 | """ |
|
395 | """ | |
320 |
|
396 | |||
321 | def check_permissions(self): |
|
397 | def check_permissions(self): | |
322 | if self.required_perms.issubset(self.user_perms.get('global')): |
|
398 | if self.required_perms.issubset(self.user_perms.get('global')): | |
323 | return True |
|
399 | return True | |
324 | return False |
|
400 | return False | |
325 |
|
401 | |||
326 |
|
402 | |||
327 | class HasPermissionAnyDecorator(PermsDecorator): |
|
403 | class HasPermissionAnyDecorator(PermsDecorator): | |
328 | """Checks for access permission for any of given predicates. In order to |
|
404 | """Checks for access permission for any of given predicates. In order to | |
329 | fulfill the request any of predicates must be meet |
|
405 | fulfill the request any of predicates must be meet | |
330 | """ |
|
406 | """ | |
331 |
|
407 | |||
332 | def check_permissions(self): |
|
408 | def check_permissions(self): | |
333 | if self.required_perms.intersection(self.user_perms.get('global')): |
|
409 | if self.required_perms.intersection(self.user_perms.get('global')): | |
334 | return True |
|
410 | return True | |
@@ -338,7 +414,7 b' class HasRepoPermissionAllDecorator(Perm' | |||||
338 | """Checks for access permission for all given predicates for specific |
|
414 | """Checks for access permission for all given predicates for specific | |
339 | repository. All of them have to be meet in order to fulfill the request |
|
415 | repository. All of them have to be meet in order to fulfill the request | |
340 | """ |
|
416 | """ | |
341 |
|
417 | |||
342 | def check_permissions(self): |
|
418 | def check_permissions(self): | |
343 | repo_name = get_repo_slug(request) |
|
419 | repo_name = get_repo_slug(request) | |
344 | try: |
|
420 | try: | |
@@ -348,16 +424,16 b' class HasRepoPermissionAllDecorator(Perm' | |||||
348 | if self.required_perms.issubset(user_perms): |
|
424 | if self.required_perms.issubset(user_perms): | |
349 | return True |
|
425 | return True | |
350 | return False |
|
426 | return False | |
351 |
|
427 | |||
352 |
|
428 | |||
353 | class HasRepoPermissionAnyDecorator(PermsDecorator): |
|
429 | class HasRepoPermissionAnyDecorator(PermsDecorator): | |
354 | """Checks for access permission for any of given predicates for specific |
|
430 | """Checks for access permission for any of given predicates for specific | |
355 | repository. In order to fulfill the request any of predicates must be meet |
|
431 | repository. In order to fulfill the request any of predicates must be meet | |
356 | """ |
|
432 | """ | |
357 |
|
433 | |||
358 | def check_permissions(self): |
|
434 | def check_permissions(self): | |
359 | repo_name = get_repo_slug(request) |
|
435 | repo_name = get_repo_slug(request) | |
360 |
|
436 | |||
361 | try: |
|
437 | try: | |
362 | user_perms = set([self.user_perms['repositories'][repo_name]]) |
|
438 | user_perms = set([self.user_perms['repositories'][repo_name]]) | |
363 | except KeyError: |
|
439 | except KeyError: | |
@@ -371,10 +447,10 b' class HasRepoPermissionAnyDecorator(Perm' | |||||
371 |
|
447 | |||
372 | class PermsFunction(object): |
|
448 | class PermsFunction(object): | |
373 | """Base function for other check functions""" |
|
449 | """Base function for other check functions""" | |
374 |
|
450 | |||
375 | def __init__(self, *perms): |
|
451 | def __init__(self, *perms): | |
376 | available_perms = config['available_permissions'] |
|
452 | available_perms = config['available_permissions'] | |
377 |
|
453 | |||
378 | for perm in perms: |
|
454 | for perm in perms: | |
379 | if perm not in available_perms: |
|
455 | if perm not in available_perms: | |
380 | raise Exception("'%s' permission in not defined" % perm) |
|
456 | raise Exception("'%s' permission in not defined" % perm) | |
@@ -382,29 +458,30 b' class PermsFunction(object):' | |||||
382 | self.user_perms = None |
|
458 | self.user_perms = None | |
383 | self.granted_for = '' |
|
459 | self.granted_for = '' | |
384 | self.repo_name = None |
|
460 | self.repo_name = None | |
385 |
|
461 | |||
386 | def __call__(self, check_Location=''): |
|
462 | def __call__(self, check_Location=''): | |
387 | user = session.get('rhodecode_user', False) |
|
463 | user = session.get('rhodecode_user', False) | |
388 | if not user: |
|
464 | if not user: | |
389 | return False |
|
465 | return False | |
390 | self.user_perms = user.permissions |
|
466 | self.user_perms = user.permissions | |
391 |
self.granted_for = user.username |
|
467 | self.granted_for = user.username | |
392 |
log.debug('checking %s %s', self.__class__.__name__, |
|
468 | log.debug('checking %s %s %s', self.__class__.__name__, | |
393 |
|
469 | self.required_perms, user) | ||
|
470 | ||||
394 | if self.check_permissions(): |
|
471 | if self.check_permissions(): | |
395 | log.debug('Permission granted for %s @%s', self.granted_for, |
|
472 | log.debug('Permission granted for %s @ %s %s', self.granted_for, | |
396 | check_Location) |
|
473 | check_Location, user) | |
397 | return True |
|
474 | return True | |
398 |
|
475 | |||
399 | else: |
|
476 | else: | |
400 | log.warning('Permission denied for %s @%s', self.granted_for, |
|
477 | log.warning('Permission denied for %s @ %s %s', self.granted_for, | |
401 | check_Location) |
|
478 | check_Location, user) | |
402 |
return False |
|
479 | return False | |
403 |
|
480 | |||
404 | def check_permissions(self): |
|
481 | def check_permissions(self): | |
405 | """Dummy function for overriding""" |
|
482 | """Dummy function for overriding""" | |
406 | raise Exception('You have to write this function in child class') |
|
483 | raise Exception('You have to write this function in child class') | |
407 |
|
484 | |||
408 | class HasPermissionAll(PermsFunction): |
|
485 | class HasPermissionAll(PermsFunction): | |
409 | def check_permissions(self): |
|
486 | def check_permissions(self): | |
410 | if self.required_perms.issubset(self.user_perms.get('global')): |
|
487 | if self.required_perms.issubset(self.user_perms.get('global')): | |
@@ -418,11 +495,11 b' class HasPermissionAny(PermsFunction):' | |||||
418 | return False |
|
495 | return False | |
419 |
|
496 | |||
420 | class HasRepoPermissionAll(PermsFunction): |
|
497 | class HasRepoPermissionAll(PermsFunction): | |
421 |
|
498 | |||
422 | def __call__(self, repo_name=None, check_Location=''): |
|
499 | def __call__(self, repo_name=None, check_Location=''): | |
423 | self.repo_name = repo_name |
|
500 | self.repo_name = repo_name | |
424 | return super(HasRepoPermissionAll, self).__call__(check_Location) |
|
501 | return super(HasRepoPermissionAll, self).__call__(check_Location) | |
425 |
|
502 | |||
426 | def check_permissions(self): |
|
503 | def check_permissions(self): | |
427 | if not self.repo_name: |
|
504 | if not self.repo_name: | |
428 | self.repo_name = get_repo_slug(request) |
|
505 | self.repo_name = get_repo_slug(request) | |
@@ -432,17 +509,17 b' class HasRepoPermissionAll(PermsFunction' | |||||
432 | [self.repo_name]]) |
|
509 | [self.repo_name]]) | |
433 | except KeyError: |
|
510 | except KeyError: | |
434 | return False |
|
511 | return False | |
435 |
self.granted_for = self.repo_name |
|
512 | self.granted_for = self.repo_name | |
436 | if self.required_perms.issubset(self.user_perms): |
|
513 | if self.required_perms.issubset(self.user_perms): | |
437 | return True |
|
514 | return True | |
438 | return False |
|
515 | return False | |
439 |
|
516 | |||
440 | class HasRepoPermissionAny(PermsFunction): |
|
517 | class HasRepoPermissionAny(PermsFunction): | |
441 |
|
518 | |||
442 | def __call__(self, repo_name=None, check_Location=''): |
|
519 | def __call__(self, repo_name=None, check_Location=''): | |
443 | self.repo_name = repo_name |
|
520 | self.repo_name = repo_name | |
444 | return super(HasRepoPermissionAny, self).__call__(check_Location) |
|
521 | return super(HasRepoPermissionAny, self).__call__(check_Location) | |
445 |
|
522 | |||
446 | def check_permissions(self): |
|
523 | def check_permissions(self): | |
447 | if not self.repo_name: |
|
524 | if not self.repo_name: | |
448 | self.repo_name = get_repo_slug(request) |
|
525 | self.repo_name = get_repo_slug(request) | |
@@ -464,13 +541,13 b' class HasRepoPermissionAny(PermsFunction' | |||||
464 | class HasPermissionAnyMiddleware(object): |
|
541 | class HasPermissionAnyMiddleware(object): | |
465 | def __init__(self, *perms): |
|
542 | def __init__(self, *perms): | |
466 | self.required_perms = set(perms) |
|
543 | self.required_perms = set(perms) | |
467 |
|
544 | |||
468 | def __call__(self, user, repo_name): |
|
545 | def __call__(self, user, repo_name): | |
469 | usr = AuthUser() |
|
546 | usr = AuthUser() | |
470 | usr.user_id = user.user_id |
|
547 | usr.user_id = user.user_id | |
471 | usr.username = user.username |
|
548 | usr.username = user.username | |
472 | usr.is_admin = user.admin |
|
549 | usr.is_admin = user.admin | |
473 |
|
550 | |||
474 | try: |
|
551 | try: | |
475 | self.user_perms = set([fill_perms(usr)\ |
|
552 | self.user_perms = set([fill_perms(usr)\ | |
476 | .permissions['repositories'][repo_name]]) |
|
553 | .permissions['repositories'][repo_name]]) | |
@@ -478,9 +555,9 b' class HasPermissionAnyMiddleware(object)' | |||||
478 | self.user_perms = set() |
|
555 | self.user_perms = set() | |
479 | self.granted_for = '' |
|
556 | self.granted_for = '' | |
480 | self.username = user.username |
|
557 | self.username = user.username | |
481 |
self.repo_name = repo_name |
|
558 | self.repo_name = repo_name | |
482 | return self.check_permissions() |
|
559 | return self.check_permissions() | |
483 |
|
560 | |||
484 | def check_permissions(self): |
|
561 | def check_permissions(self): | |
485 | log.debug('checking mercurial protocol ' |
|
562 | log.debug('checking mercurial protocol ' | |
486 | 'permissions for user:%s repository:%s', |
|
563 | 'permissions for user:%s repository:%s', |
@@ -9,30 +9,36 b' from rhodecode import __version__' | |||||
9 | from rhodecode.lib import auth |
|
9 | from rhodecode.lib import auth | |
10 | from rhodecode.lib.utils import get_repo_slug |
|
10 | from rhodecode.lib.utils import get_repo_slug | |
11 | from rhodecode.model import meta |
|
11 | from rhodecode.model import meta | |
12 |
from rhodecode.model. |
|
12 | from rhodecode.model.scm import ScmModel | |
13 | _get_repos_switcher_cached |
|
13 | from rhodecode import BACKENDS | |
14 |
|
14 | |||
15 | class BaseController(WSGIController): |
|
15 | class BaseController(WSGIController): | |
16 |
|
16 | |||
17 | def __before__(self): |
|
17 | def __before__(self): | |
18 | c.rhodecode_version = __version__ |
|
18 | c.rhodecode_version = __version__ | |
19 | c.rhodecode_name = config['rhodecode_title'] |
|
19 | c.rhodecode_name = config['rhodecode_title'] | |
20 | c.repo_name = get_repo_slug(request) |
|
20 | c.repo_name = get_repo_slug(request) | |
21 |
c.cached_repo_list = |
|
21 | c.cached_repo_list = ScmModel().get_repos() | |
22 | c.repo_switcher_list = _get_repos_switcher_cached(c.cached_repo_list) |
|
22 | c.backends = BACKENDS.keys() | |
23 |
|
23 | self.cut_off_limit = int(config['cut_off_limit']) | ||
|
24 | self.sa = meta.Session() | |||
|
25 | scm_model = ScmModel(self.sa) | |||
|
26 | #c.unread_journal = scm_model.get_unread_journal() | |||
|
27 | ||||
24 | if c.repo_name: |
|
28 | if c.repo_name: | |
25 |
cached_repo = |
|
29 | cached_repo = scm_model.get(c.repo_name) | |
26 |
|
||||
27 | if cached_repo: |
|
30 | if cached_repo: | |
28 | c.repository_tags = cached_repo.tags |
|
31 | c.repository_tags = cached_repo.tags | |
29 | c.repository_branches = cached_repo.branches |
|
32 | c.repository_branches = cached_repo.branches | |
|
33 | c.repository_followers = scm_model.get_followers(cached_repo.dbrepo.repo_id) | |||
|
34 | c.repository_forks = scm_model.get_forks(cached_repo.dbrepo.repo_id) | |||
30 | else: |
|
35 | else: | |
31 | c.repository_tags = {} |
|
36 | c.repository_tags = {} | |
32 | c.repository_branches = {} |
|
37 | c.repository_branches = {} | |
33 |
|
38 | c.repository_followers = 0 | ||
34 | self.sa = meta.Session |
|
39 | c.repository_forks = 0 | |
35 |
|
40 | |||
|
41 | ||||
36 | def __call__(self, environ, start_response): |
|
42 | def __call__(self, environ, start_response): | |
37 | """Invoke the Controller""" |
|
43 | """Invoke the Controller""" | |
38 | # WSGIController.__call__ dispatches to the Controller method |
|
44 | # WSGIController.__call__ dispatches to the Controller method |
@@ -1,14 +1,54 b'' | |||||
1 | from rhodecode.lib.pidlock import DaemonLock, LockHeld |
|
1 | # -*- coding: utf-8 -*- | |
2 | from vcs.utils.lazy import LazyProperty |
|
2 | """ | |
3 | from decorator import decorator |
|
3 | package.rhodecode.lib.celerylib.__init__ | |
4 | import logging |
|
4 | ~~~~~~~~~~~~~~ | |
|
5 | ||||
|
6 | celery libs for RhodeCode | |||
|
7 | ||||
|
8 | :created_on: Nov 27, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | """ | |||
|
13 | # This program is free software; you can redistribute it and/or | |||
|
14 | # modify it under the terms of the GNU General Public License | |||
|
15 | # as published by the Free Software Foundation; version 2 | |||
|
16 | # of the License or (at your opinion) any later version of the license. | |||
|
17 | # | |||
|
18 | # This program is distributed in the hope that it will be useful, | |||
|
19 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
20 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
21 | # GNU General Public License for more details. | |||
|
22 | # | |||
|
23 | # You should have received a copy of the GNU General Public License | |||
|
24 | # along with this program; if not, write to the Free Software | |||
|
25 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |||
|
26 | # MA 02110-1301, USA. | |||
|
27 | ||||
5 | import os |
|
28 | import os | |
6 | import sys |
|
29 | import sys | |
|
30 | import socket | |||
7 | import traceback |
|
31 | import traceback | |
|
32 | import logging | |||
|
33 | ||||
8 | from hashlib import md5 |
|
34 | from hashlib import md5 | |
9 | import socket |
|
35 | from decorator import decorator | |
|
36 | from vcs.utils.lazy import LazyProperty | |||
|
37 | ||||
|
38 | from rhodecode.lib.pidlock import DaemonLock, LockHeld | |||
|
39 | ||||
|
40 | from pylons import config | |||
|
41 | ||||
10 | log = logging.getLogger(__name__) |
|
42 | log = logging.getLogger(__name__) | |
11 |
|
43 | |||
|
44 | def str2bool(v): | |||
|
45 | return v.lower() in ["yes", "true", "t", "1"] if v else None | |||
|
46 | ||||
|
47 | try: | |||
|
48 | CELERY_ON = str2bool(config['app_conf'].get('use_celery')) | |||
|
49 | except KeyError: | |||
|
50 | CELERY_ON = False | |||
|
51 | ||||
12 | class ResultWrapper(object): |
|
52 | class ResultWrapper(object): | |
13 | def __init__(self, task): |
|
53 | def __init__(self, task): | |
14 | self.task = task |
|
54 | self.task = task | |
@@ -18,27 +58,22 b' class ResultWrapper(object):' | |||||
18 | return self.task |
|
58 | return self.task | |
19 |
|
59 | |||
20 | def run_task(task, *args, **kwargs): |
|
60 | def run_task(task, *args, **kwargs): | |
21 | try: |
|
61 | if CELERY_ON: | |
22 | t = task.delay(*args, **kwargs) |
|
|||
23 | log.info('running task %s', t.task_id) |
|
|||
24 | return t |
|
|||
25 | except socket.error, e: |
|
|||
26 |
|
||||
27 | try: |
|
62 | try: | |
28 | conn_failed = e.errno == 111 |
|
63 | t = task.delay(*args, **kwargs) | |
29 | except AttributeError: |
|
64 | log.info('running task %s:%s', t.task_id, task) | |
30 | conn_failed = False |
|
65 | return t | |
|
66 | except socket.error, e: | |||
|
67 | if e.errno == 111: | |||
|
68 | log.debug('Unable to connect to celeryd. Sync execution') | |||
|
69 | else: | |||
|
70 | log.error(traceback.format_exc()) | |||
|
71 | except KeyError, e: | |||
|
72 | log.debug('Unable to connect to celeryd. Sync execution') | |||
|
73 | except Exception, e: | |||
|
74 | log.error(traceback.format_exc()) | |||
31 |
|
75 | |||
32 | if conn_failed: |
|
76 | log.debug('executing task %s in sync mode', task) | |
33 | log.debug('Unable to connect to celeryd. Sync execution') |
|
|||
34 | else: |
|
|||
35 | log.debug('Unable to connect to celeryd. Sync execution') |
|
|||
36 |
|
||||
37 | except KeyError, e: |
|
|||
38 | log.debug('Unable to connect to celeryd. Sync execution') |
|
|||
39 | except Exception, e: |
|
|||
40 | log.error(traceback.format_exc()) |
|
|||
41 |
|
||||
42 | return ResultWrapper(task(*args, **kwargs)) |
|
77 | return ResultWrapper(task(*args, **kwargs)) | |
43 |
|
78 | |||
44 |
|
79 |
@@ -1,14 +1,26 b'' | |||||
1 | from celery.decorators import task |
|
1 | from celery.decorators import task | |
2 |
|
2 | |||
|
3 | import os | |||
|
4 | import traceback | |||
|
5 | from time import mktime | |||
3 | from operator import itemgetter |
|
6 | from operator import itemgetter | |
|
7 | ||||
|
8 | from pylons import config | |||
4 | from pylons.i18n.translation import _ |
|
9 | from pylons.i18n.translation import _ | |
5 | from rhodecode.lib.celerylib import run_task, locked_task |
|
10 | ||
|
11 | from rhodecode.lib.celerylib import run_task, locked_task, str2bool | |||
6 | from rhodecode.lib.helpers import person |
|
12 | from rhodecode.lib.helpers import person | |
7 | from rhodecode.lib.smtp_mailer import SmtpMailer |
|
13 | from rhodecode.lib.smtp_mailer import SmtpMailer | |
8 | from rhodecode.lib.utils import OrderedDict |
|
14 | from rhodecode.lib.utils import OrderedDict, add_cache | |
9 | from time import mktime |
|
15 | from rhodecode.model import init_model | |
10 | from vcs.backends.hg import MercurialRepository |
|
16 | from rhodecode.model import meta | |
11 | import traceback |
|
17 | from rhodecode.model.db import RhodeCodeUi | |
|
18 | ||||
|
19 | from vcs.backends import get_repo | |||
|
20 | ||||
|
21 | from sqlalchemy import engine_from_config | |||
|
22 | ||||
|
23 | add_cache(config) | |||
12 |
|
24 | |||
13 | try: |
|
25 | try: | |
14 | import json |
|
26 | import json | |
@@ -16,96 +28,56 b' except ImportError:' | |||||
16 | #python 2.5 compatibility |
|
28 | #python 2.5 compatibility | |
17 | import simplejson as json |
|
29 | import simplejson as json | |
18 |
|
30 | |||
19 | try: |
|
|||
20 | from celeryconfig import PYLONS_CONFIG as config |
|
|||
21 | celery_on = True |
|
|||
22 | except ImportError: |
|
|||
23 | #if celeryconfig is not present let's just load our pylons |
|
|||
24 | #config instead |
|
|||
25 | from pylons import config |
|
|||
26 | celery_on = False |
|
|||
27 |
|
||||
28 |
|
||||
29 | __all__ = ['whoosh_index', 'get_commits_stats', |
|
31 | __all__ = ['whoosh_index', 'get_commits_stats', | |
30 | 'reset_user_password', 'send_email'] |
|
32 | 'reset_user_password', 'send_email'] | |
31 |
|
33 | |||
|
34 | CELERY_ON = str2bool(config['app_conf'].get('use_celery')) | |||
|
35 | ||||
32 | def get_session(): |
|
36 | def get_session(): | |
33 | if celery_on: |
|
37 | if CELERY_ON: | |
34 | from sqlalchemy import engine_from_config |
|
38 | engine = engine_from_config(config, 'sqlalchemy.db1.') | |
35 | from sqlalchemy.orm import sessionmaker, scoped_session |
|
39 | init_model(engine) | |
36 | engine = engine_from_config(dict(config.items('app:main')), 'sqlalchemy.db1.') |
|
40 | sa = meta.Session() | |
37 | sa = scoped_session(sessionmaker(bind=engine)) |
|
|||
38 | else: |
|
|||
39 | #If we don't use celery reuse our current application Session |
|
|||
40 | from rhodecode.model.meta import Session |
|
|||
41 | sa = Session |
|
|||
42 |
|
||||
43 | return sa |
|
41 | return sa | |
44 |
|
42 | |||
45 | def get_hg_settings(): |
|
43 | def get_repos_path(): | |
46 | from rhodecode.model.db import RhodeCodeSettings |
|
|||
47 | sa = get_session() |
|
|||
48 | ret = sa.query(RhodeCodeSettings).all() |
|
|||
49 |
|
||||
50 | if not ret: |
|
|||
51 | raise Exception('Could not get application settings !') |
|
|||
52 | settings = {} |
|
|||
53 | for each in ret: |
|
|||
54 | settings['rhodecode_' + each.app_settings_name] = each.app_settings_value |
|
|||
55 |
|
||||
56 | return settings |
|
|||
57 |
|
||||
58 | def get_hg_ui_settings(): |
|
|||
59 | from rhodecode.model.db import RhodeCodeUi |
|
|||
60 | sa = get_session() |
|
44 | sa = get_session() | |
61 |
|
|
45 | q = sa.query(RhodeCodeUi).filter(RhodeCodeUi.ui_key == '/').one() | |
62 |
|
46 | return q.ui_value | ||
63 | if not ret: |
|
|||
64 | raise Exception('Could not get application ui settings !') |
|
|||
65 | settings = {} |
|
|||
66 | for each in ret: |
|
|||
67 | k = each.ui_key |
|
|||
68 | v = each.ui_value |
|
|||
69 | if k == '/': |
|
|||
70 | k = 'root_path' |
|
|||
71 |
|
||||
72 | if k.find('.') != -1: |
|
|||
73 | k = k.replace('.', '_') |
|
|||
74 |
|
||||
75 | if each.ui_section == 'hooks': |
|
|||
76 | v = each.ui_active |
|
|||
77 |
|
||||
78 | settings[each.ui_section + '_' + k] = v |
|
|||
79 |
|
||||
80 | return settings |
|
|||
81 |
|
47 | |||
82 | @task |
|
48 | @task | |
83 | @locked_task |
|
49 | @locked_task | |
84 | def whoosh_index(repo_location, full_index): |
|
50 | def whoosh_index(repo_location, full_index): | |
85 | log = whoosh_index.get_logger() |
|
51 | log = whoosh_index.get_logger() | |
86 | from rhodecode.lib.indexers.daemon import WhooshIndexingDaemon |
|
52 | from rhodecode.lib.indexers.daemon import WhooshIndexingDaemon | |
87 | WhooshIndexingDaemon(repo_location=repo_location).run(full_index=full_index) |
|
53 | index_location = config['index_dir'] | |
|
54 | WhooshIndexingDaemon(index_location=index_location, | |||
|
55 | repo_location=repo_location, sa=get_session())\ | |||
|
56 | .run(full_index=full_index) | |||
88 |
|
57 | |||
89 | @task |
|
58 | @task | |
90 | @locked_task |
|
59 | @locked_task | |
91 | def get_commits_stats(repo_name, ts_min_y, ts_max_y): |
|
60 | def get_commits_stats(repo_name, ts_min_y, ts_max_y): | |
92 | from rhodecode.model.db import Statistics, Repository |
|
61 | from rhodecode.model.db import Statistics, Repository | |
93 | log = get_commits_stats.get_logger() |
|
62 | log = get_commits_stats.get_logger() | |
94 | author_key_cleaner = lambda k: person(k).replace('"', "") #for js data compatibilty |
|
63 | ||
95 |
|
64 | #for js data compatibilty | ||
|
65 | author_key_cleaner = lambda k: person(k).replace('"', "") | |||
|
66 | ||||
96 | commits_by_day_author_aggregate = {} |
|
67 | commits_by_day_author_aggregate = {} | |
97 | commits_by_day_aggregate = {} |
|
68 | commits_by_day_aggregate = {} | |
98 | repos_path = get_hg_ui_settings()['paths_root_path'].replace('*', '') |
|
69 | repos_path = get_repos_path() | |
99 |
|
|
70 | p = os.path.join(repos_path, repo_name) | |
|
71 | repo = get_repo(p) | |||
100 |
|
72 | |||
101 | skip_date_limit = True |
|
73 | skip_date_limit = True | |
102 |
parse_limit = |
|
74 | parse_limit = 250 #limit for single task changeset parsing optimal for | |
103 | last_rev = 0 |
|
75 | last_rev = 0 | |
104 | last_cs = None |
|
76 | last_cs = None | |
105 | timegetter = itemgetter('time') |
|
77 | timegetter = itemgetter('time') | |
106 |
|
78 | |||
107 | sa = get_session() |
|
79 | sa = get_session() | |
108 |
|
80 | |||
109 | dbrepo = sa.query(Repository)\ |
|
81 | dbrepo = sa.query(Repository)\ | |
110 | .filter(Repository.repo_name == repo_name).scalar() |
|
82 | .filter(Repository.repo_name == repo_name).scalar() | |
111 | cur_stats = sa.query(Statistics)\ |
|
83 | cur_stats = sa.query(Statistics)\ | |
@@ -114,26 +86,27 b' def get_commits_stats(repo_name, ts_min_' | |||||
114 | last_rev = cur_stats.stat_on_revision |
|
86 | last_rev = cur_stats.stat_on_revision | |
115 | if not repo.revisions: |
|
87 | if not repo.revisions: | |
116 | return True |
|
88 | return True | |
117 |
|
89 | |||
118 | if last_rev == repo.revisions[-1] and len(repo.revisions) > 1: |
|
90 | if last_rev == repo.revisions[-1] and len(repo.revisions) > 1: | |
119 |
#pass silently without any work if we're not on first revision or |
|
91 | #pass silently without any work if we're not on first revision or | |
120 | #state of parsing revision(from db marker) is the last revision |
|
92 | #current state of parsing revision(from db marker) is the last revision | |
121 | return True |
|
93 | return True | |
122 |
|
94 | |||
123 | if cur_stats: |
|
95 | if cur_stats: | |
124 | commits_by_day_aggregate = OrderedDict( |
|
96 | commits_by_day_aggregate = OrderedDict( | |
125 | json.loads( |
|
97 | json.loads( | |
126 | cur_stats.commit_activity_combined)) |
|
98 | cur_stats.commit_activity_combined)) | |
127 | commits_by_day_author_aggregate = json.loads(cur_stats.commit_activity) |
|
99 | commits_by_day_author_aggregate = json.loads(cur_stats.commit_activity) | |
128 |
|
100 | |||
129 | log.debug('starting parsing %s', parse_limit) |
|
101 | log.debug('starting parsing %s', parse_limit) | |
130 | for cnt, rev in enumerate(repo.revisions[last_rev:]): |
|
102 | lmktime = mktime | |
|
103 | ||||
|
104 | last_rev = last_rev + 1 if last_rev > 0 else last_rev | |||
|
105 | for rev in repo.revisions[last_rev:last_rev + parse_limit]: | |||
131 | last_cs = cs = repo.get_changeset(rev) |
|
106 | last_cs = cs = repo.get_changeset(rev) | |
132 |
k = |
|
107 | k = lmktime([cs.date.timetuple()[0], cs.date.timetuple()[1], | |
133 |
|
|
108 | cs.date.timetuple()[2], 0, 0, 0, 0, 0, 0]) | |
134 | timetupple = [int(x) for x in k.split('-')] |
|
109 | ||
135 | timetupple.extend([0 for _ in xrange(6)]) |
|
|||
136 | k = mktime(timetupple) |
|
|||
137 | if commits_by_day_author_aggregate.has_key(author_key_cleaner(cs.author)): |
|
110 | if commits_by_day_author_aggregate.has_key(author_key_cleaner(cs.author)): | |
138 | try: |
|
111 | try: | |
139 | l = [timegetter(x) for x in commits_by_day_author_aggregate\ |
|
112 | l = [timegetter(x) for x in commits_by_day_author_aggregate\ | |
@@ -141,20 +114,20 b' def get_commits_stats(repo_name, ts_min_' | |||||
141 | time_pos = l.index(k) |
|
114 | time_pos = l.index(k) | |
142 | except ValueError: |
|
115 | except ValueError: | |
143 | time_pos = False |
|
116 | time_pos = False | |
144 |
|
117 | |||
145 | if time_pos >= 0 and time_pos is not False: |
|
118 | if time_pos >= 0 and time_pos is not False: | |
146 |
|
119 | |||
147 | datadict = commits_by_day_author_aggregate\ |
|
120 | datadict = commits_by_day_author_aggregate\ | |
148 | [author_key_cleaner(cs.author)]['data'][time_pos] |
|
121 | [author_key_cleaner(cs.author)]['data'][time_pos] | |
149 |
|
122 | |||
150 | datadict["commits"] += 1 |
|
123 | datadict["commits"] += 1 | |
151 | datadict["added"] += len(cs.added) |
|
124 | datadict["added"] += len(cs.added) | |
152 | datadict["changed"] += len(cs.changed) |
|
125 | datadict["changed"] += len(cs.changed) | |
153 | datadict["removed"] += len(cs.removed) |
|
126 | datadict["removed"] += len(cs.removed) | |
154 |
|
127 | |||
155 | else: |
|
128 | else: | |
156 | if k >= ts_min_y and k <= ts_max_y or skip_date_limit: |
|
129 | if k >= ts_min_y and k <= ts_max_y or skip_date_limit: | |
157 |
|
130 | |||
158 | datadict = {"time":k, |
|
131 | datadict = {"time":k, | |
159 | "commits":1, |
|
132 | "commits":1, | |
160 | "added":len(cs.added), |
|
133 | "added":len(cs.added), | |
@@ -163,7 +136,7 b' def get_commits_stats(repo_name, ts_min_' | |||||
163 | } |
|
136 | } | |
164 | commits_by_day_author_aggregate\ |
|
137 | commits_by_day_author_aggregate\ | |
165 | [author_key_cleaner(cs.author)]['data'].append(datadict) |
|
138 | [author_key_cleaner(cs.author)]['data'].append(datadict) | |
166 |
|
139 | |||
167 | else: |
|
140 | else: | |
168 | if k >= ts_min_y and k <= ts_max_y or skip_date_limit: |
|
141 | if k >= ts_min_y and k <= ts_max_y or skip_date_limit: | |
169 | commits_by_day_author_aggregate[author_key_cleaner(cs.author)] = { |
|
142 | commits_by_day_author_aggregate[author_key_cleaner(cs.author)] = { | |
@@ -175,23 +148,15 b' def get_commits_stats(repo_name, ts_min_' | |||||
175 | "removed":len(cs.removed), |
|
148 | "removed":len(cs.removed), | |
176 | }], |
|
149 | }], | |
177 | "schema":["commits"], |
|
150 | "schema":["commits"], | |
178 |
} |
|
151 | } | |
179 |
|
152 | |||
180 | #gather all data by day |
|
153 | #gather all data by day | |
181 | if commits_by_day_aggregate.has_key(k): |
|
154 | if commits_by_day_aggregate.has_key(k): | |
182 | commits_by_day_aggregate[k] += 1 |
|
155 | commits_by_day_aggregate[k] += 1 | |
183 | else: |
|
156 | else: | |
184 | commits_by_day_aggregate[k] = 1 |
|
157 | commits_by_day_aggregate[k] = 1 | |
185 |
|
||||
186 | if cnt >= parse_limit: |
|
|||
187 | #don't fetch to much data since we can freeze application |
|
|||
188 | break |
|
|||
189 |
|
158 | |||
190 | overview_data = [] |
|
159 | overview_data = sorted(commits_by_day_aggregate.items(), key=itemgetter(0)) | |
191 | for k, v in commits_by_day_aggregate.items(): |
|
|||
192 | overview_data.append([k, v]) |
|
|||
193 | overview_data = sorted(overview_data, key=itemgetter(0)) |
|
|||
194 |
|
||||
195 | if not commits_by_day_author_aggregate: |
|
160 | if not commits_by_day_author_aggregate: | |
196 | commits_by_day_author_aggregate[author_key_cleaner(repo.contact)] = { |
|
161 | commits_by_day_author_aggregate[author_key_cleaner(repo.contact)] = { | |
197 | "label":author_key_cleaner(repo.contact), |
|
162 | "label":author_key_cleaner(repo.contact), | |
@@ -206,23 +171,24 b' def get_commits_stats(repo_name, ts_min_' | |||||
206 | log.debug('last revison %s', last_rev) |
|
171 | log.debug('last revison %s', last_rev) | |
207 | leftovers = len(repo.revisions[last_rev:]) |
|
172 | leftovers = len(repo.revisions[last_rev:]) | |
208 | log.debug('revisions to parse %s', leftovers) |
|
173 | log.debug('revisions to parse %s', leftovers) | |
209 |
|
174 | |||
210 |
if last_rev == 0 or leftovers < parse_limit: |
|
175 | if last_rev == 0 or leftovers < parse_limit: | |
|
176 | log.debug('getting code trending stats') | |||
211 | stats.languages = json.dumps(__get_codes_stats(repo_name)) |
|
177 | stats.languages = json.dumps(__get_codes_stats(repo_name)) | |
212 |
|
178 | |||
213 | stats.repository = dbrepo |
|
179 | stats.repository = dbrepo | |
214 | stats.stat_on_revision = last_cs.revision |
|
180 | stats.stat_on_revision = last_cs.revision | |
215 |
|
181 | |||
216 | try: |
|
182 | try: | |
217 | sa.add(stats) |
|
183 | sa.add(stats) | |
218 |
sa.commit() |
|
184 | sa.commit() | |
219 | except: |
|
185 | except: | |
220 | log.error(traceback.format_exc()) |
|
186 | log.error(traceback.format_exc()) | |
221 | sa.rollback() |
|
187 | sa.rollback() | |
222 | return False |
|
188 | return False | |
223 | if len(repo.revisions) > 1: |
|
189 | if len(repo.revisions) > 1: | |
224 | run_task(get_commits_stats, repo_name, ts_min_y, ts_max_y) |
|
190 | run_task(get_commits_stats, repo_name, ts_min_y, ts_max_y) | |
225 |
|
191 | |||
226 | return True |
|
192 | return True | |
227 |
|
193 | |||
228 | @task |
|
194 | @task | |
@@ -230,7 +196,7 b' def reset_user_password(user_email):' | |||||
230 | log = reset_user_password.get_logger() |
|
196 | log = reset_user_password.get_logger() | |
231 | from rhodecode.lib import auth |
|
197 | from rhodecode.lib import auth | |
232 | from rhodecode.model.db import User |
|
198 | from rhodecode.model.db import User | |
233 |
|
199 | |||
234 | try: |
|
200 | try: | |
235 | try: |
|
201 | try: | |
236 | sa = get_session() |
|
202 | sa = get_session() | |
@@ -244,38 +210,52 b' def reset_user_password(user_email):' | |||||
244 | log.info('change password for %s', user_email) |
|
210 | log.info('change password for %s', user_email) | |
245 | if new_passwd is None: |
|
211 | if new_passwd is None: | |
246 | raise Exception('unable to generate new password') |
|
212 | raise Exception('unable to generate new password') | |
247 |
|
213 | |||
248 | except: |
|
214 | except: | |
249 | log.error(traceback.format_exc()) |
|
215 | log.error(traceback.format_exc()) | |
250 | sa.rollback() |
|
216 | sa.rollback() | |
251 |
|
217 | |||
252 | run_task(send_email, user_email, |
|
218 | run_task(send_email, user_email, | |
253 | "Your new rhodecode password", |
|
219 | "Your new rhodecode password", | |
254 | 'Your new rhodecode password:%s' % (new_passwd)) |
|
220 | 'Your new rhodecode password:%s' % (new_passwd)) | |
255 | log.info('send new password mail to %s', user_email) |
|
221 | log.info('send new password mail to %s', user_email) | |
256 |
|
222 | |||
257 |
|
223 | |||
258 | except: |
|
224 | except: | |
259 | log.error('Failed to update user password') |
|
225 | log.error('Failed to update user password') | |
260 | log.error(traceback.format_exc()) |
|
226 | log.error(traceback.format_exc()) | |
|
227 | ||||
261 | return True |
|
228 | return True | |
262 |
|
229 | |||
263 |
@task |
|
230 | @task | |
264 | def send_email(recipients, subject, body): |
|
231 | def send_email(recipients, subject, body): | |
|
232 | """ | |||
|
233 | Sends an email with defined parameters from the .ini files. | |||
|
234 | ||||
|
235 | ||||
|
236 | :param recipients: list of recipients, it this is empty the defined email | |||
|
237 | address from field 'email_to' is used instead | |||
|
238 | :param subject: subject of the mail | |||
|
239 | :param body: body of the mail | |||
|
240 | """ | |||
265 | log = send_email.get_logger() |
|
241 | log = send_email.get_logger() | |
266 |
email_config = |
|
242 | email_config = config | |
|
243 | ||||
|
244 | if not recipients: | |||
|
245 | recipients = [email_config.get('email_to')] | |||
|
246 | ||||
267 | mail_from = email_config.get('app_email_from') |
|
247 | mail_from = email_config.get('app_email_from') | |
268 | user = email_config.get('smtp_username') |
|
248 | user = email_config.get('smtp_username') | |
269 | passwd = email_config.get('smtp_password') |
|
249 | passwd = email_config.get('smtp_password') | |
270 | mail_server = email_config.get('smtp_server') |
|
250 | mail_server = email_config.get('smtp_server') | |
271 | mail_port = email_config.get('smtp_port') |
|
251 | mail_port = email_config.get('smtp_port') | |
272 | tls = email_config.get('smtp_use_tls') |
|
252 | tls = str2bool(email_config.get('smtp_use_tls')) | |
273 | ssl = False |
|
253 | ssl = str2bool(email_config.get('smtp_use_ssl')) | |
274 |
|
254 | |||
275 | try: |
|
255 | try: | |
276 | m = SmtpMailer(mail_from, user, passwd, mail_server, |
|
256 | m = SmtpMailer(mail_from, user, passwd, mail_server, | |
277 | mail_port, ssl, tls) |
|
257 | mail_port, ssl, tls) | |
278 |
m.send(recipients, subject, body) |
|
258 | m.send(recipients, subject, body) | |
279 | except: |
|
259 | except: | |
280 | log.error('Mail sending failed') |
|
260 | log.error('Mail sending failed') | |
281 | log.error(traceback.format_exc()) |
|
261 | log.error(traceback.format_exc()) | |
@@ -284,45 +264,96 b' def send_email(recipients, subject, body' | |||||
284 |
|
264 | |||
285 | @task |
|
265 | @task | |
286 | def create_repo_fork(form_data, cur_user): |
|
266 | def create_repo_fork(form_data, cur_user): | |
287 | import os |
|
267 | from rhodecode.model.repo import RepoModel | |
288 | from rhodecode.model.repo_model import RepoModel |
|
268 | from vcs import get_backend | |
289 | sa = get_session() |
|
269 | log = create_repo_fork.get_logger() | |
290 |
rm = RepoModel( |
|
270 | repo_model = RepoModel(get_session()) | |
291 |
|
271 | repo_model.create(form_data, cur_user, just_db=True, fork=True) | ||
292 | rm.create(form_data, cur_user, just_db=True, fork=True) |
|
272 | repo_name = form_data['repo_name'] | |
293 |
|
273 | repos_path = get_repos_path() | ||
294 | repos_path = get_hg_ui_settings()['paths_root_path'].replace('*', '') |
|
274 | repo_path = os.path.join(repos_path, repo_name) | |
295 | repo_path = os.path.join(repos_path, form_data['repo_name']) |
|
|||
296 | repo_fork_path = os.path.join(repos_path, form_data['fork_name']) |
|
275 | repo_fork_path = os.path.join(repos_path, form_data['fork_name']) | |
297 |
|
276 | alias = form_data['repo_type'] | ||
298 | MercurialRepository(str(repo_fork_path), True, clone_url=str(repo_path)) |
|
|||
299 |
|
277 | |||
300 |
|
278 | log.info('creating repo fork %s as %s', repo_name, repo_path) | ||
|
279 | backend = get_backend(alias) | |||
|
280 | backend(str(repo_fork_path), create=True, src_url=str(repo_path)) | |||
|
281 | ||||
301 | def __get_codes_stats(repo_name): |
|
282 | def __get_codes_stats(repo_name): | |
302 |
LANGUAGES_EXTENSIONS = |
|
283 | LANGUAGES_EXTENSIONS_MAP = {'scm': 'Scheme', 'asmx': 'VbNetAspx', 'Rout': | |
303 | 'cfg', 'cfm', 'cpp', 'cs', 'diff', 'do', 'el', 'erl', |
|
284 | 'RConsole', 'rest': 'Rst', 'abap': 'ABAP', 'go': 'Go', 'phtml': 'HtmlPhp', | |
304 | 'h', 'java', 'js', 'jsp', 'jspx', 'lisp', |
|
285 | 'ns2': 'Newspeak', 'xml': 'EvoqueXml', 'sh-session': 'BashSession', 'ads': | |
305 | 'lua', 'm', 'mako', 'ml', 'pas', 'patch', 'php', 'php3', |
|
286 | 'Ada', 'clj': 'Clojure', 'll': 'Llvm', 'ebuild': 'Bash', 'adb': 'Ada', | |
306 | 'php4', 'phtml', 'pm', 'py', 'rb', 'rst', 's', 'sh', |
|
287 | 'ada': 'Ada', 'c++-objdump': 'CppObjdump', 'aspx': | |
307 | 'tpl', 'txt', 'vim', 'wss', 'xhtml', 'xml', 'xsl', 'xslt', |
|
288 | 'VbNetAspx', 'ksh': 'Bash', 'coffee': 'CoffeeScript', 'vert': 'GLShader', | |
308 | 'yaws'] |
|
289 | 'Makefile.*': 'Makefile', 'di': 'D', 'dpatch': 'DarcsPatch', 'rake': | |
309 | repos_path = get_hg_ui_settings()['paths_root_path'].replace('*', '') |
|
290 | 'Ruby', 'moo': 'MOOCode', 'erl-sh': 'ErlangShell', 'geo': 'GLShader', | |
310 | repo = MercurialRepository(repos_path + repo_name) |
|
291 | 'pov': 'Povray', 'bas': 'VbNet', 'bat': 'Batch', 'd': 'D', 'lisp': | |
|
292 | 'CommonLisp', 'h': 'C', 'rbx': 'Ruby', 'tcl': 'Tcl', 'c++': 'Cpp', 'md': | |||
|
293 | 'MiniD', '.vimrc': 'Vim', 'xsd': 'Xml', 'ml': 'Ocaml', 'el': 'CommonLisp', | |||
|
294 | 'befunge': 'Befunge', 'xsl': 'Xslt', 'pyx': 'Cython', 'cfm': | |||
|
295 | 'ColdfusionHtml', 'evoque': 'Evoque', 'cfg': 'Ini', 'htm': 'Html', | |||
|
296 | 'Makefile': 'Makefile', 'cfc': 'ColdfusionHtml', 'tex': 'Tex', 'cs': | |||
|
297 | 'CSharp', 'mxml': 'Mxml', 'patch': 'Diff', 'apache.conf': 'ApacheConf', | |||
|
298 | 'scala': 'Scala', 'applescript': 'AppleScript', 'GNUmakefile': 'Makefile', | |||
|
299 | 'c-objdump': 'CObjdump', 'lua': 'Lua', 'apache2.conf': 'ApacheConf', 'rb': | |||
|
300 | 'Ruby', 'gemspec': 'Ruby', 'rl': 'RagelObjectiveC', 'vala': 'Vala', 'tmpl': | |||
|
301 | 'Cheetah', 'bf': 'Brainfuck', 'plt': 'Gnuplot', 'G': 'AntlrRuby', 'xslt': | |||
|
302 | 'Xslt', 'flxh': 'Felix', 'asax': 'VbNetAspx', 'Rakefile': 'Ruby', 'S': 'S', | |||
|
303 | 'wsdl': 'Xml', 'js': 'Javascript', 'autodelegate': 'Myghty', 'properties': | |||
|
304 | 'Ini', 'bash': 'Bash', 'c': 'C', 'g': 'AntlrRuby', 'r3': 'Rebol', 's': | |||
|
305 | 'Gas', 'ashx': 'VbNetAspx', 'cxx': 'Cpp', 'boo': 'Boo', 'prolog': 'Prolog', | |||
|
306 | 'sqlite3-console': 'SqliteConsole', 'cl': 'CommonLisp', 'cc': 'Cpp', 'pot': | |||
|
307 | 'Gettext', 'vim': 'Vim', 'pxi': 'Cython', 'yaml': 'Yaml', 'SConstruct': | |||
|
308 | 'Python', 'diff': 'Diff', 'txt': 'Text', 'cw': 'Redcode', 'pxd': 'Cython', | |||
|
309 | 'plot': 'Gnuplot', 'java': 'Java', 'hrl': 'Erlang', 'py': 'Python', | |||
|
310 | 'makefile': 'Makefile', 'squid.conf': 'SquidConf', 'asm': 'Nasm', 'toc': | |||
|
311 | 'Tex', 'kid': 'Genshi', 'rhtml': 'Rhtml', 'po': 'Gettext', 'pl': 'Prolog', | |||
|
312 | 'pm': 'Perl', 'hx': 'Haxe', 'ascx': 'VbNetAspx', 'ooc': 'Ooc', 'asy': | |||
|
313 | 'Asymptote', 'hs': 'Haskell', 'SConscript': 'Python', 'pytb': | |||
|
314 | 'PythonTraceback', 'myt': 'Myghty', 'hh': 'Cpp', 'R': 'S', 'aux': 'Tex', | |||
|
315 | 'rst': 'Rst', 'cpp-objdump': 'CppObjdump', 'lgt': 'Logtalk', 'rss': 'Xml', | |||
|
316 | 'flx': 'Felix', 'b': 'Brainfuck', 'f': 'Fortran', 'rbw': 'Ruby', | |||
|
317 | '.htaccess': 'ApacheConf', 'cxx-objdump': 'CppObjdump', 'j': 'ObjectiveJ', | |||
|
318 | 'mll': 'Ocaml', 'yml': 'Yaml', 'mu': 'MuPAD', 'r': 'Rebol', 'ASM': 'Nasm', | |||
|
319 | 'erl': 'Erlang', 'mly': 'Ocaml', 'mo': 'Modelica', 'def': 'Modula2', 'ini': | |||
|
320 | 'Ini', 'control': 'DebianControl', 'vb': 'VbNet', 'vapi': 'Vala', 'pro': | |||
|
321 | 'Prolog', 'spt': 'Cheetah', 'mli': 'Ocaml', 'as': 'ActionScript3', 'cmd': | |||
|
322 | 'Batch', 'cpp': 'Cpp', 'io': 'Io', 'tac': 'Python', 'haml': 'Haml', 'rkt': | |||
|
323 | 'Racket', 'st':'Smalltalk', 'inc': 'Povray', 'pas': 'Delphi', 'cmake': | |||
|
324 | 'CMake', 'csh':'Tcsh', 'hpp': 'Cpp', 'feature': 'Gherkin', 'html': 'Html', | |||
|
325 | 'php':'Php', 'php3':'Php', 'php4':'Php', 'php5':'Php', 'xhtml': 'Html', | |||
|
326 | 'hxx': 'Cpp', 'eclass': 'Bash', 'css': 'Css', | |||
|
327 | 'frag': 'GLShader', 'd-objdump': 'DObjdump', 'weechatlog': 'IrcLogs', | |||
|
328 | 'tcsh': 'Tcsh', 'objdump': 'Objdump', 'pyw': 'Python', 'h++': 'Cpp', | |||
|
329 | 'py3tb': 'Python3Traceback', 'jsp': 'Jsp', 'sql': 'Sql', 'mak': 'Makefile', | |||
|
330 | 'php': 'Php', 'mao': 'Mako', 'man': 'Groff', 'dylan': 'Dylan', 'sass': | |||
|
331 | 'Sass', 'cfml': 'ColdfusionHtml', 'darcspatch': 'DarcsPatch', 'tpl': | |||
|
332 | 'Smarty', 'm': 'ObjectiveC', 'f90': 'Fortran', 'mod': 'Modula2', 'sh': | |||
|
333 | 'Bash', 'lhs': 'LiterateHaskell', 'sources.list': 'SourcesList', 'axd': | |||
|
334 | 'VbNetAspx', 'sc': 'Python'} | |||
|
335 | ||||
|
336 | repos_path = get_repos_path() | |||
|
337 | p = os.path.join(repos_path, repo_name) | |||
|
338 | repo = get_repo(p) | |||
311 | tip = repo.get_changeset() |
|
339 | tip = repo.get_changeset() | |
312 |
|
||||
313 | code_stats = {} |
|
340 | code_stats = {} | |
314 | for topnode, dirs, files in tip.walk('/'): |
|
341 | ||
315 | for f in files: |
|
342 | def aggregate(cs): | |
316 | k = f.mimetype |
|
343 | for f in cs[2]: | |
317 |
|
|
344 | ext = f.extension | |
318 | if code_stats.has_key(k): |
|
345 | key = LANGUAGES_EXTENSIONS_MAP.get(ext, ext) | |
319 | code_stats[k] += 1 |
|
346 | key = key or ext | |
|
347 | if ext in LANGUAGES_EXTENSIONS_MAP.keys() and not f.is_binary: | |||
|
348 | if code_stats.has_key(key): | |||
|
349 | code_stats[key] += 1 | |||
320 | else: |
|
350 | else: | |
321 | code_stats[k] = 1 |
|
351 | code_stats[key] = 1 | |
322 |
|
352 | |||
|
353 | map(aggregate, tip.walk('/')) | |||
|
354 | ||||
323 | return code_stats or {} |
|
355 | return code_stats or {} | |
324 |
|
356 | |||
325 |
|
357 | |||
326 |
|
||||
327 |
|
358 | |||
328 |
|
359 |
@@ -1,8 +1,16 b'' | |||||
1 | #!/usr/bin/env python |
|
1 | # -*- coding: utf-8 -*- | |
2 | # encoding: utf-8 |
|
2 | """ | |
3 | # database management for RhodeCode |
|
3 | rhodecode.lib.db_manage | |
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> |
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~ | |
5 | # |
|
5 | ||
|
6 | Database creation, and setup module for RhodeCode. Used for creation | |||
|
7 | of database as well as for migration operations | |||
|
8 | ||||
|
9 | :created_on: Apr 10, 2010 | |||
|
10 | :author: marcink | |||
|
11 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
12 | :license: GPLv3, see COPYING for more details. | |||
|
13 | """ | |||
6 | # This program is free software; you can redistribute it and/or |
|
14 | # This program is free software; you can redistribute it and/or | |
7 | # modify it under the terms of the GNU General Public License |
|
15 | # modify it under the terms of the GNU General Public License | |
8 | # as published by the Free Software Foundation; version 2 |
|
16 | # as published by the Free Software Foundation; version 2 | |
@@ -18,51 +26,50 b'' | |||||
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, |
|
26 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |
19 | # MA 02110-1301, USA. |
|
27 | # MA 02110-1301, USA. | |
20 |
|
28 | |||
21 | """ |
|
|||
22 | Created on April 10, 2010 |
|
|||
23 | database management and creation for RhodeCode |
|
|||
24 | @author: marcink |
|
|||
25 | """ |
|
|||
26 |
|
||||
27 | from os.path import dirname as dn, join as jn |
|
|||
28 | import os |
|
29 | import os | |
29 | import sys |
|
30 | import sys | |
30 | import uuid |
|
31 | import uuid | |
|
32 | import logging | |||
|
33 | from os.path import dirname as dn, join as jn | |||
|
34 | ||||
|
35 | from rhodecode import __dbversion__ | |||
|
36 | from rhodecode.model import meta | |||
31 |
|
37 | |||
32 | from rhodecode.lib.auth import get_crypt_password |
|
38 | from rhodecode.lib.auth import get_crypt_password | |
33 | from rhodecode.lib.utils import ask_ok |
|
39 | from rhodecode.lib.utils import ask_ok | |
34 | from rhodecode.model import init_model |
|
40 | from rhodecode.model import init_model | |
35 | from rhodecode.model.db import User, Permission, RhodeCodeUi, RhodeCodeSettings, \ |
|
41 | from rhodecode.model.db import User, Permission, RhodeCodeUi, RhodeCodeSettings, \ | |
36 | UserToPerm |
|
42 | UserToPerm, DbMigrateVersion | |
37 | from rhodecode.model import meta |
|
43 | ||
38 | from sqlalchemy.engine import create_engine |
|
44 | from sqlalchemy.engine import create_engine | |
39 | import logging |
|
|||
40 |
|
45 | |||
41 | log = logging.getLogger(__name__) |
|
46 | log = logging.getLogger(__name__) | |
42 |
|
47 | |||
43 | class DbManage(object): |
|
48 | class DbManage(object): | |
44 |
def __init__(self, log_sql, db |
|
49 | def __init__(self, log_sql, dbconf, root, tests=False): | |
45 |
self.dbname = db |
|
50 | self.dbname = dbconf.split('/')[-1] | |
46 | self.tests = tests |
|
51 | self.tests = tests | |
47 | self.root = root |
|
52 | self.root = root | |
48 | dburi = 'sqlite:////%s' % jn(self.root, self.dbname) |
|
53 | self.dburi = dbconf | |
49 |
engine = create_engine(dburi, echo=log_sql) |
|
54 | engine = create_engine(self.dburi, echo=log_sql) | |
50 | init_model(engine) |
|
55 | init_model(engine) | |
51 | self.sa = meta.Session |
|
56 | self.sa = meta.Session() | |
52 | self.db_exists = False |
|
57 | self.db_exists = False | |
53 |
|
58 | |||
54 | def check_for_db(self, override): |
|
59 | def check_for_db(self, override): | |
55 | db_path = jn(self.root, self.dbname) |
|
60 | db_path = jn(self.root, self.dbname) | |
56 | log.info('checking for existing db in %s', db_path) |
|
61 | if self.dburi.startswith('sqlite'): | |
57 | if os.path.isfile(db_path): |
|
62 | log.info('checking for existing db in %s', db_path) | |
58 | self.db_exists = True |
|
63 | if os.path.isfile(db_path): | |
59 | if not override: |
|
64 | ||
60 | raise Exception('database already exists') |
|
65 | self.db_exists = True | |
|
66 | if not override: | |||
|
67 | raise Exception('database already exists') | |||
61 |
|
68 | |||
62 | def create_tables(self, override=False): |
|
69 | def create_tables(self, override=False): | |
|
70 | """Create a auth database | |||
63 |
|
|
71 | """ | |
64 | Create a auth database |
|
72 | ||
65 | """ |
|
|||
66 | self.check_for_db(override) |
|
73 | self.check_for_db(override) | |
67 | if self.db_exists: |
|
74 | if self.db_exists: | |
68 | log.info("database exist and it's going to be destroyed") |
|
75 | log.info("database exist and it's going to be destroyed") | |
@@ -77,34 +84,163 b' class DbManage(object):' | |||||
77 | checkfirst = not override |
|
84 | checkfirst = not override | |
78 | meta.Base.metadata.create_all(checkfirst=checkfirst) |
|
85 | meta.Base.metadata.create_all(checkfirst=checkfirst) | |
79 | log.info('Created tables for %s', self.dbname) |
|
86 | log.info('Created tables for %s', self.dbname) | |
80 |
|
87 | |||
|
88 | ||||
|
89 | ||||
|
90 | def set_db_version(self): | |||
|
91 | try: | |||
|
92 | ver = DbMigrateVersion() | |||
|
93 | ver.version = __dbversion__ | |||
|
94 | ver.repository_id = 'rhodecode_db_migrations' | |||
|
95 | ver.repository_path = 'versions' | |||
|
96 | self.sa.add(ver) | |||
|
97 | self.sa.commit() | |||
|
98 | except: | |||
|
99 | self.sa.rollback() | |||
|
100 | raise | |||
|
101 | log.info('db version set to: %s', __dbversion__) | |||
|
102 | ||||
|
103 | ||||
|
104 | def upgrade(self): | |||
|
105 | """Upgrades given database schema to given revision following | |||
|
106 | all needed steps, | |||
|
107 | ||||
|
108 | :param revision: revision to upgrade to | |||
|
109 | """ | |||
|
110 | ||||
|
111 | from rhodecode.lib.dbmigrate.migrate.versioning import api | |||
|
112 | from rhodecode.lib.dbmigrate.migrate.exceptions import \ | |||
|
113 | DatabaseNotControlledError | |||
|
114 | ||||
|
115 | upgrade = ask_ok('You are about to perform database upgrade, make ' | |||
|
116 | 'sure You backed up your database before. ' | |||
|
117 | 'Continue ? [y/n]') | |||
|
118 | if not upgrade: | |||
|
119 | sys.exit('Nothing done') | |||
|
120 | ||||
|
121 | repository_path = jn(dn(dn(dn(os.path.realpath(__file__)))), | |||
|
122 | 'rhodecode/lib/dbmigrate') | |||
|
123 | db_uri = self.dburi | |||
|
124 | ||||
|
125 | try: | |||
|
126 | curr_version = api.db_version(db_uri, repository_path) | |||
|
127 | msg = ('Found current database under version' | |||
|
128 | ' control with version %s' % curr_version) | |||
|
129 | ||||
|
130 | except (RuntimeError, DatabaseNotControlledError), e: | |||
|
131 | curr_version = 1 | |||
|
132 | msg = ('Current database is not under version control. Setting' | |||
|
133 | ' as version %s' % curr_version) | |||
|
134 | api.version_control(db_uri, repository_path, curr_version) | |||
|
135 | ||||
|
136 | print (msg) | |||
|
137 | ||||
|
138 | if curr_version == __dbversion__: | |||
|
139 | sys.exit('This database is already at the newest version') | |||
|
140 | ||||
|
141 | #====================================================================== | |||
|
142 | # UPGRADE STEPS | |||
|
143 | #====================================================================== | |||
|
144 | class UpgradeSteps(object): | |||
|
145 | ||||
|
146 | def __init__(self, klass): | |||
|
147 | self.klass = klass | |||
|
148 | ||||
|
149 | def step_0(self): | |||
|
150 | #step 0 is the schema upgrade, and than follow proper upgrades | |||
|
151 | print ('attempting to do database upgrade to version %s' \ | |||
|
152 | % __dbversion__) | |||
|
153 | api.upgrade(db_uri, repository_path, __dbversion__) | |||
|
154 | print ('Schema upgrade completed') | |||
|
155 | ||||
|
156 | def step_1(self): | |||
|
157 | pass | |||
|
158 | ||||
|
159 | def step_2(self): | |||
|
160 | print ('Patching repo paths for newer version of RhodeCode') | |||
|
161 | self.klass.fix_repo_paths() | |||
|
162 | ||||
|
163 | print ('Patching default user of RhodeCode') | |||
|
164 | self.klass.fix_default_user() | |||
|
165 | ||||
|
166 | log.info('Changing ui settings') | |||
|
167 | self.klass.create_ui_settings() | |||
|
168 | ||||
|
169 | ||||
|
170 | upgrade_steps = [0] + range(curr_version + 1, __dbversion__ + 1) | |||
|
171 | ||||
|
172 | #CALL THE PROPER ORDER OF STEPS TO PERFORM FULL UPGRADE | |||
|
173 | for step in upgrade_steps: | |||
|
174 | print ('performing upgrade step %s' % step) | |||
|
175 | callable = getattr(UpgradeSteps(self), 'step_%s' % step)() | |||
|
176 | ||||
|
177 | ||||
|
178 | ||||
|
179 | def fix_repo_paths(self): | |||
|
180 | """Fixes a old rhodecode version path into new one without a '*' | |||
|
181 | """ | |||
|
182 | ||||
|
183 | paths = self.sa.query(RhodeCodeUi)\ | |||
|
184 | .filter(RhodeCodeUi.ui_key == '/')\ | |||
|
185 | .scalar() | |||
|
186 | ||||
|
187 | paths.ui_value = paths.ui_value.replace('*', '') | |||
|
188 | ||||
|
189 | try: | |||
|
190 | self.sa.add(paths) | |||
|
191 | self.sa.commit() | |||
|
192 | except: | |||
|
193 | self.sa.rollback() | |||
|
194 | raise | |||
|
195 | ||||
|
196 | def fix_default_user(self): | |||
|
197 | """Fixes a old default user with some 'nicer' default values, | |||
|
198 | used mostly for anonymous access | |||
|
199 | """ | |||
|
200 | def_user = self.sa.query(User)\ | |||
|
201 | .filter(User.username == 'default')\ | |||
|
202 | .one() | |||
|
203 | ||||
|
204 | def_user.name = 'Anonymous' | |||
|
205 | def_user.lastname = 'User' | |||
|
206 | def_user.email = 'anonymous@rhodecode.org' | |||
|
207 | ||||
|
208 | try: | |||
|
209 | self.sa.add(def_user) | |||
|
210 | self.sa.commit() | |||
|
211 | except: | |||
|
212 | self.sa.rollback() | |||
|
213 | raise | |||
|
214 | ||||
|
215 | ||||
|
216 | ||||
81 | def admin_prompt(self, second=False): |
|
217 | def admin_prompt(self, second=False): | |
82 | if not self.tests: |
|
218 | if not self.tests: | |
83 | import getpass |
|
219 | import getpass | |
84 |
|
220 | |||
85 |
|
221 | |||
86 | def get_password(): |
|
222 | def get_password(): | |
87 | password = getpass.getpass('Specify admin password (min 6 chars):') |
|
223 | password = getpass.getpass('Specify admin password (min 6 chars):') | |
88 | confirm = getpass.getpass('Confirm password:') |
|
224 | confirm = getpass.getpass('Confirm password:') | |
89 |
|
225 | |||
90 | if password != confirm: |
|
226 | if password != confirm: | |
91 | log.error('passwords mismatch') |
|
227 | log.error('passwords mismatch') | |
92 | return False |
|
228 | return False | |
93 | if len(password) < 6: |
|
229 | if len(password) < 6: | |
94 | log.error('password is to short use at least 6 characters') |
|
230 | log.error('password is to short use at least 6 characters') | |
95 | return False |
|
231 | return False | |
96 |
|
232 | |||
97 | return password |
|
233 | return password | |
98 |
|
234 | |||
99 | username = raw_input('Specify admin username:') |
|
235 | username = raw_input('Specify admin username:') | |
100 |
|
236 | |||
101 | password = get_password() |
|
237 | password = get_password() | |
102 | if not password: |
|
238 | if not password: | |
103 | #second try |
|
239 | #second try | |
104 | password = get_password() |
|
240 | password = get_password() | |
105 | if not password: |
|
241 | if not password: | |
106 | sys.exit() |
|
242 | sys.exit() | |
107 |
|
243 | |||
108 | email = raw_input('Specify admin email:') |
|
244 | email = raw_input('Specify admin email:') | |
109 | self.create_user(username, password, email, True) |
|
245 | self.create_user(username, password, email, True) | |
110 | else: |
|
246 | else: | |
@@ -112,71 +248,121 b' class DbManage(object):' | |||||
112 | self.create_user('test_admin', 'test12', 'test_admin@mail.com', True) |
|
248 | self.create_user('test_admin', 'test12', 'test_admin@mail.com', True) | |
113 | self.create_user('test_regular', 'test12', 'test_regular@mail.com', False) |
|
249 | self.create_user('test_regular', 'test12', 'test_regular@mail.com', False) | |
114 | self.create_user('test_regular2', 'test12', 'test_regular2@mail.com', False) |
|
250 | self.create_user('test_regular2', 'test12', 'test_regular2@mail.com', False) | |
115 |
|
251 | |||
|
252 | def create_ui_settings(self): | |||
|
253 | """Creates ui settings, fills out hooks | |||
|
254 | and disables dotencode | |||
116 |
|
|
255 | ||
117 |
|
256 | """ | ||
|
257 | #HOOKS | |||
|
258 | hooks1_key = 'changegroup.update' | |||
|
259 | hooks1_ = self.sa.query(RhodeCodeUi)\ | |||
|
260 | .filter(RhodeCodeUi.ui_key == hooks1_key).scalar() | |||
|
261 | ||||
|
262 | hooks1 = RhodeCodeUi() if hooks1_ is None else hooks1_ | |||
|
263 | hooks1.ui_section = 'hooks' | |||
|
264 | hooks1.ui_key = hooks1_key | |||
|
265 | hooks1.ui_value = 'hg update >&2' | |||
|
266 | hooks1.ui_active = False | |||
|
267 | ||||
|
268 | hooks2_key = 'changegroup.repo_size' | |||
|
269 | hooks2_ = self.sa.query(RhodeCodeUi)\ | |||
|
270 | .filter(RhodeCodeUi.ui_key == hooks2_key).scalar() | |||
|
271 | ||||
|
272 | hooks2 = RhodeCodeUi() if hooks2_ is None else hooks2_ | |||
|
273 | hooks2.ui_section = 'hooks' | |||
|
274 | hooks2.ui_key = hooks2_key | |||
|
275 | hooks2.ui_value = 'python:rhodecode.lib.hooks.repo_size' | |||
|
276 | ||||
|
277 | hooks3 = RhodeCodeUi() | |||
|
278 | hooks3.ui_section = 'hooks' | |||
|
279 | hooks3.ui_key = 'pretxnchangegroup.push_logger' | |||
|
280 | hooks3.ui_value = 'python:rhodecode.lib.hooks.log_push_action' | |||
|
281 | ||||
|
282 | hooks4 = RhodeCodeUi() | |||
|
283 | hooks4.ui_section = 'hooks' | |||
|
284 | hooks4.ui_key = 'preoutgoing.pull_logger' | |||
|
285 | hooks4.ui_value = 'python:rhodecode.lib.hooks.log_pull_action' | |||
|
286 | ||||
|
287 | #For mercurial 1.7 set backward comapatibility with format | |||
|
288 | dotencode_disable = RhodeCodeUi() | |||
|
289 | dotencode_disable.ui_section = 'format' | |||
|
290 | dotencode_disable.ui_key = 'dotencode' | |||
|
291 | dotencode_disable.ui_value = 'false' | |||
|
292 | ||||
|
293 | try: | |||
|
294 | self.sa.add(hooks1) | |||
|
295 | self.sa.add(hooks2) | |||
|
296 | self.sa.add(hooks3) | |||
|
297 | self.sa.add(hooks4) | |||
|
298 | self.sa.add(dotencode_disable) | |||
|
299 | self.sa.commit() | |||
|
300 | except: | |||
|
301 | self.sa.rollback() | |||
|
302 | raise | |||
|
303 | ||||
|
304 | ||||
|
305 | def create_ldap_options(self): | |||
|
306 | """Creates ldap settings""" | |||
|
307 | ||||
|
308 | try: | |||
|
309 | for k in ['ldap_active', 'ldap_host', 'ldap_port', 'ldap_ldaps', | |||
|
310 | 'ldap_dn_user', 'ldap_dn_pass', 'ldap_base_dn']: | |||
|
311 | ||||
|
312 | setting = RhodeCodeSettings(k, '') | |||
|
313 | self.sa.add(setting) | |||
|
314 | self.sa.commit() | |||
|
315 | except: | |||
|
316 | self.sa.rollback() | |||
|
317 | raise | |||
|
318 | ||||
118 | def config_prompt(self, test_repo_path=''): |
|
319 | def config_prompt(self, test_repo_path=''): | |
119 | log.info('Setting up repositories config') |
|
320 | log.info('Setting up repositories config') | |
120 |
|
321 | |||
121 | if not self.tests and not test_repo_path: |
|
322 | if not self.tests and not test_repo_path: | |
122 | path = raw_input('Specify valid full path to your repositories' |
|
323 | path = raw_input('Specify valid full path to your repositories' | |
123 | ' you can change this later in application settings:') |
|
324 | ' you can change this later in application settings:') | |
124 | else: |
|
325 | else: | |
125 | path = test_repo_path |
|
326 | path = test_repo_path | |
126 |
|
327 | |||
127 | if not os.path.isdir(path): |
|
328 | if not os.path.isdir(path): | |
128 | log.error('You entered wrong path: %s', path) |
|
329 | log.error('You entered wrong path: %s', path) | |
129 | sys.exit() |
|
330 | sys.exit() | |
130 |
|
331 | |||
131 | hooks1 = RhodeCodeUi() |
|
332 | self.create_ui_settings() | |
132 | hooks1.ui_section = 'hooks' |
|
333 | ||
133 | hooks1.ui_key = 'changegroup.update' |
|
334 | #HG UI OPTIONS | |
134 | hooks1.ui_value = 'hg update >&2' |
|
|||
135 | hooks1.ui_active = False |
|
|||
136 |
|
||||
137 | hooks2 = RhodeCodeUi() |
|
|||
138 | hooks2.ui_section = 'hooks' |
|
|||
139 | hooks2.ui_key = 'changegroup.repo_size' |
|
|||
140 | hooks2.ui_value = 'python:rhodecode.lib.hooks.repo_size' |
|
|||
141 |
|
||||
142 | web1 = RhodeCodeUi() |
|
335 | web1 = RhodeCodeUi() | |
143 | web1.ui_section = 'web' |
|
336 | web1.ui_section = 'web' | |
144 | web1.ui_key = 'push_ssl' |
|
337 | web1.ui_key = 'push_ssl' | |
145 | web1.ui_value = 'false' |
|
338 | web1.ui_value = 'false' | |
146 |
|
339 | |||
147 | web2 = RhodeCodeUi() |
|
340 | web2 = RhodeCodeUi() | |
148 | web2.ui_section = 'web' |
|
341 | web2.ui_section = 'web' | |
149 | web2.ui_key = 'allow_archive' |
|
342 | web2.ui_key = 'allow_archive' | |
150 | web2.ui_value = 'gz zip bz2' |
|
343 | web2.ui_value = 'gz zip bz2' | |
151 |
|
344 | |||
152 | web3 = RhodeCodeUi() |
|
345 | web3 = RhodeCodeUi() | |
153 | web3.ui_section = 'web' |
|
346 | web3.ui_section = 'web' | |
154 | web3.ui_key = 'allow_push' |
|
347 | web3.ui_key = 'allow_push' | |
155 | web3.ui_value = '*' |
|
348 | web3.ui_value = '*' | |
156 |
|
349 | |||
157 | web4 = RhodeCodeUi() |
|
350 | web4 = RhodeCodeUi() | |
158 | web4.ui_section = 'web' |
|
351 | web4.ui_section = 'web' | |
159 | web4.ui_key = 'baseurl' |
|
352 | web4.ui_key = 'baseurl' | |
160 |
web4.ui_value = '/' |
|
353 | web4.ui_value = '/' | |
161 |
|
354 | |||
162 | paths = RhodeCodeUi() |
|
355 | paths = RhodeCodeUi() | |
163 | paths.ui_section = 'paths' |
|
356 | paths.ui_section = 'paths' | |
164 | paths.ui_key = '/' |
|
357 | paths.ui_key = '/' | |
165 |
paths.ui_value = |
|
358 | paths.ui_value = path | |
166 |
|
359 | |||
167 |
|
360 | |||
168 | hgsettings1 = RhodeCodeSettings() |
|
361 | hgsettings1 = RhodeCodeSettings('realm', 'RhodeCode authentication') | |
169 |
|
362 | hgsettings2 = RhodeCodeSettings('title', 'RhodeCode') | ||
170 | hgsettings1.app_settings_name = 'realm' |
|
363 | ||
171 | hgsettings1.app_settings_value = 'RhodeCode authentication' |
|
364 | ||
172 |
|
||||
173 | hgsettings2 = RhodeCodeSettings() |
|
|||
174 | hgsettings2.app_settings_name = 'title' |
|
|||
175 | hgsettings2.app_settings_value = 'RhodeCode' |
|
|||
176 |
|
||||
177 | try: |
|
365 | try: | |
178 | self.sa.add(hooks1) |
|
|||
179 | self.sa.add(hooks2) |
|
|||
180 | self.sa.add(web1) |
|
366 | self.sa.add(web1) | |
181 | self.sa.add(web2) |
|
367 | self.sa.add(web2) | |
182 | self.sa.add(web3) |
|
368 | self.sa.add(web3) | |
@@ -184,12 +370,16 b' class DbManage(object):' | |||||
184 | self.sa.add(paths) |
|
370 | self.sa.add(paths) | |
185 | self.sa.add(hgsettings1) |
|
371 | self.sa.add(hgsettings1) | |
186 | self.sa.add(hgsettings2) |
|
372 | self.sa.add(hgsettings2) | |
|
373 | ||||
187 | self.sa.commit() |
|
374 | self.sa.commit() | |
188 | except: |
|
375 | except: | |
189 | self.sa.rollback() |
|
376 | self.sa.rollback() | |
190 |
raise |
|
377 | raise | |
|
378 | ||||
|
379 | self.create_ldap_options() | |||
|
380 | ||||
191 | log.info('created ui config') |
|
381 | log.info('created ui config') | |
192 |
|
382 | |||
193 | def create_user(self, username, password, email='', admin=False): |
|
383 | def create_user(self, username, password, email='', admin=False): | |
194 | log.info('creating administrator user %s', username) |
|
384 | log.info('creating administrator user %s', username) | |
195 | new_user = User() |
|
385 | new_user = User() | |
@@ -200,7 +390,7 b' class DbManage(object):' | |||||
200 | new_user.email = email |
|
390 | new_user.email = email | |
201 | new_user.admin = admin |
|
391 | new_user.admin = admin | |
202 | new_user.active = True |
|
392 | new_user.active = True | |
203 |
|
393 | |||
204 | try: |
|
394 | try: | |
205 | self.sa.add(new_user) |
|
395 | self.sa.add(new_user) | |
206 | self.sa.commit() |
|
396 | self.sa.commit() | |
@@ -214,9 +404,9 b' class DbManage(object):' | |||||
214 | def_user = User() |
|
404 | def_user = User() | |
215 | def_user.username = 'default' |
|
405 | def_user.username = 'default' | |
216 | def_user.password = get_crypt_password(str(uuid.uuid1())[:8]) |
|
406 | def_user.password = get_crypt_password(str(uuid.uuid1())[:8]) | |
217 |
def_user.name = ' |
|
407 | def_user.name = 'Anonymous' | |
218 |
def_user.lastname = ' |
|
408 | def_user.lastname = 'User' | |
219 |
def_user.email = ' |
|
409 | def_user.email = 'anonymous@rhodecode.org' | |
220 | def_user.admin = False |
|
410 | def_user.admin = False | |
221 | def_user.active = False |
|
411 | def_user.active = False | |
222 | try: |
|
412 | try: | |
@@ -225,7 +415,7 b' class DbManage(object):' | |||||
225 | except: |
|
415 | except: | |
226 | self.sa.rollback() |
|
416 | self.sa.rollback() | |
227 | raise |
|
417 | raise | |
228 |
|
418 | |||
229 | def create_permissions(self): |
|
419 | def create_permissions(self): | |
230 | #module.(access|create|change|delete)_[name] |
|
420 | #module.(access|create|change|delete)_[name] | |
231 | #module.(read|write|owner) |
|
421 | #module.(read|write|owner) | |
@@ -240,7 +430,7 b' class DbManage(object):' | |||||
240 | ('hg.register.manual_activate', 'Register new user with rhodecode without manual activation'), |
|
430 | ('hg.register.manual_activate', 'Register new user with rhodecode without manual activation'), | |
241 | ('hg.register.auto_activate', 'Register new user with rhodecode without auto activation'), |
|
431 | ('hg.register.auto_activate', 'Register new user with rhodecode without auto activation'), | |
242 | ] |
|
432 | ] | |
243 |
|
433 | |||
244 | for p in perms: |
|
434 | for p in perms: | |
245 | new_perm = Permission() |
|
435 | new_perm = Permission() | |
246 | new_perm.permission_name = p[0] |
|
436 | new_perm.permission_name = p[0] | |
@@ -254,28 +444,28 b' class DbManage(object):' | |||||
254 |
|
444 | |||
255 | def populate_default_permissions(self): |
|
445 | def populate_default_permissions(self): | |
256 | log.info('creating default user permissions') |
|
446 | log.info('creating default user permissions') | |
257 |
|
447 | |||
258 | default_user = self.sa.query(User)\ |
|
448 | default_user = self.sa.query(User)\ | |
259 | .filter(User.username == 'default').scalar() |
|
449 | .filter(User.username == 'default').scalar() | |
260 |
|
450 | |||
261 | reg_perm = UserToPerm() |
|
451 | reg_perm = UserToPerm() | |
262 | reg_perm.user = default_user |
|
452 | reg_perm.user = default_user | |
263 | reg_perm.permission = self.sa.query(Permission)\ |
|
453 | reg_perm.permission = self.sa.query(Permission)\ | |
264 | .filter(Permission.permission_name == 'hg.register.manual_activate')\ |
|
454 | .filter(Permission.permission_name == 'hg.register.manual_activate')\ | |
265 |
.scalar() |
|
455 | .scalar() | |
266 |
|
456 | |||
267 | create_repo_perm = UserToPerm() |
|
457 | create_repo_perm = UserToPerm() | |
268 | create_repo_perm.user = default_user |
|
458 | create_repo_perm.user = default_user | |
269 | create_repo_perm.permission = self.sa.query(Permission)\ |
|
459 | create_repo_perm.permission = self.sa.query(Permission)\ | |
270 | .filter(Permission.permission_name == 'hg.create.repository')\ |
|
460 | .filter(Permission.permission_name == 'hg.create.repository')\ | |
271 |
.scalar() |
|
461 | .scalar() | |
272 |
|
462 | |||
273 | default_repo_perm = UserToPerm() |
|
463 | default_repo_perm = UserToPerm() | |
274 | default_repo_perm.user = default_user |
|
464 | default_repo_perm.user = default_user | |
275 | default_repo_perm.permission = self.sa.query(Permission)\ |
|
465 | default_repo_perm.permission = self.sa.query(Permission)\ | |
276 | .filter(Permission.permission_name == 'repository.read')\ |
|
466 | .filter(Permission.permission_name == 'repository.read')\ | |
277 |
.scalar() |
|
467 | .scalar() | |
278 |
|
468 | |||
279 | try: |
|
469 | try: | |
280 | self.sa.add(reg_perm) |
|
470 | self.sa.add(reg_perm) | |
281 | self.sa.add(create_repo_perm) |
|
471 | self.sa.add(create_repo_perm) | |
@@ -283,5 +473,5 b' class DbManage(object):' | |||||
283 | self.sa.commit() |
|
473 | self.sa.commit() | |
284 | except: |
|
474 | except: | |
285 | self.sa.rollback() |
|
475 | self.sa.rollback() | |
286 |
raise |
|
476 | raise | |
287 |
|
477 |
@@ -3,6 +3,8 b'' | |||||
3 | Consists of functions to typically be used within templates, but also |
|
3 | Consists of functions to typically be used within templates, but also | |
4 | available to Controllers. This module is available to both as 'h'. |
|
4 | available to Controllers. This module is available to both as 'h'. | |
5 | """ |
|
5 | """ | |
|
6 | import random | |||
|
7 | import hashlib | |||
6 | from pygments.formatters import HtmlFormatter |
|
8 | from pygments.formatters import HtmlFormatter | |
7 | from pygments import highlight as code_highlight |
|
9 | from pygments import highlight as code_highlight | |
8 | from pylons import url, app_globals as g |
|
10 | from pylons import url, app_globals as g | |
@@ -23,6 +25,36 b' from webhelpers.pylonslib.secure_form im' | |||||
23 | from webhelpers.text import chop_at, collapse, convert_accented_entities, \ |
|
25 | from webhelpers.text import chop_at, collapse, convert_accented_entities, \ | |
24 | convert_misc_entities, lchop, plural, rchop, remove_formatting, \ |
|
26 | convert_misc_entities, lchop, plural, rchop, remove_formatting, \ | |
25 | replace_whitespace, urlify, truncate, wrap_paragraphs |
|
27 | replace_whitespace, urlify, truncate, wrap_paragraphs | |
|
28 | from webhelpers.date import time_ago_in_words | |||
|
29 | ||||
|
30 | from webhelpers.html.tags import _set_input_attrs, _set_id_attr, \ | |||
|
31 | convert_boolean_attrs, NotGiven | |||
|
32 | ||||
|
33 | def _reset(name, value=None, id=NotGiven, type="reset", **attrs): | |||
|
34 | _set_input_attrs(attrs, type, name, value) | |||
|
35 | _set_id_attr(attrs, id, name) | |||
|
36 | convert_boolean_attrs(attrs, ["disabled"]) | |||
|
37 | return HTML.input(**attrs) | |||
|
38 | ||||
|
39 | reset = _reset | |||
|
40 | ||||
|
41 | ||||
|
42 | def get_token(): | |||
|
43 | """Return the current authentication token, creating one if one doesn't | |||
|
44 | already exist. | |||
|
45 | """ | |||
|
46 | token_key = "_authentication_token" | |||
|
47 | from pylons import session | |||
|
48 | if not token_key in session: | |||
|
49 | try: | |||
|
50 | token = hashlib.sha1(str(random.getrandbits(128))).hexdigest() | |||
|
51 | except AttributeError: # Python < 2.4 | |||
|
52 | token = hashlib.sha1(str(random.randrange(2 ** 128))).hexdigest() | |||
|
53 | session[token_key] = token | |||
|
54 | if hasattr(session, 'save'): | |||
|
55 | session.save() | |||
|
56 | return session[token_key] | |||
|
57 | ||||
26 |
|
58 | |||
27 | #Custom helpers here :) |
|
59 | #Custom helpers here :) | |
28 | class _Link(object): |
|
60 | class _Link(object): | |
@@ -93,7 +125,7 b' class _ToolTip(object):' | |||||
93 | var tts = YAHOO.util.Dom.getElementsByClassName('tooltip'); |
|
125 | var tts = YAHOO.util.Dom.getElementsByClassName('tooltip'); | |
94 |
|
126 | |||
95 | for (var i = 0; i < tts.length; i++) { |
|
127 | for (var i = 0; i < tts.length; i++) { | |
96 | //if element doesn not have and id autgenerate one for tooltip |
|
128 | //if element doesn't not have and id autgenerate one for tooltip | |
97 |
|
129 | |||
98 | if (!tts[i].id){ |
|
130 | if (!tts[i].id){ | |
99 | tts[i].id='tt'+i*100; |
|
131 | tts[i].id='tt'+i*100; | |
@@ -111,7 +143,7 b' class _ToolTip(object):' | |||||
111 | showdelay:20, |
|
143 | showdelay:20, | |
112 | }); |
|
144 | }); | |
113 |
|
145 | |||
114 | //Mouse Over event disabled for new repositories since they dont |
|
146 | //Mouse Over event disabled for new repositories since they don't | |
115 | //have last commit message |
|
147 | //have last commit message | |
116 | myToolTips.contextMouseOverEvent.subscribe( |
|
148 | myToolTips.contextMouseOverEvent.subscribe( | |
117 | function(type, args) { |
|
149 | function(type, args) { | |
@@ -270,13 +302,13 b' def pygmentize_annotation(filenode, **kw' | |||||
270 | tooltip_html = tooltip_html % (changeset.author, |
|
302 | tooltip_html = tooltip_html % (changeset.author, | |
271 | changeset.date, |
|
303 | changeset.date, | |
272 | tooltip(changeset.message)) |
|
304 | tooltip(changeset.message)) | |
273 |
lnk_format = ' |
|
305 | lnk_format = '%5s:%s' % ('r%s' % changeset.revision, | |
274 |
changeset. |
|
306 | short_id(changeset.raw_id)) | |
275 | uri = link_to( |
|
307 | uri = link_to( | |
276 | lnk_format, |
|
308 | lnk_format, | |
277 | url('changeset_home', repo_name=changeset.repository.name, |
|
309 | url('changeset_home', repo_name=changeset.repository.name, | |
278 |
revision=changeset. |
|
310 | revision=changeset.raw_id), | |
279 |
style=get_color_string(changeset. |
|
311 | style=get_color_string(changeset.raw_id), | |
280 | class_='tooltip', |
|
312 | class_='tooltip', | |
281 | tooltip_title=tooltip_html |
|
313 | tooltip_title=tooltip_html | |
282 | ) |
|
314 | ) | |
@@ -317,37 +349,168 b' def get_changeset_safe(repo, rev):' | |||||
317 | flash = _Flash() |
|
349 | flash = _Flash() | |
318 |
|
350 | |||
319 |
|
351 | |||
320 |
#============================================================================== |
|
352 | #============================================================================== | |
321 | # MERCURIAL FILTERS available via h. |
|
353 | # MERCURIAL FILTERS available via h. | |
322 |
#============================================================================== |
|
354 | #============================================================================== | |
323 | from mercurial import util |
|
355 | from mercurial import util | |
324 |
from mercurial.templatefilters import |
|
356 | from mercurial.templatefilters import person as _person | |
|
357 | ||||
|
358 | ||||
|
359 | ||||
|
360 | def _age(curdate): | |||
|
361 | """turns a datetime into an age string.""" | |||
|
362 | ||||
|
363 | if not curdate: | |||
|
364 | return '' | |||
|
365 | ||||
|
366 | from datetime import timedelta, datetime | |||
|
367 | ||||
|
368 | agescales = [("year", 3600 * 24 * 365), | |||
|
369 | ("month", 3600 * 24 * 30), | |||
|
370 | ("day", 3600 * 24), | |||
|
371 | ("hour", 3600), | |||
|
372 | ("minute", 60), | |||
|
373 | ("second", 1), ] | |||
|
374 | ||||
|
375 | age = datetime.now() - curdate | |||
|
376 | age_seconds = (age.days * agescales[2][1]) + age.seconds | |||
|
377 | pos = 1 | |||
|
378 | for scale in agescales: | |||
|
379 | if scale[1] <= age_seconds: | |||
|
380 | if pos == 6:pos = 5 | |||
|
381 | return time_ago_in_words(curdate, agescales[pos][0]) + ' ' + _('ago') | |||
|
382 | pos += 1 | |||
|
383 | ||||
|
384 | return _('just now') | |||
325 |
|
385 | |||
326 | age = lambda x:_age(x) |
|
386 | age = lambda x:_age(x) | |
327 | capitalize = lambda x: x.capitalize() |
|
387 | capitalize = lambda x: x.capitalize() | |
328 | date = lambda x: util.datestr(x) |
|
|||
329 | email = util.email |
|
388 | email = util.email | |
330 | email_or_none = lambda x: util.email(x) if util.email(x) != x else None |
|
389 | email_or_none = lambda x: util.email(x) if util.email(x) != x else None | |
331 | person = lambda x: _person(x) |
|
390 | person = lambda x: _person(x) | |
332 | hgdate = lambda x: "%d %d" % x |
|
391 | short_id = lambda x: x[:12] | |
333 | isodate = lambda x: util.datestr(x, '%Y-%m-%d %H:%M %1%2') |
|
392 | ||
334 | isodatesec = lambda x: util.datestr(x, '%Y-%m-%d %H:%M:%S %1%2') |
|
393 | ||
335 | localdate = lambda x: (x[0], util.makedate()[1]) |
|
394 | def bool2icon(value): | |
336 | rfc822date = lambda x: util.datestr(x, "%a, %d %b %Y %H:%M:%S %1%2") |
|
395 | """ | |
337 | rfc822date_notz = lambda x: util.datestr(x, "%a, %d %b %Y %H:%M:%S") |
|
396 | Returns True/False values represented as small html image of true/false | |
338 | rfc3339date = lambda x: util.datestr(x, "%Y-%m-%dT%H:%M:%S%1:%2") |
|
397 | icons | |
339 | time_ago = lambda x: util.datestr(_age(x), "%a, %d %b %Y %H:%M:%S %1%2") |
|
398 | :param value: bool value | |
|
399 | """ | |||
|
400 | ||||
|
401 | if value is True: | |||
|
402 | return HTML.tag('img', src="/images/icons/accept.png", alt=_('True')) | |||
|
403 | ||||
|
404 | if value is False: | |||
|
405 | return HTML.tag('img', src="/images/icons/cancel.png", alt=_('False')) | |||
|
406 | ||||
|
407 | return value | |||
340 |
|
408 | |||
341 |
|
409 | |||
342 | #=============================================================================== |
|
410 | def action_parser(user_log): | |
|
411 | """ | |||
|
412 | This helper will map the specified string action into translated | |||
|
413 | fancy names with icons and links | |||
|
414 | ||||
|
415 | @param action: | |||
|
416 | """ | |||
|
417 | action = user_log.action | |||
|
418 | action_params = ' ' | |||
|
419 | ||||
|
420 | x = action.split(':') | |||
|
421 | ||||
|
422 | if len(x) > 1: | |||
|
423 | action, action_params = x | |||
|
424 | ||||
|
425 | def get_cs_links(): | |||
|
426 | if action == 'push': | |||
|
427 | revs_limit = 5 | |||
|
428 | revs = action_params.split(',') | |||
|
429 | cs_links = " " + ', '.join ([link(rev, | |||
|
430 | url('changeset_home', | |||
|
431 | repo_name=user_log.repository.repo_name, | |||
|
432 | revision=rev)) for rev in revs[:revs_limit] ]) | |||
|
433 | if len(revs) > revs_limit: | |||
|
434 | uniq_id = revs[0] | |||
|
435 | html_tmpl = ('<span> %s ' | |||
|
436 | '<a class="show_more" id="_%s" href="#">%s</a> ' | |||
|
437 | '%s</span>') | |||
|
438 | cs_links += html_tmpl % (_('and'), uniq_id, _('%s more') \ | |||
|
439 | % (len(revs) - revs_limit), | |||
|
440 | _('revisions')) | |||
|
441 | ||||
|
442 | html_tmpl = '<span id="%s" style="display:none"> %s </span>' | |||
|
443 | cs_links += html_tmpl % (uniq_id, ', '.join([link(rev, | |||
|
444 | url('changeset_home', | |||
|
445 | repo_name=user_log.repository.repo_name, | |||
|
446 | revision=rev)) for rev in revs[:revs_limit] ])) | |||
|
447 | ||||
|
448 | return cs_links | |||
|
449 | return '' | |||
|
450 | ||||
|
451 | def get_fork_name(): | |||
|
452 | if action == 'user_forked_repo': | |||
|
453 | from rhodecode.model.scm import ScmModel | |||
|
454 | repo_name = action_params | |||
|
455 | repo = ScmModel().get(repo_name) | |||
|
456 | if repo is None: | |||
|
457 | return repo_name | |||
|
458 | return link_to(action_params, url('summary_home', | |||
|
459 | repo_name=repo.name,), | |||
|
460 | title=repo.dbrepo.description) | |||
|
461 | return '' | |||
|
462 | map = {'user_deleted_repo':_('User [deleted] repository'), | |||
|
463 | 'user_created_repo':_('User [created] repository'), | |||
|
464 | 'user_forked_repo':_('User [forked] repository as: %s') % get_fork_name(), | |||
|
465 | 'user_updated_repo':_('User [updated] repository'), | |||
|
466 | 'admin_deleted_repo':_('Admin [delete] repository'), | |||
|
467 | 'admin_created_repo':_('Admin [created] repository'), | |||
|
468 | 'admin_forked_repo':_('Admin [forked] repository'), | |||
|
469 | 'admin_updated_repo':_('Admin [updated] repository'), | |||
|
470 | 'push':_('[Pushed] %s') % get_cs_links(), | |||
|
471 | 'pull':_('[Pulled]'), | |||
|
472 | 'started_following_repo':_('User [started following] repository'), | |||
|
473 | 'stopped_following_repo':_('User [stopped following] repository'), | |||
|
474 | } | |||
|
475 | ||||
|
476 | action_str = map.get(action, action) | |||
|
477 | return literal(action_str.replace('[', '<span class="journal_highlight">')\ | |||
|
478 | .replace(']', '</span>')) | |||
|
479 | ||||
|
480 | def action_parser_icon(user_log): | |||
|
481 | action = user_log.action | |||
|
482 | action_params = None | |||
|
483 | x = action.split(':') | |||
|
484 | ||||
|
485 | if len(x) > 1: | |||
|
486 | action, action_params = x | |||
|
487 | ||||
|
488 | tmpl = """<img src="/images/icons/%s" alt="%s"/>""" | |||
|
489 | map = {'user_deleted_repo':'database_delete.png', | |||
|
490 | 'user_created_repo':'database_add.png', | |||
|
491 | 'user_forked_repo':'arrow_divide.png', | |||
|
492 | 'user_updated_repo':'database_edit.png', | |||
|
493 | 'admin_deleted_repo':'database_delete.png', | |||
|
494 | 'admin_created_repo':'database_ddd.png', | |||
|
495 | 'admin_forked_repo':'arrow_divide.png', | |||
|
496 | 'admin_updated_repo':'database_edit.png', | |||
|
497 | 'push':'script_add.png', | |||
|
498 | 'pull':'down_16.png', | |||
|
499 | 'started_following_repo':'heart_add.png', | |||
|
500 | 'stopped_following_repo':'heart_delete.png', | |||
|
501 | } | |||
|
502 | return literal(tmpl % (map.get(action, action), action)) | |||
|
503 | ||||
|
504 | ||||
|
505 | #============================================================================== | |||
343 | # PERMS |
|
506 | # PERMS | |
344 |
#============================================================================== |
|
507 | #============================================================================== | |
345 | from rhodecode.lib.auth import HasPermissionAny, HasPermissionAll, \ |
|
508 | from rhodecode.lib.auth import HasPermissionAny, HasPermissionAll, \ | |
346 | HasRepoPermissionAny, HasRepoPermissionAll |
|
509 | HasRepoPermissionAny, HasRepoPermissionAll | |
347 |
|
510 | |||
348 |
#============================================================================== |
|
511 | #============================================================================== | |
349 | # GRAVATAR URL |
|
512 | # GRAVATAR URL | |
350 |
#============================================================================== |
|
513 | #============================================================================== | |
351 | import hashlib |
|
514 | import hashlib | |
352 | import urllib |
|
515 | import urllib | |
353 | from pylons import request |
|
516 | from pylons import request |
@@ -22,12 +22,12 b' Created on Aug 6, 2010' | |||||
22 |
|
22 | |||
23 | @author: marcink |
|
23 | @author: marcink | |
24 | """ |
|
24 | """ | |
25 |
|
25 | from mercurial.cmdutil import revrange | ||
26 | import sys |
|
26 | from mercurial.node import nullrev | |
|
27 | from rhodecode.lib import helpers as h | |||
|
28 | from rhodecode.lib.utils import action_logger | |||
27 | import os |
|
29 | import os | |
28 | from rhodecode.lib import helpers as h |
|
30 | import sys | |
29 | from rhodecode.model import meta |
|
|||
30 | from rhodecode.model.db import UserLog, User |
|
|||
31 |
|
31 | |||
32 | def repo_size(ui, repo, hooktype=None, **kwargs): |
|
32 | def repo_size(ui, repo, hooktype=None, **kwargs): | |
33 |
|
33 | |||
@@ -53,32 +53,53 b' def repo_size(ui, repo, hooktype=None, *' | |||||
53 | size_total_f = h.format_byte_size(size_root + size_hg) |
|
53 | size_total_f = h.format_byte_size(size_root + size_hg) | |
54 | sys.stdout.write('Repository size .hg:%s repo:%s total:%s\n' \ |
|
54 | sys.stdout.write('Repository size .hg:%s repo:%s total:%s\n' \ | |
55 | % (size_hg_f, size_root_f, size_total_f)) |
|
55 | % (size_hg_f, size_root_f, size_total_f)) | |
56 |
|
56 | |||
57 |
|
|
57 | def log_pull_action(ui, repo, **kwargs): | |
|
58 | """ | |||
|
59 | Logs user last pull action | |||
|
60 | :param ui: | |||
|
61 | :param repo: | |||
|
62 | """ | |||
58 |
|
63 | |||
59 | def user_action_mapper(ui, repo, hooktype=None, **kwargs): |
|
64 | extra_params = dict(repo.ui.configitems('rhodecode_extras')) | |
|
65 | username = extra_params['username'] | |||
|
66 | repository = extra_params['repository'] | |||
|
67 | action = 'pull' | |||
|
68 | ||||
|
69 | action_logger(username, action, repository, extra_params['ip']) | |||
|
70 | ||||
|
71 | return 0 | |||
|
72 | ||||
|
73 | def log_push_action(ui, repo, **kwargs): | |||
60 | """ |
|
74 | """ | |
61 | Maps user last push action to new changeset id, from mercurial |
|
75 | Maps user last push action to new changeset id, from mercurial | |
62 | :param ui: |
|
76 | :param ui: | |
63 | :param repo: |
|
77 | :param repo: | |
64 | :param hooktype: |
|
|||
65 | """ |
|
78 | """ | |
66 |
|
79 | |||
67 | try: |
|
80 | extra_params = dict(repo.ui.configitems('rhodecode_extras')) | |
68 | sa = meta.Session |
|
81 | username = extra_params['username'] | |
69 | username = kwargs['url'].split(':')[-1] |
|
82 | repository = extra_params['repository'] | |
70 | user_log = sa.query(UserLog)\ |
|
83 | action = 'push:%s' | |
71 | .filter(UserLog.user == sa.query(User)\ |
|
84 | node = kwargs['node'] | |
72 | .filter(User.username == username).one())\ |
|
85 | ||
73 | .order_by(UserLog.user_log_id.desc()).first() |
|
86 | def get_revs(repo, rev_opt): | |
74 |
|
87 | if rev_opt: | ||
75 | if user_log and not user_log.revision: |
|
88 | revs = revrange(repo, rev_opt) | |
76 | user_log.revision = str(repo['tip']) |
|
89 | ||
77 | sa.add(user_log) |
|
90 | if len(revs) == 0: | |
78 | sa.commit() |
|
91 | return (nullrev, nullrev) | |
79 |
|
92 | return (max(revs), min(revs)) | ||
80 | except Exception, e: |
|
93 | else: | |
81 | sa.rollback() |
|
94 | return (len(repo) - 1, 0) | |
82 | raise |
|
95 | ||
83 | finally: |
|
96 | stop, start = get_revs(repo, [node + ':']) | |
84 | meta.Session.remove() |
|
97 | ||
|
98 | revs = (str(repo[r]) for r in xrange(start, stop + 1)) | |||
|
99 | ||||
|
100 | action = action % ','.join(revs) | |||
|
101 | ||||
|
102 | action_logger(username, action, repository, extra_params['ip']) | |||
|
103 | ||||
|
104 | return 0 | |||
|
105 |
@@ -1,26 +1,28 b'' | |||||
|
1 | import os | |||
|
2 | import sys | |||
|
3 | import traceback | |||
1 | from os.path import dirname as dn, join as jn |
|
4 | from os.path import dirname as dn, join as jn | |
|
5 | ||||
|
6 | #to get the rhodecode import | |||
|
7 | sys.path.append(dn(dn(dn(os.path.realpath(__file__))))) | |||
|
8 | ||||
|
9 | from rhodecode.model import init_model | |||
|
10 | from rhodecode.model.scm import ScmModel | |||
2 | from rhodecode.config.environment import load_environment |
|
11 | from rhodecode.config.environment import load_environment | |
3 | from rhodecode.model.hg_model import HgModel |
|
12 | from rhodecode.lib.utils import BasePasterCommand, Command, add_cache | |
|
13 | ||||
4 | from shutil import rmtree |
|
14 | from shutil import rmtree | |
5 | from webhelpers.html.builder import escape |
|
15 | from webhelpers.html.builder import escape | |
6 | from vcs.utils.lazy import LazyProperty |
|
16 | from vcs.utils.lazy import LazyProperty | |
7 |
|
17 | |||
|
18 | from sqlalchemy import engine_from_config | |||
|
19 | ||||
8 | from whoosh.analysis import RegexTokenizer, LowercaseFilter, StopFilter |
|
20 | from whoosh.analysis import RegexTokenizer, LowercaseFilter, StopFilter | |
9 | from whoosh.fields import TEXT, ID, STORED, Schema, FieldType |
|
21 | from whoosh.fields import TEXT, ID, STORED, Schema, FieldType | |
10 | from whoosh.index import create_in, open_dir |
|
22 | from whoosh.index import create_in, open_dir | |
11 | from whoosh.formats import Characters |
|
23 | from whoosh.formats import Characters | |
12 |
from whoosh.highlight import highlight, SimpleFragmenter, HtmlFormatter |
|
24 | from whoosh.highlight import highlight, SimpleFragmenter, HtmlFormatter | |
13 |
|
||||
14 | import os |
|
|||
15 | import sys |
|
|||
16 | import traceback |
|
|||
17 |
|
25 | |||
18 | #to get the rhodecode import |
|
|||
19 | sys.path.append(dn(dn(dn(os.path.realpath(__file__))))) |
|
|||
20 |
|
||||
21 |
|
||||
22 | #LOCATION WE KEEP THE INDEX |
|
|||
23 | IDX_LOCATION = jn(dn(dn(dn(dn(os.path.abspath(__file__))))), 'data', 'index') |
|
|||
24 |
|
26 | |||
25 | #EXTENSIONS WE WANT TO INDEX CONTENT OFF |
|
27 | #EXTENSIONS WE WANT TO INDEX CONTENT OFF | |
26 | INDEX_EXTENSIONS = ['action', 'adp', 'ashx', 'asmx', 'aspx', 'asx', 'axd', 'c', |
|
28 | INDEX_EXTENSIONS = ['action', 'adp', 'ashx', 'asmx', 'aspx', 'asx', 'axd', 'c', | |
@@ -45,9 +47,58 b' SCHEMA = Schema(owner=TEXT(),' | |||||
45 |
|
47 | |||
46 |
|
48 | |||
47 | IDX_NAME = 'HG_INDEX' |
|
49 | IDX_NAME = 'HG_INDEX' | |
48 |
FORMATTER = HtmlFormatter('span', between='\n<span class="break">...</span>\n') |
|
50 | FORMATTER = HtmlFormatter('span', between='\n<span class="break">...</span>\n') | |
49 | FRAGMENTER = SimpleFragmenter(200) |
|
51 | FRAGMENTER = SimpleFragmenter(200) | |
50 |
|
52 | |||
|
53 | ||||
|
54 | class MakeIndex(BasePasterCommand): | |||
|
55 | ||||
|
56 | max_args = 1 | |||
|
57 | min_args = 1 | |||
|
58 | ||||
|
59 | usage = "CONFIG_FILE" | |||
|
60 | summary = "Creates index for full text search given configuration file" | |||
|
61 | group_name = "RhodeCode" | |||
|
62 | takes_config_file = -1 | |||
|
63 | parser = Command.standard_parser(verbose=True) | |||
|
64 | ||||
|
65 | def command(self): | |||
|
66 | ||||
|
67 | from pylons import config | |||
|
68 | add_cache(config) | |||
|
69 | engine = engine_from_config(config, 'sqlalchemy.db1.') | |||
|
70 | init_model(engine) | |||
|
71 | ||||
|
72 | index_location = config['index_dir'] | |||
|
73 | repo_location = self.options.repo_location | |||
|
74 | ||||
|
75 | #====================================================================== | |||
|
76 | # WHOOSH DAEMON | |||
|
77 | #====================================================================== | |||
|
78 | from rhodecode.lib.pidlock import LockHeld, DaemonLock | |||
|
79 | from rhodecode.lib.indexers.daemon import WhooshIndexingDaemon | |||
|
80 | try: | |||
|
81 | l = DaemonLock() | |||
|
82 | WhooshIndexingDaemon(index_location=index_location, | |||
|
83 | repo_location=repo_location)\ | |||
|
84 | .run(full_index=self.options.full_index) | |||
|
85 | l.release() | |||
|
86 | except LockHeld: | |||
|
87 | sys.exit(1) | |||
|
88 | ||||
|
89 | def update_parser(self): | |||
|
90 | self.parser.add_option('--repo-location', | |||
|
91 | action='store', | |||
|
92 | dest='repo_location', | |||
|
93 | help="Specifies repositories location to index REQUIRED", | |||
|
94 | ) | |||
|
95 | self.parser.add_option('-f', | |||
|
96 | action='store_true', | |||
|
97 | dest='full_index', | |||
|
98 | help="Specifies that index should be made full i.e" | |||
|
99 | " destroy old and build from scratch", | |||
|
100 | default=False) | |||
|
101 | ||||
51 | class ResultWrapper(object): |
|
102 | class ResultWrapper(object): | |
52 | def __init__(self, search_type, searcher, matcher, highlight_items): |
|
103 | def __init__(self, search_type, searcher, matcher, highlight_items): | |
53 | self.search_type = search_type |
|
104 | self.search_type = search_type | |
@@ -55,7 +106,7 b' class ResultWrapper(object):' | |||||
55 | self.matcher = matcher |
|
106 | self.matcher = matcher | |
56 | self.highlight_items = highlight_items |
|
107 | self.highlight_items = highlight_items | |
57 | self.fragment_size = 200 / 2 |
|
108 | self.fragment_size = 200 / 2 | |
58 |
|
109 | |||
59 | @LazyProperty |
|
110 | @LazyProperty | |
60 | def doc_ids(self): |
|
111 | def doc_ids(self): | |
61 | docs_id = [] |
|
112 | docs_id = [] | |
@@ -64,8 +115,8 b' class ResultWrapper(object):' | |||||
64 | chunks = [offsets for offsets in self.get_chunks()] |
|
115 | chunks = [offsets for offsets in self.get_chunks()] | |
65 | docs_id.append([docnum, chunks]) |
|
116 | docs_id.append([docnum, chunks]) | |
66 | self.matcher.next() |
|
117 | self.matcher.next() | |
67 |
return docs_id |
|
118 | return docs_id | |
68 |
|
119 | |||
69 | def __str__(self): |
|
120 | def __str__(self): | |
70 | return '<%s at %s>' % (self.__class__.__name__, len(self.doc_ids)) |
|
121 | return '<%s at %s>' % (self.__class__.__name__, len(self.doc_ids)) | |
71 |
|
122 | |||
@@ -91,32 +142,32 b' class ResultWrapper(object):' | |||||
91 | slice = [] |
|
142 | slice = [] | |
92 | for docid in self.doc_ids[i:j]: |
|
143 | for docid in self.doc_ids[i:j]: | |
93 | slice.append(self.get_full_content(docid)) |
|
144 | slice.append(self.get_full_content(docid)) | |
94 |
return slice |
|
145 | return slice | |
95 |
|
146 | |||
96 |
|
147 | |||
97 | def get_full_content(self, docid): |
|
148 | def get_full_content(self, docid): | |
98 | res = self.searcher.stored_fields(docid[0]) |
|
149 | res = self.searcher.stored_fields(docid[0]) | |
99 | f_path = res['path'][res['path'].find(res['repository']) \ |
|
150 | f_path = res['path'][res['path'].find(res['repository']) \ | |
100 | + len(res['repository']):].lstrip('/') |
|
151 | + len(res['repository']):].lstrip('/') | |
101 |
|
152 | |||
102 | content_short = self.get_short_content(res, docid[1]) |
|
153 | content_short = self.get_short_content(res, docid[1]) | |
103 | res.update({'content_short':content_short, |
|
154 | res.update({'content_short':content_short, | |
104 | 'content_short_hl':self.highlight(content_short), |
|
155 | 'content_short_hl':self.highlight(content_short), | |
105 | 'f_path':f_path}) |
|
156 | 'f_path':f_path}) | |
106 |
|
157 | |||
107 |
return res |
|
158 | return res | |
108 |
|
159 | |||
109 | def get_short_content(self, res, chunks): |
|
160 | def get_short_content(self, res, chunks): | |
110 |
|
161 | |||
111 | return ''.join([res['content'][chunk[0]:chunk[1]] for chunk in chunks]) |
|
162 | return ''.join([res['content'][chunk[0]:chunk[1]] for chunk in chunks]) | |
112 |
|
163 | |||
113 | def get_chunks(self): |
|
164 | def get_chunks(self): | |
114 | """ |
|
165 | """ | |
115 | Smart function that implements chunking the content |
|
166 | Smart function that implements chunking the content | |
116 | but not overlap chunks so it doesn't highlight the same |
|
167 | but not overlap chunks so it doesn't highlight the same | |
117 | close occurrences twice. |
|
168 | close occurrences twice. | |
118 |
|
|
169 | @param matcher: | |
119 |
|
|
170 | @param size: | |
120 | """ |
|
171 | """ | |
121 | memory = [(0, 0)] |
|
172 | memory = [(0, 0)] | |
122 | for span in self.matcher.spans(): |
|
173 | for span in self.matcher.spans(): | |
@@ -124,12 +175,12 b' class ResultWrapper(object):' | |||||
124 | end = span.endchar or 0 |
|
175 | end = span.endchar or 0 | |
125 | start_offseted = max(0, start - self.fragment_size) |
|
176 | start_offseted = max(0, start - self.fragment_size) | |
126 | end_offseted = end + self.fragment_size |
|
177 | end_offseted = end + self.fragment_size | |
127 |
|
178 | |||
128 | if start_offseted < memory[-1][1]: |
|
179 | if start_offseted < memory[-1][1]: | |
129 | start_offseted = memory[-1][1] |
|
180 | start_offseted = memory[-1][1] | |
130 |
memory.append((start_offseted, end_offseted,)) |
|
181 | memory.append((start_offseted, end_offseted,)) | |
131 |
yield (start_offseted, end_offseted,) |
|
182 | yield (start_offseted, end_offseted,) | |
132 |
|
183 | |||
133 | def highlight(self, content, top=5): |
|
184 | def highlight(self, content, top=5): | |
134 | if self.search_type != 'content': |
|
185 | if self.search_type != 'content': | |
135 | return '' |
|
186 | return '' | |
@@ -139,4 +190,4 b' class ResultWrapper(object):' | |||||
139 | fragmenter=FRAGMENTER, |
|
190 | fragmenter=FRAGMENTER, | |
140 | formatter=FORMATTER, |
|
191 | formatter=FORMATTER, | |
141 | top=top) |
|
192 | top=top) | |
142 |
return hl |
|
193 | return hl |
@@ -32,12 +32,12 b' from os.path import join as jn' | |||||
32 | project_path = dn(dn(dn(dn(os.path.realpath(__file__))))) |
|
32 | project_path = dn(dn(dn(dn(os.path.realpath(__file__))))) | |
33 | sys.path.append(project_path) |
|
33 | sys.path.append(project_path) | |
34 |
|
34 | |||
35 | from rhodecode.lib.pidlock import LockHeld, DaemonLock |
|
35 | ||
36 |
from rhodecode.model. |
|
36 | from rhodecode.model.scm import ScmModel | |
37 | from rhodecode.lib.helpers import safe_unicode |
|
37 | from rhodecode.lib.helpers import safe_unicode | |
38 | from whoosh.index import create_in, open_dir |
|
38 | from whoosh.index import create_in, open_dir | |
39 | from shutil import rmtree |
|
39 | from shutil import rmtree | |
40 |
from rhodecode.lib.indexers import INDEX_EXTENSIONS, |
|
40 | from rhodecode.lib.indexers import INDEX_EXTENSIONS, SCHEMA, IDX_NAME | |
41 |
|
41 | |||
42 | from time import mktime |
|
42 | from time import mktime | |
43 | from vcs.exceptions import ChangesetError, RepositoryError |
|
43 | from vcs.exceptions import ChangesetError, RepositoryError | |
@@ -61,55 +61,59 b' ch.setFormatter(formatter)' | |||||
61 | # add ch to logger |
|
61 | # add ch to logger | |
62 | log.addHandler(ch) |
|
62 | log.addHandler(ch) | |
63 |
|
63 | |||
64 | def scan_paths(root_location): |
|
|||
65 | return HgModel.repo_scan('/', root_location, None, True) |
|
|||
66 |
|
||||
67 | class WhooshIndexingDaemon(object): |
|
64 | class WhooshIndexingDaemon(object): | |
68 | """ |
|
65 | """ | |
69 | Deamon for atomic jobs |
|
66 | Deamon for atomic jobs | |
70 | """ |
|
67 | """ | |
71 |
|
68 | |||
72 |
def __init__(self, indexname='HG_INDEX', |
|
69 | def __init__(self, indexname='HG_INDEX', index_location=None, | |
|
70 | repo_location=None, sa=None): | |||
73 | self.indexname = indexname |
|
71 | self.indexname = indexname | |
|
72 | ||||
|
73 | self.index_location = index_location | |||
|
74 | if not index_location: | |||
|
75 | raise Exception('You have to provide index location') | |||
|
76 | ||||
74 | self.repo_location = repo_location |
|
77 | self.repo_location = repo_location | |
75 | self.repo_paths = scan_paths(self.repo_location) |
|
78 | if not repo_location: | |
|
79 | raise Exception('You have to provide repositories location') | |||
|
80 | ||||
|
81 | self.repo_paths = ScmModel(sa).repo_scan(self.repo_location, None) | |||
76 | self.initial = False |
|
82 | self.initial = False | |
77 |
if not os.path.isdir( |
|
83 | if not os.path.isdir(self.index_location): | |
78 | os.mkdir(IDX_LOCATION) |
|
84 | os.makedirs(self.index_location) | |
79 | log.info('Cannot run incremental index since it does not' |
|
85 | log.info('Cannot run incremental index since it does not' | |
80 | ' yet exist running full build') |
|
86 | ' yet exist running full build') | |
81 | self.initial = True |
|
87 | self.initial = True | |
82 |
|
88 | |||
83 | def get_paths(self, repo): |
|
89 | def get_paths(self, repo): | |
84 | """ |
|
90 | """recursive walk in root dir and return a set of all path in that dir | |
85 | recursive walk in root dir and return a set of all path in that dir |
|
|||
86 | based on repository walk function |
|
91 | based on repository walk function | |
87 | """ |
|
92 | """ | |
88 | index_paths_ = set() |
|
93 | index_paths_ = set() | |
89 | try: |
|
94 | try: | |
90 | tip = repo.get_changeset() |
|
95 | for topnode, dirs, files in repo.walk('/', 'tip'): | |
91 |
|
||||
92 | for topnode, dirs, files in tip.walk('/'): |
|
|||
93 | for f in files: |
|
96 | for f in files: | |
94 | index_paths_.add(jn(repo.path, f.path)) |
|
97 | index_paths_.add(jn(repo.path, f.path)) | |
95 | for dir in dirs: |
|
98 | for dir in dirs: | |
96 | for f in files: |
|
99 | for f in files: | |
97 | index_paths_.add(jn(repo.path, f.path)) |
|
100 | index_paths_.add(jn(repo.path, f.path)) | |
98 |
|
101 | |||
99 | except RepositoryError: |
|
102 | except RepositoryError: | |
100 | pass |
|
103 | pass | |
101 |
return index_paths_ |
|
104 | return index_paths_ | |
102 |
|
105 | |||
103 | def get_node(self, repo, path): |
|
106 | def get_node(self, repo, path): | |
104 | n_path = path[len(repo.path) + 1:] |
|
107 | n_path = path[len(repo.path) + 1:] | |
105 | node = repo.get_changeset().get_node(n_path) |
|
108 | node = repo.get_changeset().get_node(n_path) | |
106 | return node |
|
109 | return node | |
107 |
|
110 | |||
108 | def get_node_mtime(self, node): |
|
111 | def get_node_mtime(self, node): | |
109 | return mktime(node.last_changeset.date.timetuple()) |
|
112 | return mktime(node.last_changeset.date.timetuple()) | |
110 |
|
113 | |||
111 | def add_doc(self, writer, path, repo): |
|
114 | def add_doc(self, writer, path, repo): | |
112 | """Adding doc to writer""" |
|
115 | """Adding doc to writer this function itself fetches data from | |
|
116 | the instance of vcs backend""" | |||
113 | node = self.get_node(repo, path) |
|
117 | node = self.get_node(repo, path) | |
114 |
|
118 | |||
115 | #we just index the content of chosen files |
|
119 | #we just index the content of chosen files | |
@@ -120,63 +124,63 b' class WhooshIndexingDaemon(object):' | |||||
120 | log.debug(' >> %s' % path) |
|
124 | log.debug(' >> %s' % path) | |
121 | #just index file name without it's content |
|
125 | #just index file name without it's content | |
122 | u_content = u'' |
|
126 | u_content = u'' | |
123 |
|
127 | |||
124 | writer.add_document(owner=unicode(repo.contact), |
|
128 | writer.add_document(owner=unicode(repo.contact), | |
125 | repository=safe_unicode(repo.name), |
|
129 | repository=safe_unicode(repo.name), | |
126 | path=safe_unicode(path), |
|
130 | path=safe_unicode(path), | |
127 | content=u_content, |
|
131 | content=u_content, | |
128 | modtime=self.get_node_mtime(node), |
|
132 | modtime=self.get_node_mtime(node), | |
129 |
extension=node.extension) |
|
133 | extension=node.extension) | |
|
134 | ||||
130 |
|
135 | |||
131 |
|
||||
132 | def build_index(self): |
|
136 | def build_index(self): | |
133 |
if os.path.exists( |
|
137 | if os.path.exists(self.index_location): | |
134 | log.debug('removing previous index') |
|
138 | log.debug('removing previous index') | |
135 | rmtree(IDX_LOCATION) |
|
139 | rmtree(self.index_location) | |
136 |
|
140 | |||
137 |
if not os.path.exists( |
|
141 | if not os.path.exists(self.index_location): | |
138 |
os.mkdir( |
|
142 | os.mkdir(self.index_location) | |
139 |
|
143 | |||
140 |
idx = create_in( |
|
144 | idx = create_in(self.index_location, SCHEMA, indexname=IDX_NAME) | |
141 | writer = idx.writer() |
|
145 | writer = idx.writer() | |
142 |
|
146 | |||
143 | for cnt, repo in enumerate(self.repo_paths.values()): |
|
147 | for cnt, repo in enumerate(self.repo_paths.values()): | |
144 | log.debug('building index @ %s' % repo.path) |
|
148 | log.debug('building index @ %s' % repo.path) | |
145 |
|
149 | |||
146 | for idx_path in self.get_paths(repo): |
|
150 | for idx_path in self.get_paths(repo): | |
147 | self.add_doc(writer, idx_path, repo) |
|
151 | self.add_doc(writer, idx_path, repo) | |
148 |
|
152 | |||
149 | log.debug('>> COMMITING CHANGES <<') |
|
153 | log.debug('>> COMMITING CHANGES <<') | |
150 | writer.commit(merge=True) |
|
154 | writer.commit(merge=True) | |
151 | log.debug('>>> FINISHED BUILDING INDEX <<<') |
|
155 | log.debug('>>> FINISHED BUILDING INDEX <<<') | |
152 |
|
156 | |||
153 |
|
157 | |||
154 | def update_index(self): |
|
158 | def update_index(self): | |
155 | log.debug('STARTING INCREMENTAL INDEXING UPDATE') |
|
159 | log.debug('STARTING INCREMENTAL INDEXING UPDATE') | |
156 |
|
160 | |||
157 |
idx = open_dir( |
|
161 | idx = open_dir(self.index_location, indexname=self.indexname) | |
158 | # The set of all paths in the index |
|
162 | # The set of all paths in the index | |
159 | indexed_paths = set() |
|
163 | indexed_paths = set() | |
160 | # The set of all paths we need to re-index |
|
164 | # The set of all paths we need to re-index | |
161 | to_index = set() |
|
165 | to_index = set() | |
162 |
|
166 | |||
163 | reader = idx.reader() |
|
167 | reader = idx.reader() | |
164 | writer = idx.writer() |
|
168 | writer = idx.writer() | |
165 |
|
169 | |||
166 | # Loop over the stored fields in the index |
|
170 | # Loop over the stored fields in the index | |
167 | for fields in reader.all_stored_fields(): |
|
171 | for fields in reader.all_stored_fields(): | |
168 | indexed_path = fields['path'] |
|
172 | indexed_path = fields['path'] | |
169 | indexed_paths.add(indexed_path) |
|
173 | indexed_paths.add(indexed_path) | |
170 |
|
174 | |||
171 | repo = self.repo_paths[fields['repository']] |
|
175 | repo = self.repo_paths[fields['repository']] | |
172 |
|
176 | |||
173 | try: |
|
177 | try: | |
174 | node = self.get_node(repo, indexed_path) |
|
178 | node = self.get_node(repo, indexed_path) | |
175 | except ChangesetError: |
|
179 | except ChangesetError: | |
176 | # This file was deleted since it was indexed |
|
180 | # This file was deleted since it was indexed | |
177 | log.debug('removing from index %s' % indexed_path) |
|
181 | log.debug('removing from index %s' % indexed_path) | |
178 | writer.delete_by_term('path', indexed_path) |
|
182 | writer.delete_by_term('path', indexed_path) | |
179 |
|
183 | |||
180 | else: |
|
184 | else: | |
181 | # Check if this file was changed since it was indexed |
|
185 | # Check if this file was changed since it was indexed | |
182 | indexed_time = fields['modtime'] |
|
186 | indexed_time = fields['modtime'] | |
@@ -187,7 +191,7 b' class WhooshIndexingDaemon(object):' | |||||
187 | log.debug('adding to reindex list %s' % indexed_path) |
|
191 | log.debug('adding to reindex list %s' % indexed_path) | |
188 | writer.delete_by_term('path', indexed_path) |
|
192 | writer.delete_by_term('path', indexed_path) | |
189 | to_index.add(indexed_path) |
|
193 | to_index.add(indexed_path) | |
190 |
|
194 | |||
191 | # Loop over the files in the filesystem |
|
195 | # Loop over the files in the filesystem | |
192 | # Assume we have a function that gathers the filenames of the |
|
196 | # Assume we have a function that gathers the filenames of the | |
193 | # documents to be indexed |
|
197 | # documents to be indexed | |
@@ -198,51 +202,14 b' class WhooshIndexingDaemon(object):' | |||||
198 | # that wasn't indexed before. So index it! |
|
202 | # that wasn't indexed before. So index it! | |
199 | self.add_doc(writer, path, repo) |
|
203 | self.add_doc(writer, path, repo) | |
200 | log.debug('re indexing %s' % path) |
|
204 | log.debug('re indexing %s' % path) | |
201 |
|
205 | |||
202 | log.debug('>> COMMITING CHANGES <<') |
|
206 | log.debug('>> COMMITING CHANGES <<') | |
203 | writer.commit(merge=True) |
|
207 | writer.commit(merge=True) | |
204 | log.debug('>>> FINISHED REBUILDING INDEX <<<') |
|
208 | log.debug('>>> FINISHED REBUILDING INDEX <<<') | |
205 |
|
209 | |||
206 | def run(self, full_index=False): |
|
210 | def run(self, full_index=False): | |
207 | """Run daemon""" |
|
211 | """Run daemon""" | |
208 | if full_index or self.initial: |
|
212 | if full_index or self.initial: | |
209 | self.build_index() |
|
213 | self.build_index() | |
210 | else: |
|
214 | else: | |
211 | self.update_index() |
|
215 | self.update_index() | |
212 |
|
||||
213 | if __name__ == "__main__": |
|
|||
214 | arg = sys.argv[1:] |
|
|||
215 | if len(arg) != 2: |
|
|||
216 | sys.stderr.write('Please specify indexing type [full|incremental]' |
|
|||
217 | 'and path to repositories as script args \n') |
|
|||
218 | sys.exit() |
|
|||
219 |
|
||||
220 |
|
||||
221 | if arg[0] == 'full': |
|
|||
222 | full_index = True |
|
|||
223 | elif arg[0] == 'incremental': |
|
|||
224 | # False means looking just for changes |
|
|||
225 | full_index = False |
|
|||
226 | else: |
|
|||
227 | sys.stdout.write('Please use [full|incremental]' |
|
|||
228 | ' as script first arg \n') |
|
|||
229 | sys.exit() |
|
|||
230 |
|
||||
231 | if not os.path.isdir(arg[1]): |
|
|||
232 | sys.stderr.write('%s is not a valid path \n' % arg[1]) |
|
|||
233 | sys.exit() |
|
|||
234 | else: |
|
|||
235 | if arg[1].endswith('/'): |
|
|||
236 | repo_location = arg[1] + '*' |
|
|||
237 | else: |
|
|||
238 | repo_location = arg[1] + '/*' |
|
|||
239 |
|
||||
240 | try: |
|
|||
241 | l = DaemonLock() |
|
|||
242 | WhooshIndexingDaemon(repo_location=repo_location)\ |
|
|||
243 | .run(full_index=full_index) |
|
|||
244 | l.release() |
|
|||
245 | reload(logging) |
|
|||
246 | except LockHeld: |
|
|||
247 | sys.exit(1) |
|
|||
248 |
|
@@ -17,6 +17,14 b'' | |||||
17 | # along with this program; if not, write to the Free Software |
|
17 | # along with this program; if not, write to the Free Software | |
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, |
|
18 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |
19 | # MA 02110-1301, USA. |
|
19 | # MA 02110-1301, USA. | |
|
20 | """ | |||
|
21 | Created on 2010-04-28 | |||
|
22 | ||||
|
23 | @author: marcink | |||
|
24 | SimpleGit middleware for handling git protocol request (push/clone etc.) | |||
|
25 | It's implemented with basic auth function | |||
|
26 | """ | |||
|
27 | ||||
20 | from dulwich import server as dulserver |
|
28 | from dulwich import server as dulserver | |
21 |
|
29 | |||
22 | class SimpleGitUploadPackHandler(dulserver.UploadPackHandler): |
|
30 | class SimpleGitUploadPackHandler(dulserver.UploadPackHandler): | |
@@ -54,26 +62,28 b' from dulwich.repo import Repo' | |||||
54 | from dulwich.web import HTTPGitApplication |
|
62 | from dulwich.web import HTTPGitApplication | |
55 | from paste.auth.basic import AuthBasicAuthenticator |
|
63 | from paste.auth.basic import AuthBasicAuthenticator | |
56 | from paste.httpheaders import REMOTE_USER, AUTH_TYPE |
|
64 | from paste.httpheaders import REMOTE_USER, AUTH_TYPE | |
57 |
from rhodecode.lib.auth import authfunc, HasPermissionAnyMiddleware |
|
65 | from rhodecode.lib.auth import authfunc, HasPermissionAnyMiddleware | |
58 | get_user_cached |
|
66 | from rhodecode.lib.utils import invalidate_cache, check_repo_fast | |
59 | from rhodecode.lib.utils import action_logger, is_git, invalidate_cache, \ |
|
67 | from rhodecode.model.user import UserModel | |
60 | check_repo_fast |
|
|||
61 | from webob.exc import HTTPNotFound, HTTPForbidden, HTTPInternalServerError |
|
68 | from webob.exc import HTTPNotFound, HTTPForbidden, HTTPInternalServerError | |
62 | import logging |
|
69 | import logging | |
63 | import os |
|
70 | import os | |
64 | import traceback |
|
71 | import traceback | |
65 | """ |
|
|||
66 | Created on 2010-04-28 |
|
|||
67 |
|
||||
68 | @author: marcink |
|
|||
69 | SimpleGit middleware for handling git protocol request (push/clone etc.) |
|
|||
70 | It's implemented with basic auth function |
|
|||
71 | """ |
|
|||
72 |
|
||||
73 |
|
||||
74 |
|
72 | |||
75 | log = logging.getLogger(__name__) |
|
73 | log = logging.getLogger(__name__) | |
76 |
|
74 | |||
|
75 | def is_git(environ): | |||
|
76 | """ | |||
|
77 | Returns True if request's target is git server. ``HTTP_USER_AGENT`` would | |||
|
78 | then have git client version given. | |||
|
79 | ||||
|
80 | :param environ: | |||
|
81 | """ | |||
|
82 | http_user_agent = environ.get('HTTP_USER_AGENT') | |||
|
83 | if http_user_agent and http_user_agent.startswith('git'): | |||
|
84 | return True | |||
|
85 | return False | |||
|
86 | ||||
77 | class SimpleGit(object): |
|
87 | class SimpleGit(object): | |
78 |
|
88 | |||
79 | def __init__(self, application, config): |
|
89 | def __init__(self, application, config): | |
@@ -81,11 +91,19 b' class SimpleGit(object):' | |||||
81 | self.config = config |
|
91 | self.config = config | |
82 | #authenticate this git request using |
|
92 | #authenticate this git request using | |
83 | self.authenticate = AuthBasicAuthenticator('', authfunc) |
|
93 | self.authenticate = AuthBasicAuthenticator('', authfunc) | |
|
94 | self.ipaddr = '0.0.0.0' | |||
|
95 | self.repository = None | |||
|
96 | self.username = None | |||
|
97 | self.action = None | |||
84 |
|
98 | |||
85 | def __call__(self, environ, start_response): |
|
99 | def __call__(self, environ, start_response): | |
86 | if not is_git(environ): |
|
100 | if not is_git(environ): | |
87 | return self.application(environ, start_response) |
|
101 | return self.application(environ, start_response) | |
88 |
|
102 | |||
|
103 | proxy_key = 'HTTP_X_REAL_IP' | |||
|
104 | def_key = 'REMOTE_ADDR' | |||
|
105 | self.ipaddr = environ.get(proxy_key, environ.get(def_key, '0.0.0.0')) | |||
|
106 | ||||
89 | #=================================================================== |
|
107 | #=================================================================== | |
90 | # AUTHENTICATE THIS GIT REQUEST |
|
108 | # AUTHENTICATE THIS GIT REQUEST | |
91 | #=================================================================== |
|
109 | #=================================================================== | |
@@ -99,10 +117,14 b' class SimpleGit(object):' | |||||
99 | else: |
|
117 | else: | |
100 | return result.wsgi_application(environ, start_response) |
|
118 | return result.wsgi_application(environ, start_response) | |
101 |
|
119 | |||
|
120 | #======================================================================= | |||
|
121 | # GET REPOSITORY | |||
|
122 | #======================================================================= | |||
102 | try: |
|
123 | try: | |
103 |
|
|
124 | repo_name = '/'.join(environ['PATH_INFO'].split('/')[1:]) | |
104 |
if |
|
125 | if repo_name.endswith('/'): | |
105 |
|
|
126 | repo_name = repo_name.rstrip('/') | |
|
127 | self.repository = repo_name | |||
106 | except: |
|
128 | except: | |
107 | log.error(traceback.format_exc()) |
|
129 | log.error(traceback.format_exc()) | |
108 | return HTTPInternalServerError()(environ, start_response) |
|
130 | return HTTPInternalServerError()(environ, start_response) | |
@@ -110,20 +132,21 b' class SimpleGit(object):' | |||||
110 | #=================================================================== |
|
132 | #=================================================================== | |
111 | # CHECK PERMISSIONS FOR THIS REQUEST |
|
133 | # CHECK PERMISSIONS FOR THIS REQUEST | |
112 | #=================================================================== |
|
134 | #=================================================================== | |
113 | action = self.__get_action(environ) |
|
135 | self.action = self.__get_action(environ) | |
114 | if action: |
|
136 | if self.action: | |
115 | username = self.__get_environ_user(environ) |
|
137 | username = self.__get_environ_user(environ) | |
116 | try: |
|
138 | try: | |
117 | user = self.__get_user(username) |
|
139 | user = self.__get_user(username) | |
|
140 | self.username = user.username | |||
118 | except: |
|
141 | except: | |
119 | log.error(traceback.format_exc()) |
|
142 | log.error(traceback.format_exc()) | |
120 | return HTTPInternalServerError()(environ, start_response) |
|
143 | return HTTPInternalServerError()(environ, start_response) | |
121 |
|
144 | |||
122 | #check permissions for this repository |
|
145 | #check permissions for this repository | |
123 | if action == 'push': |
|
146 | if self.action == 'push': | |
124 | if not HasPermissionAnyMiddleware('repository.write', |
|
147 | if not HasPermissionAnyMiddleware('repository.write', | |
125 | 'repository.admin')\ |
|
148 | 'repository.admin')\ | |
126 |
(user, |
|
149 | (user, repo_name): | |
127 | return HTTPForbidden()(environ, start_response) |
|
150 | return HTTPForbidden()(environ, start_response) | |
128 |
|
151 | |||
129 | else: |
|
152 | else: | |
@@ -131,15 +154,13 b' class SimpleGit(object):' | |||||
131 | if not HasPermissionAnyMiddleware('repository.read', |
|
154 | if not HasPermissionAnyMiddleware('repository.read', | |
132 | 'repository.write', |
|
155 | 'repository.write', | |
133 | 'repository.admin')\ |
|
156 | 'repository.admin')\ | |
134 |
(user, |
|
157 | (user, repo_name): | |
135 | return HTTPForbidden()(environ, start_response) |
|
158 | return HTTPForbidden()(environ, start_response) | |
136 |
|
159 | |||
137 | #log action |
|
160 | self.extras = {'ip':self.ipaddr, | |
138 | if action in ('push', 'pull', 'clone'): |
|
161 | 'username':self.username, | |
139 | proxy_key = 'HTTP_X_REAL_IP' |
|
162 | 'action':self.action, | |
140 | def_key = 'REMOTE_ADDR' |
|
163 | 'repository':self.repository} | |
141 | ipaddr = environ.get(proxy_key, environ.get(def_key, '0.0.0.0')) |
|
|||
142 | self.__log_user_action(user, action, self.repo_name, ipaddr) |
|
|||
143 |
|
164 | |||
144 | #=================================================================== |
|
165 | #=================================================================== | |
145 | # GIT REQUEST HANDLING |
|
166 | # GIT REQUEST HANDLING | |
@@ -151,12 +172,12 b' class SimpleGit(object):' | |||||
151 | return HTTPNotFound()(environ, start_response) |
|
172 | return HTTPNotFound()(environ, start_response) | |
152 | try: |
|
173 | try: | |
153 | app = self.__make_app() |
|
174 | app = self.__make_app() | |
154 |
except |
|
175 | except: | |
155 | log.error(traceback.format_exc()) |
|
176 | log.error(traceback.format_exc()) | |
156 | return HTTPInternalServerError()(environ, start_response) |
|
177 | return HTTPInternalServerError()(environ, start_response) | |
157 |
|
178 | |||
158 | #invalidate cache on push |
|
179 | #invalidate cache on push | |
159 | if action == 'push': |
|
180 | if self.action == 'push': | |
160 | self.__invalidate_cache(self.repo_name) |
|
181 | self.__invalidate_cache(self.repo_name) | |
161 | messages = [] |
|
182 | messages = [] | |
162 | messages.append('thank you for using rhodecode') |
|
183 | messages.append('thank you for using rhodecode') | |
@@ -175,7 +196,7 b' class SimpleGit(object):' | |||||
175 | return environ.get('REMOTE_USER') |
|
196 | return environ.get('REMOTE_USER') | |
176 |
|
197 | |||
177 | def __get_user(self, username): |
|
198 | def __get_user(self, username): | |
178 |
return |
|
199 | return UserModel().get_by_username(username, cache=True) | |
179 |
|
200 | |||
180 | def __get_action(self, environ): |
|
201 | def __get_action(self, environ): | |
181 | """ |
|
202 | """ | |
@@ -193,12 +214,8 b' class SimpleGit(object):' | |||||
193 | else: |
|
214 | else: | |
194 | return 'other' |
|
215 | return 'other' | |
195 |
|
216 | |||
196 | def __log_user_action(self, user, action, repo, ipaddr): |
|
|||
197 | action_logger(user, action, repo, ipaddr) |
|
|||
198 |
|
||||
199 | def __invalidate_cache(self, repo_name): |
|
217 | def __invalidate_cache(self, repo_name): | |
200 | """we know that some change was made to repositories and we should |
|
218 | """we know that some change was made to repositories and we should | |
201 | invalidate the cache to see the changes right away but only for |
|
219 | invalidate the cache to see the changes right away but only for | |
202 | push requests""" |
|
220 | push requests""" | |
203 |
invalidate_cache(' |
|
221 | invalidate_cache('get_repo_cached_%s' % repo_name) | |
204 | invalidate_cache('full_changelog', repo_name) |
|
@@ -24,40 +24,57 b' Created on 2010-04-28' | |||||
24 | SimpleHG middleware for handling mercurial protocol request (push/clone etc.) |
|
24 | SimpleHG middleware for handling mercurial protocol request (push/clone etc.) | |
25 | It's implemented with basic auth function |
|
25 | It's implemented with basic auth function | |
26 | """ |
|
26 | """ | |
27 | from itertools import chain |
|
|||
28 | from mercurial.error import RepoError |
|
27 | from mercurial.error import RepoError | |
29 | from mercurial.hgweb import hgweb |
|
28 | from mercurial.hgweb import hgweb | |
30 | from mercurial.hgweb.request import wsgiapplication |
|
29 | from mercurial.hgweb.request import wsgiapplication | |
31 | from paste.auth.basic import AuthBasicAuthenticator |
|
30 | from paste.auth.basic import AuthBasicAuthenticator | |
32 | from paste.httpheaders import REMOTE_USER, AUTH_TYPE |
|
31 | from paste.httpheaders import REMOTE_USER, AUTH_TYPE | |
33 |
from rhodecode.lib.auth import authfunc, HasPermissionAnyMiddleware |
|
32 | from rhodecode.lib.auth import authfunc, HasPermissionAnyMiddleware | |
34 | get_user_cached |
|
33 | from rhodecode.lib.utils import make_ui, invalidate_cache, \ | |
35 | from rhodecode.lib.utils import is_mercurial, make_ui, invalidate_cache, \ |
|
|||
36 | check_repo_fast, ui_sections |
|
34 | check_repo_fast, ui_sections | |
|
35 | from rhodecode.model.user import UserModel | |||
37 | from webob.exc import HTTPNotFound, HTTPForbidden, HTTPInternalServerError |
|
36 | from webob.exc import HTTPNotFound, HTTPForbidden, HTTPInternalServerError | |
38 | from rhodecode.lib.utils import action_logger |
|
|||
39 | import logging |
|
37 | import logging | |
40 | import os |
|
38 | import os | |
41 | import traceback |
|
39 | import traceback | |
42 |
|
40 | |||
43 | log = logging.getLogger(__name__) |
|
41 | log = logging.getLogger(__name__) | |
44 |
|
42 | |||
|
43 | def is_mercurial(environ): | |||
|
44 | """ | |||
|
45 | Returns True if request's target is mercurial server - header | |||
|
46 | ``HTTP_ACCEPT`` of such request would start with ``application/mercurial``. | |||
|
47 | """ | |||
|
48 | http_accept = environ.get('HTTP_ACCEPT') | |||
|
49 | if http_accept and http_accept.startswith('application/mercurial'): | |||
|
50 | return True | |||
|
51 | return False | |||
|
52 | ||||
45 | class SimpleHg(object): |
|
53 | class SimpleHg(object): | |
46 |
|
54 | |||
47 | def __init__(self, application, config): |
|
55 | def __init__(self, application, config): | |
48 | self.application = application |
|
56 | self.application = application | |
49 | self.config = config |
|
57 | self.config = config | |
50 | #authenticate this mercurial request using |
|
58 | #authenticate this mercurial request using authfunc | |
51 | self.authenticate = AuthBasicAuthenticator('', authfunc) |
|
59 | self.authenticate = AuthBasicAuthenticator('', authfunc) | |
|
60 | self.ipaddr = '0.0.0.0' | |||
|
61 | self.repository = None | |||
|
62 | self.username = None | |||
|
63 | self.action = None | |||
52 |
|
64 | |||
53 | def __call__(self, environ, start_response): |
|
65 | def __call__(self, environ, start_response): | |
54 | if not is_mercurial(environ): |
|
66 | if not is_mercurial(environ): | |
55 | return self.application(environ, start_response) |
|
67 | return self.application(environ, start_response) | |
56 |
|
68 | |||
|
69 | proxy_key = 'HTTP_X_REAL_IP' | |||
|
70 | def_key = 'REMOTE_ADDR' | |||
|
71 | self.ipaddr = environ.get(proxy_key, environ.get(def_key, '0.0.0.0')) | |||
|
72 | ||||
57 | #=================================================================== |
|
73 | #=================================================================== | |
58 | # AUTHENTICATE THIS MERCURIAL REQUEST |
|
74 | # AUTHENTICATE THIS MERCURIAL REQUEST | |
59 | #=================================================================== |
|
75 | #=================================================================== | |
60 | username = REMOTE_USER(environ) |
|
76 | username = REMOTE_USER(environ) | |
|
77 | ||||
61 | if not username: |
|
78 | if not username: | |
62 | self.authenticate.realm = self.config['rhodecode_realm'] |
|
79 | self.authenticate.realm = self.config['rhodecode_realm'] | |
63 | result = self.authenticate(environ) |
|
80 | result = self.authenticate(environ) | |
@@ -67,10 +84,14 b' class SimpleHg(object):' | |||||
67 | else: |
|
84 | else: | |
68 | return result.wsgi_application(environ, start_response) |
|
85 | return result.wsgi_application(environ, start_response) | |
69 |
|
86 | |||
|
87 | #======================================================================= | |||
|
88 | # GET REPOSITORY | |||
|
89 | #======================================================================= | |||
70 | try: |
|
90 | try: | |
71 | repo_name = '/'.join(environ['PATH_INFO'].split('/')[1:]) |
|
91 | repo_name = '/'.join(environ['PATH_INFO'].split('/')[1:]) | |
72 | if repo_name.endswith('/'): |
|
92 | if repo_name.endswith('/'): | |
73 | repo_name = repo_name.rstrip('/') |
|
93 | repo_name = repo_name.rstrip('/') | |
|
94 | self.repository = repo_name | |||
74 | except: |
|
95 | except: | |
75 | log.error(traceback.format_exc()) |
|
96 | log.error(traceback.format_exc()) | |
76 | return HTTPInternalServerError()(environ, start_response) |
|
97 | return HTTPInternalServerError()(environ, start_response) | |
@@ -78,17 +99,18 b' class SimpleHg(object):' | |||||
78 | #=================================================================== |
|
99 | #=================================================================== | |
79 | # CHECK PERMISSIONS FOR THIS REQUEST |
|
100 | # CHECK PERMISSIONS FOR THIS REQUEST | |
80 | #=================================================================== |
|
101 | #=================================================================== | |
81 | action = self.__get_action(environ) |
|
102 | self.action = self.__get_action(environ) | |
82 | if action: |
|
103 | if self.action: | |
83 | username = self.__get_environ_user(environ) |
|
104 | username = self.__get_environ_user(environ) | |
84 | try: |
|
105 | try: | |
85 | user = self.__get_user(username) |
|
106 | user = self.__get_user(username) | |
|
107 | self.username = user.username | |||
86 | except: |
|
108 | except: | |
87 | log.error(traceback.format_exc()) |
|
109 | log.error(traceback.format_exc()) | |
88 | return HTTPInternalServerError()(environ, start_response) |
|
110 | return HTTPInternalServerError()(environ, start_response) | |
89 |
|
111 | |||
90 | #check permissions for this repository |
|
112 | #check permissions for this repository | |
91 | if action == 'push': |
|
113 | if self.action == 'push': | |
92 | if not HasPermissionAnyMiddleware('repository.write', |
|
114 | if not HasPermissionAnyMiddleware('repository.write', | |
93 | 'repository.admin')\ |
|
115 | 'repository.admin')\ | |
94 | (user, repo_name): |
|
116 | (user, repo_name): | |
@@ -102,12 +124,10 b' class SimpleHg(object):' | |||||
102 | (user, repo_name): |
|
124 | (user, repo_name): | |
103 | return HTTPForbidden()(environ, start_response) |
|
125 | return HTTPForbidden()(environ, start_response) | |
104 |
|
126 | |||
105 | #log action |
|
127 | self.extras = {'ip':self.ipaddr, | |
106 | if action in ('push', 'pull', 'clone'): |
|
128 | 'username':self.username, | |
107 | proxy_key = 'HTTP_X_REAL_IP' |
|
129 | 'action':self.action, | |
108 | def_key = 'REMOTE_ADDR' |
|
130 | 'repository':self.repository} | |
109 | ipaddr = environ.get(proxy_key, environ.get(def_key, '0.0.0.0')) |
|
|||
110 | self.__log_user_action(user, action, repo_name, ipaddr) |
|
|||
111 |
|
131 | |||
112 | #=================================================================== |
|
132 | #=================================================================== | |
113 | # MERCURIAL REQUEST HANDLING |
|
133 | # MERCURIAL REQUEST HANDLING | |
@@ -130,40 +150,21 b' class SimpleHg(object):' | |||||
130 | return HTTPInternalServerError()(environ, start_response) |
|
150 | return HTTPInternalServerError()(environ, start_response) | |
131 |
|
151 | |||
132 | #invalidate cache on push |
|
152 | #invalidate cache on push | |
133 | if action == 'push': |
|
153 | if self.action == 'push': | |
134 | self.__invalidate_cache(repo_name) |
|
154 | self.__invalidate_cache(repo_name) | |
135 | messages = [] |
|
|||
136 | messages.append('thank you for using rhodecode') |
|
|||
137 |
|
||||
138 | return self.msg_wrapper(app, environ, start_response, messages) |
|
|||
139 | else: |
|
|||
140 | return app(environ, start_response) |
|
|||
141 |
|
||||
142 |
|
155 | |||
143 |
|
|
156 | return app(environ, start_response) | |
144 | """ |
|
157 | ||
145 | Wrapper for custom messages that come out of mercurial respond messages |
|
|||
146 | is a list of messages that the user will see at the end of response |
|
|||
147 | from merurial protocol actions that involves remote answers |
|
|||
148 | :param app: |
|
|||
149 | :param environ: |
|
|||
150 | :param start_response: |
|
|||
151 | """ |
|
|||
152 | def custom_messages(msg_list): |
|
|||
153 | for msg in msg_list: |
|
|||
154 | yield msg + '\n' |
|
|||
155 | org_response = app(environ, start_response) |
|
|||
156 | return chain(org_response, custom_messages(messages)) |
|
|||
157 |
|
158 | |||
158 | def __make_app(self): |
|
159 | def __make_app(self): | |
159 | hgserve = hgweb(str(self.repo_path), baseui=self.baseui) |
|
160 | hgserve = hgweb(str(self.repo_path), baseui=self.baseui) | |
160 | return self.__load_web_settings(hgserve) |
|
161 | return self.__load_web_settings(hgserve, self.extras) | |
161 |
|
162 | |||
162 | def __get_environ_user(self, environ): |
|
163 | def __get_environ_user(self, environ): | |
163 | return environ.get('REMOTE_USER') |
|
164 | return environ.get('REMOTE_USER') | |
164 |
|
165 | |||
165 | def __get_user(self, username): |
|
166 | def __get_user(self, username): | |
166 |
return |
|
167 | return UserModel().get_by_username(username, cache=True) | |
167 |
|
168 | |||
168 | def __get_action(self, environ): |
|
169 | def __get_action(self, environ): | |
169 | """ |
|
170 | """ | |
@@ -174,7 +175,7 b' class SimpleHg(object):' | |||||
174 | mapping = {'changegroup': 'pull', |
|
175 | mapping = {'changegroup': 'pull', | |
175 | 'changegroupsubset': 'pull', |
|
176 | 'changegroupsubset': 'pull', | |
176 | 'stream_out': 'pull', |
|
177 | 'stream_out': 'pull', | |
177 |
|
|
178 | 'listkeys': 'pull', | |
178 | 'unbundle': 'push', |
|
179 | 'unbundle': 'push', | |
179 | 'pushkey': 'push', } |
|
180 | 'pushkey': 'push', } | |
180 | for qry in environ['QUERY_STRING'].split('&'): |
|
181 | for qry in environ['QUERY_STRING'].split('&'): | |
@@ -185,25 +186,26 b' class SimpleHg(object):' | |||||
185 | else: |
|
186 | else: | |
186 | return cmd |
|
187 | return cmd | |
187 |
|
188 | |||
188 | def __log_user_action(self, user, action, repo, ipaddr): |
|
|||
189 | action_logger(user, action, repo, ipaddr) |
|
|||
190 |
|
||||
191 | def __invalidate_cache(self, repo_name): |
|
189 | def __invalidate_cache(self, repo_name): | |
192 | """we know that some change was made to repositories and we should |
|
190 | """we know that some change was made to repositories and we should | |
193 | invalidate the cache to see the changes right away but only for |
|
191 | invalidate the cache to see the changes right away but only for | |
194 | push requests""" |
|
192 | push requests""" | |
195 |
invalidate_cache(' |
|
193 | invalidate_cache('get_repo_cached_%s' % repo_name) | |
196 | invalidate_cache('full_changelog', repo_name) |
|
|||
197 |
|
194 | |||
198 |
|
195 | |||
199 | def __load_web_settings(self, hgserve): |
|
196 | def __load_web_settings(self, hgserve, extras={}): | |
200 | #set the global ui for hgserve instance passed |
|
197 | #set the global ui for hgserve instance passed | |
201 | hgserve.repo.ui = self.baseui |
|
198 | hgserve.repo.ui = self.baseui | |
202 |
|
199 | |||
203 | hgrc = os.path.join(self.repo_path, '.hg', 'hgrc') |
|
200 | hgrc = os.path.join(self.repo_path, '.hg', 'hgrc') | |
|
201 | ||||
|
202 | #inject some additional parameters that will be available in ui | |||
|
203 | #for hooks | |||
|
204 | for k, v in extras.items(): | |||
|
205 | hgserve.repo.ui.setconfig('rhodecode_extras', k, v) | |||
|
206 | ||||
204 | repoui = make_ui('file', hgrc, False) |
|
207 | repoui = make_ui('file', hgrc, False) | |
205 |
|
208 | |||
206 |
|
||||
207 | if repoui: |
|
209 | if repoui: | |
208 | #overwrite our ui instance with the section from hgrc file |
|
210 | #overwrite our ui instance with the section from hgrc file | |
209 | for section in ui_sections: |
|
211 | for section in ui_sections: |
@@ -22,7 +22,7 b' class SmtpMailer(object):' | |||||
22 |
|
22 | |||
23 | def __init__(self, mail_from, user, passwd, mail_server, |
|
23 | def __init__(self, mail_from, user, passwd, mail_server, | |
24 | mail_port=None, ssl=False, tls=False): |
|
24 | mail_port=None, ssl=False, tls=False): | |
25 |
|
25 | |||
26 | self.mail_from = mail_from |
|
26 | self.mail_from = mail_from | |
27 | self.mail_server = mail_server |
|
27 | self.mail_server = mail_server | |
28 | self.mail_port = mail_port |
|
28 | self.mail_port = mail_port | |
@@ -31,7 +31,7 b' class SmtpMailer(object):' | |||||
31 | self.ssl = ssl |
|
31 | self.ssl = ssl | |
32 | self.tls = tls |
|
32 | self.tls = tls | |
33 | self.debug = False |
|
33 | self.debug = False | |
34 |
|
34 | |||
35 | def send(self, recipients=[], subject='', body='', attachment_files={}): |
|
35 | def send(self, recipients=[], subject='', body='', attachment_files={}): | |
36 |
|
36 | |||
37 | if isinstance(recipients, basestring): |
|
37 | if isinstance(recipients, basestring): | |
@@ -43,11 +43,11 b' class SmtpMailer(object):' | |||||
43 |
|
43 | |||
44 | if self.tls: |
|
44 | if self.tls: | |
45 | smtp_serv.starttls() |
|
45 | smtp_serv.starttls() | |
46 |
|
46 | |||
47 |
if self.debug: |
|
47 | if self.debug: | |
48 | smtp_serv.set_debuglevel(1) |
|
48 | smtp_serv.set_debuglevel(1) | |
49 |
|
49 | |||
50 | smtp_serv.ehlo("mailer") |
|
50 | smtp_serv.ehlo("rhodecode mailer") | |
51 |
|
51 | |||
52 | #if server requires authorization you must provide login and password |
|
52 | #if server requires authorization you must provide login and password | |
53 | smtp_serv.login(self.user, self.passwd) |
|
53 | smtp_serv.login(self.user, self.passwd) | |
@@ -82,13 +82,13 b' class SmtpMailer(object):' | |||||
82 | maintype, subtype = ctype.split('/', 1) |
|
82 | maintype, subtype = ctype.split('/', 1) | |
83 | if maintype == 'text': |
|
83 | if maintype == 'text': | |
84 | # Note: we should handle calculating the charset |
|
84 | # Note: we should handle calculating the charset | |
85 |
file_part = MIMEText(self.get_content(msg_file), |
|
85 | file_part = MIMEText(self.get_content(msg_file), | |
86 | _subtype=subtype) |
|
86 | _subtype=subtype) | |
87 | elif maintype == 'image': |
|
87 | elif maintype == 'image': | |
88 |
file_part = MIMEImage(self.get_content(msg_file), |
|
88 | file_part = MIMEImage(self.get_content(msg_file), | |
89 | _subtype=subtype) |
|
89 | _subtype=subtype) | |
90 | elif maintype == 'audio': |
|
90 | elif maintype == 'audio': | |
91 |
file_part = MIMEAudio(self.get_content(msg_file), |
|
91 | file_part = MIMEAudio(self.get_content(msg_file), | |
92 | _subtype=subtype) |
|
92 | _subtype=subtype) | |
93 | else: |
|
93 | else: | |
94 | file_part = MIMEBase(maintype, subtype) |
|
94 | file_part = MIMEBase(maintype, subtype) | |
@@ -96,13 +96,13 b' class SmtpMailer(object):' | |||||
96 | # Encode the payload using Base64 |
|
96 | # Encode the payload using Base64 | |
97 | encoders.encode_base64(msg) |
|
97 | encoders.encode_base64(msg) | |
98 | # Set the filename parameter |
|
98 | # Set the filename parameter | |
99 |
file_part.add_header('Content-Disposition', 'attachment', |
|
99 | file_part.add_header('Content-Disposition', 'attachment', | |
100 | filename=f_name) |
|
100 | filename=f_name) | |
101 | file_part.add_header('Content-Type', ctype, name=f_name) |
|
101 | file_part.add_header('Content-Type', ctype, name=f_name) | |
102 | msg.attach(file_part) |
|
102 | msg.attach(file_part) | |
103 | else: |
|
103 | else: | |
104 |
raise Exception('Attachment files should be' |
|
104 | raise Exception('Attachment files should be' | |
105 |
'a dict in format {"filename":"filepath"}') |
|
105 | 'a dict in format {"filename":"filepath"}') | |
106 |
|
106 | |||
107 | def get_content(self, msg_file): |
|
107 | def get_content(self, msg_file): | |
108 | ''' |
|
108 | ''' |
@@ -1,7 +1,15 b'' | |||||
1 | #!/usr/bin/env python |
|
1 | # -*- coding: utf-8 -*- | |
2 | # encoding: utf-8 |
|
2 | """ | |
3 | # Utilities for RhodeCode |
|
3 | rhodecode.lib.utils | |
4 | # Copyright (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> |
|
4 | ~~~~~~~~~~~~~~~~~~~ | |
|
5 | ||||
|
6 | Utilities library for RhodeCode | |||
|
7 | ||||
|
8 | :created_on: Apr 18, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | """ | |||
5 | # This program is free software; you can redistribute it and/or |
|
13 | # This program is free software; you can redistribute it and/or | |
6 | # modify it under the terms of the GNU General Public License |
|
14 | # modify it under the terms of the GNU General Public License | |
7 | # as published by the Free Software Foundation; version 2 |
|
15 | # as published by the Free Software Foundation; version 2 | |
@@ -17,21 +25,28 b'' | |||||
17 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, |
|
25 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |
18 | # MA 02110-1301, USA. |
|
26 | # MA 02110-1301, USA. | |
19 |
|
27 | |||
20 | """ |
|
28 | import os | |
21 | Created on April 18, 2010 |
|
29 | import logging | |
22 | Utilities for RhodeCode |
|
30 | import datetime | |
23 | @author: marcink |
|
31 | import traceback | |
24 | """ |
|
32 | ||
25 | from beaker.cache import cache_region |
|
33 | from UserDict import DictMixin | |
|
34 | ||||
26 | from mercurial import ui, config, hg |
|
35 | from mercurial import ui, config, hg | |
27 | from mercurial.error import RepoError |
|
36 | from mercurial.error import RepoError | |
28 | from rhodecode.model import meta |
|
37 | ||
29 | from rhodecode.model.db import Repository, User, RhodeCodeUi, RhodeCodeSettings, UserLog |
|
38 | import paste | |
|
39 | import beaker | |||
|
40 | from paste.script.command import Command, BadCommand | |||
|
41 | ||||
30 | from vcs.backends.base import BaseChangeset |
|
42 | from vcs.backends.base import BaseChangeset | |
31 | from vcs.utils.lazy import LazyProperty |
|
43 | from vcs.utils.lazy import LazyProperty | |
32 | import logging |
|
44 | ||
33 | import datetime |
|
45 | from rhodecode.model import meta | |
34 | import os |
|
46 | from rhodecode.model.caching_query import FromCache | |
|
47 | from rhodecode.model.db import Repository, User, RhodeCodeUi, UserLog | |||
|
48 | from rhodecode.model.repo import RepoModel | |||
|
49 | from rhodecode.model.user import UserModel | |||
35 |
|
50 | |||
36 | log = logging.getLogger(__name__) |
|
51 | log = logging.getLogger(__name__) | |
37 |
|
52 | |||
@@ -39,72 +54,95 b' log = logging.getLogger(__name__)' | |||||
39 | def get_repo_slug(request): |
|
54 | def get_repo_slug(request): | |
40 | return request.environ['pylons.routes_dict'].get('repo_name') |
|
55 | return request.environ['pylons.routes_dict'].get('repo_name') | |
41 |
|
56 | |||
42 | def is_mercurial(environ): |
|
57 | def action_logger(user, action, repo, ipaddr='', sa=None): | |
43 | """ |
|
|||
44 | Returns True if request's target is mercurial server - header |
|
|||
45 | ``HTTP_ACCEPT`` of such request would start with ``application/mercurial``. |
|
|||
46 | """ |
|
|||
47 | http_accept = environ.get('HTTP_ACCEPT') |
|
|||
48 | if http_accept and http_accept.startswith('application/mercurial'): |
|
|||
49 | return True |
|
|||
50 | return False |
|
|||
51 |
|
||||
52 | def is_git(environ): |
|
|||
53 | """ |
|
58 | """ | |
54 | Returns True if request's target is git server. ``HTTP_USER_AGENT`` would |
|
59 | Action logger for various actions made by users | |
55 | then have git client version given. |
|
|||
56 |
|
60 | |||
57 | :param environ: |
|
61 | :param user: user that made this action, can be a unique username string or | |
58 | """ |
|
62 | object containing user_id attribute | |
59 | http_user_agent = environ.get('HTTP_USER_AGENT') |
|
63 | :param action: action to log, should be on of predefined unique actions for | |
60 | if http_user_agent.startswith('git'): |
|
64 | easy translations | |
61 | return True |
|
65 | :param repo: string name of repository or object containing repo_id, | |
62 | return False |
|
66 | that action was made on | |
63 |
|
67 | :param ipaddr: optional ip address from what the action was made | ||
64 | def action_logger(user, action, repo, ipaddr, sa=None): |
|
68 | :param sa: optional sqlalchemy session | |
65 |
|
|
69 | ||
66 | Action logger for various action made by users |
|
|||
67 | """ |
|
70 | """ | |
68 |
|
71 | |||
69 | if not sa: |
|
72 | if not sa: | |
70 | sa = meta.Session |
|
73 | sa = meta.Session() | |
71 |
|
74 | |||
72 | try: |
|
75 | try: | |
|
76 | um = UserModel() | |||
73 | if hasattr(user, 'user_id'): |
|
77 | if hasattr(user, 'user_id'): | |
74 |
user_ |
|
78 | user_obj = user | |
75 | elif isinstance(user, basestring): |
|
79 | elif isinstance(user, basestring): | |
76 | user_id = sa.query(User).filter(User.username == user).one() |
|
80 | user_obj = um.get_by_username(user, cache=False) | |
77 | else: |
|
81 | else: | |
78 | raise Exception('You have to provide user object or username') |
|
82 | raise Exception('You have to provide user object or username') | |
79 |
|
83 | |||
80 | repo_name = repo.lstrip('/') |
|
84 | ||
|
85 | rm = RepoModel() | |||
|
86 | if hasattr(repo, 'repo_id'): | |||
|
87 | repo_obj = rm.get(repo.repo_id, cache=False) | |||
|
88 | repo_name = repo_obj.repo_name | |||
|
89 | elif isinstance(repo, basestring): | |||
|
90 | repo_name = repo.lstrip('/') | |||
|
91 | repo_obj = rm.get_by_repo_name(repo_name, cache=False) | |||
|
92 | else: | |||
|
93 | raise Exception('You have to provide repository to action logger') | |||
|
94 | ||||
|
95 | ||||
81 | user_log = UserLog() |
|
96 | user_log = UserLog() | |
82 | user_log.user_id = user_id |
|
97 | user_log.user_id = user_obj.user_id | |
83 | user_log.action = action |
|
98 | user_log.action = action | |
|
99 | ||||
|
100 | user_log.repository_id = repo_obj.repo_id | |||
84 | user_log.repository_name = repo_name |
|
101 | user_log.repository_name = repo_name | |
85 | user_log.repository = sa.query(Repository)\ |
|
102 | ||
86 | .filter(Repository.repo_name == repo_name).one() |
|
|||
87 | user_log.action_date = datetime.datetime.now() |
|
103 | user_log.action_date = datetime.datetime.now() | |
88 | user_log.user_ip = ipaddr |
|
104 | user_log.user_ip = ipaddr | |
89 | sa.add(user_log) |
|
105 | sa.add(user_log) | |
90 | sa.commit() |
|
106 | sa.commit() | |
91 |
|
107 | |||
92 | log.info('Adding user %s, action %s on %s', |
|
108 | log.info('Adding user %s, action %s on %s', user_obj, action, repo) | |
93 | user.username, action, repo) |
|
109 | except: | |
94 | except Exception, e: |
|
110 | log.error(traceback.format_exc()) | |
95 | sa.rollback() |
|
111 | sa.rollback() | |
96 | log.error('could not log user action:%s', str(e)) |
|
|||
97 |
|
112 | |||
98 | def check_repo_dir(paths): |
|
113 | def get_repos(path, recursive=False, initial=False): | |
99 | repos_path = paths[0][1].split('/') |
|
114 | """ | |
100 | if repos_path[-1] in ['*', '**']: |
|
115 | Scans given path for repos and return (name,(type,path)) tuple | |
101 | repos_path = repos_path[:-1] |
|
116 | :param prefix: | |
102 | if repos_path[0] != '/': |
|
117 | :param path: | |
103 | repos_path[0] = '/' |
|
118 | :param recursive: | |
104 | if not os.path.isdir(os.path.join(*repos_path)): |
|
119 | :param initial: | |
105 | raise Exception('Not a valid repository in %s' % paths[0][1]) |
|
120 | """ | |
|
121 | from vcs.utils.helpers import get_scm | |||
|
122 | from vcs.exceptions import VCSError | |||
|
123 | ||||
|
124 | try: | |||
|
125 | scm = get_scm(path) | |||
|
126 | except: | |||
|
127 | pass | |||
|
128 | else: | |||
|
129 | raise Exception('The given path %s should not be a repository got %s', | |||
|
130 | path, scm) | |||
|
131 | ||||
|
132 | for dirpath in os.listdir(path): | |||
|
133 | try: | |||
|
134 | yield dirpath, get_scm(os.path.join(path, dirpath)) | |||
|
135 | except VCSError: | |||
|
136 | pass | |||
106 |
|
137 | |||
107 | def check_repo_fast(repo_name, base_path): |
|
138 | def check_repo_fast(repo_name, base_path): | |
|
139 | """ | |||
|
140 | Check given path for existance of directory | |||
|
141 | :param repo_name: | |||
|
142 | :param base_path: | |||
|
143 | ||||
|
144 | :return False: if this directory is present | |||
|
145 | """ | |||
108 | if os.path.isdir(os.path.join(base_path, repo_name)):return False |
|
146 | if os.path.isdir(os.path.join(base_path, repo_name)):return False | |
109 | return True |
|
147 | return True | |
110 |
|
148 | |||
@@ -135,57 +173,6 b' def ask_ok(prompt, retries=4, complaint=' | |||||
135 | if retries < 0: raise IOError |
|
173 | if retries < 0: raise IOError | |
136 | print complaint |
|
174 | print complaint | |
137 |
|
175 | |||
138 | @cache_region('super_short_term', 'cached_hg_ui') |
|
|||
139 | def get_hg_ui_cached(): |
|
|||
140 | try: |
|
|||
141 | sa = meta.Session |
|
|||
142 | ret = sa.query(RhodeCodeUi).all() |
|
|||
143 | finally: |
|
|||
144 | meta.Session.remove() |
|
|||
145 | return ret |
|
|||
146 |
|
||||
147 |
|
||||
148 | def get_hg_settings(): |
|
|||
149 | try: |
|
|||
150 | sa = meta.Session |
|
|||
151 | ret = sa.query(RhodeCodeSettings).all() |
|
|||
152 | finally: |
|
|||
153 | meta.Session.remove() |
|
|||
154 |
|
||||
155 | if not ret: |
|
|||
156 | raise Exception('Could not get application settings !') |
|
|||
157 | settings = {} |
|
|||
158 | for each in ret: |
|
|||
159 | settings['rhodecode_' + each.app_settings_name] = each.app_settings_value |
|
|||
160 |
|
||||
161 | return settings |
|
|||
162 |
|
||||
163 | def get_hg_ui_settings(): |
|
|||
164 | try: |
|
|||
165 | sa = meta.Session |
|
|||
166 | ret = sa.query(RhodeCodeUi).all() |
|
|||
167 | finally: |
|
|||
168 | meta.Session.remove() |
|
|||
169 |
|
||||
170 | if not ret: |
|
|||
171 | raise Exception('Could not get application ui settings !') |
|
|||
172 | settings = {} |
|
|||
173 | for each in ret: |
|
|||
174 | k = each.ui_key |
|
|||
175 | v = each.ui_value |
|
|||
176 | if k == '/': |
|
|||
177 | k = 'root_path' |
|
|||
178 |
|
||||
179 | if k.find('.') != -1: |
|
|||
180 | k = k.replace('.', '_') |
|
|||
181 |
|
||||
182 | if each.ui_section == 'hooks': |
|
|||
183 | v = each.ui_active |
|
|||
184 |
|
||||
185 | settings[each.ui_section + '_' + k] = v |
|
|||
186 |
|
||||
187 | return settings |
|
|||
188 |
|
||||
189 | #propagated from mercurial documentation |
|
176 | #propagated from mercurial documentation | |
190 | ui_sections = ['alias', 'auth', |
|
177 | ui_sections = ['alias', 'auth', | |
191 | 'decode/encode', 'defaults', |
|
178 | 'decode/encode', 'defaults', | |
@@ -210,6 +197,11 b" def make_ui(read_from='file', path=None," | |||||
210 |
|
197 | |||
211 | baseui = ui.ui() |
|
198 | baseui = ui.ui() | |
212 |
|
199 | |||
|
200 | #clean the baseui object | |||
|
201 | baseui._ocfg = config.config() | |||
|
202 | baseui._ucfg = config.config() | |||
|
203 | baseui._tcfg = config.config() | |||
|
204 | ||||
213 | if read_from == 'file': |
|
205 | if read_from == 'file': | |
214 | if not os.path.isfile(path): |
|
206 | if not os.path.isfile(path): | |
215 | log.warning('Unable to read config file %s' % path) |
|
207 | log.warning('Unable to read config file %s' % path) | |
@@ -219,70 +211,69 b" def make_ui(read_from='file', path=None," | |||||
219 | cfg.read(path) |
|
211 | cfg.read(path) | |
220 | for section in ui_sections: |
|
212 | for section in ui_sections: | |
221 | for k, v in cfg.items(section): |
|
213 | for k, v in cfg.items(section): | |
|
214 | log.debug('settings ui from file[%s]%s:%s', section, k, v) | |||
222 | baseui.setconfig(section, k, v) |
|
215 | baseui.setconfig(section, k, v) | |
223 | log.debug('settings ui from file[%s]%s:%s', section, k, v) |
|
|||
224 |
|
||||
225 | for k, v in baseui.configitems('extensions'): |
|
|||
226 | baseui.setconfig('extensions', k, '0') |
|
|||
227 | #just enable mq |
|
|||
228 | baseui.setconfig('extensions', 'mq', '1') |
|
|||
229 | if checkpaths:check_repo_dir(cfg.items('paths')) |
|
|||
230 |
|
216 | |||
231 |
|
217 | |||
232 | elif read_from == 'db': |
|
218 | elif read_from == 'db': | |
233 | hg_ui = get_hg_ui_cached() |
|
219 | sa = meta.Session() | |
|
220 | ret = sa.query(RhodeCodeUi)\ | |||
|
221 | .options(FromCache("sql_cache_short", | |||
|
222 | "get_hg_ui_settings")).all() | |||
|
223 | ||||
|
224 | hg_ui = ret | |||
234 | for ui_ in hg_ui: |
|
225 | for ui_ in hg_ui: | |
235 | if ui_.ui_active: |
|
226 | if ui_.ui_active: | |
236 |
log.debug('settings ui from db[%s]%s:%s', ui_.ui_section, |
|
227 | log.debug('settings ui from db[%s]%s:%s', ui_.ui_section, | |
|
228 | ui_.ui_key, ui_.ui_value) | |||
237 | baseui.setconfig(ui_.ui_section, ui_.ui_key, ui_.ui_value) |
|
229 | baseui.setconfig(ui_.ui_section, ui_.ui_key, ui_.ui_value) | |
238 |
|
230 | |||
239 |
|
231 | meta.Session.remove() | ||
240 | return baseui |
|
232 | return baseui | |
241 |
|
233 | |||
242 |
|
234 | |||
243 | def set_rhodecode_config(config): |
|
235 | def set_rhodecode_config(config): | |
244 | hgsettings = get_hg_settings() |
|
236 | """ | |
|
237 | Updates pylons config with new settings from database | |||
|
238 | :param config: | |||
|
239 | """ | |||
|
240 | from rhodecode.model.settings import SettingsModel | |||
|
241 | hgsettings = SettingsModel().get_app_settings() | |||
245 |
|
242 | |||
246 | for k, v in hgsettings.items(): |
|
243 | for k, v in hgsettings.items(): | |
247 | config[k] = v |
|
244 | config[k] = v | |
248 |
|
245 | |||
249 |
def invalidate_cache( |
|
246 | def invalidate_cache(cache_key, *args): | |
250 | """Invalidates given name cache""" |
|
247 | """ | |
251 |
|
248 | Puts cache invalidation task into db for | ||
252 | from beaker.cache import region_invalidate |
|
249 | further global cache invalidation | |
253 | log.info('INVALIDATING CACHE FOR %s', name) |
|
250 | """ | |
|
251 | from rhodecode.model.scm import ScmModel | |||
254 |
|
252 | |||
255 | """propagate our arguments to make sure invalidation works. First |
|
253 | if cache_key.startswith('get_repo_cached_'): | |
256 | argument has to be the name of cached func name give to cache decorator |
|
254 | name = cache_key.split('get_repo_cached_')[-1] | |
257 | without that the invalidation would not work""" |
|
255 | ScmModel().mark_for_invalidation(name) | |
258 | tmp = [name] |
|
|||
259 | tmp.extend(args) |
|
|||
260 | args = tuple(tmp) |
|
|||
261 |
|
||||
262 | if name == 'cached_repo_list': |
|
|||
263 | from rhodecode.model.hg_model import _get_repos_cached |
|
|||
264 | region_invalidate(_get_repos_cached, None, *args) |
|
|||
265 |
|
||||
266 | if name == 'full_changelog': |
|
|||
267 | from rhodecode.model.hg_model import _full_changelog_cached |
|
|||
268 | region_invalidate(_full_changelog_cached, None, *args) |
|
|||
269 |
|
256 | |||
270 | class EmptyChangeset(BaseChangeset): |
|
257 | class EmptyChangeset(BaseChangeset): | |
271 | """ |
|
258 | """ | |
272 | An dummy empty changeset. |
|
259 | An dummy empty changeset. It's possible to pass hash when creating | |
|
260 | an EmptyChangeset | |||
273 | """ |
|
261 | """ | |
274 |
|
262 | |||
275 | revision = -1 |
|
263 | def __init__(self, cs='0' * 40): | |
276 | message = '' |
|
264 | self._empty_cs = cs | |
277 | author = '' |
|
265 | self.revision = -1 | |
278 | date = '' |
|
266 | self.message = '' | |
|
267 | self.author = '' | |||
|
268 | self.date = '' | |||
|
269 | ||||
279 | @LazyProperty |
|
270 | @LazyProperty | |
280 | def raw_id(self): |
|
271 | def raw_id(self): | |
281 | """ |
|
272 | """ | |
282 | Returns raw string identifing this changeset, useful for web |
|
273 | Returns raw string identifying this changeset, useful for web | |
283 | representation. |
|
274 | representation. | |
284 | """ |
|
275 | """ | |
285 | return '0' * 40 |
|
276 | return self._empty_cs | |
286 |
|
277 | |||
287 | @LazyProperty |
|
278 | @LazyProperty | |
288 | def short_id(self): |
|
279 | def short_id(self): | |
@@ -301,26 +292,25 b' def repo2db_mapper(initial_repo_list, re' | |||||
301 | """ |
|
292 | """ | |
302 | maps all found repositories into db |
|
293 | maps all found repositories into db | |
303 | """ |
|
294 | """ | |
304 | from rhodecode.model.repo_model import RepoModel |
|
|||
305 |
|
295 | |||
306 | sa = meta.Session |
|
296 | sa = meta.Session() | |
|
297 | rm = RepoModel() | |||
307 | user = sa.query(User).filter(User.admin == True).first() |
|
298 | user = sa.query(User).filter(User.admin == True).first() | |
308 |
|
299 | |||
309 | rm = RepoModel() |
|
|||
310 |
|
||||
311 | for name, repo in initial_repo_list.items(): |
|
300 | for name, repo in initial_repo_list.items(): | |
312 | if not sa.query(Repository).filter(Repository.repo_name == name).scalar(): |
|
301 | if not rm.get_by_repo_name(name, cache=False): | |
313 | log.info('repository %s not found creating default', name) |
|
302 | log.info('repository %s not found creating default', name) | |
314 |
|
303 | |||
315 | form_data = { |
|
304 | form_data = { | |
316 | 'repo_name':name, |
|
305 | 'repo_name':name, | |
317 | 'description':repo.description if repo.description != 'unknown' else \ |
|
306 | 'repo_type':repo.alias, | |
318 |
|
|
307 | 'description':repo.description \ | |
|
308 | if repo.description != 'unknown' else \ | |||
|
309 | '%s repository' % name, | |||
319 | 'private':False |
|
310 | 'private':False | |
320 | } |
|
311 | } | |
321 | rm.create(form_data, user, just_db=True) |
|
312 | rm.create(form_data, user, just_db=True) | |
322 |
|
313 | |||
323 |
|
||||
324 | if remove_obsolete: |
|
314 | if remove_obsolete: | |
325 | #remove from database those repositories that are not in the filesystem |
|
315 | #remove from database those repositories that are not in the filesystem | |
326 | for repo in sa.query(Repository).all(): |
|
316 | for repo in sa.query(Repository).all(): | |
@@ -328,11 +318,6 b' def repo2db_mapper(initial_repo_list, re' | |||||
328 | sa.delete(repo) |
|
318 | sa.delete(repo) | |
329 | sa.commit() |
|
319 | sa.commit() | |
330 |
|
320 | |||
331 |
|
||||
332 | meta.Session.remove() |
|
|||
333 |
|
||||
334 | from UserDict import DictMixin |
|
|||
335 |
|
||||
336 | class OrderedDict(dict, DictMixin): |
|
321 | class OrderedDict(dict, DictMixin): | |
337 |
|
322 | |||
338 | def __init__(self, *args, **kwds): |
|
323 | def __init__(self, *args, **kwds): | |
@@ -433,8 +418,51 b' class OrderedDict(dict, DictMixin):' | |||||
433 | return not self == other |
|
418 | return not self == other | |
434 |
|
419 | |||
435 |
|
420 | |||
|
421 | #set cache regions for beaker so celery can utilise it | |||
|
422 | def add_cache(settings): | |||
|
423 | cache_settings = {'regions':None} | |||
|
424 | for key in settings.keys(): | |||
|
425 | for prefix in ['beaker.cache.', 'cache.']: | |||
|
426 | if key.startswith(prefix): | |||
|
427 | name = key.split(prefix)[1].strip() | |||
|
428 | cache_settings[name] = settings[key].strip() | |||
|
429 | if cache_settings['regions']: | |||
|
430 | for region in cache_settings['regions'].split(','): | |||
|
431 | region = region.strip() | |||
|
432 | region_settings = {} | |||
|
433 | for key, value in cache_settings.items(): | |||
|
434 | if key.startswith(region): | |||
|
435 | region_settings[key.split('.')[1]] = value | |||
|
436 | region_settings['expire'] = int(region_settings.get('expire', | |||
|
437 | 60)) | |||
|
438 | region_settings.setdefault('lock_dir', | |||
|
439 | cache_settings.get('lock_dir')) | |||
|
440 | if 'type' not in region_settings: | |||
|
441 | region_settings['type'] = cache_settings.get('type', | |||
|
442 | 'memory') | |||
|
443 | beaker.cache.cache_regions[region] = region_settings | |||
|
444 | ||||
|
445 | def get_current_revision(): | |||
|
446 | """ | |||
|
447 | Returns tuple of (number, id) from repository containing this package | |||
|
448 | or None if repository could not be found. | |||
|
449 | """ | |||
|
450 | try: | |||
|
451 | from vcs import get_repo | |||
|
452 | from vcs.utils.helpers import get_scm | |||
|
453 | from vcs.exceptions import RepositoryError, VCSError | |||
|
454 | repopath = os.path.join(os.path.dirname(__file__), '..', '..') | |||
|
455 | scm = get_scm(repopath)[0] | |||
|
456 | repo = get_repo(path=repopath, alias=scm) | |||
|
457 | tip = repo.get_changeset() | |||
|
458 | return (tip.revision, tip.short_id) | |||
|
459 | except (ImportError, RepositoryError, VCSError), err: | |||
|
460 | logging.debug("Cannot retrieve rhodecode's revision. Original error " | |||
|
461 | "was: %s" % err) | |||
|
462 | return None | |||
|
463 | ||||
436 | #=============================================================================== |
|
464 | #=============================================================================== | |
437 | # TEST FUNCTIONS |
|
465 | # TEST FUNCTIONS AND CREATORS | |
438 | #=============================================================================== |
|
466 | #=============================================================================== | |
439 | def create_test_index(repo_location, full_index): |
|
467 | def create_test_index(repo_location, full_index): | |
440 | """Makes default test index |
|
468 | """Makes default test index | |
@@ -443,15 +471,16 b' def create_test_index(repo_location, ful' | |||||
443 | """ |
|
471 | """ | |
444 | from rhodecode.lib.indexers.daemon import WhooshIndexingDaemon |
|
472 | from rhodecode.lib.indexers.daemon import WhooshIndexingDaemon | |
445 | from rhodecode.lib.pidlock import DaemonLock, LockHeld |
|
473 | from rhodecode.lib.pidlock import DaemonLock, LockHeld | |
446 | from rhodecode.lib.indexers import IDX_LOCATION |
|
|||
447 | import shutil |
|
474 | import shutil | |
448 |
|
475 | |||
449 | if os.path.exists(IDX_LOCATION): |
|
476 | index_location = os.path.join(repo_location, 'index') | |
450 | shutil.rmtree(IDX_LOCATION) |
|
477 | if os.path.exists(index_location): | |
|
478 | shutil.rmtree(index_location) | |||
451 |
|
479 | |||
452 | try: |
|
480 | try: | |
453 | l = DaemonLock() |
|
481 | l = DaemonLock() | |
454 |
WhooshIndexingDaemon( |
|
482 | WhooshIndexingDaemon(index_location=index_location, | |
|
483 | repo_location=repo_location)\ | |||
455 | .run(full_index=full_index) |
|
484 | .run(full_index=full_index) | |
456 | l.release() |
|
485 | l.release() | |
457 | except LockHeld: |
|
486 | except LockHeld: | |
@@ -462,10 +491,11 b' def create_test_env(repos_test_path, con' | |||||
462 | install test repository into tmp dir |
|
491 | install test repository into tmp dir | |
463 | """ |
|
492 | """ | |
464 | from rhodecode.lib.db_manage import DbManage |
|
493 | from rhodecode.lib.db_manage import DbManage | |
|
494 | from rhodecode.tests import HG_REPO, GIT_REPO, NEW_HG_REPO, NEW_GIT_REPO, \ | |||
|
495 | HG_FORK, GIT_FORK, TESTS_TMP_PATH | |||
465 | import tarfile |
|
496 | import tarfile | |
466 | import shutil |
|
497 | import shutil | |
467 | from os.path import dirname as dn, join as jn, abspath |
|
498 | from os.path import dirname as dn, join as jn, abspath | |
468 | from rhodecode.tests import REPO_PATH, NEW_REPO_PATH, FORK_REPO_PATH |
|
|||
469 |
|
499 | |||
470 | log = logging.getLogger('TestEnvCreator') |
|
500 | log = logging.getLogger('TestEnvCreator') | |
471 | # create logger |
|
501 | # create logger | |
@@ -485,10 +515,10 b' def create_test_env(repos_test_path, con' | |||||
485 | log.addHandler(ch) |
|
515 | log.addHandler(ch) | |
486 |
|
516 | |||
487 | #PART ONE create db |
|
517 | #PART ONE create db | |
488 |
db |
|
518 | dbconf = config['sqlalchemy.db1.url'] | |
489 |
log.debug('making test db %s', db |
|
519 | log.debug('making test db %s', dbconf) | |
490 |
|
520 | |||
491 |
dbmanage = DbManage(log_sql=True, db |
|
521 | dbmanage = DbManage(log_sql=True, dbconf=dbconf, root=config['here'], | |
492 | tests=True) |
|
522 | tests=True) | |
493 | dbmanage.create_tables(override=True) |
|
523 | dbmanage.create_tables(override=True) | |
494 | dbmanage.config_prompt(repos_test_path) |
|
524 | dbmanage.config_prompt(repos_test_path) | |
@@ -498,18 +528,87 b' def create_test_env(repos_test_path, con' | |||||
498 | dbmanage.populate_default_permissions() |
|
528 | dbmanage.populate_default_permissions() | |
499 |
|
529 | |||
500 | #PART TWO make test repo |
|
530 | #PART TWO make test repo | |
501 | log.debug('making test vcs repo') |
|
531 | log.debug('making test vcs repositories') | |
502 | if os.path.isdir(REPO_PATH): |
|
532 | ||
503 | log.debug('REMOVING %s', REPO_PATH) |
|
533 | #remove old one from previos tests | |
504 | shutil.rmtree(REPO_PATH) |
|
534 | for r in [HG_REPO, GIT_REPO, NEW_HG_REPO, NEW_GIT_REPO, HG_FORK, GIT_FORK]: | |
505 | if os.path.isdir(NEW_REPO_PATH): |
|
535 | ||
506 | log.debug('REMOVING %s', NEW_REPO_PATH) |
|
536 | if os.path.isdir(jn(TESTS_TMP_PATH, r)): | |
507 | shutil.rmtree(NEW_REPO_PATH) |
|
537 | log.debug('removing %s', r) | |
508 | if os.path.isdir(FORK_REPO_PATH): |
|
538 | shutil.rmtree(jn(TESTS_TMP_PATH, r)) | |
509 | log.debug('REMOVING %s', FORK_REPO_PATH) |
|
539 | ||
510 | shutil.rmtree(FORK_REPO_PATH) |
|
540 | #CREATE DEFAULT HG REPOSITORY | |
|
541 | cur_dir = dn(dn(abspath(__file__))) | |||
|
542 | tar = tarfile.open(jn(cur_dir, 'tests', "vcs_test_hg.tar.gz")) | |||
|
543 | tar.extractall(jn(TESTS_TMP_PATH, HG_REPO)) | |||
|
544 | tar.close() | |||
|
545 | ||||
|
546 | ||||
|
547 | #============================================================================== | |||
|
548 | # PASTER COMMANDS | |||
|
549 | #============================================================================== | |||
|
550 | ||||
|
551 | class BasePasterCommand(Command): | |||
|
552 | """ | |||
|
553 | Abstract Base Class for paster commands. | |||
|
554 | ||||
|
555 | The celery commands are somewhat aggressive about loading | |||
|
556 | celery.conf, and since our module sets the `CELERY_LOADER` | |||
|
557 | environment variable to our loader, we have to bootstrap a bit and | |||
|
558 | make sure we've had a chance to load the pylons config off of the | |||
|
559 | command line, otherwise everything fails. | |||
|
560 | """ | |||
|
561 | min_args = 1 | |||
|
562 | min_args_error = "Please provide a paster config file as an argument." | |||
|
563 | takes_config_file = 1 | |||
|
564 | requires_config_file = True | |||
511 |
|
565 | |||
512 | cur_dir = dn(dn(abspath(__file__))) |
|
566 | def notify_msg(self, msg, log=False): | |
513 | tar = tarfile.open(jn(cur_dir, 'tests', "vcs_test.tar.gz")) |
|
567 | """Make a notification to user, additionally if logger is passed | |
514 | tar.extractall('/tmp') |
|
568 | it logs this action using given logger | |
515 | tar.close() |
|
569 | ||
|
570 | :param msg: message that will be printed to user | |||
|
571 | :param log: logging instance, to use to additionally log this message | |||
|
572 | ||||
|
573 | """ | |||
|
574 | print msg | |||
|
575 | if log and isinstance(log, logging): | |||
|
576 | log(msg) | |||
|
577 | ||||
|
578 | ||||
|
579 | def run(self, args): | |||
|
580 | """ | |||
|
581 | Overrides Command.run | |||
|
582 | ||||
|
583 | Checks for a config file argument and loads it. | |||
|
584 | """ | |||
|
585 | if len(args) < self.min_args: | |||
|
586 | raise BadCommand( | |||
|
587 | self.min_args_error % {'min_args': self.min_args, | |||
|
588 | 'actual_args': len(args)}) | |||
|
589 | ||||
|
590 | # Decrement because we're going to lob off the first argument. | |||
|
591 | # @@ This is hacky | |||
|
592 | self.min_args -= 1 | |||
|
593 | self.bootstrap_config(args[0]) | |||
|
594 | self.update_parser() | |||
|
595 | return super(BasePasterCommand, self).run(args[1:]) | |||
|
596 | ||||
|
597 | def update_parser(self): | |||
|
598 | """ | |||
|
599 | Abstract method. Allows for the class's parser to be updated | |||
|
600 | before the superclass's `run` method is called. Necessary to | |||
|
601 | allow options/arguments to be passed through to the underlying | |||
|
602 | celery command. | |||
|
603 | """ | |||
|
604 | raise NotImplementedError("Abstract Method.") | |||
|
605 | ||||
|
606 | def bootstrap_config(self, conf): | |||
|
607 | """ | |||
|
608 | Loads the pylons configuration. | |||
|
609 | """ | |||
|
610 | from pylons import config as pylonsconfig | |||
|
611 | ||||
|
612 | path_to_ini_file = os.path.realpath(conf) | |||
|
613 | conf = paste.deploy.appconfig('config:' + path_to_ini_file) | |||
|
614 | pylonsconfig.init_app(conf.global_conf, conf.local_conf) |
@@ -1,23 +1,71 b'' | |||||
1 | """The application's model objects""" |
|
1 | # -*- coding: utf-8 -*- | |
|
2 | """ | |||
|
3 | rhodecode.model.__init__ | |||
|
4 | ~~~~~~~~~~~~~~~~~~~~~~~~ | |||
|
5 | ||||
|
6 | The application's model objects | |||
|
7 | ||||
|
8 | :created_on: Nov 25, 2010 | |||
|
9 | :author: marcink | |||
|
10 | :copyright: (C) 2009-2010 Marcin Kuzminski <marcin@python-works.com> | |||
|
11 | :license: GPLv3, see COPYING for more details. | |||
|
12 | ||||
|
13 | ||||
|
14 | :example: | |||
|
15 | ||||
|
16 | .. code-block:: python | |||
|
17 | ||||
|
18 | from paste.deploy import appconfig | |||
|
19 | from pylons import config | |||
|
20 | from sqlalchemy import engine_from_config | |||
|
21 | from rhodecode.config.environment import load_environment | |||
|
22 | ||||
|
23 | conf = appconfig('config:development.ini', relative_to = './../../') | |||
|
24 | load_environment(conf.global_conf, conf.local_conf) | |||
|
25 | ||||
|
26 | engine = engine_from_config(config, 'sqlalchemy.') | |||
|
27 | init_model(engine) | |||
|
28 | # RUN YOUR CODE HERE | |||
|
29 | ||||
|
30 | """ | |||
|
31 | # This program is free software; you can redistribute it and/or | |||
|
32 | # modify it under the terms of the GNU General Public License | |||
|
33 | # as published by the Free Software Foundation; version 2 | |||
|
34 | # of the License or (at your opinion) any later version of the license. | |||
|
35 | # | |||
|
36 | # This program is distributed in the hope that it will be useful, | |||
|
37 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
38 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
39 | # GNU General Public License for more details. | |||
|
40 | # | |||
|
41 | # You should have received a copy of the GNU General Public License | |||
|
42 | # along with this program; if not, write to the Free Software | |||
|
43 | # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, | |||
|
44 | # MA 02110-1301, USA. | |||
|
45 | ||||
2 | import logging |
|
46 | import logging | |
3 | from rhodecode.model import meta |
|
47 | from rhodecode.model import meta | |
4 | log = logging.getLogger(__name__) |
|
48 | log = logging.getLogger(__name__) | |
5 |
|
49 | |||
6 | def init_model(engine): |
|
50 | def init_model(engine): | |
7 | """Call me before using any of the tables or classes in the model""" |
|
51 | """Initializes db session, bind the engine with the metadata, | |
8 | log.info("INITIALIZING DB MODELS") |
|
52 | Call this before using any of the tables or classes in the model, preferably | |
|
53 | once in application start | |||
|
54 | ||||
|
55 | :param engine: engine to bind to | |||
|
56 | """ | |||
|
57 | log.info("initializing db models for %s", engine) | |||
9 | meta.Base.metadata.bind = engine |
|
58 | meta.Base.metadata.bind = engine | |
10 | #meta.Base2.metadata.bind = engine2 |
|
|||
11 |
|
59 | |||
12 | #THIS IS A TEST FOR EXECUTING SCRIPT AND LOAD PYLONS APPLICATION GLOBALS |
|
60 | class BaseModel(object): | |
13 | #from paste.deploy import appconfig |
|
61 | """Base Model for all RhodeCode models, it adds sql alchemy session | |
14 | #from pylons import config |
|
62 | into instance of model | |
15 | #from sqlalchemy import engine_from_config |
|
63 | ||
16 | #from rhodecode.config.environment import load_environment |
|
64 | :param sa: If passed it reuses this session instead of creating a new one | |
17 | # |
|
65 | """ | |
18 | #conf = appconfig('config:development.ini', relative_to = './../../') |
|
66 | ||
19 | #load_environment(conf.global_conf, conf.local_conf) |
|
67 | def __init__(self, sa=None): | |
20 | # |
|
68 | if sa is not None: | |
21 | #engine = engine_from_config(config, 'sqlalchemy.') |
|
69 | self.sa = sa | |
22 | #init_model(engine) |
|
70 | else: | |
23 | # DO SOMETHING |
|
71 | self.sa = meta.Session() |
@@ -55,11 +55,11 b' class CachingQuery(Query):' | |||||
55 | the "public" method of configuring this state upon the CachingQuery. |
|
55 | the "public" method of configuring this state upon the CachingQuery. | |
56 |
|
56 | |||
57 | """ |
|
57 | """ | |
58 |
|
58 | |||
59 | def __init__(self, manager, *args, **kw): |
|
59 | def __init__(self, manager, *args, **kw): | |
60 | self.cache_manager = manager |
|
60 | self.cache_manager = manager | |
61 | Query.__init__(self, *args, **kw) |
|
61 | Query.__init__(self, *args, **kw) | |
62 |
|
62 | |||
63 | def __iter__(self): |
|
63 | def __iter__(self): | |
64 | """override __iter__ to pull results from Beaker |
|
64 | """override __iter__ to pull results from Beaker | |
65 | if particular attributes have been configured. |
|
65 | if particular attributes have been configured. | |
@@ -101,7 +101,7 b' class CachingQuery(Query):' | |||||
101 | """Set the value in the cache for this query.""" |
|
101 | """Set the value in the cache for this query.""" | |
102 |
|
102 | |||
103 | cache, cache_key = _get_cache_parameters(self) |
|
103 | cache, cache_key = _get_cache_parameters(self) | |
104 |
cache.put(cache_key, value) |
|
104 | cache.put(cache_key, value) | |
105 |
|
105 | |||
106 | def query_callable(manager): |
|
106 | def query_callable(manager): | |
107 | def query(*arg, **kw): |
|
107 | def query(*arg, **kw): | |
@@ -110,11 +110,11 b' def query_callable(manager):' | |||||
110 |
|
110 | |||
111 | def get_cache_region(name, region): |
|
111 | def get_cache_region(name, region): | |
112 | if region not in beaker.cache.cache_regions: |
|
112 | if region not in beaker.cache.cache_regions: | |
113 |
raise BeakerException('Cache region |
|
113 | raise BeakerException('Cache region `%s` not configured ' | |
114 | 'Check if proper cache settings are in the .ini files' % region) |
|
114 | 'Check if proper cache settings are in the .ini files' % region) | |
115 | kw = beaker.cache.cache_regions[region] |
|
115 | kw = beaker.cache.cache_regions[region] | |
116 | return beaker.cache.Cache._get_cache(name, kw) |
|
116 | return beaker.cache.Cache._get_cache(name, kw) | |
117 |
|
117 | |||
118 | def _get_cache_parameters(query): |
|
118 | def _get_cache_parameters(query): | |
119 | """For a query with cache_region and cache_namespace configured, |
|
119 | """For a query with cache_region and cache_namespace configured, | |
120 | return the correspoinding Cache instance and cache key, based |
|
120 | return the correspoinding Cache instance and cache key, based | |
@@ -125,7 +125,7 b' def _get_cache_parameters(query):' | |||||
125 | raise ValueError("This Query does not have caching parameters configured.") |
|
125 | raise ValueError("This Query does not have caching parameters configured.") | |
126 |
|
126 | |||
127 | region, namespace, cache_key = query._cache_parameters |
|
127 | region, namespace, cache_key = query._cache_parameters | |
128 |
|
128 | |||
129 | namespace = _namespace_from_query(namespace, query) |
|
129 | namespace = _namespace_from_query(namespace, query) | |
130 |
|
130 | |||
131 | if cache_key is None: |
|
131 | if cache_key is None: | |
@@ -153,15 +153,15 b' def _namespace_from_query(namespace, que' | |||||
153 | return namespace |
|
153 | return namespace | |
154 |
|
154 | |||
155 | def _set_cache_parameters(query, region, namespace, cache_key): |
|
155 | def _set_cache_parameters(query, region, namespace, cache_key): | |
156 |
|
156 | |||
157 | if hasattr(query, '_cache_parameters'): |
|
157 | if hasattr(query, '_cache_parameters'): | |
158 | region, namespace, cache_key = query._cache_parameters |
|
158 | region, namespace, cache_key = query._cache_parameters | |
159 | raise ValueError("This query is already configured " |
|
159 | raise ValueError("This query is already configured " | |
160 |
"for region %r namespace %r" % |
|
160 | "for region %r namespace %r" % | |
161 | (region, namespace) |
|
161 | (region, namespace) | |
162 | ) |
|
162 | ) | |
163 | query._cache_parameters = region, namespace, cache_key |
|
163 | query._cache_parameters = region, namespace, cache_key | |
164 |
|
164 | |||
165 | class FromCache(MapperOption): |
|
165 | class FromCache(MapperOption): | |
166 | """Specifies that a Query should load results from a cache.""" |
|
166 | """Specifies that a Query should load results from a cache.""" | |
167 |
|
167 | |||
@@ -187,10 +187,10 b' class FromCache(MapperOption):' | |||||
187 | self.region = region |
|
187 | self.region = region | |
188 | self.namespace = namespace |
|
188 | self.namespace = namespace | |
189 | self.cache_key = cache_key |
|
189 | self.cache_key = cache_key | |
190 |
|
190 | |||
191 | def process_query(self, query): |
|
191 | def process_query(self, query): | |
192 | """Process a Query during normal loading operation.""" |
|
192 | """Process a Query during normal loading operation.""" | |
193 |
|
193 | |||
194 | _set_cache_parameters(query, self.region, self.namespace, self.cache_key) |
|
194 | _set_cache_parameters(query, self.region, self.namespace, self.cache_key) | |
195 |
|
195 | |||
196 | class RelationshipCache(MapperOption): |
|
196 | class RelationshipCache(MapperOption): | |
@@ -263,13 +263,13 b' def _params_from_query(query):' | |||||
263 | v = [] |
|
263 | v = [] | |
264 | def visit_bindparam(bind): |
|
264 | def visit_bindparam(bind): | |
265 | value = query._params.get(bind.key, bind.value) |
|
265 | value = query._params.get(bind.key, bind.value) | |
266 |
|
266 | |||
267 | # lazyloader may dig a callable in here, intended |
|
267 | # lazyloader may dig a callable in here, intended | |
268 | # to late-evaluate params after autoflush is called. |
|
268 | # to late-evaluate params after autoflush is called. | |
269 | # convert to a scalar value. |
|
269 | # convert to a scalar value. | |
270 | if callable(value): |
|
270 | if callable(value): | |
271 | value = value() |
|
271 | value = value() | |
272 |
|
272 | |||
273 | v.append(value) |
|
273 | v.append(value) | |
274 | if query._criterion is not None: |
|
274 | if query._criterion is not None: | |
275 | visitors.traverse(query._criterion, {}, {'bindparam':visit_bindparam}) |
|
275 | visitors.traverse(query._criterion, {}, {'bindparam':visit_bindparam}) |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file renamed from rhodecode/model/permission_model.py to rhodecode/model/permission.py |
|
NO CONTENT: file renamed from rhodecode/model/permission_model.py to rhodecode/model/permission.py | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file renamed from rhodecode/model/repo_model.py to rhodecode/model/repo.py |
|
NO CONTENT: file renamed from rhodecode/model/repo_model.py to rhodecode/model/repo.py | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file renamed from rhodecode/model/hg_model.py to rhodecode/model/scm.py |
|
NO CONTENT: file renamed from rhodecode/model/hg_model.py to rhodecode/model/scm.py | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file renamed from rhodecode/model/user_model.py to rhodecode/model/user.py |
|
NO CONTENT: file renamed from rhodecode/model/user_model.py to rhodecode/model/user.py | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file, binary diff hidden |
|
NO CONTENT: modified file, binary diff hidden |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file renamed from rhodecode/tests/functional/test_permissions.py to rhodecode/tests/functional/test_admin_permissions.py |
|
NO CONTENT: file renamed from rhodecode/tests/functional/test_permissions.py to rhodecode/tests/functional/test_admin_permissions.py | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file renamed from rhodecode/tests/functional/test_repos.py to rhodecode/tests/functional/test_admin_repos.py |
|
NO CONTENT: file renamed from rhodecode/tests/functional/test_repos.py to rhodecode/tests/functional/test_admin_repos.py | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file renamed from rhodecode/tests/functional/test_users.py to rhodecode/tests/functional/test_admin_users.py |
|
NO CONTENT: file renamed from rhodecode/tests/functional/test_users.py to rhodecode/tests/functional/test_admin_users.py | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file renamed from rhodecode/tests/functional/test_hg.py to rhodecode/tests/functional/test_home.py |
|
NO CONTENT: file renamed from rhodecode/tests/functional/test_hg.py to rhodecode/tests/functional/test_home.py | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed |
1 | NO CONTENT: file was removed, binary diff hidden |
|
NO CONTENT: file was removed, binary diff hidden |
1 | NO CONTENT: file was removed, binary diff hidden |
|
NO CONTENT: file was removed, binary diff hidden |
1 | NO CONTENT: file was removed, binary diff hidden |
|
NO CONTENT: file was removed, binary diff hidden |
1 | NO CONTENT: file was removed, binary diff hidden |
|
NO CONTENT: file was removed, binary diff hidden |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed, binary diff hidden |
|
NO CONTENT: file was removed, binary diff hidden |
General Comments 0
You need to be logged in to leave comments.
Login now