Show More
@@ -1,397 +1,397 b'' | |||||
1 | .. _contributing: |
|
1 | .. _contributing: | |
2 |
|
2 | |||
3 | ========================= |
|
3 | ========================= | |
4 | Contributing to Kallithea |
|
4 | Contributing to Kallithea | |
5 | ========================= |
|
5 | ========================= | |
6 |
|
6 | |||
7 | Kallithea is developed and maintained by its users. Please join us and scratch |
|
7 | Kallithea is developed and maintained by its users. Please join us and scratch | |
8 | your own itch. |
|
8 | your own itch. | |
9 |
|
9 | |||
10 |
|
10 | |||
11 | Infrastructure |
|
11 | Infrastructure | |
12 | -------------- |
|
12 | -------------- | |
13 |
|
13 | |||
14 | The main repository is hosted on Our Own Kallithea (aka OOK) at |
|
14 | The main repository is hosted on Our Own Kallithea (aka OOK) at | |
15 | https://kallithea-scm.org/repos/kallithea/, our self-hosted instance |
|
15 | https://kallithea-scm.org/repos/kallithea/, our self-hosted instance | |
16 | of Kallithea. |
|
16 | of Kallithea. | |
17 |
|
17 | |||
18 | Please use the `mailing list`_ to send patches or report issues. |
|
18 | Please use the `mailing list`_ to send patches or report issues. | |
19 |
|
19 | |||
20 | We use Weblate_ to translate the user interface messages into languages other |
|
20 | We use Weblate_ to translate the user interface messages into languages other | |
21 | than English. Join our project on `Hosted Weblate`_ to help us. |
|
21 | than English. Join our project on `Hosted Weblate`_ to help us. | |
22 | To register, you can use your Bitbucket or GitHub account. See :ref:`translations` |
|
22 | To register, you can use your Bitbucket or GitHub account. See :ref:`translations` | |
23 | for more details. |
|
23 | for more details. | |
24 |
|
24 | |||
25 |
|
25 | |||
26 | Getting started |
|
26 | Getting started | |
27 | --------------- |
|
27 | --------------- | |
28 |
|
28 | |||
29 | To get started with Kallithea development run the following commands in your |
|
29 | To get started with Kallithea development run the following commands in your | |
30 | bash shell:: |
|
30 | bash shell:: | |
31 |
|
31 | |||
32 | hg clone https://kallithea-scm.org/repos/kallithea |
|
32 | hg clone https://kallithea-scm.org/repos/kallithea | |
33 | cd kallithea |
|
33 | cd kallithea | |
34 | python3 -m venv venv |
|
34 | python3 -m venv venv | |
35 | . venv/bin/activate |
|
35 | . venv/bin/activate | |
36 | pip install --upgrade pip setuptools |
|
36 | pip install --upgrade pip setuptools | |
37 | pip install --upgrade -e . -r dev_requirements.txt python-ldap python-pam |
|
37 | pip install --upgrade -e . -r dev_requirements.txt python-ldap python-pam | |
38 | kallithea-cli config-create my.ini |
|
38 | kallithea-cli config-create my.ini | |
39 | kallithea-cli db-create -c my.ini --user=user --email=user@example.com --password=password --repos=/tmp |
|
39 | kallithea-cli db-create -c my.ini --user=user --email=user@example.com --password=password --repos=/tmp | |
40 | kallithea-cli front-end-build |
|
40 | kallithea-cli front-end-build | |
41 | gearbox serve -c my.ini --reload & |
|
41 | gearbox serve -c my.ini --reload & | |
42 | firefox http://127.0.0.1:5000/ |
|
42 | firefox http://127.0.0.1:5000/ | |
43 |
|
43 | |||
44 |
|
44 | |||
45 | Contribution flow |
|
45 | Contribution flow | |
46 | ----------------- |
|
46 | ----------------- | |
47 |
|
47 | |||
48 | Starting from an existing Kallithea clone, make sure it is up to date with the |
|
48 | Starting from an existing Kallithea clone, make sure it is up to date with the | |
49 | latest upstream changes:: |
|
49 | latest upstream changes:: | |
50 |
|
50 | |||
51 | hg pull |
|
51 | hg pull | |
52 | hg update |
|
52 | hg update | |
53 |
|
53 | |||
54 | Review the :ref:`contributing-guidelines` and :ref:`coding-guidelines`. |
|
54 | Review the :ref:`contributing-guidelines` and :ref:`coding-guidelines`. | |
55 |
|
55 | |||
56 | If you are new to Mercurial, refer to Mercurial `Quick Start`_ and `Beginners |
|
56 | If you are new to Mercurial, refer to Mercurial `Quick Start`_ and `Beginners | |
57 | Guide`_ on the Mercurial wiki. |
|
57 | Guide`_ on the Mercurial wiki. | |
58 |
|
58 | |||
59 | Now, make some changes and test them (see :ref:`contributing-tests`). Don't |
|
59 | Now, make some changes and test them (see :ref:`contributing-tests`). Don't | |
60 | forget to add new tests to cover new functionality or bug fixes. |
|
60 | forget to add new tests to cover new functionality or bug fixes. | |
61 |
|
61 | |||
62 | For documentation changes, run ``make html`` from the ``docs`` directory to |
|
62 | For documentation changes, run ``make html`` from the ``docs`` directory to | |
63 | generate the HTML result, then review them in your browser. |
|
63 | generate the HTML result, then review them in your browser. | |
64 |
|
64 | |||
65 | Before submitting any changes, run the cleanup script:: |
|
65 | Before submitting any changes, run the cleanup script:: | |
66 |
|
66 | |||
67 | ./scripts/run-all-cleanup |
|
67 | ./scripts/run-all-cleanup | |
68 |
|
68 | |||
69 | When you are completely ready, you can send your changes to the community for |
|
69 | When you are completely ready, you can send your changes to the community for | |
70 | review and inclusion, via the mailing list (via ``hg email``). |
|
70 | review and inclusion, via the mailing list (via ``hg email``). | |
71 |
|
71 | |||
72 | .. _contributing-tests: |
|
72 | .. _contributing-tests: | |
73 |
|
73 | |||
74 |
|
74 | |||
75 | Internal dependencies |
|
75 | Internal dependencies | |
76 | --------------------- |
|
76 | --------------------- | |
77 |
|
77 | |||
78 | We try to keep the code base clean and modular and avoid circular dependencies. |
|
78 | We try to keep the code base clean and modular and avoid circular dependencies. | |
79 | Code should only invoke code in layers below itself. |
|
79 | Code should only invoke code in layers below itself. | |
80 |
|
80 | |||
81 | Imports should import whole modules ``from`` their parent module, perhaps |
|
81 | Imports should import whole modules ``from`` their parent module, perhaps | |
82 | ``as`` a shortened name. Avoid imports ``from`` modules. |
|
82 | ``as`` a shortened name. Avoid imports ``from`` modules. | |
83 |
|
83 | |||
84 | To avoid cycles and partially initialized modules, ``__init__.py`` should *not* |
|
84 | To avoid cycles and partially initialized modules, ``__init__.py`` should *not* | |
85 | contain any non-trivial imports. The top level of a module should *not* be a |
|
85 | contain any non-trivial imports. The top level of a module should *not* be a | |
86 | facade for the module functionality. |
|
86 | facade for the module functionality. | |
87 |
|
87 | |||
88 | Common code for a module is often in ``base.py``. |
|
88 | Common code for a module is often in ``base.py``. | |
89 |
|
89 | |||
90 | The important part of the dependency graph is approximately linear. In the |
|
90 | The important part of the dependency graph is approximately linear. In the | |
91 | following list, modules may only depend on modules below them: |
|
91 | following list, modules may only depend on modules below them: | |
92 |
|
92 | |||
93 | ``tests`` |
|
93 | ``tests`` | |
94 | Just get the job done - anything goes. |
|
94 | Just get the job done - anything goes. | |
95 |
|
95 | |||
96 | ``bin/`` & ``config/`` & ``alembic/`` |
|
96 | ``bin/`` & ``config/`` & ``alembic/`` | |
97 | The main entry points, defined in ``setup.py``. Note: The TurboGears template |
|
97 | The main entry points, defined in ``setup.py``. Note: The TurboGears template | |
98 | use ``config`` for the high WSGI application - this is not for low level |
|
98 | use ``config`` for the high WSGI application - this is not for low level | |
99 | configuration. |
|
99 | configuration. | |
100 |
|
100 | |||
101 | ``controllers/`` |
|
101 | ``controllers/`` | |
102 | The top level web application, with TurboGears using the ``root`` controller |
|
102 | The top level web application, with TurboGears using the ``root`` controller | |
103 | as entry point, and ``routing`` dispatching to other controllers. |
|
103 | as entry point, and ``routing`` dispatching to other controllers. | |
104 |
|
104 | |||
105 | ``templates/**.html`` |
|
105 | ``templates/**.html`` | |
106 | The "view", rendering to HTML. Invoked by controllers which can pass them |
|
106 | The "view", rendering to HTML. Invoked by controllers which can pass them | |
107 | anything from lower layers - especially ``helpers`` available as ``h`` will |
|
107 | anything from lower layers - especially ``helpers`` available as ``h`` will | |
108 | cut through all layers, and ``c`` gives access to global variables. |
|
108 | cut through all layers, and ``c`` gives access to global variables. | |
109 |
|
109 | |||
110 | ``lib/helpers.py`` |
|
110 | ``lib/helpers.py`` | |
111 | High level helpers, exposing everything to templates as ``h``. It depends on |
|
111 | High level helpers, exposing everything to templates as ``h``. It depends on | |
112 | everything and has a huge dependency chain, so it should not be used for |
|
112 | everything and has a huge dependency chain, so it should not be used for | |
113 | anything else. TODO. |
|
113 | anything else. TODO. | |
114 |
|
114 | |||
115 | ``controlles/base.py`` |
|
115 | ``controllers/base.py`` | |
116 | The base class of controllers, with lots of model knowledge. |
|
116 | The base class of controllers, with lots of model knowledge. | |
117 |
|
117 | |||
118 | ``lib/auth.py`` |
|
118 | ``lib/auth.py`` | |
119 | All things related to authentication. TODO. |
|
119 | All things related to authentication. TODO. | |
120 |
|
120 | |||
121 | ``lib/utils.py`` |
|
121 | ``lib/utils.py`` | |
122 | High level utils with lots of model knowledge. TODO. |
|
122 | High level utils with lots of model knowledge. TODO. | |
123 |
|
123 | |||
124 | ``lib/hooks.py`` |
|
124 | ``lib/hooks.py`` | |
125 | Hooks into "everything" to give centralized logging to database, cache |
|
125 | Hooks into "everything" to give centralized logging to database, cache | |
126 | invalidation, and extension handling. TODO. |
|
126 | invalidation, and extension handling. TODO. | |
127 |
|
127 | |||
128 | ``model/`` |
|
128 | ``model/`` | |
129 | Convenience business logic wrappers around database models. |
|
129 | Convenience business logic wrappers around database models. | |
130 |
|
130 | |||
131 | ``model/db.py`` |
|
131 | ``model/db.py`` | |
132 | Defines the database schema and provides some additional logic. |
|
132 | Defines the database schema and provides some additional logic. | |
133 |
|
133 | |||
134 | ``model/scm.py`` |
|
134 | ``model/scm.py`` | |
135 | All things related to anything. TODO. |
|
135 | All things related to anything. TODO. | |
136 |
|
136 | |||
137 | SQLAlchemy |
|
137 | SQLAlchemy | |
138 | Database session and transaction in thread-local variables. |
|
138 | Database session and transaction in thread-local variables. | |
139 |
|
139 | |||
140 | ``lib/utils2.py`` |
|
140 | ``lib/utils2.py`` | |
141 | Low level utils specific to Kallithea. |
|
141 | Low level utils specific to Kallithea. | |
142 |
|
142 | |||
143 | ``lib/webutils.py`` |
|
143 | ``lib/webutils.py`` | |
144 | Low level generic utils with awareness of the TurboGears environment. |
|
144 | Low level generic utils with awareness of the TurboGears environment. | |
145 |
|
145 | |||
146 | TurboGears |
|
146 | TurboGears | |
147 | Request, response and state like i18n gettext in thread-local variables. |
|
147 | Request, response and state like i18n gettext in thread-local variables. | |
148 | External dependency with global state - usage should be minimized. |
|
148 | External dependency with global state - usage should be minimized. | |
149 |
|
149 | |||
150 | ``lib/vcs/`` |
|
150 | ``lib/vcs/`` | |
151 | Previously an independent library. No awareness of web, database, or state. |
|
151 | Previously an independent library. No awareness of web, database, or state. | |
152 |
|
152 | |||
153 | ``lib/*`` |
|
153 | ``lib/*`` | |
154 | Various "pure" functionality not depending on anything else. |
|
154 | Various "pure" functionality not depending on anything else. | |
155 |
|
155 | |||
156 | ``__init__`` |
|
156 | ``__init__`` | |
157 | Very basic Kallithea constants - some of them are set very early based on ``.ini``. |
|
157 | Very basic Kallithea constants - some of them are set very early based on ``.ini``. | |
158 |
|
158 | |||
159 | This is not exactly how it is right now, but we aim for something like that. |
|
159 | This is not exactly how it is right now, but we aim for something like that. | |
160 | Especially the areas marked as TODO have some problems that need untangling. |
|
160 | Especially the areas marked as TODO have some problems that need untangling. | |
161 |
|
161 | |||
162 |
|
162 | |||
163 | Running tests |
|
163 | Running tests | |
164 | ------------- |
|
164 | ------------- | |
165 |
|
165 | |||
166 | After finishing your changes make sure all tests pass cleanly. Run the testsuite |
|
166 | After finishing your changes make sure all tests pass cleanly. Run the testsuite | |
167 | by invoking ``py.test`` from the project root:: |
|
167 | by invoking ``py.test`` from the project root:: | |
168 |
|
168 | |||
169 | py.test |
|
169 | py.test | |
170 |
|
170 | |||
171 | Note that on unix systems, the temporary directory (``/tmp`` or where |
|
171 | Note that on unix systems, the temporary directory (``/tmp`` or where | |
172 | ``$TMPDIR`` points) must allow executable files; Git hooks must be executable, |
|
172 | ``$TMPDIR`` points) must allow executable files; Git hooks must be executable, | |
173 | and the test suite creates repositories in the temporary directory. Linux |
|
173 | and the test suite creates repositories in the temporary directory. Linux | |
174 | systems with /tmp mounted noexec will thus fail. |
|
174 | systems with /tmp mounted noexec will thus fail. | |
175 |
|
175 | |||
176 | Tests can be run on PostgreSQL like:: |
|
176 | Tests can be run on PostgreSQL like:: | |
177 |
|
177 | |||
178 | sudo -u postgres createuser 'kallithea-test' --pwprompt # password password |
|
178 | sudo -u postgres createuser 'kallithea-test' --pwprompt # password password | |
179 | sudo -u postgres createdb 'kallithea-test' --owner 'kallithea-test' |
|
179 | sudo -u postgres createdb 'kallithea-test' --owner 'kallithea-test' | |
180 | REUSE_TEST_DB='postgresql://kallithea-test:password@localhost/kallithea-test' py.test |
|
180 | REUSE_TEST_DB='postgresql://kallithea-test:password@localhost/kallithea-test' py.test | |
181 |
|
181 | |||
182 | Tests can be run on MariaDB/MySQL like:: |
|
182 | Tests can be run on MariaDB/MySQL like:: | |
183 |
|
183 | |||
184 | echo "GRANT ALL PRIVILEGES ON \`kallithea-test\`.* TO 'kallithea-test'@'localhost' IDENTIFIED BY 'password'" | sudo -u mysql mysql |
|
184 | echo "GRANT ALL PRIVILEGES ON \`kallithea-test\`.* TO 'kallithea-test'@'localhost' IDENTIFIED BY 'password'" | sudo -u mysql mysql | |
185 | TEST_DB='mysql://kallithea-test:password@localhost/kallithea-test?charset=utf8mb4' py.test |
|
185 | TEST_DB='mysql://kallithea-test:password@localhost/kallithea-test?charset=utf8mb4' py.test | |
186 |
|
186 | |||
187 | You can also use ``tox`` to run the tests with all supported Python versions. |
|
187 | You can also use ``tox`` to run the tests with all supported Python versions. | |
188 |
|
188 | |||
189 | When running tests, Kallithea generates a `test.ini` based on template values |
|
189 | When running tests, Kallithea generates a `test.ini` based on template values | |
190 | in `kallithea/tests/conftest.py` and populates the SQLite database specified |
|
190 | in `kallithea/tests/conftest.py` and populates the SQLite database specified | |
191 | there. |
|
191 | there. | |
192 |
|
192 | |||
193 | It is possible to avoid recreating the full test database on each invocation of |
|
193 | It is possible to avoid recreating the full test database on each invocation of | |
194 | the tests, thus eliminating the initial delay. To achieve this, run the tests as:: |
|
194 | the tests, thus eliminating the initial delay. To achieve this, run the tests as:: | |
195 |
|
195 | |||
196 | gearbox serve -c /tmp/kallithea-test-XXX/test.ini --pid-file=test.pid --daemon |
|
196 | gearbox serve -c /tmp/kallithea-test-XXX/test.ini --pid-file=test.pid --daemon | |
197 | KALLITHEA_WHOOSH_TEST_DISABLE=1 KALLITHEA_NO_TMP_PATH=1 py.test |
|
197 | KALLITHEA_WHOOSH_TEST_DISABLE=1 KALLITHEA_NO_TMP_PATH=1 py.test | |
198 | kill -9 $(cat test.pid) |
|
198 | kill -9 $(cat test.pid) | |
199 |
|
199 | |||
200 | In these commands, the following variables are used:: |
|
200 | In these commands, the following variables are used:: | |
201 |
|
201 | |||
202 | KALLITHEA_WHOOSH_TEST_DISABLE=1 - skip whoosh index building and tests |
|
202 | KALLITHEA_WHOOSH_TEST_DISABLE=1 - skip whoosh index building and tests | |
203 | KALLITHEA_NO_TMP_PATH=1 - disable new temp path for tests, used mostly for testing_vcs_operations |
|
203 | KALLITHEA_NO_TMP_PATH=1 - disable new temp path for tests, used mostly for testing_vcs_operations | |
204 |
|
204 | |||
205 | You can run individual tests by specifying their path as argument to py.test. |
|
205 | You can run individual tests by specifying their path as argument to py.test. | |
206 | py.test also has many more options, see `py.test -h`. Some useful options |
|
206 | py.test also has many more options, see `py.test -h`. Some useful options | |
207 | are:: |
|
207 | are:: | |
208 |
|
208 | |||
209 | -k EXPRESSION only run tests which match the given substring |
|
209 | -k EXPRESSION only run tests which match the given substring | |
210 | expression. An expression is a python evaluable |
|
210 | expression. An expression is a python evaluable | |
211 | expression where all names are substring-matched |
|
211 | expression where all names are substring-matched | |
212 | against test names and their parent classes. Example: |
|
212 | against test names and their parent classes. Example: | |
213 | -x, --exitfirst exit instantly on first error or failed test. |
|
213 | -x, --exitfirst exit instantly on first error or failed test. | |
214 | --lf rerun only the tests that failed at the last run (or |
|
214 | --lf rerun only the tests that failed at the last run (or | |
215 | all if none failed) |
|
215 | all if none failed) | |
216 | --ff run all tests but run the last failures first. This |
|
216 | --ff run all tests but run the last failures first. This | |
217 | may re-order tests and thus lead to repeated fixture |
|
217 | may re-order tests and thus lead to repeated fixture | |
218 | setup/teardown |
|
218 | setup/teardown | |
219 | --pdb start the interactive Python debugger on errors. |
|
219 | --pdb start the interactive Python debugger on errors. | |
220 | -s, --capture=no don't capture stdout (any stdout output will be |
|
220 | -s, --capture=no don't capture stdout (any stdout output will be | |
221 | printed immediately) |
|
221 | printed immediately) | |
222 |
|
222 | |||
223 | Performance tests |
|
223 | Performance tests | |
224 | ^^^^^^^^^^^^^^^^^ |
|
224 | ^^^^^^^^^^^^^^^^^ | |
225 |
|
225 | |||
226 | A number of performance tests are present in the test suite, but they are |
|
226 | A number of performance tests are present in the test suite, but they are | |
227 | not run in a standard test run. These tests are useful to |
|
227 | not run in a standard test run. These tests are useful to | |
228 | evaluate the impact of certain code changes with respect to performance. |
|
228 | evaluate the impact of certain code changes with respect to performance. | |
229 |
|
229 | |||
230 | To run these tests:: |
|
230 | To run these tests:: | |
231 |
|
231 | |||
232 | env TEST_PERFORMANCE=1 py.test kallithea/tests/performance |
|
232 | env TEST_PERFORMANCE=1 py.test kallithea/tests/performance | |
233 |
|
233 | |||
234 | To analyze performance, you could install pytest-profiling_, which enables the |
|
234 | To analyze performance, you could install pytest-profiling_, which enables the | |
235 | --profile and --profile-svg options to py.test. |
|
235 | --profile and --profile-svg options to py.test. | |
236 |
|
236 | |||
237 | .. _pytest-profiling: https://github.com/manahl/pytest-plugins/tree/master/pytest-profiling |
|
237 | .. _pytest-profiling: https://github.com/manahl/pytest-plugins/tree/master/pytest-profiling | |
238 |
|
238 | |||
239 | .. _contributing-guidelines: |
|
239 | .. _contributing-guidelines: | |
240 |
|
240 | |||
241 |
|
241 | |||
242 | Contribution guidelines |
|
242 | Contribution guidelines | |
243 | ----------------------- |
|
243 | ----------------------- | |
244 |
|
244 | |||
245 | Kallithea is GPLv3 and we assume all contributions are made by the |
|
245 | Kallithea is GPLv3 and we assume all contributions are made by the | |
246 | committer/contributor and under GPLv3 unless explicitly stated. We do care a |
|
246 | committer/contributor and under GPLv3 unless explicitly stated. We do care a | |
247 | lot about preservation of copyright and license information for existing code |
|
247 | lot about preservation of copyright and license information for existing code | |
248 | that is brought into the project. |
|
248 | that is brought into the project. | |
249 |
|
249 | |||
250 | Contributions will be accepted in most formats -- such as commits hosted on your |
|
250 | Contributions will be accepted in most formats -- such as commits hosted on your | |
251 | own Kallithea instance, or patches sent by email to the `kallithea-general`_ |
|
251 | own Kallithea instance, or patches sent by email to the `kallithea-general`_ | |
252 | mailing list. |
|
252 | mailing list. | |
253 |
|
253 | |||
254 | Make sure to test your changes both manually and with the automatic tests |
|
254 | Make sure to test your changes both manually and with the automatic tests | |
255 | before posting. |
|
255 | before posting. | |
256 |
|
256 | |||
257 | We care about quality and review and keeping a clean repository history. We |
|
257 | We care about quality and review and keeping a clean repository history. We | |
258 | might give feedback that requests polishing contributions until they are |
|
258 | might give feedback that requests polishing contributions until they are | |
259 | "perfect". We might also rebase and collapse and make minor adjustments to your |
|
259 | "perfect". We might also rebase and collapse and make minor adjustments to your | |
260 | changes when we apply them. |
|
260 | changes when we apply them. | |
261 |
|
261 | |||
262 | We try to make sure we have consensus on the direction the project is taking. |
|
262 | We try to make sure we have consensus on the direction the project is taking. | |
263 | Everything non-sensitive should be discussed in public -- preferably on the |
|
263 | Everything non-sensitive should be discussed in public -- preferably on the | |
264 | mailing list. We aim at having all non-trivial changes reviewed by at least |
|
264 | mailing list. We aim at having all non-trivial changes reviewed by at least | |
265 | one other core developer before pushing. Obvious non-controversial changes will |
|
265 | one other core developer before pushing. Obvious non-controversial changes will | |
266 | be handled more casually. |
|
266 | be handled more casually. | |
267 |
|
267 | |||
268 | There is a main development branch ("default") which is generally stable so that |
|
268 | There is a main development branch ("default") which is generally stable so that | |
269 | it can be (and is) used in production. There is also a "stable" branch that is |
|
269 | it can be (and is) used in production. There is also a "stable" branch that is | |
270 | almost exclusively reserved for bug fixes or trivial changes. Experimental |
|
270 | almost exclusively reserved for bug fixes or trivial changes. Experimental | |
271 | changes should live elsewhere (for example in a pull request) until they are |
|
271 | changes should live elsewhere (for example in a pull request) until they are | |
272 | ready. |
|
272 | ready. | |
273 |
|
273 | |||
274 | .. _coding-guidelines: |
|
274 | .. _coding-guidelines: | |
275 |
|
275 | |||
276 |
|
276 | |||
277 | Coding guidelines |
|
277 | Coding guidelines | |
278 | ----------------- |
|
278 | ----------------- | |
279 |
|
279 | |||
280 | We don't have a formal coding/formatting standard. We are currently using a mix |
|
280 | We don't have a formal coding/formatting standard. We are currently using a mix | |
281 | of Mercurial's (https://www.mercurial-scm.org/wiki/CodingStyle), pep8, and |
|
281 | of Mercurial's (https://www.mercurial-scm.org/wiki/CodingStyle), pep8, and | |
282 | consistency with existing code. Run ``scripts/run-all-cleanup`` before |
|
282 | consistency with existing code. Run ``scripts/run-all-cleanup`` before | |
283 | committing to ensure some basic code formatting consistency. |
|
283 | committing to ensure some basic code formatting consistency. | |
284 |
|
284 | |||
285 | We support Python 3.6 and later. |
|
285 | We support Python 3.6 and later. | |
286 |
|
286 | |||
287 | We try to support the most common modern web browsers. IE9 is still supported |
|
287 | We try to support the most common modern web browsers. IE9 is still supported | |
288 | to the extent it is feasible, IE8 is not. |
|
288 | to the extent it is feasible, IE8 is not. | |
289 |
|
289 | |||
290 | We primarily support Linux and OS X on the server side but Windows should also work. |
|
290 | We primarily support Linux and OS X on the server side but Windows should also work. | |
291 |
|
291 | |||
292 | HTML templates should use 2 spaces for indentation ... but be pragmatic. We |
|
292 | HTML templates should use 2 spaces for indentation ... but be pragmatic. We | |
293 | should use templates cleverly and avoid duplication. We should use reasonable |
|
293 | should use templates cleverly and avoid duplication. We should use reasonable | |
294 | semantic markup with element classes and IDs that can be used for styling and testing. |
|
294 | semantic markup with element classes and IDs that can be used for styling and testing. | |
295 | We should only use inline styles in places where it really is semantic (such as |
|
295 | We should only use inline styles in places where it really is semantic (such as | |
296 | ``display: none``). |
|
296 | ``display: none``). | |
297 |
|
297 | |||
298 | JavaScript must use ``;`` between/after statements. Indentation 4 spaces. Inline |
|
298 | JavaScript must use ``;`` between/after statements. Indentation 4 spaces. Inline | |
299 | multiline functions should be indented two levels -- one for the ``()`` and one for |
|
299 | multiline functions should be indented two levels -- one for the ``()`` and one for | |
300 | ``{}``. |
|
300 | ``{}``. | |
301 | Variables holding jQuery objects should be named with a leading ``$``. |
|
301 | Variables holding jQuery objects should be named with a leading ``$``. | |
302 |
|
302 | |||
303 | Commit messages should have a leading short line summarizing the changes. For |
|
303 | Commit messages should have a leading short line summarizing the changes. For | |
304 | bug fixes, put ``(Issue #123)`` at the end of this line. |
|
304 | bug fixes, put ``(Issue #123)`` at the end of this line. | |
305 |
|
305 | |||
306 | Use American English grammar and spelling overall. Use `English title case`_ for |
|
306 | Use American English grammar and spelling overall. Use `English title case`_ for | |
307 | page titles, button labels, headers, and 'labels' for fields in forms. |
|
307 | page titles, button labels, headers, and 'labels' for fields in forms. | |
308 |
|
308 | |||
309 | .. _English title case: https://en.wikipedia.org/wiki/Capitalization#Title_case |
|
309 | .. _English title case: https://en.wikipedia.org/wiki/Capitalization#Title_case | |
310 |
|
310 | |||
311 | Template helpers (that is, everything in ``kallithea.lib.helpers``) |
|
311 | Template helpers (that is, everything in ``kallithea.lib.helpers``) | |
312 | should only be referenced from templates. If you need to call a |
|
312 | should only be referenced from templates. If you need to call a | |
313 | helper from the Python code, consider moving the function somewhere |
|
313 | helper from the Python code, consider moving the function somewhere | |
314 | else (e.g. to the model). |
|
314 | else (e.g. to the model). | |
315 |
|
315 | |||
316 | Notes on the SQLAlchemy session |
|
316 | Notes on the SQLAlchemy session | |
317 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
317 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
318 |
|
318 | |||
319 | Each HTTP request runs inside an independent SQLAlchemy session (as well |
|
319 | Each HTTP request runs inside an independent SQLAlchemy session (as well | |
320 | as in an independent database transaction). ``Session`` is the session manager |
|
320 | as in an independent database transaction). ``Session`` is the session manager | |
321 | and factory. ``Session()`` will create a new session on-demand or return the |
|
321 | and factory. ``Session()`` will create a new session on-demand or return the | |
322 | current session for the active thread. Many database operations are methods on |
|
322 | current session for the active thread. Many database operations are methods on | |
323 | such session instances. The session will generally be removed by |
|
323 | such session instances. The session will generally be removed by | |
324 | TurboGears automatically. |
|
324 | TurboGears automatically. | |
325 |
|
325 | |||
326 | Database model objects |
|
326 | Database model objects | |
327 | (almost) always belong to a particular SQLAlchemy session, which means |
|
327 | (almost) always belong to a particular SQLAlchemy session, which means | |
328 | that SQLAlchemy will ensure that they're kept in sync with the database |
|
328 | that SQLAlchemy will ensure that they're kept in sync with the database | |
329 | (but also means that they cannot be shared across requests). |
|
329 | (but also means that they cannot be shared across requests). | |
330 |
|
330 | |||
331 | Objects can be added to the session using ``Session().add``, but this is |
|
331 | Objects can be added to the session using ``Session().add``, but this is | |
332 | rarely needed: |
|
332 | rarely needed: | |
333 |
|
333 | |||
334 | * When creating a database object by calling the constructor directly, |
|
334 | * When creating a database object by calling the constructor directly, | |
335 | it must explicitly be added to the session. |
|
335 | it must explicitly be added to the session. | |
336 |
|
336 | |||
337 | * When creating an object using a factory function (like |
|
337 | * When creating an object using a factory function (like | |
338 | ``create_repo``), the returned object has already (by convention) |
|
338 | ``create_repo``), the returned object has already (by convention) | |
339 | been added to the session, and should not be added again. |
|
339 | been added to the session, and should not be added again. | |
340 |
|
340 | |||
341 | * When getting an object from the session (via ``Session().query`` or |
|
341 | * When getting an object from the session (via ``Session().query`` or | |
342 | any of the utility functions that look up objects in the database), |
|
342 | any of the utility functions that look up objects in the database), | |
343 | it's already part of the session, and should not be added again. |
|
343 | it's already part of the session, and should not be added again. | |
344 | SQLAlchemy monitors attribute modifications automatically for all |
|
344 | SQLAlchemy monitors attribute modifications automatically for all | |
345 | objects it knows about and syncs them to the database. |
|
345 | objects it knows about and syncs them to the database. | |
346 |
|
346 | |||
347 | SQLAlchemy also flushes changes to the database automatically; manually |
|
347 | SQLAlchemy also flushes changes to the database automatically; manually | |
348 | calling ``Session().flush`` is usually only necessary when the Python |
|
348 | calling ``Session().flush`` is usually only necessary when the Python | |
349 | code needs the database to assign an "auto-increment" primary key ID to |
|
349 | code needs the database to assign an "auto-increment" primary key ID to | |
350 | a freshly created model object (before flushing, the ID attribute will |
|
350 | a freshly created model object (before flushing, the ID attribute will | |
351 | be ``None``). |
|
351 | be ``None``). | |
352 |
|
352 | |||
353 | Debugging |
|
353 | Debugging | |
354 | ^^^^^^^^^ |
|
354 | ^^^^^^^^^ | |
355 |
|
355 | |||
356 | A good way to trace what Kallithea is doing is to keep an eye on the output on |
|
356 | A good way to trace what Kallithea is doing is to keep an eye on the output on | |
357 | stdout/stderr of the server process. Perhaps change ``my.ini`` to log at |
|
357 | stdout/stderr of the server process. Perhaps change ``my.ini`` to log at | |
358 | ``DEBUG`` or ``INFO`` level, especially ``[logger_kallithea]``, but perhaps |
|
358 | ``DEBUG`` or ``INFO`` level, especially ``[logger_kallithea]``, but perhaps | |
359 | also other loggers. It is often easier to add additional ``log`` or ``print`` |
|
359 | also other loggers. It is often easier to add additional ``log`` or ``print`` | |
360 | statements than to use a Python debugger. |
|
360 | statements than to use a Python debugger. | |
361 |
|
361 | |||
362 | Sometimes it is simpler to disable ``errorpage.enabled`` and perhaps also |
|
362 | Sometimes it is simpler to disable ``errorpage.enabled`` and perhaps also | |
363 | ``trace_errors.enable`` to expose raw errors instead of adding extra |
|
363 | ``trace_errors.enable`` to expose raw errors instead of adding extra | |
364 | processing. Enabling ``debug`` can be helpful for showing and exploring |
|
364 | processing. Enabling ``debug`` can be helpful for showing and exploring | |
365 | tracebacks in the browser, but is also insecure and will add extra processing. |
|
365 | tracebacks in the browser, but is also insecure and will add extra processing. | |
366 |
|
366 | |||
367 | TurboGears2 DebugBar |
|
367 | TurboGears2 DebugBar | |
368 | ^^^^^^^^^^^^^^^^^^^^ |
|
368 | ^^^^^^^^^^^^^^^^^^^^ | |
369 |
|
369 | |||
370 | It is possible to enable the TurboGears2-provided DebugBar_, a toolbar overlayed |
|
370 | It is possible to enable the TurboGears2-provided DebugBar_, a toolbar overlayed | |
371 | over the Kallithea web interface, allowing you to see: |
|
371 | over the Kallithea web interface, allowing you to see: | |
372 |
|
372 | |||
373 | * timing information of the current request, including profiling information |
|
373 | * timing information of the current request, including profiling information | |
374 | * request data, including GET data, POST data, cookies, headers and environment |
|
374 | * request data, including GET data, POST data, cookies, headers and environment | |
375 | variables |
|
375 | variables | |
376 | * a list of executed database queries, including timing and result values |
|
376 | * a list of executed database queries, including timing and result values | |
377 |
|
377 | |||
378 | DebugBar is only activated when ``debug = true`` is set in the configuration |
|
378 | DebugBar is only activated when ``debug = true`` is set in the configuration | |
379 | file. This is important, because the DebugBar toolbar will be visible for all |
|
379 | file. This is important, because the DebugBar toolbar will be visible for all | |
380 | users, and allow them to see information they should not be allowed to see. Like |
|
380 | users, and allow them to see information they should not be allowed to see. Like | |
381 | is anyway the case for ``debug = true``, do not use this in production! |
|
381 | is anyway the case for ``debug = true``, do not use this in production! | |
382 |
|
382 | |||
383 | To enable DebugBar, install ``tgext.debugbar`` and ``kajiki`` (typically via |
|
383 | To enable DebugBar, install ``tgext.debugbar`` and ``kajiki`` (typically via | |
384 | ``pip``) and restart Kallithea (in debug mode). |
|
384 | ``pip``) and restart Kallithea (in debug mode). | |
385 |
|
385 | |||
386 |
|
386 | |||
387 | Thank you for your contribution! |
|
387 | Thank you for your contribution! | |
388 | -------------------------------- |
|
388 | -------------------------------- | |
389 |
|
389 | |||
390 |
|
390 | |||
391 | .. _Weblate: http://weblate.org/ |
|
391 | .. _Weblate: http://weblate.org/ | |
392 | .. _mailing list: http://lists.sfconservancy.org/mailman/listinfo/kallithea-general |
|
392 | .. _mailing list: http://lists.sfconservancy.org/mailman/listinfo/kallithea-general | |
393 | .. _kallithea-general: http://lists.sfconservancy.org/mailman/listinfo/kallithea-general |
|
393 | .. _kallithea-general: http://lists.sfconservancy.org/mailman/listinfo/kallithea-general | |
394 | .. _Hosted Weblate: https://hosted.weblate.org/projects/kallithea/kallithea/ |
|
394 | .. _Hosted Weblate: https://hosted.weblate.org/projects/kallithea/kallithea/ | |
395 | .. _DebugBar: https://github.com/TurboGears/tgext.debugbar |
|
395 | .. _DebugBar: https://github.com/TurboGears/tgext.debugbar | |
396 | .. _Quick Start: https://www.mercurial-scm.org/wiki/QuickStart |
|
396 | .. _Quick Start: https://www.mercurial-scm.org/wiki/QuickStart | |
397 | .. _Beginners Guide: https://www.mercurial-scm.org/wiki/BeginnersGuides |
|
397 | .. _Beginners Guide: https://www.mercurial-scm.org/wiki/BeginnersGuides |
@@ -1,93 +1,93 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.config.middleware.simplegit |
|
15 | kallithea.config.middleware.simplegit | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | SimpleGit middleware for handling Git protocol requests (push/clone etc.) |
|
18 | SimpleGit middleware for handling Git protocol requests (push/clone etc.) | |
19 | It's implemented with basic auth function |
|
19 | It's implemented with basic auth function | |
20 |
|
20 | |||
21 | This file was forked by the Kallithea project in July 2014. |
|
21 | This file was forked by the Kallithea project in July 2014. | |
22 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | Original author and date, and relevant copyright and licensing information is below: | |
23 | :created_on: Apr 28, 2010 |
|
23 | :created_on: Apr 28, 2010 | |
24 | :author: marcink |
|
24 | :author: marcink | |
25 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
26 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | :license: GPLv3, see LICENSE.md for more details. | |
27 |
|
27 | |||
28 | """ |
|
28 | """ | |
29 |
|
29 | |||
30 |
|
30 | |||
31 | import logging |
|
31 | import logging | |
32 | import re |
|
32 | import re | |
33 |
|
33 | |||
34 | from kallithea.config.middleware.pygrack import make_wsgi_app |
|
34 | from kallithea.config.middleware.pygrack import make_wsgi_app | |
|
35 | from kallithea.controllers import base | |||
35 | from kallithea.lib import hooks |
|
36 | from kallithea.lib import hooks | |
36 | from kallithea.lib.base import BaseVCSController, get_path_info |
|
|||
37 |
|
37 | |||
38 |
|
38 | |||
39 | log = logging.getLogger(__name__) |
|
39 | log = logging.getLogger(__name__) | |
40 |
|
40 | |||
41 |
|
41 | |||
42 | GIT_PROTO_PAT = re.compile(r'^/(.+)/(info/refs|git-upload-pack|git-receive-pack)$') |
|
42 | GIT_PROTO_PAT = re.compile(r'^/(.+)/(info/refs|git-upload-pack|git-receive-pack)$') | |
43 |
|
43 | |||
44 |
|
44 | |||
45 | cmd_mapping = { |
|
45 | cmd_mapping = { | |
46 | 'git-receive-pack': 'push', |
|
46 | 'git-receive-pack': 'push', | |
47 | 'git-upload-pack': 'pull', |
|
47 | 'git-upload-pack': 'pull', | |
48 | } |
|
48 | } | |
49 |
|
49 | |||
50 |
|
50 | |||
51 | class SimpleGit(BaseVCSController): |
|
51 | class SimpleGit(base.BaseVCSController): | |
52 |
|
52 | |||
53 | scm_alias = 'git' |
|
53 | scm_alias = 'git' | |
54 |
|
54 | |||
55 | @classmethod |
|
55 | @classmethod | |
56 | def parse_request(cls, environ): |
|
56 | def parse_request(cls, environ): | |
57 | path_info = get_path_info(environ) |
|
57 | path_info = base.get_path_info(environ) | |
58 | m = GIT_PROTO_PAT.match(path_info) |
|
58 | m = GIT_PROTO_PAT.match(path_info) | |
59 | if m is None: |
|
59 | if m is None: | |
60 | return None |
|
60 | return None | |
61 |
|
61 | |||
62 | class parsed_request(object): |
|
62 | class parsed_request(object): | |
63 | # See https://git-scm.com/book/en/v2/Git-Internals-Transfer-Protocols#_the_smart_protocol |
|
63 | # See https://git-scm.com/book/en/v2/Git-Internals-Transfer-Protocols#_the_smart_protocol | |
64 | repo_name = m.group(1).rstrip('/') |
|
64 | repo_name = m.group(1).rstrip('/') | |
65 | cmd = m.group(2) |
|
65 | cmd = m.group(2) | |
66 |
|
66 | |||
67 | query_string = environ['QUERY_STRING'] |
|
67 | query_string = environ['QUERY_STRING'] | |
68 | if cmd == 'info/refs' and query_string.startswith('service='): |
|
68 | if cmd == 'info/refs' and query_string.startswith('service='): | |
69 | service = query_string.split('=', 1)[1] |
|
69 | service = query_string.split('=', 1)[1] | |
70 | action = cmd_mapping.get(service) |
|
70 | action = cmd_mapping.get(service) | |
71 | else: |
|
71 | else: | |
72 | service = None |
|
72 | service = None | |
73 | action = cmd_mapping.get(cmd) |
|
73 | action = cmd_mapping.get(cmd) | |
74 |
|
74 | |||
75 | return parsed_request |
|
75 | return parsed_request | |
76 |
|
76 | |||
77 | def _make_app(self, parsed_request): |
|
77 | def _make_app(self, parsed_request): | |
78 | """ |
|
78 | """ | |
79 | Return a pygrack wsgi application. |
|
79 | Return a pygrack wsgi application. | |
80 | """ |
|
80 | """ | |
81 | pygrack_app = make_wsgi_app(parsed_request.repo_name, self.basepath) |
|
81 | pygrack_app = make_wsgi_app(parsed_request.repo_name, self.basepath) | |
82 |
|
82 | |||
83 | def wrapper_app(environ, start_response): |
|
83 | def wrapper_app(environ, start_response): | |
84 | if (parsed_request.cmd == 'info/refs' and |
|
84 | if (parsed_request.cmd == 'info/refs' and | |
85 | parsed_request.service == 'git-upload-pack' |
|
85 | parsed_request.service == 'git-upload-pack' | |
86 | ): |
|
86 | ): | |
87 | # Run hooks like Mercurial outgoing.kallithea_pull_action does |
|
87 | # Run hooks like Mercurial outgoing.kallithea_pull_action does | |
88 | hooks.log_pull_action() |
|
88 | hooks.log_pull_action() | |
89 | # Note: push hooks are handled by post-receive hook |
|
89 | # Note: push hooks are handled by post-receive hook | |
90 |
|
90 | |||
91 | return pygrack_app(environ, start_response) |
|
91 | return pygrack_app(environ, start_response) | |
92 |
|
92 | |||
93 | return wrapper_app |
|
93 | return wrapper_app |
@@ -1,149 +1,149 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.config.middleware.simplehg |
|
15 | kallithea.config.middleware.simplehg | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | SimpleHg middleware for handling Mercurial protocol requests (push/clone etc.). |
|
18 | SimpleHg middleware for handling Mercurial protocol requests (push/clone etc.). | |
19 | It's implemented with basic auth function |
|
19 | It's implemented with basic auth function | |
20 |
|
20 | |||
21 | This file was forked by the Kallithea project in July 2014. |
|
21 | This file was forked by the Kallithea project in July 2014. | |
22 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | Original author and date, and relevant copyright and licensing information is below: | |
23 | :created_on: Apr 28, 2010 |
|
23 | :created_on: Apr 28, 2010 | |
24 | :author: marcink |
|
24 | :author: marcink | |
25 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
26 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | :license: GPLv3, see LICENSE.md for more details. | |
27 |
|
27 | |||
28 | """ |
|
28 | """ | |
29 |
|
29 | |||
30 |
|
30 | |||
31 | import logging |
|
31 | import logging | |
32 | import os |
|
32 | import os | |
33 | import urllib.parse |
|
33 | import urllib.parse | |
34 |
|
34 | |||
35 | import mercurial.hgweb |
|
35 | import mercurial.hgweb | |
36 |
|
36 | |||
37 | from kallithea.lib.base import BaseVCSController, get_path_info |
|
37 | from kallithea.controllers import base | |
38 | from kallithea.lib.utils import make_ui |
|
38 | from kallithea.lib.utils import make_ui | |
39 | from kallithea.lib.utils2 import safe_bytes |
|
39 | from kallithea.lib.utils2 import safe_bytes | |
40 |
|
40 | |||
41 |
|
41 | |||
42 | log = logging.getLogger(__name__) |
|
42 | log = logging.getLogger(__name__) | |
43 |
|
43 | |||
44 |
|
44 | |||
45 | def get_header_hgarg(environ): |
|
45 | def get_header_hgarg(environ): | |
46 | """Decode the special Mercurial encoding of big requests over multiple headers. |
|
46 | """Decode the special Mercurial encoding of big requests over multiple headers. | |
47 | >>> get_header_hgarg({}) |
|
47 | >>> get_header_hgarg({}) | |
48 | '' |
|
48 | '' | |
49 | >>> get_header_hgarg({'HTTP_X_HGARG_0': ' ', 'HTTP_X_HGARG_1': 'a','HTTP_X_HGARG_2': '','HTTP_X_HGARG_3': 'b+c %20'}) |
|
49 | >>> get_header_hgarg({'HTTP_X_HGARG_0': ' ', 'HTTP_X_HGARG_1': 'a','HTTP_X_HGARG_2': '','HTTP_X_HGARG_3': 'b+c %20'}) | |
50 | 'ab+c %20' |
|
50 | 'ab+c %20' | |
51 | """ |
|
51 | """ | |
52 | chunks = [] |
|
52 | chunks = [] | |
53 | i = 1 |
|
53 | i = 1 | |
54 | while True: |
|
54 | while True: | |
55 | v = environ.get('HTTP_X_HGARG_%d' % i) |
|
55 | v = environ.get('HTTP_X_HGARG_%d' % i) | |
56 | if v is None: |
|
56 | if v is None: | |
57 | break |
|
57 | break | |
58 | chunks.append(v) |
|
58 | chunks.append(v) | |
59 | i += 1 |
|
59 | i += 1 | |
60 | return ''.join(chunks) |
|
60 | return ''.join(chunks) | |
61 |
|
61 | |||
62 |
|
62 | |||
63 | cmd_mapping = { |
|
63 | cmd_mapping = { | |
64 | # 'batch' is not in this list - it is handled explicitly |
|
64 | # 'batch' is not in this list - it is handled explicitly | |
65 | 'between': 'pull', |
|
65 | 'between': 'pull', | |
66 | 'branches': 'pull', |
|
66 | 'branches': 'pull', | |
67 | 'branchmap': 'pull', |
|
67 | 'branchmap': 'pull', | |
68 | 'capabilities': 'pull', |
|
68 | 'capabilities': 'pull', | |
69 | 'changegroup': 'pull', |
|
69 | 'changegroup': 'pull', | |
70 | 'changegroupsubset': 'pull', |
|
70 | 'changegroupsubset': 'pull', | |
71 | 'changesetdata': 'pull', |
|
71 | 'changesetdata': 'pull', | |
72 | 'clonebundles': 'pull', |
|
72 | 'clonebundles': 'pull', | |
73 | 'debugwireargs': 'pull', |
|
73 | 'debugwireargs': 'pull', | |
74 | 'filedata': 'pull', |
|
74 | 'filedata': 'pull', | |
75 | 'getbundle': 'pull', |
|
75 | 'getbundle': 'pull', | |
76 | 'getlfile': 'pull', |
|
76 | 'getlfile': 'pull', | |
77 | 'heads': 'pull', |
|
77 | 'heads': 'pull', | |
78 | 'hello': 'pull', |
|
78 | 'hello': 'pull', | |
79 | 'known': 'pull', |
|
79 | 'known': 'pull', | |
80 | 'lheads': 'pull', |
|
80 | 'lheads': 'pull', | |
81 | 'listkeys': 'pull', |
|
81 | 'listkeys': 'pull', | |
82 | 'lookup': 'pull', |
|
82 | 'lookup': 'pull', | |
83 | 'manifestdata': 'pull', |
|
83 | 'manifestdata': 'pull', | |
84 | 'narrow_widen': 'pull', |
|
84 | 'narrow_widen': 'pull', | |
85 | 'protocaps': 'pull', |
|
85 | 'protocaps': 'pull', | |
86 | 'statlfile': 'pull', |
|
86 | 'statlfile': 'pull', | |
87 | 'stream_out': 'pull', |
|
87 | 'stream_out': 'pull', | |
88 | 'pushkey': 'push', |
|
88 | 'pushkey': 'push', | |
89 | 'putlfile': 'push', |
|
89 | 'putlfile': 'push', | |
90 | 'unbundle': 'push', |
|
90 | 'unbundle': 'push', | |
91 | } |
|
91 | } | |
92 |
|
92 | |||
93 |
|
93 | |||
94 | class SimpleHg(BaseVCSController): |
|
94 | class SimpleHg(base.BaseVCSController): | |
95 |
|
95 | |||
96 | scm_alias = 'hg' |
|
96 | scm_alias = 'hg' | |
97 |
|
97 | |||
98 | @classmethod |
|
98 | @classmethod | |
99 | def parse_request(cls, environ): |
|
99 | def parse_request(cls, environ): | |
100 | http_accept = environ.get('HTTP_ACCEPT', '') |
|
100 | http_accept = environ.get('HTTP_ACCEPT', '') | |
101 | if not http_accept.startswith('application/mercurial'): |
|
101 | if not http_accept.startswith('application/mercurial'): | |
102 | return None |
|
102 | return None | |
103 | path_info = get_path_info(environ) |
|
103 | path_info = base.get_path_info(environ) | |
104 | if not path_info.startswith('/'): # it must! |
|
104 | if not path_info.startswith('/'): # it must! | |
105 | return None |
|
105 | return None | |
106 |
|
106 | |||
107 | class parsed_request(object): |
|
107 | class parsed_request(object): | |
108 | repo_name = path_info[1:].rstrip('/') |
|
108 | repo_name = path_info[1:].rstrip('/') | |
109 |
|
109 | |||
110 | query_string = environ['QUERY_STRING'] |
|
110 | query_string = environ['QUERY_STRING'] | |
111 |
|
111 | |||
112 | action = None |
|
112 | action = None | |
113 | for qry in query_string.split('&'): |
|
113 | for qry in query_string.split('&'): | |
114 | parts = qry.split('=', 1) |
|
114 | parts = qry.split('=', 1) | |
115 | if len(parts) == 2 and parts[0] == 'cmd': |
|
115 | if len(parts) == 2 and parts[0] == 'cmd': | |
116 | cmd = parts[1] |
|
116 | cmd = parts[1] | |
117 | if cmd == 'batch': |
|
117 | if cmd == 'batch': | |
118 | hgarg = get_header_hgarg(environ) |
|
118 | hgarg = get_header_hgarg(environ) | |
119 | if not hgarg.startswith('cmds='): |
|
119 | if not hgarg.startswith('cmds='): | |
120 | action = 'push' # paranoid and safe |
|
120 | action = 'push' # paranoid and safe | |
121 | break |
|
121 | break | |
122 | action = 'pull' |
|
122 | action = 'pull' | |
123 | for cmd_arg in hgarg[5:].split(';'): |
|
123 | for cmd_arg in hgarg[5:].split(';'): | |
124 | cmd, _args = urllib.parse.unquote_plus(cmd_arg).split(' ', 1) |
|
124 | cmd, _args = urllib.parse.unquote_plus(cmd_arg).split(' ', 1) | |
125 | op = cmd_mapping.get(cmd, 'push') |
|
125 | op = cmd_mapping.get(cmd, 'push') | |
126 | if op != 'pull': |
|
126 | if op != 'pull': | |
127 | assert op == 'push' |
|
127 | assert op == 'push' | |
128 | action = 'push' |
|
128 | action = 'push' | |
129 | break |
|
129 | break | |
130 | else: |
|
130 | else: | |
131 | action = cmd_mapping.get(cmd, 'push') |
|
131 | action = cmd_mapping.get(cmd, 'push') | |
132 | break # only process one cmd |
|
132 | break # only process one cmd | |
133 |
|
133 | |||
134 | return parsed_request |
|
134 | return parsed_request | |
135 |
|
135 | |||
136 | def _make_app(self, parsed_request): |
|
136 | def _make_app(self, parsed_request): | |
137 | """ |
|
137 | """ | |
138 | Make an hgweb wsgi application. |
|
138 | Make an hgweb wsgi application. | |
139 | """ |
|
139 | """ | |
140 | repo_name = parsed_request.repo_name |
|
140 | repo_name = parsed_request.repo_name | |
141 | repo_path = os.path.join(self.basepath, repo_name) |
|
141 | repo_path = os.path.join(self.basepath, repo_name) | |
142 | baseui = make_ui(repo_path=repo_path) |
|
142 | baseui = make_ui(repo_path=repo_path) | |
143 | hgweb_app = mercurial.hgweb.hgweb(safe_bytes(repo_path), name=safe_bytes(repo_name), baseui=baseui) |
|
143 | hgweb_app = mercurial.hgweb.hgweb(safe_bytes(repo_path), name=safe_bytes(repo_name), baseui=baseui) | |
144 |
|
144 | |||
145 | def wrapper_app(environ, start_response): |
|
145 | def wrapper_app(environ, start_response): | |
146 | environ['REPO_NAME'] = repo_name # used by mercurial.hgweb.hgweb |
|
146 | environ['REPO_NAME'] = repo_name # used by mercurial.hgweb.hgweb | |
147 | return hgweb_app(environ, start_response) |
|
147 | return hgweb_app(environ, start_response) | |
148 |
|
148 | |||
149 | return wrapper_app |
|
149 | return wrapper_app |
@@ -1,102 +1,102 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.config.middleware.wrapper |
|
15 | kallithea.config.middleware.wrapper | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Wrap app to measure request and response time ... all the way to the response |
|
18 | Wrap app to measure request and response time ... all the way to the response | |
19 | WSGI iterator has been closed. |
|
19 | WSGI iterator has been closed. | |
20 |
|
20 | |||
21 | This file was forked by the Kallithea project in July 2014. |
|
21 | This file was forked by the Kallithea project in July 2014. | |
22 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | Original author and date, and relevant copyright and licensing information is below: | |
23 | :created_on: May 23, 2013 |
|
23 | :created_on: May 23, 2013 | |
24 | :author: marcink |
|
24 | :author: marcink | |
25 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
26 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | :license: GPLv3, see LICENSE.md for more details. | |
27 | """ |
|
27 | """ | |
28 |
|
28 | |||
29 | import logging |
|
29 | import logging | |
30 | import time |
|
30 | import time | |
31 |
|
31 | |||
32 | from kallithea.lib.base import get_ip_addr, get_path_info |
|
32 | from kallithea.controllers import base | |
33 |
|
33 | |||
34 |
|
34 | |||
35 | log = logging.getLogger(__name__) |
|
35 | log = logging.getLogger(__name__) | |
36 |
|
36 | |||
37 |
|
37 | |||
38 | class Meter: |
|
38 | class Meter: | |
39 |
|
39 | |||
40 | def __init__(self, start_response): |
|
40 | def __init__(self, start_response): | |
41 | self._start_response = start_response |
|
41 | self._start_response = start_response | |
42 | self._start = time.time() |
|
42 | self._start = time.time() | |
43 | self.status = None |
|
43 | self.status = None | |
44 | self._size = 0 |
|
44 | self._size = 0 | |
45 |
|
45 | |||
46 | def duration(self): |
|
46 | def duration(self): | |
47 | return time.time() - self._start |
|
47 | return time.time() - self._start | |
48 |
|
48 | |||
49 | def start_response(self, status, response_headers, exc_info=None): |
|
49 | def start_response(self, status, response_headers, exc_info=None): | |
50 | self.status = status |
|
50 | self.status = status | |
51 | write = self._start_response(status, response_headers, exc_info) |
|
51 | write = self._start_response(status, response_headers, exc_info) | |
52 | def metered_write(s): |
|
52 | def metered_write(s): | |
53 | self.measure(s) |
|
53 | self.measure(s) | |
54 | write(s) |
|
54 | write(s) | |
55 | return metered_write |
|
55 | return metered_write | |
56 |
|
56 | |||
57 | def measure(self, chunk): |
|
57 | def measure(self, chunk): | |
58 | self._size += len(chunk) |
|
58 | self._size += len(chunk) | |
59 |
|
59 | |||
60 | def size(self): |
|
60 | def size(self): | |
61 | return self._size |
|
61 | return self._size | |
62 |
|
62 | |||
63 |
|
63 | |||
64 | class ResultIter: |
|
64 | class ResultIter: | |
65 |
|
65 | |||
66 | def __init__(self, result, meter, description): |
|
66 | def __init__(self, result, meter, description): | |
67 | self._result_close = getattr(result, 'close', None) or (lambda: None) |
|
67 | self._result_close = getattr(result, 'close', None) or (lambda: None) | |
68 | self._next = iter(result).__next__ |
|
68 | self._next = iter(result).__next__ | |
69 | self._meter = meter |
|
69 | self._meter = meter | |
70 | self._description = description |
|
70 | self._description = description | |
71 |
|
71 | |||
72 | def __iter__(self): |
|
72 | def __iter__(self): | |
73 | return self |
|
73 | return self | |
74 |
|
74 | |||
75 | def __next__(self): |
|
75 | def __next__(self): | |
76 | chunk = self._next() |
|
76 | chunk = self._next() | |
77 | self._meter.measure(chunk) |
|
77 | self._meter.measure(chunk) | |
78 | return chunk |
|
78 | return chunk | |
79 |
|
79 | |||
80 | def close(self): |
|
80 | def close(self): | |
81 | self._result_close() |
|
81 | self._result_close() | |
82 | log.info("%s responded %r after %.3fs with %s bytes", self._description, self._meter.status, self._meter.duration(), self._meter.size()) |
|
82 | log.info("%s responded %r after %.3fs with %s bytes", self._description, self._meter.status, self._meter.duration(), self._meter.size()) | |
83 |
|
83 | |||
84 |
|
84 | |||
85 | class RequestWrapper(object): |
|
85 | class RequestWrapper(object): | |
86 |
|
86 | |||
87 | def __init__(self, app, config): |
|
87 | def __init__(self, app, config): | |
88 | self.application = app |
|
88 | self.application = app | |
89 | self.config = config |
|
89 | self.config = config | |
90 |
|
90 | |||
91 | def __call__(self, environ, start_response): |
|
91 | def __call__(self, environ, start_response): | |
92 | meter = Meter(start_response) |
|
92 | meter = Meter(start_response) | |
93 | description = "Request from %s for %s" % ( |
|
93 | description = "Request from %s for %s" % ( | |
94 | get_ip_addr(environ), |
|
94 | base.get_ip_addr(environ), | |
95 | get_path_info(environ), |
|
95 | base.get_path_info(environ), | |
96 | ) |
|
96 | ) | |
97 | log.info("%s received", description) |
|
97 | log.info("%s received", description) | |
98 | try: |
|
98 | try: | |
99 | result = self.application(environ, meter.start_response) |
|
99 | result = self.application(environ, meter.start_response) | |
100 | finally: |
|
100 | finally: | |
101 | log.info("%s responding %r after %.3fs", description, meter.status, meter.duration()) |
|
101 | log.info("%s responding %r after %.3fs", description, meter.status, meter.duration()) | |
102 | return ResultIter(result, meter, description) |
|
102 | return ResultIter(result, meter, description) |
@@ -1,147 +1,147 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.admin.admin |
|
15 | kallithea.controllers.admin.admin | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Controller for Admin panel of Kallithea |
|
18 | Controller for Admin panel of Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 7, 2010 |
|
22 | :created_on: Apr 7, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 |
|
28 | |||
29 | import logging |
|
29 | import logging | |
30 |
|
30 | |||
31 | from sqlalchemy.orm import joinedload |
|
31 | from sqlalchemy.orm import joinedload | |
32 | from sqlalchemy.sql.expression import and_, func, or_ |
|
32 | from sqlalchemy.sql.expression import and_, func, or_ | |
33 | from tg import request |
|
33 | from tg import request | |
34 | from tg import tmpl_context as c |
|
34 | from tg import tmpl_context as c | |
35 | from whoosh import query |
|
35 | from whoosh import query | |
36 | from whoosh.qparser.dateparse import DateParserPlugin |
|
36 | from whoosh.qparser.dateparse import DateParserPlugin | |
37 | from whoosh.qparser.default import QueryParser |
|
37 | from whoosh.qparser.default import QueryParser | |
38 |
|
38 | |||
|
39 | from kallithea.controllers import base | |||
39 | from kallithea.lib.auth import HasPermissionAnyDecorator, LoginRequired |
|
40 | from kallithea.lib.auth import HasPermissionAnyDecorator, LoginRequired | |
40 | from kallithea.lib.base import BaseController, render |
|
|||
41 | from kallithea.lib.indexers import JOURNAL_SCHEMA |
|
41 | from kallithea.lib.indexers import JOURNAL_SCHEMA | |
42 | from kallithea.lib.page import Page |
|
42 | from kallithea.lib.page import Page | |
43 | from kallithea.lib.utils2 import remove_prefix, remove_suffix, safe_int |
|
43 | from kallithea.lib.utils2 import remove_prefix, remove_suffix, safe_int | |
44 | from kallithea.model import db |
|
44 | from kallithea.model import db | |
45 |
|
45 | |||
46 |
|
46 | |||
47 | log = logging.getLogger(__name__) |
|
47 | log = logging.getLogger(__name__) | |
48 |
|
48 | |||
49 |
|
49 | |||
50 | def _journal_filter(user_log, search_term): |
|
50 | def _journal_filter(user_log, search_term): | |
51 | """ |
|
51 | """ | |
52 | Filters sqlalchemy user_log based on search_term with whoosh Query language |
|
52 | Filters sqlalchemy user_log based on search_term with whoosh Query language | |
53 | http://packages.python.org/Whoosh/querylang.html |
|
53 | http://packages.python.org/Whoosh/querylang.html | |
54 |
|
54 | |||
55 | :param user_log: |
|
55 | :param user_log: | |
56 | :param search_term: |
|
56 | :param search_term: | |
57 | """ |
|
57 | """ | |
58 | log.debug('Initial search term: %r', search_term) |
|
58 | log.debug('Initial search term: %r', search_term) | |
59 | qry = None |
|
59 | qry = None | |
60 | if search_term: |
|
60 | if search_term: | |
61 | qp = QueryParser('repository', schema=JOURNAL_SCHEMA) |
|
61 | qp = QueryParser('repository', schema=JOURNAL_SCHEMA) | |
62 | qp.add_plugin(DateParserPlugin()) |
|
62 | qp.add_plugin(DateParserPlugin()) | |
63 | qry = qp.parse(search_term) |
|
63 | qry = qp.parse(search_term) | |
64 | log.debug('Filtering using parsed query %r', qry) |
|
64 | log.debug('Filtering using parsed query %r', qry) | |
65 |
|
65 | |||
66 | def wildcard_handler(col, wc_term): |
|
66 | def wildcard_handler(col, wc_term): | |
67 | if wc_term.startswith('*') and not wc_term.endswith('*'): |
|
67 | if wc_term.startswith('*') and not wc_term.endswith('*'): | |
68 | # postfix == endswith |
|
68 | # postfix == endswith | |
69 | wc_term = remove_prefix(wc_term, prefix='*') |
|
69 | wc_term = remove_prefix(wc_term, prefix='*') | |
70 | return func.lower(col).endswith(func.lower(wc_term)) |
|
70 | return func.lower(col).endswith(func.lower(wc_term)) | |
71 | elif wc_term.startswith('*') and wc_term.endswith('*'): |
|
71 | elif wc_term.startswith('*') and wc_term.endswith('*'): | |
72 | # wildcard == ilike |
|
72 | # wildcard == ilike | |
73 | wc_term = remove_prefix(wc_term, prefix='*') |
|
73 | wc_term = remove_prefix(wc_term, prefix='*') | |
74 | wc_term = remove_suffix(wc_term, suffix='*') |
|
74 | wc_term = remove_suffix(wc_term, suffix='*') | |
75 | return func.lower(col).contains(func.lower(wc_term)) |
|
75 | return func.lower(col).contains(func.lower(wc_term)) | |
76 |
|
76 | |||
77 | def get_filterion(field, val, term): |
|
77 | def get_filterion(field, val, term): | |
78 |
|
78 | |||
79 | if field == 'repository': |
|
79 | if field == 'repository': | |
80 | field = getattr(db.UserLog, 'repository_name') |
|
80 | field = getattr(db.UserLog, 'repository_name') | |
81 | elif field == 'ip': |
|
81 | elif field == 'ip': | |
82 | field = getattr(db.UserLog, 'user_ip') |
|
82 | field = getattr(db.UserLog, 'user_ip') | |
83 | elif field == 'date': |
|
83 | elif field == 'date': | |
84 | field = getattr(db.UserLog, 'action_date') |
|
84 | field = getattr(db.UserLog, 'action_date') | |
85 | elif field == 'username': |
|
85 | elif field == 'username': | |
86 | field = getattr(db.UserLog, 'username') |
|
86 | field = getattr(db.UserLog, 'username') | |
87 | else: |
|
87 | else: | |
88 | field = getattr(db.UserLog, field) |
|
88 | field = getattr(db.UserLog, field) | |
89 | log.debug('filter field: %s val=>%s', field, val) |
|
89 | log.debug('filter field: %s val=>%s', field, val) | |
90 |
|
90 | |||
91 | # sql filtering |
|
91 | # sql filtering | |
92 | if isinstance(term, query.Wildcard): |
|
92 | if isinstance(term, query.Wildcard): | |
93 | return wildcard_handler(field, val) |
|
93 | return wildcard_handler(field, val) | |
94 | elif isinstance(term, query.Prefix): |
|
94 | elif isinstance(term, query.Prefix): | |
95 | return func.lower(field).startswith(func.lower(val)) |
|
95 | return func.lower(field).startswith(func.lower(val)) | |
96 | elif isinstance(term, query.DateRange): |
|
96 | elif isinstance(term, query.DateRange): | |
97 | return and_(field >= val[0], field <= val[1]) |
|
97 | return and_(field >= val[0], field <= val[1]) | |
98 | return func.lower(field) == func.lower(val) |
|
98 | return func.lower(field) == func.lower(val) | |
99 |
|
99 | |||
100 | if isinstance(qry, (query.And, query.Term, query.Prefix, query.Wildcard, |
|
100 | if isinstance(qry, (query.And, query.Term, query.Prefix, query.Wildcard, | |
101 | query.DateRange)): |
|
101 | query.DateRange)): | |
102 | if not isinstance(qry, query.And): |
|
102 | if not isinstance(qry, query.And): | |
103 | qry = [qry] |
|
103 | qry = [qry] | |
104 | for term in qry: |
|
104 | for term in qry: | |
105 | field = term.fieldname |
|
105 | field = term.fieldname | |
106 | val = (term.text if not isinstance(term, query.DateRange) |
|
106 | val = (term.text if not isinstance(term, query.DateRange) | |
107 | else [term.startdate, term.enddate]) |
|
107 | else [term.startdate, term.enddate]) | |
108 | user_log = user_log.filter(get_filterion(field, val, term)) |
|
108 | user_log = user_log.filter(get_filterion(field, val, term)) | |
109 | elif isinstance(qry, query.Or): |
|
109 | elif isinstance(qry, query.Or): | |
110 | filters = [] |
|
110 | filters = [] | |
111 | for term in qry: |
|
111 | for term in qry: | |
112 | field = term.fieldname |
|
112 | field = term.fieldname | |
113 | val = (term.text if not isinstance(term, query.DateRange) |
|
113 | val = (term.text if not isinstance(term, query.DateRange) | |
114 | else [term.startdate, term.enddate]) |
|
114 | else [term.startdate, term.enddate]) | |
115 | filters.append(get_filterion(field, val, term)) |
|
115 | filters.append(get_filterion(field, val, term)) | |
116 | user_log = user_log.filter(or_(*filters)) |
|
116 | user_log = user_log.filter(or_(*filters)) | |
117 |
|
117 | |||
118 | return user_log |
|
118 | return user_log | |
119 |
|
119 | |||
120 |
|
120 | |||
121 | class AdminController(BaseController): |
|
121 | class AdminController(base.BaseController): | |
122 |
|
122 | |||
123 | @LoginRequired(allow_default_user=True) |
|
123 | @LoginRequired(allow_default_user=True) | |
124 | def _before(self, *args, **kwargs): |
|
124 | def _before(self, *args, **kwargs): | |
125 | super(AdminController, self)._before(*args, **kwargs) |
|
125 | super(AdminController, self)._before(*args, **kwargs) | |
126 |
|
126 | |||
127 | @HasPermissionAnyDecorator('hg.admin') |
|
127 | @HasPermissionAnyDecorator('hg.admin') | |
128 | def index(self): |
|
128 | def index(self): | |
129 | users_log = db.UserLog.query() \ |
|
129 | users_log = db.UserLog.query() \ | |
130 | .options(joinedload(db.UserLog.user)) \ |
|
130 | .options(joinedload(db.UserLog.user)) \ | |
131 | .options(joinedload(db.UserLog.repository)) |
|
131 | .options(joinedload(db.UserLog.repository)) | |
132 |
|
132 | |||
133 | # FILTERING |
|
133 | # FILTERING | |
134 | c.search_term = request.GET.get('filter') |
|
134 | c.search_term = request.GET.get('filter') | |
135 | users_log = _journal_filter(users_log, c.search_term) |
|
135 | users_log = _journal_filter(users_log, c.search_term) | |
136 |
|
136 | |||
137 | users_log = users_log.order_by(db.UserLog.action_date.desc()) |
|
137 | users_log = users_log.order_by(db.UserLog.action_date.desc()) | |
138 |
|
138 | |||
139 | p = safe_int(request.GET.get('page'), 1) |
|
139 | p = safe_int(request.GET.get('page'), 1) | |
140 |
|
140 | |||
141 | c.users_log = Page(users_log, page=p, items_per_page=10, |
|
141 | c.users_log = Page(users_log, page=p, items_per_page=10, | |
142 | filter=c.search_term) |
|
142 | filter=c.search_term) | |
143 |
|
143 | |||
144 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
144 | if request.environ.get('HTTP_X_PARTIAL_XHR'): | |
145 | return render('admin/admin_log.html') |
|
145 | return base.render('admin/admin_log.html') | |
146 |
|
146 | |||
147 | return render('admin/admin.html') |
|
147 | return base.render('admin/admin.html') |
@@ -1,148 +1,148 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.admin.auth_settings |
|
15 | kallithea.controllers.admin.auth_settings | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | pluggable authentication controller for Kallithea |
|
18 | pluggable authentication controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Nov 26, 2010 |
|
22 | :created_on: Nov 26, 2010 | |
23 | :author: akesterson |
|
23 | :author: akesterson | |
24 | """ |
|
24 | """ | |
25 |
|
25 | |||
26 | import logging |
|
26 | import logging | |
27 | import traceback |
|
27 | import traceback | |
28 |
|
28 | |||
29 | import formencode.htmlfill |
|
29 | import formencode.htmlfill | |
30 | from tg import request |
|
30 | from tg import request | |
31 | from tg import tmpl_context as c |
|
31 | from tg import tmpl_context as c | |
32 | from tg.i18n import ugettext as _ |
|
32 | from tg.i18n import ugettext as _ | |
33 | from webob.exc import HTTPFound |
|
33 | from webob.exc import HTTPFound | |
34 |
|
34 | |||
|
35 | from kallithea.controllers import base | |||
35 | from kallithea.lib import auth_modules, webutils |
|
36 | from kallithea.lib import auth_modules, webutils | |
36 | from kallithea.lib.auth import HasPermissionAnyDecorator, LoginRequired |
|
37 | from kallithea.lib.auth import HasPermissionAnyDecorator, LoginRequired | |
37 | from kallithea.lib.base import BaseController, render |
|
|||
38 | from kallithea.lib.webutils import url |
|
38 | from kallithea.lib.webutils import url | |
39 | from kallithea.model import db, meta |
|
39 | from kallithea.model import db, meta | |
40 | from kallithea.model.forms import AuthSettingsForm |
|
40 | from kallithea.model.forms import AuthSettingsForm | |
41 |
|
41 | |||
42 |
|
42 | |||
43 | log = logging.getLogger(__name__) |
|
43 | log = logging.getLogger(__name__) | |
44 |
|
44 | |||
45 |
|
45 | |||
46 | class AuthSettingsController(BaseController): |
|
46 | class AuthSettingsController(base.BaseController): | |
47 |
|
47 | |||
48 | @LoginRequired() |
|
48 | @LoginRequired() | |
49 | @HasPermissionAnyDecorator('hg.admin') |
|
49 | @HasPermissionAnyDecorator('hg.admin') | |
50 | def _before(self, *args, **kwargs): |
|
50 | def _before(self, *args, **kwargs): | |
51 | super(AuthSettingsController, self)._before(*args, **kwargs) |
|
51 | super(AuthSettingsController, self)._before(*args, **kwargs) | |
52 |
|
52 | |||
53 | def __load_defaults(self): |
|
53 | def __load_defaults(self): | |
54 | c.available_plugins = [ |
|
54 | c.available_plugins = [ | |
55 | 'kallithea.lib.auth_modules.auth_internal', |
|
55 | 'kallithea.lib.auth_modules.auth_internal', | |
56 | 'kallithea.lib.auth_modules.auth_container', |
|
56 | 'kallithea.lib.auth_modules.auth_container', | |
57 | 'kallithea.lib.auth_modules.auth_ldap', |
|
57 | 'kallithea.lib.auth_modules.auth_ldap', | |
58 | 'kallithea.lib.auth_modules.auth_crowd', |
|
58 | 'kallithea.lib.auth_modules.auth_crowd', | |
59 | 'kallithea.lib.auth_modules.auth_pam' |
|
59 | 'kallithea.lib.auth_modules.auth_pam' | |
60 | ] |
|
60 | ] | |
61 | self.enabled_plugins = auth_modules.get_auth_plugins() |
|
61 | self.enabled_plugins = auth_modules.get_auth_plugins() | |
62 | c.enabled_plugin_names = [plugin.__class__.__module__ for plugin in self.enabled_plugins] |
|
62 | c.enabled_plugin_names = [plugin.__class__.__module__ for plugin in self.enabled_plugins] | |
63 |
|
63 | |||
64 | def __render(self, defaults, errors): |
|
64 | def __render(self, defaults, errors): | |
65 | c.defaults = {} |
|
65 | c.defaults = {} | |
66 | c.plugin_settings = {} |
|
66 | c.plugin_settings = {} | |
67 | c.plugin_shortnames = {} |
|
67 | c.plugin_shortnames = {} | |
68 |
|
68 | |||
69 | for plugin in self.enabled_plugins: |
|
69 | for plugin in self.enabled_plugins: | |
70 | module = plugin.__class__.__module__ |
|
70 | module = plugin.__class__.__module__ | |
71 | c.plugin_shortnames[module] = plugin.name |
|
71 | c.plugin_shortnames[module] = plugin.name | |
72 | c.plugin_settings[module] = plugin.plugin_settings() |
|
72 | c.plugin_settings[module] = plugin.plugin_settings() | |
73 | for v in c.plugin_settings[module]: |
|
73 | for v in c.plugin_settings[module]: | |
74 | fullname = "auth_%s_%s" % (plugin.name, v["name"]) |
|
74 | fullname = "auth_%s_%s" % (plugin.name, v["name"]) | |
75 | if "default" in v: |
|
75 | if "default" in v: | |
76 | c.defaults[fullname] = v["default"] |
|
76 | c.defaults[fullname] = v["default"] | |
77 | # Current values will be the default on the form, if there are any |
|
77 | # Current values will be the default on the form, if there are any | |
78 | setting = db.Setting.get_by_name(fullname) |
|
78 | setting = db.Setting.get_by_name(fullname) | |
79 | if setting is not None: |
|
79 | if setting is not None: | |
80 | c.defaults[fullname] = setting.app_settings_value |
|
80 | c.defaults[fullname] = setting.app_settings_value | |
81 | if defaults: |
|
81 | if defaults: | |
82 | c.defaults.update(defaults) |
|
82 | c.defaults.update(defaults) | |
83 |
|
83 | |||
84 | # we want to show , separated list of enabled plugins |
|
84 | # we want to show , separated list of enabled plugins | |
85 | c.defaults['auth_plugins'] = ','.join(c.enabled_plugin_names) |
|
85 | c.defaults['auth_plugins'] = ','.join(c.enabled_plugin_names) | |
86 |
|
86 | |||
87 | log.debug('defaults: %s', defaults) |
|
87 | log.debug('defaults: %s', defaults) | |
88 | return formencode.htmlfill.render( |
|
88 | return formencode.htmlfill.render( | |
89 | render('admin/auth/auth_settings.html'), |
|
89 | base.render('admin/auth/auth_settings.html'), | |
90 | defaults=c.defaults, |
|
90 | defaults=c.defaults, | |
91 | errors=errors, |
|
91 | errors=errors, | |
92 | prefix_error=False, |
|
92 | prefix_error=False, | |
93 | encoding="UTF-8", |
|
93 | encoding="UTF-8", | |
94 | force_defaults=False) |
|
94 | force_defaults=False) | |
95 |
|
95 | |||
96 | def index(self): |
|
96 | def index(self): | |
97 | self.__load_defaults() |
|
97 | self.__load_defaults() | |
98 | return self.__render(defaults=None, errors=None) |
|
98 | return self.__render(defaults=None, errors=None) | |
99 |
|
99 | |||
100 | def auth_settings(self): |
|
100 | def auth_settings(self): | |
101 | """POST create and store auth settings""" |
|
101 | """POST create and store auth settings""" | |
102 | self.__load_defaults() |
|
102 | self.__load_defaults() | |
103 | log.debug("POST Result: %s", dict(request.POST)) |
|
103 | log.debug("POST Result: %s", dict(request.POST)) | |
104 |
|
104 | |||
105 | # First, parse only the plugin list (not the plugin settings). |
|
105 | # First, parse only the plugin list (not the plugin settings). | |
106 | _auth_plugins_validator = AuthSettingsForm([]).fields['auth_plugins'] |
|
106 | _auth_plugins_validator = AuthSettingsForm([]).fields['auth_plugins'] | |
107 | try: |
|
107 | try: | |
108 | new_enabled_plugins = _auth_plugins_validator.to_python(request.POST.get('auth_plugins')) |
|
108 | new_enabled_plugins = _auth_plugins_validator.to_python(request.POST.get('auth_plugins')) | |
109 | except formencode.Invalid: |
|
109 | except formencode.Invalid: | |
110 | # User provided an invalid plugin list. Just fall back to |
|
110 | # User provided an invalid plugin list. Just fall back to | |
111 | # the list of currently enabled plugins. (We'll re-validate |
|
111 | # the list of currently enabled plugins. (We'll re-validate | |
112 | # and show an error message to the user, below.) |
|
112 | # and show an error message to the user, below.) | |
113 | pass |
|
113 | pass | |
114 | else: |
|
114 | else: | |
115 | # Hide plugins that the user has asked to be disabled, but |
|
115 | # Hide plugins that the user has asked to be disabled, but | |
116 | # do not show plugins that the user has asked to be enabled |
|
116 | # do not show plugins that the user has asked to be enabled | |
117 | # (yet), since that'll cause validation errors and/or wrong |
|
117 | # (yet), since that'll cause validation errors and/or wrong | |
118 | # settings being applied (e.g. checkboxes being cleared), |
|
118 | # settings being applied (e.g. checkboxes being cleared), | |
119 | # since the plugin settings will not be in the POST data. |
|
119 | # since the plugin settings will not be in the POST data. | |
120 | c.enabled_plugin_names = [p for p in c.enabled_plugin_names if p in new_enabled_plugins] |
|
120 | c.enabled_plugin_names = [p for p in c.enabled_plugin_names if p in new_enabled_plugins] | |
121 |
|
121 | |||
122 | # Next, parse everything including plugin settings. |
|
122 | # Next, parse everything including plugin settings. | |
123 | _form = AuthSettingsForm(c.enabled_plugin_names)() |
|
123 | _form = AuthSettingsForm(c.enabled_plugin_names)() | |
124 |
|
124 | |||
125 | try: |
|
125 | try: | |
126 | form_result = _form.to_python(dict(request.POST)) |
|
126 | form_result = _form.to_python(dict(request.POST)) | |
127 | for k, v in form_result.items(): |
|
127 | for k, v in form_result.items(): | |
128 | if k == 'auth_plugins': |
|
128 | if k == 'auth_plugins': | |
129 | # we want to store it comma separated inside our settings |
|
129 | # we want to store it comma separated inside our settings | |
130 | v = ','.join(v) |
|
130 | v = ','.join(v) | |
131 | log.debug("%s = %s", k, str(v)) |
|
131 | log.debug("%s = %s", k, str(v)) | |
132 | setting = db.Setting.create_or_update(k, v) |
|
132 | setting = db.Setting.create_or_update(k, v) | |
133 | meta.Session().commit() |
|
133 | meta.Session().commit() | |
134 | webutils.flash(_('Auth settings updated successfully'), |
|
134 | webutils.flash(_('Auth settings updated successfully'), | |
135 | category='success') |
|
135 | category='success') | |
136 | except formencode.Invalid as errors: |
|
136 | except formencode.Invalid as errors: | |
137 | log.error(traceback.format_exc()) |
|
137 | log.error(traceback.format_exc()) | |
138 | e = errors.error_dict or {} |
|
138 | e = errors.error_dict or {} | |
139 | return self.__render( |
|
139 | return self.__render( | |
140 | defaults=errors.value, |
|
140 | defaults=errors.value, | |
141 | errors=e, |
|
141 | errors=e, | |
142 | ) |
|
142 | ) | |
143 | except Exception: |
|
143 | except Exception: | |
144 | log.error(traceback.format_exc()) |
|
144 | log.error(traceback.format_exc()) | |
145 | webutils.flash(_('error occurred during update of auth settings'), |
|
145 | webutils.flash(_('error occurred during update of auth settings'), | |
146 | category='error') |
|
146 | category='error') | |
147 |
|
147 | |||
148 | raise HTTPFound(location=url('auth_home')) |
|
148 | raise HTTPFound(location=url('auth_home')) |
@@ -1,91 +1,91 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.admin.defaults |
|
15 | kallithea.controllers.admin.defaults | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | default settings controller for Kallithea |
|
18 | default settings controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 27, 2010 |
|
22 | :created_on: Apr 27, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 |
|
30 | |||
31 | import formencode |
|
31 | import formencode | |
32 | from formencode import htmlfill |
|
32 | from formencode import htmlfill | |
33 | from tg import request |
|
33 | from tg import request | |
34 | from tg.i18n import ugettext as _ |
|
34 | from tg.i18n import ugettext as _ | |
35 | from webob.exc import HTTPFound |
|
35 | from webob.exc import HTTPFound | |
36 |
|
36 | |||
|
37 | from kallithea.controllers import base | |||
37 | from kallithea.lib import webutils |
|
38 | from kallithea.lib import webutils | |
38 | from kallithea.lib.auth import HasPermissionAnyDecorator, LoginRequired |
|
39 | from kallithea.lib.auth import HasPermissionAnyDecorator, LoginRequired | |
39 | from kallithea.lib.base import BaseController, render |
|
|||
40 | from kallithea.lib.webutils import url |
|
40 | from kallithea.lib.webutils import url | |
41 | from kallithea.model import db, meta |
|
41 | from kallithea.model import db, meta | |
42 | from kallithea.model.forms import DefaultsForm |
|
42 | from kallithea.model.forms import DefaultsForm | |
43 |
|
43 | |||
44 |
|
44 | |||
45 | log = logging.getLogger(__name__) |
|
45 | log = logging.getLogger(__name__) | |
46 |
|
46 | |||
47 |
|
47 | |||
48 | class DefaultsController(BaseController): |
|
48 | class DefaultsController(base.BaseController): | |
49 |
|
49 | |||
50 | @LoginRequired() |
|
50 | @LoginRequired() | |
51 | @HasPermissionAnyDecorator('hg.admin') |
|
51 | @HasPermissionAnyDecorator('hg.admin') | |
52 | def _before(self, *args, **kwargs): |
|
52 | def _before(self, *args, **kwargs): | |
53 | super(DefaultsController, self)._before(*args, **kwargs) |
|
53 | super(DefaultsController, self)._before(*args, **kwargs) | |
54 |
|
54 | |||
55 | def index(self, format='html'): |
|
55 | def index(self, format='html'): | |
56 | defaults = db.Setting.get_default_repo_settings() |
|
56 | defaults = db.Setting.get_default_repo_settings() | |
57 |
|
57 | |||
58 | return htmlfill.render( |
|
58 | return htmlfill.render( | |
59 | render('admin/defaults/defaults.html'), |
|
59 | base.render('admin/defaults/defaults.html'), | |
60 | defaults=defaults, |
|
60 | defaults=defaults, | |
61 | encoding="UTF-8", |
|
61 | encoding="UTF-8", | |
62 | force_defaults=False |
|
62 | force_defaults=False | |
63 | ) |
|
63 | ) | |
64 |
|
64 | |||
65 | def update(self, id): |
|
65 | def update(self, id): | |
66 | _form = DefaultsForm()() |
|
66 | _form = DefaultsForm()() | |
67 |
|
67 | |||
68 | try: |
|
68 | try: | |
69 | form_result = _form.to_python(dict(request.POST)) |
|
69 | form_result = _form.to_python(dict(request.POST)) | |
70 | for k, v in form_result.items(): |
|
70 | for k, v in form_result.items(): | |
71 | setting = db.Setting.create_or_update(k, v) |
|
71 | setting = db.Setting.create_or_update(k, v) | |
72 | meta.Session().commit() |
|
72 | meta.Session().commit() | |
73 | webutils.flash(_('Default settings updated successfully'), |
|
73 | webutils.flash(_('Default settings updated successfully'), | |
74 | category='success') |
|
74 | category='success') | |
75 |
|
75 | |||
76 | except formencode.Invalid as errors: |
|
76 | except formencode.Invalid as errors: | |
77 | defaults = errors.value |
|
77 | defaults = errors.value | |
78 |
|
78 | |||
79 | return htmlfill.render( |
|
79 | return htmlfill.render( | |
80 | render('admin/defaults/defaults.html'), |
|
80 | base.render('admin/defaults/defaults.html'), | |
81 | defaults=defaults, |
|
81 | defaults=defaults, | |
82 | errors=errors.error_dict or {}, |
|
82 | errors=errors.error_dict or {}, | |
83 | prefix_error=False, |
|
83 | prefix_error=False, | |
84 | encoding="UTF-8", |
|
84 | encoding="UTF-8", | |
85 | force_defaults=False) |
|
85 | force_defaults=False) | |
86 | except Exception: |
|
86 | except Exception: | |
87 | log.error(traceback.format_exc()) |
|
87 | log.error(traceback.format_exc()) | |
88 | webutils.flash(_('Error occurred during update of defaults'), |
|
88 | webutils.flash(_('Error occurred during update of defaults'), | |
89 | category='error') |
|
89 | category='error') | |
90 |
|
90 | |||
91 | raise HTTPFound(location=url('defaults')) |
|
91 | raise HTTPFound(location=url('defaults')) |
@@ -1,265 +1,265 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.admin.gists |
|
15 | kallithea.controllers.admin.gists | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | gist controller for Kallithea |
|
18 | gist controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: May 9, 2013 |
|
22 | :created_on: May 9, 2013 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 |
|
30 | |||
31 | import formencode.htmlfill |
|
31 | import formencode.htmlfill | |
32 | from sqlalchemy.sql.expression import or_ |
|
32 | from sqlalchemy.sql.expression import or_ | |
33 | from tg import request, response |
|
33 | from tg import request, response | |
34 | from tg import tmpl_context as c |
|
34 | from tg import tmpl_context as c | |
35 | from tg.i18n import ugettext as _ |
|
35 | from tg.i18n import ugettext as _ | |
36 | from webob.exc import HTTPForbidden, HTTPFound, HTTPNotFound |
|
36 | from webob.exc import HTTPForbidden, HTTPFound, HTTPNotFound | |
37 |
|
37 | |||
|
38 | from kallithea.controllers import base | |||
38 | from kallithea.lib import auth, webutils |
|
39 | from kallithea.lib import auth, webutils | |
39 | from kallithea.lib.auth import LoginRequired |
|
40 | from kallithea.lib.auth import LoginRequired | |
40 | from kallithea.lib.base import BaseController, jsonify, render |
|
|||
41 | from kallithea.lib.page import Page |
|
41 | from kallithea.lib.page import Page | |
42 | from kallithea.lib.utils2 import safe_int, safe_str, time_to_datetime |
|
42 | from kallithea.lib.utils2 import safe_int, safe_str, time_to_datetime | |
43 | from kallithea.lib.vcs.exceptions import NodeNotChangedError, VCSError |
|
43 | from kallithea.lib.vcs.exceptions import NodeNotChangedError, VCSError | |
44 | from kallithea.lib.webutils import url |
|
44 | from kallithea.lib.webutils import url | |
45 | from kallithea.model import db, meta |
|
45 | from kallithea.model import db, meta | |
46 | from kallithea.model.forms import GistForm |
|
46 | from kallithea.model.forms import GistForm | |
47 | from kallithea.model.gist import GistModel |
|
47 | from kallithea.model.gist import GistModel | |
48 |
|
48 | |||
49 |
|
49 | |||
50 | log = logging.getLogger(__name__) |
|
50 | log = logging.getLogger(__name__) | |
51 |
|
51 | |||
52 |
|
52 | |||
53 | class GistsController(BaseController): |
|
53 | class GistsController(base.BaseController): | |
54 |
|
54 | |||
55 | def __load_defaults(self, extra_values=None): |
|
55 | def __load_defaults(self, extra_values=None): | |
56 | c.lifetime_values = [ |
|
56 | c.lifetime_values = [ | |
57 | (str(-1), _('Forever')), |
|
57 | (str(-1), _('Forever')), | |
58 | (str(5), _('5 minutes')), |
|
58 | (str(5), _('5 minutes')), | |
59 | (str(60), _('1 hour')), |
|
59 | (str(60), _('1 hour')), | |
60 | (str(60 * 24), _('1 day')), |
|
60 | (str(60 * 24), _('1 day')), | |
61 | (str(60 * 24 * 30), _('1 month')), |
|
61 | (str(60 * 24 * 30), _('1 month')), | |
62 | ] |
|
62 | ] | |
63 | if extra_values: |
|
63 | if extra_values: | |
64 | c.lifetime_values.append(extra_values) |
|
64 | c.lifetime_values.append(extra_values) | |
65 | c.lifetime_options = [(c.lifetime_values, _("Lifetime"))] |
|
65 | c.lifetime_options = [(c.lifetime_values, _("Lifetime"))] | |
66 |
|
66 | |||
67 | @LoginRequired(allow_default_user=True) |
|
67 | @LoginRequired(allow_default_user=True) | |
68 | def index(self): |
|
68 | def index(self): | |
69 | not_default_user = not request.authuser.is_default_user |
|
69 | not_default_user = not request.authuser.is_default_user | |
70 | c.show_private = request.GET.get('private') and not_default_user |
|
70 | c.show_private = request.GET.get('private') and not_default_user | |
71 | c.show_public = request.GET.get('public') and not_default_user |
|
71 | c.show_public = request.GET.get('public') and not_default_user | |
72 | url_params = {} |
|
72 | url_params = {} | |
73 | if c.show_public: |
|
73 | if c.show_public: | |
74 | url_params['public'] = 1 |
|
74 | url_params['public'] = 1 | |
75 | elif c.show_private: |
|
75 | elif c.show_private: | |
76 | url_params['private'] = 1 |
|
76 | url_params['private'] = 1 | |
77 |
|
77 | |||
78 | gists = db.Gist().query() \ |
|
78 | gists = db.Gist().query() \ | |
79 | .filter_by(is_expired=False) \ |
|
79 | .filter_by(is_expired=False) \ | |
80 | .order_by(db.Gist.created_on.desc()) |
|
80 | .order_by(db.Gist.created_on.desc()) | |
81 |
|
81 | |||
82 | # MY private |
|
82 | # MY private | |
83 | if c.show_private and not c.show_public: |
|
83 | if c.show_private and not c.show_public: | |
84 | gists = gists.filter(db.Gist.gist_type == db.Gist.GIST_PRIVATE) \ |
|
84 | gists = gists.filter(db.Gist.gist_type == db.Gist.GIST_PRIVATE) \ | |
85 | .filter(db.Gist.owner_id == request.authuser.user_id) |
|
85 | .filter(db.Gist.owner_id == request.authuser.user_id) | |
86 | # MY public |
|
86 | # MY public | |
87 | elif c.show_public and not c.show_private: |
|
87 | elif c.show_public and not c.show_private: | |
88 | gists = gists.filter(db.Gist.gist_type == db.Gist.GIST_PUBLIC) \ |
|
88 | gists = gists.filter(db.Gist.gist_type == db.Gist.GIST_PUBLIC) \ | |
89 | .filter(db.Gist.owner_id == request.authuser.user_id) |
|
89 | .filter(db.Gist.owner_id == request.authuser.user_id) | |
90 |
|
90 | |||
91 | # MY public+private |
|
91 | # MY public+private | |
92 | elif c.show_private and c.show_public: |
|
92 | elif c.show_private and c.show_public: | |
93 | gists = gists.filter(or_(db.Gist.gist_type == db.Gist.GIST_PUBLIC, |
|
93 | gists = gists.filter(or_(db.Gist.gist_type == db.Gist.GIST_PUBLIC, | |
94 | db.Gist.gist_type == db.Gist.GIST_PRIVATE)) \ |
|
94 | db.Gist.gist_type == db.Gist.GIST_PRIVATE)) \ | |
95 | .filter(db.Gist.owner_id == request.authuser.user_id) |
|
95 | .filter(db.Gist.owner_id == request.authuser.user_id) | |
96 |
|
96 | |||
97 | # default show ALL public gists |
|
97 | # default show ALL public gists | |
98 | if not c.show_public and not c.show_private: |
|
98 | if not c.show_public and not c.show_private: | |
99 | gists = gists.filter(db.Gist.gist_type == db.Gist.GIST_PUBLIC) |
|
99 | gists = gists.filter(db.Gist.gist_type == db.Gist.GIST_PUBLIC) | |
100 |
|
100 | |||
101 | c.gists = gists |
|
101 | c.gists = gists | |
102 | p = safe_int(request.GET.get('page'), 1) |
|
102 | p = safe_int(request.GET.get('page'), 1) | |
103 | c.gists_pager = Page(c.gists, page=p, items_per_page=10, |
|
103 | c.gists_pager = Page(c.gists, page=p, items_per_page=10, | |
104 | **url_params) |
|
104 | **url_params) | |
105 | return render('admin/gists/index.html') |
|
105 | return base.render('admin/gists/index.html') | |
106 |
|
106 | |||
107 | @LoginRequired() |
|
107 | @LoginRequired() | |
108 | def create(self): |
|
108 | def create(self): | |
109 | self.__load_defaults() |
|
109 | self.__load_defaults() | |
110 | gist_form = GistForm([x[0] for x in c.lifetime_values])() |
|
110 | gist_form = GistForm([x[0] for x in c.lifetime_values])() | |
111 | try: |
|
111 | try: | |
112 | form_result = gist_form.to_python(dict(request.POST)) |
|
112 | form_result = gist_form.to_python(dict(request.POST)) | |
113 | # TODO: multiple files support, from the form |
|
113 | # TODO: multiple files support, from the form | |
114 | filename = form_result['filename'] or db.Gist.DEFAULT_FILENAME |
|
114 | filename = form_result['filename'] or db.Gist.DEFAULT_FILENAME | |
115 | nodes = { |
|
115 | nodes = { | |
116 | filename: { |
|
116 | filename: { | |
117 | 'content': form_result['content'], |
|
117 | 'content': form_result['content'], | |
118 | 'lexer': form_result['mimetype'] # None is autodetect |
|
118 | 'lexer': form_result['mimetype'] # None is autodetect | |
119 | } |
|
119 | } | |
120 | } |
|
120 | } | |
121 | _public = form_result['public'] |
|
121 | _public = form_result['public'] | |
122 | gist_type = db.Gist.GIST_PUBLIC if _public else db.Gist.GIST_PRIVATE |
|
122 | gist_type = db.Gist.GIST_PUBLIC if _public else db.Gist.GIST_PRIVATE | |
123 | gist = GistModel().create( |
|
123 | gist = GistModel().create( | |
124 | description=form_result['description'], |
|
124 | description=form_result['description'], | |
125 | owner=request.authuser.user_id, |
|
125 | owner=request.authuser.user_id, | |
126 | ip_addr=request.ip_addr, |
|
126 | ip_addr=request.ip_addr, | |
127 | gist_mapping=nodes, |
|
127 | gist_mapping=nodes, | |
128 | gist_type=gist_type, |
|
128 | gist_type=gist_type, | |
129 | lifetime=form_result['lifetime'] |
|
129 | lifetime=form_result['lifetime'] | |
130 | ) |
|
130 | ) | |
131 | meta.Session().commit() |
|
131 | meta.Session().commit() | |
132 | new_gist_id = gist.gist_access_id |
|
132 | new_gist_id = gist.gist_access_id | |
133 | except formencode.Invalid as errors: |
|
133 | except formencode.Invalid as errors: | |
134 | defaults = errors.value |
|
134 | defaults = errors.value | |
135 |
|
135 | |||
136 | return formencode.htmlfill.render( |
|
136 | return formencode.htmlfill.render( | |
137 | render('admin/gists/new.html'), |
|
137 | base.render('admin/gists/new.html'), | |
138 | defaults=defaults, |
|
138 | defaults=defaults, | |
139 | errors=errors.error_dict or {}, |
|
139 | errors=errors.error_dict or {}, | |
140 | prefix_error=False, |
|
140 | prefix_error=False, | |
141 | encoding="UTF-8", |
|
141 | encoding="UTF-8", | |
142 | force_defaults=False) |
|
142 | force_defaults=False) | |
143 |
|
143 | |||
144 | except Exception as e: |
|
144 | except Exception as e: | |
145 | log.error(traceback.format_exc()) |
|
145 | log.error(traceback.format_exc()) | |
146 | webutils.flash(_('Error occurred during gist creation'), category='error') |
|
146 | webutils.flash(_('Error occurred during gist creation'), category='error') | |
147 | raise HTTPFound(location=url('new_gist')) |
|
147 | raise HTTPFound(location=url('new_gist')) | |
148 | raise HTTPFound(location=url('gist', gist_id=new_gist_id)) |
|
148 | raise HTTPFound(location=url('gist', gist_id=new_gist_id)) | |
149 |
|
149 | |||
150 | @LoginRequired() |
|
150 | @LoginRequired() | |
151 | def new(self, format='html'): |
|
151 | def new(self, format='html'): | |
152 | self.__load_defaults() |
|
152 | self.__load_defaults() | |
153 | return render('admin/gists/new.html') |
|
153 | return base.render('admin/gists/new.html') | |
154 |
|
154 | |||
155 | @LoginRequired() |
|
155 | @LoginRequired() | |
156 | def delete(self, gist_id): |
|
156 | def delete(self, gist_id): | |
157 | gist = GistModel().get_gist(gist_id) |
|
157 | gist = GistModel().get_gist(gist_id) | |
158 | owner = gist.owner_id == request.authuser.user_id |
|
158 | owner = gist.owner_id == request.authuser.user_id | |
159 | if auth.HasPermissionAny('hg.admin')() or owner: |
|
159 | if auth.HasPermissionAny('hg.admin')() or owner: | |
160 | GistModel().delete(gist) |
|
160 | GistModel().delete(gist) | |
161 | meta.Session().commit() |
|
161 | meta.Session().commit() | |
162 | webutils.flash(_('Deleted gist %s') % gist.gist_access_id, category='success') |
|
162 | webutils.flash(_('Deleted gist %s') % gist.gist_access_id, category='success') | |
163 | else: |
|
163 | else: | |
164 | raise HTTPForbidden() |
|
164 | raise HTTPForbidden() | |
165 |
|
165 | |||
166 | raise HTTPFound(location=url('gists')) |
|
166 | raise HTTPFound(location=url('gists')) | |
167 |
|
167 | |||
168 | @LoginRequired(allow_default_user=True) |
|
168 | @LoginRequired(allow_default_user=True) | |
169 | def show(self, gist_id, revision='tip', format='html', f_path=None): |
|
169 | def show(self, gist_id, revision='tip', format='html', f_path=None): | |
170 | c.gist = db.Gist.get_or_404(gist_id) |
|
170 | c.gist = db.Gist.get_or_404(gist_id) | |
171 |
|
171 | |||
172 | if c.gist.is_expired: |
|
172 | if c.gist.is_expired: | |
173 | log.error('Gist expired at %s', |
|
173 | log.error('Gist expired at %s', | |
174 | time_to_datetime(c.gist.gist_expires)) |
|
174 | time_to_datetime(c.gist.gist_expires)) | |
175 | raise HTTPNotFound() |
|
175 | raise HTTPNotFound() | |
176 | try: |
|
176 | try: | |
177 | c.file_changeset, c.files = GistModel().get_gist_files(gist_id, |
|
177 | c.file_changeset, c.files = GistModel().get_gist_files(gist_id, | |
178 | revision=revision) |
|
178 | revision=revision) | |
179 | except VCSError: |
|
179 | except VCSError: | |
180 | log.error(traceback.format_exc()) |
|
180 | log.error(traceback.format_exc()) | |
181 | raise HTTPNotFound() |
|
181 | raise HTTPNotFound() | |
182 | if format == 'raw': |
|
182 | if format == 'raw': | |
183 | content = '\n\n'.join( |
|
183 | content = '\n\n'.join( | |
184 | safe_str(f.content) |
|
184 | safe_str(f.content) | |
185 | for f in c.files if (f_path is None or f.path == f_path) |
|
185 | for f in c.files if (f_path is None or f.path == f_path) | |
186 | ) |
|
186 | ) | |
187 | response.content_type = 'text/plain' |
|
187 | response.content_type = 'text/plain' | |
188 | return content |
|
188 | return content | |
189 | return render('admin/gists/show.html') |
|
189 | return base.render('admin/gists/show.html') | |
190 |
|
190 | |||
191 | @LoginRequired() |
|
191 | @LoginRequired() | |
192 | def edit(self, gist_id, format='html'): |
|
192 | def edit(self, gist_id, format='html'): | |
193 | c.gist = db.Gist.get_or_404(gist_id) |
|
193 | c.gist = db.Gist.get_or_404(gist_id) | |
194 |
|
194 | |||
195 | if c.gist.is_expired: |
|
195 | if c.gist.is_expired: | |
196 | log.error('Gist expired at %s', |
|
196 | log.error('Gist expired at %s', | |
197 | time_to_datetime(c.gist.gist_expires)) |
|
197 | time_to_datetime(c.gist.gist_expires)) | |
198 | raise HTTPNotFound() |
|
198 | raise HTTPNotFound() | |
199 | try: |
|
199 | try: | |
200 | c.file_changeset, c.files = GistModel().get_gist_files(gist_id) |
|
200 | c.file_changeset, c.files = GistModel().get_gist_files(gist_id) | |
201 | except VCSError: |
|
201 | except VCSError: | |
202 | log.error(traceback.format_exc()) |
|
202 | log.error(traceback.format_exc()) | |
203 | raise HTTPNotFound() |
|
203 | raise HTTPNotFound() | |
204 |
|
204 | |||
205 | self.__load_defaults(extra_values=('0', _('Unmodified'))) |
|
205 | self.__load_defaults(extra_values=('0', _('Unmodified'))) | |
206 | rendered = render('admin/gists/edit.html') |
|
206 | rendered = base.render('admin/gists/edit.html') | |
207 |
|
207 | |||
208 | if request.POST: |
|
208 | if request.POST: | |
209 | rpost = request.POST |
|
209 | rpost = request.POST | |
210 | nodes = {} |
|
210 | nodes = {} | |
211 | for org_filename, filename, mimetype, content in zip( |
|
211 | for org_filename, filename, mimetype, content in zip( | |
212 | rpost.getall('org_files'), |
|
212 | rpost.getall('org_files'), | |
213 | rpost.getall('files'), |
|
213 | rpost.getall('files'), | |
214 | rpost.getall('mimetypes'), |
|
214 | rpost.getall('mimetypes'), | |
215 | rpost.getall('contents')): |
|
215 | rpost.getall('contents')): | |
216 |
|
216 | |||
217 | nodes[org_filename] = { |
|
217 | nodes[org_filename] = { | |
218 | 'org_filename': org_filename, |
|
218 | 'org_filename': org_filename, | |
219 | 'filename': filename, |
|
219 | 'filename': filename, | |
220 | 'content': content, |
|
220 | 'content': content, | |
221 | 'lexer': mimetype, |
|
221 | 'lexer': mimetype, | |
222 | } |
|
222 | } | |
223 | try: |
|
223 | try: | |
224 | GistModel().update( |
|
224 | GistModel().update( | |
225 | gist=c.gist, |
|
225 | gist=c.gist, | |
226 | description=rpost['description'], |
|
226 | description=rpost['description'], | |
227 | owner=c.gist.owner, # FIXME: request.authuser.user_id ? |
|
227 | owner=c.gist.owner, # FIXME: request.authuser.user_id ? | |
228 | ip_addr=request.ip_addr, |
|
228 | ip_addr=request.ip_addr, | |
229 | gist_mapping=nodes, |
|
229 | gist_mapping=nodes, | |
230 | gist_type=c.gist.gist_type, |
|
230 | gist_type=c.gist.gist_type, | |
231 | lifetime=rpost['lifetime'] |
|
231 | lifetime=rpost['lifetime'] | |
232 | ) |
|
232 | ) | |
233 |
|
233 | |||
234 | meta.Session().commit() |
|
234 | meta.Session().commit() | |
235 | webutils.flash(_('Successfully updated gist content'), category='success') |
|
235 | webutils.flash(_('Successfully updated gist content'), category='success') | |
236 | except NodeNotChangedError: |
|
236 | except NodeNotChangedError: | |
237 | # raised if nothing was changed in repo itself. We anyway then |
|
237 | # raised if nothing was changed in repo itself. We anyway then | |
238 | # store only DB stuff for gist |
|
238 | # store only DB stuff for gist | |
239 | meta.Session().commit() |
|
239 | meta.Session().commit() | |
240 | webutils.flash(_('Successfully updated gist data'), category='success') |
|
240 | webutils.flash(_('Successfully updated gist data'), category='success') | |
241 | except Exception: |
|
241 | except Exception: | |
242 | log.error(traceback.format_exc()) |
|
242 | log.error(traceback.format_exc()) | |
243 | webutils.flash(_('Error occurred during update of gist %s') % gist_id, |
|
243 | webutils.flash(_('Error occurred during update of gist %s') % gist_id, | |
244 | category='error') |
|
244 | category='error') | |
245 |
|
245 | |||
246 | raise HTTPFound(location=url('gist', gist_id=gist_id)) |
|
246 | raise HTTPFound(location=url('gist', gist_id=gist_id)) | |
247 |
|
247 | |||
248 | return rendered |
|
248 | return rendered | |
249 |
|
249 | |||
250 | @LoginRequired() |
|
250 | @LoginRequired() | |
251 | @jsonify |
|
251 | @base.jsonify | |
252 | def check_revision(self, gist_id): |
|
252 | def check_revision(self, gist_id): | |
253 | c.gist = db.Gist.get_or_404(gist_id) |
|
253 | c.gist = db.Gist.get_or_404(gist_id) | |
254 | last_rev = c.gist.scm_instance.get_changeset() |
|
254 | last_rev = c.gist.scm_instance.get_changeset() | |
255 | success = True |
|
255 | success = True | |
256 | revision = request.POST.get('revision') |
|
256 | revision = request.POST.get('revision') | |
257 |
|
257 | |||
258 | # TODO: maybe move this to model ? |
|
258 | # TODO: maybe move this to model ? | |
259 | if revision != last_rev.raw_id: |
|
259 | if revision != last_rev.raw_id: | |
260 | log.error('Last revision %s is different than submitted %s', |
|
260 | log.error('Last revision %s is different than submitted %s', | |
261 | revision, last_rev) |
|
261 | revision, last_rev) | |
262 | # our gist has newer version than we |
|
262 | # our gist has newer version than we | |
263 | success = False |
|
263 | success = False | |
264 |
|
264 | |||
265 | return {'success': success} |
|
265 | return {'success': success} |
@@ -1,289 +1,289 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.admin.my_account |
|
15 | kallithea.controllers.admin.my_account | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | my account controller for Kallithea admin |
|
18 | my account controller for Kallithea admin | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: August 20, 2013 |
|
22 | :created_on: August 20, 2013 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 |
|
30 | |||
31 | import formencode |
|
31 | import formencode | |
32 | from formencode import htmlfill |
|
32 | from formencode import htmlfill | |
33 | from tg import request |
|
33 | from tg import request | |
34 | from tg import tmpl_context as c |
|
34 | from tg import tmpl_context as c | |
35 | from tg.i18n import ugettext as _ |
|
35 | from tg.i18n import ugettext as _ | |
36 | from webob.exc import HTTPFound |
|
36 | from webob.exc import HTTPFound | |
37 |
|
37 | |||
|
38 | from kallithea.controllers import base | |||
38 | from kallithea.lib import auth_modules, webutils |
|
39 | from kallithea.lib import auth_modules, webutils | |
39 | from kallithea.lib.auth import AuthUser, LoginRequired |
|
40 | from kallithea.lib.auth import AuthUser, LoginRequired | |
40 | from kallithea.lib.base import BaseController, IfSshEnabled, render |
|
|||
41 | from kallithea.lib.utils2 import generate_api_key, safe_int |
|
41 | from kallithea.lib.utils2 import generate_api_key, safe_int | |
42 | from kallithea.lib.webutils import url |
|
42 | from kallithea.lib.webutils import url | |
43 | from kallithea.model import db, meta |
|
43 | from kallithea.model import db, meta | |
44 | from kallithea.model.api_key import ApiKeyModel |
|
44 | from kallithea.model.api_key import ApiKeyModel | |
45 | from kallithea.model.forms import PasswordChangeForm, UserForm |
|
45 | from kallithea.model.forms import PasswordChangeForm, UserForm | |
46 | from kallithea.model.repo import RepoModel |
|
46 | from kallithea.model.repo import RepoModel | |
47 | from kallithea.model.ssh_key import SshKeyModel, SshKeyModelException |
|
47 | from kallithea.model.ssh_key import SshKeyModel, SshKeyModelException | |
48 | from kallithea.model.user import UserModel |
|
48 | from kallithea.model.user import UserModel | |
49 |
|
49 | |||
50 |
|
50 | |||
51 | log = logging.getLogger(__name__) |
|
51 | log = logging.getLogger(__name__) | |
52 |
|
52 | |||
53 |
|
53 | |||
54 | class MyAccountController(BaseController): |
|
54 | class MyAccountController(base.BaseController): | |
55 |
|
55 | |||
56 | @LoginRequired() |
|
56 | @LoginRequired() | |
57 | def _before(self, *args, **kwargs): |
|
57 | def _before(self, *args, **kwargs): | |
58 | super(MyAccountController, self)._before(*args, **kwargs) |
|
58 | super(MyAccountController, self)._before(*args, **kwargs) | |
59 |
|
59 | |||
60 | def __load_data(self): |
|
60 | def __load_data(self): | |
61 | c.user = db.User.get(request.authuser.user_id) |
|
61 | c.user = db.User.get(request.authuser.user_id) | |
62 | if c.user.is_default_user: |
|
62 | if c.user.is_default_user: | |
63 | webutils.flash(_("You can't edit this user since it's" |
|
63 | webutils.flash(_("You can't edit this user since it's" | |
64 | " crucial for entire application"), category='warning') |
|
64 | " crucial for entire application"), category='warning') | |
65 | raise HTTPFound(location=url('users')) |
|
65 | raise HTTPFound(location=url('users')) | |
66 |
|
66 | |||
67 | def _load_my_repos_data(self, watched=False): |
|
67 | def _load_my_repos_data(self, watched=False): | |
68 | if watched: |
|
68 | if watched: | |
69 | admin = False |
|
69 | admin = False | |
70 | repos_list = meta.Session().query(db.Repository) \ |
|
70 | repos_list = meta.Session().query(db.Repository) \ | |
71 | .join(db.UserFollowing) \ |
|
71 | .join(db.UserFollowing) \ | |
72 | .filter(db.UserFollowing.user_id == |
|
72 | .filter(db.UserFollowing.user_id == | |
73 | request.authuser.user_id).all() |
|
73 | request.authuser.user_id).all() | |
74 | else: |
|
74 | else: | |
75 | admin = True |
|
75 | admin = True | |
76 | repos_list = meta.Session().query(db.Repository) \ |
|
76 | repos_list = meta.Session().query(db.Repository) \ | |
77 | .filter(db.Repository.owner_id == |
|
77 | .filter(db.Repository.owner_id == | |
78 | request.authuser.user_id).all() |
|
78 | request.authuser.user_id).all() | |
79 |
|
79 | |||
80 | return RepoModel().get_repos_as_dict(repos_list, admin=admin) |
|
80 | return RepoModel().get_repos_as_dict(repos_list, admin=admin) | |
81 |
|
81 | |||
82 | def my_account(self): |
|
82 | def my_account(self): | |
83 | c.active = 'profile' |
|
83 | c.active = 'profile' | |
84 | self.__load_data() |
|
84 | self.__load_data() | |
85 | c.perm_user = AuthUser(user_id=request.authuser.user_id) |
|
85 | c.perm_user = AuthUser(user_id=request.authuser.user_id) | |
86 | managed_fields = auth_modules.get_managed_fields(c.user) |
|
86 | managed_fields = auth_modules.get_managed_fields(c.user) | |
87 | def_user_perms = AuthUser(dbuser=db.User.get_default_user()).global_permissions |
|
87 | def_user_perms = AuthUser(dbuser=db.User.get_default_user()).global_permissions | |
88 | if 'hg.register.none' in def_user_perms: |
|
88 | if 'hg.register.none' in def_user_perms: | |
89 | managed_fields.extend(['username', 'firstname', 'lastname', 'email']) |
|
89 | managed_fields.extend(['username', 'firstname', 'lastname', 'email']) | |
90 |
|
90 | |||
91 | c.readonly = lambda n: 'readonly' if n in managed_fields else None |
|
91 | c.readonly = lambda n: 'readonly' if n in managed_fields else None | |
92 |
|
92 | |||
93 | defaults = c.user.get_dict() |
|
93 | defaults = c.user.get_dict() | |
94 | update = False |
|
94 | update = False | |
95 | if request.POST: |
|
95 | if request.POST: | |
96 | _form = UserForm(edit=True, |
|
96 | _form = UserForm(edit=True, | |
97 | old_data={'user_id': request.authuser.user_id, |
|
97 | old_data={'user_id': request.authuser.user_id, | |
98 | 'email': request.authuser.email})() |
|
98 | 'email': request.authuser.email})() | |
99 | form_result = {} |
|
99 | form_result = {} | |
100 | try: |
|
100 | try: | |
101 | post_data = dict(request.POST) |
|
101 | post_data = dict(request.POST) | |
102 | post_data['new_password'] = '' |
|
102 | post_data['new_password'] = '' | |
103 | post_data['password_confirmation'] = '' |
|
103 | post_data['password_confirmation'] = '' | |
104 | form_result = _form.to_python(post_data) |
|
104 | form_result = _form.to_python(post_data) | |
105 | # skip updating those attrs for my account |
|
105 | # skip updating those attrs for my account | |
106 | skip_attrs = ['admin', 'active', 'extern_type', 'extern_name', |
|
106 | skip_attrs = ['admin', 'active', 'extern_type', 'extern_name', | |
107 | 'new_password', 'password_confirmation', |
|
107 | 'new_password', 'password_confirmation', | |
108 | ] + managed_fields |
|
108 | ] + managed_fields | |
109 |
|
109 | |||
110 | UserModel().update(request.authuser.user_id, form_result, |
|
110 | UserModel().update(request.authuser.user_id, form_result, | |
111 | skip_attrs=skip_attrs) |
|
111 | skip_attrs=skip_attrs) | |
112 | webutils.flash(_('Your account was updated successfully'), |
|
112 | webutils.flash(_('Your account was updated successfully'), | |
113 | category='success') |
|
113 | category='success') | |
114 | meta.Session().commit() |
|
114 | meta.Session().commit() | |
115 | update = True |
|
115 | update = True | |
116 |
|
116 | |||
117 | except formencode.Invalid as errors: |
|
117 | except formencode.Invalid as errors: | |
118 | return htmlfill.render( |
|
118 | return htmlfill.render( | |
119 | render('admin/my_account/my_account.html'), |
|
119 | base.render('admin/my_account/my_account.html'), | |
120 | defaults=errors.value, |
|
120 | defaults=errors.value, | |
121 | errors=errors.error_dict or {}, |
|
121 | errors=errors.error_dict or {}, | |
122 | prefix_error=False, |
|
122 | prefix_error=False, | |
123 | encoding="UTF-8", |
|
123 | encoding="UTF-8", | |
124 | force_defaults=False) |
|
124 | force_defaults=False) | |
125 | except Exception: |
|
125 | except Exception: | |
126 | log.error(traceback.format_exc()) |
|
126 | log.error(traceback.format_exc()) | |
127 | webutils.flash(_('Error occurred during update of user %s') |
|
127 | webutils.flash(_('Error occurred during update of user %s') | |
128 | % form_result.get('username'), category='error') |
|
128 | % form_result.get('username'), category='error') | |
129 | if update: |
|
129 | if update: | |
130 | raise HTTPFound(location='my_account') |
|
130 | raise HTTPFound(location='my_account') | |
131 | return htmlfill.render( |
|
131 | return htmlfill.render( | |
132 | render('admin/my_account/my_account.html'), |
|
132 | base.render('admin/my_account/my_account.html'), | |
133 | defaults=defaults, |
|
133 | defaults=defaults, | |
134 | encoding="UTF-8", |
|
134 | encoding="UTF-8", | |
135 | force_defaults=False) |
|
135 | force_defaults=False) | |
136 |
|
136 | |||
137 | def my_account_password(self): |
|
137 | def my_account_password(self): | |
138 | c.active = 'password' |
|
138 | c.active = 'password' | |
139 | self.__load_data() |
|
139 | self.__load_data() | |
140 |
|
140 | |||
141 | managed_fields = auth_modules.get_managed_fields(c.user) |
|
141 | managed_fields = auth_modules.get_managed_fields(c.user) | |
142 | c.can_change_password = 'password' not in managed_fields |
|
142 | c.can_change_password = 'password' not in managed_fields | |
143 |
|
143 | |||
144 | if request.POST and c.can_change_password: |
|
144 | if request.POST and c.can_change_password: | |
145 | _form = PasswordChangeForm(request.authuser.username)() |
|
145 | _form = PasswordChangeForm(request.authuser.username)() | |
146 | try: |
|
146 | try: | |
147 | form_result = _form.to_python(request.POST) |
|
147 | form_result = _form.to_python(request.POST) | |
148 | UserModel().update(request.authuser.user_id, form_result) |
|
148 | UserModel().update(request.authuser.user_id, form_result) | |
149 | meta.Session().commit() |
|
149 | meta.Session().commit() | |
150 | webutils.flash(_("Successfully updated password"), category='success') |
|
150 | webutils.flash(_("Successfully updated password"), category='success') | |
151 | except formencode.Invalid as errors: |
|
151 | except formencode.Invalid as errors: | |
152 | return htmlfill.render( |
|
152 | return htmlfill.render( | |
153 | render('admin/my_account/my_account.html'), |
|
153 | base.render('admin/my_account/my_account.html'), | |
154 | defaults=errors.value, |
|
154 | defaults=errors.value, | |
155 | errors=errors.error_dict or {}, |
|
155 | errors=errors.error_dict or {}, | |
156 | prefix_error=False, |
|
156 | prefix_error=False, | |
157 | encoding="UTF-8", |
|
157 | encoding="UTF-8", | |
158 | force_defaults=False) |
|
158 | force_defaults=False) | |
159 | except Exception: |
|
159 | except Exception: | |
160 | log.error(traceback.format_exc()) |
|
160 | log.error(traceback.format_exc()) | |
161 | webutils.flash(_('Error occurred during update of user password'), |
|
161 | webutils.flash(_('Error occurred during update of user password'), | |
162 | category='error') |
|
162 | category='error') | |
163 | return render('admin/my_account/my_account.html') |
|
163 | return base.render('admin/my_account/my_account.html') | |
164 |
|
164 | |||
165 | def my_account_repos(self): |
|
165 | def my_account_repos(self): | |
166 | c.active = 'repos' |
|
166 | c.active = 'repos' | |
167 | self.__load_data() |
|
167 | self.__load_data() | |
168 |
|
168 | |||
169 | # data used to render the grid |
|
169 | # data used to render the grid | |
170 | c.data = self._load_my_repos_data() |
|
170 | c.data = self._load_my_repos_data() | |
171 | return render('admin/my_account/my_account.html') |
|
171 | return base.render('admin/my_account/my_account.html') | |
172 |
|
172 | |||
173 | def my_account_watched(self): |
|
173 | def my_account_watched(self): | |
174 | c.active = 'watched' |
|
174 | c.active = 'watched' | |
175 | self.__load_data() |
|
175 | self.__load_data() | |
176 |
|
176 | |||
177 | # data used to render the grid |
|
177 | # data used to render the grid | |
178 | c.data = self._load_my_repos_data(watched=True) |
|
178 | c.data = self._load_my_repos_data(watched=True) | |
179 | return render('admin/my_account/my_account.html') |
|
179 | return base.render('admin/my_account/my_account.html') | |
180 |
|
180 | |||
181 | def my_account_perms(self): |
|
181 | def my_account_perms(self): | |
182 | c.active = 'perms' |
|
182 | c.active = 'perms' | |
183 | self.__load_data() |
|
183 | self.__load_data() | |
184 | c.perm_user = AuthUser(user_id=request.authuser.user_id) |
|
184 | c.perm_user = AuthUser(user_id=request.authuser.user_id) | |
185 |
|
185 | |||
186 | return render('admin/my_account/my_account.html') |
|
186 | return base.render('admin/my_account/my_account.html') | |
187 |
|
187 | |||
188 | def my_account_emails(self): |
|
188 | def my_account_emails(self): | |
189 | c.active = 'emails' |
|
189 | c.active = 'emails' | |
190 | self.__load_data() |
|
190 | self.__load_data() | |
191 |
|
191 | |||
192 | c.user_email_map = db.UserEmailMap.query() \ |
|
192 | c.user_email_map = db.UserEmailMap.query() \ | |
193 | .filter(db.UserEmailMap.user == c.user).all() |
|
193 | .filter(db.UserEmailMap.user == c.user).all() | |
194 | return render('admin/my_account/my_account.html') |
|
194 | return base.render('admin/my_account/my_account.html') | |
195 |
|
195 | |||
196 | def my_account_emails_add(self): |
|
196 | def my_account_emails_add(self): | |
197 | email = request.POST.get('new_email') |
|
197 | email = request.POST.get('new_email') | |
198 |
|
198 | |||
199 | try: |
|
199 | try: | |
200 | UserModel().add_extra_email(request.authuser.user_id, email) |
|
200 | UserModel().add_extra_email(request.authuser.user_id, email) | |
201 | meta.Session().commit() |
|
201 | meta.Session().commit() | |
202 | webutils.flash(_("Added email %s to user") % email, category='success') |
|
202 | webutils.flash(_("Added email %s to user") % email, category='success') | |
203 | except formencode.Invalid as error: |
|
203 | except formencode.Invalid as error: | |
204 | msg = error.error_dict['email'] |
|
204 | msg = error.error_dict['email'] | |
205 | webutils.flash(msg, category='error') |
|
205 | webutils.flash(msg, category='error') | |
206 | except Exception: |
|
206 | except Exception: | |
207 | log.error(traceback.format_exc()) |
|
207 | log.error(traceback.format_exc()) | |
208 | webutils.flash(_('An error occurred during email saving'), |
|
208 | webutils.flash(_('An error occurred during email saving'), | |
209 | category='error') |
|
209 | category='error') | |
210 | raise HTTPFound(location=url('my_account_emails')) |
|
210 | raise HTTPFound(location=url('my_account_emails')) | |
211 |
|
211 | |||
212 | def my_account_emails_delete(self): |
|
212 | def my_account_emails_delete(self): | |
213 | email_id = request.POST.get('del_email_id') |
|
213 | email_id = request.POST.get('del_email_id') | |
214 | user_model = UserModel() |
|
214 | user_model = UserModel() | |
215 | user_model.delete_extra_email(request.authuser.user_id, email_id) |
|
215 | user_model.delete_extra_email(request.authuser.user_id, email_id) | |
216 | meta.Session().commit() |
|
216 | meta.Session().commit() | |
217 | webutils.flash(_("Removed email from user"), category='success') |
|
217 | webutils.flash(_("Removed email from user"), category='success') | |
218 | raise HTTPFound(location=url('my_account_emails')) |
|
218 | raise HTTPFound(location=url('my_account_emails')) | |
219 |
|
219 | |||
220 | def my_account_api_keys(self): |
|
220 | def my_account_api_keys(self): | |
221 | c.active = 'api_keys' |
|
221 | c.active = 'api_keys' | |
222 | self.__load_data() |
|
222 | self.__load_data() | |
223 | show_expired = True |
|
223 | show_expired = True | |
224 | c.lifetime_values = [ |
|
224 | c.lifetime_values = [ | |
225 | (str(-1), _('Forever')), |
|
225 | (str(-1), _('Forever')), | |
226 | (str(5), _('5 minutes')), |
|
226 | (str(5), _('5 minutes')), | |
227 | (str(60), _('1 hour')), |
|
227 | (str(60), _('1 hour')), | |
228 | (str(60 * 24), _('1 day')), |
|
228 | (str(60 * 24), _('1 day')), | |
229 | (str(60 * 24 * 30), _('1 month')), |
|
229 | (str(60 * 24 * 30), _('1 month')), | |
230 | ] |
|
230 | ] | |
231 | c.lifetime_options = [(c.lifetime_values, _("Lifetime"))] |
|
231 | c.lifetime_options = [(c.lifetime_values, _("Lifetime"))] | |
232 | c.user_api_keys = ApiKeyModel().get_api_keys(request.authuser.user_id, |
|
232 | c.user_api_keys = ApiKeyModel().get_api_keys(request.authuser.user_id, | |
233 | show_expired=show_expired) |
|
233 | show_expired=show_expired) | |
234 | return render('admin/my_account/my_account.html') |
|
234 | return base.render('admin/my_account/my_account.html') | |
235 |
|
235 | |||
236 | def my_account_api_keys_add(self): |
|
236 | def my_account_api_keys_add(self): | |
237 | lifetime = safe_int(request.POST.get('lifetime'), -1) |
|
237 | lifetime = safe_int(request.POST.get('lifetime'), -1) | |
238 | description = request.POST.get('description') |
|
238 | description = request.POST.get('description') | |
239 | ApiKeyModel().create(request.authuser.user_id, description, lifetime) |
|
239 | ApiKeyModel().create(request.authuser.user_id, description, lifetime) | |
240 | meta.Session().commit() |
|
240 | meta.Session().commit() | |
241 | webutils.flash(_("API key successfully created"), category='success') |
|
241 | webutils.flash(_("API key successfully created"), category='success') | |
242 | raise HTTPFound(location=url('my_account_api_keys')) |
|
242 | raise HTTPFound(location=url('my_account_api_keys')) | |
243 |
|
243 | |||
244 | def my_account_api_keys_delete(self): |
|
244 | def my_account_api_keys_delete(self): | |
245 | api_key = request.POST.get('del_api_key') |
|
245 | api_key = request.POST.get('del_api_key') | |
246 | if request.POST.get('del_api_key_builtin'): |
|
246 | if request.POST.get('del_api_key_builtin'): | |
247 | user = db.User.get(request.authuser.user_id) |
|
247 | user = db.User.get(request.authuser.user_id) | |
248 | user.api_key = generate_api_key() |
|
248 | user.api_key = generate_api_key() | |
249 | meta.Session().commit() |
|
249 | meta.Session().commit() | |
250 | webutils.flash(_("API key successfully reset"), category='success') |
|
250 | webutils.flash(_("API key successfully reset"), category='success') | |
251 | elif api_key: |
|
251 | elif api_key: | |
252 | ApiKeyModel().delete(api_key, request.authuser.user_id) |
|
252 | ApiKeyModel().delete(api_key, request.authuser.user_id) | |
253 | meta.Session().commit() |
|
253 | meta.Session().commit() | |
254 | webutils.flash(_("API key successfully deleted"), category='success') |
|
254 | webutils.flash(_("API key successfully deleted"), category='success') | |
255 |
|
255 | |||
256 | raise HTTPFound(location=url('my_account_api_keys')) |
|
256 | raise HTTPFound(location=url('my_account_api_keys')) | |
257 |
|
257 | |||
258 | @IfSshEnabled |
|
258 | @base.IfSshEnabled | |
259 | def my_account_ssh_keys(self): |
|
259 | def my_account_ssh_keys(self): | |
260 | c.active = 'ssh_keys' |
|
260 | c.active = 'ssh_keys' | |
261 | self.__load_data() |
|
261 | self.__load_data() | |
262 | c.user_ssh_keys = SshKeyModel().get_ssh_keys(request.authuser.user_id) |
|
262 | c.user_ssh_keys = SshKeyModel().get_ssh_keys(request.authuser.user_id) | |
263 | return render('admin/my_account/my_account.html') |
|
263 | return base.render('admin/my_account/my_account.html') | |
264 |
|
264 | |||
265 | @IfSshEnabled |
|
265 | @base.IfSshEnabled | |
266 | def my_account_ssh_keys_add(self): |
|
266 | def my_account_ssh_keys_add(self): | |
267 | description = request.POST.get('description') |
|
267 | description = request.POST.get('description') | |
268 | public_key = request.POST.get('public_key') |
|
268 | public_key = request.POST.get('public_key') | |
269 | try: |
|
269 | try: | |
270 | new_ssh_key = SshKeyModel().create(request.authuser.user_id, |
|
270 | new_ssh_key = SshKeyModel().create(request.authuser.user_id, | |
271 | description, public_key) |
|
271 | description, public_key) | |
272 | meta.Session().commit() |
|
272 | meta.Session().commit() | |
273 | SshKeyModel().write_authorized_keys() |
|
273 | SshKeyModel().write_authorized_keys() | |
274 | webutils.flash(_("SSH key %s successfully added") % new_ssh_key.fingerprint, category='success') |
|
274 | webutils.flash(_("SSH key %s successfully added") % new_ssh_key.fingerprint, category='success') | |
275 | except SshKeyModelException as e: |
|
275 | except SshKeyModelException as e: | |
276 | webutils.flash(e.args[0], category='error') |
|
276 | webutils.flash(e.args[0], category='error') | |
277 | raise HTTPFound(location=url('my_account_ssh_keys')) |
|
277 | raise HTTPFound(location=url('my_account_ssh_keys')) | |
278 |
|
278 | |||
279 | @IfSshEnabled |
|
279 | @base.IfSshEnabled | |
280 | def my_account_ssh_keys_delete(self): |
|
280 | def my_account_ssh_keys_delete(self): | |
281 | fingerprint = request.POST.get('del_public_key_fingerprint') |
|
281 | fingerprint = request.POST.get('del_public_key_fingerprint') | |
282 | try: |
|
282 | try: | |
283 | SshKeyModel().delete(fingerprint, request.authuser.user_id) |
|
283 | SshKeyModel().delete(fingerprint, request.authuser.user_id) | |
284 | meta.Session().commit() |
|
284 | meta.Session().commit() | |
285 | SshKeyModel().write_authorized_keys() |
|
285 | SshKeyModel().write_authorized_keys() | |
286 | webutils.flash(_("SSH key successfully deleted"), category='success') |
|
286 | webutils.flash(_("SSH key successfully deleted"), category='success') | |
287 | except SshKeyModelException as e: |
|
287 | except SshKeyModelException as e: | |
288 | webutils.flash(e.args[0], category='error') |
|
288 | webutils.flash(e.args[0], category='error') | |
289 | raise HTTPFound(location=url('my_account_ssh_keys')) |
|
289 | raise HTTPFound(location=url('my_account_ssh_keys')) |
@@ -1,182 +1,182 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.admin.permissions |
|
15 | kallithea.controllers.admin.permissions | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | permissions controller for Kallithea |
|
18 | permissions controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 27, 2010 |
|
22 | :created_on: Apr 27, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 |
|
28 | |||
29 | import logging |
|
29 | import logging | |
30 | import traceback |
|
30 | import traceback | |
31 |
|
31 | |||
32 | import formencode |
|
32 | import formencode | |
33 | from formencode import htmlfill |
|
33 | from formencode import htmlfill | |
34 | from tg import request |
|
34 | from tg import request | |
35 | from tg import tmpl_context as c |
|
35 | from tg import tmpl_context as c | |
36 | from tg.i18n import ugettext as _ |
|
36 | from tg.i18n import ugettext as _ | |
37 | from webob.exc import HTTPFound |
|
37 | from webob.exc import HTTPFound | |
38 |
|
38 | |||
|
39 | from kallithea.controllers import base | |||
39 | from kallithea.lib import webutils |
|
40 | from kallithea.lib import webutils | |
40 | from kallithea.lib.auth import AuthUser, HasPermissionAnyDecorator, LoginRequired |
|
41 | from kallithea.lib.auth import AuthUser, HasPermissionAnyDecorator, LoginRequired | |
41 | from kallithea.lib.base import BaseController, render |
|
|||
42 | from kallithea.lib.webutils import url |
|
42 | from kallithea.lib.webutils import url | |
43 | from kallithea.model import db, meta |
|
43 | from kallithea.model import db, meta | |
44 | from kallithea.model.forms import DefaultPermissionsForm |
|
44 | from kallithea.model.forms import DefaultPermissionsForm | |
45 | from kallithea.model.permission import PermissionModel |
|
45 | from kallithea.model.permission import PermissionModel | |
46 |
|
46 | |||
47 |
|
47 | |||
48 | log = logging.getLogger(__name__) |
|
48 | log = logging.getLogger(__name__) | |
49 |
|
49 | |||
50 |
|
50 | |||
51 | class PermissionsController(BaseController): |
|
51 | class PermissionsController(base.BaseController): | |
52 |
|
52 | |||
53 | @LoginRequired() |
|
53 | @LoginRequired() | |
54 | @HasPermissionAnyDecorator('hg.admin') |
|
54 | @HasPermissionAnyDecorator('hg.admin') | |
55 | def _before(self, *args, **kwargs): |
|
55 | def _before(self, *args, **kwargs): | |
56 | super(PermissionsController, self)._before(*args, **kwargs) |
|
56 | super(PermissionsController, self)._before(*args, **kwargs) | |
57 |
|
57 | |||
58 | def __load_data(self): |
|
58 | def __load_data(self): | |
59 | # Permissions for the Default user on new repositories |
|
59 | # Permissions for the Default user on new repositories | |
60 | c.repo_perms_choices = [('repository.none', _('None'),), |
|
60 | c.repo_perms_choices = [('repository.none', _('None'),), | |
61 | ('repository.read', _('Read'),), |
|
61 | ('repository.read', _('Read'),), | |
62 | ('repository.write', _('Write'),), |
|
62 | ('repository.write', _('Write'),), | |
63 | ('repository.admin', _('Admin'),)] |
|
63 | ('repository.admin', _('Admin'),)] | |
64 | # Permissions for the Default user on new repository groups |
|
64 | # Permissions for the Default user on new repository groups | |
65 | c.group_perms_choices = [('group.none', _('None'),), |
|
65 | c.group_perms_choices = [('group.none', _('None'),), | |
66 | ('group.read', _('Read'),), |
|
66 | ('group.read', _('Read'),), | |
67 | ('group.write', _('Write'),), |
|
67 | ('group.write', _('Write'),), | |
68 | ('group.admin', _('Admin'),)] |
|
68 | ('group.admin', _('Admin'),)] | |
69 | # Permissions for the Default user on new user groups |
|
69 | # Permissions for the Default user on new user groups | |
70 | c.user_group_perms_choices = [('usergroup.none', _('None'),), |
|
70 | c.user_group_perms_choices = [('usergroup.none', _('None'),), | |
71 | ('usergroup.read', _('Read'),), |
|
71 | ('usergroup.read', _('Read'),), | |
72 | ('usergroup.write', _('Write'),), |
|
72 | ('usergroup.write', _('Write'),), | |
73 | ('usergroup.admin', _('Admin'),)] |
|
73 | ('usergroup.admin', _('Admin'),)] | |
74 | # Registration - allow new Users to create an account |
|
74 | # Registration - allow new Users to create an account | |
75 | c.register_choices = [ |
|
75 | c.register_choices = [ | |
76 | ('hg.register.none', |
|
76 | ('hg.register.none', | |
77 | _('Disabled')), |
|
77 | _('Disabled')), | |
78 | ('hg.register.manual_activate', |
|
78 | ('hg.register.manual_activate', | |
79 | _('Allowed with manual account activation')), |
|
79 | _('Allowed with manual account activation')), | |
80 | ('hg.register.auto_activate', |
|
80 | ('hg.register.auto_activate', | |
81 | _('Allowed with automatic account activation')), ] |
|
81 | _('Allowed with automatic account activation')), ] | |
82 | # External auth account activation |
|
82 | # External auth account activation | |
83 | c.extern_activate_choices = [ |
|
83 | c.extern_activate_choices = [ | |
84 | ('hg.extern_activate.manual', _('Manual activation of external account')), |
|
84 | ('hg.extern_activate.manual', _('Manual activation of external account')), | |
85 | ('hg.extern_activate.auto', _('Automatic activation of external account')), |
|
85 | ('hg.extern_activate.auto', _('Automatic activation of external account')), | |
86 | ] |
|
86 | ] | |
87 | # Top level repository creation |
|
87 | # Top level repository creation | |
88 | c.repo_create_choices = [('hg.create.none', _('Disabled')), |
|
88 | c.repo_create_choices = [('hg.create.none', _('Disabled')), | |
89 | ('hg.create.repository', _('Enabled'))] |
|
89 | ('hg.create.repository', _('Enabled'))] | |
90 | # User group creation |
|
90 | # User group creation | |
91 | c.user_group_create_choices = [('hg.usergroup.create.false', _('Disabled')), |
|
91 | c.user_group_create_choices = [('hg.usergroup.create.false', _('Disabled')), | |
92 | ('hg.usergroup.create.true', _('Enabled'))] |
|
92 | ('hg.usergroup.create.true', _('Enabled'))] | |
93 | # Repository forking: |
|
93 | # Repository forking: | |
94 | c.fork_choices = [('hg.fork.none', _('Disabled')), |
|
94 | c.fork_choices = [('hg.fork.none', _('Disabled')), | |
95 | ('hg.fork.repository', _('Enabled'))] |
|
95 | ('hg.fork.repository', _('Enabled'))] | |
96 |
|
96 | |||
97 | def permission_globals(self): |
|
97 | def permission_globals(self): | |
98 | c.active = 'globals' |
|
98 | c.active = 'globals' | |
99 | self.__load_data() |
|
99 | self.__load_data() | |
100 | if request.POST: |
|
100 | if request.POST: | |
101 | _form = DefaultPermissionsForm( |
|
101 | _form = DefaultPermissionsForm( | |
102 | [x[0] for x in c.repo_perms_choices], |
|
102 | [x[0] for x in c.repo_perms_choices], | |
103 | [x[0] for x in c.group_perms_choices], |
|
103 | [x[0] for x in c.group_perms_choices], | |
104 | [x[0] for x in c.user_group_perms_choices], |
|
104 | [x[0] for x in c.user_group_perms_choices], | |
105 | [x[0] for x in c.repo_create_choices], |
|
105 | [x[0] for x in c.repo_create_choices], | |
106 | [x[0] for x in c.user_group_create_choices], |
|
106 | [x[0] for x in c.user_group_create_choices], | |
107 | [x[0] for x in c.fork_choices], |
|
107 | [x[0] for x in c.fork_choices], | |
108 | [x[0] for x in c.register_choices], |
|
108 | [x[0] for x in c.register_choices], | |
109 | [x[0] for x in c.extern_activate_choices])() |
|
109 | [x[0] for x in c.extern_activate_choices])() | |
110 |
|
110 | |||
111 | try: |
|
111 | try: | |
112 | form_result = _form.to_python(dict(request.POST)) |
|
112 | form_result = _form.to_python(dict(request.POST)) | |
113 | form_result.update({'perm_user_name': 'default'}) |
|
113 | form_result.update({'perm_user_name': 'default'}) | |
114 | PermissionModel().update(form_result) |
|
114 | PermissionModel().update(form_result) | |
115 | meta.Session().commit() |
|
115 | meta.Session().commit() | |
116 | webutils.flash(_('Global permissions updated successfully'), |
|
116 | webutils.flash(_('Global permissions updated successfully'), | |
117 | category='success') |
|
117 | category='success') | |
118 |
|
118 | |||
119 | except formencode.Invalid as errors: |
|
119 | except formencode.Invalid as errors: | |
120 | defaults = errors.value |
|
120 | defaults = errors.value | |
121 |
|
121 | |||
122 | return htmlfill.render( |
|
122 | return htmlfill.render( | |
123 | render('admin/permissions/permissions.html'), |
|
123 | base.render('admin/permissions/permissions.html'), | |
124 | defaults=defaults, |
|
124 | defaults=defaults, | |
125 | errors=errors.error_dict or {}, |
|
125 | errors=errors.error_dict or {}, | |
126 | prefix_error=False, |
|
126 | prefix_error=False, | |
127 | encoding="UTF-8", |
|
127 | encoding="UTF-8", | |
128 | force_defaults=False) |
|
128 | force_defaults=False) | |
129 | except Exception: |
|
129 | except Exception: | |
130 | log.error(traceback.format_exc()) |
|
130 | log.error(traceback.format_exc()) | |
131 | webutils.flash(_('Error occurred during update of permissions'), |
|
131 | webutils.flash(_('Error occurred during update of permissions'), | |
132 | category='error') |
|
132 | category='error') | |
133 |
|
133 | |||
134 | raise HTTPFound(location=url('admin_permissions')) |
|
134 | raise HTTPFound(location=url('admin_permissions')) | |
135 |
|
135 | |||
136 | c.user = db.User.get_default_user() |
|
136 | c.user = db.User.get_default_user() | |
137 | defaults = {'anonymous': c.user.active} |
|
137 | defaults = {'anonymous': c.user.active} | |
138 |
|
138 | |||
139 | for p in c.user.user_perms: |
|
139 | for p in c.user.user_perms: | |
140 | if p.permission.permission_name.startswith('repository.'): |
|
140 | if p.permission.permission_name.startswith('repository.'): | |
141 | defaults['default_repo_perm'] = p.permission.permission_name |
|
141 | defaults['default_repo_perm'] = p.permission.permission_name | |
142 |
|
142 | |||
143 | if p.permission.permission_name.startswith('group.'): |
|
143 | if p.permission.permission_name.startswith('group.'): | |
144 | defaults['default_group_perm'] = p.permission.permission_name |
|
144 | defaults['default_group_perm'] = p.permission.permission_name | |
145 |
|
145 | |||
146 | if p.permission.permission_name.startswith('usergroup.'): |
|
146 | if p.permission.permission_name.startswith('usergroup.'): | |
147 | defaults['default_user_group_perm'] = p.permission.permission_name |
|
147 | defaults['default_user_group_perm'] = p.permission.permission_name | |
148 |
|
148 | |||
149 | elif p.permission.permission_name.startswith('hg.create.'): |
|
149 | elif p.permission.permission_name.startswith('hg.create.'): | |
150 | defaults['default_repo_create'] = p.permission.permission_name |
|
150 | defaults['default_repo_create'] = p.permission.permission_name | |
151 |
|
151 | |||
152 | if p.permission.permission_name.startswith('hg.usergroup.'): |
|
152 | if p.permission.permission_name.startswith('hg.usergroup.'): | |
153 | defaults['default_user_group_create'] = p.permission.permission_name |
|
153 | defaults['default_user_group_create'] = p.permission.permission_name | |
154 |
|
154 | |||
155 | if p.permission.permission_name.startswith('hg.register.'): |
|
155 | if p.permission.permission_name.startswith('hg.register.'): | |
156 | defaults['default_register'] = p.permission.permission_name |
|
156 | defaults['default_register'] = p.permission.permission_name | |
157 |
|
157 | |||
158 | if p.permission.permission_name.startswith('hg.extern_activate.'): |
|
158 | if p.permission.permission_name.startswith('hg.extern_activate.'): | |
159 | defaults['default_extern_activate'] = p.permission.permission_name |
|
159 | defaults['default_extern_activate'] = p.permission.permission_name | |
160 |
|
160 | |||
161 | if p.permission.permission_name.startswith('hg.fork.'): |
|
161 | if p.permission.permission_name.startswith('hg.fork.'): | |
162 | defaults['default_fork'] = p.permission.permission_name |
|
162 | defaults['default_fork'] = p.permission.permission_name | |
163 |
|
163 | |||
164 | return htmlfill.render( |
|
164 | return htmlfill.render( | |
165 | render('admin/permissions/permissions.html'), |
|
165 | base.render('admin/permissions/permissions.html'), | |
166 | defaults=defaults, |
|
166 | defaults=defaults, | |
167 | encoding="UTF-8", |
|
167 | encoding="UTF-8", | |
168 | force_defaults=False) |
|
168 | force_defaults=False) | |
169 |
|
169 | |||
170 | def permission_ips(self): |
|
170 | def permission_ips(self): | |
171 | c.active = 'ips' |
|
171 | c.active = 'ips' | |
172 | c.user = db.User.get_default_user() |
|
172 | c.user = db.User.get_default_user() | |
173 | c.user_ip_map = db.UserIpMap.query() \ |
|
173 | c.user_ip_map = db.UserIpMap.query() \ | |
174 | .filter(db.UserIpMap.user == c.user).all() |
|
174 | .filter(db.UserIpMap.user == c.user).all() | |
175 |
|
175 | |||
176 | return render('admin/permissions/permissions.html') |
|
176 | return base.render('admin/permissions/permissions.html') | |
177 |
|
177 | |||
178 | def permission_perms(self): |
|
178 | def permission_perms(self): | |
179 | c.active = 'perms' |
|
179 | c.active = 'perms' | |
180 | c.user = db.User.get_default_user() |
|
180 | c.user = db.User.get_default_user() | |
181 | c.perm_user = AuthUser(dbuser=c.user) |
|
181 | c.perm_user = AuthUser(dbuser=c.user) | |
182 | return render('admin/permissions/permissions.html') |
|
182 | return base.render('admin/permissions/permissions.html') |
@@ -1,402 +1,402 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.admin.repo_groups |
|
15 | kallithea.controllers.admin.repo_groups | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Repository groups controller for Kallithea |
|
18 | Repository groups controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Mar 23, 2010 |
|
22 | :created_on: Mar 23, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 |
|
30 | |||
31 | import formencode |
|
31 | import formencode | |
32 | from formencode import htmlfill |
|
32 | from formencode import htmlfill | |
33 | from tg import app_globals, request |
|
33 | from tg import app_globals, request | |
34 | from tg import tmpl_context as c |
|
34 | from tg import tmpl_context as c | |
35 | from tg.i18n import ugettext as _ |
|
35 | from tg.i18n import ugettext as _ | |
36 | from tg.i18n import ungettext |
|
36 | from tg.i18n import ungettext | |
37 | from webob.exc import HTTPForbidden, HTTPFound, HTTPInternalServerError, HTTPNotFound |
|
37 | from webob.exc import HTTPForbidden, HTTPFound, HTTPInternalServerError, HTTPNotFound | |
38 |
|
38 | |||
39 | import kallithea.lib.helpers as h |
|
39 | import kallithea.lib.helpers as h | |
|
40 | from kallithea.controllers import base | |||
40 | from kallithea.lib import webutils |
|
41 | from kallithea.lib import webutils | |
41 | from kallithea.lib.auth import HasPermissionAny, HasRepoGroupPermissionLevel, HasRepoGroupPermissionLevelDecorator, LoginRequired |
|
42 | from kallithea.lib.auth import HasPermissionAny, HasRepoGroupPermissionLevel, HasRepoGroupPermissionLevelDecorator, LoginRequired | |
42 | from kallithea.lib.base import BaseController, render |
|
|||
43 | from kallithea.lib.utils2 import safe_int |
|
43 | from kallithea.lib.utils2 import safe_int | |
44 | from kallithea.lib.webutils import url |
|
44 | from kallithea.lib.webutils import url | |
45 | from kallithea.model import db, meta |
|
45 | from kallithea.model import db, meta | |
46 | from kallithea.model.forms import RepoGroupForm, RepoGroupPermsForm |
|
46 | from kallithea.model.forms import RepoGroupForm, RepoGroupPermsForm | |
47 | from kallithea.model.repo import RepoModel |
|
47 | from kallithea.model.repo import RepoModel | |
48 | from kallithea.model.repo_group import RepoGroupModel |
|
48 | from kallithea.model.repo_group import RepoGroupModel | |
49 | from kallithea.model.scm import AvailableRepoGroupChoices, RepoGroupList |
|
49 | from kallithea.model.scm import AvailableRepoGroupChoices, RepoGroupList | |
50 |
|
50 | |||
51 |
|
51 | |||
52 | log = logging.getLogger(__name__) |
|
52 | log = logging.getLogger(__name__) | |
53 |
|
53 | |||
54 |
|
54 | |||
55 | class RepoGroupsController(BaseController): |
|
55 | class RepoGroupsController(base.BaseController): | |
56 |
|
56 | |||
57 | @LoginRequired(allow_default_user=True) |
|
57 | @LoginRequired(allow_default_user=True) | |
58 | def _before(self, *args, **kwargs): |
|
58 | def _before(self, *args, **kwargs): | |
59 | super(RepoGroupsController, self)._before(*args, **kwargs) |
|
59 | super(RepoGroupsController, self)._before(*args, **kwargs) | |
60 |
|
60 | |||
61 | def __load_defaults(self, extras=(), exclude=()): |
|
61 | def __load_defaults(self, extras=(), exclude=()): | |
62 | """extras is used for keeping current parent ignoring permissions |
|
62 | """extras is used for keeping current parent ignoring permissions | |
63 | exclude is used for not moving group to itself TODO: also exclude descendants |
|
63 | exclude is used for not moving group to itself TODO: also exclude descendants | |
64 | Note: only admin can create top level groups |
|
64 | Note: only admin can create top level groups | |
65 | """ |
|
65 | """ | |
66 | repo_groups = AvailableRepoGroupChoices('admin', extras) |
|
66 | repo_groups = AvailableRepoGroupChoices('admin', extras) | |
67 | exclude_group_ids = set(rg.group_id for rg in exclude) |
|
67 | exclude_group_ids = set(rg.group_id for rg in exclude) | |
68 | c.repo_groups = [rg for rg in repo_groups |
|
68 | c.repo_groups = [rg for rg in repo_groups | |
69 | if rg[0] not in exclude_group_ids] |
|
69 | if rg[0] not in exclude_group_ids] | |
70 |
|
70 | |||
71 | def __load_data(self, group_id): |
|
71 | def __load_data(self, group_id): | |
72 | """ |
|
72 | """ | |
73 | Load defaults settings for edit, and update |
|
73 | Load defaults settings for edit, and update | |
74 |
|
74 | |||
75 | :param group_id: |
|
75 | :param group_id: | |
76 | """ |
|
76 | """ | |
77 | repo_group = db.RepoGroup.get_or_404(group_id) |
|
77 | repo_group = db.RepoGroup.get_or_404(group_id) | |
78 | data = repo_group.get_dict() |
|
78 | data = repo_group.get_dict() | |
79 | data['group_name'] = repo_group.name |
|
79 | data['group_name'] = repo_group.name | |
80 |
|
80 | |||
81 | # fill repository group users |
|
81 | # fill repository group users | |
82 | for p in repo_group.repo_group_to_perm: |
|
82 | for p in repo_group.repo_group_to_perm: | |
83 | data.update({'u_perm_%s' % p.user.username: |
|
83 | data.update({'u_perm_%s' % p.user.username: | |
84 | p.permission.permission_name}) |
|
84 | p.permission.permission_name}) | |
85 |
|
85 | |||
86 | # fill repository group groups |
|
86 | # fill repository group groups | |
87 | for p in repo_group.users_group_to_perm: |
|
87 | for p in repo_group.users_group_to_perm: | |
88 | data.update({'g_perm_%s' % p.users_group.users_group_name: |
|
88 | data.update({'g_perm_%s' % p.users_group.users_group_name: | |
89 | p.permission.permission_name}) |
|
89 | p.permission.permission_name}) | |
90 |
|
90 | |||
91 | return data |
|
91 | return data | |
92 |
|
92 | |||
93 | def _revoke_perms_on_yourself(self, form_result): |
|
93 | def _revoke_perms_on_yourself(self, form_result): | |
94 | _up = [u for u in form_result['perms_updates'] if request.authuser.username == u[0]] |
|
94 | _up = [u for u in form_result['perms_updates'] if request.authuser.username == u[0]] | |
95 | _new = [u for u in form_result['perms_new'] if request.authuser.username == u[0]] |
|
95 | _new = [u for u in form_result['perms_new'] if request.authuser.username == u[0]] | |
96 | if _new and _new[0][1] != 'group.admin' or _up and _up[0][1] != 'group.admin': |
|
96 | if _new and _new[0][1] != 'group.admin' or _up and _up[0][1] != 'group.admin': | |
97 | return True |
|
97 | return True | |
98 | return False |
|
98 | return False | |
99 |
|
99 | |||
100 | def index(self, format='html'): |
|
100 | def index(self, format='html'): | |
101 | _list = db.RepoGroup.query(sorted=True).all() |
|
101 | _list = db.RepoGroup.query(sorted=True).all() | |
102 | group_iter = RepoGroupList(_list, perm_level='admin') |
|
102 | group_iter = RepoGroupList(_list, perm_level='admin') | |
103 | repo_groups_data = [] |
|
103 | repo_groups_data = [] | |
104 | _tmpl_lookup = app_globals.mako_lookup |
|
104 | _tmpl_lookup = app_globals.mako_lookup | |
105 | template = _tmpl_lookup.get_template('data_table/_dt_elements.html') |
|
105 | template = _tmpl_lookup.get_template('data_table/_dt_elements.html') | |
106 |
|
106 | |||
107 | def repo_group_name(repo_group_name, children_groups): |
|
107 | def repo_group_name(repo_group_name, children_groups): | |
108 | return template.get_def("repo_group_name") \ |
|
108 | return template.get_def("repo_group_name") \ | |
109 | .render_unicode(repo_group_name, children_groups, _=_, h=h, c=c) |
|
109 | .render_unicode(repo_group_name, children_groups, _=_, h=h, c=c) | |
110 |
|
110 | |||
111 | def repo_group_actions(repo_group_id, repo_group_name, gr_count): |
|
111 | def repo_group_actions(repo_group_id, repo_group_name, gr_count): | |
112 | return template.get_def("repo_group_actions") \ |
|
112 | return template.get_def("repo_group_actions") \ | |
113 | .render_unicode(repo_group_id, repo_group_name, gr_count, _=_, h=h, c=c, |
|
113 | .render_unicode(repo_group_id, repo_group_name, gr_count, _=_, h=h, c=c, | |
114 | ungettext=ungettext) |
|
114 | ungettext=ungettext) | |
115 |
|
115 | |||
116 | for repo_gr in group_iter: |
|
116 | for repo_gr in group_iter: | |
117 | children_groups = [g.name for g in repo_gr.parents] + [repo_gr.name] |
|
117 | children_groups = [g.name for g in repo_gr.parents] + [repo_gr.name] | |
118 | repo_count = repo_gr.repositories.count() |
|
118 | repo_count = repo_gr.repositories.count() | |
119 | repo_groups_data.append({ |
|
119 | repo_groups_data.append({ | |
120 | "raw_name": webutils.escape(repo_gr.group_name), |
|
120 | "raw_name": webutils.escape(repo_gr.group_name), | |
121 | "group_name": repo_group_name(repo_gr.group_name, children_groups), |
|
121 | "group_name": repo_group_name(repo_gr.group_name, children_groups), | |
122 | "desc": webutils.escape(repo_gr.group_description), |
|
122 | "desc": webutils.escape(repo_gr.group_description), | |
123 | "repos": repo_count, |
|
123 | "repos": repo_count, | |
124 | "owner": repo_gr.owner.username, |
|
124 | "owner": repo_gr.owner.username, | |
125 | "action": repo_group_actions(repo_gr.group_id, repo_gr.group_name, |
|
125 | "action": repo_group_actions(repo_gr.group_id, repo_gr.group_name, | |
126 | repo_count) |
|
126 | repo_count) | |
127 | }) |
|
127 | }) | |
128 |
|
128 | |||
129 | c.data = { |
|
129 | c.data = { | |
130 | "sort": None, |
|
130 | "sort": None, | |
131 | "dir": "asc", |
|
131 | "dir": "asc", | |
132 | "records": repo_groups_data |
|
132 | "records": repo_groups_data | |
133 | } |
|
133 | } | |
134 |
|
134 | |||
135 | return render('admin/repo_groups/repo_groups.html') |
|
135 | return base.render('admin/repo_groups/repo_groups.html') | |
136 |
|
136 | |||
137 | def create(self): |
|
137 | def create(self): | |
138 | self.__load_defaults() |
|
138 | self.__load_defaults() | |
139 |
|
139 | |||
140 | # permissions for can create group based on parent_id are checked |
|
140 | # permissions for can create group based on parent_id are checked | |
141 | # here in the Form |
|
141 | # here in the Form | |
142 | repo_group_form = RepoGroupForm(repo_groups=c.repo_groups) |
|
142 | repo_group_form = RepoGroupForm(repo_groups=c.repo_groups) | |
143 | form_result = None |
|
143 | form_result = None | |
144 | try: |
|
144 | try: | |
145 | form_result = repo_group_form.to_python(dict(request.POST)) |
|
145 | form_result = repo_group_form.to_python(dict(request.POST)) | |
146 | gr = RepoGroupModel().create( |
|
146 | gr = RepoGroupModel().create( | |
147 | group_name=form_result['group_name'], |
|
147 | group_name=form_result['group_name'], | |
148 | group_description=form_result['group_description'], |
|
148 | group_description=form_result['group_description'], | |
149 | parent=form_result['parent_group_id'], |
|
149 | parent=form_result['parent_group_id'], | |
150 | owner=request.authuser.user_id, # TODO: make editable |
|
150 | owner=request.authuser.user_id, # TODO: make editable | |
151 | copy_permissions=form_result['group_copy_permissions'] |
|
151 | copy_permissions=form_result['group_copy_permissions'] | |
152 | ) |
|
152 | ) | |
153 | meta.Session().commit() |
|
153 | meta.Session().commit() | |
154 | # TODO: in future action_logger(, '', '', '') |
|
154 | # TODO: in future action_logger(, '', '', '') | |
155 | except formencode.Invalid as errors: |
|
155 | except formencode.Invalid as errors: | |
156 | return htmlfill.render( |
|
156 | return htmlfill.render( | |
157 | render('admin/repo_groups/repo_group_add.html'), |
|
157 | base.render('admin/repo_groups/repo_group_add.html'), | |
158 | defaults=errors.value, |
|
158 | defaults=errors.value, | |
159 | errors=errors.error_dict or {}, |
|
159 | errors=errors.error_dict or {}, | |
160 | prefix_error=False, |
|
160 | prefix_error=False, | |
161 | encoding="UTF-8", |
|
161 | encoding="UTF-8", | |
162 | force_defaults=False) |
|
162 | force_defaults=False) | |
163 | except Exception: |
|
163 | except Exception: | |
164 | log.error(traceback.format_exc()) |
|
164 | log.error(traceback.format_exc()) | |
165 | webutils.flash(_('Error occurred during creation of repository group %s') |
|
165 | webutils.flash(_('Error occurred during creation of repository group %s') | |
166 | % request.POST.get('group_name'), category='error') |
|
166 | % request.POST.get('group_name'), category='error') | |
167 | if form_result is None: |
|
167 | if form_result is None: | |
168 | raise |
|
168 | raise | |
169 | parent_group_id = form_result['parent_group_id'] |
|
169 | parent_group_id = form_result['parent_group_id'] | |
170 | # TODO: maybe we should get back to the main view, not the admin one |
|
170 | # TODO: maybe we should get back to the main view, not the admin one | |
171 | raise HTTPFound(location=url('repos_groups', parent_group=parent_group_id)) |
|
171 | raise HTTPFound(location=url('repos_groups', parent_group=parent_group_id)) | |
172 | webutils.flash(_('Created repository group %s') % gr.group_name, |
|
172 | webutils.flash(_('Created repository group %s') % gr.group_name, | |
173 | category='success') |
|
173 | category='success') | |
174 | raise HTTPFound(location=url('repos_group_home', group_name=gr.group_name)) |
|
174 | raise HTTPFound(location=url('repos_group_home', group_name=gr.group_name)) | |
175 |
|
175 | |||
176 | def new(self): |
|
176 | def new(self): | |
177 | parent_group_id = safe_int(request.GET.get('parent_group') or '-1') |
|
177 | parent_group_id = safe_int(request.GET.get('parent_group') or '-1') | |
178 | if HasPermissionAny('hg.admin')('group create'): |
|
178 | if HasPermissionAny('hg.admin')('group create'): | |
179 | # we're global admin, we're ok and we can create TOP level groups |
|
179 | # we're global admin, we're ok and we can create TOP level groups | |
180 | pass |
|
180 | pass | |
181 | else: |
|
181 | else: | |
182 | # we pass in parent group into creation form, thus we know |
|
182 | # we pass in parent group into creation form, thus we know | |
183 | # what would be the group, we can check perms here ! |
|
183 | # what would be the group, we can check perms here ! | |
184 | group = db.RepoGroup.get(parent_group_id) if parent_group_id else None |
|
184 | group = db.RepoGroup.get(parent_group_id) if parent_group_id else None | |
185 | group_name = group.group_name if group else None |
|
185 | group_name = group.group_name if group else None | |
186 | if HasRepoGroupPermissionLevel('admin')(group_name, 'group create'): |
|
186 | if HasRepoGroupPermissionLevel('admin')(group_name, 'group create'): | |
187 | pass |
|
187 | pass | |
188 | else: |
|
188 | else: | |
189 | raise HTTPForbidden() |
|
189 | raise HTTPForbidden() | |
190 |
|
190 | |||
191 | self.__load_defaults() |
|
191 | self.__load_defaults() | |
192 | return htmlfill.render( |
|
192 | return htmlfill.render( | |
193 | render('admin/repo_groups/repo_group_add.html'), |
|
193 | base.render('admin/repo_groups/repo_group_add.html'), | |
194 | defaults={'parent_group_id': parent_group_id}, |
|
194 | defaults={'parent_group_id': parent_group_id}, | |
195 | errors={}, |
|
195 | errors={}, | |
196 | prefix_error=False, |
|
196 | prefix_error=False, | |
197 | encoding="UTF-8", |
|
197 | encoding="UTF-8", | |
198 | force_defaults=False) |
|
198 | force_defaults=False) | |
199 |
|
199 | |||
200 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
200 | @HasRepoGroupPermissionLevelDecorator('admin') | |
201 | def update(self, group_name): |
|
201 | def update(self, group_name): | |
202 | c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
202 | c.repo_group = db.RepoGroup.guess_instance(group_name) | |
203 | self.__load_defaults(extras=[c.repo_group.parent_group], |
|
203 | self.__load_defaults(extras=[c.repo_group.parent_group], | |
204 | exclude=[c.repo_group]) |
|
204 | exclude=[c.repo_group]) | |
205 |
|
205 | |||
206 | # TODO: kill allow_empty_group - it is only used for redundant form validation! |
|
206 | # TODO: kill allow_empty_group - it is only used for redundant form validation! | |
207 | if HasPermissionAny('hg.admin')('group edit'): |
|
207 | if HasPermissionAny('hg.admin')('group edit'): | |
208 | # we're global admin, we're ok and we can create TOP level groups |
|
208 | # we're global admin, we're ok and we can create TOP level groups | |
209 | allow_empty_group = True |
|
209 | allow_empty_group = True | |
210 | elif not c.repo_group.parent_group: |
|
210 | elif not c.repo_group.parent_group: | |
211 | allow_empty_group = True |
|
211 | allow_empty_group = True | |
212 | else: |
|
212 | else: | |
213 | allow_empty_group = False |
|
213 | allow_empty_group = False | |
214 | repo_group_form = RepoGroupForm( |
|
214 | repo_group_form = RepoGroupForm( | |
215 | edit=True, |
|
215 | edit=True, | |
216 | old_data=c.repo_group.get_dict(), |
|
216 | old_data=c.repo_group.get_dict(), | |
217 | repo_groups=c.repo_groups, |
|
217 | repo_groups=c.repo_groups, | |
218 | can_create_in_root=allow_empty_group, |
|
218 | can_create_in_root=allow_empty_group, | |
219 | )() |
|
219 | )() | |
220 | try: |
|
220 | try: | |
221 | form_result = repo_group_form.to_python(dict(request.POST)) |
|
221 | form_result = repo_group_form.to_python(dict(request.POST)) | |
222 |
|
222 | |||
223 | new_gr = RepoGroupModel().update(group_name, form_result) |
|
223 | new_gr = RepoGroupModel().update(group_name, form_result) | |
224 | meta.Session().commit() |
|
224 | meta.Session().commit() | |
225 | webutils.flash(_('Updated repository group %s') |
|
225 | webutils.flash(_('Updated repository group %s') | |
226 | % form_result['group_name'], category='success') |
|
226 | % form_result['group_name'], category='success') | |
227 | # we now have new name ! |
|
227 | # we now have new name ! | |
228 | group_name = new_gr.group_name |
|
228 | group_name = new_gr.group_name | |
229 | # TODO: in future action_logger(, '', '', '') |
|
229 | # TODO: in future action_logger(, '', '', '') | |
230 | except formencode.Invalid as errors: |
|
230 | except formencode.Invalid as errors: | |
231 | c.active = 'settings' |
|
231 | c.active = 'settings' | |
232 | return htmlfill.render( |
|
232 | return htmlfill.render( | |
233 | render('admin/repo_groups/repo_group_edit.html'), |
|
233 | base.render('admin/repo_groups/repo_group_edit.html'), | |
234 | defaults=errors.value, |
|
234 | defaults=errors.value, | |
235 | errors=errors.error_dict or {}, |
|
235 | errors=errors.error_dict or {}, | |
236 | prefix_error=False, |
|
236 | prefix_error=False, | |
237 | encoding="UTF-8", |
|
237 | encoding="UTF-8", | |
238 | force_defaults=False) |
|
238 | force_defaults=False) | |
239 | except Exception: |
|
239 | except Exception: | |
240 | log.error(traceback.format_exc()) |
|
240 | log.error(traceback.format_exc()) | |
241 | webutils.flash(_('Error occurred during update of repository group %s') |
|
241 | webutils.flash(_('Error occurred during update of repository group %s') | |
242 | % request.POST.get('group_name'), category='error') |
|
242 | % request.POST.get('group_name'), category='error') | |
243 |
|
243 | |||
244 | raise HTTPFound(location=url('edit_repo_group', group_name=group_name)) |
|
244 | raise HTTPFound(location=url('edit_repo_group', group_name=group_name)) | |
245 |
|
245 | |||
246 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
246 | @HasRepoGroupPermissionLevelDecorator('admin') | |
247 | def delete(self, group_name): |
|
247 | def delete(self, group_name): | |
248 | gr = c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
248 | gr = c.repo_group = db.RepoGroup.guess_instance(group_name) | |
249 | repos = gr.repositories.all() |
|
249 | repos = gr.repositories.all() | |
250 | if repos: |
|
250 | if repos: | |
251 | webutils.flash(_('This group contains %s repositories and cannot be ' |
|
251 | webutils.flash(_('This group contains %s repositories and cannot be ' | |
252 | 'deleted') % len(repos), category='warning') |
|
252 | 'deleted') % len(repos), category='warning') | |
253 | raise HTTPFound(location=url('repos_groups')) |
|
253 | raise HTTPFound(location=url('repos_groups')) | |
254 |
|
254 | |||
255 | children = gr.children.all() |
|
255 | children = gr.children.all() | |
256 | if children: |
|
256 | if children: | |
257 | webutils.flash(_('This group contains %s subgroups and cannot be deleted' |
|
257 | webutils.flash(_('This group contains %s subgroups and cannot be deleted' | |
258 | % (len(children))), category='warning') |
|
258 | % (len(children))), category='warning') | |
259 | raise HTTPFound(location=url('repos_groups')) |
|
259 | raise HTTPFound(location=url('repos_groups')) | |
260 |
|
260 | |||
261 | try: |
|
261 | try: | |
262 | RepoGroupModel().delete(group_name) |
|
262 | RepoGroupModel().delete(group_name) | |
263 | meta.Session().commit() |
|
263 | meta.Session().commit() | |
264 | webutils.flash(_('Removed repository group %s') % group_name, |
|
264 | webutils.flash(_('Removed repository group %s') % group_name, | |
265 | category='success') |
|
265 | category='success') | |
266 | # TODO: in future action_logger(, '', '', '') |
|
266 | # TODO: in future action_logger(, '', '', '') | |
267 | except Exception: |
|
267 | except Exception: | |
268 | log.error(traceback.format_exc()) |
|
268 | log.error(traceback.format_exc()) | |
269 | webutils.flash(_('Error occurred during deletion of repository group %s') |
|
269 | webutils.flash(_('Error occurred during deletion of repository group %s') | |
270 | % group_name, category='error') |
|
270 | % group_name, category='error') | |
271 |
|
271 | |||
272 | if gr.parent_group: |
|
272 | if gr.parent_group: | |
273 | raise HTTPFound(location=url('repos_group_home', group_name=gr.parent_group.group_name)) |
|
273 | raise HTTPFound(location=url('repos_group_home', group_name=gr.parent_group.group_name)) | |
274 | raise HTTPFound(location=url('repos_groups')) |
|
274 | raise HTTPFound(location=url('repos_groups')) | |
275 |
|
275 | |||
276 | def show_by_name(self, group_name): |
|
276 | def show_by_name(self, group_name): | |
277 | """ |
|
277 | """ | |
278 | This is a proxy that does a lookup group_name -> id, and shows |
|
278 | This is a proxy that does a lookup group_name -> id, and shows | |
279 | the group by id view instead |
|
279 | the group by id view instead | |
280 | """ |
|
280 | """ | |
281 | group_name = group_name.rstrip('/') |
|
281 | group_name = group_name.rstrip('/') | |
282 | id_ = db.RepoGroup.get_by_group_name(group_name) |
|
282 | id_ = db.RepoGroup.get_by_group_name(group_name) | |
283 | if id_: |
|
283 | if id_: | |
284 | return self.show(group_name) |
|
284 | return self.show(group_name) | |
285 | raise HTTPNotFound |
|
285 | raise HTTPNotFound | |
286 |
|
286 | |||
287 | @HasRepoGroupPermissionLevelDecorator('read') |
|
287 | @HasRepoGroupPermissionLevelDecorator('read') | |
288 | def show(self, group_name): |
|
288 | def show(self, group_name): | |
289 | c.active = 'settings' |
|
289 | c.active = 'settings' | |
290 |
|
290 | |||
291 | c.group = c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
291 | c.group = c.repo_group = db.RepoGroup.guess_instance(group_name) | |
292 |
|
292 | |||
293 | groups = db.RepoGroup.query(sorted=True).filter_by(parent_group=c.group).all() |
|
293 | groups = db.RepoGroup.query(sorted=True).filter_by(parent_group=c.group).all() | |
294 | repo_groups_list = self.scm_model.get_repo_groups(groups) |
|
294 | repo_groups_list = self.scm_model.get_repo_groups(groups) | |
295 |
|
295 | |||
296 | repos_list = db.Repository.query(sorted=True).filter_by(group=c.group).all() |
|
296 | repos_list = db.Repository.query(sorted=True).filter_by(group=c.group).all() | |
297 | c.data = RepoModel().get_repos_as_dict(repos_list, |
|
297 | c.data = RepoModel().get_repos_as_dict(repos_list, | |
298 | repo_groups_list=repo_groups_list, |
|
298 | repo_groups_list=repo_groups_list, | |
299 | short_name=True) |
|
299 | short_name=True) | |
300 |
|
300 | |||
301 | return render('admin/repo_groups/repo_group_show.html') |
|
301 | return base.render('admin/repo_groups/repo_group_show.html') | |
302 |
|
302 | |||
303 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
303 | @HasRepoGroupPermissionLevelDecorator('admin') | |
304 | def edit(self, group_name): |
|
304 | def edit(self, group_name): | |
305 | c.active = 'settings' |
|
305 | c.active = 'settings' | |
306 |
|
306 | |||
307 | c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
307 | c.repo_group = db.RepoGroup.guess_instance(group_name) | |
308 | self.__load_defaults(extras=[c.repo_group.parent_group], |
|
308 | self.__load_defaults(extras=[c.repo_group.parent_group], | |
309 | exclude=[c.repo_group]) |
|
309 | exclude=[c.repo_group]) | |
310 | defaults = self.__load_data(c.repo_group.group_id) |
|
310 | defaults = self.__load_data(c.repo_group.group_id) | |
311 |
|
311 | |||
312 | return htmlfill.render( |
|
312 | return htmlfill.render( | |
313 | render('admin/repo_groups/repo_group_edit.html'), |
|
313 | base.render('admin/repo_groups/repo_group_edit.html'), | |
314 | defaults=defaults, |
|
314 | defaults=defaults, | |
315 | encoding="UTF-8", |
|
315 | encoding="UTF-8", | |
316 | force_defaults=False |
|
316 | force_defaults=False | |
317 | ) |
|
317 | ) | |
318 |
|
318 | |||
319 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
319 | @HasRepoGroupPermissionLevelDecorator('admin') | |
320 | def edit_repo_group_advanced(self, group_name): |
|
320 | def edit_repo_group_advanced(self, group_name): | |
321 | c.active = 'advanced' |
|
321 | c.active = 'advanced' | |
322 | c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
322 | c.repo_group = db.RepoGroup.guess_instance(group_name) | |
323 |
|
323 | |||
324 | return render('admin/repo_groups/repo_group_edit.html') |
|
324 | return base.render('admin/repo_groups/repo_group_edit.html') | |
325 |
|
325 | |||
326 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
326 | @HasRepoGroupPermissionLevelDecorator('admin') | |
327 | def edit_repo_group_perms(self, group_name): |
|
327 | def edit_repo_group_perms(self, group_name): | |
328 | c.active = 'perms' |
|
328 | c.active = 'perms' | |
329 | c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
329 | c.repo_group = db.RepoGroup.guess_instance(group_name) | |
330 | self.__load_defaults() |
|
330 | self.__load_defaults() | |
331 | defaults = self.__load_data(c.repo_group.group_id) |
|
331 | defaults = self.__load_data(c.repo_group.group_id) | |
332 |
|
332 | |||
333 | return htmlfill.render( |
|
333 | return htmlfill.render( | |
334 | render('admin/repo_groups/repo_group_edit.html'), |
|
334 | base.render('admin/repo_groups/repo_group_edit.html'), | |
335 | defaults=defaults, |
|
335 | defaults=defaults, | |
336 | encoding="UTF-8", |
|
336 | encoding="UTF-8", | |
337 | force_defaults=False |
|
337 | force_defaults=False | |
338 | ) |
|
338 | ) | |
339 |
|
339 | |||
340 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
340 | @HasRepoGroupPermissionLevelDecorator('admin') | |
341 | def update_perms(self, group_name): |
|
341 | def update_perms(self, group_name): | |
342 | """ |
|
342 | """ | |
343 | Update permissions for given repository group |
|
343 | Update permissions for given repository group | |
344 |
|
344 | |||
345 | :param group_name: |
|
345 | :param group_name: | |
346 | """ |
|
346 | """ | |
347 |
|
347 | |||
348 | c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
348 | c.repo_group = db.RepoGroup.guess_instance(group_name) | |
349 | valid_recursive_choices = ['none', 'repos', 'groups', 'all'] |
|
349 | valid_recursive_choices = ['none', 'repos', 'groups', 'all'] | |
350 | form_result = RepoGroupPermsForm(valid_recursive_choices)().to_python(request.POST) |
|
350 | form_result = RepoGroupPermsForm(valid_recursive_choices)().to_python(request.POST) | |
351 | if not request.authuser.is_admin: |
|
351 | if not request.authuser.is_admin: | |
352 | if self._revoke_perms_on_yourself(form_result): |
|
352 | if self._revoke_perms_on_yourself(form_result): | |
353 | msg = _('Cannot revoke permission for yourself as admin') |
|
353 | msg = _('Cannot revoke permission for yourself as admin') | |
354 | webutils.flash(msg, category='warning') |
|
354 | webutils.flash(msg, category='warning') | |
355 | raise HTTPFound(location=url('edit_repo_group_perms', group_name=group_name)) |
|
355 | raise HTTPFound(location=url('edit_repo_group_perms', group_name=group_name)) | |
356 | recursive = form_result['recursive'] |
|
356 | recursive = form_result['recursive'] | |
357 | # iterate over all members(if in recursive mode) of this groups and |
|
357 | # iterate over all members(if in recursive mode) of this groups and | |
358 | # set the permissions ! |
|
358 | # set the permissions ! | |
359 | # this can be potentially heavy operation |
|
359 | # this can be potentially heavy operation | |
360 | RepoGroupModel()._update_permissions(c.repo_group, |
|
360 | RepoGroupModel()._update_permissions(c.repo_group, | |
361 | form_result['perms_new'], |
|
361 | form_result['perms_new'], | |
362 | form_result['perms_updates'], |
|
362 | form_result['perms_updates'], | |
363 | recursive) |
|
363 | recursive) | |
364 | # TODO: implement this |
|
364 | # TODO: implement this | |
365 | #action_logger(request.authuser, 'admin_changed_repo_permissions', |
|
365 | #action_logger(request.authuser, 'admin_changed_repo_permissions', | |
366 | # repo_name, request.ip_addr) |
|
366 | # repo_name, request.ip_addr) | |
367 | meta.Session().commit() |
|
367 | meta.Session().commit() | |
368 | webutils.flash(_('Repository group permissions updated'), category='success') |
|
368 | webutils.flash(_('Repository group permissions updated'), category='success') | |
369 | raise HTTPFound(location=url('edit_repo_group_perms', group_name=group_name)) |
|
369 | raise HTTPFound(location=url('edit_repo_group_perms', group_name=group_name)) | |
370 |
|
370 | |||
371 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
371 | @HasRepoGroupPermissionLevelDecorator('admin') | |
372 | def delete_perms(self, group_name): |
|
372 | def delete_perms(self, group_name): | |
373 | try: |
|
373 | try: | |
374 | obj_type = request.POST.get('obj_type') |
|
374 | obj_type = request.POST.get('obj_type') | |
375 | obj_id = None |
|
375 | obj_id = None | |
376 | if obj_type == 'user': |
|
376 | if obj_type == 'user': | |
377 | obj_id = safe_int(request.POST.get('user_id')) |
|
377 | obj_id = safe_int(request.POST.get('user_id')) | |
378 | elif obj_type == 'user_group': |
|
378 | elif obj_type == 'user_group': | |
379 | obj_id = safe_int(request.POST.get('user_group_id')) |
|
379 | obj_id = safe_int(request.POST.get('user_group_id')) | |
380 |
|
380 | |||
381 | if not request.authuser.is_admin: |
|
381 | if not request.authuser.is_admin: | |
382 | if obj_type == 'user' and request.authuser.user_id == obj_id: |
|
382 | if obj_type == 'user' and request.authuser.user_id == obj_id: | |
383 | msg = _('Cannot revoke permission for yourself as admin') |
|
383 | msg = _('Cannot revoke permission for yourself as admin') | |
384 | webutils.flash(msg, category='warning') |
|
384 | webutils.flash(msg, category='warning') | |
385 | raise Exception('revoke admin permission on self') |
|
385 | raise Exception('revoke admin permission on self') | |
386 | recursive = request.POST.get('recursive', 'none') |
|
386 | recursive = request.POST.get('recursive', 'none') | |
387 | if obj_type == 'user': |
|
387 | if obj_type == 'user': | |
388 | RepoGroupModel().delete_permission(repo_group=group_name, |
|
388 | RepoGroupModel().delete_permission(repo_group=group_name, | |
389 | obj=obj_id, obj_type='user', |
|
389 | obj=obj_id, obj_type='user', | |
390 | recursive=recursive) |
|
390 | recursive=recursive) | |
391 | elif obj_type == 'user_group': |
|
391 | elif obj_type == 'user_group': | |
392 | RepoGroupModel().delete_permission(repo_group=group_name, |
|
392 | RepoGroupModel().delete_permission(repo_group=group_name, | |
393 | obj=obj_id, |
|
393 | obj=obj_id, | |
394 | obj_type='user_group', |
|
394 | obj_type='user_group', | |
395 | recursive=recursive) |
|
395 | recursive=recursive) | |
396 |
|
396 | |||
397 | meta.Session().commit() |
|
397 | meta.Session().commit() | |
398 | except Exception: |
|
398 | except Exception: | |
399 | log.error(traceback.format_exc()) |
|
399 | log.error(traceback.format_exc()) | |
400 | webutils.flash(_('An error occurred during revoking of permission'), |
|
400 | webutils.flash(_('An error occurred during revoking of permission'), | |
401 | category='error') |
|
401 | category='error') | |
402 | raise HTTPInternalServerError() |
|
402 | raise HTTPInternalServerError() |
@@ -1,513 +1,513 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.admin.repos |
|
15 | kallithea.controllers.admin.repos | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Repositories controller for Kallithea |
|
18 | Repositories controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 7, 2010 |
|
22 | :created_on: Apr 7, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 |
|
30 | |||
31 | import formencode |
|
31 | import formencode | |
32 | from formencode import htmlfill |
|
32 | from formencode import htmlfill | |
33 | from tg import request |
|
33 | from tg import request | |
34 | from tg import tmpl_context as c |
|
34 | from tg import tmpl_context as c | |
35 | from tg.i18n import ugettext as _ |
|
35 | from tg.i18n import ugettext as _ | |
36 | from webob.exc import HTTPForbidden, HTTPFound, HTTPInternalServerError, HTTPNotFound |
|
36 | from webob.exc import HTTPForbidden, HTTPFound, HTTPInternalServerError, HTTPNotFound | |
37 |
|
37 | |||
38 | import kallithea |
|
38 | import kallithea | |
|
39 | from kallithea.controllers import base | |||
39 | from kallithea.lib import webutils |
|
40 | from kallithea.lib import webutils | |
40 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired, NotAnonymous |
|
41 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired, NotAnonymous | |
41 | from kallithea.lib.base import BaseRepoController, jsonify, render |
|
|||
42 | from kallithea.lib.exceptions import AttachedForksError |
|
42 | from kallithea.lib.exceptions import AttachedForksError | |
43 | from kallithea.lib.utils2 import safe_int |
|
43 | from kallithea.lib.utils2 import safe_int | |
44 | from kallithea.lib.vcs import RepositoryError |
|
44 | from kallithea.lib.vcs import RepositoryError | |
45 | from kallithea.lib.webutils import url |
|
45 | from kallithea.lib.webutils import url | |
46 | from kallithea.model import db, meta, userlog |
|
46 | from kallithea.model import db, meta, userlog | |
47 | from kallithea.model.forms import RepoFieldForm, RepoForm, RepoPermsForm |
|
47 | from kallithea.model.forms import RepoFieldForm, RepoForm, RepoPermsForm | |
48 | from kallithea.model.repo import RepoModel |
|
48 | from kallithea.model.repo import RepoModel | |
49 | from kallithea.model.scm import AvailableRepoGroupChoices, RepoList, ScmModel |
|
49 | from kallithea.model.scm import AvailableRepoGroupChoices, RepoList, ScmModel | |
50 |
|
50 | |||
51 |
|
51 | |||
52 | log = logging.getLogger(__name__) |
|
52 | log = logging.getLogger(__name__) | |
53 |
|
53 | |||
54 |
|
54 | |||
55 | class ReposController(BaseRepoController): |
|
55 | class ReposController(base.BaseRepoController): | |
56 |
|
56 | |||
57 | @LoginRequired(allow_default_user=True) |
|
57 | @LoginRequired(allow_default_user=True) | |
58 | def _before(self, *args, **kwargs): |
|
58 | def _before(self, *args, **kwargs): | |
59 | super(ReposController, self)._before(*args, **kwargs) |
|
59 | super(ReposController, self)._before(*args, **kwargs) | |
60 |
|
60 | |||
61 | def _load_repo(self): |
|
61 | def _load_repo(self): | |
62 | repo_obj = c.db_repo |
|
62 | repo_obj = c.db_repo | |
63 |
|
63 | |||
64 | if repo_obj is None: |
|
64 | if repo_obj is None: | |
65 | raise HTTPNotFound() |
|
65 | raise HTTPNotFound() | |
66 |
|
66 | |||
67 | return repo_obj |
|
67 | return repo_obj | |
68 |
|
68 | |||
69 | def __load_defaults(self, repo=None): |
|
69 | def __load_defaults(self, repo=None): | |
70 | extras = [] if repo is None else [repo.group] |
|
70 | extras = [] if repo is None else [repo.group] | |
71 |
|
71 | |||
72 | c.repo_groups = AvailableRepoGroupChoices('write', extras) |
|
72 | c.repo_groups = AvailableRepoGroupChoices('write', extras) | |
73 |
|
73 | |||
74 | c.landing_revs_choices, c.landing_revs = ScmModel().get_repo_landing_revs(repo) |
|
74 | c.landing_revs_choices, c.landing_revs = ScmModel().get_repo_landing_revs(repo) | |
75 |
|
75 | |||
76 | def __load_data(self): |
|
76 | def __load_data(self): | |
77 | """ |
|
77 | """ | |
78 | Load defaults settings for edit, and update |
|
78 | Load defaults settings for edit, and update | |
79 | """ |
|
79 | """ | |
80 | c.repo_info = self._load_repo() |
|
80 | c.repo_info = self._load_repo() | |
81 | self.__load_defaults(c.repo_info) |
|
81 | self.__load_defaults(c.repo_info) | |
82 |
|
82 | |||
83 | defaults = RepoModel()._get_defaults(c.repo_name) |
|
83 | defaults = RepoModel()._get_defaults(c.repo_name) | |
84 | defaults['clone_uri'] = c.repo_info.clone_uri_hidden # don't show password |
|
84 | defaults['clone_uri'] = c.repo_info.clone_uri_hidden # don't show password | |
85 | defaults['permanent_url'] = c.repo_info.clone_url(clone_uri_tmpl=c.clone_uri_tmpl, with_id=True) |
|
85 | defaults['permanent_url'] = c.repo_info.clone_url(clone_uri_tmpl=c.clone_uri_tmpl, with_id=True) | |
86 |
|
86 | |||
87 | return defaults |
|
87 | return defaults | |
88 |
|
88 | |||
89 | def index(self, format='html'): |
|
89 | def index(self, format='html'): | |
90 | repos_list = RepoList(db.Repository.query(sorted=True).all(), perm_level='admin') |
|
90 | repos_list = RepoList(db.Repository.query(sorted=True).all(), perm_level='admin') | |
91 | # the repo list will be filtered to only show repos where the user has read permissions |
|
91 | # the repo list will be filtered to only show repos where the user has read permissions | |
92 | repos_data = RepoModel().get_repos_as_dict(repos_list, admin=True) |
|
92 | repos_data = RepoModel().get_repos_as_dict(repos_list, admin=True) | |
93 | # data used to render the grid |
|
93 | # data used to render the grid | |
94 | c.data = repos_data |
|
94 | c.data = repos_data | |
95 |
|
95 | |||
96 | return render('admin/repos/repos.html') |
|
96 | return base.render('admin/repos/repos.html') | |
97 |
|
97 | |||
98 | @NotAnonymous() |
|
98 | @NotAnonymous() | |
99 | def create(self): |
|
99 | def create(self): | |
100 | self.__load_defaults() |
|
100 | self.__load_defaults() | |
101 | try: |
|
101 | try: | |
102 | # CanWriteGroup validators checks permissions of this POST |
|
102 | # CanWriteGroup validators checks permissions of this POST | |
103 | form_result = RepoForm(repo_groups=c.repo_groups, |
|
103 | form_result = RepoForm(repo_groups=c.repo_groups, | |
104 | landing_revs=c.landing_revs_choices)() \ |
|
104 | landing_revs=c.landing_revs_choices)() \ | |
105 | .to_python(dict(request.POST)) |
|
105 | .to_python(dict(request.POST)) | |
106 | except formencode.Invalid as errors: |
|
106 | except formencode.Invalid as errors: | |
107 | log.info(errors) |
|
107 | log.info(errors) | |
108 | return htmlfill.render( |
|
108 | return htmlfill.render( | |
109 | render('admin/repos/repo_add.html'), |
|
109 | base.render('admin/repos/repo_add.html'), | |
110 | defaults=errors.value, |
|
110 | defaults=errors.value, | |
111 | errors=errors.error_dict or {}, |
|
111 | errors=errors.error_dict or {}, | |
112 | prefix_error=False, |
|
112 | prefix_error=False, | |
113 | force_defaults=False, |
|
113 | force_defaults=False, | |
114 | encoding="UTF-8") |
|
114 | encoding="UTF-8") | |
115 |
|
115 | |||
116 | try: |
|
116 | try: | |
117 | # create is done sometimes async on celery, db transaction |
|
117 | # create is done sometimes async on celery, db transaction | |
118 | # management is handled there. |
|
118 | # management is handled there. | |
119 | task = RepoModel().create(form_result, request.authuser.user_id) |
|
119 | task = RepoModel().create(form_result, request.authuser.user_id) | |
120 | task_id = task.task_id |
|
120 | task_id = task.task_id | |
121 | except Exception: |
|
121 | except Exception: | |
122 | log.error(traceback.format_exc()) |
|
122 | log.error(traceback.format_exc()) | |
123 | msg = (_('Error creating repository %s') |
|
123 | msg = (_('Error creating repository %s') | |
124 | % form_result.get('repo_name')) |
|
124 | % form_result.get('repo_name')) | |
125 | webutils.flash(msg, category='error') |
|
125 | webutils.flash(msg, category='error') | |
126 | raise HTTPFound(location=url('home')) |
|
126 | raise HTTPFound(location=url('home')) | |
127 |
|
127 | |||
128 | raise HTTPFound(location=webutils.url('repo_creating_home', |
|
128 | raise HTTPFound(location=webutils.url('repo_creating_home', | |
129 | repo_name=form_result['repo_name_full'], |
|
129 | repo_name=form_result['repo_name_full'], | |
130 | task_id=task_id)) |
|
130 | task_id=task_id)) | |
131 |
|
131 | |||
132 | @NotAnonymous() |
|
132 | @NotAnonymous() | |
133 | def create_repository(self): |
|
133 | def create_repository(self): | |
134 | self.__load_defaults() |
|
134 | self.__load_defaults() | |
135 | if not c.repo_groups: |
|
135 | if not c.repo_groups: | |
136 | raise HTTPForbidden |
|
136 | raise HTTPForbidden | |
137 | parent_group = request.GET.get('parent_group') |
|
137 | parent_group = request.GET.get('parent_group') | |
138 |
|
138 | |||
139 | ## apply the defaults from defaults page |
|
139 | ## apply the defaults from defaults page | |
140 | defaults = db.Setting.get_default_repo_settings(strip_prefix=True) |
|
140 | defaults = db.Setting.get_default_repo_settings(strip_prefix=True) | |
141 | if parent_group: |
|
141 | if parent_group: | |
142 | prg = db.RepoGroup.get(parent_group) |
|
142 | prg = db.RepoGroup.get(parent_group) | |
143 | if prg is None or not any(rgc[0] == prg.group_id |
|
143 | if prg is None or not any(rgc[0] == prg.group_id | |
144 | for rgc in c.repo_groups): |
|
144 | for rgc in c.repo_groups): | |
145 | raise HTTPForbidden |
|
145 | raise HTTPForbidden | |
146 | else: |
|
146 | else: | |
147 | parent_group = '-1' |
|
147 | parent_group = '-1' | |
148 | defaults.update({'repo_group': parent_group}) |
|
148 | defaults.update({'repo_group': parent_group}) | |
149 |
|
149 | |||
150 | return htmlfill.render( |
|
150 | return htmlfill.render( | |
151 | render('admin/repos/repo_add.html'), |
|
151 | base.render('admin/repos/repo_add.html'), | |
152 | defaults=defaults, |
|
152 | defaults=defaults, | |
153 | errors={}, |
|
153 | errors={}, | |
154 | prefix_error=False, |
|
154 | prefix_error=False, | |
155 | encoding="UTF-8", |
|
155 | encoding="UTF-8", | |
156 | force_defaults=False) |
|
156 | force_defaults=False) | |
157 |
|
157 | |||
158 | @LoginRequired() |
|
158 | @LoginRequired() | |
159 | def repo_creating(self, repo_name): |
|
159 | def repo_creating(self, repo_name): | |
160 | c.repo = repo_name |
|
160 | c.repo = repo_name | |
161 | c.task_id = request.GET.get('task_id') |
|
161 | c.task_id = request.GET.get('task_id') | |
162 | if not c.repo: |
|
162 | if not c.repo: | |
163 | raise HTTPNotFound() |
|
163 | raise HTTPNotFound() | |
164 | return render('admin/repos/repo_creating.html') |
|
164 | return base.render('admin/repos/repo_creating.html') | |
165 |
|
165 | |||
166 | @LoginRequired() |
|
166 | @LoginRequired() | |
167 | @jsonify |
|
167 | @base.jsonify | |
168 | def repo_check(self, repo_name): |
|
168 | def repo_check(self, repo_name): | |
169 | c.repo = repo_name |
|
169 | c.repo = repo_name | |
170 | task_id = request.GET.get('task_id') |
|
170 | task_id = request.GET.get('task_id') | |
171 |
|
171 | |||
172 | if task_id and task_id not in ['None']: |
|
172 | if task_id and task_id not in ['None']: | |
173 | if kallithea.CELERY_APP: |
|
173 | if kallithea.CELERY_APP: | |
174 | task_result = kallithea.CELERY_APP.AsyncResult(task_id) |
|
174 | task_result = kallithea.CELERY_APP.AsyncResult(task_id) | |
175 | if task_result.failed(): |
|
175 | if task_result.failed(): | |
176 | raise HTTPInternalServerError(task_result.traceback) |
|
176 | raise HTTPInternalServerError(task_result.traceback) | |
177 |
|
177 | |||
178 | repo = db.Repository.get_by_repo_name(repo_name) |
|
178 | repo = db.Repository.get_by_repo_name(repo_name) | |
179 | if repo and repo.repo_state == db.Repository.STATE_CREATED: |
|
179 | if repo and repo.repo_state == db.Repository.STATE_CREATED: | |
180 | if repo.clone_uri: |
|
180 | if repo.clone_uri: | |
181 | webutils.flash(_('Created repository %s from %s') |
|
181 | webutils.flash(_('Created repository %s from %s') | |
182 | % (repo.repo_name, repo.clone_uri_hidden), category='success') |
|
182 | % (repo.repo_name, repo.clone_uri_hidden), category='success') | |
183 | else: |
|
183 | else: | |
184 | repo_url = webutils.link_to(repo.repo_name, |
|
184 | repo_url = webutils.link_to(repo.repo_name, | |
185 | webutils.url('summary_home', |
|
185 | webutils.url('summary_home', | |
186 | repo_name=repo.repo_name)) |
|
186 | repo_name=repo.repo_name)) | |
187 | fork = repo.fork |
|
187 | fork = repo.fork | |
188 | if fork is not None: |
|
188 | if fork is not None: | |
189 | fork_name = fork.repo_name |
|
189 | fork_name = fork.repo_name | |
190 | webutils.flash(webutils.HTML(_('Forked repository %s as %s')) |
|
190 | webutils.flash(webutils.HTML(_('Forked repository %s as %s')) | |
191 | % (fork_name, repo_url), category='success') |
|
191 | % (fork_name, repo_url), category='success') | |
192 | else: |
|
192 | else: | |
193 | webutils.flash(webutils.HTML(_('Created repository %s')) % repo_url, |
|
193 | webutils.flash(webutils.HTML(_('Created repository %s')) % repo_url, | |
194 | category='success') |
|
194 | category='success') | |
195 | return {'result': True} |
|
195 | return {'result': True} | |
196 | return {'result': False} |
|
196 | return {'result': False} | |
197 |
|
197 | |||
198 | @HasRepoPermissionLevelDecorator('admin') |
|
198 | @HasRepoPermissionLevelDecorator('admin') | |
199 | def update(self, repo_name): |
|
199 | def update(self, repo_name): | |
200 | c.repo_info = self._load_repo() |
|
200 | c.repo_info = self._load_repo() | |
201 | self.__load_defaults(c.repo_info) |
|
201 | self.__load_defaults(c.repo_info) | |
202 | c.active = 'settings' |
|
202 | c.active = 'settings' | |
203 | c.repo_fields = db.RepositoryField.query() \ |
|
203 | c.repo_fields = db.RepositoryField.query() \ | |
204 | .filter(db.RepositoryField.repository == c.repo_info).all() |
|
204 | .filter(db.RepositoryField.repository == c.repo_info).all() | |
205 |
|
205 | |||
206 | repo_model = RepoModel() |
|
206 | repo_model = RepoModel() | |
207 | changed_name = repo_name |
|
207 | changed_name = repo_name | |
208 | repo = db.Repository.get_by_repo_name(repo_name) |
|
208 | repo = db.Repository.get_by_repo_name(repo_name) | |
209 | old_data = { |
|
209 | old_data = { | |
210 | 'repo_name': repo_name, |
|
210 | 'repo_name': repo_name, | |
211 | 'repo_group': repo.group.get_dict() if repo.group else {}, |
|
211 | 'repo_group': repo.group.get_dict() if repo.group else {}, | |
212 | 'repo_type': repo.repo_type, |
|
212 | 'repo_type': repo.repo_type, | |
213 | } |
|
213 | } | |
214 | _form = RepoForm(edit=True, old_data=old_data, |
|
214 | _form = RepoForm(edit=True, old_data=old_data, | |
215 | repo_groups=c.repo_groups, |
|
215 | repo_groups=c.repo_groups, | |
216 | landing_revs=c.landing_revs_choices)() |
|
216 | landing_revs=c.landing_revs_choices)() | |
217 |
|
217 | |||
218 | try: |
|
218 | try: | |
219 | form_result = _form.to_python(dict(request.POST)) |
|
219 | form_result = _form.to_python(dict(request.POST)) | |
220 | repo = repo_model.update(repo_name, **form_result) |
|
220 | repo = repo_model.update(repo_name, **form_result) | |
221 | ScmModel().mark_for_invalidation(repo_name) |
|
221 | ScmModel().mark_for_invalidation(repo_name) | |
222 | webutils.flash(_('Repository %s updated successfully') % repo_name, |
|
222 | webutils.flash(_('Repository %s updated successfully') % repo_name, | |
223 | category='success') |
|
223 | category='success') | |
224 | changed_name = repo.repo_name |
|
224 | changed_name = repo.repo_name | |
225 | userlog.action_logger(request.authuser, 'admin_updated_repo', |
|
225 | userlog.action_logger(request.authuser, 'admin_updated_repo', | |
226 | changed_name, request.ip_addr) |
|
226 | changed_name, request.ip_addr) | |
227 | meta.Session().commit() |
|
227 | meta.Session().commit() | |
228 | except formencode.Invalid as errors: |
|
228 | except formencode.Invalid as errors: | |
229 | log.info(errors) |
|
229 | log.info(errors) | |
230 | defaults = self.__load_data() |
|
230 | defaults = self.__load_data() | |
231 | defaults.update(errors.value) |
|
231 | defaults.update(errors.value) | |
232 | return htmlfill.render( |
|
232 | return htmlfill.render( | |
233 | render('admin/repos/repo_edit.html'), |
|
233 | base.render('admin/repos/repo_edit.html'), | |
234 | defaults=defaults, |
|
234 | defaults=defaults, | |
235 | errors=errors.error_dict or {}, |
|
235 | errors=errors.error_dict or {}, | |
236 | prefix_error=False, |
|
236 | prefix_error=False, | |
237 | encoding="UTF-8", |
|
237 | encoding="UTF-8", | |
238 | force_defaults=False) |
|
238 | force_defaults=False) | |
239 |
|
239 | |||
240 | except Exception: |
|
240 | except Exception: | |
241 | log.error(traceback.format_exc()) |
|
241 | log.error(traceback.format_exc()) | |
242 | webutils.flash(_('Error occurred during update of repository %s') |
|
242 | webutils.flash(_('Error occurred during update of repository %s') | |
243 | % repo_name, category='error') |
|
243 | % repo_name, category='error') | |
244 | raise HTTPFound(location=url('edit_repo', repo_name=changed_name)) |
|
244 | raise HTTPFound(location=url('edit_repo', repo_name=changed_name)) | |
245 |
|
245 | |||
246 | @HasRepoPermissionLevelDecorator('admin') |
|
246 | @HasRepoPermissionLevelDecorator('admin') | |
247 | def delete(self, repo_name): |
|
247 | def delete(self, repo_name): | |
248 | repo_model = RepoModel() |
|
248 | repo_model = RepoModel() | |
249 | repo = repo_model.get_by_repo_name(repo_name) |
|
249 | repo = repo_model.get_by_repo_name(repo_name) | |
250 | if not repo: |
|
250 | if not repo: | |
251 | raise HTTPNotFound() |
|
251 | raise HTTPNotFound() | |
252 | try: |
|
252 | try: | |
253 | _forks = repo.forks.count() |
|
253 | _forks = repo.forks.count() | |
254 | handle_forks = None |
|
254 | handle_forks = None | |
255 | if _forks and request.POST.get('forks'): |
|
255 | if _forks and request.POST.get('forks'): | |
256 | do = request.POST['forks'] |
|
256 | do = request.POST['forks'] | |
257 | if do == 'detach_forks': |
|
257 | if do == 'detach_forks': | |
258 | handle_forks = 'detach' |
|
258 | handle_forks = 'detach' | |
259 | webutils.flash(_('Detached %s forks') % _forks, category='success') |
|
259 | webutils.flash(_('Detached %s forks') % _forks, category='success') | |
260 | elif do == 'delete_forks': |
|
260 | elif do == 'delete_forks': | |
261 | handle_forks = 'delete' |
|
261 | handle_forks = 'delete' | |
262 | webutils.flash(_('Deleted %s forks') % _forks, category='success') |
|
262 | webutils.flash(_('Deleted %s forks') % _forks, category='success') | |
263 | repo_model.delete(repo, forks=handle_forks) |
|
263 | repo_model.delete(repo, forks=handle_forks) | |
264 | userlog.action_logger(request.authuser, 'admin_deleted_repo', |
|
264 | userlog.action_logger(request.authuser, 'admin_deleted_repo', | |
265 | repo_name, request.ip_addr) |
|
265 | repo_name, request.ip_addr) | |
266 | ScmModel().mark_for_invalidation(repo_name) |
|
266 | ScmModel().mark_for_invalidation(repo_name) | |
267 | webutils.flash(_('Deleted repository %s') % repo_name, category='success') |
|
267 | webutils.flash(_('Deleted repository %s') % repo_name, category='success') | |
268 | meta.Session().commit() |
|
268 | meta.Session().commit() | |
269 | except AttachedForksError: |
|
269 | except AttachedForksError: | |
270 | webutils.flash(_('Cannot delete repository %s which still has forks') |
|
270 | webutils.flash(_('Cannot delete repository %s which still has forks') | |
271 | % repo_name, category='warning') |
|
271 | % repo_name, category='warning') | |
272 |
|
272 | |||
273 | except Exception: |
|
273 | except Exception: | |
274 | log.error(traceback.format_exc()) |
|
274 | log.error(traceback.format_exc()) | |
275 | webutils.flash(_('An error occurred during deletion of %s') % repo_name, |
|
275 | webutils.flash(_('An error occurred during deletion of %s') % repo_name, | |
276 | category='error') |
|
276 | category='error') | |
277 |
|
277 | |||
278 | if repo.group: |
|
278 | if repo.group: | |
279 | raise HTTPFound(location=url('repos_group_home', group_name=repo.group.group_name)) |
|
279 | raise HTTPFound(location=url('repos_group_home', group_name=repo.group.group_name)) | |
280 | raise HTTPFound(location=url('repos')) |
|
280 | raise HTTPFound(location=url('repos')) | |
281 |
|
281 | |||
282 | @HasRepoPermissionLevelDecorator('admin') |
|
282 | @HasRepoPermissionLevelDecorator('admin') | |
283 | def edit(self, repo_name): |
|
283 | def edit(self, repo_name): | |
284 | defaults = self.__load_data() |
|
284 | defaults = self.__load_data() | |
285 | c.repo_fields = db.RepositoryField.query() \ |
|
285 | c.repo_fields = db.RepositoryField.query() \ | |
286 | .filter(db.RepositoryField.repository == c.repo_info).all() |
|
286 | .filter(db.RepositoryField.repository == c.repo_info).all() | |
287 | c.active = 'settings' |
|
287 | c.active = 'settings' | |
288 | return htmlfill.render( |
|
288 | return htmlfill.render( | |
289 | render('admin/repos/repo_edit.html'), |
|
289 | base.render('admin/repos/repo_edit.html'), | |
290 | defaults=defaults, |
|
290 | defaults=defaults, | |
291 | encoding="UTF-8", |
|
291 | encoding="UTF-8", | |
292 | force_defaults=False) |
|
292 | force_defaults=False) | |
293 |
|
293 | |||
294 | @HasRepoPermissionLevelDecorator('admin') |
|
294 | @HasRepoPermissionLevelDecorator('admin') | |
295 | def edit_permissions(self, repo_name): |
|
295 | def edit_permissions(self, repo_name): | |
296 | c.repo_info = self._load_repo() |
|
296 | c.repo_info = self._load_repo() | |
297 | c.active = 'permissions' |
|
297 | c.active = 'permissions' | |
298 | defaults = RepoModel()._get_defaults(repo_name) |
|
298 | defaults = RepoModel()._get_defaults(repo_name) | |
299 |
|
299 | |||
300 | return htmlfill.render( |
|
300 | return htmlfill.render( | |
301 | render('admin/repos/repo_edit.html'), |
|
301 | base.render('admin/repos/repo_edit.html'), | |
302 | defaults=defaults, |
|
302 | defaults=defaults, | |
303 | encoding="UTF-8", |
|
303 | encoding="UTF-8", | |
304 | force_defaults=False) |
|
304 | force_defaults=False) | |
305 |
|
305 | |||
306 | @HasRepoPermissionLevelDecorator('admin') |
|
306 | @HasRepoPermissionLevelDecorator('admin') | |
307 | def edit_permissions_update(self, repo_name): |
|
307 | def edit_permissions_update(self, repo_name): | |
308 | form = RepoPermsForm()().to_python(request.POST) |
|
308 | form = RepoPermsForm()().to_python(request.POST) | |
309 | RepoModel()._update_permissions(repo_name, form['perms_new'], |
|
309 | RepoModel()._update_permissions(repo_name, form['perms_new'], | |
310 | form['perms_updates']) |
|
310 | form['perms_updates']) | |
311 | # TODO: implement this |
|
311 | # TODO: implement this | |
312 | #action_logger(request.authuser, 'admin_changed_repo_permissions', |
|
312 | #action_logger(request.authuser, 'admin_changed_repo_permissions', | |
313 | # repo_name, request.ip_addr) |
|
313 | # repo_name, request.ip_addr) | |
314 | meta.Session().commit() |
|
314 | meta.Session().commit() | |
315 | webutils.flash(_('Repository permissions updated'), category='success') |
|
315 | webutils.flash(_('Repository permissions updated'), category='success') | |
316 | raise HTTPFound(location=url('edit_repo_perms', repo_name=repo_name)) |
|
316 | raise HTTPFound(location=url('edit_repo_perms', repo_name=repo_name)) | |
317 |
|
317 | |||
318 | @HasRepoPermissionLevelDecorator('admin') |
|
318 | @HasRepoPermissionLevelDecorator('admin') | |
319 | def edit_permissions_revoke(self, repo_name): |
|
319 | def edit_permissions_revoke(self, repo_name): | |
320 | try: |
|
320 | try: | |
321 | obj_type = request.POST.get('obj_type') |
|
321 | obj_type = request.POST.get('obj_type') | |
322 | obj_id = None |
|
322 | obj_id = None | |
323 | if obj_type == 'user': |
|
323 | if obj_type == 'user': | |
324 | obj_id = safe_int(request.POST.get('user_id')) |
|
324 | obj_id = safe_int(request.POST.get('user_id')) | |
325 | elif obj_type == 'user_group': |
|
325 | elif obj_type == 'user_group': | |
326 | obj_id = safe_int(request.POST.get('user_group_id')) |
|
326 | obj_id = safe_int(request.POST.get('user_group_id')) | |
327 | else: |
|
327 | else: | |
328 | assert False |
|
328 | assert False | |
329 |
|
329 | |||
330 | if obj_type == 'user': |
|
330 | if obj_type == 'user': | |
331 | RepoModel().revoke_user_permission(repo=repo_name, user=obj_id) |
|
331 | RepoModel().revoke_user_permission(repo=repo_name, user=obj_id) | |
332 | elif obj_type == 'user_group': |
|
332 | elif obj_type == 'user_group': | |
333 | RepoModel().revoke_user_group_permission( |
|
333 | RepoModel().revoke_user_group_permission( | |
334 | repo=repo_name, group_name=obj_id |
|
334 | repo=repo_name, group_name=obj_id | |
335 | ) |
|
335 | ) | |
336 | else: |
|
336 | else: | |
337 | assert False |
|
337 | assert False | |
338 | # TODO: implement this |
|
338 | # TODO: implement this | |
339 | #action_logger(request.authuser, 'admin_revoked_repo_permissions', |
|
339 | #action_logger(request.authuser, 'admin_revoked_repo_permissions', | |
340 | # repo_name, request.ip_addr) |
|
340 | # repo_name, request.ip_addr) | |
341 | meta.Session().commit() |
|
341 | meta.Session().commit() | |
342 | except Exception: |
|
342 | except Exception: | |
343 | log.error(traceback.format_exc()) |
|
343 | log.error(traceback.format_exc()) | |
344 | webutils.flash(_('An error occurred during revoking of permission'), |
|
344 | webutils.flash(_('An error occurred during revoking of permission'), | |
345 | category='error') |
|
345 | category='error') | |
346 | raise HTTPInternalServerError() |
|
346 | raise HTTPInternalServerError() | |
347 | return [] |
|
347 | return [] | |
348 |
|
348 | |||
349 | @HasRepoPermissionLevelDecorator('admin') |
|
349 | @HasRepoPermissionLevelDecorator('admin') | |
350 | def edit_fields(self, repo_name): |
|
350 | def edit_fields(self, repo_name): | |
351 | c.repo_info = self._load_repo() |
|
351 | c.repo_info = self._load_repo() | |
352 | c.repo_fields = db.RepositoryField.query() \ |
|
352 | c.repo_fields = db.RepositoryField.query() \ | |
353 | .filter(db.RepositoryField.repository == c.repo_info).all() |
|
353 | .filter(db.RepositoryField.repository == c.repo_info).all() | |
354 | c.active = 'fields' |
|
354 | c.active = 'fields' | |
355 | if request.POST: |
|
355 | if request.POST: | |
356 |
|
356 | |||
357 | raise HTTPFound(location=url('repo_edit_fields')) |
|
357 | raise HTTPFound(location=url('repo_edit_fields')) | |
358 | return render('admin/repos/repo_edit.html') |
|
358 | return base.render('admin/repos/repo_edit.html') | |
359 |
|
359 | |||
360 | @HasRepoPermissionLevelDecorator('admin') |
|
360 | @HasRepoPermissionLevelDecorator('admin') | |
361 | def create_repo_field(self, repo_name): |
|
361 | def create_repo_field(self, repo_name): | |
362 | try: |
|
362 | try: | |
363 | form_result = RepoFieldForm()().to_python(dict(request.POST)) |
|
363 | form_result = RepoFieldForm()().to_python(dict(request.POST)) | |
364 | new_field = db.RepositoryField() |
|
364 | new_field = db.RepositoryField() | |
365 | new_field.repository = db.Repository.get_by_repo_name(repo_name) |
|
365 | new_field.repository = db.Repository.get_by_repo_name(repo_name) | |
366 | new_field.field_key = form_result['new_field_key'] |
|
366 | new_field.field_key = form_result['new_field_key'] | |
367 | new_field.field_type = form_result['new_field_type'] # python type |
|
367 | new_field.field_type = form_result['new_field_type'] # python type | |
368 | new_field.field_value = form_result['new_field_value'] # set initial blank value |
|
368 | new_field.field_value = form_result['new_field_value'] # set initial blank value | |
369 | new_field.field_desc = form_result['new_field_desc'] |
|
369 | new_field.field_desc = form_result['new_field_desc'] | |
370 | new_field.field_label = form_result['new_field_label'] |
|
370 | new_field.field_label = form_result['new_field_label'] | |
371 | meta.Session().add(new_field) |
|
371 | meta.Session().add(new_field) | |
372 | meta.Session().commit() |
|
372 | meta.Session().commit() | |
373 | except formencode.Invalid as e: |
|
373 | except formencode.Invalid as e: | |
374 | webutils.flash(_('Field validation error: %s') % e.msg, category='error') |
|
374 | webutils.flash(_('Field validation error: %s') % e.msg, category='error') | |
375 | except Exception as e: |
|
375 | except Exception as e: | |
376 | log.error(traceback.format_exc()) |
|
376 | log.error(traceback.format_exc()) | |
377 | webutils.flash(_('An error occurred during creation of field: %r') % e, category='error') |
|
377 | webutils.flash(_('An error occurred during creation of field: %r') % e, category='error') | |
378 | raise HTTPFound(location=url('edit_repo_fields', repo_name=repo_name)) |
|
378 | raise HTTPFound(location=url('edit_repo_fields', repo_name=repo_name)) | |
379 |
|
379 | |||
380 | @HasRepoPermissionLevelDecorator('admin') |
|
380 | @HasRepoPermissionLevelDecorator('admin') | |
381 | def delete_repo_field(self, repo_name, field_id): |
|
381 | def delete_repo_field(self, repo_name, field_id): | |
382 | field = db.RepositoryField.get_or_404(field_id) |
|
382 | field = db.RepositoryField.get_or_404(field_id) | |
383 | try: |
|
383 | try: | |
384 | meta.Session().delete(field) |
|
384 | meta.Session().delete(field) | |
385 | meta.Session().commit() |
|
385 | meta.Session().commit() | |
386 | except Exception as e: |
|
386 | except Exception as e: | |
387 | log.error(traceback.format_exc()) |
|
387 | log.error(traceback.format_exc()) | |
388 | msg = _('An error occurred during removal of field') |
|
388 | msg = _('An error occurred during removal of field') | |
389 | webutils.flash(msg, category='error') |
|
389 | webutils.flash(msg, category='error') | |
390 | raise HTTPFound(location=url('edit_repo_fields', repo_name=repo_name)) |
|
390 | raise HTTPFound(location=url('edit_repo_fields', repo_name=repo_name)) | |
391 |
|
391 | |||
392 | @HasRepoPermissionLevelDecorator('admin') |
|
392 | @HasRepoPermissionLevelDecorator('admin') | |
393 | def edit_advanced(self, repo_name): |
|
393 | def edit_advanced(self, repo_name): | |
394 | c.repo_info = self._load_repo() |
|
394 | c.repo_info = self._load_repo() | |
395 | c.default_user_id = kallithea.DEFAULT_USER_ID |
|
395 | c.default_user_id = kallithea.DEFAULT_USER_ID | |
396 | c.in_public_journal = db.UserFollowing.query() \ |
|
396 | c.in_public_journal = db.UserFollowing.query() \ | |
397 | .filter(db.UserFollowing.user_id == c.default_user_id) \ |
|
397 | .filter(db.UserFollowing.user_id == c.default_user_id) \ | |
398 | .filter(db.UserFollowing.follows_repository == c.repo_info).scalar() |
|
398 | .filter(db.UserFollowing.follows_repository == c.repo_info).scalar() | |
399 |
|
399 | |||
400 | _repos = db.Repository.query(sorted=True).all() |
|
400 | _repos = db.Repository.query(sorted=True).all() | |
401 | read_access_repos = RepoList(_repos, perm_level='read') |
|
401 | read_access_repos = RepoList(_repos, perm_level='read') | |
402 | c.repos_list = [(None, _('-- Not a fork --'))] |
|
402 | c.repos_list = [(None, _('-- Not a fork --'))] | |
403 | c.repos_list += [(x.repo_id, x.repo_name) |
|
403 | c.repos_list += [(x.repo_id, x.repo_name) | |
404 | for x in read_access_repos |
|
404 | for x in read_access_repos | |
405 | if x.repo_id != c.repo_info.repo_id |
|
405 | if x.repo_id != c.repo_info.repo_id | |
406 | and x.repo_type == c.repo_info.repo_type] |
|
406 | and x.repo_type == c.repo_info.repo_type] | |
407 |
|
407 | |||
408 | defaults = { |
|
408 | defaults = { | |
409 | 'id_fork_of': c.repo_info.fork_id if c.repo_info.fork_id else '' |
|
409 | 'id_fork_of': c.repo_info.fork_id if c.repo_info.fork_id else '' | |
410 | } |
|
410 | } | |
411 |
|
411 | |||
412 | c.active = 'advanced' |
|
412 | c.active = 'advanced' | |
413 | if request.POST: |
|
413 | if request.POST: | |
414 | raise HTTPFound(location=url('repo_edit_advanced')) |
|
414 | raise HTTPFound(location=url('repo_edit_advanced')) | |
415 | return htmlfill.render( |
|
415 | return htmlfill.render( | |
416 | render('admin/repos/repo_edit.html'), |
|
416 | base.render('admin/repos/repo_edit.html'), | |
417 | defaults=defaults, |
|
417 | defaults=defaults, | |
418 | encoding="UTF-8", |
|
418 | encoding="UTF-8", | |
419 | force_defaults=False) |
|
419 | force_defaults=False) | |
420 |
|
420 | |||
421 | @HasRepoPermissionLevelDecorator('admin') |
|
421 | @HasRepoPermissionLevelDecorator('admin') | |
422 | def edit_advanced_journal(self, repo_name): |
|
422 | def edit_advanced_journal(self, repo_name): | |
423 | """ |
|
423 | """ | |
424 | Sets this repository to be visible in public journal, |
|
424 | Sets this repository to be visible in public journal, | |
425 | in other words asking default user to follow this repo |
|
425 | in other words asking default user to follow this repo | |
426 |
|
426 | |||
427 | :param repo_name: |
|
427 | :param repo_name: | |
428 | """ |
|
428 | """ | |
429 |
|
429 | |||
430 | try: |
|
430 | try: | |
431 | repo_id = db.Repository.get_by_repo_name(repo_name).repo_id |
|
431 | repo_id = db.Repository.get_by_repo_name(repo_name).repo_id | |
432 | user_id = kallithea.DEFAULT_USER_ID |
|
432 | user_id = kallithea.DEFAULT_USER_ID | |
433 | self.scm_model.toggle_following_repo(repo_id, user_id) |
|
433 | self.scm_model.toggle_following_repo(repo_id, user_id) | |
434 | webutils.flash(_('Updated repository visibility in public journal'), |
|
434 | webutils.flash(_('Updated repository visibility in public journal'), | |
435 | category='success') |
|
435 | category='success') | |
436 | meta.Session().commit() |
|
436 | meta.Session().commit() | |
437 | except Exception: |
|
437 | except Exception: | |
438 | webutils.flash(_('An error occurred during setting this' |
|
438 | webutils.flash(_('An error occurred during setting this' | |
439 | ' repository in public journal'), |
|
439 | ' repository in public journal'), | |
440 | category='error') |
|
440 | category='error') | |
441 | raise HTTPFound(location=url('edit_repo_advanced', repo_name=repo_name)) |
|
441 | raise HTTPFound(location=url('edit_repo_advanced', repo_name=repo_name)) | |
442 |
|
442 | |||
443 | @HasRepoPermissionLevelDecorator('admin') |
|
443 | @HasRepoPermissionLevelDecorator('admin') | |
444 | def edit_advanced_fork(self, repo_name): |
|
444 | def edit_advanced_fork(self, repo_name): | |
445 | """ |
|
445 | """ | |
446 | Mark given repository as a fork of another |
|
446 | Mark given repository as a fork of another | |
447 |
|
447 | |||
448 | :param repo_name: |
|
448 | :param repo_name: | |
449 | """ |
|
449 | """ | |
450 | try: |
|
450 | try: | |
451 | fork_id = request.POST.get('id_fork_of') |
|
451 | fork_id = request.POST.get('id_fork_of') | |
452 | repo = ScmModel().mark_as_fork(repo_name, fork_id, |
|
452 | repo = ScmModel().mark_as_fork(repo_name, fork_id, | |
453 | request.authuser.username) |
|
453 | request.authuser.username) | |
454 | fork = repo.fork.repo_name if repo.fork else _('Nothing') |
|
454 | fork = repo.fork.repo_name if repo.fork else _('Nothing') | |
455 | meta.Session().commit() |
|
455 | meta.Session().commit() | |
456 | webutils.flash(_('Marked repository %s as fork of %s') % (repo_name, fork), |
|
456 | webutils.flash(_('Marked repository %s as fork of %s') % (repo_name, fork), | |
457 | category='success') |
|
457 | category='success') | |
458 | except RepositoryError as e: |
|
458 | except RepositoryError as e: | |
459 | log.error(traceback.format_exc()) |
|
459 | log.error(traceback.format_exc()) | |
460 | webutils.flash(e, category='error') |
|
460 | webutils.flash(e, category='error') | |
461 | except Exception as e: |
|
461 | except Exception as e: | |
462 | log.error(traceback.format_exc()) |
|
462 | log.error(traceback.format_exc()) | |
463 | webutils.flash(_('An error occurred during this operation'), |
|
463 | webutils.flash(_('An error occurred during this operation'), | |
464 | category='error') |
|
464 | category='error') | |
465 |
|
465 | |||
466 | raise HTTPFound(location=url('edit_repo_advanced', repo_name=repo_name)) |
|
466 | raise HTTPFound(location=url('edit_repo_advanced', repo_name=repo_name)) | |
467 |
|
467 | |||
468 | @HasRepoPermissionLevelDecorator('admin') |
|
468 | @HasRepoPermissionLevelDecorator('admin') | |
469 | def edit_remote(self, repo_name): |
|
469 | def edit_remote(self, repo_name): | |
470 | c.repo_info = self._load_repo() |
|
470 | c.repo_info = self._load_repo() | |
471 | c.active = 'remote' |
|
471 | c.active = 'remote' | |
472 | if request.POST: |
|
472 | if request.POST: | |
473 | try: |
|
473 | try: | |
474 | ScmModel().pull_changes(repo_name, request.authuser.username, request.ip_addr) |
|
474 | ScmModel().pull_changes(repo_name, request.authuser.username, request.ip_addr) | |
475 | webutils.flash(_('Pulled from remote location'), category='success') |
|
475 | webutils.flash(_('Pulled from remote location'), category='success') | |
476 | except Exception as e: |
|
476 | except Exception as e: | |
477 | log.error(traceback.format_exc()) |
|
477 | log.error(traceback.format_exc()) | |
478 | webutils.flash(_('An error occurred during pull from remote location'), |
|
478 | webutils.flash(_('An error occurred during pull from remote location'), | |
479 | category='error') |
|
479 | category='error') | |
480 | raise HTTPFound(location=url('edit_repo_remote', repo_name=c.repo_name)) |
|
480 | raise HTTPFound(location=url('edit_repo_remote', repo_name=c.repo_name)) | |
481 | return render('admin/repos/repo_edit.html') |
|
481 | return base.render('admin/repos/repo_edit.html') | |
482 |
|
482 | |||
483 | @HasRepoPermissionLevelDecorator('admin') |
|
483 | @HasRepoPermissionLevelDecorator('admin') | |
484 | def edit_statistics(self, repo_name): |
|
484 | def edit_statistics(self, repo_name): | |
485 | c.repo_info = self._load_repo() |
|
485 | c.repo_info = self._load_repo() | |
486 | repo = c.repo_info.scm_instance |
|
486 | repo = c.repo_info.scm_instance | |
487 |
|
487 | |||
488 | if c.repo_info.stats: |
|
488 | if c.repo_info.stats: | |
489 | # this is on what revision we ended up so we add +1 for count |
|
489 | # this is on what revision we ended up so we add +1 for count | |
490 | last_rev = c.repo_info.stats.stat_on_revision + 1 |
|
490 | last_rev = c.repo_info.stats.stat_on_revision + 1 | |
491 | else: |
|
491 | else: | |
492 | last_rev = 0 |
|
492 | last_rev = 0 | |
493 | c.stats_revision = last_rev |
|
493 | c.stats_revision = last_rev | |
494 |
|
494 | |||
495 | c.repo_last_rev = repo.count() if repo.revisions else 0 |
|
495 | c.repo_last_rev = repo.count() if repo.revisions else 0 | |
496 |
|
496 | |||
497 | if last_rev == 0 or c.repo_last_rev == 0: |
|
497 | if last_rev == 0 or c.repo_last_rev == 0: | |
498 | c.stats_percentage = 0 |
|
498 | c.stats_percentage = 0 | |
499 | else: |
|
499 | else: | |
500 | c.stats_percentage = '%.2f' % ((float((last_rev)) / c.repo_last_rev) * 100) |
|
500 | c.stats_percentage = '%.2f' % ((float((last_rev)) / c.repo_last_rev) * 100) | |
501 |
|
501 | |||
502 | c.active = 'statistics' |
|
502 | c.active = 'statistics' | |
503 | if request.POST: |
|
503 | if request.POST: | |
504 | try: |
|
504 | try: | |
505 | RepoModel().delete_stats(repo_name) |
|
505 | RepoModel().delete_stats(repo_name) | |
506 | meta.Session().commit() |
|
506 | meta.Session().commit() | |
507 | except Exception as e: |
|
507 | except Exception as e: | |
508 | log.error(traceback.format_exc()) |
|
508 | log.error(traceback.format_exc()) | |
509 | webutils.flash(_('An error occurred during deletion of repository stats'), |
|
509 | webutils.flash(_('An error occurred during deletion of repository stats'), | |
510 | category='error') |
|
510 | category='error') | |
511 | raise HTTPFound(location=url('edit_repo_statistics', repo_name=c.repo_name)) |
|
511 | raise HTTPFound(location=url('edit_repo_statistics', repo_name=c.repo_name)) | |
512 |
|
512 | |||
513 | return render('admin/repos/repo_edit.html') |
|
513 | return base.render('admin/repos/repo_edit.html') |
@@ -1,410 +1,410 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.admin.settings |
|
15 | kallithea.controllers.admin.settings | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | settings controller for Kallithea admin |
|
18 | settings controller for Kallithea admin | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Jul 14, 2010 |
|
22 | :created_on: Jul 14, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 |
|
30 | |||
31 | import formencode |
|
31 | import formencode | |
32 | from formencode import htmlfill |
|
32 | from formencode import htmlfill | |
33 | from tg import config, request |
|
33 | from tg import config, request | |
34 | from tg import tmpl_context as c |
|
34 | from tg import tmpl_context as c | |
35 | from tg.i18n import ugettext as _ |
|
35 | from tg.i18n import ugettext as _ | |
36 | from webob.exc import HTTPFound |
|
36 | from webob.exc import HTTPFound | |
37 |
|
37 | |||
38 | import kallithea |
|
38 | import kallithea | |
|
39 | from kallithea.controllers import base | |||
39 | from kallithea.lib import webutils |
|
40 | from kallithea.lib import webutils | |
40 | from kallithea.lib.auth import HasPermissionAnyDecorator, LoginRequired |
|
41 | from kallithea.lib.auth import HasPermissionAnyDecorator, LoginRequired | |
41 | from kallithea.lib.base import BaseController, render |
|
|||
42 | from kallithea.lib.utils import repo2db_mapper, set_app_settings |
|
42 | from kallithea.lib.utils import repo2db_mapper, set_app_settings | |
43 | from kallithea.lib.utils2 import safe_str |
|
43 | from kallithea.lib.utils2 import safe_str | |
44 | from kallithea.lib.vcs import VCSError |
|
44 | from kallithea.lib.vcs import VCSError | |
45 | from kallithea.lib.webutils import url |
|
45 | from kallithea.lib.webutils import url | |
46 | from kallithea.model import async_tasks, db, meta |
|
46 | from kallithea.model import async_tasks, db, meta | |
47 | from kallithea.model.forms import ApplicationSettingsForm, ApplicationUiSettingsForm, ApplicationVisualisationForm |
|
47 | from kallithea.model.forms import ApplicationSettingsForm, ApplicationUiSettingsForm, ApplicationVisualisationForm | |
48 | from kallithea.model.notification import EmailNotificationModel |
|
48 | from kallithea.model.notification import EmailNotificationModel | |
49 | from kallithea.model.scm import ScmModel |
|
49 | from kallithea.model.scm import ScmModel | |
50 |
|
50 | |||
51 |
|
51 | |||
52 | log = logging.getLogger(__name__) |
|
52 | log = logging.getLogger(__name__) | |
53 |
|
53 | |||
54 |
|
54 | |||
55 | class SettingsController(BaseController): |
|
55 | class SettingsController(base.BaseController): | |
56 |
|
56 | |||
57 | @LoginRequired(allow_default_user=True) |
|
57 | @LoginRequired(allow_default_user=True) | |
58 | def _before(self, *args, **kwargs): |
|
58 | def _before(self, *args, **kwargs): | |
59 | super(SettingsController, self)._before(*args, **kwargs) |
|
59 | super(SettingsController, self)._before(*args, **kwargs) | |
60 |
|
60 | |||
61 | def _get_hg_ui_settings(self): |
|
61 | def _get_hg_ui_settings(self): | |
62 | ret = db.Ui.query().all() |
|
62 | ret = db.Ui.query().all() | |
63 |
|
63 | |||
64 | settings = {} |
|
64 | settings = {} | |
65 | for each in ret: |
|
65 | for each in ret: | |
66 | k = each.ui_section + '_' + each.ui_key |
|
66 | k = each.ui_section + '_' + each.ui_key | |
67 | v = each.ui_value |
|
67 | v = each.ui_value | |
68 | if k == 'paths_/': |
|
68 | if k == 'paths_/': | |
69 | k = 'paths_root_path' |
|
69 | k = 'paths_root_path' | |
70 |
|
70 | |||
71 | k = k.replace('.', '_') |
|
71 | k = k.replace('.', '_') | |
72 |
|
72 | |||
73 | if each.ui_section in ['hooks', 'extensions']: |
|
73 | if each.ui_section in ['hooks', 'extensions']: | |
74 | v = each.ui_active |
|
74 | v = each.ui_active | |
75 |
|
75 | |||
76 | settings[k] = v |
|
76 | settings[k] = v | |
77 | return settings |
|
77 | return settings | |
78 |
|
78 | |||
79 | @HasPermissionAnyDecorator('hg.admin') |
|
79 | @HasPermissionAnyDecorator('hg.admin') | |
80 | def settings_vcs(self): |
|
80 | def settings_vcs(self): | |
81 | c.active = 'vcs' |
|
81 | c.active = 'vcs' | |
82 | if request.POST: |
|
82 | if request.POST: | |
83 | application_form = ApplicationUiSettingsForm()() |
|
83 | application_form = ApplicationUiSettingsForm()() | |
84 | try: |
|
84 | try: | |
85 | form_result = application_form.to_python(dict(request.POST)) |
|
85 | form_result = application_form.to_python(dict(request.POST)) | |
86 | except formencode.Invalid as errors: |
|
86 | except formencode.Invalid as errors: | |
87 | return htmlfill.render( |
|
87 | return htmlfill.render( | |
88 | render('admin/settings/settings.html'), |
|
88 | base.render('admin/settings/settings.html'), | |
89 | defaults=errors.value, |
|
89 | defaults=errors.value, | |
90 | errors=errors.error_dict or {}, |
|
90 | errors=errors.error_dict or {}, | |
91 | prefix_error=False, |
|
91 | prefix_error=False, | |
92 | encoding="UTF-8", |
|
92 | encoding="UTF-8", | |
93 | force_defaults=False) |
|
93 | force_defaults=False) | |
94 |
|
94 | |||
95 | try: |
|
95 | try: | |
96 | if c.visual.allow_repo_location_change: |
|
96 | if c.visual.allow_repo_location_change: | |
97 | sett = db.Ui.get_by_key('paths', '/') |
|
97 | sett = db.Ui.get_by_key('paths', '/') | |
98 | sett.ui_value = form_result['paths_root_path'] |
|
98 | sett.ui_value = form_result['paths_root_path'] | |
99 |
|
99 | |||
100 | # HOOKS |
|
100 | # HOOKS | |
101 | sett = db.Ui.get_by_key('hooks', db.Ui.HOOK_UPDATE) |
|
101 | sett = db.Ui.get_by_key('hooks', db.Ui.HOOK_UPDATE) | |
102 | sett.ui_active = form_result['hooks_changegroup_update'] |
|
102 | sett.ui_active = form_result['hooks_changegroup_update'] | |
103 |
|
103 | |||
104 | sett = db.Ui.get_by_key('hooks', db.Ui.HOOK_REPO_SIZE) |
|
104 | sett = db.Ui.get_by_key('hooks', db.Ui.HOOK_REPO_SIZE) | |
105 | sett.ui_active = form_result['hooks_changegroup_repo_size'] |
|
105 | sett.ui_active = form_result['hooks_changegroup_repo_size'] | |
106 |
|
106 | |||
107 | ## EXTENSIONS |
|
107 | ## EXTENSIONS | |
108 | sett = db.Ui.get_or_create('extensions', 'largefiles') |
|
108 | sett = db.Ui.get_or_create('extensions', 'largefiles') | |
109 | sett.ui_active = form_result['extensions_largefiles'] |
|
109 | sett.ui_active = form_result['extensions_largefiles'] | |
110 |
|
110 | |||
111 | # sett = db.Ui.get_or_create('extensions', 'hggit') |
|
111 | # sett = db.Ui.get_or_create('extensions', 'hggit') | |
112 | # sett.ui_active = form_result['extensions_hggit'] |
|
112 | # sett.ui_active = form_result['extensions_hggit'] | |
113 |
|
113 | |||
114 | meta.Session().commit() |
|
114 | meta.Session().commit() | |
115 |
|
115 | |||
116 | webutils.flash(_('Updated VCS settings'), category='success') |
|
116 | webutils.flash(_('Updated VCS settings'), category='success') | |
117 |
|
117 | |||
118 | except Exception: |
|
118 | except Exception: | |
119 | log.error(traceback.format_exc()) |
|
119 | log.error(traceback.format_exc()) | |
120 | webutils.flash(_('Error occurred while updating ' |
|
120 | webutils.flash(_('Error occurred while updating ' | |
121 | 'application settings'), category='error') |
|
121 | 'application settings'), category='error') | |
122 |
|
122 | |||
123 | defaults = db.Setting.get_app_settings() |
|
123 | defaults = db.Setting.get_app_settings() | |
124 | defaults.update(self._get_hg_ui_settings()) |
|
124 | defaults.update(self._get_hg_ui_settings()) | |
125 |
|
125 | |||
126 | return htmlfill.render( |
|
126 | return htmlfill.render( | |
127 | render('admin/settings/settings.html'), |
|
127 | base.render('admin/settings/settings.html'), | |
128 | defaults=defaults, |
|
128 | defaults=defaults, | |
129 | encoding="UTF-8", |
|
129 | encoding="UTF-8", | |
130 | force_defaults=False) |
|
130 | force_defaults=False) | |
131 |
|
131 | |||
132 | @HasPermissionAnyDecorator('hg.admin') |
|
132 | @HasPermissionAnyDecorator('hg.admin') | |
133 | def settings_mapping(self): |
|
133 | def settings_mapping(self): | |
134 | c.active = 'mapping' |
|
134 | c.active = 'mapping' | |
135 | if request.POST: |
|
135 | if request.POST: | |
136 | rm_obsolete = request.POST.get('destroy', False) |
|
136 | rm_obsolete = request.POST.get('destroy', False) | |
137 | install_git_hooks = request.POST.get('hooks', False) |
|
137 | install_git_hooks = request.POST.get('hooks', False) | |
138 | overwrite_git_hooks = request.POST.get('hooks_overwrite', False) |
|
138 | overwrite_git_hooks = request.POST.get('hooks_overwrite', False) | |
139 | invalidate_cache = request.POST.get('invalidate', False) |
|
139 | invalidate_cache = request.POST.get('invalidate', False) | |
140 | log.debug('rescanning repo location with destroy obsolete=%s, ' |
|
140 | log.debug('rescanning repo location with destroy obsolete=%s, ' | |
141 | 'install git hooks=%s and ' |
|
141 | 'install git hooks=%s and ' | |
142 | 'overwrite git hooks=%s' % (rm_obsolete, install_git_hooks, overwrite_git_hooks)) |
|
142 | 'overwrite git hooks=%s' % (rm_obsolete, install_git_hooks, overwrite_git_hooks)) | |
143 |
|
143 | |||
144 | filesystem_repos = ScmModel().repo_scan() |
|
144 | filesystem_repos = ScmModel().repo_scan() | |
145 | added, removed = repo2db_mapper(filesystem_repos, rm_obsolete, |
|
145 | added, removed = repo2db_mapper(filesystem_repos, rm_obsolete, | |
146 | install_git_hooks=install_git_hooks, |
|
146 | install_git_hooks=install_git_hooks, | |
147 | user=request.authuser.username, |
|
147 | user=request.authuser.username, | |
148 | overwrite_git_hooks=overwrite_git_hooks) |
|
148 | overwrite_git_hooks=overwrite_git_hooks) | |
149 | added_msg = webutils.HTML(', ').join( |
|
149 | added_msg = webutils.HTML(', ').join( | |
150 | webutils.link_to(safe_str(repo_name), webutils.url('summary_home', repo_name=repo_name)) for repo_name in added |
|
150 | webutils.link_to(safe_str(repo_name), webutils.url('summary_home', repo_name=repo_name)) for repo_name in added | |
151 | ) or '-' |
|
151 | ) or '-' | |
152 | removed_msg = webutils.HTML(', ').join( |
|
152 | removed_msg = webutils.HTML(', ').join( | |
153 | safe_str(repo_name) for repo_name in removed |
|
153 | safe_str(repo_name) for repo_name in removed | |
154 | ) or '-' |
|
154 | ) or '-' | |
155 | webutils.flash(webutils.HTML(_('Repositories successfully rescanned. Added: %s. Removed: %s.')) % |
|
155 | webutils.flash(webutils.HTML(_('Repositories successfully rescanned. Added: %s. Removed: %s.')) % | |
156 | (added_msg, removed_msg), category='success') |
|
156 | (added_msg, removed_msg), category='success') | |
157 |
|
157 | |||
158 | if invalidate_cache: |
|
158 | if invalidate_cache: | |
159 | log.debug('invalidating all repositories cache') |
|
159 | log.debug('invalidating all repositories cache') | |
160 | i = 0 |
|
160 | i = 0 | |
161 | for repo in db.Repository.query(): |
|
161 | for repo in db.Repository.query(): | |
162 | try: |
|
162 | try: | |
163 | ScmModel().mark_for_invalidation(repo.repo_name) |
|
163 | ScmModel().mark_for_invalidation(repo.repo_name) | |
164 | i += 1 |
|
164 | i += 1 | |
165 | except VCSError as e: |
|
165 | except VCSError as e: | |
166 | log.warning('VCS error invalidating %s: %s', repo.repo_name, e) |
|
166 | log.warning('VCS error invalidating %s: %s', repo.repo_name, e) | |
167 | webutils.flash(_('Invalidated %s repositories') % i, category='success') |
|
167 | webutils.flash(_('Invalidated %s repositories') % i, category='success') | |
168 |
|
168 | |||
169 | raise HTTPFound(location=url('admin_settings_mapping')) |
|
169 | raise HTTPFound(location=url('admin_settings_mapping')) | |
170 |
|
170 | |||
171 | defaults = db.Setting.get_app_settings() |
|
171 | defaults = db.Setting.get_app_settings() | |
172 | defaults.update(self._get_hg_ui_settings()) |
|
172 | defaults.update(self._get_hg_ui_settings()) | |
173 |
|
173 | |||
174 | return htmlfill.render( |
|
174 | return htmlfill.render( | |
175 | render('admin/settings/settings.html'), |
|
175 | base.render('admin/settings/settings.html'), | |
176 | defaults=defaults, |
|
176 | defaults=defaults, | |
177 | encoding="UTF-8", |
|
177 | encoding="UTF-8", | |
178 | force_defaults=False) |
|
178 | force_defaults=False) | |
179 |
|
179 | |||
180 | @HasPermissionAnyDecorator('hg.admin') |
|
180 | @HasPermissionAnyDecorator('hg.admin') | |
181 | def settings_global(self): |
|
181 | def settings_global(self): | |
182 | c.active = 'global' |
|
182 | c.active = 'global' | |
183 | if request.POST: |
|
183 | if request.POST: | |
184 | application_form = ApplicationSettingsForm()() |
|
184 | application_form = ApplicationSettingsForm()() | |
185 | try: |
|
185 | try: | |
186 | form_result = application_form.to_python(dict(request.POST)) |
|
186 | form_result = application_form.to_python(dict(request.POST)) | |
187 | except formencode.Invalid as errors: |
|
187 | except formencode.Invalid as errors: | |
188 | return htmlfill.render( |
|
188 | return htmlfill.render( | |
189 | render('admin/settings/settings.html'), |
|
189 | base.render('admin/settings/settings.html'), | |
190 | defaults=errors.value, |
|
190 | defaults=errors.value, | |
191 | errors=errors.error_dict or {}, |
|
191 | errors=errors.error_dict or {}, | |
192 | prefix_error=False, |
|
192 | prefix_error=False, | |
193 | encoding="UTF-8", |
|
193 | encoding="UTF-8", | |
194 | force_defaults=False) |
|
194 | force_defaults=False) | |
195 |
|
195 | |||
196 | try: |
|
196 | try: | |
197 | for setting in ( |
|
197 | for setting in ( | |
198 | 'title', |
|
198 | 'title', | |
199 | 'realm', |
|
199 | 'realm', | |
200 | 'ga_code', |
|
200 | 'ga_code', | |
201 | 'captcha_public_key', |
|
201 | 'captcha_public_key', | |
202 | 'captcha_private_key', |
|
202 | 'captcha_private_key', | |
203 | ): |
|
203 | ): | |
204 | db.Setting.create_or_update(setting, form_result[setting]) |
|
204 | db.Setting.create_or_update(setting, form_result[setting]) | |
205 |
|
205 | |||
206 | meta.Session().commit() |
|
206 | meta.Session().commit() | |
207 | set_app_settings(config) |
|
207 | set_app_settings(config) | |
208 | webutils.flash(_('Updated application settings'), category='success') |
|
208 | webutils.flash(_('Updated application settings'), category='success') | |
209 |
|
209 | |||
210 | except Exception: |
|
210 | except Exception: | |
211 | log.error(traceback.format_exc()) |
|
211 | log.error(traceback.format_exc()) | |
212 | webutils.flash(_('Error occurred while updating ' |
|
212 | webutils.flash(_('Error occurred while updating ' | |
213 | 'application settings'), |
|
213 | 'application settings'), | |
214 | category='error') |
|
214 | category='error') | |
215 |
|
215 | |||
216 | raise HTTPFound(location=url('admin_settings_global')) |
|
216 | raise HTTPFound(location=url('admin_settings_global')) | |
217 |
|
217 | |||
218 | defaults = db.Setting.get_app_settings() |
|
218 | defaults = db.Setting.get_app_settings() | |
219 | defaults.update(self._get_hg_ui_settings()) |
|
219 | defaults.update(self._get_hg_ui_settings()) | |
220 |
|
220 | |||
221 | return htmlfill.render( |
|
221 | return htmlfill.render( | |
222 | render('admin/settings/settings.html'), |
|
222 | base.render('admin/settings/settings.html'), | |
223 | defaults=defaults, |
|
223 | defaults=defaults, | |
224 | encoding="UTF-8", |
|
224 | encoding="UTF-8", | |
225 | force_defaults=False) |
|
225 | force_defaults=False) | |
226 |
|
226 | |||
227 | @HasPermissionAnyDecorator('hg.admin') |
|
227 | @HasPermissionAnyDecorator('hg.admin') | |
228 | def settings_visual(self): |
|
228 | def settings_visual(self): | |
229 | c.active = 'visual' |
|
229 | c.active = 'visual' | |
230 | if request.POST: |
|
230 | if request.POST: | |
231 | application_form = ApplicationVisualisationForm()() |
|
231 | application_form = ApplicationVisualisationForm()() | |
232 | try: |
|
232 | try: | |
233 | form_result = application_form.to_python(dict(request.POST)) |
|
233 | form_result = application_form.to_python(dict(request.POST)) | |
234 | except formencode.Invalid as errors: |
|
234 | except formencode.Invalid as errors: | |
235 | return htmlfill.render( |
|
235 | return htmlfill.render( | |
236 | render('admin/settings/settings.html'), |
|
236 | base.render('admin/settings/settings.html'), | |
237 | defaults=errors.value, |
|
237 | defaults=errors.value, | |
238 | errors=errors.error_dict or {}, |
|
238 | errors=errors.error_dict or {}, | |
239 | prefix_error=False, |
|
239 | prefix_error=False, | |
240 | encoding="UTF-8", |
|
240 | encoding="UTF-8", | |
241 | force_defaults=False) |
|
241 | force_defaults=False) | |
242 |
|
242 | |||
243 | try: |
|
243 | try: | |
244 | settings = [ |
|
244 | settings = [ | |
245 | ('show_public_icon', 'show_public_icon', 'bool'), |
|
245 | ('show_public_icon', 'show_public_icon', 'bool'), | |
246 | ('show_private_icon', 'show_private_icon', 'bool'), |
|
246 | ('show_private_icon', 'show_private_icon', 'bool'), | |
247 | ('stylify_metalabels', 'stylify_metalabels', 'bool'), |
|
247 | ('stylify_metalabels', 'stylify_metalabels', 'bool'), | |
248 | ('repository_fields', 'repository_fields', 'bool'), |
|
248 | ('repository_fields', 'repository_fields', 'bool'), | |
249 | ('dashboard_items', 'dashboard_items', 'int'), |
|
249 | ('dashboard_items', 'dashboard_items', 'int'), | |
250 | ('admin_grid_items', 'admin_grid_items', 'int'), |
|
250 | ('admin_grid_items', 'admin_grid_items', 'int'), | |
251 | ('show_version', 'show_version', 'bool'), |
|
251 | ('show_version', 'show_version', 'bool'), | |
252 | ('use_gravatar', 'use_gravatar', 'bool'), |
|
252 | ('use_gravatar', 'use_gravatar', 'bool'), | |
253 | ('gravatar_url', 'gravatar_url', 'unicode'), |
|
253 | ('gravatar_url', 'gravatar_url', 'unicode'), | |
254 | ('clone_uri_tmpl', 'clone_uri_tmpl', 'unicode'), |
|
254 | ('clone_uri_tmpl', 'clone_uri_tmpl', 'unicode'), | |
255 | ('clone_ssh_tmpl', 'clone_ssh_tmpl', 'unicode'), |
|
255 | ('clone_ssh_tmpl', 'clone_ssh_tmpl', 'unicode'), | |
256 | ] |
|
256 | ] | |
257 | for setting, form_key, type_ in settings: |
|
257 | for setting, form_key, type_ in settings: | |
258 | db.Setting.create_or_update(setting, form_result[form_key], type_) |
|
258 | db.Setting.create_or_update(setting, form_result[form_key], type_) | |
259 |
|
259 | |||
260 | meta.Session().commit() |
|
260 | meta.Session().commit() | |
261 | set_app_settings(config) |
|
261 | set_app_settings(config) | |
262 | webutils.flash(_('Updated visualisation settings'), |
|
262 | webutils.flash(_('Updated visualisation settings'), | |
263 | category='success') |
|
263 | category='success') | |
264 |
|
264 | |||
265 | except Exception: |
|
265 | except Exception: | |
266 | log.error(traceback.format_exc()) |
|
266 | log.error(traceback.format_exc()) | |
267 | webutils.flash(_('Error occurred during updating ' |
|
267 | webutils.flash(_('Error occurred during updating ' | |
268 | 'visualisation settings'), |
|
268 | 'visualisation settings'), | |
269 | category='error') |
|
269 | category='error') | |
270 |
|
270 | |||
271 | raise HTTPFound(location=url('admin_settings_visual')) |
|
271 | raise HTTPFound(location=url('admin_settings_visual')) | |
272 |
|
272 | |||
273 | defaults = db.Setting.get_app_settings() |
|
273 | defaults = db.Setting.get_app_settings() | |
274 | defaults.update(self._get_hg_ui_settings()) |
|
274 | defaults.update(self._get_hg_ui_settings()) | |
275 |
|
275 | |||
276 | return htmlfill.render( |
|
276 | return htmlfill.render( | |
277 | render('admin/settings/settings.html'), |
|
277 | base.render('admin/settings/settings.html'), | |
278 | defaults=defaults, |
|
278 | defaults=defaults, | |
279 | encoding="UTF-8", |
|
279 | encoding="UTF-8", | |
280 | force_defaults=False) |
|
280 | force_defaults=False) | |
281 |
|
281 | |||
282 | @HasPermissionAnyDecorator('hg.admin') |
|
282 | @HasPermissionAnyDecorator('hg.admin') | |
283 | def settings_email(self): |
|
283 | def settings_email(self): | |
284 | c.active = 'email' |
|
284 | c.active = 'email' | |
285 | if request.POST: |
|
285 | if request.POST: | |
286 | test_email = request.POST.get('test_email') |
|
286 | test_email = request.POST.get('test_email') | |
287 | test_email_subj = 'Kallithea test email' |
|
287 | test_email_subj = 'Kallithea test email' | |
288 | test_body = ('Kallithea Email test, ' |
|
288 | test_body = ('Kallithea Email test, ' | |
289 | 'Kallithea version: %s' % c.kallithea_version) |
|
289 | 'Kallithea version: %s' % c.kallithea_version) | |
290 | if not test_email: |
|
290 | if not test_email: | |
291 | webutils.flash(_('Please enter email address'), category='error') |
|
291 | webutils.flash(_('Please enter email address'), category='error') | |
292 | raise HTTPFound(location=url('admin_settings_email')) |
|
292 | raise HTTPFound(location=url('admin_settings_email')) | |
293 |
|
293 | |||
294 | test_email_txt_body = EmailNotificationModel() \ |
|
294 | test_email_txt_body = EmailNotificationModel() \ | |
295 | .get_email_tmpl(EmailNotificationModel.TYPE_DEFAULT, |
|
295 | .get_email_tmpl(EmailNotificationModel.TYPE_DEFAULT, | |
296 | 'txt', body=test_body) |
|
296 | 'txt', body=test_body) | |
297 | test_email_html_body = EmailNotificationModel() \ |
|
297 | test_email_html_body = EmailNotificationModel() \ | |
298 | .get_email_tmpl(EmailNotificationModel.TYPE_DEFAULT, |
|
298 | .get_email_tmpl(EmailNotificationModel.TYPE_DEFAULT, | |
299 | 'html', body=test_body) |
|
299 | 'html', body=test_body) | |
300 |
|
300 | |||
301 | recipients = [test_email] if test_email else None |
|
301 | recipients = [test_email] if test_email else None | |
302 |
|
302 | |||
303 | async_tasks.send_email(recipients, test_email_subj, |
|
303 | async_tasks.send_email(recipients, test_email_subj, | |
304 | test_email_txt_body, test_email_html_body) |
|
304 | test_email_txt_body, test_email_html_body) | |
305 |
|
305 | |||
306 | webutils.flash(_('Send email task created'), category='success') |
|
306 | webutils.flash(_('Send email task created'), category='success') | |
307 | raise HTTPFound(location=url('admin_settings_email')) |
|
307 | raise HTTPFound(location=url('admin_settings_email')) | |
308 |
|
308 | |||
309 | defaults = db.Setting.get_app_settings() |
|
309 | defaults = db.Setting.get_app_settings() | |
310 | defaults.update(self._get_hg_ui_settings()) |
|
310 | defaults.update(self._get_hg_ui_settings()) | |
311 |
|
311 | |||
312 | c.ini = kallithea.CONFIG |
|
312 | c.ini = kallithea.CONFIG | |
313 |
|
313 | |||
314 | return htmlfill.render( |
|
314 | return htmlfill.render( | |
315 | render('admin/settings/settings.html'), |
|
315 | base.render('admin/settings/settings.html'), | |
316 | defaults=defaults, |
|
316 | defaults=defaults, | |
317 | encoding="UTF-8", |
|
317 | encoding="UTF-8", | |
318 | force_defaults=False) |
|
318 | force_defaults=False) | |
319 |
|
319 | |||
320 | @HasPermissionAnyDecorator('hg.admin') |
|
320 | @HasPermissionAnyDecorator('hg.admin') | |
321 | def settings_hooks(self): |
|
321 | def settings_hooks(self): | |
322 | c.active = 'hooks' |
|
322 | c.active = 'hooks' | |
323 | if request.POST: |
|
323 | if request.POST: | |
324 | if c.visual.allow_custom_hooks_settings: |
|
324 | if c.visual.allow_custom_hooks_settings: | |
325 | ui_key = request.POST.get('new_hook_ui_key') |
|
325 | ui_key = request.POST.get('new_hook_ui_key') | |
326 | ui_value = request.POST.get('new_hook_ui_value') |
|
326 | ui_value = request.POST.get('new_hook_ui_value') | |
327 |
|
327 | |||
328 | hook_id = request.POST.get('hook_id') |
|
328 | hook_id = request.POST.get('hook_id') | |
329 |
|
329 | |||
330 | try: |
|
330 | try: | |
331 | ui_key = ui_key and ui_key.strip() |
|
331 | ui_key = ui_key and ui_key.strip() | |
332 | if ui_key in (x.ui_key for x in db.Ui.get_custom_hooks()): |
|
332 | if ui_key in (x.ui_key for x in db.Ui.get_custom_hooks()): | |
333 | webutils.flash(_('Hook already exists'), category='error') |
|
333 | webutils.flash(_('Hook already exists'), category='error') | |
334 | elif ui_key in (x.ui_key for x in db.Ui.get_builtin_hooks()): |
|
334 | elif ui_key in (x.ui_key for x in db.Ui.get_builtin_hooks()): | |
335 | webutils.flash(_('Builtin hooks are read-only. Please use another hook name.'), category='error') |
|
335 | webutils.flash(_('Builtin hooks are read-only. Please use another hook name.'), category='error') | |
336 | elif ui_value and ui_key: |
|
336 | elif ui_value and ui_key: | |
337 | db.Ui.create_or_update_hook(ui_key, ui_value) |
|
337 | db.Ui.create_or_update_hook(ui_key, ui_value) | |
338 | webutils.flash(_('Added new hook'), category='success') |
|
338 | webutils.flash(_('Added new hook'), category='success') | |
339 | elif hook_id: |
|
339 | elif hook_id: | |
340 | db.Ui.delete(hook_id) |
|
340 | db.Ui.delete(hook_id) | |
341 | meta.Session().commit() |
|
341 | meta.Session().commit() | |
342 |
|
342 | |||
343 | # check for edits |
|
343 | # check for edits | |
344 | update = False |
|
344 | update = False | |
345 | _d = request.POST.dict_of_lists() |
|
345 | _d = request.POST.dict_of_lists() | |
346 | for k, v, ov in zip(_d.get('hook_ui_key', []), |
|
346 | for k, v, ov in zip(_d.get('hook_ui_key', []), | |
347 | _d.get('hook_ui_value_new', []), |
|
347 | _d.get('hook_ui_value_new', []), | |
348 | _d.get('hook_ui_value', [])): |
|
348 | _d.get('hook_ui_value', [])): | |
349 | if v != ov: |
|
349 | if v != ov: | |
350 | db.Ui.create_or_update_hook(k, v) |
|
350 | db.Ui.create_or_update_hook(k, v) | |
351 | update = True |
|
351 | update = True | |
352 |
|
352 | |||
353 | if update: |
|
353 | if update: | |
354 | webutils.flash(_('Updated hooks'), category='success') |
|
354 | webutils.flash(_('Updated hooks'), category='success') | |
355 | meta.Session().commit() |
|
355 | meta.Session().commit() | |
356 | except Exception: |
|
356 | except Exception: | |
357 | log.error(traceback.format_exc()) |
|
357 | log.error(traceback.format_exc()) | |
358 | webutils.flash(_('Error occurred during hook creation'), |
|
358 | webutils.flash(_('Error occurred during hook creation'), | |
359 | category='error') |
|
359 | category='error') | |
360 |
|
360 | |||
361 | raise HTTPFound(location=url('admin_settings_hooks')) |
|
361 | raise HTTPFound(location=url('admin_settings_hooks')) | |
362 |
|
362 | |||
363 | defaults = db.Setting.get_app_settings() |
|
363 | defaults = db.Setting.get_app_settings() | |
364 | defaults.update(self._get_hg_ui_settings()) |
|
364 | defaults.update(self._get_hg_ui_settings()) | |
365 |
|
365 | |||
366 | c.hooks = db.Ui.get_builtin_hooks() |
|
366 | c.hooks = db.Ui.get_builtin_hooks() | |
367 | c.custom_hooks = db.Ui.get_custom_hooks() |
|
367 | c.custom_hooks = db.Ui.get_custom_hooks() | |
368 |
|
368 | |||
369 | return htmlfill.render( |
|
369 | return htmlfill.render( | |
370 | render('admin/settings/settings.html'), |
|
370 | base.render('admin/settings/settings.html'), | |
371 | defaults=defaults, |
|
371 | defaults=defaults, | |
372 | encoding="UTF-8", |
|
372 | encoding="UTF-8", | |
373 | force_defaults=False) |
|
373 | force_defaults=False) | |
374 |
|
374 | |||
375 | @HasPermissionAnyDecorator('hg.admin') |
|
375 | @HasPermissionAnyDecorator('hg.admin') | |
376 | def settings_search(self): |
|
376 | def settings_search(self): | |
377 | c.active = 'search' |
|
377 | c.active = 'search' | |
378 | if request.POST: |
|
378 | if request.POST: | |
379 | repo_location = self._get_hg_ui_settings()['paths_root_path'] |
|
379 | repo_location = self._get_hg_ui_settings()['paths_root_path'] | |
380 | full_index = request.POST.get('full_index', False) |
|
380 | full_index = request.POST.get('full_index', False) | |
381 | async_tasks.whoosh_index(repo_location, full_index) |
|
381 | async_tasks.whoosh_index(repo_location, full_index) | |
382 | webutils.flash(_('Whoosh reindex task scheduled'), category='success') |
|
382 | webutils.flash(_('Whoosh reindex task scheduled'), category='success') | |
383 | raise HTTPFound(location=url('admin_settings_search')) |
|
383 | raise HTTPFound(location=url('admin_settings_search')) | |
384 |
|
384 | |||
385 | defaults = db.Setting.get_app_settings() |
|
385 | defaults = db.Setting.get_app_settings() | |
386 | defaults.update(self._get_hg_ui_settings()) |
|
386 | defaults.update(self._get_hg_ui_settings()) | |
387 |
|
387 | |||
388 | return htmlfill.render( |
|
388 | return htmlfill.render( | |
389 | render('admin/settings/settings.html'), |
|
389 | base.render('admin/settings/settings.html'), | |
390 | defaults=defaults, |
|
390 | defaults=defaults, | |
391 | encoding="UTF-8", |
|
391 | encoding="UTF-8", | |
392 | force_defaults=False) |
|
392 | force_defaults=False) | |
393 |
|
393 | |||
394 | @HasPermissionAnyDecorator('hg.admin') |
|
394 | @HasPermissionAnyDecorator('hg.admin') | |
395 | def settings_system(self): |
|
395 | def settings_system(self): | |
396 | c.active = 'system' |
|
396 | c.active = 'system' | |
397 |
|
397 | |||
398 | defaults = db.Setting.get_app_settings() |
|
398 | defaults = db.Setting.get_app_settings() | |
399 | defaults.update(self._get_hg_ui_settings()) |
|
399 | defaults.update(self._get_hg_ui_settings()) | |
400 |
|
400 | |||
401 | c.ini = kallithea.CONFIG |
|
401 | c.ini = kallithea.CONFIG | |
402 | server_info = db.Setting.get_server_info() |
|
402 | server_info = db.Setting.get_server_info() | |
403 | for key, val in server_info.items(): |
|
403 | for key, val in server_info.items(): | |
404 | setattr(c, key, val) |
|
404 | setattr(c, key, val) | |
405 |
|
405 | |||
406 | return htmlfill.render( |
|
406 | return htmlfill.render( | |
407 | render('admin/settings/settings.html'), |
|
407 | base.render('admin/settings/settings.html'), | |
408 | defaults=defaults, |
|
408 | defaults=defaults, | |
409 | encoding="UTF-8", |
|
409 | encoding="UTF-8", | |
410 | force_defaults=False) |
|
410 | force_defaults=False) |
@@ -1,407 +1,407 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.admin.user_groups |
|
15 | kallithea.controllers.admin.user_groups | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | User Groups crud controller |
|
18 | User Groups crud controller | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Jan 25, 2011 |
|
22 | :created_on: Jan 25, 2011 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 |
|
30 | |||
31 | import formencode |
|
31 | import formencode | |
32 | from formencode import htmlfill |
|
32 | from formencode import htmlfill | |
33 | from sqlalchemy.orm import joinedload |
|
33 | from sqlalchemy.orm import joinedload | |
34 | from sqlalchemy.sql.expression import func |
|
34 | from sqlalchemy.sql.expression import func | |
35 | from tg import app_globals, request |
|
35 | from tg import app_globals, request | |
36 | from tg import tmpl_context as c |
|
36 | from tg import tmpl_context as c | |
37 | from tg.i18n import ugettext as _ |
|
37 | from tg.i18n import ugettext as _ | |
38 | from webob.exc import HTTPFound, HTTPInternalServerError |
|
38 | from webob.exc import HTTPFound, HTTPInternalServerError | |
39 |
|
39 | |||
40 | import kallithea.lib.helpers as h |
|
40 | import kallithea.lib.helpers as h | |
|
41 | from kallithea.controllers import base | |||
41 | from kallithea.lib import webutils |
|
42 | from kallithea.lib import webutils | |
42 | from kallithea.lib.auth import HasPermissionAnyDecorator, HasUserGroupPermissionLevelDecorator, LoginRequired |
|
43 | from kallithea.lib.auth import HasPermissionAnyDecorator, HasUserGroupPermissionLevelDecorator, LoginRequired | |
43 | from kallithea.lib.base import BaseController, render |
|
|||
44 | from kallithea.lib.exceptions import RepoGroupAssignmentError, UserGroupsAssignedException |
|
44 | from kallithea.lib.exceptions import RepoGroupAssignmentError, UserGroupsAssignedException | |
45 | from kallithea.lib.utils2 import safe_int, safe_str |
|
45 | from kallithea.lib.utils2 import safe_int, safe_str | |
46 | from kallithea.lib.webutils import url |
|
46 | from kallithea.lib.webutils import url | |
47 | from kallithea.model import db, meta, userlog |
|
47 | from kallithea.model import db, meta, userlog | |
48 | from kallithea.model.forms import CustomDefaultPermissionsForm, UserGroupForm, UserGroupPermsForm |
|
48 | from kallithea.model.forms import CustomDefaultPermissionsForm, UserGroupForm, UserGroupPermsForm | |
49 | from kallithea.model.scm import UserGroupList |
|
49 | from kallithea.model.scm import UserGroupList | |
50 | from kallithea.model.user_group import UserGroupModel |
|
50 | from kallithea.model.user_group import UserGroupModel | |
51 |
|
51 | |||
52 |
|
52 | |||
53 | log = logging.getLogger(__name__) |
|
53 | log = logging.getLogger(__name__) | |
54 |
|
54 | |||
55 |
|
55 | |||
56 | class UserGroupsController(BaseController): |
|
56 | class UserGroupsController(base.BaseController): | |
57 |
|
57 | |||
58 | @LoginRequired(allow_default_user=True) |
|
58 | @LoginRequired(allow_default_user=True) | |
59 | def _before(self, *args, **kwargs): |
|
59 | def _before(self, *args, **kwargs): | |
60 | super(UserGroupsController, self)._before(*args, **kwargs) |
|
60 | super(UserGroupsController, self)._before(*args, **kwargs) | |
61 |
|
61 | |||
62 | def __load_data(self, user_group_id): |
|
62 | def __load_data(self, user_group_id): | |
63 | c.group_members_obj = sorted((x.user for x in c.user_group.members), |
|
63 | c.group_members_obj = sorted((x.user for x in c.user_group.members), | |
64 | key=lambda u: u.username.lower()) |
|
64 | key=lambda u: u.username.lower()) | |
65 |
|
65 | |||
66 | c.group_members = [(x.user_id, x.username) for x in c.group_members_obj] |
|
66 | c.group_members = [(x.user_id, x.username) for x in c.group_members_obj] | |
67 | c.available_members = sorted(((x.user_id, x.username) for x in |
|
67 | c.available_members = sorted(((x.user_id, x.username) for x in | |
68 | db.User.query().all()), |
|
68 | db.User.query().all()), | |
69 | key=lambda u: u[1].lower()) |
|
69 | key=lambda u: u[1].lower()) | |
70 |
|
70 | |||
71 | def __load_defaults(self, user_group_id): |
|
71 | def __load_defaults(self, user_group_id): | |
72 | """ |
|
72 | """ | |
73 | Load defaults settings for edit, and update |
|
73 | Load defaults settings for edit, and update | |
74 |
|
74 | |||
75 | :param user_group_id: |
|
75 | :param user_group_id: | |
76 | """ |
|
76 | """ | |
77 | user_group = db.UserGroup.get_or_404(user_group_id) |
|
77 | user_group = db.UserGroup.get_or_404(user_group_id) | |
78 | data = user_group.get_dict() |
|
78 | data = user_group.get_dict() | |
79 | return data |
|
79 | return data | |
80 |
|
80 | |||
81 | def index(self, format='html'): |
|
81 | def index(self, format='html'): | |
82 | _list = db.UserGroup.query() \ |
|
82 | _list = db.UserGroup.query() \ | |
83 | .order_by(func.lower(db.UserGroup.users_group_name)) \ |
|
83 | .order_by(func.lower(db.UserGroup.users_group_name)) \ | |
84 | .all() |
|
84 | .all() | |
85 | group_iter = UserGroupList(_list, perm_level='admin') |
|
85 | group_iter = UserGroupList(_list, perm_level='admin') | |
86 | user_groups_data = [] |
|
86 | user_groups_data = [] | |
87 | _tmpl_lookup = app_globals.mako_lookup |
|
87 | _tmpl_lookup = app_globals.mako_lookup | |
88 | template = _tmpl_lookup.get_template('data_table/_dt_elements.html') |
|
88 | template = _tmpl_lookup.get_template('data_table/_dt_elements.html') | |
89 |
|
89 | |||
90 | def user_group_name(user_group_id, user_group_name): |
|
90 | def user_group_name(user_group_id, user_group_name): | |
91 | return template.get_def("user_group_name") \ |
|
91 | return template.get_def("user_group_name") \ | |
92 | .render_unicode(user_group_id, user_group_name, _=_, h=h, c=c) |
|
92 | .render_unicode(user_group_id, user_group_name, _=_, h=h, c=c) | |
93 |
|
93 | |||
94 | def user_group_actions(user_group_id, user_group_name): |
|
94 | def user_group_actions(user_group_id, user_group_name): | |
95 | return template.get_def("user_group_actions") \ |
|
95 | return template.get_def("user_group_actions") \ | |
96 | .render_unicode(user_group_id, user_group_name, _=_, h=h, c=c) |
|
96 | .render_unicode(user_group_id, user_group_name, _=_, h=h, c=c) | |
97 |
|
97 | |||
98 | for user_gr in group_iter: |
|
98 | for user_gr in group_iter: | |
99 | user_groups_data.append({ |
|
99 | user_groups_data.append({ | |
100 | "raw_name": user_gr.users_group_name, |
|
100 | "raw_name": user_gr.users_group_name, | |
101 | "group_name": user_group_name(user_gr.users_group_id, |
|
101 | "group_name": user_group_name(user_gr.users_group_id, | |
102 | user_gr.users_group_name), |
|
102 | user_gr.users_group_name), | |
103 | "desc": webutils.escape(user_gr.user_group_description), |
|
103 | "desc": webutils.escape(user_gr.user_group_description), | |
104 | "members": len(user_gr.members), |
|
104 | "members": len(user_gr.members), | |
105 | "active": h.boolicon(user_gr.users_group_active), |
|
105 | "active": h.boolicon(user_gr.users_group_active), | |
106 | "owner": user_gr.owner.username, |
|
106 | "owner": user_gr.owner.username, | |
107 | "action": user_group_actions(user_gr.users_group_id, user_gr.users_group_name) |
|
107 | "action": user_group_actions(user_gr.users_group_id, user_gr.users_group_name) | |
108 | }) |
|
108 | }) | |
109 |
|
109 | |||
110 | c.data = { |
|
110 | c.data = { | |
111 | "sort": None, |
|
111 | "sort": None, | |
112 | "dir": "asc", |
|
112 | "dir": "asc", | |
113 | "records": user_groups_data |
|
113 | "records": user_groups_data | |
114 | } |
|
114 | } | |
115 |
|
115 | |||
116 | return render('admin/user_groups/user_groups.html') |
|
116 | return base.render('admin/user_groups/user_groups.html') | |
117 |
|
117 | |||
118 | @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true') |
|
118 | @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true') | |
119 | def create(self): |
|
119 | def create(self): | |
120 | users_group_form = UserGroupForm()() |
|
120 | users_group_form = UserGroupForm()() | |
121 | try: |
|
121 | try: | |
122 | form_result = users_group_form.to_python(dict(request.POST)) |
|
122 | form_result = users_group_form.to_python(dict(request.POST)) | |
123 | ug = UserGroupModel().create(name=form_result['users_group_name'], |
|
123 | ug = UserGroupModel().create(name=form_result['users_group_name'], | |
124 | description=form_result['user_group_description'], |
|
124 | description=form_result['user_group_description'], | |
125 | owner=request.authuser.user_id, |
|
125 | owner=request.authuser.user_id, | |
126 | active=form_result['users_group_active']) |
|
126 | active=form_result['users_group_active']) | |
127 |
|
127 | |||
128 | gr = form_result['users_group_name'] |
|
128 | gr = form_result['users_group_name'] | |
129 | userlog.action_logger(request.authuser, |
|
129 | userlog.action_logger(request.authuser, | |
130 | 'admin_created_users_group:%s' % gr, |
|
130 | 'admin_created_users_group:%s' % gr, | |
131 | None, request.ip_addr) |
|
131 | None, request.ip_addr) | |
132 | webutils.flash(webutils.HTML(_('Created user group %s')) % webutils.link_to(gr, url('edit_users_group', id=ug.users_group_id)), |
|
132 | webutils.flash(webutils.HTML(_('Created user group %s')) % webutils.link_to(gr, url('edit_users_group', id=ug.users_group_id)), | |
133 | category='success') |
|
133 | category='success') | |
134 | meta.Session().commit() |
|
134 | meta.Session().commit() | |
135 | except formencode.Invalid as errors: |
|
135 | except formencode.Invalid as errors: | |
136 | return htmlfill.render( |
|
136 | return htmlfill.render( | |
137 | render('admin/user_groups/user_group_add.html'), |
|
137 | base.render('admin/user_groups/user_group_add.html'), | |
138 | defaults=errors.value, |
|
138 | defaults=errors.value, | |
139 | errors=errors.error_dict or {}, |
|
139 | errors=errors.error_dict or {}, | |
140 | prefix_error=False, |
|
140 | prefix_error=False, | |
141 | encoding="UTF-8", |
|
141 | encoding="UTF-8", | |
142 | force_defaults=False) |
|
142 | force_defaults=False) | |
143 | except Exception: |
|
143 | except Exception: | |
144 | log.error(traceback.format_exc()) |
|
144 | log.error(traceback.format_exc()) | |
145 | webutils.flash(_('Error occurred during creation of user group %s') |
|
145 | webutils.flash(_('Error occurred during creation of user group %s') | |
146 | % request.POST.get('users_group_name'), category='error') |
|
146 | % request.POST.get('users_group_name'), category='error') | |
147 |
|
147 | |||
148 | raise HTTPFound(location=url('users_groups')) |
|
148 | raise HTTPFound(location=url('users_groups')) | |
149 |
|
149 | |||
150 | @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true') |
|
150 | @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true') | |
151 | def new(self, format='html'): |
|
151 | def new(self, format='html'): | |
152 | return render('admin/user_groups/user_group_add.html') |
|
152 | return base.render('admin/user_groups/user_group_add.html') | |
153 |
|
153 | |||
154 | @HasUserGroupPermissionLevelDecorator('admin') |
|
154 | @HasUserGroupPermissionLevelDecorator('admin') | |
155 | def update(self, id): |
|
155 | def update(self, id): | |
156 | c.user_group = db.UserGroup.get_or_404(id) |
|
156 | c.user_group = db.UserGroup.get_or_404(id) | |
157 | c.active = 'settings' |
|
157 | c.active = 'settings' | |
158 | self.__load_data(id) |
|
158 | self.__load_data(id) | |
159 |
|
159 | |||
160 | available_members = [safe_str(x[0]) for x in c.available_members] |
|
160 | available_members = [safe_str(x[0]) for x in c.available_members] | |
161 |
|
161 | |||
162 | users_group_form = UserGroupForm(edit=True, |
|
162 | users_group_form = UserGroupForm(edit=True, | |
163 | old_data=c.user_group.get_dict(), |
|
163 | old_data=c.user_group.get_dict(), | |
164 | available_members=available_members)() |
|
164 | available_members=available_members)() | |
165 |
|
165 | |||
166 | try: |
|
166 | try: | |
167 | form_result = users_group_form.to_python(request.POST) |
|
167 | form_result = users_group_form.to_python(request.POST) | |
168 | UserGroupModel().update(c.user_group, form_result) |
|
168 | UserGroupModel().update(c.user_group, form_result) | |
169 | gr = form_result['users_group_name'] |
|
169 | gr = form_result['users_group_name'] | |
170 | userlog.action_logger(request.authuser, |
|
170 | userlog.action_logger(request.authuser, | |
171 | 'admin_updated_users_group:%s' % gr, |
|
171 | 'admin_updated_users_group:%s' % gr, | |
172 | None, request.ip_addr) |
|
172 | None, request.ip_addr) | |
173 | webutils.flash(_('Updated user group %s') % gr, category='success') |
|
173 | webutils.flash(_('Updated user group %s') % gr, category='success') | |
174 | meta.Session().commit() |
|
174 | meta.Session().commit() | |
175 | except formencode.Invalid as errors: |
|
175 | except formencode.Invalid as errors: | |
176 | ug_model = UserGroupModel() |
|
176 | ug_model = UserGroupModel() | |
177 | defaults = errors.value |
|
177 | defaults = errors.value | |
178 | e = errors.error_dict or {} |
|
178 | e = errors.error_dict or {} | |
179 | defaults.update({ |
|
179 | defaults.update({ | |
180 | 'create_repo_perm': ug_model.has_perm(id, |
|
180 | 'create_repo_perm': ug_model.has_perm(id, | |
181 | 'hg.create.repository'), |
|
181 | 'hg.create.repository'), | |
182 | 'fork_repo_perm': ug_model.has_perm(id, |
|
182 | 'fork_repo_perm': ug_model.has_perm(id, | |
183 | 'hg.fork.repository'), |
|
183 | 'hg.fork.repository'), | |
184 | }) |
|
184 | }) | |
185 |
|
185 | |||
186 | return htmlfill.render( |
|
186 | return htmlfill.render( | |
187 | render('admin/user_groups/user_group_edit.html'), |
|
187 | base.render('admin/user_groups/user_group_edit.html'), | |
188 | defaults=defaults, |
|
188 | defaults=defaults, | |
189 | errors=e, |
|
189 | errors=e, | |
190 | prefix_error=False, |
|
190 | prefix_error=False, | |
191 | encoding="UTF-8", |
|
191 | encoding="UTF-8", | |
192 | force_defaults=False) |
|
192 | force_defaults=False) | |
193 | except Exception: |
|
193 | except Exception: | |
194 | log.error(traceback.format_exc()) |
|
194 | log.error(traceback.format_exc()) | |
195 | webutils.flash(_('Error occurred during update of user group %s') |
|
195 | webutils.flash(_('Error occurred during update of user group %s') | |
196 | % request.POST.get('users_group_name'), category='error') |
|
196 | % request.POST.get('users_group_name'), category='error') | |
197 |
|
197 | |||
198 | raise HTTPFound(location=url('edit_users_group', id=id)) |
|
198 | raise HTTPFound(location=url('edit_users_group', id=id)) | |
199 |
|
199 | |||
200 | @HasUserGroupPermissionLevelDecorator('admin') |
|
200 | @HasUserGroupPermissionLevelDecorator('admin') | |
201 | def delete(self, id): |
|
201 | def delete(self, id): | |
202 | usr_gr = db.UserGroup.get_or_404(id) |
|
202 | usr_gr = db.UserGroup.get_or_404(id) | |
203 | try: |
|
203 | try: | |
204 | UserGroupModel().delete(usr_gr) |
|
204 | UserGroupModel().delete(usr_gr) | |
205 | meta.Session().commit() |
|
205 | meta.Session().commit() | |
206 | webutils.flash(_('Successfully deleted user group'), category='success') |
|
206 | webutils.flash(_('Successfully deleted user group'), category='success') | |
207 | except UserGroupsAssignedException as e: |
|
207 | except UserGroupsAssignedException as e: | |
208 | webutils.flash(e, category='error') |
|
208 | webutils.flash(e, category='error') | |
209 | except Exception: |
|
209 | except Exception: | |
210 | log.error(traceback.format_exc()) |
|
210 | log.error(traceback.format_exc()) | |
211 | webutils.flash(_('An error occurred during deletion of user group'), |
|
211 | webutils.flash(_('An error occurred during deletion of user group'), | |
212 | category='error') |
|
212 | category='error') | |
213 | raise HTTPFound(location=url('users_groups')) |
|
213 | raise HTTPFound(location=url('users_groups')) | |
214 |
|
214 | |||
215 | @HasUserGroupPermissionLevelDecorator('admin') |
|
215 | @HasUserGroupPermissionLevelDecorator('admin') | |
216 | def edit(self, id, format='html'): |
|
216 | def edit(self, id, format='html'): | |
217 | c.user_group = db.UserGroup.get_or_404(id) |
|
217 | c.user_group = db.UserGroup.get_or_404(id) | |
218 | c.active = 'settings' |
|
218 | c.active = 'settings' | |
219 | self.__load_data(id) |
|
219 | self.__load_data(id) | |
220 |
|
220 | |||
221 | defaults = self.__load_defaults(id) |
|
221 | defaults = self.__load_defaults(id) | |
222 |
|
222 | |||
223 | return htmlfill.render( |
|
223 | return htmlfill.render( | |
224 | render('admin/user_groups/user_group_edit.html'), |
|
224 | base.render('admin/user_groups/user_group_edit.html'), | |
225 | defaults=defaults, |
|
225 | defaults=defaults, | |
226 | encoding="UTF-8", |
|
226 | encoding="UTF-8", | |
227 | force_defaults=False |
|
227 | force_defaults=False | |
228 | ) |
|
228 | ) | |
229 |
|
229 | |||
230 | @HasUserGroupPermissionLevelDecorator('admin') |
|
230 | @HasUserGroupPermissionLevelDecorator('admin') | |
231 | def edit_perms(self, id): |
|
231 | def edit_perms(self, id): | |
232 | c.user_group = db.UserGroup.get_or_404(id) |
|
232 | c.user_group = db.UserGroup.get_or_404(id) | |
233 | c.active = 'perms' |
|
233 | c.active = 'perms' | |
234 |
|
234 | |||
235 | defaults = {} |
|
235 | defaults = {} | |
236 | # fill user group users |
|
236 | # fill user group users | |
237 | for p in c.user_group.user_user_group_to_perm: |
|
237 | for p in c.user_group.user_user_group_to_perm: | |
238 | defaults.update({'u_perm_%s' % p.user.username: |
|
238 | defaults.update({'u_perm_%s' % p.user.username: | |
239 | p.permission.permission_name}) |
|
239 | p.permission.permission_name}) | |
240 |
|
240 | |||
241 | for p in c.user_group.user_group_user_group_to_perm: |
|
241 | for p in c.user_group.user_group_user_group_to_perm: | |
242 | defaults.update({'g_perm_%s' % p.user_group.users_group_name: |
|
242 | defaults.update({'g_perm_%s' % p.user_group.users_group_name: | |
243 | p.permission.permission_name}) |
|
243 | p.permission.permission_name}) | |
244 |
|
244 | |||
245 | return htmlfill.render( |
|
245 | return htmlfill.render( | |
246 | render('admin/user_groups/user_group_edit.html'), |
|
246 | base.render('admin/user_groups/user_group_edit.html'), | |
247 | defaults=defaults, |
|
247 | defaults=defaults, | |
248 | encoding="UTF-8", |
|
248 | encoding="UTF-8", | |
249 | force_defaults=False |
|
249 | force_defaults=False | |
250 | ) |
|
250 | ) | |
251 |
|
251 | |||
252 | @HasUserGroupPermissionLevelDecorator('admin') |
|
252 | @HasUserGroupPermissionLevelDecorator('admin') | |
253 | def update_perms(self, id): |
|
253 | def update_perms(self, id): | |
254 | """ |
|
254 | """ | |
255 | grant permission for given usergroup |
|
255 | grant permission for given usergroup | |
256 |
|
256 | |||
257 | :param id: |
|
257 | :param id: | |
258 | """ |
|
258 | """ | |
259 | user_group = db.UserGroup.get_or_404(id) |
|
259 | user_group = db.UserGroup.get_or_404(id) | |
260 | form = UserGroupPermsForm()().to_python(request.POST) |
|
260 | form = UserGroupPermsForm()().to_python(request.POST) | |
261 |
|
261 | |||
262 | # set the permissions ! |
|
262 | # set the permissions ! | |
263 | try: |
|
263 | try: | |
264 | UserGroupModel()._update_permissions(user_group, form['perms_new'], |
|
264 | UserGroupModel()._update_permissions(user_group, form['perms_new'], | |
265 | form['perms_updates']) |
|
265 | form['perms_updates']) | |
266 | except RepoGroupAssignmentError: |
|
266 | except RepoGroupAssignmentError: | |
267 | webutils.flash(_('Target group cannot be the same'), category='error') |
|
267 | webutils.flash(_('Target group cannot be the same'), category='error') | |
268 | raise HTTPFound(location=url('edit_user_group_perms', id=id)) |
|
268 | raise HTTPFound(location=url('edit_user_group_perms', id=id)) | |
269 | # TODO: implement this |
|
269 | # TODO: implement this | |
270 | #action_logger(request.authuser, 'admin_changed_repo_permissions', |
|
270 | #action_logger(request.authuser, 'admin_changed_repo_permissions', | |
271 | # repo_name, request.ip_addr) |
|
271 | # repo_name, request.ip_addr) | |
272 | meta.Session().commit() |
|
272 | meta.Session().commit() | |
273 | webutils.flash(_('User group permissions updated'), category='success') |
|
273 | webutils.flash(_('User group permissions updated'), category='success') | |
274 | raise HTTPFound(location=url('edit_user_group_perms', id=id)) |
|
274 | raise HTTPFound(location=url('edit_user_group_perms', id=id)) | |
275 |
|
275 | |||
276 | @HasUserGroupPermissionLevelDecorator('admin') |
|
276 | @HasUserGroupPermissionLevelDecorator('admin') | |
277 | def delete_perms(self, id): |
|
277 | def delete_perms(self, id): | |
278 | try: |
|
278 | try: | |
279 | obj_type = request.POST.get('obj_type') |
|
279 | obj_type = request.POST.get('obj_type') | |
280 | obj_id = None |
|
280 | obj_id = None | |
281 | if obj_type == 'user': |
|
281 | if obj_type == 'user': | |
282 | obj_id = safe_int(request.POST.get('user_id')) |
|
282 | obj_id = safe_int(request.POST.get('user_id')) | |
283 | elif obj_type == 'user_group': |
|
283 | elif obj_type == 'user_group': | |
284 | obj_id = safe_int(request.POST.get('user_group_id')) |
|
284 | obj_id = safe_int(request.POST.get('user_group_id')) | |
285 |
|
285 | |||
286 | if not request.authuser.is_admin: |
|
286 | if not request.authuser.is_admin: | |
287 | if obj_type == 'user' and request.authuser.user_id == obj_id: |
|
287 | if obj_type == 'user' and request.authuser.user_id == obj_id: | |
288 | msg = _('Cannot revoke permission for yourself as admin') |
|
288 | msg = _('Cannot revoke permission for yourself as admin') | |
289 | webutils.flash(msg, category='warning') |
|
289 | webutils.flash(msg, category='warning') | |
290 | raise Exception('revoke admin permission on self') |
|
290 | raise Exception('revoke admin permission on self') | |
291 | if obj_type == 'user': |
|
291 | if obj_type == 'user': | |
292 | UserGroupModel().revoke_user_permission(user_group=id, |
|
292 | UserGroupModel().revoke_user_permission(user_group=id, | |
293 | user=obj_id) |
|
293 | user=obj_id) | |
294 | elif obj_type == 'user_group': |
|
294 | elif obj_type == 'user_group': | |
295 | UserGroupModel().revoke_user_group_permission(target_user_group=id, |
|
295 | UserGroupModel().revoke_user_group_permission(target_user_group=id, | |
296 | user_group=obj_id) |
|
296 | user_group=obj_id) | |
297 | meta.Session().commit() |
|
297 | meta.Session().commit() | |
298 | except Exception: |
|
298 | except Exception: | |
299 | log.error(traceback.format_exc()) |
|
299 | log.error(traceback.format_exc()) | |
300 | webutils.flash(_('An error occurred during revoking of permission'), |
|
300 | webutils.flash(_('An error occurred during revoking of permission'), | |
301 | category='error') |
|
301 | category='error') | |
302 | raise HTTPInternalServerError() |
|
302 | raise HTTPInternalServerError() | |
303 |
|
303 | |||
304 | @HasUserGroupPermissionLevelDecorator('admin') |
|
304 | @HasUserGroupPermissionLevelDecorator('admin') | |
305 | def edit_default_perms(self, id): |
|
305 | def edit_default_perms(self, id): | |
306 | c.user_group = db.UserGroup.get_or_404(id) |
|
306 | c.user_group = db.UserGroup.get_or_404(id) | |
307 | c.active = 'default_perms' |
|
307 | c.active = 'default_perms' | |
308 |
|
308 | |||
309 | permissions = { |
|
309 | permissions = { | |
310 | 'repositories': {}, |
|
310 | 'repositories': {}, | |
311 | 'repositories_groups': {} |
|
311 | 'repositories_groups': {} | |
312 | } |
|
312 | } | |
313 | ugroup_repo_perms = db.UserGroupRepoToPerm.query() \ |
|
313 | ugroup_repo_perms = db.UserGroupRepoToPerm.query() \ | |
314 | .options(joinedload(db.UserGroupRepoToPerm.permission)) \ |
|
314 | .options(joinedload(db.UserGroupRepoToPerm.permission)) \ | |
315 | .options(joinedload(db.UserGroupRepoToPerm.repository)) \ |
|
315 | .options(joinedload(db.UserGroupRepoToPerm.repository)) \ | |
316 | .filter(db.UserGroupRepoToPerm.users_group_id == id) \ |
|
316 | .filter(db.UserGroupRepoToPerm.users_group_id == id) \ | |
317 | .all() |
|
317 | .all() | |
318 |
|
318 | |||
319 | for gr in ugroup_repo_perms: |
|
319 | for gr in ugroup_repo_perms: | |
320 | permissions['repositories'][gr.repository.repo_name] \ |
|
320 | permissions['repositories'][gr.repository.repo_name] \ | |
321 | = gr.permission.permission_name |
|
321 | = gr.permission.permission_name | |
322 |
|
322 | |||
323 | ugroup_group_perms = db.UserGroupRepoGroupToPerm.query() \ |
|
323 | ugroup_group_perms = db.UserGroupRepoGroupToPerm.query() \ | |
324 | .options(joinedload(db.UserGroupRepoGroupToPerm.permission)) \ |
|
324 | .options(joinedload(db.UserGroupRepoGroupToPerm.permission)) \ | |
325 | .options(joinedload(db.UserGroupRepoGroupToPerm.group)) \ |
|
325 | .options(joinedload(db.UserGroupRepoGroupToPerm.group)) \ | |
326 | .filter(db.UserGroupRepoGroupToPerm.users_group_id == id) \ |
|
326 | .filter(db.UserGroupRepoGroupToPerm.users_group_id == id) \ | |
327 | .all() |
|
327 | .all() | |
328 |
|
328 | |||
329 | for gr in ugroup_group_perms: |
|
329 | for gr in ugroup_group_perms: | |
330 | permissions['repositories_groups'][gr.group.group_name] \ |
|
330 | permissions['repositories_groups'][gr.group.group_name] \ | |
331 | = gr.permission.permission_name |
|
331 | = gr.permission.permission_name | |
332 | c.permissions = permissions |
|
332 | c.permissions = permissions | |
333 |
|
333 | |||
334 | ug_model = UserGroupModel() |
|
334 | ug_model = UserGroupModel() | |
335 |
|
335 | |||
336 | defaults = c.user_group.get_dict() |
|
336 | defaults = c.user_group.get_dict() | |
337 | defaults.update({ |
|
337 | defaults.update({ | |
338 | 'create_repo_perm': ug_model.has_perm(c.user_group, |
|
338 | 'create_repo_perm': ug_model.has_perm(c.user_group, | |
339 | 'hg.create.repository'), |
|
339 | 'hg.create.repository'), | |
340 | 'create_user_group_perm': ug_model.has_perm(c.user_group, |
|
340 | 'create_user_group_perm': ug_model.has_perm(c.user_group, | |
341 | 'hg.usergroup.create.true'), |
|
341 | 'hg.usergroup.create.true'), | |
342 | 'fork_repo_perm': ug_model.has_perm(c.user_group, |
|
342 | 'fork_repo_perm': ug_model.has_perm(c.user_group, | |
343 | 'hg.fork.repository'), |
|
343 | 'hg.fork.repository'), | |
344 | }) |
|
344 | }) | |
345 |
|
345 | |||
346 | return htmlfill.render( |
|
346 | return htmlfill.render( | |
347 | render('admin/user_groups/user_group_edit.html'), |
|
347 | base.render('admin/user_groups/user_group_edit.html'), | |
348 | defaults=defaults, |
|
348 | defaults=defaults, | |
349 | encoding="UTF-8", |
|
349 | encoding="UTF-8", | |
350 | force_defaults=False |
|
350 | force_defaults=False | |
351 | ) |
|
351 | ) | |
352 |
|
352 | |||
353 | @HasUserGroupPermissionLevelDecorator('admin') |
|
353 | @HasUserGroupPermissionLevelDecorator('admin') | |
354 | def update_default_perms(self, id): |
|
354 | def update_default_perms(self, id): | |
355 | user_group = db.UserGroup.get_or_404(id) |
|
355 | user_group = db.UserGroup.get_or_404(id) | |
356 |
|
356 | |||
357 | try: |
|
357 | try: | |
358 | form = CustomDefaultPermissionsForm()() |
|
358 | form = CustomDefaultPermissionsForm()() | |
359 | form_result = form.to_python(request.POST) |
|
359 | form_result = form.to_python(request.POST) | |
360 |
|
360 | |||
361 | usergroup_model = UserGroupModel() |
|
361 | usergroup_model = UserGroupModel() | |
362 |
|
362 | |||
363 | defs = db.UserGroupToPerm.query() \ |
|
363 | defs = db.UserGroupToPerm.query() \ | |
364 | .filter(db.UserGroupToPerm.users_group == user_group) \ |
|
364 | .filter(db.UserGroupToPerm.users_group == user_group) \ | |
365 | .all() |
|
365 | .all() | |
366 | for ug in defs: |
|
366 | for ug in defs: | |
367 | meta.Session().delete(ug) |
|
367 | meta.Session().delete(ug) | |
368 |
|
368 | |||
369 | if form_result['create_repo_perm']: |
|
369 | if form_result['create_repo_perm']: | |
370 | usergroup_model.grant_perm(id, 'hg.create.repository') |
|
370 | usergroup_model.grant_perm(id, 'hg.create.repository') | |
371 | else: |
|
371 | else: | |
372 | usergroup_model.grant_perm(id, 'hg.create.none') |
|
372 | usergroup_model.grant_perm(id, 'hg.create.none') | |
373 | if form_result['create_user_group_perm']: |
|
373 | if form_result['create_user_group_perm']: | |
374 | usergroup_model.grant_perm(id, 'hg.usergroup.create.true') |
|
374 | usergroup_model.grant_perm(id, 'hg.usergroup.create.true') | |
375 | else: |
|
375 | else: | |
376 | usergroup_model.grant_perm(id, 'hg.usergroup.create.false') |
|
376 | usergroup_model.grant_perm(id, 'hg.usergroup.create.false') | |
377 | if form_result['fork_repo_perm']: |
|
377 | if form_result['fork_repo_perm']: | |
378 | usergroup_model.grant_perm(id, 'hg.fork.repository') |
|
378 | usergroup_model.grant_perm(id, 'hg.fork.repository') | |
379 | else: |
|
379 | else: | |
380 | usergroup_model.grant_perm(id, 'hg.fork.none') |
|
380 | usergroup_model.grant_perm(id, 'hg.fork.none') | |
381 |
|
381 | |||
382 | webutils.flash(_("Updated permissions"), category='success') |
|
382 | webutils.flash(_("Updated permissions"), category='success') | |
383 | meta.Session().commit() |
|
383 | meta.Session().commit() | |
384 | except Exception: |
|
384 | except Exception: | |
385 | log.error(traceback.format_exc()) |
|
385 | log.error(traceback.format_exc()) | |
386 | webutils.flash(_('An error occurred during permissions saving'), |
|
386 | webutils.flash(_('An error occurred during permissions saving'), | |
387 | category='error') |
|
387 | category='error') | |
388 |
|
388 | |||
389 | raise HTTPFound(location=url('edit_user_group_default_perms', id=id)) |
|
389 | raise HTTPFound(location=url('edit_user_group_default_perms', id=id)) | |
390 |
|
390 | |||
391 | @HasUserGroupPermissionLevelDecorator('admin') |
|
391 | @HasUserGroupPermissionLevelDecorator('admin') | |
392 | def edit_advanced(self, id): |
|
392 | def edit_advanced(self, id): | |
393 | c.user_group = db.UserGroup.get_or_404(id) |
|
393 | c.user_group = db.UserGroup.get_or_404(id) | |
394 | c.active = 'advanced' |
|
394 | c.active = 'advanced' | |
395 | c.group_members_obj = sorted((x.user for x in c.user_group.members), |
|
395 | c.group_members_obj = sorted((x.user for x in c.user_group.members), | |
396 | key=lambda u: u.username.lower()) |
|
396 | key=lambda u: u.username.lower()) | |
397 | return render('admin/user_groups/user_group_edit.html') |
|
397 | return base.render('admin/user_groups/user_group_edit.html') | |
398 |
|
398 | |||
399 | @HasUserGroupPermissionLevelDecorator('admin') |
|
399 | @HasUserGroupPermissionLevelDecorator('admin') | |
400 | def edit_members(self, id): |
|
400 | def edit_members(self, id): | |
401 | c.user_group = db.UserGroup.get_or_404(id) |
|
401 | c.user_group = db.UserGroup.get_or_404(id) | |
402 | c.active = 'members' |
|
402 | c.active = 'members' | |
403 | c.group_members_obj = sorted((x.user for x in c.user_group.members), |
|
403 | c.group_members_obj = sorted((x.user for x in c.user_group.members), | |
404 | key=lambda u: u.username.lower()) |
|
404 | key=lambda u: u.username.lower()) | |
405 |
|
405 | |||
406 | c.group_members = [(x.user_id, x.username) for x in c.group_members_obj] |
|
406 | c.group_members = [(x.user_id, x.username) for x in c.group_members_obj] | |
407 | return render('admin/user_groups/user_group_edit.html') |
|
407 | return base.render('admin/user_groups/user_group_edit.html') |
@@ -1,472 +1,472 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.admin.users |
|
15 | kallithea.controllers.admin.users | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Users crud controller |
|
18 | Users crud controller | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 4, 2010 |
|
22 | :created_on: Apr 4, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 |
|
30 | |||
31 | import formencode |
|
31 | import formencode | |
32 | from formencode import htmlfill |
|
32 | from formencode import htmlfill | |
33 | from sqlalchemy.sql.expression import func |
|
33 | from sqlalchemy.sql.expression import func | |
34 | from tg import app_globals, request |
|
34 | from tg import app_globals, request | |
35 | from tg import tmpl_context as c |
|
35 | from tg import tmpl_context as c | |
36 | from tg.i18n import ugettext as _ |
|
36 | from tg.i18n import ugettext as _ | |
37 | from webob.exc import HTTPFound, HTTPNotFound |
|
37 | from webob.exc import HTTPFound, HTTPNotFound | |
38 |
|
38 | |||
39 | import kallithea |
|
39 | import kallithea | |
40 | import kallithea.lib.helpers as h |
|
40 | import kallithea.lib.helpers as h | |
|
41 | from kallithea.controllers import base | |||
41 | from kallithea.lib import auth_modules, webutils |
|
42 | from kallithea.lib import auth_modules, webutils | |
42 | from kallithea.lib.auth import AuthUser, HasPermissionAnyDecorator, LoginRequired |
|
43 | from kallithea.lib.auth import AuthUser, HasPermissionAnyDecorator, LoginRequired | |
43 | from kallithea.lib.base import BaseController, IfSshEnabled, render |
|
|||
44 | from kallithea.lib.exceptions import DefaultUserException, UserCreationError, UserOwnsReposException |
|
44 | from kallithea.lib.exceptions import DefaultUserException, UserCreationError, UserOwnsReposException | |
45 | from kallithea.lib.utils2 import datetime_to_time, fmt_date, generate_api_key, safe_int |
|
45 | from kallithea.lib.utils2 import datetime_to_time, fmt_date, generate_api_key, safe_int | |
46 | from kallithea.lib.webutils import url |
|
46 | from kallithea.lib.webutils import url | |
47 | from kallithea.model import db, meta, userlog |
|
47 | from kallithea.model import db, meta, userlog | |
48 | from kallithea.model.api_key import ApiKeyModel |
|
48 | from kallithea.model.api_key import ApiKeyModel | |
49 | from kallithea.model.forms import CustomDefaultPermissionsForm, UserForm |
|
49 | from kallithea.model.forms import CustomDefaultPermissionsForm, UserForm | |
50 | from kallithea.model.ssh_key import SshKeyModel, SshKeyModelException |
|
50 | from kallithea.model.ssh_key import SshKeyModel, SshKeyModelException | |
51 | from kallithea.model.user import UserModel |
|
51 | from kallithea.model.user import UserModel | |
52 |
|
52 | |||
53 |
|
53 | |||
54 | log = logging.getLogger(__name__) |
|
54 | log = logging.getLogger(__name__) | |
55 |
|
55 | |||
56 |
|
56 | |||
57 | class UsersController(BaseController): |
|
57 | class UsersController(base.BaseController): | |
58 |
|
58 | |||
59 | @LoginRequired() |
|
59 | @LoginRequired() | |
60 | @HasPermissionAnyDecorator('hg.admin') |
|
60 | @HasPermissionAnyDecorator('hg.admin') | |
61 | def _before(self, *args, **kwargs): |
|
61 | def _before(self, *args, **kwargs): | |
62 | super(UsersController, self)._before(*args, **kwargs) |
|
62 | super(UsersController, self)._before(*args, **kwargs) | |
63 |
|
63 | |||
64 | def index(self, format='html'): |
|
64 | def index(self, format='html'): | |
65 | c.users_list = db.User.query().order_by(db.User.username) \ |
|
65 | c.users_list = db.User.query().order_by(db.User.username) \ | |
66 | .filter_by(is_default_user=False) \ |
|
66 | .filter_by(is_default_user=False) \ | |
67 | .order_by(func.lower(db.User.username)) \ |
|
67 | .order_by(func.lower(db.User.username)) \ | |
68 | .all() |
|
68 | .all() | |
69 |
|
69 | |||
70 | users_data = [] |
|
70 | users_data = [] | |
71 | _tmpl_lookup = app_globals.mako_lookup |
|
71 | _tmpl_lookup = app_globals.mako_lookup | |
72 | template = _tmpl_lookup.get_template('data_table/_dt_elements.html') |
|
72 | template = _tmpl_lookup.get_template('data_table/_dt_elements.html') | |
73 |
|
73 | |||
74 | grav_tmpl = '<div class="gravatar">%s</div>' |
|
74 | grav_tmpl = '<div class="gravatar">%s</div>' | |
75 |
|
75 | |||
76 | def username(user_id, username): |
|
76 | def username(user_id, username): | |
77 | return template.get_def("user_name") \ |
|
77 | return template.get_def("user_name") \ | |
78 | .render_unicode(user_id, username, _=_, h=h, c=c) |
|
78 | .render_unicode(user_id, username, _=_, h=h, c=c) | |
79 |
|
79 | |||
80 | def user_actions(user_id, username): |
|
80 | def user_actions(user_id, username): | |
81 | return template.get_def("user_actions") \ |
|
81 | return template.get_def("user_actions") \ | |
82 | .render_unicode(user_id, username, _=_, h=h, c=c) |
|
82 | .render_unicode(user_id, username, _=_, h=h, c=c) | |
83 |
|
83 | |||
84 | for user in c.users_list: |
|
84 | for user in c.users_list: | |
85 | users_data.append({ |
|
85 | users_data.append({ | |
86 | "gravatar": grav_tmpl % h.gravatar(user.email, size=20), |
|
86 | "gravatar": grav_tmpl % h.gravatar(user.email, size=20), | |
87 | "raw_name": user.username, |
|
87 | "raw_name": user.username, | |
88 | "username": username(user.user_id, user.username), |
|
88 | "username": username(user.user_id, user.username), | |
89 | "firstname": webutils.escape(user.name), |
|
89 | "firstname": webutils.escape(user.name), | |
90 | "lastname": webutils.escape(user.lastname), |
|
90 | "lastname": webutils.escape(user.lastname), | |
91 | "last_login": fmt_date(user.last_login), |
|
91 | "last_login": fmt_date(user.last_login), | |
92 | "last_login_raw": datetime_to_time(user.last_login), |
|
92 | "last_login_raw": datetime_to_time(user.last_login), | |
93 | "active": h.boolicon(user.active), |
|
93 | "active": h.boolicon(user.active), | |
94 | "admin": h.boolicon(user.admin), |
|
94 | "admin": h.boolicon(user.admin), | |
95 | "extern_type": user.extern_type, |
|
95 | "extern_type": user.extern_type, | |
96 | "extern_name": user.extern_name, |
|
96 | "extern_name": user.extern_name, | |
97 | "action": user_actions(user.user_id, user.username), |
|
97 | "action": user_actions(user.user_id, user.username), | |
98 | }) |
|
98 | }) | |
99 |
|
99 | |||
100 | c.data = { |
|
100 | c.data = { | |
101 | "sort": None, |
|
101 | "sort": None, | |
102 | "dir": "asc", |
|
102 | "dir": "asc", | |
103 | "records": users_data |
|
103 | "records": users_data | |
104 | } |
|
104 | } | |
105 |
|
105 | |||
106 | return render('admin/users/users.html') |
|
106 | return base.render('admin/users/users.html') | |
107 |
|
107 | |||
108 | def create(self): |
|
108 | def create(self): | |
109 | c.default_extern_type = db.User.DEFAULT_AUTH_TYPE |
|
109 | c.default_extern_type = db.User.DEFAULT_AUTH_TYPE | |
110 | c.default_extern_name = '' |
|
110 | c.default_extern_name = '' | |
111 | user_model = UserModel() |
|
111 | user_model = UserModel() | |
112 | user_form = UserForm()() |
|
112 | user_form = UserForm()() | |
113 | try: |
|
113 | try: | |
114 | form_result = user_form.to_python(dict(request.POST)) |
|
114 | form_result = user_form.to_python(dict(request.POST)) | |
115 | user = user_model.create(form_result) |
|
115 | user = user_model.create(form_result) | |
116 | userlog.action_logger(request.authuser, 'admin_created_user:%s' % user.username, |
|
116 | userlog.action_logger(request.authuser, 'admin_created_user:%s' % user.username, | |
117 | None, request.ip_addr) |
|
117 | None, request.ip_addr) | |
118 | webutils.flash(_('Created user %s') % user.username, |
|
118 | webutils.flash(_('Created user %s') % user.username, | |
119 | category='success') |
|
119 | category='success') | |
120 | meta.Session().commit() |
|
120 | meta.Session().commit() | |
121 | except formencode.Invalid as errors: |
|
121 | except formencode.Invalid as errors: | |
122 | return htmlfill.render( |
|
122 | return htmlfill.render( | |
123 | render('admin/users/user_add.html'), |
|
123 | base.render('admin/users/user_add.html'), | |
124 | defaults=errors.value, |
|
124 | defaults=errors.value, | |
125 | errors=errors.error_dict or {}, |
|
125 | errors=errors.error_dict or {}, | |
126 | prefix_error=False, |
|
126 | prefix_error=False, | |
127 | encoding="UTF-8", |
|
127 | encoding="UTF-8", | |
128 | force_defaults=False) |
|
128 | force_defaults=False) | |
129 | except UserCreationError as e: |
|
129 | except UserCreationError as e: | |
130 | webutils.flash(e, 'error') |
|
130 | webutils.flash(e, 'error') | |
131 | except Exception: |
|
131 | except Exception: | |
132 | log.error(traceback.format_exc()) |
|
132 | log.error(traceback.format_exc()) | |
133 | webutils.flash(_('Error occurred during creation of user %s') |
|
133 | webutils.flash(_('Error occurred during creation of user %s') | |
134 | % request.POST.get('username'), category='error') |
|
134 | % request.POST.get('username'), category='error') | |
135 | raise HTTPFound(location=url('edit_user', id=user.user_id)) |
|
135 | raise HTTPFound(location=url('edit_user', id=user.user_id)) | |
136 |
|
136 | |||
137 | def new(self, format='html'): |
|
137 | def new(self, format='html'): | |
138 | c.default_extern_type = db.User.DEFAULT_AUTH_TYPE |
|
138 | c.default_extern_type = db.User.DEFAULT_AUTH_TYPE | |
139 | c.default_extern_name = '' |
|
139 | c.default_extern_name = '' | |
140 | return render('admin/users/user_add.html') |
|
140 | return base.render('admin/users/user_add.html') | |
141 |
|
141 | |||
142 | def update(self, id): |
|
142 | def update(self, id): | |
143 | user_model = UserModel() |
|
143 | user_model = UserModel() | |
144 | user = user_model.get(id) |
|
144 | user = user_model.get(id) | |
145 | _form = UserForm(edit=True, old_data={'user_id': id, |
|
145 | _form = UserForm(edit=True, old_data={'user_id': id, | |
146 | 'email': user.email})() |
|
146 | 'email': user.email})() | |
147 | form_result = {} |
|
147 | form_result = {} | |
148 | try: |
|
148 | try: | |
149 | form_result = _form.to_python(dict(request.POST)) |
|
149 | form_result = _form.to_python(dict(request.POST)) | |
150 | skip_attrs = ['extern_type', 'extern_name', |
|
150 | skip_attrs = ['extern_type', 'extern_name', | |
151 | ] + auth_modules.get_managed_fields(user) |
|
151 | ] + auth_modules.get_managed_fields(user) | |
152 |
|
152 | |||
153 | user_model.update(id, form_result, skip_attrs=skip_attrs) |
|
153 | user_model.update(id, form_result, skip_attrs=skip_attrs) | |
154 | usr = form_result['username'] |
|
154 | usr = form_result['username'] | |
155 | userlog.action_logger(request.authuser, 'admin_updated_user:%s' % usr, |
|
155 | userlog.action_logger(request.authuser, 'admin_updated_user:%s' % usr, | |
156 | None, request.ip_addr) |
|
156 | None, request.ip_addr) | |
157 | webutils.flash(_('User updated successfully'), category='success') |
|
157 | webutils.flash(_('User updated successfully'), category='success') | |
158 | meta.Session().commit() |
|
158 | meta.Session().commit() | |
159 | except formencode.Invalid as errors: |
|
159 | except formencode.Invalid as errors: | |
160 | defaults = errors.value |
|
160 | defaults = errors.value | |
161 | e = errors.error_dict or {} |
|
161 | e = errors.error_dict or {} | |
162 | defaults.update({ |
|
162 | defaults.update({ | |
163 | 'create_repo_perm': user_model.has_perm(id, |
|
163 | 'create_repo_perm': user_model.has_perm(id, | |
164 | 'hg.create.repository'), |
|
164 | 'hg.create.repository'), | |
165 | 'fork_repo_perm': user_model.has_perm(id, 'hg.fork.repository'), |
|
165 | 'fork_repo_perm': user_model.has_perm(id, 'hg.fork.repository'), | |
166 | }) |
|
166 | }) | |
167 | return htmlfill.render( |
|
167 | return htmlfill.render( | |
168 | self._render_edit_profile(user), |
|
168 | self._render_edit_profile(user), | |
169 | defaults=defaults, |
|
169 | defaults=defaults, | |
170 | errors=e, |
|
170 | errors=e, | |
171 | prefix_error=False, |
|
171 | prefix_error=False, | |
172 | encoding="UTF-8", |
|
172 | encoding="UTF-8", | |
173 | force_defaults=False) |
|
173 | force_defaults=False) | |
174 | except Exception: |
|
174 | except Exception: | |
175 | log.error(traceback.format_exc()) |
|
175 | log.error(traceback.format_exc()) | |
176 | webutils.flash(_('Error occurred during update of user %s') |
|
176 | webutils.flash(_('Error occurred during update of user %s') | |
177 | % form_result.get('username'), category='error') |
|
177 | % form_result.get('username'), category='error') | |
178 | raise HTTPFound(location=url('edit_user', id=id)) |
|
178 | raise HTTPFound(location=url('edit_user', id=id)) | |
179 |
|
179 | |||
180 | def delete(self, id): |
|
180 | def delete(self, id): | |
181 | usr = db.User.get_or_404(id) |
|
181 | usr = db.User.get_or_404(id) | |
182 | has_ssh_keys = bool(usr.ssh_keys) |
|
182 | has_ssh_keys = bool(usr.ssh_keys) | |
183 | try: |
|
183 | try: | |
184 | UserModel().delete(usr) |
|
184 | UserModel().delete(usr) | |
185 | meta.Session().commit() |
|
185 | meta.Session().commit() | |
186 | webutils.flash(_('Successfully deleted user'), category='success') |
|
186 | webutils.flash(_('Successfully deleted user'), category='success') | |
187 | except (UserOwnsReposException, DefaultUserException) as e: |
|
187 | except (UserOwnsReposException, DefaultUserException) as e: | |
188 | webutils.flash(e, category='warning') |
|
188 | webutils.flash(e, category='warning') | |
189 | except Exception: |
|
189 | except Exception: | |
190 | log.error(traceback.format_exc()) |
|
190 | log.error(traceback.format_exc()) | |
191 | webutils.flash(_('An error occurred during deletion of user'), |
|
191 | webutils.flash(_('An error occurred during deletion of user'), | |
192 | category='error') |
|
192 | category='error') | |
193 | else: |
|
193 | else: | |
194 | if has_ssh_keys: |
|
194 | if has_ssh_keys: | |
195 | SshKeyModel().write_authorized_keys() |
|
195 | SshKeyModel().write_authorized_keys() | |
196 | raise HTTPFound(location=url('users')) |
|
196 | raise HTTPFound(location=url('users')) | |
197 |
|
197 | |||
198 | def _get_user_or_raise_if_default(self, id): |
|
198 | def _get_user_or_raise_if_default(self, id): | |
199 | try: |
|
199 | try: | |
200 | return db.User.get_or_404(id, allow_default=False) |
|
200 | return db.User.get_or_404(id, allow_default=False) | |
201 | except DefaultUserException: |
|
201 | except DefaultUserException: | |
202 | webutils.flash(_("The default user cannot be edited"), category='warning') |
|
202 | webutils.flash(_("The default user cannot be edited"), category='warning') | |
203 | raise HTTPNotFound |
|
203 | raise HTTPNotFound | |
204 |
|
204 | |||
205 | def _render_edit_profile(self, user): |
|
205 | def _render_edit_profile(self, user): | |
206 | c.user = user |
|
206 | c.user = user | |
207 | c.active = 'profile' |
|
207 | c.active = 'profile' | |
208 | c.perm_user = AuthUser(dbuser=user) |
|
208 | c.perm_user = AuthUser(dbuser=user) | |
209 | managed_fields = auth_modules.get_managed_fields(user) |
|
209 | managed_fields = auth_modules.get_managed_fields(user) | |
210 | c.readonly = lambda n: 'readonly' if n in managed_fields else None |
|
210 | c.readonly = lambda n: 'readonly' if n in managed_fields else None | |
211 | return render('admin/users/user_edit.html') |
|
211 | return base.render('admin/users/user_edit.html') | |
212 |
|
212 | |||
213 | def edit(self, id, format='html'): |
|
213 | def edit(self, id, format='html'): | |
214 | user = self._get_user_or_raise_if_default(id) |
|
214 | user = self._get_user_or_raise_if_default(id) | |
215 | defaults = user.get_dict() |
|
215 | defaults = user.get_dict() | |
216 |
|
216 | |||
217 | return htmlfill.render( |
|
217 | return htmlfill.render( | |
218 | self._render_edit_profile(user), |
|
218 | self._render_edit_profile(user), | |
219 | defaults=defaults, |
|
219 | defaults=defaults, | |
220 | encoding="UTF-8", |
|
220 | encoding="UTF-8", | |
221 | force_defaults=False) |
|
221 | force_defaults=False) | |
222 |
|
222 | |||
223 | def edit_advanced(self, id): |
|
223 | def edit_advanced(self, id): | |
224 | c.user = self._get_user_or_raise_if_default(id) |
|
224 | c.user = self._get_user_or_raise_if_default(id) | |
225 | c.active = 'advanced' |
|
225 | c.active = 'advanced' | |
226 | c.perm_user = AuthUser(dbuser=c.user) |
|
226 | c.perm_user = AuthUser(dbuser=c.user) | |
227 |
|
227 | |||
228 | umodel = UserModel() |
|
228 | umodel = UserModel() | |
229 | defaults = c.user.get_dict() |
|
229 | defaults = c.user.get_dict() | |
230 | defaults.update({ |
|
230 | defaults.update({ | |
231 | 'create_repo_perm': umodel.has_perm(c.user, 'hg.create.repository'), |
|
231 | 'create_repo_perm': umodel.has_perm(c.user, 'hg.create.repository'), | |
232 | 'create_user_group_perm': umodel.has_perm(c.user, |
|
232 | 'create_user_group_perm': umodel.has_perm(c.user, | |
233 | 'hg.usergroup.create.true'), |
|
233 | 'hg.usergroup.create.true'), | |
234 | 'fork_repo_perm': umodel.has_perm(c.user, 'hg.fork.repository'), |
|
234 | 'fork_repo_perm': umodel.has_perm(c.user, 'hg.fork.repository'), | |
235 | }) |
|
235 | }) | |
236 | return htmlfill.render( |
|
236 | return htmlfill.render( | |
237 | render('admin/users/user_edit.html'), |
|
237 | base.render('admin/users/user_edit.html'), | |
238 | defaults=defaults, |
|
238 | defaults=defaults, | |
239 | encoding="UTF-8", |
|
239 | encoding="UTF-8", | |
240 | force_defaults=False) |
|
240 | force_defaults=False) | |
241 |
|
241 | |||
242 | def edit_api_keys(self, id): |
|
242 | def edit_api_keys(self, id): | |
243 | c.user = self._get_user_or_raise_if_default(id) |
|
243 | c.user = self._get_user_or_raise_if_default(id) | |
244 | c.active = 'api_keys' |
|
244 | c.active = 'api_keys' | |
245 | show_expired = True |
|
245 | show_expired = True | |
246 | c.lifetime_values = [ |
|
246 | c.lifetime_values = [ | |
247 | (str(-1), _('Forever')), |
|
247 | (str(-1), _('Forever')), | |
248 | (str(5), _('5 minutes')), |
|
248 | (str(5), _('5 minutes')), | |
249 | (str(60), _('1 hour')), |
|
249 | (str(60), _('1 hour')), | |
250 | (str(60 * 24), _('1 day')), |
|
250 | (str(60 * 24), _('1 day')), | |
251 | (str(60 * 24 * 30), _('1 month')), |
|
251 | (str(60 * 24 * 30), _('1 month')), | |
252 | ] |
|
252 | ] | |
253 | c.lifetime_options = [(c.lifetime_values, _("Lifetime"))] |
|
253 | c.lifetime_options = [(c.lifetime_values, _("Lifetime"))] | |
254 | c.user_api_keys = ApiKeyModel().get_api_keys(c.user.user_id, |
|
254 | c.user_api_keys = ApiKeyModel().get_api_keys(c.user.user_id, | |
255 | show_expired=show_expired) |
|
255 | show_expired=show_expired) | |
256 | defaults = c.user.get_dict() |
|
256 | defaults = c.user.get_dict() | |
257 | return htmlfill.render( |
|
257 | return htmlfill.render( | |
258 | render('admin/users/user_edit.html'), |
|
258 | base.render('admin/users/user_edit.html'), | |
259 | defaults=defaults, |
|
259 | defaults=defaults, | |
260 | encoding="UTF-8", |
|
260 | encoding="UTF-8", | |
261 | force_defaults=False) |
|
261 | force_defaults=False) | |
262 |
|
262 | |||
263 | def add_api_key(self, id): |
|
263 | def add_api_key(self, id): | |
264 | c.user = self._get_user_or_raise_if_default(id) |
|
264 | c.user = self._get_user_or_raise_if_default(id) | |
265 |
|
265 | |||
266 | lifetime = safe_int(request.POST.get('lifetime'), -1) |
|
266 | lifetime = safe_int(request.POST.get('lifetime'), -1) | |
267 | description = request.POST.get('description') |
|
267 | description = request.POST.get('description') | |
268 | ApiKeyModel().create(c.user.user_id, description, lifetime) |
|
268 | ApiKeyModel().create(c.user.user_id, description, lifetime) | |
269 | meta.Session().commit() |
|
269 | meta.Session().commit() | |
270 | webutils.flash(_("API key successfully created"), category='success') |
|
270 | webutils.flash(_("API key successfully created"), category='success') | |
271 | raise HTTPFound(location=url('edit_user_api_keys', id=c.user.user_id)) |
|
271 | raise HTTPFound(location=url('edit_user_api_keys', id=c.user.user_id)) | |
272 |
|
272 | |||
273 | def delete_api_key(self, id): |
|
273 | def delete_api_key(self, id): | |
274 | c.user = self._get_user_or_raise_if_default(id) |
|
274 | c.user = self._get_user_or_raise_if_default(id) | |
275 |
|
275 | |||
276 | api_key = request.POST.get('del_api_key') |
|
276 | api_key = request.POST.get('del_api_key') | |
277 | if request.POST.get('del_api_key_builtin'): |
|
277 | if request.POST.get('del_api_key_builtin'): | |
278 | c.user.api_key = generate_api_key() |
|
278 | c.user.api_key = generate_api_key() | |
279 | meta.Session().commit() |
|
279 | meta.Session().commit() | |
280 | webutils.flash(_("API key successfully reset"), category='success') |
|
280 | webutils.flash(_("API key successfully reset"), category='success') | |
281 | elif api_key: |
|
281 | elif api_key: | |
282 | ApiKeyModel().delete(api_key, c.user.user_id) |
|
282 | ApiKeyModel().delete(api_key, c.user.user_id) | |
283 | meta.Session().commit() |
|
283 | meta.Session().commit() | |
284 | webutils.flash(_("API key successfully deleted"), category='success') |
|
284 | webutils.flash(_("API key successfully deleted"), category='success') | |
285 |
|
285 | |||
286 | raise HTTPFound(location=url('edit_user_api_keys', id=c.user.user_id)) |
|
286 | raise HTTPFound(location=url('edit_user_api_keys', id=c.user.user_id)) | |
287 |
|
287 | |||
288 | def update_account(self, id): |
|
288 | def update_account(self, id): | |
289 | pass |
|
289 | pass | |
290 |
|
290 | |||
291 | def edit_perms(self, id): |
|
291 | def edit_perms(self, id): | |
292 | c.user = self._get_user_or_raise_if_default(id) |
|
292 | c.user = self._get_user_or_raise_if_default(id) | |
293 | c.active = 'perms' |
|
293 | c.active = 'perms' | |
294 | c.perm_user = AuthUser(dbuser=c.user) |
|
294 | c.perm_user = AuthUser(dbuser=c.user) | |
295 |
|
295 | |||
296 | umodel = UserModel() |
|
296 | umodel = UserModel() | |
297 | defaults = c.user.get_dict() |
|
297 | defaults = c.user.get_dict() | |
298 | defaults.update({ |
|
298 | defaults.update({ | |
299 | 'create_repo_perm': umodel.has_perm(c.user, 'hg.create.repository'), |
|
299 | 'create_repo_perm': umodel.has_perm(c.user, 'hg.create.repository'), | |
300 | 'create_user_group_perm': umodel.has_perm(c.user, |
|
300 | 'create_user_group_perm': umodel.has_perm(c.user, | |
301 | 'hg.usergroup.create.true'), |
|
301 | 'hg.usergroup.create.true'), | |
302 | 'fork_repo_perm': umodel.has_perm(c.user, 'hg.fork.repository'), |
|
302 | 'fork_repo_perm': umodel.has_perm(c.user, 'hg.fork.repository'), | |
303 | }) |
|
303 | }) | |
304 | return htmlfill.render( |
|
304 | return htmlfill.render( | |
305 | render('admin/users/user_edit.html'), |
|
305 | base.render('admin/users/user_edit.html'), | |
306 | defaults=defaults, |
|
306 | defaults=defaults, | |
307 | encoding="UTF-8", |
|
307 | encoding="UTF-8", | |
308 | force_defaults=False) |
|
308 | force_defaults=False) | |
309 |
|
309 | |||
310 | def update_perms(self, id): |
|
310 | def update_perms(self, id): | |
311 | user = self._get_user_or_raise_if_default(id) |
|
311 | user = self._get_user_or_raise_if_default(id) | |
312 |
|
312 | |||
313 | try: |
|
313 | try: | |
314 | form = CustomDefaultPermissionsForm()() |
|
314 | form = CustomDefaultPermissionsForm()() | |
315 | form_result = form.to_python(request.POST) |
|
315 | form_result = form.to_python(request.POST) | |
316 |
|
316 | |||
317 | user_model = UserModel() |
|
317 | user_model = UserModel() | |
318 |
|
318 | |||
319 | defs = db.UserToPerm.query() \ |
|
319 | defs = db.UserToPerm.query() \ | |
320 | .filter(db.UserToPerm.user == user) \ |
|
320 | .filter(db.UserToPerm.user == user) \ | |
321 | .all() |
|
321 | .all() | |
322 | for ug in defs: |
|
322 | for ug in defs: | |
323 | meta.Session().delete(ug) |
|
323 | meta.Session().delete(ug) | |
324 |
|
324 | |||
325 | if form_result['create_repo_perm']: |
|
325 | if form_result['create_repo_perm']: | |
326 | user_model.grant_perm(id, 'hg.create.repository') |
|
326 | user_model.grant_perm(id, 'hg.create.repository') | |
327 | else: |
|
327 | else: | |
328 | user_model.grant_perm(id, 'hg.create.none') |
|
328 | user_model.grant_perm(id, 'hg.create.none') | |
329 | if form_result['create_user_group_perm']: |
|
329 | if form_result['create_user_group_perm']: | |
330 | user_model.grant_perm(id, 'hg.usergroup.create.true') |
|
330 | user_model.grant_perm(id, 'hg.usergroup.create.true') | |
331 | else: |
|
331 | else: | |
332 | user_model.grant_perm(id, 'hg.usergroup.create.false') |
|
332 | user_model.grant_perm(id, 'hg.usergroup.create.false') | |
333 | if form_result['fork_repo_perm']: |
|
333 | if form_result['fork_repo_perm']: | |
334 | user_model.grant_perm(id, 'hg.fork.repository') |
|
334 | user_model.grant_perm(id, 'hg.fork.repository') | |
335 | else: |
|
335 | else: | |
336 | user_model.grant_perm(id, 'hg.fork.none') |
|
336 | user_model.grant_perm(id, 'hg.fork.none') | |
337 | webutils.flash(_("Updated permissions"), category='success') |
|
337 | webutils.flash(_("Updated permissions"), category='success') | |
338 | meta.Session().commit() |
|
338 | meta.Session().commit() | |
339 | except Exception: |
|
339 | except Exception: | |
340 | log.error(traceback.format_exc()) |
|
340 | log.error(traceback.format_exc()) | |
341 | webutils.flash(_('An error occurred during permissions saving'), |
|
341 | webutils.flash(_('An error occurred during permissions saving'), | |
342 | category='error') |
|
342 | category='error') | |
343 | raise HTTPFound(location=url('edit_user_perms', id=id)) |
|
343 | raise HTTPFound(location=url('edit_user_perms', id=id)) | |
344 |
|
344 | |||
345 | def edit_emails(self, id): |
|
345 | def edit_emails(self, id): | |
346 | c.user = self._get_user_or_raise_if_default(id) |
|
346 | c.user = self._get_user_or_raise_if_default(id) | |
347 | c.active = 'emails' |
|
347 | c.active = 'emails' | |
348 | c.user_email_map = db.UserEmailMap.query() \ |
|
348 | c.user_email_map = db.UserEmailMap.query() \ | |
349 | .filter(db.UserEmailMap.user == c.user).all() |
|
349 | .filter(db.UserEmailMap.user == c.user).all() | |
350 |
|
350 | |||
351 | defaults = c.user.get_dict() |
|
351 | defaults = c.user.get_dict() | |
352 | return htmlfill.render( |
|
352 | return htmlfill.render( | |
353 | render('admin/users/user_edit.html'), |
|
353 | base.render('admin/users/user_edit.html'), | |
354 | defaults=defaults, |
|
354 | defaults=defaults, | |
355 | encoding="UTF-8", |
|
355 | encoding="UTF-8", | |
356 | force_defaults=False) |
|
356 | force_defaults=False) | |
357 |
|
357 | |||
358 | def add_email(self, id): |
|
358 | def add_email(self, id): | |
359 | user = self._get_user_or_raise_if_default(id) |
|
359 | user = self._get_user_or_raise_if_default(id) | |
360 | email = request.POST.get('new_email') |
|
360 | email = request.POST.get('new_email') | |
361 | user_model = UserModel() |
|
361 | user_model = UserModel() | |
362 |
|
362 | |||
363 | try: |
|
363 | try: | |
364 | user_model.add_extra_email(id, email) |
|
364 | user_model.add_extra_email(id, email) | |
365 | meta.Session().commit() |
|
365 | meta.Session().commit() | |
366 | webutils.flash(_("Added email %s to user") % email, category='success') |
|
366 | webutils.flash(_("Added email %s to user") % email, category='success') | |
367 | except formencode.Invalid as error: |
|
367 | except formencode.Invalid as error: | |
368 | msg = error.error_dict['email'] |
|
368 | msg = error.error_dict['email'] | |
369 | webutils.flash(msg, category='error') |
|
369 | webutils.flash(msg, category='error') | |
370 | except Exception: |
|
370 | except Exception: | |
371 | log.error(traceback.format_exc()) |
|
371 | log.error(traceback.format_exc()) | |
372 | webutils.flash(_('An error occurred during email saving'), |
|
372 | webutils.flash(_('An error occurred during email saving'), | |
373 | category='error') |
|
373 | category='error') | |
374 | raise HTTPFound(location=url('edit_user_emails', id=id)) |
|
374 | raise HTTPFound(location=url('edit_user_emails', id=id)) | |
375 |
|
375 | |||
376 | def delete_email(self, id): |
|
376 | def delete_email(self, id): | |
377 | user = self._get_user_or_raise_if_default(id) |
|
377 | user = self._get_user_or_raise_if_default(id) | |
378 | email_id = request.POST.get('del_email_id') |
|
378 | email_id = request.POST.get('del_email_id') | |
379 | user_model = UserModel() |
|
379 | user_model = UserModel() | |
380 | user_model.delete_extra_email(id, email_id) |
|
380 | user_model.delete_extra_email(id, email_id) | |
381 | meta.Session().commit() |
|
381 | meta.Session().commit() | |
382 | webutils.flash(_("Removed email from user"), category='success') |
|
382 | webutils.flash(_("Removed email from user"), category='success') | |
383 | raise HTTPFound(location=url('edit_user_emails', id=id)) |
|
383 | raise HTTPFound(location=url('edit_user_emails', id=id)) | |
384 |
|
384 | |||
385 | def edit_ips(self, id): |
|
385 | def edit_ips(self, id): | |
386 | c.user = self._get_user_or_raise_if_default(id) |
|
386 | c.user = self._get_user_or_raise_if_default(id) | |
387 | c.active = 'ips' |
|
387 | c.active = 'ips' | |
388 | c.user_ip_map = db.UserIpMap.query() \ |
|
388 | c.user_ip_map = db.UserIpMap.query() \ | |
389 | .filter(db.UserIpMap.user == c.user).all() |
|
389 | .filter(db.UserIpMap.user == c.user).all() | |
390 |
|
390 | |||
391 | c.default_user_ip_map = db.UserIpMap.query() \ |
|
391 | c.default_user_ip_map = db.UserIpMap.query() \ | |
392 | .filter(db.UserIpMap.user_id == kallithea.DEFAULT_USER_ID).all() |
|
392 | .filter(db.UserIpMap.user_id == kallithea.DEFAULT_USER_ID).all() | |
393 |
|
393 | |||
394 | defaults = c.user.get_dict() |
|
394 | defaults = c.user.get_dict() | |
395 | return htmlfill.render( |
|
395 | return htmlfill.render( | |
396 | render('admin/users/user_edit.html'), |
|
396 | base.render('admin/users/user_edit.html'), | |
397 | defaults=defaults, |
|
397 | defaults=defaults, | |
398 | encoding="UTF-8", |
|
398 | encoding="UTF-8", | |
399 | force_defaults=False) |
|
399 | force_defaults=False) | |
400 |
|
400 | |||
401 | def add_ip(self, id): |
|
401 | def add_ip(self, id): | |
402 | ip = request.POST.get('new_ip') |
|
402 | ip = request.POST.get('new_ip') | |
403 | user_model = UserModel() |
|
403 | user_model = UserModel() | |
404 |
|
404 | |||
405 | try: |
|
405 | try: | |
406 | user_model.add_extra_ip(id, ip) |
|
406 | user_model.add_extra_ip(id, ip) | |
407 | meta.Session().commit() |
|
407 | meta.Session().commit() | |
408 | webutils.flash(_("Added IP address %s to user whitelist") % ip, category='success') |
|
408 | webutils.flash(_("Added IP address %s to user whitelist") % ip, category='success') | |
409 | except formencode.Invalid as error: |
|
409 | except formencode.Invalid as error: | |
410 | msg = error.error_dict['ip'] |
|
410 | msg = error.error_dict['ip'] | |
411 | webutils.flash(msg, category='error') |
|
411 | webutils.flash(msg, category='error') | |
412 | except Exception: |
|
412 | except Exception: | |
413 | log.error(traceback.format_exc()) |
|
413 | log.error(traceback.format_exc()) | |
414 | webutils.flash(_('An error occurred while adding IP address'), |
|
414 | webutils.flash(_('An error occurred while adding IP address'), | |
415 | category='error') |
|
415 | category='error') | |
416 |
|
416 | |||
417 | if 'default_user' in request.POST: |
|
417 | if 'default_user' in request.POST: | |
418 | raise HTTPFound(location=url('admin_permissions_ips')) |
|
418 | raise HTTPFound(location=url('admin_permissions_ips')) | |
419 | raise HTTPFound(location=url('edit_user_ips', id=id)) |
|
419 | raise HTTPFound(location=url('edit_user_ips', id=id)) | |
420 |
|
420 | |||
421 | def delete_ip(self, id): |
|
421 | def delete_ip(self, id): | |
422 | ip_id = request.POST.get('del_ip_id') |
|
422 | ip_id = request.POST.get('del_ip_id') | |
423 | user_model = UserModel() |
|
423 | user_model = UserModel() | |
424 | user_model.delete_extra_ip(id, ip_id) |
|
424 | user_model.delete_extra_ip(id, ip_id) | |
425 | meta.Session().commit() |
|
425 | meta.Session().commit() | |
426 | webutils.flash(_("Removed IP address from user whitelist"), category='success') |
|
426 | webutils.flash(_("Removed IP address from user whitelist"), category='success') | |
427 |
|
427 | |||
428 | if 'default_user' in request.POST: |
|
428 | if 'default_user' in request.POST: | |
429 | raise HTTPFound(location=url('admin_permissions_ips')) |
|
429 | raise HTTPFound(location=url('admin_permissions_ips')) | |
430 | raise HTTPFound(location=url('edit_user_ips', id=id)) |
|
430 | raise HTTPFound(location=url('edit_user_ips', id=id)) | |
431 |
|
431 | |||
432 | @IfSshEnabled |
|
432 | @base.IfSshEnabled | |
433 | def edit_ssh_keys(self, id): |
|
433 | def edit_ssh_keys(self, id): | |
434 | c.user = self._get_user_or_raise_if_default(id) |
|
434 | c.user = self._get_user_or_raise_if_default(id) | |
435 | c.active = 'ssh_keys' |
|
435 | c.active = 'ssh_keys' | |
436 | c.user_ssh_keys = SshKeyModel().get_ssh_keys(c.user.user_id) |
|
436 | c.user_ssh_keys = SshKeyModel().get_ssh_keys(c.user.user_id) | |
437 | defaults = c.user.get_dict() |
|
437 | defaults = c.user.get_dict() | |
438 | return htmlfill.render( |
|
438 | return htmlfill.render( | |
439 | render('admin/users/user_edit.html'), |
|
439 | base.render('admin/users/user_edit.html'), | |
440 | defaults=defaults, |
|
440 | defaults=defaults, | |
441 | encoding="UTF-8", |
|
441 | encoding="UTF-8", | |
442 | force_defaults=False) |
|
442 | force_defaults=False) | |
443 |
|
443 | |||
444 | @IfSshEnabled |
|
444 | @base.IfSshEnabled | |
445 | def ssh_keys_add(self, id): |
|
445 | def ssh_keys_add(self, id): | |
446 | c.user = self._get_user_or_raise_if_default(id) |
|
446 | c.user = self._get_user_or_raise_if_default(id) | |
447 |
|
447 | |||
448 | description = request.POST.get('description') |
|
448 | description = request.POST.get('description') | |
449 | public_key = request.POST.get('public_key') |
|
449 | public_key = request.POST.get('public_key') | |
450 | try: |
|
450 | try: | |
451 | new_ssh_key = SshKeyModel().create(c.user.user_id, |
|
451 | new_ssh_key = SshKeyModel().create(c.user.user_id, | |
452 | description, public_key) |
|
452 | description, public_key) | |
453 | meta.Session().commit() |
|
453 | meta.Session().commit() | |
454 | SshKeyModel().write_authorized_keys() |
|
454 | SshKeyModel().write_authorized_keys() | |
455 | webutils.flash(_("SSH key %s successfully added") % new_ssh_key.fingerprint, category='success') |
|
455 | webutils.flash(_("SSH key %s successfully added") % new_ssh_key.fingerprint, category='success') | |
456 | except SshKeyModelException as e: |
|
456 | except SshKeyModelException as e: | |
457 | webutils.flash(e.args[0], category='error') |
|
457 | webutils.flash(e.args[0], category='error') | |
458 | raise HTTPFound(location=url('edit_user_ssh_keys', id=c.user.user_id)) |
|
458 | raise HTTPFound(location=url('edit_user_ssh_keys', id=c.user.user_id)) | |
459 |
|
459 | |||
460 | @IfSshEnabled |
|
460 | @base.IfSshEnabled | |
461 | def ssh_keys_delete(self, id): |
|
461 | def ssh_keys_delete(self, id): | |
462 | c.user = self._get_user_or_raise_if_default(id) |
|
462 | c.user = self._get_user_or_raise_if_default(id) | |
463 |
|
463 | |||
464 | fingerprint = request.POST.get('del_public_key_fingerprint') |
|
464 | fingerprint = request.POST.get('del_public_key_fingerprint') | |
465 | try: |
|
465 | try: | |
466 | SshKeyModel().delete(fingerprint, c.user.user_id) |
|
466 | SshKeyModel().delete(fingerprint, c.user.user_id) | |
467 | meta.Session().commit() |
|
467 | meta.Session().commit() | |
468 | SshKeyModel().write_authorized_keys() |
|
468 | SshKeyModel().write_authorized_keys() | |
469 | webutils.flash(_("SSH key successfully deleted"), category='success') |
|
469 | webutils.flash(_("SSH key successfully deleted"), category='success') | |
470 | except SshKeyModelException as e: |
|
470 | except SshKeyModelException as e: | |
471 | webutils.flash(e.args[0], category='error') |
|
471 | webutils.flash(e.args[0], category='error') | |
472 | raise HTTPFound(location=url('edit_user_ssh_keys', id=c.user.user_id)) |
|
472 | raise HTTPFound(location=url('edit_user_ssh_keys', id=c.user.user_id)) |
@@ -1,265 +1,265 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.api |
|
15 | kallithea.controllers.api | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | JSON RPC controller |
|
18 | JSON RPC controller | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Aug 20, 2011 |
|
22 | :created_on: Aug 20, 2011 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import inspect |
|
28 | import inspect | |
29 | import itertools |
|
29 | import itertools | |
30 | import logging |
|
30 | import logging | |
31 | import time |
|
31 | import time | |
32 | import traceback |
|
32 | import traceback | |
33 | import types |
|
33 | import types | |
34 |
|
34 | |||
35 | from tg import Response, TGController, request, response |
|
35 | from tg import Response, TGController, request, response | |
36 | from webob.exc import HTTPError, HTTPException |
|
36 | from webob.exc import HTTPError, HTTPException | |
37 |
|
37 | |||
|
38 | from kallithea.controllers import base | |||
38 | from kallithea.lib import ext_json |
|
39 | from kallithea.lib import ext_json | |
39 | from kallithea.lib.auth import AuthUser |
|
40 | from kallithea.lib.auth import AuthUser | |
40 | from kallithea.lib.base import get_ip_addr, get_path_info |
|
|||
41 | from kallithea.lib.utils2 import ascii_bytes |
|
41 | from kallithea.lib.utils2 import ascii_bytes | |
42 | from kallithea.model import db |
|
42 | from kallithea.model import db | |
43 |
|
43 | |||
44 |
|
44 | |||
45 | log = logging.getLogger('JSONRPC') |
|
45 | log = logging.getLogger('JSONRPC') | |
46 |
|
46 | |||
47 |
|
47 | |||
48 | class JSONRPCError(BaseException): |
|
48 | class JSONRPCError(BaseException): | |
49 |
|
49 | |||
50 | def __init__(self, message): |
|
50 | def __init__(self, message): | |
51 | self.message = message |
|
51 | self.message = message | |
52 | super(JSONRPCError, self).__init__() |
|
52 | super(JSONRPCError, self).__init__() | |
53 |
|
53 | |||
54 | def __str__(self): |
|
54 | def __str__(self): | |
55 | return self.message |
|
55 | return self.message | |
56 |
|
56 | |||
57 |
|
57 | |||
58 | class JSONRPCErrorResponse(Response, HTTPException): |
|
58 | class JSONRPCErrorResponse(Response, HTTPException): | |
59 | """ |
|
59 | """ | |
60 | Generate a Response object with a JSON-RPC error body |
|
60 | Generate a Response object with a JSON-RPC error body | |
61 | """ |
|
61 | """ | |
62 |
|
62 | |||
63 | def __init__(self, message=None, retid=None, code=None): |
|
63 | def __init__(self, message=None, retid=None, code=None): | |
64 | HTTPException.__init__(self, message, self) |
|
64 | HTTPException.__init__(self, message, self) | |
65 | Response.__init__(self, |
|
65 | Response.__init__(self, | |
66 | json_body=dict(id=retid, result=None, error=message), |
|
66 | json_body=dict(id=retid, result=None, error=message), | |
67 | status=code, |
|
67 | status=code, | |
68 | content_type='application/json') |
|
68 | content_type='application/json') | |
69 |
|
69 | |||
70 |
|
70 | |||
71 | class JSONRPCController(TGController): |
|
71 | class JSONRPCController(TGController): | |
72 | """ |
|
72 | """ | |
73 | A WSGI-speaking JSON-RPC controller class |
|
73 | A WSGI-speaking JSON-RPC controller class | |
74 |
|
74 | |||
75 | See the specification: |
|
75 | See the specification: | |
76 | <http://json-rpc.org/wiki/specification>`. |
|
76 | <http://json-rpc.org/wiki/specification>`. | |
77 |
|
77 | |||
78 | Valid controller return values should be json-serializable objects. |
|
78 | Valid controller return values should be json-serializable objects. | |
79 |
|
79 | |||
80 | Sub-classes should catch their exceptions and raise JSONRPCError |
|
80 | Sub-classes should catch their exceptions and raise JSONRPCError | |
81 | if they want to pass meaningful errors to the client. |
|
81 | if they want to pass meaningful errors to the client. | |
82 |
|
82 | |||
83 | """ |
|
83 | """ | |
84 |
|
84 | |||
85 | def _get_method_args(self): |
|
85 | def _get_method_args(self): | |
86 | """ |
|
86 | """ | |
87 | Return `self._rpc_args` to dispatched controller method |
|
87 | Return `self._rpc_args` to dispatched controller method | |
88 | chosen by __call__ |
|
88 | chosen by __call__ | |
89 | """ |
|
89 | """ | |
90 | return self._rpc_args |
|
90 | return self._rpc_args | |
91 |
|
91 | |||
92 | def _dispatch(self, state, remainder=None): |
|
92 | def _dispatch(self, state, remainder=None): | |
93 | """ |
|
93 | """ | |
94 | Parse the request body as JSON, look up the method on the |
|
94 | Parse the request body as JSON, look up the method on the | |
95 | controller and if it exists, dispatch to it. |
|
95 | controller and if it exists, dispatch to it. | |
96 | """ |
|
96 | """ | |
97 | # Since we are here we should respond as JSON |
|
97 | # Since we are here we should respond as JSON | |
98 | response.content_type = 'application/json' |
|
98 | response.content_type = 'application/json' | |
99 |
|
99 | |||
100 | environ = state.request.environ |
|
100 | environ = state.request.environ | |
101 | start = time.time() |
|
101 | start = time.time() | |
102 | ip_addr = get_ip_addr(environ) |
|
102 | ip_addr = base.get_ip_addr(environ) | |
103 | self._req_id = None |
|
103 | self._req_id = None | |
104 | if 'CONTENT_LENGTH' not in environ: |
|
104 | if 'CONTENT_LENGTH' not in environ: | |
105 | log.debug("No Content-Length") |
|
105 | log.debug("No Content-Length") | |
106 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
106 | raise JSONRPCErrorResponse(retid=self._req_id, | |
107 | message="No Content-Length in request") |
|
107 | message="No Content-Length in request") | |
108 | else: |
|
108 | else: | |
109 | length = environ['CONTENT_LENGTH'] or 0 |
|
109 | length = environ['CONTENT_LENGTH'] or 0 | |
110 | length = int(environ['CONTENT_LENGTH']) |
|
110 | length = int(environ['CONTENT_LENGTH']) | |
111 | log.debug('Content-Length: %s', length) |
|
111 | log.debug('Content-Length: %s', length) | |
112 |
|
112 | |||
113 | if length == 0: |
|
113 | if length == 0: | |
114 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
114 | raise JSONRPCErrorResponse(retid=self._req_id, | |
115 | message="Content-Length is 0") |
|
115 | message="Content-Length is 0") | |
116 |
|
116 | |||
117 | raw_body = environ['wsgi.input'].read(length) |
|
117 | raw_body = environ['wsgi.input'].read(length) | |
118 |
|
118 | |||
119 | try: |
|
119 | try: | |
120 | json_body = ext_json.loads(raw_body) |
|
120 | json_body = ext_json.loads(raw_body) | |
121 | except ValueError as e: |
|
121 | except ValueError as e: | |
122 | # catch JSON errors Here |
|
122 | # catch JSON errors Here | |
123 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
123 | raise JSONRPCErrorResponse(retid=self._req_id, | |
124 | message="JSON parse error ERR:%s RAW:%r" |
|
124 | message="JSON parse error ERR:%s RAW:%r" | |
125 | % (e, raw_body)) |
|
125 | % (e, raw_body)) | |
126 |
|
126 | |||
127 | # check AUTH based on API key |
|
127 | # check AUTH based on API key | |
128 | try: |
|
128 | try: | |
129 | self._req_api_key = json_body['api_key'] |
|
129 | self._req_api_key = json_body['api_key'] | |
130 | self._req_id = json_body['id'] |
|
130 | self._req_id = json_body['id'] | |
131 | self._req_method = json_body['method'] |
|
131 | self._req_method = json_body['method'] | |
132 | self._request_params = json_body['args'] |
|
132 | self._request_params = json_body['args'] | |
133 | if not isinstance(self._request_params, dict): |
|
133 | if not isinstance(self._request_params, dict): | |
134 | self._request_params = {} |
|
134 | self._request_params = {} | |
135 |
|
135 | |||
136 | log.debug('method: %s, params: %s', |
|
136 | log.debug('method: %s, params: %s', | |
137 | self._req_method, self._request_params) |
|
137 | self._req_method, self._request_params) | |
138 | except KeyError as e: |
|
138 | except KeyError as e: | |
139 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
139 | raise JSONRPCErrorResponse(retid=self._req_id, | |
140 | message='Incorrect JSON query missing %s' % e) |
|
140 | message='Incorrect JSON query missing %s' % e) | |
141 |
|
141 | |||
142 | # check if we can find this session using api_key |
|
142 | # check if we can find this session using api_key | |
143 | try: |
|
143 | try: | |
144 | u = db.User.get_by_api_key(self._req_api_key) |
|
144 | u = db.User.get_by_api_key(self._req_api_key) | |
145 | auth_user = AuthUser.make(dbuser=u, ip_addr=ip_addr) |
|
145 | auth_user = AuthUser.make(dbuser=u, ip_addr=ip_addr) | |
146 | if auth_user is None: |
|
146 | if auth_user is None: | |
147 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
147 | raise JSONRPCErrorResponse(retid=self._req_id, | |
148 | message='Invalid API key') |
|
148 | message='Invalid API key') | |
149 | except Exception as e: |
|
149 | except Exception as e: | |
150 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
150 | raise JSONRPCErrorResponse(retid=self._req_id, | |
151 | message='Invalid API key') |
|
151 | message='Invalid API key') | |
152 |
|
152 | |||
153 | request.authuser = auth_user |
|
153 | request.authuser = auth_user | |
154 | request.ip_addr = ip_addr |
|
154 | request.ip_addr = ip_addr | |
155 |
|
155 | |||
156 | self._error = None |
|
156 | self._error = None | |
157 | try: |
|
157 | try: | |
158 | self._func = self._find_method() |
|
158 | self._func = self._find_method() | |
159 | except AttributeError as e: |
|
159 | except AttributeError as e: | |
160 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
160 | raise JSONRPCErrorResponse(retid=self._req_id, | |
161 | message=str(e)) |
|
161 | message=str(e)) | |
162 |
|
162 | |||
163 | # now that we have a method, add self._req_params to |
|
163 | # now that we have a method, add self._req_params to | |
164 | # self.kargs and dispatch control to WGIController |
|
164 | # self.kargs and dispatch control to WGIController | |
165 | argspec = inspect.getfullargspec(self._func) |
|
165 | argspec = inspect.getfullargspec(self._func) | |
166 | arglist = argspec.args[1:] |
|
166 | arglist = argspec.args[1:] | |
167 | argtypes = [type(arg) for arg in argspec.defaults or []] |
|
167 | argtypes = [type(arg) for arg in argspec.defaults or []] | |
168 | default_empty = type(NotImplemented) |
|
168 | default_empty = type(NotImplemented) | |
169 |
|
169 | |||
170 | # kw arguments required by this method |
|
170 | # kw arguments required by this method | |
171 | func_kwargs = dict(itertools.zip_longest(reversed(arglist), reversed(argtypes), |
|
171 | func_kwargs = dict(itertools.zip_longest(reversed(arglist), reversed(argtypes), | |
172 | fillvalue=default_empty)) |
|
172 | fillvalue=default_empty)) | |
173 |
|
173 | |||
174 | # This attribute will need to be first param of a method that uses |
|
174 | # This attribute will need to be first param of a method that uses | |
175 | # api_key, which is translated to instance of user at that name |
|
175 | # api_key, which is translated to instance of user at that name | |
176 | USER_SESSION_ATTR = 'apiuser' |
|
176 | USER_SESSION_ATTR = 'apiuser' | |
177 |
|
177 | |||
178 | # get our arglist and check if we provided them as args |
|
178 | # get our arglist and check if we provided them as args | |
179 | for arg, default in func_kwargs.items(): |
|
179 | for arg, default in func_kwargs.items(): | |
180 | if arg == USER_SESSION_ATTR: |
|
180 | if arg == USER_SESSION_ATTR: | |
181 | # USER_SESSION_ATTR is something translated from API key and |
|
181 | # USER_SESSION_ATTR is something translated from API key and | |
182 | # this is checked before so we don't need validate it |
|
182 | # this is checked before so we don't need validate it | |
183 | continue |
|
183 | continue | |
184 |
|
184 | |||
185 | # skip the required param check if it's default value is |
|
185 | # skip the required param check if it's default value is | |
186 | # NotImplementedType (default_empty) |
|
186 | # NotImplementedType (default_empty) | |
187 | if default == default_empty and arg not in self._request_params: |
|
187 | if default == default_empty and arg not in self._request_params: | |
188 | raise JSONRPCErrorResponse( |
|
188 | raise JSONRPCErrorResponse( | |
189 | retid=self._req_id, |
|
189 | retid=self._req_id, | |
190 | message='Missing non optional `%s` arg in JSON DATA' % arg, |
|
190 | message='Missing non optional `%s` arg in JSON DATA' % arg, | |
191 | ) |
|
191 | ) | |
192 |
|
192 | |||
193 | extra = set(self._request_params).difference(func_kwargs) |
|
193 | extra = set(self._request_params).difference(func_kwargs) | |
194 | if extra: |
|
194 | if extra: | |
195 | raise JSONRPCErrorResponse( |
|
195 | raise JSONRPCErrorResponse( | |
196 | retid=self._req_id, |
|
196 | retid=self._req_id, | |
197 | message='Unknown %s arg in JSON DATA' % |
|
197 | message='Unknown %s arg in JSON DATA' % | |
198 | ', '.join('`%s`' % arg for arg in extra), |
|
198 | ', '.join('`%s`' % arg for arg in extra), | |
199 | ) |
|
199 | ) | |
200 |
|
200 | |||
201 | self._rpc_args = {} |
|
201 | self._rpc_args = {} | |
202 | self._rpc_args.update(self._request_params) |
|
202 | self._rpc_args.update(self._request_params) | |
203 | self._rpc_args['action'] = self._req_method |
|
203 | self._rpc_args['action'] = self._req_method | |
204 | self._rpc_args['environ'] = environ |
|
204 | self._rpc_args['environ'] = environ | |
205 |
|
205 | |||
206 | log.info('IP: %s Request to %s time: %.3fs' % ( |
|
206 | log.info('IP: %s Request to %s time: %.3fs' % ( | |
207 | get_ip_addr(environ), |
|
207 | base.get_ip_addr(environ), | |
208 | get_path_info(environ), time.time() - start) |
|
208 | base.get_path_info(environ), time.time() - start) | |
209 | ) |
|
209 | ) | |
210 |
|
210 | |||
211 | state.set_action(self._rpc_call, []) |
|
211 | state.set_action(self._rpc_call, []) | |
212 | state.set_params(self._rpc_args) |
|
212 | state.set_params(self._rpc_args) | |
213 | return state |
|
213 | return state | |
214 |
|
214 | |||
215 | def _rpc_call(self, action, environ, **rpc_args): |
|
215 | def _rpc_call(self, action, environ, **rpc_args): | |
216 | """ |
|
216 | """ | |
217 | Call the specified RPC Method |
|
217 | Call the specified RPC Method | |
218 | """ |
|
218 | """ | |
219 | raw_response = '' |
|
219 | raw_response = '' | |
220 | try: |
|
220 | try: | |
221 | raw_response = getattr(self, action)(**rpc_args) |
|
221 | raw_response = getattr(self, action)(**rpc_args) | |
222 | if isinstance(raw_response, HTTPError): |
|
222 | if isinstance(raw_response, HTTPError): | |
223 | self._error = str(raw_response) |
|
223 | self._error = str(raw_response) | |
224 | except JSONRPCError as e: |
|
224 | except JSONRPCError as e: | |
225 | self._error = str(e) |
|
225 | self._error = str(e) | |
226 | except Exception as e: |
|
226 | except Exception as e: | |
227 | log.error('Encountered unhandled exception: %s', |
|
227 | log.error('Encountered unhandled exception: %s', | |
228 | traceback.format_exc(),) |
|
228 | traceback.format_exc(),) | |
229 | json_exc = JSONRPCError('Internal server error') |
|
229 | json_exc = JSONRPCError('Internal server error') | |
230 | self._error = str(json_exc) |
|
230 | self._error = str(json_exc) | |
231 |
|
231 | |||
232 | if self._error is not None: |
|
232 | if self._error is not None: | |
233 | raw_response = None |
|
233 | raw_response = None | |
234 |
|
234 | |||
235 | response = dict(id=self._req_id, result=raw_response, error=self._error) |
|
235 | response = dict(id=self._req_id, result=raw_response, error=self._error) | |
236 | try: |
|
236 | try: | |
237 | return ascii_bytes(ext_json.dumps(response)) |
|
237 | return ascii_bytes(ext_json.dumps(response)) | |
238 | except TypeError as e: |
|
238 | except TypeError as e: | |
239 | log.error('API FAILED. Error encoding response for %s %s: %s\n%s', action, rpc_args, e, traceback.format_exc()) |
|
239 | log.error('API FAILED. Error encoding response for %s %s: %s\n%s', action, rpc_args, e, traceback.format_exc()) | |
240 | return ascii_bytes(ext_json.dumps( |
|
240 | return ascii_bytes(ext_json.dumps( | |
241 | dict( |
|
241 | dict( | |
242 | id=self._req_id, |
|
242 | id=self._req_id, | |
243 | result=None, |
|
243 | result=None, | |
244 | error="Error encoding response", |
|
244 | error="Error encoding response", | |
245 | ) |
|
245 | ) | |
246 | )) |
|
246 | )) | |
247 |
|
247 | |||
248 | def _find_method(self): |
|
248 | def _find_method(self): | |
249 | """ |
|
249 | """ | |
250 | Return method named by `self._req_method` in controller if able |
|
250 | Return method named by `self._req_method` in controller if able | |
251 | """ |
|
251 | """ | |
252 | log.debug('Trying to find JSON-RPC method: %s', self._req_method) |
|
252 | log.debug('Trying to find JSON-RPC method: %s', self._req_method) | |
253 | if self._req_method.startswith('_'): |
|
253 | if self._req_method.startswith('_'): | |
254 | raise AttributeError("Method not allowed") |
|
254 | raise AttributeError("Method not allowed") | |
255 |
|
255 | |||
256 | try: |
|
256 | try: | |
257 | func = getattr(self, self._req_method, None) |
|
257 | func = getattr(self, self._req_method, None) | |
258 | except UnicodeEncodeError: |
|
258 | except UnicodeEncodeError: | |
259 | raise AttributeError("Problem decoding unicode in requested " |
|
259 | raise AttributeError("Problem decoding unicode in requested " | |
260 | "method name.") |
|
260 | "method name.") | |
261 |
|
261 | |||
262 | if isinstance(func, types.MethodType): |
|
262 | if isinstance(func, types.MethodType): | |
263 | return func |
|
263 | return func | |
264 | else: |
|
264 | else: | |
265 | raise AttributeError("No such method: %s" % (self._req_method,)) |
|
265 | raise AttributeError("No such method: %s" % (self._req_method,)) |
@@ -1,639 +1,639 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 |
|
14 | |||
15 | """ |
|
15 | """ | |
16 |
kallithea. |
|
16 | kallithea.controllers.base | |
17 | ~~~~~~~~~~~~~~~~~~ |
|
17 | ~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
18 |
|
18 | |||
19 | The base Controller API |
|
19 | The base Controller API | |
20 | Provides the BaseController class for subclassing. And usage in different |
|
20 | Provides the BaseController class for subclassing. And usage in different | |
21 | controllers |
|
21 | controllers | |
22 |
|
22 | |||
23 | This file was forked by the Kallithea project in July 2014. |
|
23 | This file was forked by the Kallithea project in July 2014. | |
24 | Original author and date, and relevant copyright and licensing information is below: |
|
24 | Original author and date, and relevant copyright and licensing information is below: | |
25 | :created_on: Oct 06, 2010 |
|
25 | :created_on: Oct 06, 2010 | |
26 | :author: marcink |
|
26 | :author: marcink | |
27 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
27 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
28 | :license: GPLv3, see LICENSE.md for more details. |
|
28 | :license: GPLv3, see LICENSE.md for more details. | |
29 | """ |
|
29 | """ | |
30 |
|
30 | |||
31 | import base64 |
|
31 | import base64 | |
32 | import datetime |
|
32 | import datetime | |
33 | import logging |
|
33 | import logging | |
34 | import traceback |
|
34 | import traceback | |
35 | import warnings |
|
35 | import warnings | |
36 |
|
36 | |||
37 | import decorator |
|
37 | import decorator | |
38 | import paste.auth.basic |
|
38 | import paste.auth.basic | |
39 | import paste.httpexceptions |
|
39 | import paste.httpexceptions | |
40 | import paste.httpheaders |
|
40 | import paste.httpheaders | |
41 | import webob.exc |
|
41 | import webob.exc | |
42 | from tg import TGController, config, render_template, request, response, session |
|
42 | from tg import TGController, config, render_template, request, response, session | |
43 | from tg import tmpl_context as c |
|
43 | from tg import tmpl_context as c | |
44 | from tg.i18n import ugettext as _ |
|
44 | from tg.i18n import ugettext as _ | |
45 |
|
45 | |||
46 | import kallithea |
|
46 | import kallithea | |
47 | from kallithea.lib import auth_modules, ext_json, webutils |
|
47 | from kallithea.lib import auth_modules, ext_json, webutils | |
48 | from kallithea.lib.auth import AuthUser, HasPermissionAnyMiddleware |
|
48 | from kallithea.lib.auth import AuthUser, HasPermissionAnyMiddleware | |
49 | from kallithea.lib.exceptions import UserCreationError |
|
49 | from kallithea.lib.exceptions import UserCreationError | |
50 | from kallithea.lib.utils import get_repo_slug, is_valid_repo |
|
50 | from kallithea.lib.utils import get_repo_slug, is_valid_repo | |
51 | from kallithea.lib.utils2 import AttributeDict, asbool, ascii_bytes, safe_int, safe_str, set_hook_environment |
|
51 | from kallithea.lib.utils2 import AttributeDict, asbool, ascii_bytes, safe_int, safe_str, set_hook_environment | |
52 | from kallithea.lib.vcs.exceptions import ChangesetDoesNotExistError, EmptyRepositoryError, RepositoryError |
|
52 | from kallithea.lib.vcs.exceptions import ChangesetDoesNotExistError, EmptyRepositoryError, RepositoryError | |
53 | from kallithea.lib.webutils import url |
|
53 | from kallithea.lib.webutils import url | |
54 | from kallithea.model import db, meta |
|
54 | from kallithea.model import db, meta | |
55 | from kallithea.model.scm import ScmModel |
|
55 | from kallithea.model.scm import ScmModel | |
56 |
|
56 | |||
57 |
|
57 | |||
58 | log = logging.getLogger(__name__) |
|
58 | log = logging.getLogger(__name__) | |
59 |
|
59 | |||
60 |
|
60 | |||
61 | def render(template_path): |
|
61 | def render(template_path): | |
62 | return render_template({'url': url}, 'mako', template_path) |
|
62 | return render_template({'url': url}, 'mako', template_path) | |
63 |
|
63 | |||
64 |
|
64 | |||
65 | def _filter_proxy(ip): |
|
65 | def _filter_proxy(ip): | |
66 | """ |
|
66 | """ | |
67 | HEADERS can have multiple ips inside the left-most being the original |
|
67 | HEADERS can have multiple ips inside the left-most being the original | |
68 | client, and each successive proxy that passed the request adding the IP |
|
68 | client, and each successive proxy that passed the request adding the IP | |
69 | address where it received the request from. |
|
69 | address where it received the request from. | |
70 |
|
70 | |||
71 | :param ip: |
|
71 | :param ip: | |
72 | """ |
|
72 | """ | |
73 | if ',' in ip: |
|
73 | if ',' in ip: | |
74 | _ips = ip.split(',') |
|
74 | _ips = ip.split(',') | |
75 | _first_ip = _ips[0].strip() |
|
75 | _first_ip = _ips[0].strip() | |
76 | log.debug('Got multiple IPs %s, using %s', ','.join(_ips), _first_ip) |
|
76 | log.debug('Got multiple IPs %s, using %s', ','.join(_ips), _first_ip) | |
77 | return _first_ip |
|
77 | return _first_ip | |
78 | return ip |
|
78 | return ip | |
79 |
|
79 | |||
80 |
|
80 | |||
81 | def get_ip_addr(environ): |
|
81 | def get_ip_addr(environ): | |
82 | proxy_key = 'HTTP_X_REAL_IP' |
|
82 | proxy_key = 'HTTP_X_REAL_IP' | |
83 | proxy_key2 = 'HTTP_X_FORWARDED_FOR' |
|
83 | proxy_key2 = 'HTTP_X_FORWARDED_FOR' | |
84 | def_key = 'REMOTE_ADDR' |
|
84 | def_key = 'REMOTE_ADDR' | |
85 |
|
85 | |||
86 | ip = environ.get(proxy_key) |
|
86 | ip = environ.get(proxy_key) | |
87 | if ip: |
|
87 | if ip: | |
88 | return _filter_proxy(ip) |
|
88 | return _filter_proxy(ip) | |
89 |
|
89 | |||
90 | ip = environ.get(proxy_key2) |
|
90 | ip = environ.get(proxy_key2) | |
91 | if ip: |
|
91 | if ip: | |
92 | return _filter_proxy(ip) |
|
92 | return _filter_proxy(ip) | |
93 |
|
93 | |||
94 | ip = environ.get(def_key, '0.0.0.0') |
|
94 | ip = environ.get(def_key, '0.0.0.0') | |
95 | return _filter_proxy(ip) |
|
95 | return _filter_proxy(ip) | |
96 |
|
96 | |||
97 |
|
97 | |||
98 | def get_path_info(environ): |
|
98 | def get_path_info(environ): | |
99 | """Return PATH_INFO from environ ... using tg.original_request if available. |
|
99 | """Return PATH_INFO from environ ... using tg.original_request if available. | |
100 |
|
100 | |||
101 | In Python 3 WSGI, PATH_INFO is a unicode str, but kind of contains encoded |
|
101 | In Python 3 WSGI, PATH_INFO is a unicode str, but kind of contains encoded | |
102 | bytes. The code points are guaranteed to only use the lower 8 bit bits, and |
|
102 | bytes. The code points are guaranteed to only use the lower 8 bit bits, and | |
103 | encoding the string with the 1:1 encoding latin1 will give the |
|
103 | encoding the string with the 1:1 encoding latin1 will give the | |
104 | corresponding byte string ... which then can be decoded to proper unicode. |
|
104 | corresponding byte string ... which then can be decoded to proper unicode. | |
105 | """ |
|
105 | """ | |
106 | org_req = environ.get('tg.original_request') |
|
106 | org_req = environ.get('tg.original_request') | |
107 | if org_req is not None: |
|
107 | if org_req is not None: | |
108 | environ = org_req.environ |
|
108 | environ = org_req.environ | |
109 | return safe_str(environ['PATH_INFO'].encode('latin1')) |
|
109 | return safe_str(environ['PATH_INFO'].encode('latin1')) | |
110 |
|
110 | |||
111 |
|
111 | |||
112 | def log_in_user(user, remember, is_external_auth, ip_addr): |
|
112 | def log_in_user(user, remember, is_external_auth, ip_addr): | |
113 | """ |
|
113 | """ | |
114 | Log a `User` in and update session and cookies. If `remember` is True, |
|
114 | Log a `User` in and update session and cookies. If `remember` is True, | |
115 | the session cookie is set to expire in a year; otherwise, it expires at |
|
115 | the session cookie is set to expire in a year; otherwise, it expires at | |
116 | the end of the browser session. |
|
116 | the end of the browser session. | |
117 |
|
117 | |||
118 | Returns populated `AuthUser` object. |
|
118 | Returns populated `AuthUser` object. | |
119 | """ |
|
119 | """ | |
120 | # It should not be possible to explicitly log in as the default user. |
|
120 | # It should not be possible to explicitly log in as the default user. | |
121 | assert not user.is_default_user, user |
|
121 | assert not user.is_default_user, user | |
122 |
|
122 | |||
123 | auth_user = AuthUser.make(dbuser=user, is_external_auth=is_external_auth, ip_addr=ip_addr) |
|
123 | auth_user = AuthUser.make(dbuser=user, is_external_auth=is_external_auth, ip_addr=ip_addr) | |
124 | if auth_user is None: |
|
124 | if auth_user is None: | |
125 | return None |
|
125 | return None | |
126 |
|
126 | |||
127 | user.update_lastlogin() |
|
127 | user.update_lastlogin() | |
128 | meta.Session().commit() |
|
128 | meta.Session().commit() | |
129 |
|
129 | |||
130 | # Start new session to prevent session fixation attacks. |
|
130 | # Start new session to prevent session fixation attacks. | |
131 | session.invalidate() |
|
131 | session.invalidate() | |
132 | session['authuser'] = cookie = auth_user.to_cookie() |
|
132 | session['authuser'] = cookie = auth_user.to_cookie() | |
133 |
|
133 | |||
134 | # If they want to be remembered, update the cookie. |
|
134 | # If they want to be remembered, update the cookie. | |
135 | # NOTE: Assumes that beaker defaults to browser session cookie. |
|
135 | # NOTE: Assumes that beaker defaults to browser session cookie. | |
136 | if remember: |
|
136 | if remember: | |
137 | t = datetime.datetime.now() + datetime.timedelta(days=365) |
|
137 | t = datetime.datetime.now() + datetime.timedelta(days=365) | |
138 | session._set_cookie_expires(t) |
|
138 | session._set_cookie_expires(t) | |
139 |
|
139 | |||
140 | session.save() |
|
140 | session.save() | |
141 |
|
141 | |||
142 | log.info('user %s is now authenticated and stored in ' |
|
142 | log.info('user %s is now authenticated and stored in ' | |
143 | 'session, session attrs %s', user.username, cookie) |
|
143 | 'session, session attrs %s', user.username, cookie) | |
144 |
|
144 | |||
145 | # dumps session attrs back to cookie |
|
145 | # dumps session attrs back to cookie | |
146 | session._update_cookie_out() |
|
146 | session._update_cookie_out() | |
147 |
|
147 | |||
148 | return auth_user |
|
148 | return auth_user | |
149 |
|
149 | |||
150 |
|
150 | |||
151 | class BasicAuth(paste.auth.basic.AuthBasicAuthenticator): |
|
151 | class BasicAuth(paste.auth.basic.AuthBasicAuthenticator): | |
152 |
|
152 | |||
153 | def __init__(self, realm, authfunc, auth_http_code=None): |
|
153 | def __init__(self, realm, authfunc, auth_http_code=None): | |
154 | self.realm = realm |
|
154 | self.realm = realm | |
155 | self.authfunc = authfunc |
|
155 | self.authfunc = authfunc | |
156 | self._rc_auth_http_code = auth_http_code |
|
156 | self._rc_auth_http_code = auth_http_code | |
157 |
|
157 | |||
158 | def build_authentication(self, environ): |
|
158 | def build_authentication(self, environ): | |
159 | head = paste.httpheaders.WWW_AUTHENTICATE.tuples('Basic realm="%s"' % self.realm) |
|
159 | head = paste.httpheaders.WWW_AUTHENTICATE.tuples('Basic realm="%s"' % self.realm) | |
160 | # Consume the whole body before sending a response |
|
160 | # Consume the whole body before sending a response | |
161 | try: |
|
161 | try: | |
162 | request_body_size = int(environ.get('CONTENT_LENGTH', 0)) |
|
162 | request_body_size = int(environ.get('CONTENT_LENGTH', 0)) | |
163 | except (ValueError): |
|
163 | except (ValueError): | |
164 | request_body_size = 0 |
|
164 | request_body_size = 0 | |
165 | environ['wsgi.input'].read(request_body_size) |
|
165 | environ['wsgi.input'].read(request_body_size) | |
166 | if self._rc_auth_http_code and self._rc_auth_http_code == '403': |
|
166 | if self._rc_auth_http_code and self._rc_auth_http_code == '403': | |
167 | # return 403 if alternative http return code is specified in |
|
167 | # return 403 if alternative http return code is specified in | |
168 | # Kallithea config |
|
168 | # Kallithea config | |
169 | return paste.httpexceptions.HTTPForbidden(headers=head) |
|
169 | return paste.httpexceptions.HTTPForbidden(headers=head) | |
170 | return paste.httpexceptions.HTTPUnauthorized(headers=head) |
|
170 | return paste.httpexceptions.HTTPUnauthorized(headers=head) | |
171 |
|
171 | |||
172 | def authenticate(self, environ): |
|
172 | def authenticate(self, environ): | |
173 | authorization = paste.httpheaders.AUTHORIZATION(environ) |
|
173 | authorization = paste.httpheaders.AUTHORIZATION(environ) | |
174 | if not authorization: |
|
174 | if not authorization: | |
175 | return self.build_authentication(environ) |
|
175 | return self.build_authentication(environ) | |
176 | (authmeth, auth) = authorization.split(' ', 1) |
|
176 | (authmeth, auth) = authorization.split(' ', 1) | |
177 | if 'basic' != authmeth.lower(): |
|
177 | if 'basic' != authmeth.lower(): | |
178 | return self.build_authentication(environ) |
|
178 | return self.build_authentication(environ) | |
179 | auth = safe_str(base64.b64decode(auth.strip())) |
|
179 | auth = safe_str(base64.b64decode(auth.strip())) | |
180 | _parts = auth.split(':', 1) |
|
180 | _parts = auth.split(':', 1) | |
181 | if len(_parts) == 2: |
|
181 | if len(_parts) == 2: | |
182 | username, password = _parts |
|
182 | username, password = _parts | |
183 | if self.authfunc(username, password, environ) is not None: |
|
183 | if self.authfunc(username, password, environ) is not None: | |
184 | return username |
|
184 | return username | |
185 | return self.build_authentication(environ) |
|
185 | return self.build_authentication(environ) | |
186 |
|
186 | |||
187 | __call__ = authenticate |
|
187 | __call__ = authenticate | |
188 |
|
188 | |||
189 |
|
189 | |||
190 | class BaseVCSController(object): |
|
190 | class BaseVCSController(object): | |
191 | """Base controller for handling Mercurial/Git protocol requests |
|
191 | """Base controller for handling Mercurial/Git protocol requests | |
192 | (coming from a VCS client, and not a browser). |
|
192 | (coming from a VCS client, and not a browser). | |
193 | """ |
|
193 | """ | |
194 |
|
194 | |||
195 | scm_alias = None # 'hg' / 'git' |
|
195 | scm_alias = None # 'hg' / 'git' | |
196 |
|
196 | |||
197 | def __init__(self, application, config): |
|
197 | def __init__(self, application, config): | |
198 | self.application = application |
|
198 | self.application = application | |
199 | self.config = config |
|
199 | self.config = config | |
200 | # base path of repo locations |
|
200 | # base path of repo locations | |
201 | self.basepath = self.config['base_path'] |
|
201 | self.basepath = self.config['base_path'] | |
202 | # authenticate this VCS request using the authentication modules |
|
202 | # authenticate this VCS request using the authentication modules | |
203 | self.authenticate = BasicAuth('', auth_modules.authenticate, |
|
203 | self.authenticate = BasicAuth('', auth_modules.authenticate, | |
204 | config.get('auth_ret_code')) |
|
204 | config.get('auth_ret_code')) | |
205 |
|
205 | |||
206 | @classmethod |
|
206 | @classmethod | |
207 | def parse_request(cls, environ): |
|
207 | def parse_request(cls, environ): | |
208 | """If request is parsed as a request for this VCS, return a namespace with the parsed request. |
|
208 | """If request is parsed as a request for this VCS, return a namespace with the parsed request. | |
209 | If the request is unknown, return None. |
|
209 | If the request is unknown, return None. | |
210 | """ |
|
210 | """ | |
211 | raise NotImplementedError() |
|
211 | raise NotImplementedError() | |
212 |
|
212 | |||
213 | def _authorize(self, environ, action, repo_name, ip_addr): |
|
213 | def _authorize(self, environ, action, repo_name, ip_addr): | |
214 | """Authenticate and authorize user. |
|
214 | """Authenticate and authorize user. | |
215 |
|
215 | |||
216 | Since we're dealing with a VCS client and not a browser, we only |
|
216 | Since we're dealing with a VCS client and not a browser, we only | |
217 | support HTTP basic authentication, either directly via raw header |
|
217 | support HTTP basic authentication, either directly via raw header | |
218 | inspection, or by using container authentication to delegate the |
|
218 | inspection, or by using container authentication to delegate the | |
219 | authentication to the web server. |
|
219 | authentication to the web server. | |
220 |
|
220 | |||
221 | Returns (user, None) on successful authentication and authorization. |
|
221 | Returns (user, None) on successful authentication and authorization. | |
222 | Returns (None, wsgi_app) to send the wsgi_app response to the client. |
|
222 | Returns (None, wsgi_app) to send the wsgi_app response to the client. | |
223 | """ |
|
223 | """ | |
224 | # Use anonymous access if allowed for action on repo. |
|
224 | # Use anonymous access if allowed for action on repo. | |
225 | default_user = db.User.get_default_user() |
|
225 | default_user = db.User.get_default_user() | |
226 | default_authuser = AuthUser.make(dbuser=default_user, ip_addr=ip_addr) |
|
226 | default_authuser = AuthUser.make(dbuser=default_user, ip_addr=ip_addr) | |
227 | if default_authuser is None: |
|
227 | if default_authuser is None: | |
228 | log.debug('No anonymous access at all') # move on to proper user auth |
|
228 | log.debug('No anonymous access at all') # move on to proper user auth | |
229 | else: |
|
229 | else: | |
230 | if self._check_permission(action, default_authuser, repo_name): |
|
230 | if self._check_permission(action, default_authuser, repo_name): | |
231 | return default_authuser, None |
|
231 | return default_authuser, None | |
232 | log.debug('Not authorized to access this repository as anonymous user') |
|
232 | log.debug('Not authorized to access this repository as anonymous user') | |
233 |
|
233 | |||
234 | username = None |
|
234 | username = None | |
235 | #============================================================== |
|
235 | #============================================================== | |
236 | # DEFAULT PERM FAILED OR ANONYMOUS ACCESS IS DISABLED SO WE |
|
236 | # DEFAULT PERM FAILED OR ANONYMOUS ACCESS IS DISABLED SO WE | |
237 | # NEED TO AUTHENTICATE AND ASK FOR AUTH USER PERMISSIONS |
|
237 | # NEED TO AUTHENTICATE AND ASK FOR AUTH USER PERMISSIONS | |
238 | #============================================================== |
|
238 | #============================================================== | |
239 |
|
239 | |||
240 | # try to auth based on environ, container auth methods |
|
240 | # try to auth based on environ, container auth methods | |
241 | log.debug('Running PRE-AUTH for container based authentication') |
|
241 | log.debug('Running PRE-AUTH for container based authentication') | |
242 | pre_auth = auth_modules.authenticate('', '', environ) |
|
242 | pre_auth = auth_modules.authenticate('', '', environ) | |
243 | if pre_auth is not None and pre_auth.get('username'): |
|
243 | if pre_auth is not None and pre_auth.get('username'): | |
244 | username = pre_auth['username'] |
|
244 | username = pre_auth['username'] | |
245 | log.debug('PRE-AUTH got %s as username', username) |
|
245 | log.debug('PRE-AUTH got %s as username', username) | |
246 |
|
246 | |||
247 | # If not authenticated by the container, running basic auth |
|
247 | # If not authenticated by the container, running basic auth | |
248 | if not username: |
|
248 | if not username: | |
249 | self.authenticate.realm = self.config['realm'] |
|
249 | self.authenticate.realm = self.config['realm'] | |
250 | result = self.authenticate(environ) |
|
250 | result = self.authenticate(environ) | |
251 | if isinstance(result, str): |
|
251 | if isinstance(result, str): | |
252 | paste.httpheaders.AUTH_TYPE.update(environ, 'basic') |
|
252 | paste.httpheaders.AUTH_TYPE.update(environ, 'basic') | |
253 | paste.httpheaders.REMOTE_USER.update(environ, result) |
|
253 | paste.httpheaders.REMOTE_USER.update(environ, result) | |
254 | username = result |
|
254 | username = result | |
255 | else: |
|
255 | else: | |
256 | return None, result.wsgi_application |
|
256 | return None, result.wsgi_application | |
257 |
|
257 | |||
258 | #============================================================== |
|
258 | #============================================================== | |
259 | # CHECK PERMISSIONS FOR THIS REQUEST USING GIVEN USERNAME |
|
259 | # CHECK PERMISSIONS FOR THIS REQUEST USING GIVEN USERNAME | |
260 | #============================================================== |
|
260 | #============================================================== | |
261 | try: |
|
261 | try: | |
262 | user = db.User.get_by_username_or_email(username) |
|
262 | user = db.User.get_by_username_or_email(username) | |
263 | except Exception: |
|
263 | except Exception: | |
264 | log.error(traceback.format_exc()) |
|
264 | log.error(traceback.format_exc()) | |
265 | return None, webob.exc.HTTPInternalServerError() |
|
265 | return None, webob.exc.HTTPInternalServerError() | |
266 |
|
266 | |||
267 | authuser = AuthUser.make(dbuser=user, ip_addr=ip_addr) |
|
267 | authuser = AuthUser.make(dbuser=user, ip_addr=ip_addr) | |
268 | if authuser is None: |
|
268 | if authuser is None: | |
269 | return None, webob.exc.HTTPForbidden() |
|
269 | return None, webob.exc.HTTPForbidden() | |
270 | if not self._check_permission(action, authuser, repo_name): |
|
270 | if not self._check_permission(action, authuser, repo_name): | |
271 | return None, webob.exc.HTTPForbidden() |
|
271 | return None, webob.exc.HTTPForbidden() | |
272 |
|
272 | |||
273 | return user, None |
|
273 | return user, None | |
274 |
|
274 | |||
275 | def _handle_request(self, environ, start_response): |
|
275 | def _handle_request(self, environ, start_response): | |
276 | raise NotImplementedError() |
|
276 | raise NotImplementedError() | |
277 |
|
277 | |||
278 | def _check_permission(self, action, authuser, repo_name): |
|
278 | def _check_permission(self, action, authuser, repo_name): | |
279 | """ |
|
279 | """ | |
280 | :param action: 'push' or 'pull' |
|
280 | :param action: 'push' or 'pull' | |
281 | :param user: `AuthUser` instance |
|
281 | :param user: `AuthUser` instance | |
282 | :param repo_name: repository name |
|
282 | :param repo_name: repository name | |
283 | """ |
|
283 | """ | |
284 | if action == 'push': |
|
284 | if action == 'push': | |
285 | if not HasPermissionAnyMiddleware('repository.write', |
|
285 | if not HasPermissionAnyMiddleware('repository.write', | |
286 | 'repository.admin')(authuser, |
|
286 | 'repository.admin')(authuser, | |
287 | repo_name): |
|
287 | repo_name): | |
288 | return False |
|
288 | return False | |
289 |
|
289 | |||
290 | elif action == 'pull': |
|
290 | elif action == 'pull': | |
291 | #any other action need at least read permission |
|
291 | #any other action need at least read permission | |
292 | if not HasPermissionAnyMiddleware('repository.read', |
|
292 | if not HasPermissionAnyMiddleware('repository.read', | |
293 | 'repository.write', |
|
293 | 'repository.write', | |
294 | 'repository.admin')(authuser, |
|
294 | 'repository.admin')(authuser, | |
295 | repo_name): |
|
295 | repo_name): | |
296 | return False |
|
296 | return False | |
297 |
|
297 | |||
298 | else: |
|
298 | else: | |
299 | assert False, action |
|
299 | assert False, action | |
300 |
|
300 | |||
301 | return True |
|
301 | return True | |
302 |
|
302 | |||
303 | def __call__(self, environ, start_response): |
|
303 | def __call__(self, environ, start_response): | |
304 | try: |
|
304 | try: | |
305 | # try parsing a request for this VCS - if it fails, call the wrapped app |
|
305 | # try parsing a request for this VCS - if it fails, call the wrapped app | |
306 | parsed_request = self.parse_request(environ) |
|
306 | parsed_request = self.parse_request(environ) | |
307 | if parsed_request is None: |
|
307 | if parsed_request is None: | |
308 | return self.application(environ, start_response) |
|
308 | return self.application(environ, start_response) | |
309 |
|
309 | |||
310 | # skip passing error to error controller |
|
310 | # skip passing error to error controller | |
311 | environ['pylons.status_code_redirect'] = True |
|
311 | environ['pylons.status_code_redirect'] = True | |
312 |
|
312 | |||
313 | # quick check if repo exists... |
|
313 | # quick check if repo exists... | |
314 | if not is_valid_repo(parsed_request.repo_name, self.basepath, self.scm_alias): |
|
314 | if not is_valid_repo(parsed_request.repo_name, self.basepath, self.scm_alias): | |
315 | raise webob.exc.HTTPNotFound() |
|
315 | raise webob.exc.HTTPNotFound() | |
316 |
|
316 | |||
317 | if parsed_request.action is None: |
|
317 | if parsed_request.action is None: | |
318 | # Note: the client doesn't get the helpful error message |
|
318 | # Note: the client doesn't get the helpful error message | |
319 | raise webob.exc.HTTPBadRequest('Unable to detect pull/push action for %r! Are you using a nonstandard command or client?' % parsed_request.repo_name) |
|
319 | raise webob.exc.HTTPBadRequest('Unable to detect pull/push action for %r! Are you using a nonstandard command or client?' % parsed_request.repo_name) | |
320 |
|
320 | |||
321 | #====================================================================== |
|
321 | #====================================================================== | |
322 | # CHECK PERMISSIONS |
|
322 | # CHECK PERMISSIONS | |
323 | #====================================================================== |
|
323 | #====================================================================== | |
324 | ip_addr = get_ip_addr(environ) |
|
324 | ip_addr = get_ip_addr(environ) | |
325 | user, response_app = self._authorize(environ, parsed_request.action, parsed_request.repo_name, ip_addr) |
|
325 | user, response_app = self._authorize(environ, parsed_request.action, parsed_request.repo_name, ip_addr) | |
326 | if response_app is not None: |
|
326 | if response_app is not None: | |
327 | return response_app(environ, start_response) |
|
327 | return response_app(environ, start_response) | |
328 |
|
328 | |||
329 | #====================================================================== |
|
329 | #====================================================================== | |
330 | # REQUEST HANDLING |
|
330 | # REQUEST HANDLING | |
331 | #====================================================================== |
|
331 | #====================================================================== | |
332 | set_hook_environment(user.username, ip_addr, |
|
332 | set_hook_environment(user.username, ip_addr, | |
333 | parsed_request.repo_name, self.scm_alias, parsed_request.action) |
|
333 | parsed_request.repo_name, self.scm_alias, parsed_request.action) | |
334 |
|
334 | |||
335 | try: |
|
335 | try: | |
336 | log.info('%s action on %s repo "%s" by "%s" from %s', |
|
336 | log.info('%s action on %s repo "%s" by "%s" from %s', | |
337 | parsed_request.action, self.scm_alias, parsed_request.repo_name, user.username, ip_addr) |
|
337 | parsed_request.action, self.scm_alias, parsed_request.repo_name, user.username, ip_addr) | |
338 | app = self._make_app(parsed_request) |
|
338 | app = self._make_app(parsed_request) | |
339 | return app(environ, start_response) |
|
339 | return app(environ, start_response) | |
340 | except Exception: |
|
340 | except Exception: | |
341 | log.error(traceback.format_exc()) |
|
341 | log.error(traceback.format_exc()) | |
342 | raise webob.exc.HTTPInternalServerError() |
|
342 | raise webob.exc.HTTPInternalServerError() | |
343 |
|
343 | |||
344 | except webob.exc.HTTPException as e: |
|
344 | except webob.exc.HTTPException as e: | |
345 | return e(environ, start_response) |
|
345 | return e(environ, start_response) | |
346 |
|
346 | |||
347 |
|
347 | |||
348 | class BaseController(TGController): |
|
348 | class BaseController(TGController): | |
349 |
|
349 | |||
350 | def _before(self, *args, **kwargs): |
|
350 | def _before(self, *args, **kwargs): | |
351 | """ |
|
351 | """ | |
352 | _before is called before controller methods and after __call__ |
|
352 | _before is called before controller methods and after __call__ | |
353 | """ |
|
353 | """ | |
354 | if request.needs_csrf_check: |
|
354 | if request.needs_csrf_check: | |
355 | # CSRF protection: Whenever a request has ambient authority (whether |
|
355 | # CSRF protection: Whenever a request has ambient authority (whether | |
356 | # through a session cookie or its origin IP address), it must include |
|
356 | # through a session cookie or its origin IP address), it must include | |
357 | # the correct token, unless the HTTP method is GET or HEAD (and thus |
|
357 | # the correct token, unless the HTTP method is GET or HEAD (and thus | |
358 | # guaranteed to be side effect free. In practice, the only situation |
|
358 | # guaranteed to be side effect free. In practice, the only situation | |
359 | # where we allow side effects without ambient authority is when the |
|
359 | # where we allow side effects without ambient authority is when the | |
360 | # authority comes from an API key; and that is handled above. |
|
360 | # authority comes from an API key; and that is handled above. | |
361 | token = request.POST.get(webutils.session_csrf_secret_name) |
|
361 | token = request.POST.get(webutils.session_csrf_secret_name) | |
362 | if not token or token != webutils.session_csrf_secret_token(): |
|
362 | if not token or token != webutils.session_csrf_secret_token(): | |
363 | log.error('CSRF check failed') |
|
363 | log.error('CSRF check failed') | |
364 | raise webob.exc.HTTPForbidden() |
|
364 | raise webob.exc.HTTPForbidden() | |
365 |
|
365 | |||
366 | c.kallithea_version = kallithea.__version__ |
|
366 | c.kallithea_version = kallithea.__version__ | |
367 | settings = db.Setting.get_app_settings() |
|
367 | settings = db.Setting.get_app_settings() | |
368 |
|
368 | |||
369 | # Visual options |
|
369 | # Visual options | |
370 | c.visual = AttributeDict({}) |
|
370 | c.visual = AttributeDict({}) | |
371 |
|
371 | |||
372 | ## DB stored |
|
372 | ## DB stored | |
373 | c.visual.show_public_icon = asbool(settings.get('show_public_icon')) |
|
373 | c.visual.show_public_icon = asbool(settings.get('show_public_icon')) | |
374 | c.visual.show_private_icon = asbool(settings.get('show_private_icon')) |
|
374 | c.visual.show_private_icon = asbool(settings.get('show_private_icon')) | |
375 | c.visual.stylify_metalabels = asbool(settings.get('stylify_metalabels')) |
|
375 | c.visual.stylify_metalabels = asbool(settings.get('stylify_metalabels')) | |
376 | c.visual.page_size = safe_int(settings.get('dashboard_items', 100)) |
|
376 | c.visual.page_size = safe_int(settings.get('dashboard_items', 100)) | |
377 | c.visual.admin_grid_items = safe_int(settings.get('admin_grid_items', 100)) |
|
377 | c.visual.admin_grid_items = safe_int(settings.get('admin_grid_items', 100)) | |
378 | c.visual.repository_fields = asbool(settings.get('repository_fields')) |
|
378 | c.visual.repository_fields = asbool(settings.get('repository_fields')) | |
379 | c.visual.show_version = asbool(settings.get('show_version')) |
|
379 | c.visual.show_version = asbool(settings.get('show_version')) | |
380 | c.visual.use_gravatar = asbool(settings.get('use_gravatar')) |
|
380 | c.visual.use_gravatar = asbool(settings.get('use_gravatar')) | |
381 | c.visual.gravatar_url = settings.get('gravatar_url') |
|
381 | c.visual.gravatar_url = settings.get('gravatar_url') | |
382 |
|
382 | |||
383 | c.ga_code = settings.get('ga_code') |
|
383 | c.ga_code = settings.get('ga_code') | |
384 | # TODO: replace undocumented backwards compatibility hack with db upgrade and rename ga_code |
|
384 | # TODO: replace undocumented backwards compatibility hack with db upgrade and rename ga_code | |
385 | if c.ga_code and '<' not in c.ga_code: |
|
385 | if c.ga_code and '<' not in c.ga_code: | |
386 | c.ga_code = '''<script type="text/javascript"> |
|
386 | c.ga_code = '''<script type="text/javascript"> | |
387 | var _gaq = _gaq || []; |
|
387 | var _gaq = _gaq || []; | |
388 | _gaq.push(['_setAccount', '%s']); |
|
388 | _gaq.push(['_setAccount', '%s']); | |
389 | _gaq.push(['_trackPageview']); |
|
389 | _gaq.push(['_trackPageview']); | |
390 |
|
390 | |||
391 | (function() { |
|
391 | (function() { | |
392 | var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; |
|
392 | var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; | |
393 | ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; |
|
393 | ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; | |
394 | var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); |
|
394 | var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); | |
395 | })(); |
|
395 | })(); | |
396 | </script>''' % c.ga_code |
|
396 | </script>''' % c.ga_code | |
397 | c.site_name = settings.get('title') |
|
397 | c.site_name = settings.get('title') | |
398 | c.clone_uri_tmpl = settings.get('clone_uri_tmpl') or db.Repository.DEFAULT_CLONE_URI |
|
398 | c.clone_uri_tmpl = settings.get('clone_uri_tmpl') or db.Repository.DEFAULT_CLONE_URI | |
399 | c.clone_ssh_tmpl = settings.get('clone_ssh_tmpl') or db.Repository.DEFAULT_CLONE_SSH |
|
399 | c.clone_ssh_tmpl = settings.get('clone_ssh_tmpl') or db.Repository.DEFAULT_CLONE_SSH | |
400 |
|
400 | |||
401 | ## INI stored |
|
401 | ## INI stored | |
402 | c.visual.allow_repo_location_change = asbool(config.get('allow_repo_location_change', True)) |
|
402 | c.visual.allow_repo_location_change = asbool(config.get('allow_repo_location_change', True)) | |
403 | c.visual.allow_custom_hooks_settings = asbool(config.get('allow_custom_hooks_settings', True)) |
|
403 | c.visual.allow_custom_hooks_settings = asbool(config.get('allow_custom_hooks_settings', True)) | |
404 | c.ssh_enabled = asbool(config.get('ssh_enabled', False)) |
|
404 | c.ssh_enabled = asbool(config.get('ssh_enabled', False)) | |
405 |
|
405 | |||
406 | c.instance_id = config.get('instance_id') |
|
406 | c.instance_id = config.get('instance_id') | |
407 | c.issues_url = config.get('bugtracker', url('issues_url')) |
|
407 | c.issues_url = config.get('bugtracker', url('issues_url')) | |
408 | # END CONFIG VARS |
|
408 | # END CONFIG VARS | |
409 |
|
409 | |||
410 | c.repo_name = get_repo_slug(request) # can be empty |
|
410 | c.repo_name = get_repo_slug(request) # can be empty | |
411 | c.backends = list(kallithea.BACKENDS) |
|
411 | c.backends = list(kallithea.BACKENDS) | |
412 |
|
412 | |||
413 | self.cut_off_limit = safe_int(config.get('cut_off_limit')) |
|
413 | self.cut_off_limit = safe_int(config.get('cut_off_limit')) | |
414 |
|
414 | |||
415 | c.my_pr_count = db.PullRequest.query(reviewer_id=request.authuser.user_id, include_closed=False).count() |
|
415 | c.my_pr_count = db.PullRequest.query(reviewer_id=request.authuser.user_id, include_closed=False).count() | |
416 |
|
416 | |||
417 | self.scm_model = ScmModel() |
|
417 | self.scm_model = ScmModel() | |
418 |
|
418 | |||
419 | @staticmethod |
|
419 | @staticmethod | |
420 | def _determine_auth_user(session_authuser, ip_addr): |
|
420 | def _determine_auth_user(session_authuser, ip_addr): | |
421 | """ |
|
421 | """ | |
422 | Create an `AuthUser` object given the API key/bearer token |
|
422 | Create an `AuthUser` object given the API key/bearer token | |
423 | (if any) and the value of the authuser session cookie. |
|
423 | (if any) and the value of the authuser session cookie. | |
424 | Returns None if no valid user is found (like not active or no access for IP). |
|
424 | Returns None if no valid user is found (like not active or no access for IP). | |
425 | """ |
|
425 | """ | |
426 |
|
426 | |||
427 | # Authenticate by session cookie |
|
427 | # Authenticate by session cookie | |
428 | # In ancient login sessions, 'authuser' may not be a dict. |
|
428 | # In ancient login sessions, 'authuser' may not be a dict. | |
429 | # In that case, the user will have to log in again. |
|
429 | # In that case, the user will have to log in again. | |
430 | # v0.3 and earlier included an 'is_authenticated' key; if present, |
|
430 | # v0.3 and earlier included an 'is_authenticated' key; if present, | |
431 | # this must be True. |
|
431 | # this must be True. | |
432 | if isinstance(session_authuser, dict) and session_authuser.get('is_authenticated', True): |
|
432 | if isinstance(session_authuser, dict) and session_authuser.get('is_authenticated', True): | |
433 | return AuthUser.from_cookie(session_authuser, ip_addr=ip_addr) |
|
433 | return AuthUser.from_cookie(session_authuser, ip_addr=ip_addr) | |
434 |
|
434 | |||
435 | # Authenticate by auth_container plugin (if enabled) |
|
435 | # Authenticate by auth_container plugin (if enabled) | |
436 | if any( |
|
436 | if any( | |
437 | plugin.is_container_auth |
|
437 | plugin.is_container_auth | |
438 | for plugin in auth_modules.get_auth_plugins() |
|
438 | for plugin in auth_modules.get_auth_plugins() | |
439 | ): |
|
439 | ): | |
440 | try: |
|
440 | try: | |
441 | user_info = auth_modules.authenticate('', '', request.environ) |
|
441 | user_info = auth_modules.authenticate('', '', request.environ) | |
442 | except UserCreationError as e: |
|
442 | except UserCreationError as e: | |
443 | webutils.flash(e, 'error', logf=log.error) |
|
443 | webutils.flash(e, 'error', logf=log.error) | |
444 | else: |
|
444 | else: | |
445 | if user_info is not None: |
|
445 | if user_info is not None: | |
446 | username = user_info['username'] |
|
446 | username = user_info['username'] | |
447 | user = db.User.get_by_username(username, case_insensitive=True) |
|
447 | user = db.User.get_by_username(username, case_insensitive=True) | |
448 | return log_in_user(user, remember=False, is_external_auth=True, ip_addr=ip_addr) |
|
448 | return log_in_user(user, remember=False, is_external_auth=True, ip_addr=ip_addr) | |
449 |
|
449 | |||
450 | # User is default user (if active) or anonymous |
|
450 | # User is default user (if active) or anonymous | |
451 | default_user = db.User.get_default_user() |
|
451 | default_user = db.User.get_default_user() | |
452 | authuser = AuthUser.make(dbuser=default_user, ip_addr=ip_addr) |
|
452 | authuser = AuthUser.make(dbuser=default_user, ip_addr=ip_addr) | |
453 | if authuser is None: # fall back to anonymous |
|
453 | if authuser is None: # fall back to anonymous | |
454 | authuser = AuthUser(dbuser=default_user) # TODO: somehow use .make? |
|
454 | authuser = AuthUser(dbuser=default_user) # TODO: somehow use .make? | |
455 | return authuser |
|
455 | return authuser | |
456 |
|
456 | |||
457 | @staticmethod |
|
457 | @staticmethod | |
458 | def _basic_security_checks(): |
|
458 | def _basic_security_checks(): | |
459 | """Perform basic security/sanity checks before processing the request.""" |
|
459 | """Perform basic security/sanity checks before processing the request.""" | |
460 |
|
460 | |||
461 | # Only allow the following HTTP request methods. |
|
461 | # Only allow the following HTTP request methods. | |
462 | if request.method not in ['GET', 'HEAD', 'POST']: |
|
462 | if request.method not in ['GET', 'HEAD', 'POST']: | |
463 | raise webob.exc.HTTPMethodNotAllowed() |
|
463 | raise webob.exc.HTTPMethodNotAllowed() | |
464 |
|
464 | |||
465 | # Also verify the _method override - no longer allowed. |
|
465 | # Also verify the _method override - no longer allowed. | |
466 | if request.params.get('_method') is None: |
|
466 | if request.params.get('_method') is None: | |
467 | pass # no override, no problem |
|
467 | pass # no override, no problem | |
468 | else: |
|
468 | else: | |
469 | raise webob.exc.HTTPMethodNotAllowed() |
|
469 | raise webob.exc.HTTPMethodNotAllowed() | |
470 |
|
470 | |||
471 | # Make sure CSRF token never appears in the URL. If so, invalidate it. |
|
471 | # Make sure CSRF token never appears in the URL. If so, invalidate it. | |
472 | if webutils.session_csrf_secret_name in request.GET: |
|
472 | if webutils.session_csrf_secret_name in request.GET: | |
473 | log.error('CSRF key leak detected') |
|
473 | log.error('CSRF key leak detected') | |
474 | session.pop(webutils.session_csrf_secret_name, None) |
|
474 | session.pop(webutils.session_csrf_secret_name, None) | |
475 | session.save() |
|
475 | session.save() | |
476 | webutils.flash(_('CSRF token leak has been detected - all form tokens have been expired'), |
|
476 | webutils.flash(_('CSRF token leak has been detected - all form tokens have been expired'), | |
477 | category='error') |
|
477 | category='error') | |
478 |
|
478 | |||
479 | # WebOb already ignores request payload parameters for anything other |
|
479 | # WebOb already ignores request payload parameters for anything other | |
480 | # than POST/PUT, but double-check since other Kallithea code relies on |
|
480 | # than POST/PUT, but double-check since other Kallithea code relies on | |
481 | # this assumption. |
|
481 | # this assumption. | |
482 | if request.method not in ['POST', 'PUT'] and request.POST: |
|
482 | if request.method not in ['POST', 'PUT'] and request.POST: | |
483 | log.error('%r request with payload parameters; WebOb should have stopped this', request.method) |
|
483 | log.error('%r request with payload parameters; WebOb should have stopped this', request.method) | |
484 | raise webob.exc.HTTPBadRequest() |
|
484 | raise webob.exc.HTTPBadRequest() | |
485 |
|
485 | |||
486 | def __call__(self, environ, context): |
|
486 | def __call__(self, environ, context): | |
487 | try: |
|
487 | try: | |
488 | ip_addr = get_ip_addr(environ) |
|
488 | ip_addr = get_ip_addr(environ) | |
489 | self._basic_security_checks() |
|
489 | self._basic_security_checks() | |
490 |
|
490 | |||
491 | api_key = request.GET.get('api_key') |
|
491 | api_key = request.GET.get('api_key') | |
492 | try: |
|
492 | try: | |
493 | # Request.authorization may raise ValueError on invalid input |
|
493 | # Request.authorization may raise ValueError on invalid input | |
494 | type, params = request.authorization |
|
494 | type, params = request.authorization | |
495 | except (ValueError, TypeError): |
|
495 | except (ValueError, TypeError): | |
496 | pass |
|
496 | pass | |
497 | else: |
|
497 | else: | |
498 | if type.lower() == 'bearer': |
|
498 | if type.lower() == 'bearer': | |
499 | api_key = params # bearer token is an api key too |
|
499 | api_key = params # bearer token is an api key too | |
500 |
|
500 | |||
501 | if api_key is None: |
|
501 | if api_key is None: | |
502 | authuser = self._determine_auth_user( |
|
502 | authuser = self._determine_auth_user( | |
503 | session.get('authuser'), |
|
503 | session.get('authuser'), | |
504 | ip_addr=ip_addr, |
|
504 | ip_addr=ip_addr, | |
505 | ) |
|
505 | ) | |
506 | needs_csrf_check = request.method not in ['GET', 'HEAD'] |
|
506 | needs_csrf_check = request.method not in ['GET', 'HEAD'] | |
507 |
|
507 | |||
508 | else: |
|
508 | else: | |
509 | dbuser = db.User.get_by_api_key(api_key) |
|
509 | dbuser = db.User.get_by_api_key(api_key) | |
510 | if dbuser is None: |
|
510 | if dbuser is None: | |
511 | log.info('No db user found for authentication with API key ****%s from %s', |
|
511 | log.info('No db user found for authentication with API key ****%s from %s', | |
512 | api_key[-4:], ip_addr) |
|
512 | api_key[-4:], ip_addr) | |
513 | authuser = AuthUser.make(dbuser=dbuser, is_external_auth=True, ip_addr=ip_addr) |
|
513 | authuser = AuthUser.make(dbuser=dbuser, is_external_auth=True, ip_addr=ip_addr) | |
514 | needs_csrf_check = False # API key provides CSRF protection |
|
514 | needs_csrf_check = False # API key provides CSRF protection | |
515 |
|
515 | |||
516 | if authuser is None: |
|
516 | if authuser is None: | |
517 | log.info('No valid user found') |
|
517 | log.info('No valid user found') | |
518 | raise webob.exc.HTTPForbidden() |
|
518 | raise webob.exc.HTTPForbidden() | |
519 |
|
519 | |||
520 | # set globals for auth user |
|
520 | # set globals for auth user | |
521 | request.authuser = authuser |
|
521 | request.authuser = authuser | |
522 | request.ip_addr = ip_addr |
|
522 | request.ip_addr = ip_addr | |
523 | request.needs_csrf_check = needs_csrf_check |
|
523 | request.needs_csrf_check = needs_csrf_check | |
524 |
|
524 | |||
525 | log.info('IP: %s User: %s Request: %s', |
|
525 | log.info('IP: %s User: %s Request: %s', | |
526 | request.ip_addr, request.authuser, |
|
526 | request.ip_addr, request.authuser, | |
527 | get_path_info(environ), |
|
527 | get_path_info(environ), | |
528 | ) |
|
528 | ) | |
529 | return super(BaseController, self).__call__(environ, context) |
|
529 | return super(BaseController, self).__call__(environ, context) | |
530 | except webob.exc.HTTPException as e: |
|
530 | except webob.exc.HTTPException as e: | |
531 | return e |
|
531 | return e | |
532 |
|
532 | |||
533 |
|
533 | |||
534 | class BaseRepoController(BaseController): |
|
534 | class BaseRepoController(BaseController): | |
535 | """ |
|
535 | """ | |
536 | Base class for controllers responsible for loading all needed data for |
|
536 | Base class for controllers responsible for loading all needed data for | |
537 | repository loaded items are |
|
537 | repository loaded items are | |
538 |
|
538 | |||
539 | c.db_repo_scm_instance: instance of scm repository |
|
539 | c.db_repo_scm_instance: instance of scm repository | |
540 | c.db_repo: instance of db |
|
540 | c.db_repo: instance of db | |
541 | c.repository_followers: number of followers |
|
541 | c.repository_followers: number of followers | |
542 | c.repository_forks: number of forks |
|
542 | c.repository_forks: number of forks | |
543 | c.repository_following: weather the current user is following the current repo |
|
543 | c.repository_following: weather the current user is following the current repo | |
544 | """ |
|
544 | """ | |
545 |
|
545 | |||
546 | def _before(self, *args, **kwargs): |
|
546 | def _before(self, *args, **kwargs): | |
547 | super(BaseRepoController, self)._before(*args, **kwargs) |
|
547 | super(BaseRepoController, self)._before(*args, **kwargs) | |
548 | if c.repo_name: # extracted from request by base-base BaseController._before |
|
548 | if c.repo_name: # extracted from request by base-base BaseController._before | |
549 | _dbr = db.Repository.get_by_repo_name(c.repo_name) |
|
549 | _dbr = db.Repository.get_by_repo_name(c.repo_name) | |
550 | if not _dbr: |
|
550 | if not _dbr: | |
551 | return |
|
551 | return | |
552 |
|
552 | |||
553 | log.debug('Found repository in database %s with state `%s`', |
|
553 | log.debug('Found repository in database %s with state `%s`', | |
554 | _dbr, _dbr.repo_state) |
|
554 | _dbr, _dbr.repo_state) | |
555 | route = getattr(request.environ.get('routes.route'), 'name', '') |
|
555 | route = getattr(request.environ.get('routes.route'), 'name', '') | |
556 |
|
556 | |||
557 | # allow to delete repos that are somehow damages in filesystem |
|
557 | # allow to delete repos that are somehow damages in filesystem | |
558 | if route in ['delete_repo']: |
|
558 | if route in ['delete_repo']: | |
559 | return |
|
559 | return | |
560 |
|
560 | |||
561 | if _dbr.repo_state in [db.Repository.STATE_PENDING]: |
|
561 | if _dbr.repo_state in [db.Repository.STATE_PENDING]: | |
562 | if route in ['repo_creating_home']: |
|
562 | if route in ['repo_creating_home']: | |
563 | return |
|
563 | return | |
564 | check_url = url('repo_creating_home', repo_name=c.repo_name) |
|
564 | check_url = url('repo_creating_home', repo_name=c.repo_name) | |
565 | raise webob.exc.HTTPFound(location=check_url) |
|
565 | raise webob.exc.HTTPFound(location=check_url) | |
566 |
|
566 | |||
567 | dbr = c.db_repo = _dbr |
|
567 | dbr = c.db_repo = _dbr | |
568 | c.db_repo_scm_instance = c.db_repo.scm_instance |
|
568 | c.db_repo_scm_instance = c.db_repo.scm_instance | |
569 | if c.db_repo_scm_instance is None: |
|
569 | if c.db_repo_scm_instance is None: | |
570 | log.error('%s this repository is present in database but it ' |
|
570 | log.error('%s this repository is present in database but it ' | |
571 | 'cannot be created as an scm instance', c.repo_name) |
|
571 | 'cannot be created as an scm instance', c.repo_name) | |
572 | webutils.flash(_('Repository not found in the filesystem'), |
|
572 | webutils.flash(_('Repository not found in the filesystem'), | |
573 | category='error') |
|
573 | category='error') | |
574 | raise webob.exc.HTTPNotFound() |
|
574 | raise webob.exc.HTTPNotFound() | |
575 |
|
575 | |||
576 | # some globals counter for menu |
|
576 | # some globals counter for menu | |
577 | c.repository_followers = self.scm_model.get_followers(dbr) |
|
577 | c.repository_followers = self.scm_model.get_followers(dbr) | |
578 | c.repository_forks = self.scm_model.get_forks(dbr) |
|
578 | c.repository_forks = self.scm_model.get_forks(dbr) | |
579 | c.repository_pull_requests = self.scm_model.get_pull_requests(dbr) |
|
579 | c.repository_pull_requests = self.scm_model.get_pull_requests(dbr) | |
580 | c.repository_following = self.scm_model.is_following_repo( |
|
580 | c.repository_following = self.scm_model.is_following_repo( | |
581 | c.repo_name, request.authuser.user_id) |
|
581 | c.repo_name, request.authuser.user_id) | |
582 |
|
582 | |||
583 | @staticmethod |
|
583 | @staticmethod | |
584 | def _get_ref_rev(repo, ref_type, ref_name, returnempty=False): |
|
584 | def _get_ref_rev(repo, ref_type, ref_name, returnempty=False): | |
585 | """ |
|
585 | """ | |
586 | Safe way to get changeset. If error occurs show error. |
|
586 | Safe way to get changeset. If error occurs show error. | |
587 | """ |
|
587 | """ | |
588 | try: |
|
588 | try: | |
589 | return repo.scm_instance.get_ref_revision(ref_type, ref_name) |
|
589 | return repo.scm_instance.get_ref_revision(ref_type, ref_name) | |
590 | except EmptyRepositoryError as e: |
|
590 | except EmptyRepositoryError as e: | |
591 | if returnempty: |
|
591 | if returnempty: | |
592 | return repo.scm_instance.EMPTY_CHANGESET |
|
592 | return repo.scm_instance.EMPTY_CHANGESET | |
593 | webutils.flash(_('There are no changesets yet'), category='error') |
|
593 | webutils.flash(_('There are no changesets yet'), category='error') | |
594 | raise webob.exc.HTTPNotFound() |
|
594 | raise webob.exc.HTTPNotFound() | |
595 | except ChangesetDoesNotExistError as e: |
|
595 | except ChangesetDoesNotExistError as e: | |
596 | webutils.flash(_('Changeset for %s %s not found in %s') % |
|
596 | webutils.flash(_('Changeset for %s %s not found in %s') % | |
597 | (ref_type, ref_name, repo.repo_name), |
|
597 | (ref_type, ref_name, repo.repo_name), | |
598 | category='error') |
|
598 | category='error') | |
599 | raise webob.exc.HTTPNotFound() |
|
599 | raise webob.exc.HTTPNotFound() | |
600 | except RepositoryError as e: |
|
600 | except RepositoryError as e: | |
601 | log.error(traceback.format_exc()) |
|
601 | log.error(traceback.format_exc()) | |
602 | webutils.flash(e, category='error') |
|
602 | webutils.flash(e, category='error') | |
603 | raise webob.exc.HTTPBadRequest() |
|
603 | raise webob.exc.HTTPBadRequest() | |
604 |
|
604 | |||
605 |
|
605 | |||
606 | @decorator.decorator |
|
606 | @decorator.decorator | |
607 | def jsonify(func, *args, **kwargs): |
|
607 | def jsonify(func, *args, **kwargs): | |
608 | """Action decorator that formats output for JSON |
|
608 | """Action decorator that formats output for JSON | |
609 |
|
609 | |||
610 | Given a function that will return content, this decorator will turn |
|
610 | Given a function that will return content, this decorator will turn | |
611 | the result into JSON, with a content-type of 'application/json' and |
|
611 | the result into JSON, with a content-type of 'application/json' and | |
612 | output it. |
|
612 | output it. | |
613 | """ |
|
613 | """ | |
614 | response.headers['Content-Type'] = 'application/json; charset=utf-8' |
|
614 | response.headers['Content-Type'] = 'application/json; charset=utf-8' | |
615 | data = func(*args, **kwargs) |
|
615 | data = func(*args, **kwargs) | |
616 | if isinstance(data, (list, tuple)): |
|
616 | if isinstance(data, (list, tuple)): | |
617 | # A JSON list response is syntactically valid JavaScript and can be |
|
617 | # A JSON list response is syntactically valid JavaScript and can be | |
618 | # loaded and executed as JavaScript by a malicious third-party site |
|
618 | # loaded and executed as JavaScript by a malicious third-party site | |
619 | # using <script>, which can lead to cross-site data leaks. |
|
619 | # using <script>, which can lead to cross-site data leaks. | |
620 | # JSON responses should therefore be scalars or objects (i.e. Python |
|
620 | # JSON responses should therefore be scalars or objects (i.e. Python | |
621 | # dicts), because a JSON object is a syntax error if intepreted as JS. |
|
621 | # dicts), because a JSON object is a syntax error if intepreted as JS. | |
622 | msg = "JSON responses with Array envelopes are susceptible to " \ |
|
622 | msg = "JSON responses with Array envelopes are susceptible to " \ | |
623 | "cross-site data leak attacks, see " \ |
|
623 | "cross-site data leak attacks, see " \ | |
624 | "https://web.archive.org/web/20120519231904/http://wiki.pylonshq.com/display/pylonsfaq/Warnings" |
|
624 | "https://web.archive.org/web/20120519231904/http://wiki.pylonshq.com/display/pylonsfaq/Warnings" | |
625 | warnings.warn(msg, Warning, 2) |
|
625 | warnings.warn(msg, Warning, 2) | |
626 | log.warning(msg) |
|
626 | log.warning(msg) | |
627 | log.debug("Returning JSON wrapped action output") |
|
627 | log.debug("Returning JSON wrapped action output") | |
628 | return ascii_bytes(ext_json.dumps(data)) |
|
628 | return ascii_bytes(ext_json.dumps(data)) | |
629 |
|
629 | |||
630 | @decorator.decorator |
|
630 | @decorator.decorator | |
631 | def IfSshEnabled(func, *args, **kwargs): |
|
631 | def IfSshEnabled(func, *args, **kwargs): | |
632 | """Decorator for functions that can only be called if SSH access is enabled. |
|
632 | """Decorator for functions that can only be called if SSH access is enabled. | |
633 |
|
633 | |||
634 | If SSH access is disabled in the configuration file, HTTPNotFound is raised. |
|
634 | If SSH access is disabled in the configuration file, HTTPNotFound is raised. | |
635 | """ |
|
635 | """ | |
636 | if not c.ssh_enabled: |
|
636 | if not c.ssh_enabled: | |
637 | webutils.flash(_("SSH access is disabled."), category='warning') |
|
637 | webutils.flash(_("SSH access is disabled."), category='warning') | |
638 | raise webob.exc.HTTPNotFound() |
|
638 | raise webob.exc.HTTPNotFound() | |
639 | return func(*args, **kwargs) |
|
639 | return func(*args, **kwargs) |
@@ -1,157 +1,157 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.changelog |
|
15 | kallithea.controllers.changelog | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | changelog controller for Kallithea |
|
18 | changelog controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 21, 2010 |
|
22 | :created_on: Apr 21, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 |
|
30 | |||
31 | from tg import request, session |
|
31 | from tg import request, session | |
32 | from tg import tmpl_context as c |
|
32 | from tg import tmpl_context as c | |
33 | from tg.i18n import ugettext as _ |
|
33 | from tg.i18n import ugettext as _ | |
34 | from webob.exc import HTTPBadRequest, HTTPFound, HTTPNotFound |
|
34 | from webob.exc import HTTPBadRequest, HTTPFound, HTTPNotFound | |
35 |
|
35 | |||
|
36 | from kallithea.controllers import base | |||
36 | from kallithea.lib import webutils |
|
37 | from kallithea.lib import webutils | |
37 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
38 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired | |
38 | from kallithea.lib.base import BaseRepoController, render |
|
|||
39 | from kallithea.lib.graphmod import graph_data |
|
39 | from kallithea.lib.graphmod import graph_data | |
40 | from kallithea.lib.page import Page |
|
40 | from kallithea.lib.page import Page | |
41 | from kallithea.lib.utils2 import safe_int |
|
41 | from kallithea.lib.utils2 import safe_int | |
42 | from kallithea.lib.vcs.exceptions import ChangesetDoesNotExistError, ChangesetError, EmptyRepositoryError, NodeDoesNotExistError, RepositoryError |
|
42 | from kallithea.lib.vcs.exceptions import ChangesetDoesNotExistError, ChangesetError, EmptyRepositoryError, NodeDoesNotExistError, RepositoryError | |
43 | from kallithea.lib.webutils import url |
|
43 | from kallithea.lib.webutils import url | |
44 |
|
44 | |||
45 |
|
45 | |||
46 | log = logging.getLogger(__name__) |
|
46 | log = logging.getLogger(__name__) | |
47 |
|
47 | |||
48 |
|
48 | |||
49 | class ChangelogController(BaseRepoController): |
|
49 | class ChangelogController(base.BaseRepoController): | |
50 |
|
50 | |||
51 | def _before(self, *args, **kwargs): |
|
51 | def _before(self, *args, **kwargs): | |
52 | super(ChangelogController, self)._before(*args, **kwargs) |
|
52 | super(ChangelogController, self)._before(*args, **kwargs) | |
53 | c.affected_files_cut_off = 60 |
|
53 | c.affected_files_cut_off = 60 | |
54 |
|
54 | |||
55 | @staticmethod |
|
55 | @staticmethod | |
56 | def __get_cs(rev, repo): |
|
56 | def __get_cs(rev, repo): | |
57 | """ |
|
57 | """ | |
58 | Safe way to get changeset. If error occur fail with error message. |
|
58 | Safe way to get changeset. If error occur fail with error message. | |
59 |
|
59 | |||
60 | :param rev: revision to fetch |
|
60 | :param rev: revision to fetch | |
61 | :param repo: repo instance |
|
61 | :param repo: repo instance | |
62 | """ |
|
62 | """ | |
63 |
|
63 | |||
64 | try: |
|
64 | try: | |
65 | return c.db_repo_scm_instance.get_changeset(rev) |
|
65 | return c.db_repo_scm_instance.get_changeset(rev) | |
66 | except EmptyRepositoryError as e: |
|
66 | except EmptyRepositoryError as e: | |
67 | webutils.flash(_('There are no changesets yet'), category='error') |
|
67 | webutils.flash(_('There are no changesets yet'), category='error') | |
68 | except RepositoryError as e: |
|
68 | except RepositoryError as e: | |
69 | log.error(traceback.format_exc()) |
|
69 | log.error(traceback.format_exc()) | |
70 | webutils.flash(e, category='error') |
|
70 | webutils.flash(e, category='error') | |
71 | raise HTTPBadRequest() |
|
71 | raise HTTPBadRequest() | |
72 |
|
72 | |||
73 | @LoginRequired(allow_default_user=True) |
|
73 | @LoginRequired(allow_default_user=True) | |
74 | @HasRepoPermissionLevelDecorator('read') |
|
74 | @HasRepoPermissionLevelDecorator('read') | |
75 | def index(self, repo_name, revision=None, f_path=None): |
|
75 | def index(self, repo_name, revision=None, f_path=None): | |
76 | limit = 2000 |
|
76 | limit = 2000 | |
77 | default = 100 |
|
77 | default = 100 | |
78 | if request.GET.get('size'): |
|
78 | if request.GET.get('size'): | |
79 | c.size = max(min(safe_int(request.GET.get('size')), limit), 1) |
|
79 | c.size = max(min(safe_int(request.GET.get('size')), limit), 1) | |
80 | session['changelog_size'] = c.size |
|
80 | session['changelog_size'] = c.size | |
81 | session.save() |
|
81 | session.save() | |
82 | else: |
|
82 | else: | |
83 | c.size = int(session.get('changelog_size', default)) |
|
83 | c.size = int(session.get('changelog_size', default)) | |
84 | # min size must be 1 |
|
84 | # min size must be 1 | |
85 | c.size = max(c.size, 1) |
|
85 | c.size = max(c.size, 1) | |
86 | p = safe_int(request.GET.get('page'), 1) |
|
86 | p = safe_int(request.GET.get('page'), 1) | |
87 | branch_name = request.GET.get('branch', None) |
|
87 | branch_name = request.GET.get('branch', None) | |
88 | if (branch_name and |
|
88 | if (branch_name and | |
89 | branch_name not in c.db_repo_scm_instance.branches and |
|
89 | branch_name not in c.db_repo_scm_instance.branches and | |
90 | branch_name not in c.db_repo_scm_instance.closed_branches and |
|
90 | branch_name not in c.db_repo_scm_instance.closed_branches and | |
91 | not revision |
|
91 | not revision | |
92 | ): |
|
92 | ): | |
93 | raise HTTPFound(location=url('changelog_file_home', repo_name=c.repo_name, |
|
93 | raise HTTPFound(location=url('changelog_file_home', repo_name=c.repo_name, | |
94 | revision=branch_name, f_path=f_path or '')) |
|
94 | revision=branch_name, f_path=f_path or '')) | |
95 |
|
95 | |||
96 | if revision == 'tip': |
|
96 | if revision == 'tip': | |
97 | revision = None |
|
97 | revision = None | |
98 |
|
98 | |||
99 | c.changelog_for_path = f_path |
|
99 | c.changelog_for_path = f_path | |
100 | try: |
|
100 | try: | |
101 |
|
101 | |||
102 | if f_path: |
|
102 | if f_path: | |
103 | log.debug('generating changelog for path %s', f_path) |
|
103 | log.debug('generating changelog for path %s', f_path) | |
104 | # get the history for the file ! |
|
104 | # get the history for the file ! | |
105 | tip_cs = c.db_repo_scm_instance.get_changeset() |
|
105 | tip_cs = c.db_repo_scm_instance.get_changeset() | |
106 | try: |
|
106 | try: | |
107 | collection = tip_cs.get_file_history(f_path) |
|
107 | collection = tip_cs.get_file_history(f_path) | |
108 | except (NodeDoesNotExistError, ChangesetError): |
|
108 | except (NodeDoesNotExistError, ChangesetError): | |
109 | # this node is not present at tip ! |
|
109 | # this node is not present at tip ! | |
110 | try: |
|
110 | try: | |
111 | cs = self.__get_cs(revision, repo_name) |
|
111 | cs = self.__get_cs(revision, repo_name) | |
112 | collection = cs.get_file_history(f_path) |
|
112 | collection = cs.get_file_history(f_path) | |
113 | except RepositoryError as e: |
|
113 | except RepositoryError as e: | |
114 | webutils.flash(e, category='warning') |
|
114 | webutils.flash(e, category='warning') | |
115 | raise HTTPFound(location=webutils.url('changelog_home', repo_name=repo_name)) |
|
115 | raise HTTPFound(location=webutils.url('changelog_home', repo_name=repo_name)) | |
116 | else: |
|
116 | else: | |
117 | collection = c.db_repo_scm_instance.get_changesets(start=0, end=revision, |
|
117 | collection = c.db_repo_scm_instance.get_changesets(start=0, end=revision, | |
118 | branch_name=branch_name, reverse=True) |
|
118 | branch_name=branch_name, reverse=True) | |
119 | c.total_cs = len(collection) |
|
119 | c.total_cs = len(collection) | |
120 |
|
120 | |||
121 | c.cs_pagination = Page(collection, page=p, item_count=c.total_cs, items_per_page=c.size, |
|
121 | c.cs_pagination = Page(collection, page=p, item_count=c.total_cs, items_per_page=c.size, | |
122 | branch=branch_name) |
|
122 | branch=branch_name) | |
123 |
|
123 | |||
124 | page_revisions = [x.raw_id for x in c.cs_pagination] |
|
124 | page_revisions = [x.raw_id for x in c.cs_pagination] | |
125 | c.cs_comments = c.db_repo.get_comments(page_revisions) |
|
125 | c.cs_comments = c.db_repo.get_comments(page_revisions) | |
126 | c.cs_statuses = c.db_repo.statuses(page_revisions) |
|
126 | c.cs_statuses = c.db_repo.statuses(page_revisions) | |
127 | except EmptyRepositoryError as e: |
|
127 | except EmptyRepositoryError as e: | |
128 | webutils.flash(e, category='warning') |
|
128 | webutils.flash(e, category='warning') | |
129 | raise HTTPFound(location=url('summary_home', repo_name=c.repo_name)) |
|
129 | raise HTTPFound(location=url('summary_home', repo_name=c.repo_name)) | |
130 | except (RepositoryError, ChangesetDoesNotExistError, Exception) as e: |
|
130 | except (RepositoryError, ChangesetDoesNotExistError, Exception) as e: | |
131 | log.error(traceback.format_exc()) |
|
131 | log.error(traceback.format_exc()) | |
132 | webutils.flash(e, category='error') |
|
132 | webutils.flash(e, category='error') | |
133 | raise HTTPFound(location=url('changelog_home', repo_name=c.repo_name)) |
|
133 | raise HTTPFound(location=url('changelog_home', repo_name=c.repo_name)) | |
134 |
|
134 | |||
135 | c.branch_name = branch_name |
|
135 | c.branch_name = branch_name | |
136 | c.branch_filters = [('', _('None'))] + \ |
|
136 | c.branch_filters = [('', _('None'))] + \ | |
137 | [(k, k) for k in c.db_repo_scm_instance.branches] |
|
137 | [(k, k) for k in c.db_repo_scm_instance.branches] | |
138 | if c.db_repo_scm_instance.closed_branches: |
|
138 | if c.db_repo_scm_instance.closed_branches: | |
139 | prefix = _('(closed)') + ' ' |
|
139 | prefix = _('(closed)') + ' ' | |
140 | c.branch_filters += [('-', '-')] + \ |
|
140 | c.branch_filters += [('-', '-')] + \ | |
141 | [(k, prefix + k) for k in c.db_repo_scm_instance.closed_branches] |
|
141 | [(k, prefix + k) for k in c.db_repo_scm_instance.closed_branches] | |
142 | revs = [] |
|
142 | revs = [] | |
143 | if not f_path: |
|
143 | if not f_path: | |
144 | revs = [x.revision for x in c.cs_pagination] |
|
144 | revs = [x.revision for x in c.cs_pagination] | |
145 | c.jsdata = graph_data(c.db_repo_scm_instance, revs) |
|
145 | c.jsdata = graph_data(c.db_repo_scm_instance, revs) | |
146 |
|
146 | |||
147 | c.revision = revision # requested revision ref |
|
147 | c.revision = revision # requested revision ref | |
148 | c.first_revision = c.cs_pagination[0] # pagination is never empty here! |
|
148 | c.first_revision = c.cs_pagination[0] # pagination is never empty here! | |
149 | return render('changelog/changelog.html') |
|
149 | return base.render('changelog/changelog.html') | |
150 |
|
150 | |||
151 | @LoginRequired(allow_default_user=True) |
|
151 | @LoginRequired(allow_default_user=True) | |
152 | @HasRepoPermissionLevelDecorator('read') |
|
152 | @HasRepoPermissionLevelDecorator('read') | |
153 | def changelog_details(self, cs): |
|
153 | def changelog_details(self, cs): | |
154 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
154 | if request.environ.get('HTTP_X_PARTIAL_XHR'): | |
155 | c.cs = c.db_repo_scm_instance.get_changeset(cs) |
|
155 | c.cs = c.db_repo_scm_instance.get_changeset(cs) | |
156 | return render('changelog/changelog_details.html') |
|
156 | return base.render('changelog/changelog_details.html') | |
157 | raise HTTPNotFound() |
|
157 | raise HTTPNotFound() |
@@ -1,370 +1,370 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.changeset |
|
15 | kallithea.controllers.changeset | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | changeset controller showing changes between revisions |
|
18 | changeset controller showing changes between revisions | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 25, 2010 |
|
22 | :created_on: Apr 25, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import binascii |
|
28 | import binascii | |
29 | import logging |
|
29 | import logging | |
30 | import traceback |
|
30 | import traceback | |
31 | from collections import OrderedDict |
|
31 | from collections import OrderedDict | |
32 |
|
32 | |||
33 | from tg import request, response |
|
33 | from tg import request, response | |
34 | from tg import tmpl_context as c |
|
34 | from tg import tmpl_context as c | |
35 | from tg.i18n import ugettext as _ |
|
35 | from tg.i18n import ugettext as _ | |
36 | from webob.exc import HTTPBadRequest, HTTPForbidden, HTTPNotFound |
|
36 | from webob.exc import HTTPBadRequest, HTTPForbidden, HTTPNotFound | |
37 |
|
37 | |||
38 | import kallithea.lib.helpers as h |
|
38 | import kallithea.lib.helpers as h | |
|
39 | from kallithea.controllers import base | |||
39 | from kallithea.lib import auth, diffs, webutils |
|
40 | from kallithea.lib import auth, diffs, webutils | |
40 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
41 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired | |
41 | from kallithea.lib.base import BaseRepoController, jsonify, render |
|
|||
42 | from kallithea.lib.graphmod import graph_data |
|
42 | from kallithea.lib.graphmod import graph_data | |
43 | from kallithea.lib.utils2 import ascii_str, safe_str |
|
43 | from kallithea.lib.utils2 import ascii_str, safe_str | |
44 | from kallithea.lib.vcs.backends.base import EmptyChangeset |
|
44 | from kallithea.lib.vcs.backends.base import EmptyChangeset | |
45 | from kallithea.lib.vcs.exceptions import ChangesetDoesNotExistError, EmptyRepositoryError, RepositoryError |
|
45 | from kallithea.lib.vcs.exceptions import ChangesetDoesNotExistError, EmptyRepositoryError, RepositoryError | |
46 | from kallithea.model import db, meta, userlog |
|
46 | from kallithea.model import db, meta, userlog | |
47 | from kallithea.model.changeset_status import ChangesetStatusModel |
|
47 | from kallithea.model.changeset_status import ChangesetStatusModel | |
48 | from kallithea.model.comment import ChangesetCommentsModel |
|
48 | from kallithea.model.comment import ChangesetCommentsModel | |
49 | from kallithea.model.pull_request import PullRequestModel |
|
49 | from kallithea.model.pull_request import PullRequestModel | |
50 |
|
50 | |||
51 |
|
51 | |||
52 | log = logging.getLogger(__name__) |
|
52 | log = logging.getLogger(__name__) | |
53 |
|
53 | |||
54 |
|
54 | |||
55 | def create_cs_pr_comment(repo_name, revision=None, pull_request=None, allowed_to_change_status=True): |
|
55 | def create_cs_pr_comment(repo_name, revision=None, pull_request=None, allowed_to_change_status=True): | |
56 | """ |
|
56 | """ | |
57 | Add a comment to the specified changeset or pull request, using POST values |
|
57 | Add a comment to the specified changeset or pull request, using POST values | |
58 | from the request. |
|
58 | from the request. | |
59 |
|
59 | |||
60 | Comments can be inline (when a file path and line number is specified in |
|
60 | Comments can be inline (when a file path and line number is specified in | |
61 | POST) or general comments. |
|
61 | POST) or general comments. | |
62 | A comment can be accompanied by a review status change (accepted, rejected, |
|
62 | A comment can be accompanied by a review status change (accepted, rejected, | |
63 | etc.). Pull requests can be closed or deleted. |
|
63 | etc.). Pull requests can be closed or deleted. | |
64 |
|
64 | |||
65 | Parameter 'allowed_to_change_status' is used for both status changes and |
|
65 | Parameter 'allowed_to_change_status' is used for both status changes and | |
66 | closing of pull requests. For deleting of pull requests, more specific |
|
66 | closing of pull requests. For deleting of pull requests, more specific | |
67 | checks are done. |
|
67 | checks are done. | |
68 | """ |
|
68 | """ | |
69 |
|
69 | |||
70 | assert request.environ.get('HTTP_X_PARTIAL_XHR') |
|
70 | assert request.environ.get('HTTP_X_PARTIAL_XHR') | |
71 | if pull_request: |
|
71 | if pull_request: | |
72 | pull_request_id = pull_request.pull_request_id |
|
72 | pull_request_id = pull_request.pull_request_id | |
73 | else: |
|
73 | else: | |
74 | pull_request_id = None |
|
74 | pull_request_id = None | |
75 |
|
75 | |||
76 | status = request.POST.get('changeset_status') |
|
76 | status = request.POST.get('changeset_status') | |
77 | close_pr = request.POST.get('save_close') |
|
77 | close_pr = request.POST.get('save_close') | |
78 | delete = request.POST.get('save_delete') |
|
78 | delete = request.POST.get('save_delete') | |
79 | f_path = request.POST.get('f_path') |
|
79 | f_path = request.POST.get('f_path') | |
80 | line_no = request.POST.get('line') |
|
80 | line_no = request.POST.get('line') | |
81 |
|
81 | |||
82 | if (status or close_pr or delete) and (f_path or line_no): |
|
82 | if (status or close_pr or delete) and (f_path or line_no): | |
83 | # status votes and closing is only possible in general comments |
|
83 | # status votes and closing is only possible in general comments | |
84 | raise HTTPBadRequest() |
|
84 | raise HTTPBadRequest() | |
85 |
|
85 | |||
86 | if not allowed_to_change_status: |
|
86 | if not allowed_to_change_status: | |
87 | if status or close_pr: |
|
87 | if status or close_pr: | |
88 | webutils.flash(_('No permission to change status'), 'error') |
|
88 | webutils.flash(_('No permission to change status'), 'error') | |
89 | raise HTTPForbidden() |
|
89 | raise HTTPForbidden() | |
90 |
|
90 | |||
91 | if pull_request and delete == "delete": |
|
91 | if pull_request and delete == "delete": | |
92 | if (pull_request.owner_id == request.authuser.user_id or |
|
92 | if (pull_request.owner_id == request.authuser.user_id or | |
93 | auth.HasPermissionAny('hg.admin')() or |
|
93 | auth.HasPermissionAny('hg.admin')() or | |
94 | auth.HasRepoPermissionLevel('admin')(pull_request.org_repo.repo_name) or |
|
94 | auth.HasRepoPermissionLevel('admin')(pull_request.org_repo.repo_name) or | |
95 | auth.HasRepoPermissionLevel('admin')(pull_request.other_repo.repo_name) |
|
95 | auth.HasRepoPermissionLevel('admin')(pull_request.other_repo.repo_name) | |
96 | ) and not pull_request.is_closed(): |
|
96 | ) and not pull_request.is_closed(): | |
97 | PullRequestModel().delete(pull_request) |
|
97 | PullRequestModel().delete(pull_request) | |
98 | meta.Session().commit() |
|
98 | meta.Session().commit() | |
99 | webutils.flash(_('Successfully deleted pull request %s') % pull_request_id, |
|
99 | webutils.flash(_('Successfully deleted pull request %s') % pull_request_id, | |
100 | category='success') |
|
100 | category='success') | |
101 | return { |
|
101 | return { | |
102 | 'location': webutils.url('my_pullrequests'), # or repo pr list? |
|
102 | 'location': webutils.url('my_pullrequests'), # or repo pr list? | |
103 | } |
|
103 | } | |
104 | raise HTTPForbidden() |
|
104 | raise HTTPForbidden() | |
105 |
|
105 | |||
106 | text = request.POST.get('text', '').strip() |
|
106 | text = request.POST.get('text', '').strip() | |
107 |
|
107 | |||
108 | comment = ChangesetCommentsModel().create( |
|
108 | comment = ChangesetCommentsModel().create( | |
109 | text=text, |
|
109 | text=text, | |
110 | repo=c.db_repo.repo_id, |
|
110 | repo=c.db_repo.repo_id, | |
111 | author=request.authuser.user_id, |
|
111 | author=request.authuser.user_id, | |
112 | revision=revision, |
|
112 | revision=revision, | |
113 | pull_request=pull_request_id, |
|
113 | pull_request=pull_request_id, | |
114 | f_path=f_path or None, |
|
114 | f_path=f_path or None, | |
115 | line_no=line_no or None, |
|
115 | line_no=line_no or None, | |
116 | status_change=db.ChangesetStatus.get_status_lbl(status) if status else None, |
|
116 | status_change=db.ChangesetStatus.get_status_lbl(status) if status else None, | |
117 | closing_pr=close_pr, |
|
117 | closing_pr=close_pr, | |
118 | ) |
|
118 | ) | |
119 |
|
119 | |||
120 | if status: |
|
120 | if status: | |
121 | ChangesetStatusModel().set_status( |
|
121 | ChangesetStatusModel().set_status( | |
122 | c.db_repo.repo_id, |
|
122 | c.db_repo.repo_id, | |
123 | status, |
|
123 | status, | |
124 | request.authuser.user_id, |
|
124 | request.authuser.user_id, | |
125 | comment, |
|
125 | comment, | |
126 | revision=revision, |
|
126 | revision=revision, | |
127 | pull_request=pull_request_id, |
|
127 | pull_request=pull_request_id, | |
128 | ) |
|
128 | ) | |
129 |
|
129 | |||
130 | if pull_request: |
|
130 | if pull_request: | |
131 | action = 'user_commented_pull_request:%s' % pull_request_id |
|
131 | action = 'user_commented_pull_request:%s' % pull_request_id | |
132 | else: |
|
132 | else: | |
133 | action = 'user_commented_revision:%s' % revision |
|
133 | action = 'user_commented_revision:%s' % revision | |
134 | userlog.action_logger(request.authuser, action, c.db_repo, request.ip_addr) |
|
134 | userlog.action_logger(request.authuser, action, c.db_repo, request.ip_addr) | |
135 |
|
135 | |||
136 | if pull_request and close_pr: |
|
136 | if pull_request and close_pr: | |
137 | PullRequestModel().close_pull_request(pull_request_id) |
|
137 | PullRequestModel().close_pull_request(pull_request_id) | |
138 | userlog.action_logger(request.authuser, |
|
138 | userlog.action_logger(request.authuser, | |
139 | 'user_closed_pull_request:%s' % pull_request_id, |
|
139 | 'user_closed_pull_request:%s' % pull_request_id, | |
140 | c.db_repo, request.ip_addr) |
|
140 | c.db_repo, request.ip_addr) | |
141 |
|
141 | |||
142 | meta.Session().commit() |
|
142 | meta.Session().commit() | |
143 |
|
143 | |||
144 | data = { |
|
144 | data = { | |
145 | 'target_id': webutils.safeid(request.POST.get('f_path')), |
|
145 | 'target_id': webutils.safeid(request.POST.get('f_path')), | |
146 | } |
|
146 | } | |
147 | if comment is not None: |
|
147 | if comment is not None: | |
148 | c.comment = comment |
|
148 | c.comment = comment | |
149 | data.update(comment.get_dict()) |
|
149 | data.update(comment.get_dict()) | |
150 | data.update({'rendered_text': |
|
150 | data.update({'rendered_text': | |
151 | render('changeset/changeset_comment_block.html')}) |
|
151 | base.render('changeset/changeset_comment_block.html')}) | |
152 |
|
152 | |||
153 | return data |
|
153 | return data | |
154 |
|
154 | |||
155 | def delete_cs_pr_comment(repo_name, comment_id): |
|
155 | def delete_cs_pr_comment(repo_name, comment_id): | |
156 | """Delete a comment from a changeset or pull request""" |
|
156 | """Delete a comment from a changeset or pull request""" | |
157 | co = db.ChangesetComment.get_or_404(comment_id) |
|
157 | co = db.ChangesetComment.get_or_404(comment_id) | |
158 | if co.repo.repo_name != repo_name: |
|
158 | if co.repo.repo_name != repo_name: | |
159 | raise HTTPNotFound() |
|
159 | raise HTTPNotFound() | |
160 | if co.pull_request and co.pull_request.is_closed(): |
|
160 | if co.pull_request and co.pull_request.is_closed(): | |
161 | # don't allow deleting comments on closed pull request |
|
161 | # don't allow deleting comments on closed pull request | |
162 | raise HTTPForbidden() |
|
162 | raise HTTPForbidden() | |
163 |
|
163 | |||
164 | owner = co.author_id == request.authuser.user_id |
|
164 | owner = co.author_id == request.authuser.user_id | |
165 | repo_admin = auth.HasRepoPermissionLevel('admin')(repo_name) |
|
165 | repo_admin = auth.HasRepoPermissionLevel('admin')(repo_name) | |
166 | if auth.HasPermissionAny('hg.admin')() or repo_admin or owner: |
|
166 | if auth.HasPermissionAny('hg.admin')() or repo_admin or owner: | |
167 | ChangesetCommentsModel().delete(comment=co) |
|
167 | ChangesetCommentsModel().delete(comment=co) | |
168 | meta.Session().commit() |
|
168 | meta.Session().commit() | |
169 | return True |
|
169 | return True | |
170 | else: |
|
170 | else: | |
171 | raise HTTPForbidden() |
|
171 | raise HTTPForbidden() | |
172 |
|
172 | |||
173 | class ChangesetController(BaseRepoController): |
|
173 | class ChangesetController(base.BaseRepoController): | |
174 |
|
174 | |||
175 | def _before(self, *args, **kwargs): |
|
175 | def _before(self, *args, **kwargs): | |
176 | super(ChangesetController, self)._before(*args, **kwargs) |
|
176 | super(ChangesetController, self)._before(*args, **kwargs) | |
177 | c.affected_files_cut_off = 60 |
|
177 | c.affected_files_cut_off = 60 | |
178 |
|
178 | |||
179 | def _index(self, revision, method): |
|
179 | def _index(self, revision, method): | |
180 | c.pull_request = None |
|
180 | c.pull_request = None | |
181 | c.fulldiff = request.GET.get('fulldiff') # for reporting number of changed files |
|
181 | c.fulldiff = request.GET.get('fulldiff') # for reporting number of changed files | |
182 | # get ranges of revisions if preset |
|
182 | # get ranges of revisions if preset | |
183 | rev_range = revision.split('...')[:2] |
|
183 | rev_range = revision.split('...')[:2] | |
184 | c.cs_repo = c.db_repo |
|
184 | c.cs_repo = c.db_repo | |
185 | try: |
|
185 | try: | |
186 | if len(rev_range) == 2: |
|
186 | if len(rev_range) == 2: | |
187 | rev_start = rev_range[0] |
|
187 | rev_start = rev_range[0] | |
188 | rev_end = rev_range[1] |
|
188 | rev_end = rev_range[1] | |
189 | rev_ranges = c.db_repo_scm_instance.get_changesets(start=rev_start, |
|
189 | rev_ranges = c.db_repo_scm_instance.get_changesets(start=rev_start, | |
190 | end=rev_end) |
|
190 | end=rev_end) | |
191 | else: |
|
191 | else: | |
192 | rev_ranges = [c.db_repo_scm_instance.get_changeset(revision)] |
|
192 | rev_ranges = [c.db_repo_scm_instance.get_changeset(revision)] | |
193 |
|
193 | |||
194 | c.cs_ranges = list(rev_ranges) |
|
194 | c.cs_ranges = list(rev_ranges) | |
195 | if not c.cs_ranges: |
|
195 | if not c.cs_ranges: | |
196 | raise RepositoryError('Changeset range returned empty result') |
|
196 | raise RepositoryError('Changeset range returned empty result') | |
197 |
|
197 | |||
198 | except (ChangesetDoesNotExistError, EmptyRepositoryError): |
|
198 | except (ChangesetDoesNotExistError, EmptyRepositoryError): | |
199 | log.debug(traceback.format_exc()) |
|
199 | log.debug(traceback.format_exc()) | |
200 | msg = _('Such revision does not exist for this repository') |
|
200 | msg = _('Such revision does not exist for this repository') | |
201 | webutils.flash(msg, category='error') |
|
201 | webutils.flash(msg, category='error') | |
202 | raise HTTPNotFound() |
|
202 | raise HTTPNotFound() | |
203 |
|
203 | |||
204 | c.changes = OrderedDict() |
|
204 | c.changes = OrderedDict() | |
205 |
|
205 | |||
206 | c.lines_added = 0 # count of lines added |
|
206 | c.lines_added = 0 # count of lines added | |
207 | c.lines_deleted = 0 # count of lines removes |
|
207 | c.lines_deleted = 0 # count of lines removes | |
208 |
|
208 | |||
209 | c.changeset_statuses = db.ChangesetStatus.STATUSES |
|
209 | c.changeset_statuses = db.ChangesetStatus.STATUSES | |
210 | comments = dict() |
|
210 | comments = dict() | |
211 | c.statuses = [] |
|
211 | c.statuses = [] | |
212 | c.inline_comments = [] |
|
212 | c.inline_comments = [] | |
213 | c.inline_cnt = 0 |
|
213 | c.inline_cnt = 0 | |
214 |
|
214 | |||
215 | # Iterate over ranges (default changeset view is always one changeset) |
|
215 | # Iterate over ranges (default changeset view is always one changeset) | |
216 | for changeset in c.cs_ranges: |
|
216 | for changeset in c.cs_ranges: | |
217 | if method == 'show': |
|
217 | if method == 'show': | |
218 | c.statuses.extend([ChangesetStatusModel().get_status( |
|
218 | c.statuses.extend([ChangesetStatusModel().get_status( | |
219 | c.db_repo.repo_id, changeset.raw_id)]) |
|
219 | c.db_repo.repo_id, changeset.raw_id)]) | |
220 |
|
220 | |||
221 | # Changeset comments |
|
221 | # Changeset comments | |
222 | comments.update((com.comment_id, com) |
|
222 | comments.update((com.comment_id, com) | |
223 | for com in ChangesetCommentsModel() |
|
223 | for com in ChangesetCommentsModel() | |
224 | .get_comments(c.db_repo.repo_id, |
|
224 | .get_comments(c.db_repo.repo_id, | |
225 | revision=changeset.raw_id)) |
|
225 | revision=changeset.raw_id)) | |
226 |
|
226 | |||
227 | # Status change comments - mostly from pull requests |
|
227 | # Status change comments - mostly from pull requests | |
228 | comments.update((st.comment_id, st.comment) |
|
228 | comments.update((st.comment_id, st.comment) | |
229 | for st in ChangesetStatusModel() |
|
229 | for st in ChangesetStatusModel() | |
230 | .get_statuses(c.db_repo.repo_id, |
|
230 | .get_statuses(c.db_repo.repo_id, | |
231 | changeset.raw_id, with_revisions=True) |
|
231 | changeset.raw_id, with_revisions=True) | |
232 | if st.comment_id is not None) |
|
232 | if st.comment_id is not None) | |
233 |
|
233 | |||
234 | inlines = ChangesetCommentsModel() \ |
|
234 | inlines = ChangesetCommentsModel() \ | |
235 | .get_inline_comments(c.db_repo.repo_id, |
|
235 | .get_inline_comments(c.db_repo.repo_id, | |
236 | revision=changeset.raw_id) |
|
236 | revision=changeset.raw_id) | |
237 | c.inline_comments.extend(inlines) |
|
237 | c.inline_comments.extend(inlines) | |
238 |
|
238 | |||
239 | cs2 = changeset.raw_id |
|
239 | cs2 = changeset.raw_id | |
240 | cs1 = changeset.parents[0].raw_id if changeset.parents else EmptyChangeset().raw_id |
|
240 | cs1 = changeset.parents[0].raw_id if changeset.parents else EmptyChangeset().raw_id | |
241 | ignore_whitespace_diff = h.get_ignore_whitespace_diff(request.GET) |
|
241 | ignore_whitespace_diff = h.get_ignore_whitespace_diff(request.GET) | |
242 | diff_context_size = h.get_diff_context_size(request.GET) |
|
242 | diff_context_size = h.get_diff_context_size(request.GET) | |
243 | raw_diff = diffs.get_diff(c.db_repo_scm_instance, cs1, cs2, |
|
243 | raw_diff = diffs.get_diff(c.db_repo_scm_instance, cs1, cs2, | |
244 | ignore_whitespace=ignore_whitespace_diff, context=diff_context_size) |
|
244 | ignore_whitespace=ignore_whitespace_diff, context=diff_context_size) | |
245 | diff_limit = None if c.fulldiff else self.cut_off_limit |
|
245 | diff_limit = None if c.fulldiff else self.cut_off_limit | |
246 | file_diff_data = [] |
|
246 | file_diff_data = [] | |
247 | if method == 'show': |
|
247 | if method == 'show': | |
248 | diff_processor = diffs.DiffProcessor(raw_diff, |
|
248 | diff_processor = diffs.DiffProcessor(raw_diff, | |
249 | vcs=c.db_repo_scm_instance.alias, |
|
249 | vcs=c.db_repo_scm_instance.alias, | |
250 | diff_limit=diff_limit) |
|
250 | diff_limit=diff_limit) | |
251 | c.limited_diff = diff_processor.limited_diff |
|
251 | c.limited_diff = diff_processor.limited_diff | |
252 | for f in diff_processor.parsed: |
|
252 | for f in diff_processor.parsed: | |
253 | st = f['stats'] |
|
253 | st = f['stats'] | |
254 | c.lines_added += st['added'] |
|
254 | c.lines_added += st['added'] | |
255 | c.lines_deleted += st['deleted'] |
|
255 | c.lines_deleted += st['deleted'] | |
256 | filename = f['filename'] |
|
256 | filename = f['filename'] | |
257 | fid = h.FID(changeset.raw_id, filename) |
|
257 | fid = h.FID(changeset.raw_id, filename) | |
258 | url_fid = h.FID('', filename) |
|
258 | url_fid = h.FID('', filename) | |
259 | html_diff = diffs.as_html(parsed_lines=[f]) |
|
259 | html_diff = diffs.as_html(parsed_lines=[f]) | |
260 | file_diff_data.append((fid, url_fid, f['operation'], f['old_filename'], filename, html_diff, st)) |
|
260 | file_diff_data.append((fid, url_fid, f['operation'], f['old_filename'], filename, html_diff, st)) | |
261 | else: |
|
261 | else: | |
262 | # downloads/raw we only need RAW diff nothing else |
|
262 | # downloads/raw we only need RAW diff nothing else | |
263 | file_diff_data.append(('', None, None, None, raw_diff, None)) |
|
263 | file_diff_data.append(('', None, None, None, raw_diff, None)) | |
264 | c.changes[changeset.raw_id] = (cs1, cs2, file_diff_data) |
|
264 | c.changes[changeset.raw_id] = (cs1, cs2, file_diff_data) | |
265 |
|
265 | |||
266 | # sort comments in creation order |
|
266 | # sort comments in creation order | |
267 | c.comments = [com for com_id, com in sorted(comments.items())] |
|
267 | c.comments = [com for com_id, com in sorted(comments.items())] | |
268 |
|
268 | |||
269 | # count inline comments |
|
269 | # count inline comments | |
270 | for __, lines in c.inline_comments: |
|
270 | for __, lines in c.inline_comments: | |
271 | for comments in lines.values(): |
|
271 | for comments in lines.values(): | |
272 | c.inline_cnt += len(comments) |
|
272 | c.inline_cnt += len(comments) | |
273 |
|
273 | |||
274 | if len(c.cs_ranges) == 1: |
|
274 | if len(c.cs_ranges) == 1: | |
275 | c.changeset = c.cs_ranges[0] |
|
275 | c.changeset = c.cs_ranges[0] | |
276 | c.parent_tmpl = ''.join(['# Parent %s\n' % x.raw_id |
|
276 | c.parent_tmpl = ''.join(['# Parent %s\n' % x.raw_id | |
277 | for x in c.changeset.parents]) |
|
277 | for x in c.changeset.parents]) | |
278 | c.changeset_graft_source_hash = ascii_str(c.changeset.extra.get(b'source', b'')) |
|
278 | c.changeset_graft_source_hash = ascii_str(c.changeset.extra.get(b'source', b'')) | |
279 | c.changeset_transplant_source_hash = ascii_str(binascii.hexlify(c.changeset.extra.get(b'transplant_source', b''))) |
|
279 | c.changeset_transplant_source_hash = ascii_str(binascii.hexlify(c.changeset.extra.get(b'transplant_source', b''))) | |
280 | if method == 'download': |
|
280 | if method == 'download': | |
281 | response.content_type = 'text/plain' |
|
281 | response.content_type = 'text/plain' | |
282 | response.content_disposition = 'attachment; filename=%s.diff' \ |
|
282 | response.content_disposition = 'attachment; filename=%s.diff' \ | |
283 | % revision[:12] |
|
283 | % revision[:12] | |
284 | return raw_diff |
|
284 | return raw_diff | |
285 | elif method == 'patch': |
|
285 | elif method == 'patch': | |
286 | response.content_type = 'text/plain' |
|
286 | response.content_type = 'text/plain' | |
287 | c.diff = safe_str(raw_diff) |
|
287 | c.diff = safe_str(raw_diff) | |
288 | return render('changeset/patch_changeset.html') |
|
288 | return base.render('changeset/patch_changeset.html') | |
289 | elif method == 'raw': |
|
289 | elif method == 'raw': | |
290 | response.content_type = 'text/plain' |
|
290 | response.content_type = 'text/plain' | |
291 | return raw_diff |
|
291 | return raw_diff | |
292 | elif method == 'show': |
|
292 | elif method == 'show': | |
293 | if len(c.cs_ranges) == 1: |
|
293 | if len(c.cs_ranges) == 1: | |
294 | return render('changeset/changeset.html') |
|
294 | return base.render('changeset/changeset.html') | |
295 | else: |
|
295 | else: | |
296 | c.cs_ranges_org = None |
|
296 | c.cs_ranges_org = None | |
297 | c.cs_comments = {} |
|
297 | c.cs_comments = {} | |
298 | revs = [ctx.revision for ctx in reversed(c.cs_ranges)] |
|
298 | revs = [ctx.revision for ctx in reversed(c.cs_ranges)] | |
299 | c.jsdata = graph_data(c.db_repo_scm_instance, revs) |
|
299 | c.jsdata = graph_data(c.db_repo_scm_instance, revs) | |
300 | return render('changeset/changeset_range.html') |
|
300 | return base.render('changeset/changeset_range.html') | |
301 |
|
301 | |||
302 | @LoginRequired(allow_default_user=True) |
|
302 | @LoginRequired(allow_default_user=True) | |
303 | @HasRepoPermissionLevelDecorator('read') |
|
303 | @HasRepoPermissionLevelDecorator('read') | |
304 | def index(self, revision, method='show'): |
|
304 | def index(self, revision, method='show'): | |
305 | return self._index(revision, method=method) |
|
305 | return self._index(revision, method=method) | |
306 |
|
306 | |||
307 | @LoginRequired(allow_default_user=True) |
|
307 | @LoginRequired(allow_default_user=True) | |
308 | @HasRepoPermissionLevelDecorator('read') |
|
308 | @HasRepoPermissionLevelDecorator('read') | |
309 | def changeset_raw(self, revision): |
|
309 | def changeset_raw(self, revision): | |
310 | return self._index(revision, method='raw') |
|
310 | return self._index(revision, method='raw') | |
311 |
|
311 | |||
312 | @LoginRequired(allow_default_user=True) |
|
312 | @LoginRequired(allow_default_user=True) | |
313 | @HasRepoPermissionLevelDecorator('read') |
|
313 | @HasRepoPermissionLevelDecorator('read') | |
314 | def changeset_patch(self, revision): |
|
314 | def changeset_patch(self, revision): | |
315 | return self._index(revision, method='patch') |
|
315 | return self._index(revision, method='patch') | |
316 |
|
316 | |||
317 | @LoginRequired(allow_default_user=True) |
|
317 | @LoginRequired(allow_default_user=True) | |
318 | @HasRepoPermissionLevelDecorator('read') |
|
318 | @HasRepoPermissionLevelDecorator('read') | |
319 | def changeset_download(self, revision): |
|
319 | def changeset_download(self, revision): | |
320 | return self._index(revision, method='download') |
|
320 | return self._index(revision, method='download') | |
321 |
|
321 | |||
322 | @LoginRequired() |
|
322 | @LoginRequired() | |
323 | @HasRepoPermissionLevelDecorator('read') |
|
323 | @HasRepoPermissionLevelDecorator('read') | |
324 | @jsonify |
|
324 | @base.jsonify | |
325 | def comment(self, repo_name, revision): |
|
325 | def comment(self, repo_name, revision): | |
326 | return create_cs_pr_comment(repo_name, revision=revision) |
|
326 | return create_cs_pr_comment(repo_name, revision=revision) | |
327 |
|
327 | |||
328 | @LoginRequired() |
|
328 | @LoginRequired() | |
329 | @HasRepoPermissionLevelDecorator('read') |
|
329 | @HasRepoPermissionLevelDecorator('read') | |
330 | @jsonify |
|
330 | @base.jsonify | |
331 | def delete_comment(self, repo_name, comment_id): |
|
331 | def delete_comment(self, repo_name, comment_id): | |
332 | return delete_cs_pr_comment(repo_name, comment_id) |
|
332 | return delete_cs_pr_comment(repo_name, comment_id) | |
333 |
|
333 | |||
334 | @LoginRequired(allow_default_user=True) |
|
334 | @LoginRequired(allow_default_user=True) | |
335 | @HasRepoPermissionLevelDecorator('read') |
|
335 | @HasRepoPermissionLevelDecorator('read') | |
336 | @jsonify |
|
336 | @base.jsonify | |
337 | def changeset_info(self, repo_name, revision): |
|
337 | def changeset_info(self, repo_name, revision): | |
338 | if request.is_xhr: |
|
338 | if request.is_xhr: | |
339 | try: |
|
339 | try: | |
340 | return c.db_repo_scm_instance.get_changeset(revision) |
|
340 | return c.db_repo_scm_instance.get_changeset(revision) | |
341 | except ChangesetDoesNotExistError as e: |
|
341 | except ChangesetDoesNotExistError as e: | |
342 | return EmptyChangeset(message=str(e)) |
|
342 | return EmptyChangeset(message=str(e)) | |
343 | else: |
|
343 | else: | |
344 | raise HTTPBadRequest() |
|
344 | raise HTTPBadRequest() | |
345 |
|
345 | |||
346 | @LoginRequired(allow_default_user=True) |
|
346 | @LoginRequired(allow_default_user=True) | |
347 | @HasRepoPermissionLevelDecorator('read') |
|
347 | @HasRepoPermissionLevelDecorator('read') | |
348 | @jsonify |
|
348 | @base.jsonify | |
349 | def changeset_children(self, repo_name, revision): |
|
349 | def changeset_children(self, repo_name, revision): | |
350 | if request.is_xhr: |
|
350 | if request.is_xhr: | |
351 | changeset = c.db_repo_scm_instance.get_changeset(revision) |
|
351 | changeset = c.db_repo_scm_instance.get_changeset(revision) | |
352 | result = {"results": []} |
|
352 | result = {"results": []} | |
353 | if changeset.children: |
|
353 | if changeset.children: | |
354 | result = {"results": changeset.children} |
|
354 | result = {"results": changeset.children} | |
355 | return result |
|
355 | return result | |
356 | else: |
|
356 | else: | |
357 | raise HTTPBadRequest() |
|
357 | raise HTTPBadRequest() | |
358 |
|
358 | |||
359 | @LoginRequired(allow_default_user=True) |
|
359 | @LoginRequired(allow_default_user=True) | |
360 | @HasRepoPermissionLevelDecorator('read') |
|
360 | @HasRepoPermissionLevelDecorator('read') | |
361 | @jsonify |
|
361 | @base.jsonify | |
362 | def changeset_parents(self, repo_name, revision): |
|
362 | def changeset_parents(self, repo_name, revision): | |
363 | if request.is_xhr: |
|
363 | if request.is_xhr: | |
364 | changeset = c.db_repo_scm_instance.get_changeset(revision) |
|
364 | changeset = c.db_repo_scm_instance.get_changeset(revision) | |
365 | result = {"results": []} |
|
365 | result = {"results": []} | |
366 | if changeset.parents: |
|
366 | if changeset.parents: | |
367 | result = {"results": changeset.parents} |
|
367 | result = {"results": changeset.parents} | |
368 | return result |
|
368 | return result | |
369 | else: |
|
369 | else: | |
370 | raise HTTPBadRequest() |
|
370 | raise HTTPBadRequest() |
@@ -1,188 +1,188 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.compare |
|
15 | kallithea.controllers.compare | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | compare controller showing differences between two |
|
18 | compare controller showing differences between two | |
19 | repos, branches, bookmarks or tips |
|
19 | repos, branches, bookmarks or tips | |
20 |
|
20 | |||
21 | This file was forked by the Kallithea project in July 2014. |
|
21 | This file was forked by the Kallithea project in July 2014. | |
22 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | Original author and date, and relevant copyright and licensing information is below: | |
23 | :created_on: May 6, 2012 |
|
23 | :created_on: May 6, 2012 | |
24 | :author: marcink |
|
24 | :author: marcink | |
25 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
26 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | :license: GPLv3, see LICENSE.md for more details. | |
27 | """ |
|
27 | """ | |
28 |
|
28 | |||
29 |
|
29 | |||
30 | import logging |
|
30 | import logging | |
31 |
|
31 | |||
32 | from tg import request |
|
32 | from tg import request | |
33 | from tg import tmpl_context as c |
|
33 | from tg import tmpl_context as c | |
34 | from tg.i18n import ugettext as _ |
|
34 | from tg.i18n import ugettext as _ | |
35 | from webob.exc import HTTPBadRequest, HTTPFound, HTTPNotFound |
|
35 | from webob.exc import HTTPBadRequest, HTTPFound, HTTPNotFound | |
36 |
|
36 | |||
37 | import kallithea.lib.helpers as h |
|
37 | import kallithea.lib.helpers as h | |
|
38 | from kallithea.controllers import base | |||
38 | from kallithea.lib import diffs, webutils |
|
39 | from kallithea.lib import diffs, webutils | |
39 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
40 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired | |
40 | from kallithea.lib.base import BaseRepoController, render |
|
|||
41 | from kallithea.lib.graphmod import graph_data |
|
41 | from kallithea.lib.graphmod import graph_data | |
42 | from kallithea.lib.webutils import url |
|
42 | from kallithea.lib.webutils import url | |
43 | from kallithea.model import db |
|
43 | from kallithea.model import db | |
44 |
|
44 | |||
45 |
|
45 | |||
46 | log = logging.getLogger(__name__) |
|
46 | log = logging.getLogger(__name__) | |
47 |
|
47 | |||
48 |
|
48 | |||
49 | class CompareController(BaseRepoController): |
|
49 | class CompareController(base.BaseRepoController): | |
50 |
|
50 | |||
51 | def _before(self, *args, **kwargs): |
|
51 | def _before(self, *args, **kwargs): | |
52 | super(CompareController, self)._before(*args, **kwargs) |
|
52 | super(CompareController, self)._before(*args, **kwargs) | |
53 |
|
53 | |||
54 | # The base repository has already been retrieved. |
|
54 | # The base repository has already been retrieved. | |
55 | c.a_repo = c.db_repo |
|
55 | c.a_repo = c.db_repo | |
56 |
|
56 | |||
57 | # Retrieve the "changeset" repository (default: same as base). |
|
57 | # Retrieve the "changeset" repository (default: same as base). | |
58 | other_repo = request.GET.get('other_repo', None) |
|
58 | other_repo = request.GET.get('other_repo', None) | |
59 | if other_repo is None: |
|
59 | if other_repo is None: | |
60 | c.cs_repo = c.a_repo |
|
60 | c.cs_repo = c.a_repo | |
61 | else: |
|
61 | else: | |
62 | c.cs_repo = db.Repository.get_by_repo_name(other_repo) |
|
62 | c.cs_repo = db.Repository.get_by_repo_name(other_repo) | |
63 | if c.cs_repo is None: |
|
63 | if c.cs_repo is None: | |
64 | msg = _('Could not find other repository %s') % other_repo |
|
64 | msg = _('Could not find other repository %s') % other_repo | |
65 | webutils.flash(msg, category='error') |
|
65 | webutils.flash(msg, category='error') | |
66 | raise HTTPFound(location=url('compare_home', repo_name=c.a_repo.repo_name)) |
|
66 | raise HTTPFound(location=url('compare_home', repo_name=c.a_repo.repo_name)) | |
67 |
|
67 | |||
68 | # Verify that it's even possible to compare these two repositories. |
|
68 | # Verify that it's even possible to compare these two repositories. | |
69 | if c.a_repo.scm_instance.alias != c.cs_repo.scm_instance.alias: |
|
69 | if c.a_repo.scm_instance.alias != c.cs_repo.scm_instance.alias: | |
70 | msg = _('Cannot compare repositories of different types') |
|
70 | msg = _('Cannot compare repositories of different types') | |
71 | webutils.flash(msg, category='error') |
|
71 | webutils.flash(msg, category='error') | |
72 | raise HTTPFound(location=url('compare_home', repo_name=c.a_repo.repo_name)) |
|
72 | raise HTTPFound(location=url('compare_home', repo_name=c.a_repo.repo_name)) | |
73 |
|
73 | |||
74 | @LoginRequired(allow_default_user=True) |
|
74 | @LoginRequired(allow_default_user=True) | |
75 | @HasRepoPermissionLevelDecorator('read') |
|
75 | @HasRepoPermissionLevelDecorator('read') | |
76 | def index(self, repo_name): |
|
76 | def index(self, repo_name): | |
77 | c.compare_home = True |
|
77 | c.compare_home = True | |
78 | c.a_ref_name = c.cs_ref_name = None |
|
78 | c.a_ref_name = c.cs_ref_name = None | |
79 | return render('compare/compare_diff.html') |
|
79 | return base.render('compare/compare_diff.html') | |
80 |
|
80 | |||
81 | @LoginRequired(allow_default_user=True) |
|
81 | @LoginRequired(allow_default_user=True) | |
82 | @HasRepoPermissionLevelDecorator('read') |
|
82 | @HasRepoPermissionLevelDecorator('read') | |
83 | def compare(self, repo_name, org_ref_type, org_ref_name, other_ref_type, other_ref_name): |
|
83 | def compare(self, repo_name, org_ref_type, org_ref_name, other_ref_type, other_ref_name): | |
84 | org_ref_name = org_ref_name.strip() |
|
84 | org_ref_name = org_ref_name.strip() | |
85 | other_ref_name = other_ref_name.strip() |
|
85 | other_ref_name = other_ref_name.strip() | |
86 |
|
86 | |||
87 | # If merge is True: |
|
87 | # If merge is True: | |
88 | # Show what org would get if merged with other: |
|
88 | # Show what org would get if merged with other: | |
89 | # List changesets that are ancestors of other but not of org. |
|
89 | # List changesets that are ancestors of other but not of org. | |
90 | # New changesets in org is thus ignored. |
|
90 | # New changesets in org is thus ignored. | |
91 | # Diff will be from common ancestor, and merges of org to other will thus be ignored. |
|
91 | # Diff will be from common ancestor, and merges of org to other will thus be ignored. | |
92 | # If merge is False: |
|
92 | # If merge is False: | |
93 | # Make a raw diff from org to other, no matter if related or not. |
|
93 | # Make a raw diff from org to other, no matter if related or not. | |
94 | # Changesets in one and not in the other will be ignored |
|
94 | # Changesets in one and not in the other will be ignored | |
95 | merge = bool(request.GET.get('merge')) |
|
95 | merge = bool(request.GET.get('merge')) | |
96 | # fulldiff disables cut_off_limit |
|
96 | # fulldiff disables cut_off_limit | |
97 | fulldiff = request.GET.get('fulldiff') |
|
97 | fulldiff = request.GET.get('fulldiff') | |
98 | # partial uses compare_cs.html template directly |
|
98 | # partial uses compare_cs.html template directly | |
99 | partial = request.environ.get('HTTP_X_PARTIAL_XHR') |
|
99 | partial = request.environ.get('HTTP_X_PARTIAL_XHR') | |
100 | # is_ajax_preview puts hidden input field with changeset revisions |
|
100 | # is_ajax_preview puts hidden input field with changeset revisions | |
101 | c.is_ajax_preview = partial and request.GET.get('is_ajax_preview') |
|
101 | c.is_ajax_preview = partial and request.GET.get('is_ajax_preview') | |
102 | # swap url for compare_diff page - never partial and never is_ajax_preview |
|
102 | # swap url for compare_diff page - never partial and never is_ajax_preview | |
103 | c.swap_url = webutils.url('compare_url', |
|
103 | c.swap_url = webutils.url('compare_url', | |
104 | repo_name=c.cs_repo.repo_name, |
|
104 | repo_name=c.cs_repo.repo_name, | |
105 | org_ref_type=other_ref_type, org_ref_name=other_ref_name, |
|
105 | org_ref_type=other_ref_type, org_ref_name=other_ref_name, | |
106 | other_repo=c.a_repo.repo_name, |
|
106 | other_repo=c.a_repo.repo_name, | |
107 | other_ref_type=org_ref_type, other_ref_name=org_ref_name, |
|
107 | other_ref_type=org_ref_type, other_ref_name=org_ref_name, | |
108 | merge=merge or '') |
|
108 | merge=merge or '') | |
109 | ignore_whitespace_diff = h.get_ignore_whitespace_diff(request.GET) |
|
109 | ignore_whitespace_diff = h.get_ignore_whitespace_diff(request.GET) | |
110 | diff_context_size = h.get_diff_context_size(request.GET) |
|
110 | diff_context_size = h.get_diff_context_size(request.GET) | |
111 |
|
111 | |||
112 | c.a_rev = self._get_ref_rev(c.a_repo, org_ref_type, org_ref_name, |
|
112 | c.a_rev = self._get_ref_rev(c.a_repo, org_ref_type, org_ref_name, | |
113 | returnempty=True) |
|
113 | returnempty=True) | |
114 | c.cs_rev = self._get_ref_rev(c.cs_repo, other_ref_type, other_ref_name) |
|
114 | c.cs_rev = self._get_ref_rev(c.cs_repo, other_ref_type, other_ref_name) | |
115 |
|
115 | |||
116 | c.compare_home = False |
|
116 | c.compare_home = False | |
117 | c.a_ref_name = org_ref_name |
|
117 | c.a_ref_name = org_ref_name | |
118 | c.a_ref_type = org_ref_type |
|
118 | c.a_ref_type = org_ref_type | |
119 | c.cs_ref_name = other_ref_name |
|
119 | c.cs_ref_name = other_ref_name | |
120 | c.cs_ref_type = other_ref_type |
|
120 | c.cs_ref_type = other_ref_type | |
121 |
|
121 | |||
122 | c.cs_ranges, c.cs_ranges_org, c.ancestors = c.a_repo.scm_instance.get_diff_changesets( |
|
122 | c.cs_ranges, c.cs_ranges_org, c.ancestors = c.a_repo.scm_instance.get_diff_changesets( | |
123 | c.a_rev, c.cs_repo.scm_instance, c.cs_rev) |
|
123 | c.a_rev, c.cs_repo.scm_instance, c.cs_rev) | |
124 | raw_ids = [x.raw_id for x in c.cs_ranges] |
|
124 | raw_ids = [x.raw_id for x in c.cs_ranges] | |
125 | c.cs_comments = c.cs_repo.get_comments(raw_ids) |
|
125 | c.cs_comments = c.cs_repo.get_comments(raw_ids) | |
126 | c.cs_statuses = c.cs_repo.statuses(raw_ids) |
|
126 | c.cs_statuses = c.cs_repo.statuses(raw_ids) | |
127 |
|
127 | |||
128 | revs = [ctx.revision for ctx in reversed(c.cs_ranges)] |
|
128 | revs = [ctx.revision for ctx in reversed(c.cs_ranges)] | |
129 | c.jsdata = graph_data(c.cs_repo.scm_instance, revs) |
|
129 | c.jsdata = graph_data(c.cs_repo.scm_instance, revs) | |
130 |
|
130 | |||
131 | if partial: |
|
131 | if partial: | |
132 | return render('compare/compare_cs.html') |
|
132 | return base.render('compare/compare_cs.html') | |
133 |
|
133 | |||
134 | org_repo = c.a_repo |
|
134 | org_repo = c.a_repo | |
135 | other_repo = c.cs_repo |
|
135 | other_repo = c.cs_repo | |
136 |
|
136 | |||
137 | if merge: |
|
137 | if merge: | |
138 | rev1 = msg = None |
|
138 | rev1 = msg = None | |
139 | if not c.cs_ranges: |
|
139 | if not c.cs_ranges: | |
140 | msg = _('Cannot show empty diff') |
|
140 | msg = _('Cannot show empty diff') | |
141 | elif not c.ancestors: |
|
141 | elif not c.ancestors: | |
142 | msg = _('No ancestor found for merge diff') |
|
142 | msg = _('No ancestor found for merge diff') | |
143 | elif len(c.ancestors) == 1: |
|
143 | elif len(c.ancestors) == 1: | |
144 | rev1 = c.ancestors[0] |
|
144 | rev1 = c.ancestors[0] | |
145 | else: |
|
145 | else: | |
146 | msg = _('Multiple merge ancestors found for merge compare') |
|
146 | msg = _('Multiple merge ancestors found for merge compare') | |
147 | if rev1 is None: |
|
147 | if rev1 is None: | |
148 | webutils.flash(msg, category='error') |
|
148 | webutils.flash(msg, category='error') | |
149 | log.error(msg) |
|
149 | log.error(msg) | |
150 | raise HTTPNotFound |
|
150 | raise HTTPNotFound | |
151 |
|
151 | |||
152 | # case we want a simple diff without incoming changesets, |
|
152 | # case we want a simple diff without incoming changesets, | |
153 | # previewing what will be merged. |
|
153 | # previewing what will be merged. | |
154 | # Make the diff on the other repo (which is known to have other_rev) |
|
154 | # Make the diff on the other repo (which is known to have other_rev) | |
155 | log.debug('Using ancestor %s as rev1 instead of %s', |
|
155 | log.debug('Using ancestor %s as rev1 instead of %s', | |
156 | rev1, c.a_rev) |
|
156 | rev1, c.a_rev) | |
157 | org_repo = other_repo |
|
157 | org_repo = other_repo | |
158 | else: # comparing tips, not necessarily linearly related |
|
158 | else: # comparing tips, not necessarily linearly related | |
159 | if org_repo != other_repo: |
|
159 | if org_repo != other_repo: | |
160 | # TODO: we could do this by using hg unionrepo |
|
160 | # TODO: we could do this by using hg unionrepo | |
161 | log.error('cannot compare across repos %s and %s', org_repo, other_repo) |
|
161 | log.error('cannot compare across repos %s and %s', org_repo, other_repo) | |
162 | webutils.flash(_('Cannot compare repositories without using common ancestor'), category='error') |
|
162 | webutils.flash(_('Cannot compare repositories without using common ancestor'), category='error') | |
163 | raise HTTPBadRequest |
|
163 | raise HTTPBadRequest | |
164 | rev1 = c.a_rev |
|
164 | rev1 = c.a_rev | |
165 |
|
165 | |||
166 | diff_limit = None if fulldiff else self.cut_off_limit |
|
166 | diff_limit = None if fulldiff else self.cut_off_limit | |
167 |
|
167 | |||
168 | log.debug('running diff between %s and %s in %s', |
|
168 | log.debug('running diff between %s and %s in %s', | |
169 | rev1, c.cs_rev, org_repo.scm_instance.path) |
|
169 | rev1, c.cs_rev, org_repo.scm_instance.path) | |
170 | raw_diff = diffs.get_diff(org_repo.scm_instance, rev1=rev1, rev2=c.cs_rev, |
|
170 | raw_diff = diffs.get_diff(org_repo.scm_instance, rev1=rev1, rev2=c.cs_rev, | |
171 | ignore_whitespace=ignore_whitespace_diff, |
|
171 | ignore_whitespace=ignore_whitespace_diff, | |
172 | context=diff_context_size) |
|
172 | context=diff_context_size) | |
173 |
|
173 | |||
174 | diff_processor = diffs.DiffProcessor(raw_diff, diff_limit=diff_limit) |
|
174 | diff_processor = diffs.DiffProcessor(raw_diff, diff_limit=diff_limit) | |
175 | c.limited_diff = diff_processor.limited_diff |
|
175 | c.limited_diff = diff_processor.limited_diff | |
176 | c.file_diff_data = [] |
|
176 | c.file_diff_data = [] | |
177 | c.lines_added = 0 |
|
177 | c.lines_added = 0 | |
178 | c.lines_deleted = 0 |
|
178 | c.lines_deleted = 0 | |
179 | for f in diff_processor.parsed: |
|
179 | for f in diff_processor.parsed: | |
180 | st = f['stats'] |
|
180 | st = f['stats'] | |
181 | c.lines_added += st['added'] |
|
181 | c.lines_added += st['added'] | |
182 | c.lines_deleted += st['deleted'] |
|
182 | c.lines_deleted += st['deleted'] | |
183 | filename = f['filename'] |
|
183 | filename = f['filename'] | |
184 | fid = h.FID('', filename) |
|
184 | fid = h.FID('', filename) | |
185 | html_diff = diffs.as_html(parsed_lines=[f]) |
|
185 | html_diff = diffs.as_html(parsed_lines=[f]) | |
186 | c.file_diff_data.append((fid, None, f['operation'], f['old_filename'], filename, html_diff, st)) |
|
186 | c.file_diff_data.append((fid, None, f['operation'], f['old_filename'], filename, html_diff, st)) | |
187 |
|
187 | |||
188 | return render('compare/compare_diff.html') |
|
188 | return base.render('compare/compare_diff.html') |
@@ -1,91 +1,91 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.error |
|
15 | kallithea.controllers.error | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Kallithea error controller |
|
18 | Kallithea error controller | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Dec 8, 2010 |
|
22 | :created_on: Dec 8, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import html |
|
28 | import html | |
29 | import logging |
|
29 | import logging | |
30 |
|
30 | |||
31 | from tg import config, expose, request |
|
31 | from tg import config, expose, request | |
32 | from tg import tmpl_context as c |
|
32 | from tg import tmpl_context as c | |
33 | from tg.i18n import ugettext as _ |
|
33 | from tg.i18n import ugettext as _ | |
34 |
|
34 | |||
35 | from kallithea.lib.base import BaseController |
|
35 | from kallithea.controllers import base | |
36 |
|
36 | |||
37 |
|
37 | |||
38 | log = logging.getLogger(__name__) |
|
38 | log = logging.getLogger(__name__) | |
39 |
|
39 | |||
40 |
|
40 | |||
41 | class ErrorController(BaseController): |
|
41 | class ErrorController(base.BaseController): | |
42 | """Generates error documents as and when they are required. |
|
42 | """Generates error documents as and when they are required. | |
43 |
|
43 | |||
44 | The errorpage middleware renders /error/document when error |
|
44 | The errorpage middleware renders /error/document when error | |
45 | related status codes are returned from the application. |
|
45 | related status codes are returned from the application. | |
46 | """ |
|
46 | """ | |
47 |
|
47 | |||
48 | def _before(self, *args, **kwargs): |
|
48 | def _before(self, *args, **kwargs): | |
49 | # disable all base actions since we don't need them here |
|
49 | # disable all base actions since we don't need them here | |
50 | pass |
|
50 | pass | |
51 |
|
51 | |||
52 | @expose('/errors/error_document.html') |
|
52 | @expose('/errors/error_document.html') | |
53 | def document(self, *args, **kwargs): |
|
53 | def document(self, *args, **kwargs): | |
54 | resp = request.environ.get('tg.original_response') |
|
54 | resp = request.environ.get('tg.original_response') | |
55 | c.site_name = config.get('title') |
|
55 | c.site_name = config.get('title') | |
56 |
|
56 | |||
57 | log.debug('### %s ###', resp and resp.status or 'no response') |
|
57 | log.debug('### %s ###', resp and resp.status or 'no response') | |
58 |
|
58 | |||
59 | e = request.environ |
|
59 | e = request.environ | |
60 | c.serv_p = r'%(protocol)s://%(host)s/' % { |
|
60 | c.serv_p = r'%(protocol)s://%(host)s/' % { | |
61 | 'protocol': e.get('wsgi.url_scheme'), |
|
61 | 'protocol': e.get('wsgi.url_scheme'), | |
62 | 'host': e.get('HTTP_HOST'), } |
|
62 | 'host': e.get('HTTP_HOST'), } | |
63 | if resp: |
|
63 | if resp: | |
64 | c.error_message = html.escape(request.GET.get('code', str(resp.status))) |
|
64 | c.error_message = html.escape(request.GET.get('code', str(resp.status))) | |
65 | c.error_explanation = self.get_error_explanation(resp.status_int) |
|
65 | c.error_explanation = self.get_error_explanation(resp.status_int) | |
66 | else: |
|
66 | else: | |
67 | c.error_message = _('No response') |
|
67 | c.error_message = _('No response') | |
68 | c.error_explanation = _('Unknown error') |
|
68 | c.error_explanation = _('Unknown error') | |
69 |
|
69 | |||
70 | return dict() |
|
70 | return dict() | |
71 |
|
71 | |||
72 | def get_error_explanation(self, code): |
|
72 | def get_error_explanation(self, code): | |
73 | """ get the error explanations of int codes |
|
73 | """ get the error explanations of int codes | |
74 | [400, 401, 403, 404, 500]""" |
|
74 | [400, 401, 403, 404, 500]""" | |
75 | try: |
|
75 | try: | |
76 | code = int(code) |
|
76 | code = int(code) | |
77 | except ValueError: |
|
77 | except ValueError: | |
78 | code = 500 |
|
78 | code = 500 | |
79 |
|
79 | |||
80 | if code == 400: |
|
80 | if code == 400: | |
81 | return _('The request could not be understood by the server' |
|
81 | return _('The request could not be understood by the server' | |
82 | ' due to malformed syntax.') |
|
82 | ' due to malformed syntax.') | |
83 | if code == 401: |
|
83 | if code == 401: | |
84 | return _('Unauthorized access to resource') |
|
84 | return _('Unauthorized access to resource') | |
85 | if code == 403: |
|
85 | if code == 403: | |
86 | return _("You don't have permission to view this page") |
|
86 | return _("You don't have permission to view this page") | |
87 | if code == 404: |
|
87 | if code == 404: | |
88 | return _('The resource could not be found') |
|
88 | return _('The resource could not be found') | |
89 | if code == 500: |
|
89 | if code == 500: | |
90 | return _('The server encountered an unexpected condition' |
|
90 | return _('The server encountered an unexpected condition' | |
91 | ' which prevented it from fulfilling the request.') |
|
91 | ' which prevented it from fulfilling the request.') |
@@ -1,134 +1,134 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.feed |
|
15 | kallithea.controllers.feed | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Feed controller for Kallithea |
|
18 | Feed controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 23, 2010 |
|
22 | :created_on: Apr 23, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 |
|
28 | |||
29 | import logging |
|
29 | import logging | |
30 |
|
30 | |||
31 | from beaker.cache import cache_region |
|
31 | from beaker.cache import cache_region | |
32 | from tg import response |
|
32 | from tg import response | |
33 | from tg import tmpl_context as c |
|
33 | from tg import tmpl_context as c | |
34 | from tg.i18n import ugettext as _ |
|
34 | from tg.i18n import ugettext as _ | |
35 |
|
35 | |||
36 | import kallithea |
|
36 | import kallithea | |
37 | import kallithea.lib.helpers as h |
|
37 | import kallithea.lib.helpers as h | |
|
38 | from kallithea.controllers import base | |||
38 | from kallithea.lib import feeds, webutils |
|
39 | from kallithea.lib import feeds, webutils | |
39 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
40 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired | |
40 | from kallithea.lib.base import BaseRepoController |
|
|||
41 | from kallithea.lib.diffs import DiffProcessor |
|
41 | from kallithea.lib.diffs import DiffProcessor | |
42 | from kallithea.lib.utils2 import asbool, fmt_date, safe_int, safe_str, shorter |
|
42 | from kallithea.lib.utils2 import asbool, fmt_date, safe_int, safe_str, shorter | |
43 |
|
43 | |||
44 |
|
44 | |||
45 | log = logging.getLogger(__name__) |
|
45 | log = logging.getLogger(__name__) | |
46 |
|
46 | |||
47 |
|
47 | |||
48 | class FeedController(BaseRepoController): |
|
48 | class FeedController(base.BaseRepoController): | |
49 |
|
49 | |||
50 | @LoginRequired(allow_default_user=True) |
|
50 | @LoginRequired(allow_default_user=True) | |
51 | @HasRepoPermissionLevelDecorator('read') |
|
51 | @HasRepoPermissionLevelDecorator('read') | |
52 | def _before(self, *args, **kwargs): |
|
52 | def _before(self, *args, **kwargs): | |
53 | super(FeedController, self)._before(*args, **kwargs) |
|
53 | super(FeedController, self)._before(*args, **kwargs) | |
54 |
|
54 | |||
55 | def _get_title(self, cs): |
|
55 | def _get_title(self, cs): | |
56 | return shorter(cs.message, 160) |
|
56 | return shorter(cs.message, 160) | |
57 |
|
57 | |||
58 | def __get_desc(self, cs): |
|
58 | def __get_desc(self, cs): | |
59 | desc_msg = [(_('%s committed on %s') |
|
59 | desc_msg = [(_('%s committed on %s') | |
60 | % (h.person(cs.author), fmt_date(cs.date))) + '<br/>'] |
|
60 | % (h.person(cs.author), fmt_date(cs.date))) + '<br/>'] | |
61 | # branches, tags, bookmarks |
|
61 | # branches, tags, bookmarks | |
62 | for branch in cs.branches: |
|
62 | for branch in cs.branches: | |
63 | desc_msg.append('branch: %s<br/>' % branch) |
|
63 | desc_msg.append('branch: %s<br/>' % branch) | |
64 | for book in cs.bookmarks: |
|
64 | for book in cs.bookmarks: | |
65 | desc_msg.append('bookmark: %s<br/>' % book) |
|
65 | desc_msg.append('bookmark: %s<br/>' % book) | |
66 | for tag in cs.tags: |
|
66 | for tag in cs.tags: | |
67 | desc_msg.append('tag: %s<br/>' % tag) |
|
67 | desc_msg.append('tag: %s<br/>' % tag) | |
68 |
|
68 | |||
69 | changes = [] |
|
69 | changes = [] | |
70 | diff_limit = safe_int(kallithea.CONFIG.get('rss_cut_off_limit', 32 * 1024)) |
|
70 | diff_limit = safe_int(kallithea.CONFIG.get('rss_cut_off_limit', 32 * 1024)) | |
71 | raw_diff = cs.diff() |
|
71 | raw_diff = cs.diff() | |
72 | diff_processor = DiffProcessor(raw_diff, |
|
72 | diff_processor = DiffProcessor(raw_diff, | |
73 | diff_limit=diff_limit, |
|
73 | diff_limit=diff_limit, | |
74 | inline_diff=False) |
|
74 | inline_diff=False) | |
75 |
|
75 | |||
76 | for st in diff_processor.parsed: |
|
76 | for st in diff_processor.parsed: | |
77 | st.update({'added': st['stats']['added'], |
|
77 | st.update({'added': st['stats']['added'], | |
78 | 'removed': st['stats']['deleted']}) |
|
78 | 'removed': st['stats']['deleted']}) | |
79 | changes.append('\n %(operation)s %(filename)s ' |
|
79 | changes.append('\n %(operation)s %(filename)s ' | |
80 | '(%(added)s lines added, %(removed)s lines removed)' |
|
80 | '(%(added)s lines added, %(removed)s lines removed)' | |
81 | % st) |
|
81 | % st) | |
82 | if diff_processor.limited_diff: |
|
82 | if diff_processor.limited_diff: | |
83 | changes = changes + ['\n ' + |
|
83 | changes = changes + ['\n ' + | |
84 | _('Changeset was too big and was cut off...')] |
|
84 | _('Changeset was too big and was cut off...')] | |
85 |
|
85 | |||
86 | # rev link |
|
86 | # rev link | |
87 | _url = webutils.canonical_url('changeset_home', repo_name=c.db_repo.repo_name, |
|
87 | _url = webutils.canonical_url('changeset_home', repo_name=c.db_repo.repo_name, | |
88 | revision=cs.raw_id) |
|
88 | revision=cs.raw_id) | |
89 | desc_msg.append('changeset: <a href="%s">%s</a>' % (_url, cs.raw_id[:8])) |
|
89 | desc_msg.append('changeset: <a href="%s">%s</a>' % (_url, cs.raw_id[:8])) | |
90 |
|
90 | |||
91 | desc_msg.append('<pre>') |
|
91 | desc_msg.append('<pre>') | |
92 | desc_msg.append(webutils.urlify_text(cs.message)) |
|
92 | desc_msg.append(webutils.urlify_text(cs.message)) | |
93 | desc_msg.append('\n') |
|
93 | desc_msg.append('\n') | |
94 | desc_msg.extend(changes) |
|
94 | desc_msg.extend(changes) | |
95 | if asbool(kallithea.CONFIG.get('rss_include_diff', False)): |
|
95 | if asbool(kallithea.CONFIG.get('rss_include_diff', False)): | |
96 | desc_msg.append('\n\n') |
|
96 | desc_msg.append('\n\n') | |
97 | desc_msg.append(safe_str(raw_diff)) |
|
97 | desc_msg.append(safe_str(raw_diff)) | |
98 | desc_msg.append('</pre>') |
|
98 | desc_msg.append('</pre>') | |
99 | return desc_msg |
|
99 | return desc_msg | |
100 |
|
100 | |||
101 | def _feed(self, repo_name, feeder): |
|
101 | def _feed(self, repo_name, feeder): | |
102 | """Produce a simple feed""" |
|
102 | """Produce a simple feed""" | |
103 |
|
103 | |||
104 | @cache_region('long_term_file', '_get_feed_from_cache') |
|
104 | @cache_region('long_term_file', '_get_feed_from_cache') | |
105 | def _get_feed_from_cache(*_cache_keys): # parameters are not really used - only as caching key |
|
105 | def _get_feed_from_cache(*_cache_keys): # parameters are not really used - only as caching key | |
106 | header = dict( |
|
106 | header = dict( | |
107 | title=_('%s %s feed') % (c.site_name, repo_name), |
|
107 | title=_('%s %s feed') % (c.site_name, repo_name), | |
108 | link=webutils.canonical_url('summary_home', repo_name=repo_name), |
|
108 | link=webutils.canonical_url('summary_home', repo_name=repo_name), | |
109 | description=_('Changes on %s repository') % repo_name, |
|
109 | description=_('Changes on %s repository') % repo_name, | |
110 | ) |
|
110 | ) | |
111 |
|
111 | |||
112 | rss_items_per_page = safe_int(kallithea.CONFIG.get('rss_items_per_page', 20)) |
|
112 | rss_items_per_page = safe_int(kallithea.CONFIG.get('rss_items_per_page', 20)) | |
113 | entries=[] |
|
113 | entries=[] | |
114 | for cs in reversed(list(c.db_repo_scm_instance[-rss_items_per_page:])): |
|
114 | for cs in reversed(list(c.db_repo_scm_instance[-rss_items_per_page:])): | |
115 | entries.append(dict( |
|
115 | entries.append(dict( | |
116 | title=self._get_title(cs), |
|
116 | title=self._get_title(cs), | |
117 | link=webutils.canonical_url('changeset_home', repo_name=repo_name, revision=cs.raw_id), |
|
117 | link=webutils.canonical_url('changeset_home', repo_name=repo_name, revision=cs.raw_id), | |
118 | author_email=cs.author_email, |
|
118 | author_email=cs.author_email, | |
119 | author_name=cs.author_name, |
|
119 | author_name=cs.author_name, | |
120 | description=''.join(self.__get_desc(cs)), |
|
120 | description=''.join(self.__get_desc(cs)), | |
121 | pubdate=cs.date, |
|
121 | pubdate=cs.date, | |
122 | )) |
|
122 | )) | |
123 | return feeder.render(header, entries) |
|
123 | return feeder.render(header, entries) | |
124 |
|
124 | |||
125 | response.content_type = feeder.content_type |
|
125 | response.content_type = feeder.content_type | |
126 | return _get_feed_from_cache(repo_name, feeder.__name__) |
|
126 | return _get_feed_from_cache(repo_name, feeder.__name__) | |
127 |
|
127 | |||
128 | def atom(self, repo_name): |
|
128 | def atom(self, repo_name): | |
129 | """Produce a simple atom-1.0 feed""" |
|
129 | """Produce a simple atom-1.0 feed""" | |
130 | return self._feed(repo_name, feeds.AtomFeed) |
|
130 | return self._feed(repo_name, feeds.AtomFeed) | |
131 |
|
131 | |||
132 | def rss(self, repo_name): |
|
132 | def rss(self, repo_name): | |
133 | """Produce a simple rss2 feed""" |
|
133 | """Produce a simple rss2 feed""" | |
134 | return self._feed(repo_name, feeds.RssFeed) |
|
134 | return self._feed(repo_name, feeds.RssFeed) |
@@ -1,745 +1,745 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.files |
|
15 | kallithea.controllers.files | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Files controller for Kallithea |
|
18 | Files controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 21, 2010 |
|
22 | :created_on: Apr 21, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import os |
|
29 | import os | |
30 | import posixpath |
|
30 | import posixpath | |
31 | import shutil |
|
31 | import shutil | |
32 | import tempfile |
|
32 | import tempfile | |
33 | import traceback |
|
33 | import traceback | |
34 | from collections import OrderedDict |
|
34 | from collections import OrderedDict | |
35 |
|
35 | |||
36 | from tg import request, response |
|
36 | from tg import request, response | |
37 | from tg import tmpl_context as c |
|
37 | from tg import tmpl_context as c | |
38 | from tg.i18n import ugettext as _ |
|
38 | from tg.i18n import ugettext as _ | |
39 | from webob.exc import HTTPFound, HTTPNotFound |
|
39 | from webob.exc import HTTPFound, HTTPNotFound | |
40 |
|
40 | |||
41 | import kallithea |
|
41 | import kallithea | |
42 | import kallithea.lib.helpers as h |
|
42 | import kallithea.lib.helpers as h | |
|
43 | from kallithea.controllers import base | |||
43 | from kallithea.lib import diffs, webutils |
|
44 | from kallithea.lib import diffs, webutils | |
44 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
45 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired | |
45 | from kallithea.lib.base import BaseRepoController, jsonify, render |
|
|||
46 | from kallithea.lib.exceptions import NonRelativePathError |
|
46 | from kallithea.lib.exceptions import NonRelativePathError | |
47 | from kallithea.lib.utils2 import asbool, convert_line_endings, detect_mode, safe_str |
|
47 | from kallithea.lib.utils2 import asbool, convert_line_endings, detect_mode, safe_str | |
48 | from kallithea.lib.vcs.backends.base import EmptyChangeset |
|
48 | from kallithea.lib.vcs.backends.base import EmptyChangeset | |
49 | from kallithea.lib.vcs.conf import settings |
|
49 | from kallithea.lib.vcs.conf import settings | |
50 | from kallithea.lib.vcs.exceptions import (ChangesetDoesNotExistError, ChangesetError, EmptyRepositoryError, ImproperArchiveTypeError, NodeAlreadyExistsError, |
|
50 | from kallithea.lib.vcs.exceptions import (ChangesetDoesNotExistError, ChangesetError, EmptyRepositoryError, ImproperArchiveTypeError, NodeAlreadyExistsError, | |
51 | NodeDoesNotExistError, NodeError, RepositoryError, VCSError) |
|
51 | NodeDoesNotExistError, NodeError, RepositoryError, VCSError) | |
52 | from kallithea.lib.vcs.nodes import FileNode |
|
52 | from kallithea.lib.vcs.nodes import FileNode | |
53 | from kallithea.lib.vcs.utils import author_email |
|
53 | from kallithea.lib.vcs.utils import author_email | |
54 | from kallithea.lib.webutils import url |
|
54 | from kallithea.lib.webutils import url | |
55 | from kallithea.model import userlog |
|
55 | from kallithea.model import userlog | |
56 | from kallithea.model.repo import RepoModel |
|
56 | from kallithea.model.repo import RepoModel | |
57 | from kallithea.model.scm import ScmModel |
|
57 | from kallithea.model.scm import ScmModel | |
58 |
|
58 | |||
59 |
|
59 | |||
60 | log = logging.getLogger(__name__) |
|
60 | log = logging.getLogger(__name__) | |
61 |
|
61 | |||
62 |
|
62 | |||
63 | class FilesController(BaseRepoController): |
|
63 | class FilesController(base.BaseRepoController): | |
64 |
|
64 | |||
65 | def _before(self, *args, **kwargs): |
|
65 | def _before(self, *args, **kwargs): | |
66 | super(FilesController, self)._before(*args, **kwargs) |
|
66 | super(FilesController, self)._before(*args, **kwargs) | |
67 |
|
67 | |||
68 | def __get_cs(self, rev, silent_empty=False): |
|
68 | def __get_cs(self, rev, silent_empty=False): | |
69 | """ |
|
69 | """ | |
70 | Safe way to get changeset if error occur it redirects to tip with |
|
70 | Safe way to get changeset if error occur it redirects to tip with | |
71 | proper message |
|
71 | proper message | |
72 |
|
72 | |||
73 | :param rev: revision to fetch |
|
73 | :param rev: revision to fetch | |
74 | :silent_empty: return None if repository is empty |
|
74 | :silent_empty: return None if repository is empty | |
75 | """ |
|
75 | """ | |
76 |
|
76 | |||
77 | try: |
|
77 | try: | |
78 | return c.db_repo_scm_instance.get_changeset(rev) |
|
78 | return c.db_repo_scm_instance.get_changeset(rev) | |
79 | except EmptyRepositoryError as e: |
|
79 | except EmptyRepositoryError as e: | |
80 | if silent_empty: |
|
80 | if silent_empty: | |
81 | return None |
|
81 | return None | |
82 | url_ = url('files_add_home', |
|
82 | url_ = url('files_add_home', | |
83 | repo_name=c.repo_name, |
|
83 | repo_name=c.repo_name, | |
84 | revision=0, f_path='', anchor='edit') |
|
84 | revision=0, f_path='', anchor='edit') | |
85 | add_new = webutils.link_to(_('Click here to add new file'), url_, class_="alert-link") |
|
85 | add_new = webutils.link_to(_('Click here to add new file'), url_, class_="alert-link") | |
86 | webutils.flash(_('There are no files yet.') + ' ' + add_new, category='warning') |
|
86 | webutils.flash(_('There are no files yet.') + ' ' + add_new, category='warning') | |
87 | raise HTTPNotFound() |
|
87 | raise HTTPNotFound() | |
88 | except (ChangesetDoesNotExistError, LookupError): |
|
88 | except (ChangesetDoesNotExistError, LookupError): | |
89 | msg = _('Such revision does not exist for this repository') |
|
89 | msg = _('Such revision does not exist for this repository') | |
90 | webutils.flash(msg, category='error') |
|
90 | webutils.flash(msg, category='error') | |
91 | raise HTTPNotFound() |
|
91 | raise HTTPNotFound() | |
92 | except RepositoryError as e: |
|
92 | except RepositoryError as e: | |
93 | webutils.flash(e, category='error') |
|
93 | webutils.flash(e, category='error') | |
94 | raise HTTPNotFound() |
|
94 | raise HTTPNotFound() | |
95 |
|
95 | |||
96 | def __get_filenode(self, cs, path): |
|
96 | def __get_filenode(self, cs, path): | |
97 | """ |
|
97 | """ | |
98 | Returns file_node or raise HTTP error. |
|
98 | Returns file_node or raise HTTP error. | |
99 |
|
99 | |||
100 | :param cs: given changeset |
|
100 | :param cs: given changeset | |
101 | :param path: path to lookup |
|
101 | :param path: path to lookup | |
102 | """ |
|
102 | """ | |
103 |
|
103 | |||
104 | try: |
|
104 | try: | |
105 | file_node = cs.get_node(path) |
|
105 | file_node = cs.get_node(path) | |
106 | if file_node.is_dir(): |
|
106 | if file_node.is_dir(): | |
107 | raise RepositoryError('given path is a directory') |
|
107 | raise RepositoryError('given path is a directory') | |
108 | except ChangesetDoesNotExistError: |
|
108 | except ChangesetDoesNotExistError: | |
109 | msg = _('Such revision does not exist for this repository') |
|
109 | msg = _('Such revision does not exist for this repository') | |
110 | webutils.flash(msg, category='error') |
|
110 | webutils.flash(msg, category='error') | |
111 | raise HTTPNotFound() |
|
111 | raise HTTPNotFound() | |
112 | except RepositoryError as e: |
|
112 | except RepositoryError as e: | |
113 | webutils.flash(e, category='error') |
|
113 | webutils.flash(e, category='error') | |
114 | raise HTTPNotFound() |
|
114 | raise HTTPNotFound() | |
115 |
|
115 | |||
116 | return file_node |
|
116 | return file_node | |
117 |
|
117 | |||
118 | @LoginRequired(allow_default_user=True) |
|
118 | @LoginRequired(allow_default_user=True) | |
119 | @HasRepoPermissionLevelDecorator('read') |
|
119 | @HasRepoPermissionLevelDecorator('read') | |
120 | def index(self, repo_name, revision, f_path, annotate=False): |
|
120 | def index(self, repo_name, revision, f_path, annotate=False): | |
121 | # redirect to given revision from form if given |
|
121 | # redirect to given revision from form if given | |
122 | post_revision = request.POST.get('at_rev', None) |
|
122 | post_revision = request.POST.get('at_rev', None) | |
123 | if post_revision: |
|
123 | if post_revision: | |
124 | cs = self.__get_cs(post_revision) # FIXME - unused! |
|
124 | cs = self.__get_cs(post_revision) # FIXME - unused! | |
125 |
|
125 | |||
126 | c.revision = revision |
|
126 | c.revision = revision | |
127 | c.changeset = self.__get_cs(revision) |
|
127 | c.changeset = self.__get_cs(revision) | |
128 | c.branch = request.GET.get('branch', None) |
|
128 | c.branch = request.GET.get('branch', None) | |
129 | c.f_path = f_path |
|
129 | c.f_path = f_path | |
130 | c.annotate = annotate |
|
130 | c.annotate = annotate | |
131 | cur_rev = c.changeset.revision |
|
131 | cur_rev = c.changeset.revision | |
132 | # used in files_source.html: |
|
132 | # used in files_source.html: | |
133 | c.cut_off_limit = self.cut_off_limit |
|
133 | c.cut_off_limit = self.cut_off_limit | |
134 | c.fulldiff = request.GET.get('fulldiff') |
|
134 | c.fulldiff = request.GET.get('fulldiff') | |
135 |
|
135 | |||
136 | # prev link |
|
136 | # prev link | |
137 | try: |
|
137 | try: | |
138 | prev_rev = c.db_repo_scm_instance.get_changeset(cur_rev).prev(c.branch) |
|
138 | prev_rev = c.db_repo_scm_instance.get_changeset(cur_rev).prev(c.branch) | |
139 | c.url_prev = url('files_home', repo_name=c.repo_name, |
|
139 | c.url_prev = url('files_home', repo_name=c.repo_name, | |
140 | revision=prev_rev.raw_id, f_path=f_path) |
|
140 | revision=prev_rev.raw_id, f_path=f_path) | |
141 | if c.branch: |
|
141 | if c.branch: | |
142 | c.url_prev += '?branch=%s' % c.branch |
|
142 | c.url_prev += '?branch=%s' % c.branch | |
143 | except (ChangesetDoesNotExistError, VCSError): |
|
143 | except (ChangesetDoesNotExistError, VCSError): | |
144 | c.url_prev = '#' |
|
144 | c.url_prev = '#' | |
145 |
|
145 | |||
146 | # next link |
|
146 | # next link | |
147 | try: |
|
147 | try: | |
148 | next_rev = c.db_repo_scm_instance.get_changeset(cur_rev).next(c.branch) |
|
148 | next_rev = c.db_repo_scm_instance.get_changeset(cur_rev).next(c.branch) | |
149 | c.url_next = url('files_home', repo_name=c.repo_name, |
|
149 | c.url_next = url('files_home', repo_name=c.repo_name, | |
150 | revision=next_rev.raw_id, f_path=f_path) |
|
150 | revision=next_rev.raw_id, f_path=f_path) | |
151 | if c.branch: |
|
151 | if c.branch: | |
152 | c.url_next += '?branch=%s' % c.branch |
|
152 | c.url_next += '?branch=%s' % c.branch | |
153 | except (ChangesetDoesNotExistError, VCSError): |
|
153 | except (ChangesetDoesNotExistError, VCSError): | |
154 | c.url_next = '#' |
|
154 | c.url_next = '#' | |
155 |
|
155 | |||
156 | # files or dirs |
|
156 | # files or dirs | |
157 | try: |
|
157 | try: | |
158 | c.file = c.changeset.get_node(f_path) |
|
158 | c.file = c.changeset.get_node(f_path) | |
159 |
|
159 | |||
160 | if c.file.is_submodule(): |
|
160 | if c.file.is_submodule(): | |
161 | raise HTTPFound(location=c.file.url) |
|
161 | raise HTTPFound(location=c.file.url) | |
162 | elif c.file.is_file(): |
|
162 | elif c.file.is_file(): | |
163 | c.load_full_history = False |
|
163 | c.load_full_history = False | |
164 | # determine if we're on branch head |
|
164 | # determine if we're on branch head | |
165 | _branches = c.db_repo_scm_instance.branches |
|
165 | _branches = c.db_repo_scm_instance.branches | |
166 | c.on_branch_head = revision in _branches or revision in _branches.values() |
|
166 | c.on_branch_head = revision in _branches or revision in _branches.values() | |
167 | _hist = [] |
|
167 | _hist = [] | |
168 | c.file_history = [] |
|
168 | c.file_history = [] | |
169 | if c.load_full_history: |
|
169 | if c.load_full_history: | |
170 | c.file_history, _hist = self._get_node_history(c.changeset, f_path) |
|
170 | c.file_history, _hist = self._get_node_history(c.changeset, f_path) | |
171 |
|
171 | |||
172 | c.authors = [] |
|
172 | c.authors = [] | |
173 | for a in set([x.author for x in _hist]): |
|
173 | for a in set([x.author for x in _hist]): | |
174 | c.authors.append((author_email(a), h.person(a))) |
|
174 | c.authors.append((author_email(a), h.person(a))) | |
175 | else: |
|
175 | else: | |
176 | c.authors = c.file_history = [] |
|
176 | c.authors = c.file_history = [] | |
177 | except RepositoryError as e: |
|
177 | except RepositoryError as e: | |
178 | webutils.flash(e, category='error') |
|
178 | webutils.flash(e, category='error') | |
179 | raise HTTPNotFound() |
|
179 | raise HTTPNotFound() | |
180 |
|
180 | |||
181 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
181 | if request.environ.get('HTTP_X_PARTIAL_XHR'): | |
182 | return render('files/files_ypjax.html') |
|
182 | return base.render('files/files_ypjax.html') | |
183 |
|
183 | |||
184 | # TODO: tags and bookmarks? |
|
184 | # TODO: tags and bookmarks? | |
185 | c.revision_options = [(c.changeset.raw_id, |
|
185 | c.revision_options = [(c.changeset.raw_id, | |
186 | _('%s at %s') % (b, c.changeset.short_id)) for b in c.changeset.branches] + \ |
|
186 | _('%s at %s') % (b, c.changeset.short_id)) for b in c.changeset.branches] + \ | |
187 | [(n, b) for b, n in c.db_repo_scm_instance.branches.items()] |
|
187 | [(n, b) for b, n in c.db_repo_scm_instance.branches.items()] | |
188 | if c.db_repo_scm_instance.closed_branches: |
|
188 | if c.db_repo_scm_instance.closed_branches: | |
189 | prefix = _('(closed)') + ' ' |
|
189 | prefix = _('(closed)') + ' ' | |
190 | c.revision_options += [('-', '-')] + \ |
|
190 | c.revision_options += [('-', '-')] + \ | |
191 | [(n, prefix + b) for b, n in c.db_repo_scm_instance.closed_branches.items()] |
|
191 | [(n, prefix + b) for b, n in c.db_repo_scm_instance.closed_branches.items()] | |
192 |
|
192 | |||
193 | return render('files/files.html') |
|
193 | return base.render('files/files.html') | |
194 |
|
194 | |||
195 | @LoginRequired(allow_default_user=True) |
|
195 | @LoginRequired(allow_default_user=True) | |
196 | @HasRepoPermissionLevelDecorator('read') |
|
196 | @HasRepoPermissionLevelDecorator('read') | |
197 | @jsonify |
|
197 | @base.jsonify | |
198 | def history(self, repo_name, revision, f_path): |
|
198 | def history(self, repo_name, revision, f_path): | |
199 | changeset = self.__get_cs(revision) |
|
199 | changeset = self.__get_cs(revision) | |
200 | _file = changeset.get_node(f_path) |
|
200 | _file = changeset.get_node(f_path) | |
201 | if _file.is_file(): |
|
201 | if _file.is_file(): | |
202 | file_history, _hist = self._get_node_history(changeset, f_path) |
|
202 | file_history, _hist = self._get_node_history(changeset, f_path) | |
203 |
|
203 | |||
204 | res = [] |
|
204 | res = [] | |
205 | for obj in file_history: |
|
205 | for obj in file_history: | |
206 | res.append({ |
|
206 | res.append({ | |
207 | 'text': obj[1], |
|
207 | 'text': obj[1], | |
208 | 'children': [{'id': o[0], 'text': o[1]} for o in obj[0]] |
|
208 | 'children': [{'id': o[0], 'text': o[1]} for o in obj[0]] | |
209 | }) |
|
209 | }) | |
210 |
|
210 | |||
211 | data = { |
|
211 | data = { | |
212 | 'more': False, |
|
212 | 'more': False, | |
213 | 'results': res |
|
213 | 'results': res | |
214 | } |
|
214 | } | |
215 | return data |
|
215 | return data | |
216 |
|
216 | |||
217 | @LoginRequired(allow_default_user=True) |
|
217 | @LoginRequired(allow_default_user=True) | |
218 | @HasRepoPermissionLevelDecorator('read') |
|
218 | @HasRepoPermissionLevelDecorator('read') | |
219 | def authors(self, repo_name, revision, f_path): |
|
219 | def authors(self, repo_name, revision, f_path): | |
220 | changeset = self.__get_cs(revision) |
|
220 | changeset = self.__get_cs(revision) | |
221 | _file = changeset.get_node(f_path) |
|
221 | _file = changeset.get_node(f_path) | |
222 | if _file.is_file(): |
|
222 | if _file.is_file(): | |
223 | file_history, _hist = self._get_node_history(changeset, f_path) |
|
223 | file_history, _hist = self._get_node_history(changeset, f_path) | |
224 | c.authors = [] |
|
224 | c.authors = [] | |
225 | for a in set([x.author for x in _hist]): |
|
225 | for a in set([x.author for x in _hist]): | |
226 | c.authors.append((author_email(a), h.person(a))) |
|
226 | c.authors.append((author_email(a), h.person(a))) | |
227 | return render('files/files_history_box.html') |
|
227 | return base.render('files/files_history_box.html') | |
228 |
|
228 | |||
229 | @LoginRequired(allow_default_user=True) |
|
229 | @LoginRequired(allow_default_user=True) | |
230 | @HasRepoPermissionLevelDecorator('read') |
|
230 | @HasRepoPermissionLevelDecorator('read') | |
231 | def rawfile(self, repo_name, revision, f_path): |
|
231 | def rawfile(self, repo_name, revision, f_path): | |
232 | cs = self.__get_cs(revision) |
|
232 | cs = self.__get_cs(revision) | |
233 | file_node = self.__get_filenode(cs, f_path) |
|
233 | file_node = self.__get_filenode(cs, f_path) | |
234 |
|
234 | |||
235 | response.content_disposition = \ |
|
235 | response.content_disposition = \ | |
236 | 'attachment; filename=%s' % f_path.split(kallithea.URL_SEP)[-1] |
|
236 | 'attachment; filename=%s' % f_path.split(kallithea.URL_SEP)[-1] | |
237 |
|
237 | |||
238 | response.content_type = file_node.mimetype |
|
238 | response.content_type = file_node.mimetype | |
239 | return file_node.content |
|
239 | return file_node.content | |
240 |
|
240 | |||
241 | @LoginRequired(allow_default_user=True) |
|
241 | @LoginRequired(allow_default_user=True) | |
242 | @HasRepoPermissionLevelDecorator('read') |
|
242 | @HasRepoPermissionLevelDecorator('read') | |
243 | def raw(self, repo_name, revision, f_path): |
|
243 | def raw(self, repo_name, revision, f_path): | |
244 | cs = self.__get_cs(revision) |
|
244 | cs = self.__get_cs(revision) | |
245 | file_node = self.__get_filenode(cs, f_path) |
|
245 | file_node = self.__get_filenode(cs, f_path) | |
246 |
|
246 | |||
247 | raw_mimetype_mapping = { |
|
247 | raw_mimetype_mapping = { | |
248 | # map original mimetype to a mimetype used for "show as raw" |
|
248 | # map original mimetype to a mimetype used for "show as raw" | |
249 | # you can also provide a content-disposition to override the |
|
249 | # you can also provide a content-disposition to override the | |
250 | # default "attachment" disposition. |
|
250 | # default "attachment" disposition. | |
251 | # orig_type: (new_type, new_dispo) |
|
251 | # orig_type: (new_type, new_dispo) | |
252 |
|
252 | |||
253 | # show images inline: |
|
253 | # show images inline: | |
254 | 'image/x-icon': ('image/x-icon', 'inline'), |
|
254 | 'image/x-icon': ('image/x-icon', 'inline'), | |
255 | 'image/png': ('image/png', 'inline'), |
|
255 | 'image/png': ('image/png', 'inline'), | |
256 | 'image/gif': ('image/gif', 'inline'), |
|
256 | 'image/gif': ('image/gif', 'inline'), | |
257 | 'image/jpeg': ('image/jpeg', 'inline'), |
|
257 | 'image/jpeg': ('image/jpeg', 'inline'), | |
258 | 'image/svg+xml': ('image/svg+xml', 'inline'), |
|
258 | 'image/svg+xml': ('image/svg+xml', 'inline'), | |
259 | } |
|
259 | } | |
260 |
|
260 | |||
261 | mimetype = file_node.mimetype |
|
261 | mimetype = file_node.mimetype | |
262 | try: |
|
262 | try: | |
263 | mimetype, dispo = raw_mimetype_mapping[mimetype] |
|
263 | mimetype, dispo = raw_mimetype_mapping[mimetype] | |
264 | except KeyError: |
|
264 | except KeyError: | |
265 | # we don't know anything special about this, handle it safely |
|
265 | # we don't know anything special about this, handle it safely | |
266 | if file_node.is_binary: |
|
266 | if file_node.is_binary: | |
267 | # do same as download raw for binary files |
|
267 | # do same as download raw for binary files | |
268 | mimetype, dispo = 'application/octet-stream', 'attachment' |
|
268 | mimetype, dispo = 'application/octet-stream', 'attachment' | |
269 | else: |
|
269 | else: | |
270 | # do not just use the original mimetype, but force text/plain, |
|
270 | # do not just use the original mimetype, but force text/plain, | |
271 | # otherwise it would serve text/html and that might be unsafe. |
|
271 | # otherwise it would serve text/html and that might be unsafe. | |
272 | # Note: underlying vcs library fakes text/plain mimetype if the |
|
272 | # Note: underlying vcs library fakes text/plain mimetype if the | |
273 | # mimetype can not be determined and it thinks it is not |
|
273 | # mimetype can not be determined and it thinks it is not | |
274 | # binary.This might lead to erroneous text display in some |
|
274 | # binary.This might lead to erroneous text display in some | |
275 | # cases, but helps in other cases, like with text files |
|
275 | # cases, but helps in other cases, like with text files | |
276 | # without extension. |
|
276 | # without extension. | |
277 | mimetype, dispo = 'text/plain', 'inline' |
|
277 | mimetype, dispo = 'text/plain', 'inline' | |
278 |
|
278 | |||
279 | if dispo == 'attachment': |
|
279 | if dispo == 'attachment': | |
280 | dispo = 'attachment; filename=%s' % f_path.split(os.sep)[-1] |
|
280 | dispo = 'attachment; filename=%s' % f_path.split(os.sep)[-1] | |
281 |
|
281 | |||
282 | response.content_disposition = dispo |
|
282 | response.content_disposition = dispo | |
283 | response.content_type = mimetype |
|
283 | response.content_type = mimetype | |
284 | return file_node.content |
|
284 | return file_node.content | |
285 |
|
285 | |||
286 | @LoginRequired() |
|
286 | @LoginRequired() | |
287 | @HasRepoPermissionLevelDecorator('write') |
|
287 | @HasRepoPermissionLevelDecorator('write') | |
288 | def delete(self, repo_name, revision, f_path): |
|
288 | def delete(self, repo_name, revision, f_path): | |
289 | repo = c.db_repo |
|
289 | repo = c.db_repo | |
290 | # check if revision is a branch identifier- basically we cannot |
|
290 | # check if revision is a branch identifier- basically we cannot | |
291 | # create multiple heads via file editing |
|
291 | # create multiple heads via file editing | |
292 | _branches = repo.scm_instance.branches |
|
292 | _branches = repo.scm_instance.branches | |
293 | # check if revision is a branch name or branch hash |
|
293 | # check if revision is a branch name or branch hash | |
294 | if revision not in _branches and revision not in _branches.values(): |
|
294 | if revision not in _branches and revision not in _branches.values(): | |
295 | webutils.flash(_('You can only delete files with revision ' |
|
295 | webutils.flash(_('You can only delete files with revision ' | |
296 | 'being a valid branch'), category='warning') |
|
296 | 'being a valid branch'), category='warning') | |
297 | raise HTTPFound(location=webutils.url('files_home', |
|
297 | raise HTTPFound(location=webutils.url('files_home', | |
298 | repo_name=repo_name, revision='tip', |
|
298 | repo_name=repo_name, revision='tip', | |
299 | f_path=f_path)) |
|
299 | f_path=f_path)) | |
300 |
|
300 | |||
301 | r_post = request.POST |
|
301 | r_post = request.POST | |
302 |
|
302 | |||
303 | c.cs = self.__get_cs(revision) |
|
303 | c.cs = self.__get_cs(revision) | |
304 | c.file = self.__get_filenode(c.cs, f_path) |
|
304 | c.file = self.__get_filenode(c.cs, f_path) | |
305 |
|
305 | |||
306 | c.default_message = _('Deleted file %s via Kallithea') % (f_path) |
|
306 | c.default_message = _('Deleted file %s via Kallithea') % (f_path) | |
307 | c.f_path = f_path |
|
307 | c.f_path = f_path | |
308 | node_path = f_path |
|
308 | node_path = f_path | |
309 | author = request.authuser.full_contact |
|
309 | author = request.authuser.full_contact | |
310 |
|
310 | |||
311 | if r_post: |
|
311 | if r_post: | |
312 | message = r_post.get('message') or c.default_message |
|
312 | message = r_post.get('message') or c.default_message | |
313 |
|
313 | |||
314 | try: |
|
314 | try: | |
315 | nodes = { |
|
315 | nodes = { | |
316 | node_path: { |
|
316 | node_path: { | |
317 | 'content': '' |
|
317 | 'content': '' | |
318 | } |
|
318 | } | |
319 | } |
|
319 | } | |
320 | self.scm_model.delete_nodes( |
|
320 | self.scm_model.delete_nodes( | |
321 | user=request.authuser.user_id, |
|
321 | user=request.authuser.user_id, | |
322 | ip_addr=request.ip_addr, |
|
322 | ip_addr=request.ip_addr, | |
323 | repo=c.db_repo, |
|
323 | repo=c.db_repo, | |
324 | message=message, |
|
324 | message=message, | |
325 | nodes=nodes, |
|
325 | nodes=nodes, | |
326 | parent_cs=c.cs, |
|
326 | parent_cs=c.cs, | |
327 | author=author, |
|
327 | author=author, | |
328 | ) |
|
328 | ) | |
329 |
|
329 | |||
330 | webutils.flash(_('Successfully deleted file %s') % f_path, |
|
330 | webutils.flash(_('Successfully deleted file %s') % f_path, | |
331 | category='success') |
|
331 | category='success') | |
332 | except Exception: |
|
332 | except Exception: | |
333 | log.error(traceback.format_exc()) |
|
333 | log.error(traceback.format_exc()) | |
334 | webutils.flash(_('Error occurred during commit'), category='error') |
|
334 | webutils.flash(_('Error occurred during commit'), category='error') | |
335 | raise HTTPFound(location=url('changeset_home', |
|
335 | raise HTTPFound(location=url('changeset_home', | |
336 | repo_name=c.repo_name, revision='tip')) |
|
336 | repo_name=c.repo_name, revision='tip')) | |
337 |
|
337 | |||
338 | return render('files/files_delete.html') |
|
338 | return base.render('files/files_delete.html') | |
339 |
|
339 | |||
340 | @LoginRequired() |
|
340 | @LoginRequired() | |
341 | @HasRepoPermissionLevelDecorator('write') |
|
341 | @HasRepoPermissionLevelDecorator('write') | |
342 | def edit(self, repo_name, revision, f_path): |
|
342 | def edit(self, repo_name, revision, f_path): | |
343 | repo = c.db_repo |
|
343 | repo = c.db_repo | |
344 | # check if revision is a branch identifier- basically we cannot |
|
344 | # check if revision is a branch identifier- basically we cannot | |
345 | # create multiple heads via file editing |
|
345 | # create multiple heads via file editing | |
346 | _branches = repo.scm_instance.branches |
|
346 | _branches = repo.scm_instance.branches | |
347 | # check if revision is a branch name or branch hash |
|
347 | # check if revision is a branch name or branch hash | |
348 | if revision not in _branches and revision not in _branches.values(): |
|
348 | if revision not in _branches and revision not in _branches.values(): | |
349 | webutils.flash(_('You can only edit files with revision ' |
|
349 | webutils.flash(_('You can only edit files with revision ' | |
350 | 'being a valid branch'), category='warning') |
|
350 | 'being a valid branch'), category='warning') | |
351 | raise HTTPFound(location=webutils.url('files_home', |
|
351 | raise HTTPFound(location=webutils.url('files_home', | |
352 | repo_name=repo_name, revision='tip', |
|
352 | repo_name=repo_name, revision='tip', | |
353 | f_path=f_path)) |
|
353 | f_path=f_path)) | |
354 |
|
354 | |||
355 | r_post = request.POST |
|
355 | r_post = request.POST | |
356 |
|
356 | |||
357 | c.cs = self.__get_cs(revision) |
|
357 | c.cs = self.__get_cs(revision) | |
358 | c.file = self.__get_filenode(c.cs, f_path) |
|
358 | c.file = self.__get_filenode(c.cs, f_path) | |
359 |
|
359 | |||
360 | if c.file.is_binary: |
|
360 | if c.file.is_binary: | |
361 | raise HTTPFound(location=url('files_home', repo_name=c.repo_name, |
|
361 | raise HTTPFound(location=url('files_home', repo_name=c.repo_name, | |
362 | revision=c.cs.raw_id, f_path=f_path)) |
|
362 | revision=c.cs.raw_id, f_path=f_path)) | |
363 | c.default_message = _('Edited file %s via Kallithea') % (f_path) |
|
363 | c.default_message = _('Edited file %s via Kallithea') % (f_path) | |
364 | c.f_path = f_path |
|
364 | c.f_path = f_path | |
365 |
|
365 | |||
366 | if r_post: |
|
366 | if r_post: | |
367 | old_content = safe_str(c.file.content) |
|
367 | old_content = safe_str(c.file.content) | |
368 | sl = old_content.splitlines(1) |
|
368 | sl = old_content.splitlines(1) | |
369 | first_line = sl[0] if sl else '' |
|
369 | first_line = sl[0] if sl else '' | |
370 | # modes: 0 - Unix, 1 - Mac, 2 - DOS |
|
370 | # modes: 0 - Unix, 1 - Mac, 2 - DOS | |
371 | mode = detect_mode(first_line, 0) |
|
371 | mode = detect_mode(first_line, 0) | |
372 | content = convert_line_endings(r_post.get('content', ''), mode) |
|
372 | content = convert_line_endings(r_post.get('content', ''), mode) | |
373 |
|
373 | |||
374 | message = r_post.get('message') or c.default_message |
|
374 | message = r_post.get('message') or c.default_message | |
375 | author = request.authuser.full_contact |
|
375 | author = request.authuser.full_contact | |
376 |
|
376 | |||
377 | if content == old_content: |
|
377 | if content == old_content: | |
378 | webutils.flash(_('No changes'), category='warning') |
|
378 | webutils.flash(_('No changes'), category='warning') | |
379 | raise HTTPFound(location=url('changeset_home', repo_name=c.repo_name, |
|
379 | raise HTTPFound(location=url('changeset_home', repo_name=c.repo_name, | |
380 | revision='tip')) |
|
380 | revision='tip')) | |
381 | try: |
|
381 | try: | |
382 | self.scm_model.commit_change(repo=c.db_repo_scm_instance, |
|
382 | self.scm_model.commit_change(repo=c.db_repo_scm_instance, | |
383 | repo_name=repo_name, cs=c.cs, |
|
383 | repo_name=repo_name, cs=c.cs, | |
384 | user=request.authuser.user_id, |
|
384 | user=request.authuser.user_id, | |
385 | ip_addr=request.ip_addr, |
|
385 | ip_addr=request.ip_addr, | |
386 | author=author, message=message, |
|
386 | author=author, message=message, | |
387 | content=content, f_path=f_path) |
|
387 | content=content, f_path=f_path) | |
388 | webutils.flash(_('Successfully committed to %s') % f_path, |
|
388 | webutils.flash(_('Successfully committed to %s') % f_path, | |
389 | category='success') |
|
389 | category='success') | |
390 | except Exception: |
|
390 | except Exception: | |
391 | log.error(traceback.format_exc()) |
|
391 | log.error(traceback.format_exc()) | |
392 | webutils.flash(_('Error occurred during commit'), category='error') |
|
392 | webutils.flash(_('Error occurred during commit'), category='error') | |
393 | raise HTTPFound(location=url('changeset_home', |
|
393 | raise HTTPFound(location=url('changeset_home', | |
394 | repo_name=c.repo_name, revision='tip')) |
|
394 | repo_name=c.repo_name, revision='tip')) | |
395 |
|
395 | |||
396 | return render('files/files_edit.html') |
|
396 | return base.render('files/files_edit.html') | |
397 |
|
397 | |||
398 | @LoginRequired() |
|
398 | @LoginRequired() | |
399 | @HasRepoPermissionLevelDecorator('write') |
|
399 | @HasRepoPermissionLevelDecorator('write') | |
400 | def add(self, repo_name, revision, f_path): |
|
400 | def add(self, repo_name, revision, f_path): | |
401 |
|
401 | |||
402 | repo = c.db_repo |
|
402 | repo = c.db_repo | |
403 | r_post = request.POST |
|
403 | r_post = request.POST | |
404 | c.cs = self.__get_cs(revision, silent_empty=True) |
|
404 | c.cs = self.__get_cs(revision, silent_empty=True) | |
405 | if c.cs is None: |
|
405 | if c.cs is None: | |
406 | c.cs = EmptyChangeset(alias=c.db_repo_scm_instance.alias) |
|
406 | c.cs = EmptyChangeset(alias=c.db_repo_scm_instance.alias) | |
407 | c.default_message = (_('Added file via Kallithea')) |
|
407 | c.default_message = (_('Added file via Kallithea')) | |
408 | c.f_path = f_path |
|
408 | c.f_path = f_path | |
409 |
|
409 | |||
410 | if r_post: |
|
410 | if r_post: | |
411 | unix_mode = 0 |
|
411 | unix_mode = 0 | |
412 | content = convert_line_endings(r_post.get('content', ''), unix_mode) |
|
412 | content = convert_line_endings(r_post.get('content', ''), unix_mode) | |
413 |
|
413 | |||
414 | message = r_post.get('message') or c.default_message |
|
414 | message = r_post.get('message') or c.default_message | |
415 | filename = r_post.get('filename') |
|
415 | filename = r_post.get('filename') | |
416 | location = r_post.get('location', '') |
|
416 | location = r_post.get('location', '') | |
417 | file_obj = r_post.get('upload_file', None) |
|
417 | file_obj = r_post.get('upload_file', None) | |
418 |
|
418 | |||
419 | if file_obj is not None and hasattr(file_obj, 'filename'): |
|
419 | if file_obj is not None and hasattr(file_obj, 'filename'): | |
420 | filename = file_obj.filename |
|
420 | filename = file_obj.filename | |
421 | content = file_obj.file |
|
421 | content = file_obj.file | |
422 |
|
422 | |||
423 | if hasattr(content, 'file'): |
|
423 | if hasattr(content, 'file'): | |
424 | # non posix systems store real file under file attr |
|
424 | # non posix systems store real file under file attr | |
425 | content = content.file |
|
425 | content = content.file | |
426 |
|
426 | |||
427 | if not content: |
|
427 | if not content: | |
428 | webutils.flash(_('No content'), category='warning') |
|
428 | webutils.flash(_('No content'), category='warning') | |
429 | raise HTTPFound(location=url('changeset_home', repo_name=c.repo_name, |
|
429 | raise HTTPFound(location=url('changeset_home', repo_name=c.repo_name, | |
430 | revision='tip')) |
|
430 | revision='tip')) | |
431 | if not filename: |
|
431 | if not filename: | |
432 | webutils.flash(_('No filename'), category='warning') |
|
432 | webutils.flash(_('No filename'), category='warning') | |
433 | raise HTTPFound(location=url('changeset_home', repo_name=c.repo_name, |
|
433 | raise HTTPFound(location=url('changeset_home', repo_name=c.repo_name, | |
434 | revision='tip')) |
|
434 | revision='tip')) | |
435 | # strip all crap out of file, just leave the basename |
|
435 | # strip all crap out of file, just leave the basename | |
436 | filename = os.path.basename(filename) |
|
436 | filename = os.path.basename(filename) | |
437 | node_path = posixpath.join(location, filename) |
|
437 | node_path = posixpath.join(location, filename) | |
438 | author = request.authuser.full_contact |
|
438 | author = request.authuser.full_contact | |
439 |
|
439 | |||
440 | try: |
|
440 | try: | |
441 | nodes = { |
|
441 | nodes = { | |
442 | node_path: { |
|
442 | node_path: { | |
443 | 'content': content |
|
443 | 'content': content | |
444 | } |
|
444 | } | |
445 | } |
|
445 | } | |
446 | self.scm_model.create_nodes( |
|
446 | self.scm_model.create_nodes( | |
447 | user=request.authuser.user_id, |
|
447 | user=request.authuser.user_id, | |
448 | ip_addr=request.ip_addr, |
|
448 | ip_addr=request.ip_addr, | |
449 | repo=c.db_repo, |
|
449 | repo=c.db_repo, | |
450 | message=message, |
|
450 | message=message, | |
451 | nodes=nodes, |
|
451 | nodes=nodes, | |
452 | parent_cs=c.cs, |
|
452 | parent_cs=c.cs, | |
453 | author=author, |
|
453 | author=author, | |
454 | ) |
|
454 | ) | |
455 |
|
455 | |||
456 | webutils.flash(_('Successfully committed to %s') % node_path, |
|
456 | webutils.flash(_('Successfully committed to %s') % node_path, | |
457 | category='success') |
|
457 | category='success') | |
458 | except NonRelativePathError as e: |
|
458 | except NonRelativePathError as e: | |
459 | webutils.flash(_('Location must be relative path and must not ' |
|
459 | webutils.flash(_('Location must be relative path and must not ' | |
460 | 'contain .. in path'), category='warning') |
|
460 | 'contain .. in path'), category='warning') | |
461 | raise HTTPFound(location=url('changeset_home', repo_name=c.repo_name, |
|
461 | raise HTTPFound(location=url('changeset_home', repo_name=c.repo_name, | |
462 | revision='tip')) |
|
462 | revision='tip')) | |
463 | except (NodeError, NodeAlreadyExistsError) as e: |
|
463 | except (NodeError, NodeAlreadyExistsError) as e: | |
464 | webutils.flash(_(e), category='error') |
|
464 | webutils.flash(_(e), category='error') | |
465 | except Exception: |
|
465 | except Exception: | |
466 | log.error(traceback.format_exc()) |
|
466 | log.error(traceback.format_exc()) | |
467 | webutils.flash(_('Error occurred during commit'), category='error') |
|
467 | webutils.flash(_('Error occurred during commit'), category='error') | |
468 | raise HTTPFound(location=url('changeset_home', |
|
468 | raise HTTPFound(location=url('changeset_home', | |
469 | repo_name=c.repo_name, revision='tip')) |
|
469 | repo_name=c.repo_name, revision='tip')) | |
470 |
|
470 | |||
471 | return render('files/files_add.html') |
|
471 | return base.render('files/files_add.html') | |
472 |
|
472 | |||
473 | @LoginRequired(allow_default_user=True) |
|
473 | @LoginRequired(allow_default_user=True) | |
474 | @HasRepoPermissionLevelDecorator('read') |
|
474 | @HasRepoPermissionLevelDecorator('read') | |
475 | def archivefile(self, repo_name, fname): |
|
475 | def archivefile(self, repo_name, fname): | |
476 | fileformat = None |
|
476 | fileformat = None | |
477 | revision = None |
|
477 | revision = None | |
478 | ext = None |
|
478 | ext = None | |
479 | subrepos = request.GET.get('subrepos') == 'true' |
|
479 | subrepos = request.GET.get('subrepos') == 'true' | |
480 |
|
480 | |||
481 | for a_type, ext_data in settings.ARCHIVE_SPECS.items(): |
|
481 | for a_type, ext_data in settings.ARCHIVE_SPECS.items(): | |
482 | archive_spec = fname.split(ext_data[1]) |
|
482 | archive_spec = fname.split(ext_data[1]) | |
483 | if len(archive_spec) == 2 and archive_spec[1] == '': |
|
483 | if len(archive_spec) == 2 and archive_spec[1] == '': | |
484 | fileformat = a_type or ext_data[1] |
|
484 | fileformat = a_type or ext_data[1] | |
485 | revision = archive_spec[0] |
|
485 | revision = archive_spec[0] | |
486 | ext = ext_data[1] |
|
486 | ext = ext_data[1] | |
487 |
|
487 | |||
488 | try: |
|
488 | try: | |
489 | dbrepo = RepoModel().get_by_repo_name(repo_name) |
|
489 | dbrepo = RepoModel().get_by_repo_name(repo_name) | |
490 | if not dbrepo.enable_downloads: |
|
490 | if not dbrepo.enable_downloads: | |
491 | return _('Downloads disabled') # TODO: do something else? |
|
491 | return _('Downloads disabled') # TODO: do something else? | |
492 |
|
492 | |||
493 | if c.db_repo_scm_instance.alias == 'hg': |
|
493 | if c.db_repo_scm_instance.alias == 'hg': | |
494 | # patch and reset hooks section of UI config to not run any |
|
494 | # patch and reset hooks section of UI config to not run any | |
495 | # hooks on fetching archives with subrepos |
|
495 | # hooks on fetching archives with subrepos | |
496 | for k, v in c.db_repo_scm_instance._repo.ui.configitems('hooks'): |
|
496 | for k, v in c.db_repo_scm_instance._repo.ui.configitems('hooks'): | |
497 | c.db_repo_scm_instance._repo.ui.setconfig('hooks', k, None) |
|
497 | c.db_repo_scm_instance._repo.ui.setconfig('hooks', k, None) | |
498 |
|
498 | |||
499 | cs = c.db_repo_scm_instance.get_changeset(revision) |
|
499 | cs = c.db_repo_scm_instance.get_changeset(revision) | |
500 | content_type = settings.ARCHIVE_SPECS[fileformat][0] |
|
500 | content_type = settings.ARCHIVE_SPECS[fileformat][0] | |
501 | except ChangesetDoesNotExistError: |
|
501 | except ChangesetDoesNotExistError: | |
502 | return _('Unknown revision %s') % revision |
|
502 | return _('Unknown revision %s') % revision | |
503 | except EmptyRepositoryError: |
|
503 | except EmptyRepositoryError: | |
504 | return _('Empty repository') |
|
504 | return _('Empty repository') | |
505 | except (ImproperArchiveTypeError, KeyError): |
|
505 | except (ImproperArchiveTypeError, KeyError): | |
506 | return _('Unknown archive type') |
|
506 | return _('Unknown archive type') | |
507 |
|
507 | |||
508 | rev_name = cs.raw_id[:12] |
|
508 | rev_name = cs.raw_id[:12] | |
509 | archive_name = '%s-%s%s' % (repo_name.replace('/', '_'), rev_name, ext) |
|
509 | archive_name = '%s-%s%s' % (repo_name.replace('/', '_'), rev_name, ext) | |
510 |
|
510 | |||
511 | archive_path = None |
|
511 | archive_path = None | |
512 | cached_archive_path = None |
|
512 | cached_archive_path = None | |
513 | archive_cache_dir = kallithea.CONFIG.get('archive_cache_dir') |
|
513 | archive_cache_dir = kallithea.CONFIG.get('archive_cache_dir') | |
514 | if archive_cache_dir and not subrepos: # TODO: subrepo caching? |
|
514 | if archive_cache_dir and not subrepos: # TODO: subrepo caching? | |
515 | if not os.path.isdir(archive_cache_dir): |
|
515 | if not os.path.isdir(archive_cache_dir): | |
516 | os.makedirs(archive_cache_dir) |
|
516 | os.makedirs(archive_cache_dir) | |
517 | cached_archive_path = os.path.join(archive_cache_dir, archive_name) |
|
517 | cached_archive_path = os.path.join(archive_cache_dir, archive_name) | |
518 | if os.path.isfile(cached_archive_path): |
|
518 | if os.path.isfile(cached_archive_path): | |
519 | log.debug('Found cached archive in %s', cached_archive_path) |
|
519 | log.debug('Found cached archive in %s', cached_archive_path) | |
520 | archive_path = cached_archive_path |
|
520 | archive_path = cached_archive_path | |
521 | else: |
|
521 | else: | |
522 | log.debug('Archive %s is not yet cached', archive_name) |
|
522 | log.debug('Archive %s is not yet cached', archive_name) | |
523 |
|
523 | |||
524 | if archive_path is None: |
|
524 | if archive_path is None: | |
525 | # generate new archive |
|
525 | # generate new archive | |
526 | fd, archive_path = tempfile.mkstemp() |
|
526 | fd, archive_path = tempfile.mkstemp() | |
527 | log.debug('Creating new temp archive in %s', archive_path) |
|
527 | log.debug('Creating new temp archive in %s', archive_path) | |
528 | with os.fdopen(fd, 'wb') as stream: |
|
528 | with os.fdopen(fd, 'wb') as stream: | |
529 | cs.fill_archive(stream=stream, kind=fileformat, subrepos=subrepos) |
|
529 | cs.fill_archive(stream=stream, kind=fileformat, subrepos=subrepos) | |
530 | # stream (and thus fd) has been closed by cs.fill_archive |
|
530 | # stream (and thus fd) has been closed by cs.fill_archive | |
531 | if cached_archive_path is not None: |
|
531 | if cached_archive_path is not None: | |
532 | # we generated the archive - move it to cache |
|
532 | # we generated the archive - move it to cache | |
533 | log.debug('Storing new archive in %s', cached_archive_path) |
|
533 | log.debug('Storing new archive in %s', cached_archive_path) | |
534 | shutil.move(archive_path, cached_archive_path) |
|
534 | shutil.move(archive_path, cached_archive_path) | |
535 | archive_path = cached_archive_path |
|
535 | archive_path = cached_archive_path | |
536 |
|
536 | |||
537 | def get_chunked_archive(archive_path): |
|
537 | def get_chunked_archive(archive_path): | |
538 | stream = open(archive_path, 'rb') |
|
538 | stream = open(archive_path, 'rb') | |
539 | while True: |
|
539 | while True: | |
540 | data = stream.read(16 * 1024) |
|
540 | data = stream.read(16 * 1024) | |
541 | if not data: |
|
541 | if not data: | |
542 | break |
|
542 | break | |
543 | yield data |
|
543 | yield data | |
544 | stream.close() |
|
544 | stream.close() | |
545 | if archive_path != cached_archive_path: |
|
545 | if archive_path != cached_archive_path: | |
546 | log.debug('Destroying temp archive %s', archive_path) |
|
546 | log.debug('Destroying temp archive %s', archive_path) | |
547 | os.remove(archive_path) |
|
547 | os.remove(archive_path) | |
548 |
|
548 | |||
549 | userlog.action_logger(user=request.authuser, |
|
549 | userlog.action_logger(user=request.authuser, | |
550 | action='user_downloaded_archive:%s' % (archive_name), |
|
550 | action='user_downloaded_archive:%s' % (archive_name), | |
551 | repo=repo_name, ipaddr=request.ip_addr, commit=True) |
|
551 | repo=repo_name, ipaddr=request.ip_addr, commit=True) | |
552 |
|
552 | |||
553 | response.content_disposition = str('attachment; filename=%s' % (archive_name)) |
|
553 | response.content_disposition = str('attachment; filename=%s' % (archive_name)) | |
554 | response.content_type = str(content_type) |
|
554 | response.content_type = str(content_type) | |
555 | return get_chunked_archive(archive_path) |
|
555 | return get_chunked_archive(archive_path) | |
556 |
|
556 | |||
557 | @LoginRequired(allow_default_user=True) |
|
557 | @LoginRequired(allow_default_user=True) | |
558 | @HasRepoPermissionLevelDecorator('read') |
|
558 | @HasRepoPermissionLevelDecorator('read') | |
559 | def diff(self, repo_name, f_path): |
|
559 | def diff(self, repo_name, f_path): | |
560 | ignore_whitespace_diff = h.get_ignore_whitespace_diff(request.GET) |
|
560 | ignore_whitespace_diff = h.get_ignore_whitespace_diff(request.GET) | |
561 | diff_context_size = h.get_diff_context_size(request.GET) |
|
561 | diff_context_size = h.get_diff_context_size(request.GET) | |
562 | diff2 = request.GET.get('diff2', '') |
|
562 | diff2 = request.GET.get('diff2', '') | |
563 | diff1 = request.GET.get('diff1', '') or diff2 |
|
563 | diff1 = request.GET.get('diff1', '') or diff2 | |
564 | c.action = request.GET.get('diff') |
|
564 | c.action = request.GET.get('diff') | |
565 | c.no_changes = diff1 == diff2 |
|
565 | c.no_changes = diff1 == diff2 | |
566 | c.f_path = f_path |
|
566 | c.f_path = f_path | |
567 | c.big_diff = False |
|
567 | c.big_diff = False | |
568 | fulldiff = request.GET.get('fulldiff') |
|
568 | fulldiff = request.GET.get('fulldiff') | |
569 | c.changes = OrderedDict() |
|
569 | c.changes = OrderedDict() | |
570 | c.changes[diff2] = [] |
|
570 | c.changes[diff2] = [] | |
571 |
|
571 | |||
572 | # special case if we want a show rev only, it's impl here |
|
572 | # special case if we want a show rev only, it's impl here | |
573 | # to reduce JS and callbacks |
|
573 | # to reduce JS and callbacks | |
574 |
|
574 | |||
575 | if request.GET.get('show_rev'): |
|
575 | if request.GET.get('show_rev'): | |
576 | if asbool(request.GET.get('annotate', 'False')): |
|
576 | if asbool(request.GET.get('annotate', 'False')): | |
577 | _url = url('files_annotate_home', repo_name=c.repo_name, |
|
577 | _url = url('files_annotate_home', repo_name=c.repo_name, | |
578 | revision=diff1, f_path=c.f_path) |
|
578 | revision=diff1, f_path=c.f_path) | |
579 | else: |
|
579 | else: | |
580 | _url = url('files_home', repo_name=c.repo_name, |
|
580 | _url = url('files_home', repo_name=c.repo_name, | |
581 | revision=diff1, f_path=c.f_path) |
|
581 | revision=diff1, f_path=c.f_path) | |
582 |
|
582 | |||
583 | raise HTTPFound(location=_url) |
|
583 | raise HTTPFound(location=_url) | |
584 | try: |
|
584 | try: | |
585 | if diff1 not in ['', None, 'None', '0' * 12, '0' * 40]: |
|
585 | if diff1 not in ['', None, 'None', '0' * 12, '0' * 40]: | |
586 | c.changeset_1 = c.db_repo_scm_instance.get_changeset(diff1) |
|
586 | c.changeset_1 = c.db_repo_scm_instance.get_changeset(diff1) | |
587 | try: |
|
587 | try: | |
588 | node1 = c.changeset_1.get_node(f_path) |
|
588 | node1 = c.changeset_1.get_node(f_path) | |
589 | if node1.is_dir(): |
|
589 | if node1.is_dir(): | |
590 | raise NodeError('%s path is a %s not a file' |
|
590 | raise NodeError('%s path is a %s not a file' | |
591 | % (node1, type(node1))) |
|
591 | % (node1, type(node1))) | |
592 | except NodeDoesNotExistError: |
|
592 | except NodeDoesNotExistError: | |
593 | c.changeset_1 = EmptyChangeset(cs=diff1, |
|
593 | c.changeset_1 = EmptyChangeset(cs=diff1, | |
594 | revision=c.changeset_1.revision, |
|
594 | revision=c.changeset_1.revision, | |
595 | repo=c.db_repo_scm_instance) |
|
595 | repo=c.db_repo_scm_instance) | |
596 | node1 = FileNode(f_path, '', changeset=c.changeset_1) |
|
596 | node1 = FileNode(f_path, '', changeset=c.changeset_1) | |
597 | else: |
|
597 | else: | |
598 | c.changeset_1 = EmptyChangeset(repo=c.db_repo_scm_instance) |
|
598 | c.changeset_1 = EmptyChangeset(repo=c.db_repo_scm_instance) | |
599 | node1 = FileNode(f_path, '', changeset=c.changeset_1) |
|
599 | node1 = FileNode(f_path, '', changeset=c.changeset_1) | |
600 |
|
600 | |||
601 | if diff2 not in ['', None, 'None', '0' * 12, '0' * 40]: |
|
601 | if diff2 not in ['', None, 'None', '0' * 12, '0' * 40]: | |
602 | c.changeset_2 = c.db_repo_scm_instance.get_changeset(diff2) |
|
602 | c.changeset_2 = c.db_repo_scm_instance.get_changeset(diff2) | |
603 | try: |
|
603 | try: | |
604 | node2 = c.changeset_2.get_node(f_path) |
|
604 | node2 = c.changeset_2.get_node(f_path) | |
605 | if node2.is_dir(): |
|
605 | if node2.is_dir(): | |
606 | raise NodeError('%s path is a %s not a file' |
|
606 | raise NodeError('%s path is a %s not a file' | |
607 | % (node2, type(node2))) |
|
607 | % (node2, type(node2))) | |
608 | except NodeDoesNotExistError: |
|
608 | except NodeDoesNotExistError: | |
609 | c.changeset_2 = EmptyChangeset(cs=diff2, |
|
609 | c.changeset_2 = EmptyChangeset(cs=diff2, | |
610 | revision=c.changeset_2.revision, |
|
610 | revision=c.changeset_2.revision, | |
611 | repo=c.db_repo_scm_instance) |
|
611 | repo=c.db_repo_scm_instance) | |
612 | node2 = FileNode(f_path, '', changeset=c.changeset_2) |
|
612 | node2 = FileNode(f_path, '', changeset=c.changeset_2) | |
613 | else: |
|
613 | else: | |
614 | c.changeset_2 = EmptyChangeset(repo=c.db_repo_scm_instance) |
|
614 | c.changeset_2 = EmptyChangeset(repo=c.db_repo_scm_instance) | |
615 | node2 = FileNode(f_path, '', changeset=c.changeset_2) |
|
615 | node2 = FileNode(f_path, '', changeset=c.changeset_2) | |
616 | except (RepositoryError, NodeError): |
|
616 | except (RepositoryError, NodeError): | |
617 | log.error(traceback.format_exc()) |
|
617 | log.error(traceback.format_exc()) | |
618 | raise HTTPFound(location=url('files_home', repo_name=c.repo_name, |
|
618 | raise HTTPFound(location=url('files_home', repo_name=c.repo_name, | |
619 | f_path=f_path)) |
|
619 | f_path=f_path)) | |
620 |
|
620 | |||
621 | if c.action == 'download': |
|
621 | if c.action == 'download': | |
622 | raw_diff = diffs.get_gitdiff(node1, node2, |
|
622 | raw_diff = diffs.get_gitdiff(node1, node2, | |
623 | ignore_whitespace=ignore_whitespace_diff, |
|
623 | ignore_whitespace=ignore_whitespace_diff, | |
624 | context=diff_context_size) |
|
624 | context=diff_context_size) | |
625 | diff_name = '%s_vs_%s.diff' % (diff1, diff2) |
|
625 | diff_name = '%s_vs_%s.diff' % (diff1, diff2) | |
626 | response.content_type = 'text/plain' |
|
626 | response.content_type = 'text/plain' | |
627 | response.content_disposition = ( |
|
627 | response.content_disposition = ( | |
628 | 'attachment; filename=%s' % diff_name |
|
628 | 'attachment; filename=%s' % diff_name | |
629 | ) |
|
629 | ) | |
630 | return raw_diff |
|
630 | return raw_diff | |
631 |
|
631 | |||
632 | elif c.action == 'raw': |
|
632 | elif c.action == 'raw': | |
633 | raw_diff = diffs.get_gitdiff(node1, node2, |
|
633 | raw_diff = diffs.get_gitdiff(node1, node2, | |
634 | ignore_whitespace=ignore_whitespace_diff, |
|
634 | ignore_whitespace=ignore_whitespace_diff, | |
635 | context=diff_context_size) |
|
635 | context=diff_context_size) | |
636 | response.content_type = 'text/plain' |
|
636 | response.content_type = 'text/plain' | |
637 | return raw_diff |
|
637 | return raw_diff | |
638 |
|
638 | |||
639 | else: |
|
639 | else: | |
640 | fid = h.FID(diff2, node2.path) |
|
640 | fid = h.FID(diff2, node2.path) | |
641 | diff_limit = None if fulldiff else self.cut_off_limit |
|
641 | diff_limit = None if fulldiff else self.cut_off_limit | |
642 | c.a_rev, c.cs_rev, a_path, diff, st, op = diffs.wrapped_diff(filenode_old=node1, |
|
642 | c.a_rev, c.cs_rev, a_path, diff, st, op = diffs.wrapped_diff(filenode_old=node1, | |
643 | filenode_new=node2, |
|
643 | filenode_new=node2, | |
644 | diff_limit=diff_limit, |
|
644 | diff_limit=diff_limit, | |
645 | ignore_whitespace=ignore_whitespace_diff, |
|
645 | ignore_whitespace=ignore_whitespace_diff, | |
646 | line_context=diff_context_size) |
|
646 | line_context=diff_context_size) | |
647 | c.file_diff_data = [(fid, fid, op, a_path, node2.path, diff, st)] |
|
647 | c.file_diff_data = [(fid, fid, op, a_path, node2.path, diff, st)] | |
648 | return render('files/file_diff.html') |
|
648 | return base.render('files/file_diff.html') | |
649 |
|
649 | |||
650 | @LoginRequired(allow_default_user=True) |
|
650 | @LoginRequired(allow_default_user=True) | |
651 | @HasRepoPermissionLevelDecorator('read') |
|
651 | @HasRepoPermissionLevelDecorator('read') | |
652 | def diff_2way(self, repo_name, f_path): |
|
652 | def diff_2way(self, repo_name, f_path): | |
653 | diff1 = request.GET.get('diff1', '') |
|
653 | diff1 = request.GET.get('diff1', '') | |
654 | diff2 = request.GET.get('diff2', '') |
|
654 | diff2 = request.GET.get('diff2', '') | |
655 | try: |
|
655 | try: | |
656 | if diff1 not in ['', None, 'None', '0' * 12, '0' * 40]: |
|
656 | if diff1 not in ['', None, 'None', '0' * 12, '0' * 40]: | |
657 | c.changeset_1 = c.db_repo_scm_instance.get_changeset(diff1) |
|
657 | c.changeset_1 = c.db_repo_scm_instance.get_changeset(diff1) | |
658 | try: |
|
658 | try: | |
659 | node1 = c.changeset_1.get_node(f_path) |
|
659 | node1 = c.changeset_1.get_node(f_path) | |
660 | if node1.is_dir(): |
|
660 | if node1.is_dir(): | |
661 | raise NodeError('%s path is a %s not a file' |
|
661 | raise NodeError('%s path is a %s not a file' | |
662 | % (node1, type(node1))) |
|
662 | % (node1, type(node1))) | |
663 | except NodeDoesNotExistError: |
|
663 | except NodeDoesNotExistError: | |
664 | c.changeset_1 = EmptyChangeset(cs=diff1, |
|
664 | c.changeset_1 = EmptyChangeset(cs=diff1, | |
665 | revision=c.changeset_1.revision, |
|
665 | revision=c.changeset_1.revision, | |
666 | repo=c.db_repo_scm_instance) |
|
666 | repo=c.db_repo_scm_instance) | |
667 | node1 = FileNode(f_path, '', changeset=c.changeset_1) |
|
667 | node1 = FileNode(f_path, '', changeset=c.changeset_1) | |
668 | else: |
|
668 | else: | |
669 | c.changeset_1 = EmptyChangeset(repo=c.db_repo_scm_instance) |
|
669 | c.changeset_1 = EmptyChangeset(repo=c.db_repo_scm_instance) | |
670 | node1 = FileNode(f_path, '', changeset=c.changeset_1) |
|
670 | node1 = FileNode(f_path, '', changeset=c.changeset_1) | |
671 |
|
671 | |||
672 | if diff2 not in ['', None, 'None', '0' * 12, '0' * 40]: |
|
672 | if diff2 not in ['', None, 'None', '0' * 12, '0' * 40]: | |
673 | c.changeset_2 = c.db_repo_scm_instance.get_changeset(diff2) |
|
673 | c.changeset_2 = c.db_repo_scm_instance.get_changeset(diff2) | |
674 | try: |
|
674 | try: | |
675 | node2 = c.changeset_2.get_node(f_path) |
|
675 | node2 = c.changeset_2.get_node(f_path) | |
676 | if node2.is_dir(): |
|
676 | if node2.is_dir(): | |
677 | raise NodeError('%s path is a %s not a file' |
|
677 | raise NodeError('%s path is a %s not a file' | |
678 | % (node2, type(node2))) |
|
678 | % (node2, type(node2))) | |
679 | except NodeDoesNotExistError: |
|
679 | except NodeDoesNotExistError: | |
680 | c.changeset_2 = EmptyChangeset(cs=diff2, |
|
680 | c.changeset_2 = EmptyChangeset(cs=diff2, | |
681 | revision=c.changeset_2.revision, |
|
681 | revision=c.changeset_2.revision, | |
682 | repo=c.db_repo_scm_instance) |
|
682 | repo=c.db_repo_scm_instance) | |
683 | node2 = FileNode(f_path, '', changeset=c.changeset_2) |
|
683 | node2 = FileNode(f_path, '', changeset=c.changeset_2) | |
684 | else: |
|
684 | else: | |
685 | c.changeset_2 = EmptyChangeset(repo=c.db_repo_scm_instance) |
|
685 | c.changeset_2 = EmptyChangeset(repo=c.db_repo_scm_instance) | |
686 | node2 = FileNode(f_path, '', changeset=c.changeset_2) |
|
686 | node2 = FileNode(f_path, '', changeset=c.changeset_2) | |
687 | except ChangesetDoesNotExistError as e: |
|
687 | except ChangesetDoesNotExistError as e: | |
688 | msg = _('Such revision does not exist for this repository') |
|
688 | msg = _('Such revision does not exist for this repository') | |
689 | webutils.flash(msg, category='error') |
|
689 | webutils.flash(msg, category='error') | |
690 | raise HTTPNotFound() |
|
690 | raise HTTPNotFound() | |
691 | c.node1 = node1 |
|
691 | c.node1 = node1 | |
692 | c.node2 = node2 |
|
692 | c.node2 = node2 | |
693 | c.cs1 = c.changeset_1 |
|
693 | c.cs1 = c.changeset_1 | |
694 | c.cs2 = c.changeset_2 |
|
694 | c.cs2 = c.changeset_2 | |
695 |
|
695 | |||
696 | return render('files/diff_2way.html') |
|
696 | return base.render('files/diff_2way.html') | |
697 |
|
697 | |||
698 | def _get_node_history(self, cs, f_path, changesets=None): |
|
698 | def _get_node_history(self, cs, f_path, changesets=None): | |
699 | """ |
|
699 | """ | |
700 | get changesets history for given node |
|
700 | get changesets history for given node | |
701 |
|
701 | |||
702 | :param cs: changeset to calculate history |
|
702 | :param cs: changeset to calculate history | |
703 | :param f_path: path for node to calculate history for |
|
703 | :param f_path: path for node to calculate history for | |
704 | :param changesets: if passed don't calculate history and take |
|
704 | :param changesets: if passed don't calculate history and take | |
705 | changesets defined in this list |
|
705 | changesets defined in this list | |
706 | """ |
|
706 | """ | |
707 | # calculate history based on tip |
|
707 | # calculate history based on tip | |
708 | tip_cs = c.db_repo_scm_instance.get_changeset() |
|
708 | tip_cs = c.db_repo_scm_instance.get_changeset() | |
709 | if changesets is None: |
|
709 | if changesets is None: | |
710 | try: |
|
710 | try: | |
711 | changesets = tip_cs.get_file_history(f_path) |
|
711 | changesets = tip_cs.get_file_history(f_path) | |
712 | except (NodeDoesNotExistError, ChangesetError): |
|
712 | except (NodeDoesNotExistError, ChangesetError): | |
713 | # this node is not present at tip ! |
|
713 | # this node is not present at tip ! | |
714 | changesets = cs.get_file_history(f_path) |
|
714 | changesets = cs.get_file_history(f_path) | |
715 | hist_l = [] |
|
715 | hist_l = [] | |
716 |
|
716 | |||
717 | changesets_group = ([], _("Changesets")) |
|
717 | changesets_group = ([], _("Changesets")) | |
718 | branches_group = ([], _("Branches")) |
|
718 | branches_group = ([], _("Branches")) | |
719 | tags_group = ([], _("Tags")) |
|
719 | tags_group = ([], _("Tags")) | |
720 | for chs in changesets: |
|
720 | for chs in changesets: | |
721 | # TODO: loop over chs.branches ... but that will not give all the bogus None branches for Git ... |
|
721 | # TODO: loop over chs.branches ... but that will not give all the bogus None branches for Git ... | |
722 | _branch = chs.branch |
|
722 | _branch = chs.branch | |
723 | n_desc = '%s (%s)' % (h.show_id(chs), _branch) |
|
723 | n_desc = '%s (%s)' % (h.show_id(chs), _branch) | |
724 | changesets_group[0].append((chs.raw_id, n_desc,)) |
|
724 | changesets_group[0].append((chs.raw_id, n_desc,)) | |
725 | hist_l.append(changesets_group) |
|
725 | hist_l.append(changesets_group) | |
726 |
|
726 | |||
727 | for name, chs in c.db_repo_scm_instance.branches.items(): |
|
727 | for name, chs in c.db_repo_scm_instance.branches.items(): | |
728 | branches_group[0].append((chs, name),) |
|
728 | branches_group[0].append((chs, name),) | |
729 | hist_l.append(branches_group) |
|
729 | hist_l.append(branches_group) | |
730 |
|
730 | |||
731 | for name, chs in c.db_repo_scm_instance.tags.items(): |
|
731 | for name, chs in c.db_repo_scm_instance.tags.items(): | |
732 | tags_group[0].append((chs, name),) |
|
732 | tags_group[0].append((chs, name),) | |
733 | hist_l.append(tags_group) |
|
733 | hist_l.append(tags_group) | |
734 |
|
734 | |||
735 | return hist_l, changesets |
|
735 | return hist_l, changesets | |
736 |
|
736 | |||
737 | @LoginRequired(allow_default_user=True) |
|
737 | @LoginRequired(allow_default_user=True) | |
738 | @HasRepoPermissionLevelDecorator('read') |
|
738 | @HasRepoPermissionLevelDecorator('read') | |
739 | @jsonify |
|
739 | @base.jsonify | |
740 | def nodelist(self, repo_name, revision, f_path): |
|
740 | def nodelist(self, repo_name, revision, f_path): | |
741 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
741 | if request.environ.get('HTTP_X_PARTIAL_XHR'): | |
742 | cs = self.__get_cs(revision) |
|
742 | cs = self.__get_cs(revision) | |
743 | _d, _f = ScmModel().get_nodes(repo_name, cs.raw_id, f_path, |
|
743 | _d, _f = ScmModel().get_nodes(repo_name, cs.raw_id, f_path, | |
744 | flat=False) |
|
744 | flat=False) | |
745 | return {'nodes': _d + _f} |
|
745 | return {'nodes': _d + _f} |
@@ -1,57 +1,57 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.followers |
|
15 | kallithea.controllers.followers | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Followers controller for Kallithea |
|
18 | Followers controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 23, 2011 |
|
22 | :created_on: Apr 23, 2011 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 |
|
29 | |||
30 | from tg import request |
|
30 | from tg import request | |
31 | from tg import tmpl_context as c |
|
31 | from tg import tmpl_context as c | |
32 |
|
32 | |||
|
33 | from kallithea.controllers import base | |||
33 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
34 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired | |
34 | from kallithea.lib.base import BaseRepoController, render |
|
|||
35 | from kallithea.lib.page import Page |
|
35 | from kallithea.lib.page import Page | |
36 | from kallithea.lib.utils2 import safe_int |
|
36 | from kallithea.lib.utils2 import safe_int | |
37 | from kallithea.model import db |
|
37 | from kallithea.model import db | |
38 |
|
38 | |||
39 |
|
39 | |||
40 | log = logging.getLogger(__name__) |
|
40 | log = logging.getLogger(__name__) | |
41 |
|
41 | |||
42 |
|
42 | |||
43 | class FollowersController(BaseRepoController): |
|
43 | class FollowersController(base.BaseRepoController): | |
44 |
|
44 | |||
45 | @LoginRequired(allow_default_user=True) |
|
45 | @LoginRequired(allow_default_user=True) | |
46 | @HasRepoPermissionLevelDecorator('read') |
|
46 | @HasRepoPermissionLevelDecorator('read') | |
47 | def followers(self, repo_name): |
|
47 | def followers(self, repo_name): | |
48 | p = safe_int(request.GET.get('page'), 1) |
|
48 | p = safe_int(request.GET.get('page'), 1) | |
49 | repo_id = c.db_repo.repo_id |
|
49 | repo_id = c.db_repo.repo_id | |
50 | d = db.UserFollowing.get_repo_followers(repo_id) \ |
|
50 | d = db.UserFollowing.get_repo_followers(repo_id) \ | |
51 | .order_by(db.UserFollowing.follows_from) |
|
51 | .order_by(db.UserFollowing.follows_from) | |
52 | c.followers_pager = Page(d, page=p, items_per_page=20) |
|
52 | c.followers_pager = Page(d, page=p, items_per_page=20) | |
53 |
|
53 | |||
54 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
54 | if request.environ.get('HTTP_X_PARTIAL_XHR'): | |
55 | return render('/followers/followers_data.html') |
|
55 | return base.render('/followers/followers_data.html') | |
56 |
|
56 | |||
57 | return render('/followers/followers.html') |
|
57 | return base.render('/followers/followers.html') |
@@ -1,173 +1,173 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.forks |
|
15 | kallithea.controllers.forks | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | forks controller for Kallithea |
|
18 | forks controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 23, 2011 |
|
22 | :created_on: Apr 23, 2011 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 |
|
30 | |||
31 | import formencode |
|
31 | import formencode | |
32 | from formencode import htmlfill |
|
32 | from formencode import htmlfill | |
33 | from tg import request |
|
33 | from tg import request | |
34 | from tg import tmpl_context as c |
|
34 | from tg import tmpl_context as c | |
35 | from tg.i18n import ugettext as _ |
|
35 | from tg.i18n import ugettext as _ | |
36 | from webob.exc import HTTPFound, HTTPNotFound |
|
36 | from webob.exc import HTTPFound, HTTPNotFound | |
37 |
|
37 | |||
38 | import kallithea |
|
38 | import kallithea | |
|
39 | from kallithea.controllers import base | |||
39 | from kallithea.lib import webutils |
|
40 | from kallithea.lib import webutils | |
40 | from kallithea.lib.auth import HasPermissionAnyDecorator, HasRepoPermissionLevel, HasRepoPermissionLevelDecorator, LoginRequired |
|
41 | from kallithea.lib.auth import HasPermissionAnyDecorator, HasRepoPermissionLevel, HasRepoPermissionLevelDecorator, LoginRequired | |
41 | from kallithea.lib.base import BaseRepoController, render |
|
|||
42 | from kallithea.lib.page import Page |
|
42 | from kallithea.lib.page import Page | |
43 | from kallithea.lib.utils2 import safe_int |
|
43 | from kallithea.lib.utils2 import safe_int | |
44 | from kallithea.model import db |
|
44 | from kallithea.model import db | |
45 | from kallithea.model.forms import RepoForkForm |
|
45 | from kallithea.model.forms import RepoForkForm | |
46 | from kallithea.model.repo import RepoModel |
|
46 | from kallithea.model.repo import RepoModel | |
47 | from kallithea.model.scm import AvailableRepoGroupChoices, ScmModel |
|
47 | from kallithea.model.scm import AvailableRepoGroupChoices, ScmModel | |
48 |
|
48 | |||
49 |
|
49 | |||
50 | log = logging.getLogger(__name__) |
|
50 | log = logging.getLogger(__name__) | |
51 |
|
51 | |||
52 |
|
52 | |||
53 | class ForksController(BaseRepoController): |
|
53 | class ForksController(base.BaseRepoController): | |
54 |
|
54 | |||
55 | def __load_defaults(self): |
|
55 | def __load_defaults(self): | |
56 | c.repo_groups = AvailableRepoGroupChoices('write') |
|
56 | c.repo_groups = AvailableRepoGroupChoices('write') | |
57 |
|
57 | |||
58 | c.landing_revs_choices, c.landing_revs = ScmModel().get_repo_landing_revs() |
|
58 | c.landing_revs_choices, c.landing_revs = ScmModel().get_repo_landing_revs() | |
59 |
|
59 | |||
60 | c.can_update = db.Ui.get_by_key('hooks', db.Ui.HOOK_UPDATE).ui_active |
|
60 | c.can_update = db.Ui.get_by_key('hooks', db.Ui.HOOK_UPDATE).ui_active | |
61 |
|
61 | |||
62 | def __load_data(self): |
|
62 | def __load_data(self): | |
63 | """ |
|
63 | """ | |
64 | Load defaults settings for edit, and update |
|
64 | Load defaults settings for edit, and update | |
65 | """ |
|
65 | """ | |
66 | self.__load_defaults() |
|
66 | self.__load_defaults() | |
67 |
|
67 | |||
68 | c.repo_info = c.db_repo |
|
68 | c.repo_info = c.db_repo | |
69 | repo = c.db_repo.scm_instance |
|
69 | repo = c.db_repo.scm_instance | |
70 |
|
70 | |||
71 | if c.repo_info is None: |
|
71 | if c.repo_info is None: | |
72 | raise HTTPNotFound() |
|
72 | raise HTTPNotFound() | |
73 |
|
73 | |||
74 | c.default_user_id = kallithea.DEFAULT_USER_ID |
|
74 | c.default_user_id = kallithea.DEFAULT_USER_ID | |
75 | c.in_public_journal = db.UserFollowing.query() \ |
|
75 | c.in_public_journal = db.UserFollowing.query() \ | |
76 | .filter(db.UserFollowing.user_id == c.default_user_id) \ |
|
76 | .filter(db.UserFollowing.user_id == c.default_user_id) \ | |
77 | .filter(db.UserFollowing.follows_repository == c.repo_info).scalar() |
|
77 | .filter(db.UserFollowing.follows_repository == c.repo_info).scalar() | |
78 |
|
78 | |||
79 | if c.repo_info.stats: |
|
79 | if c.repo_info.stats: | |
80 | last_rev = c.repo_info.stats.stat_on_revision + 1 |
|
80 | last_rev = c.repo_info.stats.stat_on_revision + 1 | |
81 | else: |
|
81 | else: | |
82 | last_rev = 0 |
|
82 | last_rev = 0 | |
83 | c.stats_revision = last_rev |
|
83 | c.stats_revision = last_rev | |
84 |
|
84 | |||
85 | c.repo_last_rev = repo.count() if repo.revisions else 0 |
|
85 | c.repo_last_rev = repo.count() if repo.revisions else 0 | |
86 |
|
86 | |||
87 | if last_rev == 0 or c.repo_last_rev == 0: |
|
87 | if last_rev == 0 or c.repo_last_rev == 0: | |
88 | c.stats_percentage = 0 |
|
88 | c.stats_percentage = 0 | |
89 | else: |
|
89 | else: | |
90 | c.stats_percentage = '%.2f' % ((float((last_rev)) / |
|
90 | c.stats_percentage = '%.2f' % ((float((last_rev)) / | |
91 | c.repo_last_rev) * 100) |
|
91 | c.repo_last_rev) * 100) | |
92 |
|
92 | |||
93 | defaults = RepoModel()._get_defaults(c.repo_name) |
|
93 | defaults = RepoModel()._get_defaults(c.repo_name) | |
94 | # alter the description to indicate a fork |
|
94 | # alter the description to indicate a fork | |
95 | defaults['description'] = ('fork of repository: %s \n%s' |
|
95 | defaults['description'] = ('fork of repository: %s \n%s' | |
96 | % (defaults['repo_name'], |
|
96 | % (defaults['repo_name'], | |
97 | defaults['description'])) |
|
97 | defaults['description'])) | |
98 | # add suffix to fork |
|
98 | # add suffix to fork | |
99 | defaults['repo_name'] = '%s-fork' % defaults['repo_name'] |
|
99 | defaults['repo_name'] = '%s-fork' % defaults['repo_name'] | |
100 |
|
100 | |||
101 | return defaults |
|
101 | return defaults | |
102 |
|
102 | |||
103 | @LoginRequired(allow_default_user=True) |
|
103 | @LoginRequired(allow_default_user=True) | |
104 | @HasRepoPermissionLevelDecorator('read') |
|
104 | @HasRepoPermissionLevelDecorator('read') | |
105 | def forks(self, repo_name): |
|
105 | def forks(self, repo_name): | |
106 | p = safe_int(request.GET.get('page'), 1) |
|
106 | p = safe_int(request.GET.get('page'), 1) | |
107 | repo_id = c.db_repo.repo_id |
|
107 | repo_id = c.db_repo.repo_id | |
108 | d = [] |
|
108 | d = [] | |
109 | for r in db.Repository.get_repo_forks(repo_id): |
|
109 | for r in db.Repository.get_repo_forks(repo_id): | |
110 | if not HasRepoPermissionLevel('read')(r.repo_name, 'get forks check'): |
|
110 | if not HasRepoPermissionLevel('read')(r.repo_name, 'get forks check'): | |
111 | continue |
|
111 | continue | |
112 | d.append(r) |
|
112 | d.append(r) | |
113 | c.forks_pager = Page(d, page=p, items_per_page=20) |
|
113 | c.forks_pager = Page(d, page=p, items_per_page=20) | |
114 |
|
114 | |||
115 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
115 | if request.environ.get('HTTP_X_PARTIAL_XHR'): | |
116 | return render('/forks/forks_data.html') |
|
116 | return base.render('/forks/forks_data.html') | |
117 |
|
117 | |||
118 | return render('/forks/forks.html') |
|
118 | return base.render('/forks/forks.html') | |
119 |
|
119 | |||
120 | @LoginRequired() |
|
120 | @LoginRequired() | |
121 | @HasPermissionAnyDecorator('hg.admin', 'hg.fork.repository') |
|
121 | @HasPermissionAnyDecorator('hg.admin', 'hg.fork.repository') | |
122 | @HasRepoPermissionLevelDecorator('read') |
|
122 | @HasRepoPermissionLevelDecorator('read') | |
123 | def fork(self, repo_name): |
|
123 | def fork(self, repo_name): | |
124 | c.repo_info = db.Repository.get_by_repo_name(repo_name) |
|
124 | c.repo_info = db.Repository.get_by_repo_name(repo_name) | |
125 | if not c.repo_info: |
|
125 | if not c.repo_info: | |
126 | raise HTTPNotFound() |
|
126 | raise HTTPNotFound() | |
127 |
|
127 | |||
128 | defaults = self.__load_data() |
|
128 | defaults = self.__load_data() | |
129 |
|
129 | |||
130 | return htmlfill.render( |
|
130 | return htmlfill.render( | |
131 | render('forks/fork.html'), |
|
131 | base.render('forks/fork.html'), | |
132 | defaults=defaults, |
|
132 | defaults=defaults, | |
133 | encoding="UTF-8", |
|
133 | encoding="UTF-8", | |
134 | force_defaults=False) |
|
134 | force_defaults=False) | |
135 |
|
135 | |||
136 | @LoginRequired() |
|
136 | @LoginRequired() | |
137 | @HasPermissionAnyDecorator('hg.admin', 'hg.fork.repository') |
|
137 | @HasPermissionAnyDecorator('hg.admin', 'hg.fork.repository') | |
138 | @HasRepoPermissionLevelDecorator('read') |
|
138 | @HasRepoPermissionLevelDecorator('read') | |
139 | def fork_create(self, repo_name): |
|
139 | def fork_create(self, repo_name): | |
140 | self.__load_defaults() |
|
140 | self.__load_defaults() | |
141 | c.repo_info = db.Repository.get_by_repo_name(repo_name) |
|
141 | c.repo_info = db.Repository.get_by_repo_name(repo_name) | |
142 | _form = RepoForkForm(old_data={'repo_type': c.repo_info.repo_type}, |
|
142 | _form = RepoForkForm(old_data={'repo_type': c.repo_info.repo_type}, | |
143 | repo_groups=c.repo_groups, |
|
143 | repo_groups=c.repo_groups, | |
144 | landing_revs=c.landing_revs_choices)() |
|
144 | landing_revs=c.landing_revs_choices)() | |
145 | form_result = {} |
|
145 | form_result = {} | |
146 | task_id = None |
|
146 | task_id = None | |
147 | try: |
|
147 | try: | |
148 | form_result = _form.to_python(dict(request.POST)) |
|
148 | form_result = _form.to_python(dict(request.POST)) | |
149 |
|
149 | |||
150 | # an approximation that is better than nothing |
|
150 | # an approximation that is better than nothing | |
151 | if not db.Ui.get_by_key('hooks', db.Ui.HOOK_UPDATE).ui_active: |
|
151 | if not db.Ui.get_by_key('hooks', db.Ui.HOOK_UPDATE).ui_active: | |
152 | form_result['update_after_clone'] = False |
|
152 | form_result['update_after_clone'] = False | |
153 |
|
153 | |||
154 | # create fork is done sometimes async on celery, db transaction |
|
154 | # create fork is done sometimes async on celery, db transaction | |
155 | # management is handled there. |
|
155 | # management is handled there. | |
156 | task = RepoModel().create_fork(form_result, request.authuser.user_id) |
|
156 | task = RepoModel().create_fork(form_result, request.authuser.user_id) | |
157 | task_id = task.task_id |
|
157 | task_id = task.task_id | |
158 | except formencode.Invalid as errors: |
|
158 | except formencode.Invalid as errors: | |
159 | return htmlfill.render( |
|
159 | return htmlfill.render( | |
160 | render('forks/fork.html'), |
|
160 | base.render('forks/fork.html'), | |
161 | defaults=errors.value, |
|
161 | defaults=errors.value, | |
162 | errors=errors.error_dict or {}, |
|
162 | errors=errors.error_dict or {}, | |
163 | prefix_error=False, |
|
163 | prefix_error=False, | |
164 | encoding="UTF-8", |
|
164 | encoding="UTF-8", | |
165 | force_defaults=False) |
|
165 | force_defaults=False) | |
166 | except Exception: |
|
166 | except Exception: | |
167 | log.error(traceback.format_exc()) |
|
167 | log.error(traceback.format_exc()) | |
168 | webutils.flash(_('An error occurred during repository forking %s') % |
|
168 | webutils.flash(_('An error occurred during repository forking %s') % | |
169 | repo_name, category='error') |
|
169 | repo_name, category='error') | |
170 |
|
170 | |||
171 | raise HTTPFound(location=webutils.url('repo_creating_home', |
|
171 | raise HTTPFound(location=webutils.url('repo_creating_home', | |
172 | repo_name=form_result['repo_name_full'], |
|
172 | repo_name=form_result['repo_name_full'], | |
173 | task_id=task_id)) |
|
173 | task_id=task_id)) |
@@ -1,211 +1,211 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.home |
|
15 | kallithea.controllers.home | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Home controller for Kallithea |
|
18 | Home controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Feb 18, 2010 |
|
22 | :created_on: Feb 18, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 |
|
26 | |||
27 | """ |
|
27 | """ | |
28 |
|
28 | |||
29 | import logging |
|
29 | import logging | |
30 |
|
30 | |||
31 | from sqlalchemy import or_ |
|
31 | from sqlalchemy import or_ | |
32 | from tg import request |
|
32 | from tg import request | |
33 | from tg import tmpl_context as c |
|
33 | from tg import tmpl_context as c | |
34 | from tg.i18n import ugettext as _ |
|
34 | from tg.i18n import ugettext as _ | |
35 | from webob.exc import HTTPBadRequest |
|
35 | from webob.exc import HTTPBadRequest | |
36 |
|
36 | |||
37 | import kallithea.lib.helpers as h |
|
37 | import kallithea.lib.helpers as h | |
|
38 | from kallithea.controllers import base | |||
38 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
39 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired | |
39 | from kallithea.lib.base import BaseController, jsonify, render |
|
|||
40 | from kallithea.lib.utils2 import safe_str |
|
40 | from kallithea.lib.utils2 import safe_str | |
41 | from kallithea.model import db |
|
41 | from kallithea.model import db | |
42 | from kallithea.model.repo import RepoModel |
|
42 | from kallithea.model.repo import RepoModel | |
43 | from kallithea.model.scm import UserGroupList |
|
43 | from kallithea.model.scm import UserGroupList | |
44 |
|
44 | |||
45 |
|
45 | |||
46 | log = logging.getLogger(__name__) |
|
46 | log = logging.getLogger(__name__) | |
47 |
|
47 | |||
48 |
|
48 | |||
49 | class HomeController(BaseController): |
|
49 | class HomeController(base.BaseController): | |
50 |
|
50 | |||
51 | def about(self): |
|
51 | def about(self): | |
52 | return render('/about.html') |
|
52 | return base.render('/about.html') | |
53 |
|
53 | |||
54 | @LoginRequired(allow_default_user=True) |
|
54 | @LoginRequired(allow_default_user=True) | |
55 | def index(self): |
|
55 | def index(self): | |
56 | c.group = None |
|
56 | c.group = None | |
57 |
|
57 | |||
58 | repo_groups_list = self.scm_model.get_repo_groups() |
|
58 | repo_groups_list = self.scm_model.get_repo_groups() | |
59 | repos_list = db.Repository.query(sorted=True).filter_by(group=None).all() |
|
59 | repos_list = db.Repository.query(sorted=True).filter_by(group=None).all() | |
60 |
|
60 | |||
61 | c.data = RepoModel().get_repos_as_dict(repos_list, |
|
61 | c.data = RepoModel().get_repos_as_dict(repos_list, | |
62 | repo_groups_list=repo_groups_list, |
|
62 | repo_groups_list=repo_groups_list, | |
63 | short_name=True) |
|
63 | short_name=True) | |
64 |
|
64 | |||
65 | return render('/index.html') |
|
65 | return base.render('/index.html') | |
66 |
|
66 | |||
67 | @LoginRequired(allow_default_user=True) |
|
67 | @LoginRequired(allow_default_user=True) | |
68 | @jsonify |
|
68 | @base.jsonify | |
69 | def repo_switcher_data(self): |
|
69 | def repo_switcher_data(self): | |
70 | if request.is_xhr: |
|
70 | if request.is_xhr: | |
71 | all_repos = db.Repository.query(sorted=True).all() |
|
71 | all_repos = db.Repository.query(sorted=True).all() | |
72 | repo_iter = self.scm_model.get_repos(all_repos) |
|
72 | repo_iter = self.scm_model.get_repos(all_repos) | |
73 | all_groups = db.RepoGroup.query(sorted=True).all() |
|
73 | all_groups = db.RepoGroup.query(sorted=True).all() | |
74 | repo_groups_iter = self.scm_model.get_repo_groups(all_groups) |
|
74 | repo_groups_iter = self.scm_model.get_repo_groups(all_groups) | |
75 |
|
75 | |||
76 | res = [{ |
|
76 | res = [{ | |
77 | 'text': _('Groups'), |
|
77 | 'text': _('Groups'), | |
78 | 'children': [ |
|
78 | 'children': [ | |
79 | {'id': obj.group_name, |
|
79 | {'id': obj.group_name, | |
80 | 'text': obj.group_name, |
|
80 | 'text': obj.group_name, | |
81 | 'type': 'group', |
|
81 | 'type': 'group', | |
82 | 'obj': {}} |
|
82 | 'obj': {}} | |
83 | for obj in repo_groups_iter |
|
83 | for obj in repo_groups_iter | |
84 | ], |
|
84 | ], | |
85 | }, |
|
85 | }, | |
86 | { |
|
86 | { | |
87 | 'text': _('Repositories'), |
|
87 | 'text': _('Repositories'), | |
88 | 'children': [ |
|
88 | 'children': [ | |
89 | {'id': obj.repo_name, |
|
89 | {'id': obj.repo_name, | |
90 | 'text': obj.repo_name, |
|
90 | 'text': obj.repo_name, | |
91 | 'type': 'repo', |
|
91 | 'type': 'repo', | |
92 | 'obj': obj.get_dict()} |
|
92 | 'obj': obj.get_dict()} | |
93 | for obj in repo_iter |
|
93 | for obj in repo_iter | |
94 | ], |
|
94 | ], | |
95 | }] |
|
95 | }] | |
96 |
|
96 | |||
97 | for res_dict in res: |
|
97 | for res_dict in res: | |
98 | for child in (res_dict['children']): |
|
98 | for child in (res_dict['children']): | |
99 | child['obj'].pop('_changeset_cache', None) # bytes cannot be encoded in json ... but this value isn't relevant on client side at all ... |
|
99 | child['obj'].pop('_changeset_cache', None) # bytes cannot be encoded in json ... but this value isn't relevant on client side at all ... | |
100 |
|
100 | |||
101 | data = { |
|
101 | data = { | |
102 | 'more': False, |
|
102 | 'more': False, | |
103 | 'results': res, |
|
103 | 'results': res, | |
104 | } |
|
104 | } | |
105 | return data |
|
105 | return data | |
106 |
|
106 | |||
107 | else: |
|
107 | else: | |
108 | raise HTTPBadRequest() |
|
108 | raise HTTPBadRequest() | |
109 |
|
109 | |||
110 | @LoginRequired(allow_default_user=True) |
|
110 | @LoginRequired(allow_default_user=True) | |
111 | @HasRepoPermissionLevelDecorator('read') |
|
111 | @HasRepoPermissionLevelDecorator('read') | |
112 | @jsonify |
|
112 | @base.jsonify | |
113 | def repo_refs_data(self, repo_name): |
|
113 | def repo_refs_data(self, repo_name): | |
114 | repo = db.Repository.get_by_repo_name(repo_name).scm_instance |
|
114 | repo = db.Repository.get_by_repo_name(repo_name).scm_instance | |
115 | res = [] |
|
115 | res = [] | |
116 | _branches = repo.branches.items() |
|
116 | _branches = repo.branches.items() | |
117 | if _branches: |
|
117 | if _branches: | |
118 | res.append({ |
|
118 | res.append({ | |
119 | 'text': _('Branch'), |
|
119 | 'text': _('Branch'), | |
120 | 'children': [{'id': safe_str(rev), 'text': safe_str(name), 'type': 'branch'} for name, rev in _branches] |
|
120 | 'children': [{'id': safe_str(rev), 'text': safe_str(name), 'type': 'branch'} for name, rev in _branches] | |
121 | }) |
|
121 | }) | |
122 | _closed_branches = repo.closed_branches.items() |
|
122 | _closed_branches = repo.closed_branches.items() | |
123 | if _closed_branches: |
|
123 | if _closed_branches: | |
124 | res.append({ |
|
124 | res.append({ | |
125 | 'text': _('Closed Branches'), |
|
125 | 'text': _('Closed Branches'), | |
126 | 'children': [{'id': safe_str(rev), 'text': safe_str(name), 'type': 'closed-branch'} for name, rev in _closed_branches] |
|
126 | 'children': [{'id': safe_str(rev), 'text': safe_str(name), 'type': 'closed-branch'} for name, rev in _closed_branches] | |
127 | }) |
|
127 | }) | |
128 | _tags = repo.tags.items() |
|
128 | _tags = repo.tags.items() | |
129 | if _tags: |
|
129 | if _tags: | |
130 | res.append({ |
|
130 | res.append({ | |
131 | 'text': _('Tag'), |
|
131 | 'text': _('Tag'), | |
132 | 'children': [{'id': safe_str(rev), 'text': safe_str(name), 'type': 'tag'} for name, rev in _tags] |
|
132 | 'children': [{'id': safe_str(rev), 'text': safe_str(name), 'type': 'tag'} for name, rev in _tags] | |
133 | }) |
|
133 | }) | |
134 | _bookmarks = repo.bookmarks.items() |
|
134 | _bookmarks = repo.bookmarks.items() | |
135 | if _bookmarks: |
|
135 | if _bookmarks: | |
136 | res.append({ |
|
136 | res.append({ | |
137 | 'text': _('Bookmark'), |
|
137 | 'text': _('Bookmark'), | |
138 | 'children': [{'id': safe_str(rev), 'text': safe_str(name), 'type': 'book'} for name, rev in _bookmarks] |
|
138 | 'children': [{'id': safe_str(rev), 'text': safe_str(name), 'type': 'book'} for name, rev in _bookmarks] | |
139 | }) |
|
139 | }) | |
140 | data = { |
|
140 | data = { | |
141 | 'more': False, |
|
141 | 'more': False, | |
142 | 'results': res |
|
142 | 'results': res | |
143 | } |
|
143 | } | |
144 | return data |
|
144 | return data | |
145 |
|
145 | |||
146 | @LoginRequired() |
|
146 | @LoginRequired() | |
147 | @jsonify |
|
147 | @base.jsonify | |
148 | def users_and_groups_data(self): |
|
148 | def users_and_groups_data(self): | |
149 | """ |
|
149 | """ | |
150 | Returns 'results' with a list of users and user groups. |
|
150 | Returns 'results' with a list of users and user groups. | |
151 |
|
151 | |||
152 | You can either use the 'key' GET parameter to get a user by providing |
|
152 | You can either use the 'key' GET parameter to get a user by providing | |
153 | the exact user key or you can use the 'query' parameter to |
|
153 | the exact user key or you can use the 'query' parameter to | |
154 | search for users by user key, first name and last name. |
|
154 | search for users by user key, first name and last name. | |
155 | 'types' defaults to just 'users' but can be set to 'users,groups' to |
|
155 | 'types' defaults to just 'users' but can be set to 'users,groups' to | |
156 | get both users and groups. |
|
156 | get both users and groups. | |
157 | No more than 500 results (of each kind) will be returned. |
|
157 | No more than 500 results (of each kind) will be returned. | |
158 | """ |
|
158 | """ | |
159 | types = request.GET.get('types', 'users').split(',') |
|
159 | types = request.GET.get('types', 'users').split(',') | |
160 | key = request.GET.get('key', '') |
|
160 | key = request.GET.get('key', '') | |
161 | query = request.GET.get('query', '') |
|
161 | query = request.GET.get('query', '') | |
162 | results = [] |
|
162 | results = [] | |
163 | if 'users' in types: |
|
163 | if 'users' in types: | |
164 | user_list = [] |
|
164 | user_list = [] | |
165 | if key: |
|
165 | if key: | |
166 | u = db.User.get_by_username(key) |
|
166 | u = db.User.get_by_username(key) | |
167 | if u: |
|
167 | if u: | |
168 | user_list = [u] |
|
168 | user_list = [u] | |
169 | elif query: |
|
169 | elif query: | |
170 | user_list = db.User.query() \ |
|
170 | user_list = db.User.query() \ | |
171 | .filter(db.User.is_default_user == False) \ |
|
171 | .filter(db.User.is_default_user == False) \ | |
172 | .filter(db.User.active == True) \ |
|
172 | .filter(db.User.active == True) \ | |
173 | .filter(or_( |
|
173 | .filter(or_( | |
174 | db.User.username.ilike("%%" + query + "%%"), |
|
174 | db.User.username.ilike("%%" + query + "%%"), | |
175 | db.User.name.concat(' ').concat(db.User.lastname).ilike("%%" + query + "%%"), |
|
175 | db.User.name.concat(' ').concat(db.User.lastname).ilike("%%" + query + "%%"), | |
176 | db.User.lastname.concat(' ').concat(db.User.name).ilike("%%" + query + "%%"), |
|
176 | db.User.lastname.concat(' ').concat(db.User.name).ilike("%%" + query + "%%"), | |
177 | db.User.email.ilike("%%" + query + "%%"), |
|
177 | db.User.email.ilike("%%" + query + "%%"), | |
178 | )) \ |
|
178 | )) \ | |
179 | .order_by(db.User.username) \ |
|
179 | .order_by(db.User.username) \ | |
180 | .limit(500) \ |
|
180 | .limit(500) \ | |
181 | .all() |
|
181 | .all() | |
182 | for u in user_list: |
|
182 | for u in user_list: | |
183 | results.append({ |
|
183 | results.append({ | |
184 | 'type': 'user', |
|
184 | 'type': 'user', | |
185 | 'id': u.user_id, |
|
185 | 'id': u.user_id, | |
186 | 'nname': u.username, |
|
186 | 'nname': u.username, | |
187 | 'fname': u.name, |
|
187 | 'fname': u.name, | |
188 | 'lname': u.lastname, |
|
188 | 'lname': u.lastname, | |
189 | 'gravatar_lnk': h.gravatar_url(u.email, size=28, default='default'), |
|
189 | 'gravatar_lnk': h.gravatar_url(u.email, size=28, default='default'), | |
190 | 'gravatar_size': 14, |
|
190 | 'gravatar_size': 14, | |
191 | }) |
|
191 | }) | |
192 | if 'groups' in types: |
|
192 | if 'groups' in types: | |
193 | grp_list = [] |
|
193 | grp_list = [] | |
194 | if key: |
|
194 | if key: | |
195 | grp = db.UserGroup.get_by_group_name(key) |
|
195 | grp = db.UserGroup.get_by_group_name(key) | |
196 | if grp: |
|
196 | if grp: | |
197 | grp_list = [grp] |
|
197 | grp_list = [grp] | |
198 | elif query: |
|
198 | elif query: | |
199 | grp_list = db.UserGroup.query() \ |
|
199 | grp_list = db.UserGroup.query() \ | |
200 | .filter(db.UserGroup.users_group_name.ilike("%%" + query + "%%")) \ |
|
200 | .filter(db.UserGroup.users_group_name.ilike("%%" + query + "%%")) \ | |
201 | .filter(db.UserGroup.users_group_active == True) \ |
|
201 | .filter(db.UserGroup.users_group_active == True) \ | |
202 | .order_by(db.UserGroup.users_group_name) \ |
|
202 | .order_by(db.UserGroup.users_group_name) \ | |
203 | .limit(500) \ |
|
203 | .limit(500) \ | |
204 | .all() |
|
204 | .all() | |
205 | for g in UserGroupList(grp_list, perm_level='read'): |
|
205 | for g in UserGroupList(grp_list, perm_level='read'): | |
206 | results.append({ |
|
206 | results.append({ | |
207 | 'type': 'group', |
|
207 | 'type': 'group', | |
208 | 'id': g.users_group_id, |
|
208 | 'id': g.users_group_id, | |
209 | 'grname': g.users_group_name, |
|
209 | 'grname': g.users_group_name, | |
210 | }) |
|
210 | }) | |
211 | return dict(results=results) |
|
211 | return dict(results=results) |
@@ -1,275 +1,275 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.journal |
|
15 | kallithea.controllers.journal | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Journal controller |
|
18 | Journal controller | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Nov 21, 2010 |
|
22 | :created_on: Nov 21, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 | from itertools import groupby |
|
30 | from itertools import groupby | |
31 |
|
31 | |||
32 | from sqlalchemy import or_ |
|
32 | from sqlalchemy import or_ | |
33 | from sqlalchemy.orm import joinedload |
|
33 | from sqlalchemy.orm import joinedload | |
34 | from tg import request, response |
|
34 | from tg import request, response | |
35 | from tg import tmpl_context as c |
|
35 | from tg import tmpl_context as c | |
36 | from tg.i18n import ugettext as _ |
|
36 | from tg.i18n import ugettext as _ | |
37 | from webob.exc import HTTPBadRequest |
|
37 | from webob.exc import HTTPBadRequest | |
38 |
|
38 | |||
39 | import kallithea.lib.helpers as h |
|
39 | import kallithea.lib.helpers as h | |
|
40 | from kallithea.controllers import base | |||
40 | from kallithea.controllers.admin.admin import _journal_filter |
|
41 | from kallithea.controllers.admin.admin import _journal_filter | |
41 | from kallithea.lib import feeds, webutils |
|
42 | from kallithea.lib import feeds, webutils | |
42 | from kallithea.lib.auth import LoginRequired |
|
43 | from kallithea.lib.auth import LoginRequired | |
43 | from kallithea.lib.base import BaseController, render |
|
|||
44 | from kallithea.lib.page import Page |
|
44 | from kallithea.lib.page import Page | |
45 | from kallithea.lib.utils2 import AttributeDict, safe_int |
|
45 | from kallithea.lib.utils2 import AttributeDict, safe_int | |
46 | from kallithea.model import db, meta |
|
46 | from kallithea.model import db, meta | |
47 | from kallithea.model.repo import RepoModel |
|
47 | from kallithea.model.repo import RepoModel | |
48 |
|
48 | |||
49 |
|
49 | |||
50 | log = logging.getLogger(__name__) |
|
50 | log = logging.getLogger(__name__) | |
51 |
|
51 | |||
52 |
|
52 | |||
53 | language = 'en-us' |
|
53 | language = 'en-us' | |
54 | ttl = "5" |
|
54 | ttl = "5" | |
55 | feed_nr = 20 |
|
55 | feed_nr = 20 | |
56 |
|
56 | |||
57 |
|
57 | |||
58 | class JournalController(BaseController): |
|
58 | class JournalController(base.BaseController): | |
59 |
|
59 | |||
60 | def _before(self, *args, **kwargs): |
|
60 | def _before(self, *args, **kwargs): | |
61 | super(JournalController, self)._before(*args, **kwargs) |
|
61 | super(JournalController, self)._before(*args, **kwargs) | |
62 | c.search_term = request.GET.get('filter') |
|
62 | c.search_term = request.GET.get('filter') | |
63 |
|
63 | |||
64 | def _get_daily_aggregate(self, journal): |
|
64 | def _get_daily_aggregate(self, journal): | |
65 | groups = [] |
|
65 | groups = [] | |
66 | for k, g in groupby(journal, lambda x: x.action_as_day): |
|
66 | for k, g in groupby(journal, lambda x: x.action_as_day): | |
67 | user_group = [] |
|
67 | user_group = [] | |
68 | # groupby username if it's a present value, else fallback to journal username |
|
68 | # groupby username if it's a present value, else fallback to journal username | |
69 | for _unused, g2 in groupby(list(g), lambda x: x.user.username if x.user else x.username): |
|
69 | for _unused, g2 in groupby(list(g), lambda x: x.user.username if x.user else x.username): | |
70 | l = list(g2) |
|
70 | l = list(g2) | |
71 | user_group.append((l[0].user, l)) |
|
71 | user_group.append((l[0].user, l)) | |
72 |
|
72 | |||
73 | groups.append((k, user_group,)) |
|
73 | groups.append((k, user_group,)) | |
74 |
|
74 | |||
75 | return groups |
|
75 | return groups | |
76 |
|
76 | |||
77 | def _get_journal_data(self, following_repos): |
|
77 | def _get_journal_data(self, following_repos): | |
78 | repo_ids = [x.follows_repository_id for x in following_repos |
|
78 | repo_ids = [x.follows_repository_id for x in following_repos | |
79 | if x.follows_repository_id is not None] |
|
79 | if x.follows_repository_id is not None] | |
80 | user_ids = [x.follows_user_id for x in following_repos |
|
80 | user_ids = [x.follows_user_id for x in following_repos | |
81 | if x.follows_user_id is not None] |
|
81 | if x.follows_user_id is not None] | |
82 |
|
82 | |||
83 | filtering_criterion = None |
|
83 | filtering_criterion = None | |
84 |
|
84 | |||
85 | if repo_ids and user_ids: |
|
85 | if repo_ids and user_ids: | |
86 | filtering_criterion = or_(db.UserLog.repository_id.in_(repo_ids), |
|
86 | filtering_criterion = or_(db.UserLog.repository_id.in_(repo_ids), | |
87 | db.UserLog.user_id.in_(user_ids)) |
|
87 | db.UserLog.user_id.in_(user_ids)) | |
88 | if repo_ids and not user_ids: |
|
88 | if repo_ids and not user_ids: | |
89 | filtering_criterion = db.UserLog.repository_id.in_(repo_ids) |
|
89 | filtering_criterion = db.UserLog.repository_id.in_(repo_ids) | |
90 | if not repo_ids and user_ids: |
|
90 | if not repo_ids and user_ids: | |
91 | filtering_criterion = db.UserLog.user_id.in_(user_ids) |
|
91 | filtering_criterion = db.UserLog.user_id.in_(user_ids) | |
92 | if filtering_criterion is not None: |
|
92 | if filtering_criterion is not None: | |
93 | journal = db.UserLog.query() \ |
|
93 | journal = db.UserLog.query() \ | |
94 | .options(joinedload(db.UserLog.user)) \ |
|
94 | .options(joinedload(db.UserLog.user)) \ | |
95 | .options(joinedload(db.UserLog.repository)) |
|
95 | .options(joinedload(db.UserLog.repository)) | |
96 | # filter |
|
96 | # filter | |
97 | journal = _journal_filter(journal, c.search_term) |
|
97 | journal = _journal_filter(journal, c.search_term) | |
98 | journal = journal.filter(filtering_criterion) \ |
|
98 | journal = journal.filter(filtering_criterion) \ | |
99 | .order_by(db.UserLog.action_date.desc()) |
|
99 | .order_by(db.UserLog.action_date.desc()) | |
100 | else: |
|
100 | else: | |
101 | journal = [] |
|
101 | journal = [] | |
102 |
|
102 | |||
103 | return journal |
|
103 | return journal | |
104 |
|
104 | |||
105 | def _feed(self, repos, feeder, link, desc): |
|
105 | def _feed(self, repos, feeder, link, desc): | |
106 | response.content_type = feeder.content_type |
|
106 | response.content_type = feeder.content_type | |
107 | journal = self._get_journal_data(repos) |
|
107 | journal = self._get_journal_data(repos) | |
108 |
|
108 | |||
109 | header = dict( |
|
109 | header = dict( | |
110 | title=desc, |
|
110 | title=desc, | |
111 | link=link, |
|
111 | link=link, | |
112 | description=desc, |
|
112 | description=desc, | |
113 | ) |
|
113 | ) | |
114 |
|
114 | |||
115 | entries=[] |
|
115 | entries=[] | |
116 | for entry in journal[:feed_nr]: |
|
116 | for entry in journal[:feed_nr]: | |
117 | user = entry.user |
|
117 | user = entry.user | |
118 | if user is None: |
|
118 | if user is None: | |
119 | # fix deleted users |
|
119 | # fix deleted users | |
120 | user = AttributeDict({'short_contact': entry.username, |
|
120 | user = AttributeDict({'short_contact': entry.username, | |
121 | 'email': '', |
|
121 | 'email': '', | |
122 | 'full_contact': ''}) |
|
122 | 'full_contact': ''}) | |
123 | action, action_extra, ico = h.action_parser(entry, feed=True) |
|
123 | action, action_extra, ico = h.action_parser(entry, feed=True) | |
124 | title = "%s - %s %s" % (user.short_contact, action(), |
|
124 | title = "%s - %s %s" % (user.short_contact, action(), | |
125 | entry.repository.repo_name) |
|
125 | entry.repository.repo_name) | |
126 | _url = None |
|
126 | _url = None | |
127 | if entry.repository is not None: |
|
127 | if entry.repository is not None: | |
128 | _url = webutils.canonical_url('changelog_home', |
|
128 | _url = webutils.canonical_url('changelog_home', | |
129 | repo_name=entry.repository.repo_name) |
|
129 | repo_name=entry.repository.repo_name) | |
130 |
|
130 | |||
131 | entries.append(dict( |
|
131 | entries.append(dict( | |
132 | title=title, |
|
132 | title=title, | |
133 | pubdate=entry.action_date, |
|
133 | pubdate=entry.action_date, | |
134 | link=_url or webutils.canonical_url(''), |
|
134 | link=_url or webutils.canonical_url(''), | |
135 | author_email=user.email, |
|
135 | author_email=user.email, | |
136 | author_name=user.full_name_or_username, |
|
136 | author_name=user.full_name_or_username, | |
137 | description=action_extra(), |
|
137 | description=action_extra(), | |
138 | )) |
|
138 | )) | |
139 |
|
139 | |||
140 | return feeder.render(header, entries) |
|
140 | return feeder.render(header, entries) | |
141 |
|
141 | |||
142 | def _atom_feed(self, repos, public=True): |
|
142 | def _atom_feed(self, repos, public=True): | |
143 | if public: |
|
143 | if public: | |
144 | link = webutils.canonical_url('public_journal_atom') |
|
144 | link = webutils.canonical_url('public_journal_atom') | |
145 | desc = '%s %s %s' % (c.site_name, _('Public Journal'), |
|
145 | desc = '%s %s %s' % (c.site_name, _('Public Journal'), | |
146 | 'atom feed') |
|
146 | 'atom feed') | |
147 | else: |
|
147 | else: | |
148 | link = webutils.canonical_url('journal_atom') |
|
148 | link = webutils.canonical_url('journal_atom') | |
149 | desc = '%s %s %s' % (c.site_name, _('Journal'), 'atom feed') |
|
149 | desc = '%s %s %s' % (c.site_name, _('Journal'), 'atom feed') | |
150 |
|
150 | |||
151 | return self._feed(repos, feeds.AtomFeed, link, desc) |
|
151 | return self._feed(repos, feeds.AtomFeed, link, desc) | |
152 |
|
152 | |||
153 | def _rss_feed(self, repos, public=True): |
|
153 | def _rss_feed(self, repos, public=True): | |
154 | if public: |
|
154 | if public: | |
155 | link = webutils.canonical_url('public_journal_atom') |
|
155 | link = webutils.canonical_url('public_journal_atom') | |
156 | desc = '%s %s %s' % (c.site_name, _('Public Journal'), |
|
156 | desc = '%s %s %s' % (c.site_name, _('Public Journal'), | |
157 | 'rss feed') |
|
157 | 'rss feed') | |
158 | else: |
|
158 | else: | |
159 | link = webutils.canonical_url('journal_atom') |
|
159 | link = webutils.canonical_url('journal_atom') | |
160 | desc = '%s %s %s' % (c.site_name, _('Journal'), 'rss feed') |
|
160 | desc = '%s %s %s' % (c.site_name, _('Journal'), 'rss feed') | |
161 |
|
161 | |||
162 | return self._feed(repos, feeds.RssFeed, link, desc) |
|
162 | return self._feed(repos, feeds.RssFeed, link, desc) | |
163 |
|
163 | |||
164 | @LoginRequired() |
|
164 | @LoginRequired() | |
165 | def index(self): |
|
165 | def index(self): | |
166 | # Return a rendered template |
|
166 | # Return a rendered template | |
167 | p = safe_int(request.GET.get('page'), 1) |
|
167 | p = safe_int(request.GET.get('page'), 1) | |
168 | c.user = db.User.get(request.authuser.user_id) |
|
168 | c.user = db.User.get(request.authuser.user_id) | |
169 | c.following = db.UserFollowing.query() \ |
|
169 | c.following = db.UserFollowing.query() \ | |
170 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ |
|
170 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ | |
171 | .options(joinedload(db.UserFollowing.follows_repository)) \ |
|
171 | .options(joinedload(db.UserFollowing.follows_repository)) \ | |
172 | .all() |
|
172 | .all() | |
173 |
|
173 | |||
174 | journal = self._get_journal_data(c.following) |
|
174 | journal = self._get_journal_data(c.following) | |
175 |
|
175 | |||
176 | c.journal_pager = Page(journal, page=p, items_per_page=20, |
|
176 | c.journal_pager = Page(journal, page=p, items_per_page=20, | |
177 | filter=c.search_term) |
|
177 | filter=c.search_term) | |
178 | c.journal_day_aggregate = self._get_daily_aggregate(c.journal_pager) |
|
178 | c.journal_day_aggregate = self._get_daily_aggregate(c.journal_pager) | |
179 |
|
179 | |||
180 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
180 | if request.environ.get('HTTP_X_PARTIAL_XHR'): | |
181 | return render('journal/journal_data.html') |
|
181 | return base.render('journal/journal_data.html') | |
182 |
|
182 | |||
183 | repos_list = db.Repository.query(sorted=True) \ |
|
183 | repos_list = db.Repository.query(sorted=True) \ | |
184 | .filter_by(owner_id=request.authuser.user_id).all() |
|
184 | .filter_by(owner_id=request.authuser.user_id).all() | |
185 |
|
185 | |||
186 | repos_data = RepoModel().get_repos_as_dict(repos_list, admin=True) |
|
186 | repos_data = RepoModel().get_repos_as_dict(repos_list, admin=True) | |
187 | # data used to render the grid |
|
187 | # data used to render the grid | |
188 | c.data = repos_data |
|
188 | c.data = repos_data | |
189 |
|
189 | |||
190 | return render('journal/journal.html') |
|
190 | return base.render('journal/journal.html') | |
191 |
|
191 | |||
192 | @LoginRequired() |
|
192 | @LoginRequired() | |
193 | def journal_atom(self): |
|
193 | def journal_atom(self): | |
194 | """Produce a simple atom-1.0 feed""" |
|
194 | """Produce a simple atom-1.0 feed""" | |
195 | following = db.UserFollowing.query() \ |
|
195 | following = db.UserFollowing.query() \ | |
196 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ |
|
196 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ | |
197 | .options(joinedload(db.UserFollowing.follows_repository)) \ |
|
197 | .options(joinedload(db.UserFollowing.follows_repository)) \ | |
198 | .all() |
|
198 | .all() | |
199 | return self._atom_feed(following, public=False) |
|
199 | return self._atom_feed(following, public=False) | |
200 |
|
200 | |||
201 | @LoginRequired() |
|
201 | @LoginRequired() | |
202 | def journal_rss(self): |
|
202 | def journal_rss(self): | |
203 | """Produce a simple rss2 feed""" |
|
203 | """Produce a simple rss2 feed""" | |
204 | following = db.UserFollowing.query() \ |
|
204 | following = db.UserFollowing.query() \ | |
205 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ |
|
205 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ | |
206 | .options(joinedload(db.UserFollowing.follows_repository)) \ |
|
206 | .options(joinedload(db.UserFollowing.follows_repository)) \ | |
207 | .all() |
|
207 | .all() | |
208 | return self._rss_feed(following, public=False) |
|
208 | return self._rss_feed(following, public=False) | |
209 |
|
209 | |||
210 | @LoginRequired() |
|
210 | @LoginRequired() | |
211 | def toggle_following(self): |
|
211 | def toggle_following(self): | |
212 | user_id = request.POST.get('follows_user_id') |
|
212 | user_id = request.POST.get('follows_user_id') | |
213 | if user_id: |
|
213 | if user_id: | |
214 | try: |
|
214 | try: | |
215 | self.scm_model.toggle_following_user(user_id, |
|
215 | self.scm_model.toggle_following_user(user_id, | |
216 | request.authuser.user_id) |
|
216 | request.authuser.user_id) | |
217 | meta.Session().commit() |
|
217 | meta.Session().commit() | |
218 | return 'ok' |
|
218 | return 'ok' | |
219 | except Exception: |
|
219 | except Exception: | |
220 | log.error(traceback.format_exc()) |
|
220 | log.error(traceback.format_exc()) | |
221 | raise HTTPBadRequest() |
|
221 | raise HTTPBadRequest() | |
222 |
|
222 | |||
223 | repo_id = request.POST.get('follows_repository_id') |
|
223 | repo_id = request.POST.get('follows_repository_id') | |
224 | if repo_id: |
|
224 | if repo_id: | |
225 | try: |
|
225 | try: | |
226 | self.scm_model.toggle_following_repo(repo_id, |
|
226 | self.scm_model.toggle_following_repo(repo_id, | |
227 | request.authuser.user_id) |
|
227 | request.authuser.user_id) | |
228 | meta.Session().commit() |
|
228 | meta.Session().commit() | |
229 | return 'ok' |
|
229 | return 'ok' | |
230 | except Exception: |
|
230 | except Exception: | |
231 | log.error(traceback.format_exc()) |
|
231 | log.error(traceback.format_exc()) | |
232 | raise HTTPBadRequest() |
|
232 | raise HTTPBadRequest() | |
233 |
|
233 | |||
234 | raise HTTPBadRequest() |
|
234 | raise HTTPBadRequest() | |
235 |
|
235 | |||
236 | @LoginRequired(allow_default_user=True) |
|
236 | @LoginRequired(allow_default_user=True) | |
237 | def public_journal(self): |
|
237 | def public_journal(self): | |
238 | # Return a rendered template |
|
238 | # Return a rendered template | |
239 | p = safe_int(request.GET.get('page'), 1) |
|
239 | p = safe_int(request.GET.get('page'), 1) | |
240 |
|
240 | |||
241 | c.following = db.UserFollowing.query() \ |
|
241 | c.following = db.UserFollowing.query() \ | |
242 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ |
|
242 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ | |
243 | .options(joinedload(db.UserFollowing.follows_repository)) \ |
|
243 | .options(joinedload(db.UserFollowing.follows_repository)) \ | |
244 | .all() |
|
244 | .all() | |
245 |
|
245 | |||
246 | journal = self._get_journal_data(c.following) |
|
246 | journal = self._get_journal_data(c.following) | |
247 |
|
247 | |||
248 | c.journal_pager = Page(journal, page=p, items_per_page=20) |
|
248 | c.journal_pager = Page(journal, page=p, items_per_page=20) | |
249 |
|
249 | |||
250 | c.journal_day_aggregate = self._get_daily_aggregate(c.journal_pager) |
|
250 | c.journal_day_aggregate = self._get_daily_aggregate(c.journal_pager) | |
251 |
|
251 | |||
252 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
252 | if request.environ.get('HTTP_X_PARTIAL_XHR'): | |
253 | return render('journal/journal_data.html') |
|
253 | return base.render('journal/journal_data.html') | |
254 |
|
254 | |||
255 | return render('journal/public_journal.html') |
|
255 | return base.render('journal/public_journal.html') | |
256 |
|
256 | |||
257 | @LoginRequired(allow_default_user=True) |
|
257 | @LoginRequired(allow_default_user=True) | |
258 | def public_journal_atom(self): |
|
258 | def public_journal_atom(self): | |
259 | """Produce a simple atom-1.0 feed""" |
|
259 | """Produce a simple atom-1.0 feed""" | |
260 | c.following = db.UserFollowing.query() \ |
|
260 | c.following = db.UserFollowing.query() \ | |
261 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ |
|
261 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ | |
262 | .options(joinedload(db.UserFollowing.follows_repository)) \ |
|
262 | .options(joinedload(db.UserFollowing.follows_repository)) \ | |
263 | .all() |
|
263 | .all() | |
264 |
|
264 | |||
265 | return self._atom_feed(c.following) |
|
265 | return self._atom_feed(c.following) | |
266 |
|
266 | |||
267 | @LoginRequired(allow_default_user=True) |
|
267 | @LoginRequired(allow_default_user=True) | |
268 | def public_journal_rss(self): |
|
268 | def public_journal_rss(self): | |
269 | """Produce a simple rss2 feed""" |
|
269 | """Produce a simple rss2 feed""" | |
270 | c.following = db.UserFollowing.query() \ |
|
270 | c.following = db.UserFollowing.query() \ | |
271 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ |
|
271 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ | |
272 | .options(joinedload(db.UserFollowing.follows_repository)) \ |
|
272 | .options(joinedload(db.UserFollowing.follows_repository)) \ | |
273 | .all() |
|
273 | .all() | |
274 |
|
274 | |||
275 | return self._rss_feed(c.following) |
|
275 | return self._rss_feed(c.following) |
@@ -1,256 +1,256 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.login |
|
15 | kallithea.controllers.login | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Login controller for Kallithea |
|
18 | Login controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 22, 2010 |
|
22 | :created_on: Apr 22, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 |
|
28 | |||
29 | import logging |
|
29 | import logging | |
30 | import re |
|
30 | import re | |
31 |
|
31 | |||
32 | import formencode |
|
32 | import formencode | |
33 | from formencode import htmlfill |
|
33 | from formencode import htmlfill | |
34 | from tg import request, session |
|
34 | from tg import request, session | |
35 | from tg import tmpl_context as c |
|
35 | from tg import tmpl_context as c | |
36 | from tg.i18n import ugettext as _ |
|
36 | from tg.i18n import ugettext as _ | |
37 | from webob.exc import HTTPBadRequest, HTTPFound |
|
37 | from webob.exc import HTTPBadRequest, HTTPFound | |
38 |
|
38 | |||
|
39 | from kallithea.controllers import base | |||
39 | from kallithea.lib import webutils |
|
40 | from kallithea.lib import webutils | |
40 | from kallithea.lib.auth import AuthUser, HasPermissionAnyDecorator |
|
41 | from kallithea.lib.auth import AuthUser, HasPermissionAnyDecorator | |
41 | from kallithea.lib.base import BaseController, log_in_user, render |
|
|||
42 | from kallithea.lib.exceptions import UserCreationError |
|
42 | from kallithea.lib.exceptions import UserCreationError | |
43 | from kallithea.lib.recaptcha import submit |
|
43 | from kallithea.lib.recaptcha import submit | |
44 | from kallithea.lib.webutils import url |
|
44 | from kallithea.lib.webutils import url | |
45 | from kallithea.model import db, meta |
|
45 | from kallithea.model import db, meta | |
46 | from kallithea.model.forms import LoginForm, PasswordResetConfirmationForm, PasswordResetRequestForm, RegisterForm |
|
46 | from kallithea.model.forms import LoginForm, PasswordResetConfirmationForm, PasswordResetRequestForm, RegisterForm | |
47 | from kallithea.model.user import UserModel |
|
47 | from kallithea.model.user import UserModel | |
48 |
|
48 | |||
49 |
|
49 | |||
50 | log = logging.getLogger(__name__) |
|
50 | log = logging.getLogger(__name__) | |
51 |
|
51 | |||
52 |
|
52 | |||
53 | class LoginController(BaseController): |
|
53 | class LoginController(base.BaseController): | |
54 |
|
54 | |||
55 | def _validate_came_from(self, came_from, |
|
55 | def _validate_came_from(self, came_from, | |
56 | _re=re.compile(r"/(?!/)[-!#$%&'()*+,./:;=?@_~0-9A-Za-z]*$")): |
|
56 | _re=re.compile(r"/(?!/)[-!#$%&'()*+,./:;=?@_~0-9A-Za-z]*$")): | |
57 | """Return True if came_from is valid and can and should be used. |
|
57 | """Return True if came_from is valid and can and should be used. | |
58 |
|
58 | |||
59 | Determines if a URI reference is valid and relative to the origin; |
|
59 | Determines if a URI reference is valid and relative to the origin; | |
60 | or in RFC 3986 terms, whether it matches this production: |
|
60 | or in RFC 3986 terms, whether it matches this production: | |
61 |
|
61 | |||
62 | origin-relative-ref = path-absolute [ "?" query ] [ "#" fragment ] |
|
62 | origin-relative-ref = path-absolute [ "?" query ] [ "#" fragment ] | |
63 |
|
63 | |||
64 | with the exception that '%' escapes are not validated and '#' is |
|
64 | with the exception that '%' escapes are not validated and '#' is | |
65 | allowed inside the fragment part. |
|
65 | allowed inside the fragment part. | |
66 | """ |
|
66 | """ | |
67 | return _re.match(came_from) is not None |
|
67 | return _re.match(came_from) is not None | |
68 |
|
68 | |||
69 | def index(self): |
|
69 | def index(self): | |
70 | c.came_from = request.GET.get('came_from', '') |
|
70 | c.came_from = request.GET.get('came_from', '') | |
71 | if c.came_from: |
|
71 | if c.came_from: | |
72 | if not self._validate_came_from(c.came_from): |
|
72 | if not self._validate_came_from(c.came_from): | |
73 | log.error('Invalid came_from (not server-relative): %r', c.came_from) |
|
73 | log.error('Invalid came_from (not server-relative): %r', c.came_from) | |
74 | raise HTTPBadRequest() |
|
74 | raise HTTPBadRequest() | |
75 | else: |
|
75 | else: | |
76 | c.came_from = url('home') |
|
76 | c.came_from = url('home') | |
77 |
|
77 | |||
78 | if request.POST: |
|
78 | if request.POST: | |
79 | # import Login Form validator class |
|
79 | # import Login Form validator class | |
80 | login_form = LoginForm()() |
|
80 | login_form = LoginForm()() | |
81 | try: |
|
81 | try: | |
82 | # login_form will check username/password using ValidAuth and report failure to the user |
|
82 | # login_form will check username/password using ValidAuth and report failure to the user | |
83 | c.form_result = login_form.to_python(dict(request.POST)) |
|
83 | c.form_result = login_form.to_python(dict(request.POST)) | |
84 | username = c.form_result['username'] |
|
84 | username = c.form_result['username'] | |
85 | user = db.User.get_by_username_or_email(username) |
|
85 | user = db.User.get_by_username_or_email(username) | |
86 | assert user is not None # the same user get just passed in the form validation |
|
86 | assert user is not None # the same user get just passed in the form validation | |
87 | except formencode.Invalid as errors: |
|
87 | except formencode.Invalid as errors: | |
88 | defaults = errors.value |
|
88 | defaults = errors.value | |
89 | # remove password from filling in form again |
|
89 | # remove password from filling in form again | |
90 | defaults.pop('password', None) |
|
90 | defaults.pop('password', None) | |
91 | return htmlfill.render( |
|
91 | return htmlfill.render( | |
92 | render('/login.html'), |
|
92 | base.render('/login.html'), | |
93 | defaults=errors.value, |
|
93 | defaults=errors.value, | |
94 | errors=errors.error_dict or {}, |
|
94 | errors=errors.error_dict or {}, | |
95 | prefix_error=False, |
|
95 | prefix_error=False, | |
96 | encoding="UTF-8", |
|
96 | encoding="UTF-8", | |
97 | force_defaults=False) |
|
97 | force_defaults=False) | |
98 | except UserCreationError as e: |
|
98 | except UserCreationError as e: | |
99 | # container auth or other auth functions that create users on |
|
99 | # container auth or other auth functions that create users on | |
100 | # the fly can throw this exception signaling that there's issue |
|
100 | # the fly can throw this exception signaling that there's issue | |
101 | # with user creation, explanation should be provided in |
|
101 | # with user creation, explanation should be provided in | |
102 | # Exception itself |
|
102 | # Exception itself | |
103 | webutils.flash(e, 'error') |
|
103 | webutils.flash(e, 'error') | |
104 | else: |
|
104 | else: | |
105 | # login_form already validated the password - now set the session cookie accordingly |
|
105 | # login_form already validated the password - now set the session cookie accordingly | |
106 | auth_user = log_in_user(user, c.form_result['remember'], is_external_auth=False, ip_addr=request.ip_addr) |
|
106 | auth_user = base.log_in_user(user, c.form_result['remember'], is_external_auth=False, ip_addr=request.ip_addr) | |
107 | if auth_user: |
|
107 | if auth_user: | |
108 | raise HTTPFound(location=c.came_from) |
|
108 | raise HTTPFound(location=c.came_from) | |
109 | webutils.flash(_('Authentication failed.'), 'error') |
|
109 | webutils.flash(_('Authentication failed.'), 'error') | |
110 | else: |
|
110 | else: | |
111 | # redirect if already logged in |
|
111 | # redirect if already logged in | |
112 | if not request.authuser.is_anonymous: |
|
112 | if not request.authuser.is_anonymous: | |
113 | raise HTTPFound(location=c.came_from) |
|
113 | raise HTTPFound(location=c.came_from) | |
114 | # continue to show login to default user |
|
114 | # continue to show login to default user | |
115 |
|
115 | |||
116 | return render('/login.html') |
|
116 | return base.render('/login.html') | |
117 |
|
117 | |||
118 | @HasPermissionAnyDecorator('hg.admin', 'hg.register.auto_activate', |
|
118 | @HasPermissionAnyDecorator('hg.admin', 'hg.register.auto_activate', | |
119 | 'hg.register.manual_activate') |
|
119 | 'hg.register.manual_activate') | |
120 | def register(self): |
|
120 | def register(self): | |
121 | def_user_perms = AuthUser(dbuser=db.User.get_default_user()).global_permissions |
|
121 | def_user_perms = AuthUser(dbuser=db.User.get_default_user()).global_permissions | |
122 | c.auto_active = 'hg.register.auto_activate' in def_user_perms |
|
122 | c.auto_active = 'hg.register.auto_activate' in def_user_perms | |
123 |
|
123 | |||
124 | settings = db.Setting.get_app_settings() |
|
124 | settings = db.Setting.get_app_settings() | |
125 | captcha_private_key = settings.get('captcha_private_key') |
|
125 | captcha_private_key = settings.get('captcha_private_key') | |
126 | c.captcha_active = bool(captcha_private_key) |
|
126 | c.captcha_active = bool(captcha_private_key) | |
127 | c.captcha_public_key = settings.get('captcha_public_key') |
|
127 | c.captcha_public_key = settings.get('captcha_public_key') | |
128 |
|
128 | |||
129 | if request.POST: |
|
129 | if request.POST: | |
130 | register_form = RegisterForm()() |
|
130 | register_form = RegisterForm()() | |
131 | try: |
|
131 | try: | |
132 | form_result = register_form.to_python(dict(request.POST)) |
|
132 | form_result = register_form.to_python(dict(request.POST)) | |
133 | form_result['active'] = c.auto_active |
|
133 | form_result['active'] = c.auto_active | |
134 |
|
134 | |||
135 | if c.captcha_active: |
|
135 | if c.captcha_active: | |
136 | response = submit(request.POST.get('g-recaptcha-response'), |
|
136 | response = submit(request.POST.get('g-recaptcha-response'), | |
137 | private_key=captcha_private_key, |
|
137 | private_key=captcha_private_key, | |
138 | remoteip=request.ip_addr) |
|
138 | remoteip=request.ip_addr) | |
139 | if not response.is_valid: |
|
139 | if not response.is_valid: | |
140 | _value = form_result |
|
140 | _value = form_result | |
141 | _msg = _('Bad captcha') |
|
141 | _msg = _('Bad captcha') | |
142 | error_dict = {'recaptcha_field': _msg} |
|
142 | error_dict = {'recaptcha_field': _msg} | |
143 | raise formencode.Invalid(_msg, _value, None, |
|
143 | raise formencode.Invalid(_msg, _value, None, | |
144 | error_dict=error_dict) |
|
144 | error_dict=error_dict) | |
145 |
|
145 | |||
146 | UserModel().create_registration(form_result) |
|
146 | UserModel().create_registration(form_result) | |
147 | webutils.flash(_('You have successfully registered with %s') % (c.site_name or 'Kallithea'), |
|
147 | webutils.flash(_('You have successfully registered with %s') % (c.site_name or 'Kallithea'), | |
148 | category='success') |
|
148 | category='success') | |
149 | meta.Session().commit() |
|
149 | meta.Session().commit() | |
150 | raise HTTPFound(location=url('login_home')) |
|
150 | raise HTTPFound(location=url('login_home')) | |
151 |
|
151 | |||
152 | except formencode.Invalid as errors: |
|
152 | except formencode.Invalid as errors: | |
153 | return htmlfill.render( |
|
153 | return htmlfill.render( | |
154 | render('/register.html'), |
|
154 | base.render('/register.html'), | |
155 | defaults=errors.value, |
|
155 | defaults=errors.value, | |
156 | errors=errors.error_dict or {}, |
|
156 | errors=errors.error_dict or {}, | |
157 | prefix_error=False, |
|
157 | prefix_error=False, | |
158 | encoding="UTF-8", |
|
158 | encoding="UTF-8", | |
159 | force_defaults=False) |
|
159 | force_defaults=False) | |
160 | except UserCreationError as e: |
|
160 | except UserCreationError as e: | |
161 | # container auth or other auth functions that create users on |
|
161 | # container auth or other auth functions that create users on | |
162 | # the fly can throw this exception signaling that there's issue |
|
162 | # the fly can throw this exception signaling that there's issue | |
163 | # with user creation, explanation should be provided in |
|
163 | # with user creation, explanation should be provided in | |
164 | # Exception itself |
|
164 | # Exception itself | |
165 | webutils.flash(e, 'error') |
|
165 | webutils.flash(e, 'error') | |
166 |
|
166 | |||
167 | return render('/register.html') |
|
167 | return base.render('/register.html') | |
168 |
|
168 | |||
169 | def password_reset(self): |
|
169 | def password_reset(self): | |
170 | settings = db.Setting.get_app_settings() |
|
170 | settings = db.Setting.get_app_settings() | |
171 | captcha_private_key = settings.get('captcha_private_key') |
|
171 | captcha_private_key = settings.get('captcha_private_key') | |
172 | c.captcha_active = bool(captcha_private_key) |
|
172 | c.captcha_active = bool(captcha_private_key) | |
173 | c.captcha_public_key = settings.get('captcha_public_key') |
|
173 | c.captcha_public_key = settings.get('captcha_public_key') | |
174 |
|
174 | |||
175 | if request.POST: |
|
175 | if request.POST: | |
176 | password_reset_form = PasswordResetRequestForm()() |
|
176 | password_reset_form = PasswordResetRequestForm()() | |
177 | try: |
|
177 | try: | |
178 | form_result = password_reset_form.to_python(dict(request.POST)) |
|
178 | form_result = password_reset_form.to_python(dict(request.POST)) | |
179 | if c.captcha_active: |
|
179 | if c.captcha_active: | |
180 | response = submit(request.POST.get('g-recaptcha-response'), |
|
180 | response = submit(request.POST.get('g-recaptcha-response'), | |
181 | private_key=captcha_private_key, |
|
181 | private_key=captcha_private_key, | |
182 | remoteip=request.ip_addr) |
|
182 | remoteip=request.ip_addr) | |
183 | if not response.is_valid: |
|
183 | if not response.is_valid: | |
184 | _value = form_result |
|
184 | _value = form_result | |
185 | _msg = _('Bad captcha') |
|
185 | _msg = _('Bad captcha') | |
186 | error_dict = {'recaptcha_field': _msg} |
|
186 | error_dict = {'recaptcha_field': _msg} | |
187 | raise formencode.Invalid(_msg, _value, None, |
|
187 | raise formencode.Invalid(_msg, _value, None, | |
188 | error_dict=error_dict) |
|
188 | error_dict=error_dict) | |
189 | redirect_link = UserModel().send_reset_password_email(form_result) |
|
189 | redirect_link = UserModel().send_reset_password_email(form_result) | |
190 | webutils.flash(_('A password reset confirmation code has been sent'), |
|
190 | webutils.flash(_('A password reset confirmation code has been sent'), | |
191 | category='success') |
|
191 | category='success') | |
192 | raise HTTPFound(location=redirect_link) |
|
192 | raise HTTPFound(location=redirect_link) | |
193 |
|
193 | |||
194 | except formencode.Invalid as errors: |
|
194 | except formencode.Invalid as errors: | |
195 | return htmlfill.render( |
|
195 | return htmlfill.render( | |
196 | render('/password_reset.html'), |
|
196 | base.render('/password_reset.html'), | |
197 | defaults=errors.value, |
|
197 | defaults=errors.value, | |
198 | errors=errors.error_dict or {}, |
|
198 | errors=errors.error_dict or {}, | |
199 | prefix_error=False, |
|
199 | prefix_error=False, | |
200 | encoding="UTF-8", |
|
200 | encoding="UTF-8", | |
201 | force_defaults=False) |
|
201 | force_defaults=False) | |
202 |
|
202 | |||
203 | return render('/password_reset.html') |
|
203 | return base.render('/password_reset.html') | |
204 |
|
204 | |||
205 | def password_reset_confirmation(self): |
|
205 | def password_reset_confirmation(self): | |
206 | # This controller handles both GET and POST requests, though we |
|
206 | # This controller handles both GET and POST requests, though we | |
207 | # only ever perform the actual password change on POST (since |
|
207 | # only ever perform the actual password change on POST (since | |
208 | # GET requests are not allowed to have side effects, and do not |
|
208 | # GET requests are not allowed to have side effects, and do not | |
209 | # receive automatic CSRF protection). |
|
209 | # receive automatic CSRF protection). | |
210 |
|
210 | |||
211 | # The template needs the email address outside of the form. |
|
211 | # The template needs the email address outside of the form. | |
212 | c.email = request.params.get('email') |
|
212 | c.email = request.params.get('email') | |
213 | c.timestamp = request.params.get('timestamp') or '' |
|
213 | c.timestamp = request.params.get('timestamp') or '' | |
214 | c.token = request.params.get('token') or '' |
|
214 | c.token = request.params.get('token') or '' | |
215 | if not request.POST: |
|
215 | if not request.POST: | |
216 | return render('/password_reset_confirmation.html') |
|
216 | return base.render('/password_reset_confirmation.html') | |
217 |
|
217 | |||
218 | form = PasswordResetConfirmationForm()() |
|
218 | form = PasswordResetConfirmationForm()() | |
219 | try: |
|
219 | try: | |
220 | form_result = form.to_python(dict(request.POST)) |
|
220 | form_result = form.to_python(dict(request.POST)) | |
221 | except formencode.Invalid as errors: |
|
221 | except formencode.Invalid as errors: | |
222 | return htmlfill.render( |
|
222 | return htmlfill.render( | |
223 | render('/password_reset_confirmation.html'), |
|
223 | base.render('/password_reset_confirmation.html'), | |
224 | defaults=errors.value, |
|
224 | defaults=errors.value, | |
225 | errors=errors.error_dict or {}, |
|
225 | errors=errors.error_dict or {}, | |
226 | prefix_error=False, |
|
226 | prefix_error=False, | |
227 | encoding='UTF-8') |
|
227 | encoding='UTF-8') | |
228 |
|
228 | |||
229 | if not UserModel().verify_reset_password_token( |
|
229 | if not UserModel().verify_reset_password_token( | |
230 | form_result['email'], |
|
230 | form_result['email'], | |
231 | form_result['timestamp'], |
|
231 | form_result['timestamp'], | |
232 | form_result['token'], |
|
232 | form_result['token'], | |
233 | ): |
|
233 | ): | |
234 | return htmlfill.render( |
|
234 | return htmlfill.render( | |
235 | render('/password_reset_confirmation.html'), |
|
235 | base.render('/password_reset_confirmation.html'), | |
236 | defaults=form_result, |
|
236 | defaults=form_result, | |
237 | errors={'token': _('Invalid password reset token')}, |
|
237 | errors={'token': _('Invalid password reset token')}, | |
238 | prefix_error=False, |
|
238 | prefix_error=False, | |
239 | encoding='UTF-8') |
|
239 | encoding='UTF-8') | |
240 |
|
240 | |||
241 | UserModel().reset_password(form_result['email'], form_result['password']) |
|
241 | UserModel().reset_password(form_result['email'], form_result['password']) | |
242 | webutils.flash(_('Successfully updated password'), category='success') |
|
242 | webutils.flash(_('Successfully updated password'), category='success') | |
243 | raise HTTPFound(location=url('login_home')) |
|
243 | raise HTTPFound(location=url('login_home')) | |
244 |
|
244 | |||
245 | def logout(self): |
|
245 | def logout(self): | |
246 | session.delete() |
|
246 | session.delete() | |
247 | log.info('Logging out and deleting session for user') |
|
247 | log.info('Logging out and deleting session for user') | |
248 | raise HTTPFound(location=url('home')) |
|
248 | raise HTTPFound(location=url('home')) | |
249 |
|
249 | |||
250 | def session_csrf_secret_token(self): |
|
250 | def session_csrf_secret_token(self): | |
251 | """Return the CSRF protection token for the session - just like it |
|
251 | """Return the CSRF protection token for the session - just like it | |
252 | could have been screen scraped from a page with a form. |
|
252 | could have been screen scraped from a page with a form. | |
253 | Only intended for testing but might also be useful for other kinds |
|
253 | Only intended for testing but might also be useful for other kinds | |
254 | of automation. |
|
254 | of automation. | |
255 | """ |
|
255 | """ | |
256 | return webutils.session_csrf_secret_token() |
|
256 | return webutils.session_csrf_secret_token() |
@@ -1,638 +1,638 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.pullrequests |
|
15 | kallithea.controllers.pullrequests | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | pull requests controller for Kallithea for initializing pull requests |
|
18 | pull requests controller for Kallithea for initializing pull requests | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: May 7, 2012 |
|
22 | :created_on: May 7, 2012 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 |
|
30 | |||
31 | import formencode |
|
31 | import formencode | |
32 | import mercurial.unionrepo |
|
32 | import mercurial.unionrepo | |
33 | from tg import request |
|
33 | from tg import request | |
34 | from tg import tmpl_context as c |
|
34 | from tg import tmpl_context as c | |
35 | from tg.i18n import ugettext as _ |
|
35 | from tg.i18n import ugettext as _ | |
36 | from webob.exc import HTTPBadRequest, HTTPForbidden, HTTPFound, HTTPNotFound |
|
36 | from webob.exc import HTTPBadRequest, HTTPForbidden, HTTPFound, HTTPNotFound | |
37 |
|
37 | |||
38 | import kallithea.lib.helpers as h |
|
38 | import kallithea.lib.helpers as h | |
|
39 | from kallithea.controllers import base | |||
39 | from kallithea.controllers.changeset import create_cs_pr_comment, delete_cs_pr_comment |
|
40 | from kallithea.controllers.changeset import create_cs_pr_comment, delete_cs_pr_comment | |
40 | from kallithea.lib import auth, diffs, webutils |
|
41 | from kallithea.lib import auth, diffs, webutils | |
41 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
42 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired | |
42 | from kallithea.lib.base import BaseRepoController, jsonify, render |
|
|||
43 | from kallithea.lib.graphmod import graph_data |
|
43 | from kallithea.lib.graphmod import graph_data | |
44 | from kallithea.lib.page import Page |
|
44 | from kallithea.lib.page import Page | |
45 | from kallithea.lib.utils2 import ascii_bytes, safe_bytes, safe_int |
|
45 | from kallithea.lib.utils2 import ascii_bytes, safe_bytes, safe_int | |
46 | from kallithea.lib.vcs.exceptions import ChangesetDoesNotExistError, EmptyRepositoryError |
|
46 | from kallithea.lib.vcs.exceptions import ChangesetDoesNotExistError, EmptyRepositoryError | |
47 | from kallithea.lib.webutils import url |
|
47 | from kallithea.lib.webutils import url | |
48 | from kallithea.model import db, meta |
|
48 | from kallithea.model import db, meta | |
49 | from kallithea.model.changeset_status import ChangesetStatusModel |
|
49 | from kallithea.model.changeset_status import ChangesetStatusModel | |
50 | from kallithea.model.comment import ChangesetCommentsModel |
|
50 | from kallithea.model.comment import ChangesetCommentsModel | |
51 | from kallithea.model.forms import PullRequestForm, PullRequestPostForm |
|
51 | from kallithea.model.forms import PullRequestForm, PullRequestPostForm | |
52 | from kallithea.model.pull_request import CreatePullRequestAction, CreatePullRequestIterationAction, PullRequestModel |
|
52 | from kallithea.model.pull_request import CreatePullRequestAction, CreatePullRequestIterationAction, PullRequestModel | |
53 |
|
53 | |||
54 |
|
54 | |||
55 | log = logging.getLogger(__name__) |
|
55 | log = logging.getLogger(__name__) | |
56 |
|
56 | |||
57 |
|
57 | |||
58 | def _get_reviewer(user_id): |
|
58 | def _get_reviewer(user_id): | |
59 | """Look up user by ID and validate it as a potential reviewer.""" |
|
59 | """Look up user by ID and validate it as a potential reviewer.""" | |
60 | try: |
|
60 | try: | |
61 | user = db.User.get(int(user_id)) |
|
61 | user = db.User.get(int(user_id)) | |
62 | except ValueError: |
|
62 | except ValueError: | |
63 | user = None |
|
63 | user = None | |
64 |
|
64 | |||
65 | if user is None or user.is_default_user: |
|
65 | if user is None or user.is_default_user: | |
66 | webutils.flash(_('Invalid reviewer "%s" specified') % user_id, category='error') |
|
66 | webutils.flash(_('Invalid reviewer "%s" specified') % user_id, category='error') | |
67 | raise HTTPBadRequest() |
|
67 | raise HTTPBadRequest() | |
68 |
|
68 | |||
69 | return user |
|
69 | return user | |
70 |
|
70 | |||
71 |
|
71 | |||
72 | class PullrequestsController(BaseRepoController): |
|
72 | class PullrequestsController(base.BaseRepoController): | |
73 |
|
73 | |||
74 | def _get_repo_refs(self, repo, rev=None, branch=None, branch_rev=None): |
|
74 | def _get_repo_refs(self, repo, rev=None, branch=None, branch_rev=None): | |
75 | """return a structure with scm repo's interesting changesets, suitable for |
|
75 | """return a structure with scm repo's interesting changesets, suitable for | |
76 | the selectors in pullrequest.html |
|
76 | the selectors in pullrequest.html | |
77 |
|
77 | |||
78 | rev: a revision that must be in the list somehow and selected by default |
|
78 | rev: a revision that must be in the list somehow and selected by default | |
79 | branch: a branch that must be in the list and selected by default - even if closed |
|
79 | branch: a branch that must be in the list and selected by default - even if closed | |
80 | branch_rev: a revision of which peers should be preferred and available.""" |
|
80 | branch_rev: a revision of which peers should be preferred and available.""" | |
81 | # list named branches that has been merged to this named branch - it should probably merge back |
|
81 | # list named branches that has been merged to this named branch - it should probably merge back | |
82 | peers = [] |
|
82 | peers = [] | |
83 |
|
83 | |||
84 | if branch_rev: |
|
84 | if branch_rev: | |
85 | # a revset not restricting to merge() would be better |
|
85 | # a revset not restricting to merge() would be better | |
86 | # (especially because it would get the branch point) |
|
86 | # (especially because it would get the branch point) | |
87 | # ... but is currently too expensive |
|
87 | # ... but is currently too expensive | |
88 | # including branches of children could be nice too |
|
88 | # including branches of children could be nice too | |
89 | peerbranches = set() |
|
89 | peerbranches = set() | |
90 | for i in repo._repo.revs( |
|
90 | for i in repo._repo.revs( | |
91 | b"sort(parents(branch(id(%s)) and merge()) - branch(id(%s)), -rev)", |
|
91 | b"sort(parents(branch(id(%s)) and merge()) - branch(id(%s)), -rev)", | |
92 | ascii_bytes(branch_rev), ascii_bytes(branch_rev), |
|
92 | ascii_bytes(branch_rev), ascii_bytes(branch_rev), | |
93 | ): |
|
93 | ): | |
94 | for abranch in repo.get_changeset(i).branches: |
|
94 | for abranch in repo.get_changeset(i).branches: | |
95 | if abranch not in peerbranches: |
|
95 | if abranch not in peerbranches: | |
96 | n = 'branch:%s:%s' % (abranch, repo.get_changeset(abranch).raw_id) |
|
96 | n = 'branch:%s:%s' % (abranch, repo.get_changeset(abranch).raw_id) | |
97 | peers.append((n, abranch)) |
|
97 | peers.append((n, abranch)) | |
98 | peerbranches.add(abranch) |
|
98 | peerbranches.add(abranch) | |
99 |
|
99 | |||
100 | selected = None |
|
100 | selected = None | |
101 | tiprev = repo.tags.get('tip') |
|
101 | tiprev = repo.tags.get('tip') | |
102 | tipbranch = None |
|
102 | tipbranch = None | |
103 |
|
103 | |||
104 | branches = [] |
|
104 | branches = [] | |
105 | for abranch, branchrev in repo.branches.items(): |
|
105 | for abranch, branchrev in repo.branches.items(): | |
106 | n = 'branch:%s:%s' % (abranch, branchrev) |
|
106 | n = 'branch:%s:%s' % (abranch, branchrev) | |
107 | desc = abranch |
|
107 | desc = abranch | |
108 | if branchrev == tiprev: |
|
108 | if branchrev == tiprev: | |
109 | tipbranch = abranch |
|
109 | tipbranch = abranch | |
110 | desc = '%s (current tip)' % desc |
|
110 | desc = '%s (current tip)' % desc | |
111 | branches.append((n, desc)) |
|
111 | branches.append((n, desc)) | |
112 | if rev == branchrev: |
|
112 | if rev == branchrev: | |
113 | selected = n |
|
113 | selected = n | |
114 | if branch == abranch: |
|
114 | if branch == abranch: | |
115 | if not rev: |
|
115 | if not rev: | |
116 | selected = n |
|
116 | selected = n | |
117 | branch = None |
|
117 | branch = None | |
118 | if branch: # branch not in list - it is probably closed |
|
118 | if branch: # branch not in list - it is probably closed | |
119 | branchrev = repo.closed_branches.get(branch) |
|
119 | branchrev = repo.closed_branches.get(branch) | |
120 | if branchrev: |
|
120 | if branchrev: | |
121 | n = 'branch:%s:%s' % (branch, branchrev) |
|
121 | n = 'branch:%s:%s' % (branch, branchrev) | |
122 | branches.append((n, _('%s (closed)') % branch)) |
|
122 | branches.append((n, _('%s (closed)') % branch)) | |
123 | selected = n |
|
123 | selected = n | |
124 | branch = None |
|
124 | branch = None | |
125 | if branch: |
|
125 | if branch: | |
126 | log.debug('branch %r not found in %s', branch, repo) |
|
126 | log.debug('branch %r not found in %s', branch, repo) | |
127 |
|
127 | |||
128 | bookmarks = [] |
|
128 | bookmarks = [] | |
129 | for bookmark, bookmarkrev in repo.bookmarks.items(): |
|
129 | for bookmark, bookmarkrev in repo.bookmarks.items(): | |
130 | n = 'book:%s:%s' % (bookmark, bookmarkrev) |
|
130 | n = 'book:%s:%s' % (bookmark, bookmarkrev) | |
131 | bookmarks.append((n, bookmark)) |
|
131 | bookmarks.append((n, bookmark)) | |
132 | if rev == bookmarkrev: |
|
132 | if rev == bookmarkrev: | |
133 | selected = n |
|
133 | selected = n | |
134 |
|
134 | |||
135 | tags = [] |
|
135 | tags = [] | |
136 | for tag, tagrev in repo.tags.items(): |
|
136 | for tag, tagrev in repo.tags.items(): | |
137 | if tag == 'tip': |
|
137 | if tag == 'tip': | |
138 | continue |
|
138 | continue | |
139 | n = 'tag:%s:%s' % (tag, tagrev) |
|
139 | n = 'tag:%s:%s' % (tag, tagrev) | |
140 | tags.append((n, tag)) |
|
140 | tags.append((n, tag)) | |
141 | # note: even if rev == tagrev, don't select the static tag - it must be chosen explicitly |
|
141 | # note: even if rev == tagrev, don't select the static tag - it must be chosen explicitly | |
142 |
|
142 | |||
143 | # prio 1: rev was selected as existing entry above |
|
143 | # prio 1: rev was selected as existing entry above | |
144 |
|
144 | |||
145 | # prio 2: create special entry for rev; rev _must_ be used |
|
145 | # prio 2: create special entry for rev; rev _must_ be used | |
146 | specials = [] |
|
146 | specials = [] | |
147 | if rev and selected is None: |
|
147 | if rev and selected is None: | |
148 | selected = 'rev:%s:%s' % (rev, rev) |
|
148 | selected = 'rev:%s:%s' % (rev, rev) | |
149 | specials = [(selected, '%s: %s' % (_("Changeset"), rev[:12]))] |
|
149 | specials = [(selected, '%s: %s' % (_("Changeset"), rev[:12]))] | |
150 |
|
150 | |||
151 | # prio 3: most recent peer branch |
|
151 | # prio 3: most recent peer branch | |
152 | if peers and not selected: |
|
152 | if peers and not selected: | |
153 | selected = peers[0][0] |
|
153 | selected = peers[0][0] | |
154 |
|
154 | |||
155 | # prio 4: tip revision |
|
155 | # prio 4: tip revision | |
156 | if not selected: |
|
156 | if not selected: | |
157 | if repo.alias == 'hg': |
|
157 | if repo.alias == 'hg': | |
158 | if tipbranch: |
|
158 | if tipbranch: | |
159 | selected = 'branch:%s:%s' % (tipbranch, tiprev) |
|
159 | selected = 'branch:%s:%s' % (tipbranch, tiprev) | |
160 | else: |
|
160 | else: | |
161 | selected = 'tag:null:' + repo.EMPTY_CHANGESET |
|
161 | selected = 'tag:null:' + repo.EMPTY_CHANGESET | |
162 | tags.append((selected, 'null')) |
|
162 | tags.append((selected, 'null')) | |
163 | else: # Git |
|
163 | else: # Git | |
164 | assert repo.alias == 'git' |
|
164 | assert repo.alias == 'git' | |
165 | if not repo.branches: |
|
165 | if not repo.branches: | |
166 | selected = '' # doesn't make sense, but better than nothing |
|
166 | selected = '' # doesn't make sense, but better than nothing | |
167 | elif 'master' in repo.branches: |
|
167 | elif 'master' in repo.branches: | |
168 | selected = 'branch:master:%s' % repo.branches['master'] |
|
168 | selected = 'branch:master:%s' % repo.branches['master'] | |
169 | else: |
|
169 | else: | |
170 | k, v = list(repo.branches.items())[0] |
|
170 | k, v = list(repo.branches.items())[0] | |
171 | selected = 'branch:%s:%s' % (k, v) |
|
171 | selected = 'branch:%s:%s' % (k, v) | |
172 |
|
172 | |||
173 | groups = [(specials, _("Special")), |
|
173 | groups = [(specials, _("Special")), | |
174 | (peers, _("Peer branches")), |
|
174 | (peers, _("Peer branches")), | |
175 | (bookmarks, _("Bookmarks")), |
|
175 | (bookmarks, _("Bookmarks")), | |
176 | (branches, _("Branches")), |
|
176 | (branches, _("Branches")), | |
177 | (tags, _("Tags")), |
|
177 | (tags, _("Tags")), | |
178 | ] |
|
178 | ] | |
179 | return [g for g in groups if g[0]], selected |
|
179 | return [g for g in groups if g[0]], selected | |
180 |
|
180 | |||
181 | def _is_allowed_to_change_status(self, pull_request): |
|
181 | def _is_allowed_to_change_status(self, pull_request): | |
182 | if pull_request.is_closed(): |
|
182 | if pull_request.is_closed(): | |
183 | return False |
|
183 | return False | |
184 |
|
184 | |||
185 | owner = request.authuser.user_id == pull_request.owner_id |
|
185 | owner = request.authuser.user_id == pull_request.owner_id | |
186 | reviewer = db.PullRequestReviewer.query() \ |
|
186 | reviewer = db.PullRequestReviewer.query() \ | |
187 | .filter(db.PullRequestReviewer.pull_request == pull_request) \ |
|
187 | .filter(db.PullRequestReviewer.pull_request == pull_request) \ | |
188 | .filter(db.PullRequestReviewer.user_id == request.authuser.user_id) \ |
|
188 | .filter(db.PullRequestReviewer.user_id == request.authuser.user_id) \ | |
189 | .count() != 0 |
|
189 | .count() != 0 | |
190 |
|
190 | |||
191 | return request.authuser.admin or owner or reviewer |
|
191 | return request.authuser.admin or owner or reviewer | |
192 |
|
192 | |||
193 | @LoginRequired(allow_default_user=True) |
|
193 | @LoginRequired(allow_default_user=True) | |
194 | @HasRepoPermissionLevelDecorator('read') |
|
194 | @HasRepoPermissionLevelDecorator('read') | |
195 | def show_all(self, repo_name): |
|
195 | def show_all(self, repo_name): | |
196 | c.from_ = request.GET.get('from_') or '' |
|
196 | c.from_ = request.GET.get('from_') or '' | |
197 | c.closed = request.GET.get('closed') or '' |
|
197 | c.closed = request.GET.get('closed') or '' | |
198 | url_params = {} |
|
198 | url_params = {} | |
199 | if c.from_: |
|
199 | if c.from_: | |
200 | url_params['from_'] = 1 |
|
200 | url_params['from_'] = 1 | |
201 | if c.closed: |
|
201 | if c.closed: | |
202 | url_params['closed'] = 1 |
|
202 | url_params['closed'] = 1 | |
203 | p = safe_int(request.GET.get('page'), 1) |
|
203 | p = safe_int(request.GET.get('page'), 1) | |
204 |
|
204 | |||
205 | q = db.PullRequest.query(include_closed=c.closed, sorted=True) |
|
205 | q = db.PullRequest.query(include_closed=c.closed, sorted=True) | |
206 | if c.from_: |
|
206 | if c.from_: | |
207 | q = q.filter_by(org_repo=c.db_repo) |
|
207 | q = q.filter_by(org_repo=c.db_repo) | |
208 | else: |
|
208 | else: | |
209 | q = q.filter_by(other_repo=c.db_repo) |
|
209 | q = q.filter_by(other_repo=c.db_repo) | |
210 | c.pull_requests = q.all() |
|
210 | c.pull_requests = q.all() | |
211 |
|
211 | |||
212 | c.pullrequests_pager = Page(c.pull_requests, page=p, items_per_page=100, **url_params) |
|
212 | c.pullrequests_pager = Page(c.pull_requests, page=p, items_per_page=100, **url_params) | |
213 |
|
213 | |||
214 | return render('/pullrequests/pullrequest_show_all.html') |
|
214 | return base.render('/pullrequests/pullrequest_show_all.html') | |
215 |
|
215 | |||
216 | @LoginRequired() |
|
216 | @LoginRequired() | |
217 | def show_my(self): |
|
217 | def show_my(self): | |
218 | c.closed = request.GET.get('closed') or '' |
|
218 | c.closed = request.GET.get('closed') or '' | |
219 |
|
219 | |||
220 | c.my_pull_requests = db.PullRequest.query( |
|
220 | c.my_pull_requests = db.PullRequest.query( | |
221 | include_closed=c.closed, |
|
221 | include_closed=c.closed, | |
222 | sorted=True, |
|
222 | sorted=True, | |
223 | ).filter_by(owner_id=request.authuser.user_id).all() |
|
223 | ).filter_by(owner_id=request.authuser.user_id).all() | |
224 |
|
224 | |||
225 | c.participate_in_pull_requests = [] |
|
225 | c.participate_in_pull_requests = [] | |
226 | c.participate_in_pull_requests_todo = [] |
|
226 | c.participate_in_pull_requests_todo = [] | |
227 | done_status = set([db.ChangesetStatus.STATUS_APPROVED, db.ChangesetStatus.STATUS_REJECTED]) |
|
227 | done_status = set([db.ChangesetStatus.STATUS_APPROVED, db.ChangesetStatus.STATUS_REJECTED]) | |
228 | for pr in db.PullRequest.query( |
|
228 | for pr in db.PullRequest.query( | |
229 | include_closed=c.closed, |
|
229 | include_closed=c.closed, | |
230 | reviewer_id=request.authuser.user_id, |
|
230 | reviewer_id=request.authuser.user_id, | |
231 | sorted=True, |
|
231 | sorted=True, | |
232 | ): |
|
232 | ): | |
233 | status = pr.user_review_status(request.authuser.user_id) # very inefficient!!! |
|
233 | status = pr.user_review_status(request.authuser.user_id) # very inefficient!!! | |
234 | if status in done_status: |
|
234 | if status in done_status: | |
235 | c.participate_in_pull_requests.append(pr) |
|
235 | c.participate_in_pull_requests.append(pr) | |
236 | else: |
|
236 | else: | |
237 | c.participate_in_pull_requests_todo.append(pr) |
|
237 | c.participate_in_pull_requests_todo.append(pr) | |
238 |
|
238 | |||
239 | return render('/pullrequests/pullrequest_show_my.html') |
|
239 | return base.render('/pullrequests/pullrequest_show_my.html') | |
240 |
|
240 | |||
241 | @LoginRequired() |
|
241 | @LoginRequired() | |
242 | @HasRepoPermissionLevelDecorator('read') |
|
242 | @HasRepoPermissionLevelDecorator('read') | |
243 | def index(self): |
|
243 | def index(self): | |
244 | org_repo = c.db_repo |
|
244 | org_repo = c.db_repo | |
245 | org_scm_instance = org_repo.scm_instance |
|
245 | org_scm_instance = org_repo.scm_instance | |
246 | try: |
|
246 | try: | |
247 | org_scm_instance.get_changeset() |
|
247 | org_scm_instance.get_changeset() | |
248 | except EmptyRepositoryError as e: |
|
248 | except EmptyRepositoryError as e: | |
249 | webutils.flash(_('There are no changesets yet'), |
|
249 | webutils.flash(_('There are no changesets yet'), | |
250 | category='warning') |
|
250 | category='warning') | |
251 | raise HTTPFound(location=url('summary_home', repo_name=org_repo.repo_name)) |
|
251 | raise HTTPFound(location=url('summary_home', repo_name=org_repo.repo_name)) | |
252 |
|
252 | |||
253 | org_rev = request.GET.get('rev_end') |
|
253 | org_rev = request.GET.get('rev_end') | |
254 | # rev_start is not directly useful - its parent could however be used |
|
254 | # rev_start is not directly useful - its parent could however be used | |
255 | # as default for other and thus give a simple compare view |
|
255 | # as default for other and thus give a simple compare view | |
256 | rev_start = request.GET.get('rev_start') |
|
256 | rev_start = request.GET.get('rev_start') | |
257 | other_rev = None |
|
257 | other_rev = None | |
258 | if rev_start: |
|
258 | if rev_start: | |
259 | starters = org_repo.get_changeset(rev_start).parents |
|
259 | starters = org_repo.get_changeset(rev_start).parents | |
260 | if starters: |
|
260 | if starters: | |
261 | other_rev = starters[0].raw_id |
|
261 | other_rev = starters[0].raw_id | |
262 | else: |
|
262 | else: | |
263 | other_rev = org_repo.scm_instance.EMPTY_CHANGESET |
|
263 | other_rev = org_repo.scm_instance.EMPTY_CHANGESET | |
264 | branch = request.GET.get('branch') |
|
264 | branch = request.GET.get('branch') | |
265 |
|
265 | |||
266 | c.cs_repos = [(org_repo.repo_name, org_repo.repo_name)] |
|
266 | c.cs_repos = [(org_repo.repo_name, org_repo.repo_name)] | |
267 | c.default_cs_repo = org_repo.repo_name |
|
267 | c.default_cs_repo = org_repo.repo_name | |
268 | c.cs_refs, c.default_cs_ref = self._get_repo_refs(org_scm_instance, rev=org_rev, branch=branch) |
|
268 | c.cs_refs, c.default_cs_ref = self._get_repo_refs(org_scm_instance, rev=org_rev, branch=branch) | |
269 |
|
269 | |||
270 | default_cs_ref_type, default_cs_branch, default_cs_rev = c.default_cs_ref.split(':') |
|
270 | default_cs_ref_type, default_cs_branch, default_cs_rev = c.default_cs_ref.split(':') | |
271 | if default_cs_ref_type != 'branch': |
|
271 | if default_cs_ref_type != 'branch': | |
272 | default_cs_branch = org_repo.get_changeset(default_cs_rev).branch |
|
272 | default_cs_branch = org_repo.get_changeset(default_cs_rev).branch | |
273 |
|
273 | |||
274 | # add org repo to other so we can open pull request against peer branches on itself |
|
274 | # add org repo to other so we can open pull request against peer branches on itself | |
275 | c.a_repos = [(org_repo.repo_name, '%s (self)' % org_repo.repo_name)] |
|
275 | c.a_repos = [(org_repo.repo_name, '%s (self)' % org_repo.repo_name)] | |
276 |
|
276 | |||
277 | if org_repo.parent: |
|
277 | if org_repo.parent: | |
278 | # add parent of this fork also and select it. |
|
278 | # add parent of this fork also and select it. | |
279 | # use the same branch on destination as on source, if available. |
|
279 | # use the same branch on destination as on source, if available. | |
280 | c.a_repos.append((org_repo.parent.repo_name, '%s (parent)' % org_repo.parent.repo_name)) |
|
280 | c.a_repos.append((org_repo.parent.repo_name, '%s (parent)' % org_repo.parent.repo_name)) | |
281 | c.a_repo = org_repo.parent |
|
281 | c.a_repo = org_repo.parent | |
282 | c.a_refs, c.default_a_ref = self._get_repo_refs( |
|
282 | c.a_refs, c.default_a_ref = self._get_repo_refs( | |
283 | org_repo.parent.scm_instance, branch=default_cs_branch, rev=other_rev) |
|
283 | org_repo.parent.scm_instance, branch=default_cs_branch, rev=other_rev) | |
284 |
|
284 | |||
285 | else: |
|
285 | else: | |
286 | c.a_repo = org_repo |
|
286 | c.a_repo = org_repo | |
287 | c.a_refs, c.default_a_ref = self._get_repo_refs(org_scm_instance, rev=other_rev) |
|
287 | c.a_refs, c.default_a_ref = self._get_repo_refs(org_scm_instance, rev=other_rev) | |
288 |
|
288 | |||
289 | # gather forks and add to this list ... even though it is rare to |
|
289 | # gather forks and add to this list ... even though it is rare to | |
290 | # request forks to pull from their parent |
|
290 | # request forks to pull from their parent | |
291 | for fork in org_repo.forks: |
|
291 | for fork in org_repo.forks: | |
292 | c.a_repos.append((fork.repo_name, fork.repo_name)) |
|
292 | c.a_repos.append((fork.repo_name, fork.repo_name)) | |
293 |
|
293 | |||
294 | return render('/pullrequests/pullrequest.html') |
|
294 | return base.render('/pullrequests/pullrequest.html') | |
295 |
|
295 | |||
296 | @LoginRequired() |
|
296 | @LoginRequired() | |
297 | @HasRepoPermissionLevelDecorator('read') |
|
297 | @HasRepoPermissionLevelDecorator('read') | |
298 | @jsonify |
|
298 | @base.jsonify | |
299 | def repo_info(self, repo_name): |
|
299 | def repo_info(self, repo_name): | |
300 | repo = c.db_repo |
|
300 | repo = c.db_repo | |
301 | refs, selected_ref = self._get_repo_refs(repo.scm_instance) |
|
301 | refs, selected_ref = self._get_repo_refs(repo.scm_instance) | |
302 | return { |
|
302 | return { | |
303 | 'description': repo.description.split('\n', 1)[0], |
|
303 | 'description': repo.description.split('\n', 1)[0], | |
304 | 'selected_ref': selected_ref, |
|
304 | 'selected_ref': selected_ref, | |
305 | 'refs': refs, |
|
305 | 'refs': refs, | |
306 | } |
|
306 | } | |
307 |
|
307 | |||
308 | @LoginRequired() |
|
308 | @LoginRequired() | |
309 | @HasRepoPermissionLevelDecorator('read') |
|
309 | @HasRepoPermissionLevelDecorator('read') | |
310 | def create(self, repo_name): |
|
310 | def create(self, repo_name): | |
311 | repo = c.db_repo |
|
311 | repo = c.db_repo | |
312 | try: |
|
312 | try: | |
313 | _form = PullRequestForm(repo.repo_id)().to_python(request.POST) |
|
313 | _form = PullRequestForm(repo.repo_id)().to_python(request.POST) | |
314 | except formencode.Invalid as errors: |
|
314 | except formencode.Invalid as errors: | |
315 | log.error(traceback.format_exc()) |
|
315 | log.error(traceback.format_exc()) | |
316 | log.error(str(errors)) |
|
316 | log.error(str(errors)) | |
317 | msg = _('Error creating pull request: %s') % errors.msg |
|
317 | msg = _('Error creating pull request: %s') % errors.msg | |
318 | webutils.flash(msg, 'error') |
|
318 | webutils.flash(msg, 'error') | |
319 | raise HTTPBadRequest |
|
319 | raise HTTPBadRequest | |
320 |
|
320 | |||
321 | # heads up: org and other might seem backward here ... |
|
321 | # heads up: org and other might seem backward here ... | |
322 | org_ref = _form['org_ref'] # will have merge_rev as rev but symbolic name |
|
322 | org_ref = _form['org_ref'] # will have merge_rev as rev but symbolic name | |
323 | org_repo = db.Repository.guess_instance(_form['org_repo']) |
|
323 | org_repo = db.Repository.guess_instance(_form['org_repo']) | |
324 |
|
324 | |||
325 | other_ref = _form['other_ref'] # will have symbolic name and head revision |
|
325 | other_ref = _form['other_ref'] # will have symbolic name and head revision | |
326 | other_repo = db.Repository.guess_instance(_form['other_repo']) |
|
326 | other_repo = db.Repository.guess_instance(_form['other_repo']) | |
327 |
|
327 | |||
328 | reviewers = [] |
|
328 | reviewers = [] | |
329 |
|
329 | |||
330 | title = _form['pullrequest_title'] |
|
330 | title = _form['pullrequest_title'] | |
331 | description = _form['pullrequest_desc'].strip() |
|
331 | description = _form['pullrequest_desc'].strip() | |
332 | owner = db.User.get(request.authuser.user_id) |
|
332 | owner = db.User.get(request.authuser.user_id) | |
333 |
|
333 | |||
334 | try: |
|
334 | try: | |
335 | cmd = CreatePullRequestAction(org_repo, other_repo, org_ref, other_ref, title, description, owner, reviewers) |
|
335 | cmd = CreatePullRequestAction(org_repo, other_repo, org_ref, other_ref, title, description, owner, reviewers) | |
336 | except CreatePullRequestAction.ValidationError as e: |
|
336 | except CreatePullRequestAction.ValidationError as e: | |
337 | webutils.flash(e, category='error', logf=log.error) |
|
337 | webutils.flash(e, category='error', logf=log.error) | |
338 | raise HTTPNotFound |
|
338 | raise HTTPNotFound | |
339 |
|
339 | |||
340 | try: |
|
340 | try: | |
341 | pull_request = cmd.execute() |
|
341 | pull_request = cmd.execute() | |
342 | meta.Session().commit() |
|
342 | meta.Session().commit() | |
343 | except Exception: |
|
343 | except Exception: | |
344 | webutils.flash(_('Error occurred while creating pull request'), |
|
344 | webutils.flash(_('Error occurred while creating pull request'), | |
345 | category='error') |
|
345 | category='error') | |
346 | log.error(traceback.format_exc()) |
|
346 | log.error(traceback.format_exc()) | |
347 | raise HTTPFound(location=url('pullrequest_home', repo_name=repo_name)) |
|
347 | raise HTTPFound(location=url('pullrequest_home', repo_name=repo_name)) | |
348 |
|
348 | |||
349 | webutils.flash(_('Successfully opened new pull request'), |
|
349 | webutils.flash(_('Successfully opened new pull request'), | |
350 | category='success') |
|
350 | category='success') | |
351 | raise HTTPFound(location=pull_request.url()) |
|
351 | raise HTTPFound(location=pull_request.url()) | |
352 |
|
352 | |||
353 | def create_new_iteration(self, old_pull_request, new_rev, title, description, reviewers): |
|
353 | def create_new_iteration(self, old_pull_request, new_rev, title, description, reviewers): | |
354 | owner = db.User.get(request.authuser.user_id) |
|
354 | owner = db.User.get(request.authuser.user_id) | |
355 | new_org_rev = self._get_ref_rev(old_pull_request.org_repo, 'rev', new_rev) |
|
355 | new_org_rev = self._get_ref_rev(old_pull_request.org_repo, 'rev', new_rev) | |
356 | new_other_rev = self._get_ref_rev(old_pull_request.other_repo, old_pull_request.other_ref_parts[0], old_pull_request.other_ref_parts[1]) |
|
356 | new_other_rev = self._get_ref_rev(old_pull_request.other_repo, old_pull_request.other_ref_parts[0], old_pull_request.other_ref_parts[1]) | |
357 | try: |
|
357 | try: | |
358 | cmd = CreatePullRequestIterationAction(old_pull_request, new_org_rev, new_other_rev, title, description, owner, reviewers) |
|
358 | cmd = CreatePullRequestIterationAction(old_pull_request, new_org_rev, new_other_rev, title, description, owner, reviewers) | |
359 | except CreatePullRequestAction.ValidationError as e: |
|
359 | except CreatePullRequestAction.ValidationError as e: | |
360 | webutils.flash(e, category='error', logf=log.error) |
|
360 | webutils.flash(e, category='error', logf=log.error) | |
361 | raise HTTPNotFound |
|
361 | raise HTTPNotFound | |
362 |
|
362 | |||
363 | try: |
|
363 | try: | |
364 | pull_request = cmd.execute() |
|
364 | pull_request = cmd.execute() | |
365 | meta.Session().commit() |
|
365 | meta.Session().commit() | |
366 | except Exception: |
|
366 | except Exception: | |
367 | webutils.flash(_('Error occurred while creating pull request'), |
|
367 | webutils.flash(_('Error occurred while creating pull request'), | |
368 | category='error') |
|
368 | category='error') | |
369 | log.error(traceback.format_exc()) |
|
369 | log.error(traceback.format_exc()) | |
370 | raise HTTPFound(location=old_pull_request.url()) |
|
370 | raise HTTPFound(location=old_pull_request.url()) | |
371 |
|
371 | |||
372 | webutils.flash(_('New pull request iteration created'), |
|
372 | webutils.flash(_('New pull request iteration created'), | |
373 | category='success') |
|
373 | category='success') | |
374 | raise HTTPFound(location=pull_request.url()) |
|
374 | raise HTTPFound(location=pull_request.url()) | |
375 |
|
375 | |||
376 | # pullrequest_post for PR editing |
|
376 | # pullrequest_post for PR editing | |
377 | @LoginRequired() |
|
377 | @LoginRequired() | |
378 | @HasRepoPermissionLevelDecorator('read') |
|
378 | @HasRepoPermissionLevelDecorator('read') | |
379 | def post(self, repo_name, pull_request_id): |
|
379 | def post(self, repo_name, pull_request_id): | |
380 | pull_request = db.PullRequest.get_or_404(pull_request_id) |
|
380 | pull_request = db.PullRequest.get_or_404(pull_request_id) | |
381 | if pull_request.is_closed(): |
|
381 | if pull_request.is_closed(): | |
382 | raise HTTPForbidden() |
|
382 | raise HTTPForbidden() | |
383 | assert pull_request.other_repo.repo_name == repo_name |
|
383 | assert pull_request.other_repo.repo_name == repo_name | |
384 | # only owner or admin can update it |
|
384 | # only owner or admin can update it | |
385 | owner = pull_request.owner_id == request.authuser.user_id |
|
385 | owner = pull_request.owner_id == request.authuser.user_id | |
386 | repo_admin = auth.HasRepoPermissionLevel('admin')(c.repo_name) |
|
386 | repo_admin = auth.HasRepoPermissionLevel('admin')(c.repo_name) | |
387 | if not (auth.HasPermissionAny('hg.admin')() or repo_admin or owner): |
|
387 | if not (auth.HasPermissionAny('hg.admin')() or repo_admin or owner): | |
388 | raise HTTPForbidden() |
|
388 | raise HTTPForbidden() | |
389 |
|
389 | |||
390 | _form = PullRequestPostForm()().to_python(request.POST) |
|
390 | _form = PullRequestPostForm()().to_python(request.POST) | |
391 |
|
391 | |||
392 | cur_reviewers = set(pull_request.get_reviewer_users()) |
|
392 | cur_reviewers = set(pull_request.get_reviewer_users()) | |
393 | new_reviewers = set(_get_reviewer(s) for s in _form['review_members']) |
|
393 | new_reviewers = set(_get_reviewer(s) for s in _form['review_members']) | |
394 | old_reviewers = set(_get_reviewer(s) for s in _form['org_review_members']) |
|
394 | old_reviewers = set(_get_reviewer(s) for s in _form['org_review_members']) | |
395 |
|
395 | |||
396 | other_added = cur_reviewers - old_reviewers |
|
396 | other_added = cur_reviewers - old_reviewers | |
397 | other_removed = old_reviewers - cur_reviewers |
|
397 | other_removed = old_reviewers - cur_reviewers | |
398 |
|
398 | |||
399 | if other_added: |
|
399 | if other_added: | |
400 | webutils.flash(_('Meanwhile, the following reviewers have been added: %s') % |
|
400 | webutils.flash(_('Meanwhile, the following reviewers have been added: %s') % | |
401 | (', '.join(u.username for u in other_added)), |
|
401 | (', '.join(u.username for u in other_added)), | |
402 | category='warning') |
|
402 | category='warning') | |
403 | if other_removed: |
|
403 | if other_removed: | |
404 | webutils.flash(_('Meanwhile, the following reviewers have been removed: %s') % |
|
404 | webutils.flash(_('Meanwhile, the following reviewers have been removed: %s') % | |
405 | (', '.join(u.username for u in other_removed)), |
|
405 | (', '.join(u.username for u in other_removed)), | |
406 | category='warning') |
|
406 | category='warning') | |
407 |
|
407 | |||
408 | if _form['updaterev']: |
|
408 | if _form['updaterev']: | |
409 | return self.create_new_iteration(pull_request, |
|
409 | return self.create_new_iteration(pull_request, | |
410 | _form['updaterev'], |
|
410 | _form['updaterev'], | |
411 | _form['pullrequest_title'], |
|
411 | _form['pullrequest_title'], | |
412 | _form['pullrequest_desc'], |
|
412 | _form['pullrequest_desc'], | |
413 | new_reviewers) |
|
413 | new_reviewers) | |
414 |
|
414 | |||
415 | added_reviewers = new_reviewers - old_reviewers - cur_reviewers |
|
415 | added_reviewers = new_reviewers - old_reviewers - cur_reviewers | |
416 | removed_reviewers = (old_reviewers - new_reviewers) & cur_reviewers |
|
416 | removed_reviewers = (old_reviewers - new_reviewers) & cur_reviewers | |
417 |
|
417 | |||
418 | old_description = pull_request.description |
|
418 | old_description = pull_request.description | |
419 | pull_request.title = _form['pullrequest_title'] |
|
419 | pull_request.title = _form['pullrequest_title'] | |
420 | pull_request.description = _form['pullrequest_desc'].strip() or _('No description') |
|
420 | pull_request.description = _form['pullrequest_desc'].strip() or _('No description') | |
421 | pull_request.owner = db.User.get_by_username(_form['owner']) |
|
421 | pull_request.owner = db.User.get_by_username(_form['owner']) | |
422 | user = db.User.get(request.authuser.user_id) |
|
422 | user = db.User.get(request.authuser.user_id) | |
423 |
|
423 | |||
424 | PullRequestModel().mention_from_description(user, pull_request, old_description) |
|
424 | PullRequestModel().mention_from_description(user, pull_request, old_description) | |
425 | PullRequestModel().add_reviewers(user, pull_request, added_reviewers) |
|
425 | PullRequestModel().add_reviewers(user, pull_request, added_reviewers) | |
426 | PullRequestModel().remove_reviewers(user, pull_request, removed_reviewers) |
|
426 | PullRequestModel().remove_reviewers(user, pull_request, removed_reviewers) | |
427 |
|
427 | |||
428 | meta.Session().commit() |
|
428 | meta.Session().commit() | |
429 | webutils.flash(_('Pull request updated'), category='success') |
|
429 | webutils.flash(_('Pull request updated'), category='success') | |
430 |
|
430 | |||
431 | raise HTTPFound(location=pull_request.url()) |
|
431 | raise HTTPFound(location=pull_request.url()) | |
432 |
|
432 | |||
433 | @LoginRequired() |
|
433 | @LoginRequired() | |
434 | @HasRepoPermissionLevelDecorator('read') |
|
434 | @HasRepoPermissionLevelDecorator('read') | |
435 | @jsonify |
|
435 | @base.jsonify | |
436 | def delete(self, repo_name, pull_request_id): |
|
436 | def delete(self, repo_name, pull_request_id): | |
437 | pull_request = db.PullRequest.get_or_404(pull_request_id) |
|
437 | pull_request = db.PullRequest.get_or_404(pull_request_id) | |
438 | # only owner can delete it ! |
|
438 | # only owner can delete it ! | |
439 | if pull_request.owner_id == request.authuser.user_id: |
|
439 | if pull_request.owner_id == request.authuser.user_id: | |
440 | PullRequestModel().delete(pull_request) |
|
440 | PullRequestModel().delete(pull_request) | |
441 | meta.Session().commit() |
|
441 | meta.Session().commit() | |
442 | webutils.flash(_('Successfully deleted pull request'), |
|
442 | webutils.flash(_('Successfully deleted pull request'), | |
443 | category='success') |
|
443 | category='success') | |
444 | raise HTTPFound(location=url('my_pullrequests')) |
|
444 | raise HTTPFound(location=url('my_pullrequests')) | |
445 | raise HTTPForbidden() |
|
445 | raise HTTPForbidden() | |
446 |
|
446 | |||
447 | @LoginRequired(allow_default_user=True) |
|
447 | @LoginRequired(allow_default_user=True) | |
448 | @HasRepoPermissionLevelDecorator('read') |
|
448 | @HasRepoPermissionLevelDecorator('read') | |
449 | def show(self, repo_name, pull_request_id, extra=None): |
|
449 | def show(self, repo_name, pull_request_id, extra=None): | |
450 | c.pull_request = db.PullRequest.get_or_404(pull_request_id) |
|
450 | c.pull_request = db.PullRequest.get_or_404(pull_request_id) | |
451 | c.allowed_to_change_status = self._is_allowed_to_change_status(c.pull_request) |
|
451 | c.allowed_to_change_status = self._is_allowed_to_change_status(c.pull_request) | |
452 | cc_model = ChangesetCommentsModel() |
|
452 | cc_model = ChangesetCommentsModel() | |
453 | cs_model = ChangesetStatusModel() |
|
453 | cs_model = ChangesetStatusModel() | |
454 |
|
454 | |||
455 | # pull_requests repo_name we opened it against |
|
455 | # pull_requests repo_name we opened it against | |
456 | # ie. other_repo must match |
|
456 | # ie. other_repo must match | |
457 | if repo_name != c.pull_request.other_repo.repo_name: |
|
457 | if repo_name != c.pull_request.other_repo.repo_name: | |
458 | raise HTTPNotFound |
|
458 | raise HTTPNotFound | |
459 |
|
459 | |||
460 | # load compare data into template context |
|
460 | # load compare data into template context | |
461 | c.cs_repo = c.pull_request.org_repo |
|
461 | c.cs_repo = c.pull_request.org_repo | |
462 | (c.cs_ref_type, |
|
462 | (c.cs_ref_type, | |
463 | c.cs_ref_name, |
|
463 | c.cs_ref_name, | |
464 | c.cs_rev) = c.pull_request.org_ref.split(':') |
|
464 | c.cs_rev) = c.pull_request.org_ref.split(':') | |
465 |
|
465 | |||
466 | c.a_repo = c.pull_request.other_repo |
|
466 | c.a_repo = c.pull_request.other_repo | |
467 | (c.a_ref_type, |
|
467 | (c.a_ref_type, | |
468 | c.a_ref_name, |
|
468 | c.a_ref_name, | |
469 | c.a_rev) = c.pull_request.other_ref.split(':') # a_rev is ancestor |
|
469 | c.a_rev) = c.pull_request.other_ref.split(':') # a_rev is ancestor | |
470 |
|
470 | |||
471 | org_scm_instance = c.cs_repo.scm_instance # property with expensive cache invalidation check!!! |
|
471 | org_scm_instance = c.cs_repo.scm_instance # property with expensive cache invalidation check!!! | |
472 | c.cs_ranges = [] |
|
472 | c.cs_ranges = [] | |
473 | for x in c.pull_request.revisions: |
|
473 | for x in c.pull_request.revisions: | |
474 | try: |
|
474 | try: | |
475 | c.cs_ranges.append(org_scm_instance.get_changeset(x)) |
|
475 | c.cs_ranges.append(org_scm_instance.get_changeset(x)) | |
476 | except ChangesetDoesNotExistError: |
|
476 | except ChangesetDoesNotExistError: | |
477 | c.cs_ranges = [] |
|
477 | c.cs_ranges = [] | |
478 | webutils.flash(_('Revision %s not found in %s') % (x, c.cs_repo.repo_name), |
|
478 | webutils.flash(_('Revision %s not found in %s') % (x, c.cs_repo.repo_name), | |
479 | 'error') |
|
479 | 'error') | |
480 | break |
|
480 | break | |
481 | c.cs_ranges_org = None # not stored and not important and moving target - could be calculated ... |
|
481 | c.cs_ranges_org = None # not stored and not important and moving target - could be calculated ... | |
482 | revs = [ctx.revision for ctx in reversed(c.cs_ranges)] |
|
482 | revs = [ctx.revision for ctx in reversed(c.cs_ranges)] | |
483 | c.jsdata = graph_data(org_scm_instance, revs) |
|
483 | c.jsdata = graph_data(org_scm_instance, revs) | |
484 |
|
484 | |||
485 | c.is_range = False |
|
485 | c.is_range = False | |
486 | try: |
|
486 | try: | |
487 | if c.a_ref_type == 'rev': # this looks like a free range where target is ancestor |
|
487 | if c.a_ref_type == 'rev': # this looks like a free range where target is ancestor | |
488 | cs_a = org_scm_instance.get_changeset(c.a_rev) |
|
488 | cs_a = org_scm_instance.get_changeset(c.a_rev) | |
489 | root_parents = c.cs_ranges[0].parents |
|
489 | root_parents = c.cs_ranges[0].parents | |
490 | c.is_range = cs_a in root_parents |
|
490 | c.is_range = cs_a in root_parents | |
491 | #c.merge_root = len(root_parents) > 1 # a range starting with a merge might deserve a warning |
|
491 | #c.merge_root = len(root_parents) > 1 # a range starting with a merge might deserve a warning | |
492 | except ChangesetDoesNotExistError: # probably because c.a_rev not found |
|
492 | except ChangesetDoesNotExistError: # probably because c.a_rev not found | |
493 | pass |
|
493 | pass | |
494 | except IndexError: # probably because c.cs_ranges is empty, probably because revisions are missing |
|
494 | except IndexError: # probably because c.cs_ranges is empty, probably because revisions are missing | |
495 | pass |
|
495 | pass | |
496 |
|
496 | |||
497 | avail_revs = set() |
|
497 | avail_revs = set() | |
498 | avail_show = [] |
|
498 | avail_show = [] | |
499 | c.cs_branch_name = c.cs_ref_name |
|
499 | c.cs_branch_name = c.cs_ref_name | |
500 | c.a_branch_name = None |
|
500 | c.a_branch_name = None | |
501 | other_scm_instance = c.a_repo.scm_instance |
|
501 | other_scm_instance = c.a_repo.scm_instance | |
502 | c.update_msg = "" |
|
502 | c.update_msg = "" | |
503 | c.update_msg_other = "" |
|
503 | c.update_msg_other = "" | |
504 | try: |
|
504 | try: | |
505 | if not c.cs_ranges: |
|
505 | if not c.cs_ranges: | |
506 | c.update_msg = _('Error: changesets not found when displaying pull request from %s.') % c.cs_rev |
|
506 | c.update_msg = _('Error: changesets not found when displaying pull request from %s.') % c.cs_rev | |
507 | elif org_scm_instance.alias == 'hg' and c.a_ref_name != 'ancestor': |
|
507 | elif org_scm_instance.alias == 'hg' and c.a_ref_name != 'ancestor': | |
508 | if c.cs_ref_type != 'branch': |
|
508 | if c.cs_ref_type != 'branch': | |
509 | c.cs_branch_name = org_scm_instance.get_changeset(c.cs_ref_name).branch # use ref_type ? |
|
509 | c.cs_branch_name = org_scm_instance.get_changeset(c.cs_ref_name).branch # use ref_type ? | |
510 | c.a_branch_name = c.a_ref_name |
|
510 | c.a_branch_name = c.a_ref_name | |
511 | if c.a_ref_type != 'branch': |
|
511 | if c.a_ref_type != 'branch': | |
512 | try: |
|
512 | try: | |
513 | c.a_branch_name = other_scm_instance.get_changeset(c.a_ref_name).branch # use ref_type ? |
|
513 | c.a_branch_name = other_scm_instance.get_changeset(c.a_ref_name).branch # use ref_type ? | |
514 | except EmptyRepositoryError: |
|
514 | except EmptyRepositoryError: | |
515 | c.a_branch_name = 'null' # not a branch name ... but close enough |
|
515 | c.a_branch_name = 'null' # not a branch name ... but close enough | |
516 | # candidates: descendants of old head that are on the right branch |
|
516 | # candidates: descendants of old head that are on the right branch | |
517 | # and not are the old head itself ... |
|
517 | # and not are the old head itself ... | |
518 | # and nothing at all if old head is a descendant of target ref name |
|
518 | # and nothing at all if old head is a descendant of target ref name | |
519 | if not c.is_range and other_scm_instance._repo.revs('present(%s)::&%s', c.cs_ranges[-1].raw_id, c.a_branch_name): |
|
519 | if not c.is_range and other_scm_instance._repo.revs('present(%s)::&%s', c.cs_ranges[-1].raw_id, c.a_branch_name): | |
520 | c.update_msg = _('This pull request has already been merged to %s.') % c.a_branch_name |
|
520 | c.update_msg = _('This pull request has already been merged to %s.') % c.a_branch_name | |
521 | elif c.pull_request.is_closed(): |
|
521 | elif c.pull_request.is_closed(): | |
522 | c.update_msg = _('This pull request has been closed and can not be updated.') |
|
522 | c.update_msg = _('This pull request has been closed and can not be updated.') | |
523 | else: # look for descendants of PR head on source branch in org repo |
|
523 | else: # look for descendants of PR head on source branch in org repo | |
524 | avail_revs = org_scm_instance._repo.revs('%s:: & branch(%s)', |
|
524 | avail_revs = org_scm_instance._repo.revs('%s:: & branch(%s)', | |
525 | revs[0], c.cs_branch_name) |
|
525 | revs[0], c.cs_branch_name) | |
526 | if len(avail_revs) > 1: # more than just revs[0] |
|
526 | if len(avail_revs) > 1: # more than just revs[0] | |
527 | # also show changesets that not are descendants but would be merged in |
|
527 | # also show changesets that not are descendants but would be merged in | |
528 | targethead = other_scm_instance.get_changeset(c.a_branch_name).raw_id |
|
528 | targethead = other_scm_instance.get_changeset(c.a_branch_name).raw_id | |
529 | if org_scm_instance.path != other_scm_instance.path: |
|
529 | if org_scm_instance.path != other_scm_instance.path: | |
530 | # Note: org_scm_instance.path must come first so all |
|
530 | # Note: org_scm_instance.path must come first so all | |
531 | # valid revision numbers are 100% org_scm compatible |
|
531 | # valid revision numbers are 100% org_scm compatible | |
532 | # - both for avail_revs and for revset results |
|
532 | # - both for avail_revs and for revset results | |
533 | hgrepo = mercurial.unionrepo.makeunionrepository(org_scm_instance.baseui, |
|
533 | hgrepo = mercurial.unionrepo.makeunionrepository(org_scm_instance.baseui, | |
534 | safe_bytes(org_scm_instance.path), |
|
534 | safe_bytes(org_scm_instance.path), | |
535 | safe_bytes(other_scm_instance.path)) |
|
535 | safe_bytes(other_scm_instance.path)) | |
536 | else: |
|
536 | else: | |
537 | hgrepo = org_scm_instance._repo |
|
537 | hgrepo = org_scm_instance._repo | |
538 | show = set(hgrepo.revs('::%ld & !::parents(%s) & !::%s', |
|
538 | show = set(hgrepo.revs('::%ld & !::parents(%s) & !::%s', | |
539 | avail_revs, revs[0], targethead)) |
|
539 | avail_revs, revs[0], targethead)) | |
540 | if show: |
|
540 | if show: | |
541 | c.update_msg = _('The following additional changes are available on %s:') % c.cs_branch_name |
|
541 | c.update_msg = _('The following additional changes are available on %s:') % c.cs_branch_name | |
542 | else: |
|
542 | else: | |
543 | c.update_msg = _('No additional changesets found for iterating on this pull request.') |
|
543 | c.update_msg = _('No additional changesets found for iterating on this pull request.') | |
544 | else: |
|
544 | else: | |
545 | show = set() |
|
545 | show = set() | |
546 | avail_revs = set() # drop revs[0] |
|
546 | avail_revs = set() # drop revs[0] | |
547 | c.update_msg = _('No additional changesets found for iterating on this pull request.') |
|
547 | c.update_msg = _('No additional changesets found for iterating on this pull request.') | |
548 |
|
548 | |||
549 | # TODO: handle branch heads that not are tip-most |
|
549 | # TODO: handle branch heads that not are tip-most | |
550 | brevs = org_scm_instance._repo.revs('%s - %ld - %s', c.cs_branch_name, avail_revs, revs[0]) |
|
550 | brevs = org_scm_instance._repo.revs('%s - %ld - %s', c.cs_branch_name, avail_revs, revs[0]) | |
551 | if brevs: |
|
551 | if brevs: | |
552 | # also show changesets that are on branch but neither ancestors nor descendants |
|
552 | # also show changesets that are on branch but neither ancestors nor descendants | |
553 | show.update(org_scm_instance._repo.revs('::%ld - ::%ld - ::%s', brevs, avail_revs, c.a_branch_name)) |
|
553 | show.update(org_scm_instance._repo.revs('::%ld - ::%ld - ::%s', brevs, avail_revs, c.a_branch_name)) | |
554 | show.add(revs[0]) # make sure graph shows this so we can see how they relate |
|
554 | show.add(revs[0]) # make sure graph shows this so we can see how they relate | |
555 | c.update_msg_other = _('Note: Branch %s has another head: %s.') % (c.cs_branch_name, |
|
555 | c.update_msg_other = _('Note: Branch %s has another head: %s.') % (c.cs_branch_name, | |
556 | org_scm_instance.get_changeset(max(brevs)).short_id) |
|
556 | org_scm_instance.get_changeset(max(brevs)).short_id) | |
557 |
|
557 | |||
558 | avail_show = sorted(show, reverse=True) |
|
558 | avail_show = sorted(show, reverse=True) | |
559 |
|
559 | |||
560 | elif org_scm_instance.alias == 'git': |
|
560 | elif org_scm_instance.alias == 'git': | |
561 | c.cs_repo.scm_instance.get_changeset(c.cs_rev) # check it exists - raise ChangesetDoesNotExistError if not |
|
561 | c.cs_repo.scm_instance.get_changeset(c.cs_rev) # check it exists - raise ChangesetDoesNotExistError if not | |
562 | c.update_msg = _("Git pull requests don't support iterating yet.") |
|
562 | c.update_msg = _("Git pull requests don't support iterating yet.") | |
563 | except ChangesetDoesNotExistError: |
|
563 | except ChangesetDoesNotExistError: | |
564 | c.update_msg = _('Error: some changesets not found when displaying pull request from %s.') % c.cs_rev |
|
564 | c.update_msg = _('Error: some changesets not found when displaying pull request from %s.') % c.cs_rev | |
565 |
|
565 | |||
566 | c.avail_revs = avail_revs |
|
566 | c.avail_revs = avail_revs | |
567 | c.avail_cs = [org_scm_instance.get_changeset(r) for r in avail_show] |
|
567 | c.avail_cs = [org_scm_instance.get_changeset(r) for r in avail_show] | |
568 | c.avail_jsdata = graph_data(org_scm_instance, avail_show) |
|
568 | c.avail_jsdata = graph_data(org_scm_instance, avail_show) | |
569 |
|
569 | |||
570 | raw_ids = [x.raw_id for x in c.cs_ranges] |
|
570 | raw_ids = [x.raw_id for x in c.cs_ranges] | |
571 | c.cs_comments = c.cs_repo.get_comments(raw_ids) |
|
571 | c.cs_comments = c.cs_repo.get_comments(raw_ids) | |
572 | c.cs_statuses = c.cs_repo.statuses(raw_ids) |
|
572 | c.cs_statuses = c.cs_repo.statuses(raw_ids) | |
573 |
|
573 | |||
574 | ignore_whitespace_diff = h.get_ignore_whitespace_diff(request.GET) |
|
574 | ignore_whitespace_diff = h.get_ignore_whitespace_diff(request.GET) | |
575 | diff_context_size = h.get_diff_context_size(request.GET) |
|
575 | diff_context_size = h.get_diff_context_size(request.GET) | |
576 | fulldiff = request.GET.get('fulldiff') |
|
576 | fulldiff = request.GET.get('fulldiff') | |
577 | diff_limit = None if fulldiff else self.cut_off_limit |
|
577 | diff_limit = None if fulldiff else self.cut_off_limit | |
578 |
|
578 | |||
579 | # we swap org/other ref since we run a simple diff on one repo |
|
579 | # we swap org/other ref since we run a simple diff on one repo | |
580 | log.debug('running diff between %s and %s in %s', |
|
580 | log.debug('running diff between %s and %s in %s', | |
581 | c.a_rev, c.cs_rev, org_scm_instance.path) |
|
581 | c.a_rev, c.cs_rev, org_scm_instance.path) | |
582 | try: |
|
582 | try: | |
583 | raw_diff = diffs.get_diff(org_scm_instance, rev1=c.a_rev, rev2=c.cs_rev, |
|
583 | raw_diff = diffs.get_diff(org_scm_instance, rev1=c.a_rev, rev2=c.cs_rev, | |
584 | ignore_whitespace=ignore_whitespace_diff, context=diff_context_size) |
|
584 | ignore_whitespace=ignore_whitespace_diff, context=diff_context_size) | |
585 | except ChangesetDoesNotExistError: |
|
585 | except ChangesetDoesNotExistError: | |
586 | raw_diff = safe_bytes(_("The diff can't be shown - the PR revisions could not be found.")) |
|
586 | raw_diff = safe_bytes(_("The diff can't be shown - the PR revisions could not be found.")) | |
587 | diff_processor = diffs.DiffProcessor(raw_diff, diff_limit=diff_limit) |
|
587 | diff_processor = diffs.DiffProcessor(raw_diff, diff_limit=diff_limit) | |
588 | c.limited_diff = diff_processor.limited_diff |
|
588 | c.limited_diff = diff_processor.limited_diff | |
589 | c.file_diff_data = [] |
|
589 | c.file_diff_data = [] | |
590 | c.lines_added = 0 |
|
590 | c.lines_added = 0 | |
591 | c.lines_deleted = 0 |
|
591 | c.lines_deleted = 0 | |
592 |
|
592 | |||
593 | for f in diff_processor.parsed: |
|
593 | for f in diff_processor.parsed: | |
594 | st = f['stats'] |
|
594 | st = f['stats'] | |
595 | c.lines_added += st['added'] |
|
595 | c.lines_added += st['added'] | |
596 | c.lines_deleted += st['deleted'] |
|
596 | c.lines_deleted += st['deleted'] | |
597 | filename = f['filename'] |
|
597 | filename = f['filename'] | |
598 | fid = h.FID('', filename) |
|
598 | fid = h.FID('', filename) | |
599 | html_diff = diffs.as_html(parsed_lines=[f]) |
|
599 | html_diff = diffs.as_html(parsed_lines=[f]) | |
600 | c.file_diff_data.append((fid, None, f['operation'], f['old_filename'], filename, html_diff, st)) |
|
600 | c.file_diff_data.append((fid, None, f['operation'], f['old_filename'], filename, html_diff, st)) | |
601 |
|
601 | |||
602 | # inline comments |
|
602 | # inline comments | |
603 | c.inline_cnt = 0 |
|
603 | c.inline_cnt = 0 | |
604 | c.inline_comments = cc_model.get_inline_comments( |
|
604 | c.inline_comments = cc_model.get_inline_comments( | |
605 | c.db_repo.repo_id, |
|
605 | c.db_repo.repo_id, | |
606 | pull_request=pull_request_id) |
|
606 | pull_request=pull_request_id) | |
607 | # count inline comments |
|
607 | # count inline comments | |
608 | for __, lines in c.inline_comments: |
|
608 | for __, lines in c.inline_comments: | |
609 | for comments in lines.values(): |
|
609 | for comments in lines.values(): | |
610 | c.inline_cnt += len(comments) |
|
610 | c.inline_cnt += len(comments) | |
611 | # comments |
|
611 | # comments | |
612 | c.comments = cc_model.get_comments(c.db_repo.repo_id, pull_request=pull_request_id) |
|
612 | c.comments = cc_model.get_comments(c.db_repo.repo_id, pull_request=pull_request_id) | |
613 |
|
613 | |||
614 | # (badly named) pull-request status calculation based on reviewer votes |
|
614 | # (badly named) pull-request status calculation based on reviewer votes | |
615 | (c.pull_request_reviewers, |
|
615 | (c.pull_request_reviewers, | |
616 | c.pull_request_pending_reviewers, |
|
616 | c.pull_request_pending_reviewers, | |
617 | c.current_voting_result, |
|
617 | c.current_voting_result, | |
618 | ) = cs_model.calculate_pull_request_result(c.pull_request) |
|
618 | ) = cs_model.calculate_pull_request_result(c.pull_request) | |
619 | c.changeset_statuses = db.ChangesetStatus.STATUSES |
|
619 | c.changeset_statuses = db.ChangesetStatus.STATUSES | |
620 |
|
620 | |||
621 | c.is_ajax_preview = False |
|
621 | c.is_ajax_preview = False | |
622 | c.ancestors = None # [c.a_rev] ... but that is shown in an other way |
|
622 | c.ancestors = None # [c.a_rev] ... but that is shown in an other way | |
623 | return render('/pullrequests/pullrequest_show.html') |
|
623 | return base.render('/pullrequests/pullrequest_show.html') | |
624 |
|
624 | |||
625 | @LoginRequired() |
|
625 | @LoginRequired() | |
626 | @HasRepoPermissionLevelDecorator('read') |
|
626 | @HasRepoPermissionLevelDecorator('read') | |
627 | @jsonify |
|
627 | @base.jsonify | |
628 | def comment(self, repo_name, pull_request_id): |
|
628 | def comment(self, repo_name, pull_request_id): | |
629 | pull_request = db.PullRequest.get_or_404(pull_request_id) |
|
629 | pull_request = db.PullRequest.get_or_404(pull_request_id) | |
630 | allowed_to_change_status = self._is_allowed_to_change_status(pull_request) |
|
630 | allowed_to_change_status = self._is_allowed_to_change_status(pull_request) | |
631 | return create_cs_pr_comment(repo_name, pull_request=pull_request, |
|
631 | return create_cs_pr_comment(repo_name, pull_request=pull_request, | |
632 | allowed_to_change_status=allowed_to_change_status) |
|
632 | allowed_to_change_status=allowed_to_change_status) | |
633 |
|
633 | |||
634 | @LoginRequired() |
|
634 | @LoginRequired() | |
635 | @HasRepoPermissionLevelDecorator('read') |
|
635 | @HasRepoPermissionLevelDecorator('read') | |
636 | @jsonify |
|
636 | @base.jsonify | |
637 | def delete_comment(self, repo_name, comment_id): |
|
637 | def delete_comment(self, repo_name, comment_id): | |
638 | return delete_cs_pr_comment(repo_name, comment_id) |
|
638 | return delete_cs_pr_comment(repo_name, comment_id) |
@@ -1,35 +1,35 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | from tg import config |
|
14 | from tg import config | |
15 | from tgext.routes import RoutedController |
|
15 | from tgext.routes import RoutedController | |
16 |
|
16 | |||
|
17 | from kallithea.controllers import base | |||
17 | from kallithea.controllers.error import ErrorController |
|
18 | from kallithea.controllers.error import ErrorController | |
18 | from kallithea.controllers.routing import make_map |
|
19 | from kallithea.controllers.routing import make_map | |
19 | from kallithea.lib.base import BaseController |
|
|||
20 |
|
20 | |||
21 |
|
21 | |||
22 | # This is the main Kallithea entry point; TurboGears will forward all requests |
|
22 | # This is the main Kallithea entry point; TurboGears will forward all requests | |
23 | # to an instance of 'controller.root.RootController' in the configured |
|
23 | # to an instance of 'controller.root.RootController' in the configured | |
24 | # 'application' module (set by app_cfg.py). Requests are forwarded to |
|
24 | # 'application' module (set by app_cfg.py). Requests are forwarded to | |
25 | # controllers based on the routing mapper that lives in this root instance. |
|
25 | # controllers based on the routing mapper that lives in this root instance. | |
26 | # The mapper is configured using routes defined in routing.py. This use of the |
|
26 | # The mapper is configured using routes defined in routing.py. This use of the | |
27 | # 'mapper' attribute is a feature of tgext.routes, which is activated by |
|
27 | # 'mapper' attribute is a feature of tgext.routes, which is activated by | |
28 | # inheriting from its RoutedController class. |
|
28 | # inheriting from its RoutedController class. | |
29 | class RootController(RoutedController, BaseController): |
|
29 | class RootController(RoutedController, base.BaseController): | |
30 |
|
30 | |||
31 | def __init__(self): |
|
31 | def __init__(self): | |
32 | self.mapper = make_map(config) |
|
32 | self.mapper = make_map(config) | |
33 |
|
33 | |||
34 | # The URL '/error/document' (the default TG errorpage.path) should be handled by ErrorController.document |
|
34 | # The URL '/error/document' (the default TG errorpage.path) should be handled by ErrorController.document | |
35 | self.error = ErrorController() |
|
35 | self.error = ErrorController() |
@@ -1,142 +1,142 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.search |
|
15 | kallithea.controllers.search | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Search controller for Kallithea |
|
18 | Search controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Aug 7, 2010 |
|
22 | :created_on: Aug 7, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import logging |
|
28 | import logging | |
29 | import traceback |
|
29 | import traceback | |
30 |
|
30 | |||
31 | from tg import config, request |
|
31 | from tg import config, request | |
32 | from tg import tmpl_context as c |
|
32 | from tg import tmpl_context as c | |
33 | from tg.i18n import ugettext as _ |
|
33 | from tg.i18n import ugettext as _ | |
34 | from whoosh.index import EmptyIndexError, exists_in, open_dir |
|
34 | from whoosh.index import EmptyIndexError, exists_in, open_dir | |
35 | from whoosh.qparser import QueryParser, QueryParserError |
|
35 | from whoosh.qparser import QueryParser, QueryParserError | |
36 | from whoosh.query import Phrase, Prefix |
|
36 | from whoosh.query import Phrase, Prefix | |
37 |
|
37 | |||
|
38 | from kallithea.controllers import base | |||
38 | from kallithea.lib.auth import LoginRequired |
|
39 | from kallithea.lib.auth import LoginRequired | |
39 | from kallithea.lib.base import BaseRepoController, render |
|
|||
40 | from kallithea.lib.indexers import CHGSET_IDX_NAME, CHGSETS_SCHEMA, IDX_NAME, SCHEMA, WhooshResultWrapper |
|
40 | from kallithea.lib.indexers import CHGSET_IDX_NAME, CHGSETS_SCHEMA, IDX_NAME, SCHEMA, WhooshResultWrapper | |
41 | from kallithea.lib.page import Page |
|
41 | from kallithea.lib.page import Page | |
42 | from kallithea.lib.utils2 import safe_int |
|
42 | from kallithea.lib.utils2 import safe_int | |
43 | from kallithea.model.repo import RepoModel |
|
43 | from kallithea.model.repo import RepoModel | |
44 |
|
44 | |||
45 |
|
45 | |||
46 | log = logging.getLogger(__name__) |
|
46 | log = logging.getLogger(__name__) | |
47 |
|
47 | |||
48 |
|
48 | |||
49 | class SearchController(BaseRepoController): |
|
49 | class SearchController(base.BaseRepoController): | |
50 |
|
50 | |||
51 | @LoginRequired(allow_default_user=True) |
|
51 | @LoginRequired(allow_default_user=True) | |
52 | def index(self, repo_name=None): |
|
52 | def index(self, repo_name=None): | |
53 | c.repo_name = repo_name |
|
53 | c.repo_name = repo_name | |
54 | c.formated_results = [] |
|
54 | c.formated_results = [] | |
55 | c.runtime = '' |
|
55 | c.runtime = '' | |
56 | c.cur_query = request.GET.get('q', None) |
|
56 | c.cur_query = request.GET.get('q', None) | |
57 | c.cur_type = request.GET.get('type', 'content') |
|
57 | c.cur_type = request.GET.get('type', 'content') | |
58 | c.cur_search = search_type = {'content': 'content', |
|
58 | c.cur_search = search_type = {'content': 'content', | |
59 | 'commit': 'message', |
|
59 | 'commit': 'message', | |
60 | 'path': 'path', |
|
60 | 'path': 'path', | |
61 | 'repository': 'repository' |
|
61 | 'repository': 'repository' | |
62 | }.get(c.cur_type, 'content') |
|
62 | }.get(c.cur_type, 'content') | |
63 |
|
63 | |||
64 | index_name = { |
|
64 | index_name = { | |
65 | 'content': IDX_NAME, |
|
65 | 'content': IDX_NAME, | |
66 | 'commit': CHGSET_IDX_NAME, |
|
66 | 'commit': CHGSET_IDX_NAME, | |
67 | 'path': IDX_NAME |
|
67 | 'path': IDX_NAME | |
68 | }.get(c.cur_type, IDX_NAME) |
|
68 | }.get(c.cur_type, IDX_NAME) | |
69 |
|
69 | |||
70 | schema_defn = { |
|
70 | schema_defn = { | |
71 | 'content': SCHEMA, |
|
71 | 'content': SCHEMA, | |
72 | 'commit': CHGSETS_SCHEMA, |
|
72 | 'commit': CHGSETS_SCHEMA, | |
73 | 'path': SCHEMA |
|
73 | 'path': SCHEMA | |
74 | }.get(c.cur_type, SCHEMA) |
|
74 | }.get(c.cur_type, SCHEMA) | |
75 |
|
75 | |||
76 | log.debug('IDX: %s', index_name) |
|
76 | log.debug('IDX: %s', index_name) | |
77 | log.debug('SCHEMA: %s', schema_defn) |
|
77 | log.debug('SCHEMA: %s', schema_defn) | |
78 |
|
78 | |||
79 | if c.cur_query: |
|
79 | if c.cur_query: | |
80 | cur_query = c.cur_query.lower() |
|
80 | cur_query = c.cur_query.lower() | |
81 | log.debug(cur_query) |
|
81 | log.debug(cur_query) | |
82 |
|
82 | |||
83 | if c.cur_query: |
|
83 | if c.cur_query: | |
84 | p = safe_int(request.GET.get('page'), 1) |
|
84 | p = safe_int(request.GET.get('page'), 1) | |
85 | highlight_items = set() |
|
85 | highlight_items = set() | |
86 | index_dir = config['index_dir'] |
|
86 | index_dir = config['index_dir'] | |
87 | try: |
|
87 | try: | |
88 | if not exists_in(index_dir, index_name): |
|
88 | if not exists_in(index_dir, index_name): | |
89 | raise EmptyIndexError |
|
89 | raise EmptyIndexError | |
90 | idx = open_dir(index_dir, indexname=index_name) |
|
90 | idx = open_dir(index_dir, indexname=index_name) | |
91 | searcher = idx.searcher() |
|
91 | searcher = idx.searcher() | |
92 |
|
92 | |||
93 | qp = QueryParser(search_type, schema=schema_defn) |
|
93 | qp = QueryParser(search_type, schema=schema_defn) | |
94 | if c.repo_name: |
|
94 | if c.repo_name: | |
95 | # use "repository_rawname:" instead of "repository:" |
|
95 | # use "repository_rawname:" instead of "repository:" | |
96 | # for case-sensitive matching |
|
96 | # for case-sensitive matching | |
97 | cur_query = 'repository_rawname:%s %s' % (c.repo_name, cur_query) |
|
97 | cur_query = 'repository_rawname:%s %s' % (c.repo_name, cur_query) | |
98 | try: |
|
98 | try: | |
99 | query = qp.parse(cur_query) |
|
99 | query = qp.parse(cur_query) | |
100 | # extract words for highlight |
|
100 | # extract words for highlight | |
101 | if isinstance(query, Phrase): |
|
101 | if isinstance(query, Phrase): | |
102 | highlight_items.update(query.words) |
|
102 | highlight_items.update(query.words) | |
103 | elif isinstance(query, Prefix): |
|
103 | elif isinstance(query, Prefix): | |
104 | highlight_items.add(query.text) |
|
104 | highlight_items.add(query.text) | |
105 | else: |
|
105 | else: | |
106 | for i in query.all_terms(): |
|
106 | for i in query.all_terms(): | |
107 | if i[0] in ['content', 'message']: |
|
107 | if i[0] in ['content', 'message']: | |
108 | highlight_items.add(i[1]) |
|
108 | highlight_items.add(i[1]) | |
109 |
|
109 | |||
110 | matcher = query.matcher(searcher) |
|
110 | matcher = query.matcher(searcher) | |
111 |
|
111 | |||
112 | log.debug('query: %s', query) |
|
112 | log.debug('query: %s', query) | |
113 | log.debug('hl terms: %s', highlight_items) |
|
113 | log.debug('hl terms: %s', highlight_items) | |
114 | results = searcher.search(query) |
|
114 | results = searcher.search(query) | |
115 | res_ln = len(results) |
|
115 | res_ln = len(results) | |
116 | c.runtime = '%s results (%.3f seconds)' % ( |
|
116 | c.runtime = '%s results (%.3f seconds)' % ( | |
117 | res_ln, results.runtime |
|
117 | res_ln, results.runtime | |
118 | ) |
|
118 | ) | |
119 |
|
119 | |||
120 | repo_location = RepoModel().repos_path |
|
120 | repo_location = RepoModel().repos_path | |
121 | c.formated_results = Page( |
|
121 | c.formated_results = Page( | |
122 | WhooshResultWrapper(search_type, searcher, matcher, |
|
122 | WhooshResultWrapper(search_type, searcher, matcher, | |
123 | highlight_items, repo_location), |
|
123 | highlight_items, repo_location), | |
124 | page=p, |
|
124 | page=p, | |
125 | item_count=res_ln, |
|
125 | item_count=res_ln, | |
126 | items_per_page=10, |
|
126 | items_per_page=10, | |
127 | type=c.cur_type, |
|
127 | type=c.cur_type, | |
128 | q=c.cur_query, |
|
128 | q=c.cur_query, | |
129 | ) |
|
129 | ) | |
130 |
|
130 | |||
131 | except QueryParserError: |
|
131 | except QueryParserError: | |
132 | c.runtime = _('Invalid search query. Try quoting it.') |
|
132 | c.runtime = _('Invalid search query. Try quoting it.') | |
133 | searcher.close() |
|
133 | searcher.close() | |
134 | except EmptyIndexError: |
|
134 | except EmptyIndexError: | |
135 | log.error("Empty search index - run 'kallithea-cli index-create' regularly") |
|
135 | log.error("Empty search index - run 'kallithea-cli index-create' regularly") | |
136 | c.runtime = _('The server has no search index.') |
|
136 | c.runtime = _('The server has no search index.') | |
137 | except Exception: |
|
137 | except Exception: | |
138 | log.error(traceback.format_exc()) |
|
138 | log.error(traceback.format_exc()) | |
139 | c.runtime = _('An error occurred during search operation.') |
|
139 | c.runtime = _('An error occurred during search operation.') | |
140 |
|
140 | |||
141 | # Return a rendered template |
|
141 | # Return a rendered template | |
142 | return render('/search/search.html') |
|
142 | return base.render('/search/search.html') |
@@ -1,212 +1,212 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 | # This program is free software: you can redistribute it and/or modify |
|
2 | # This program is free software: you can redistribute it and/or modify | |
3 | # it under the terms of the GNU General Public License as published by |
|
3 | # it under the terms of the GNU General Public License as published by | |
4 | # the Free Software Foundation, either version 3 of the License, or |
|
4 | # the Free Software Foundation, either version 3 of the License, or | |
5 | # (at your option) any later version. |
|
5 | # (at your option) any later version. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU General Public License |
|
12 | # You should have received a copy of the GNU General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | """ |
|
14 | """ | |
15 | kallithea.controllers.summary |
|
15 | kallithea.controllers.summary | |
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
17 |
|
17 | |||
18 | Summary controller for Kallithea |
|
18 | Summary controller for Kallithea | |
19 |
|
19 | |||
20 | This file was forked by the Kallithea project in July 2014. |
|
20 | This file was forked by the Kallithea project in July 2014. | |
21 | Original author and date, and relevant copyright and licensing information is below: |
|
21 | Original author and date, and relevant copyright and licensing information is below: | |
22 | :created_on: Apr 18, 2010 |
|
22 | :created_on: Apr 18, 2010 | |
23 | :author: marcink |
|
23 | :author: marcink | |
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
24 | :copyright: (c) 2013 RhodeCode GmbH, and others. | |
25 | :license: GPLv3, see LICENSE.md for more details. |
|
25 | :license: GPLv3, see LICENSE.md for more details. | |
26 | """ |
|
26 | """ | |
27 |
|
27 | |||
28 | import calendar |
|
28 | import calendar | |
29 | import itertools |
|
29 | import itertools | |
30 | import logging |
|
30 | import logging | |
31 | import traceback |
|
31 | import traceback | |
32 | from datetime import date, timedelta |
|
32 | from datetime import date, timedelta | |
33 | from time import mktime |
|
33 | from time import mktime | |
34 |
|
34 | |||
35 | from beaker.cache import cache_region |
|
35 | from beaker.cache import cache_region | |
36 | from tg import request |
|
36 | from tg import request | |
37 | from tg import tmpl_context as c |
|
37 | from tg import tmpl_context as c | |
38 | from tg.i18n import ugettext as _ |
|
38 | from tg.i18n import ugettext as _ | |
39 | from webob.exc import HTTPBadRequest |
|
39 | from webob.exc import HTTPBadRequest | |
40 |
|
40 | |||
|
41 | from kallithea.controllers import base | |||
41 | from kallithea.lib import ext_json, webutils |
|
42 | from kallithea.lib import ext_json, webutils | |
42 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
43 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired | |
43 | from kallithea.lib.base import BaseRepoController, jsonify, render |
|
|||
44 | from kallithea.lib.conf import ALL_EXTS, ALL_READMES, LANGUAGES_EXTENSIONS_MAP |
|
44 | from kallithea.lib.conf import ALL_EXTS, ALL_READMES, LANGUAGES_EXTENSIONS_MAP | |
45 | from kallithea.lib.markup_renderer import MarkupRenderer |
|
45 | from kallithea.lib.markup_renderer import MarkupRenderer | |
46 | from kallithea.lib.page import Page |
|
46 | from kallithea.lib.page import Page | |
47 | from kallithea.lib.utils2 import safe_int, safe_str |
|
47 | from kallithea.lib.utils2 import safe_int, safe_str | |
48 | from kallithea.lib.vcs.backends.base import EmptyChangeset |
|
48 | from kallithea.lib.vcs.backends.base import EmptyChangeset | |
49 | from kallithea.lib.vcs.exceptions import ChangesetError, EmptyRepositoryError, NodeDoesNotExistError |
|
49 | from kallithea.lib.vcs.exceptions import ChangesetError, EmptyRepositoryError, NodeDoesNotExistError | |
50 | from kallithea.lib.vcs.nodes import FileNode |
|
50 | from kallithea.lib.vcs.nodes import FileNode | |
51 | from kallithea.model import async_tasks, db |
|
51 | from kallithea.model import async_tasks, db | |
52 |
|
52 | |||
53 |
|
53 | |||
54 | log = logging.getLogger(__name__) |
|
54 | log = logging.getLogger(__name__) | |
55 |
|
55 | |||
56 | README_FILES = [''.join([x[0][0], x[1][0]]) for x in |
|
56 | README_FILES = [''.join([x[0][0], x[1][0]]) for x in | |
57 | sorted(list(itertools.product(ALL_READMES, ALL_EXTS)), |
|
57 | sorted(list(itertools.product(ALL_READMES, ALL_EXTS)), | |
58 | key=lambda y:y[0][1] + y[1][1])] |
|
58 | key=lambda y:y[0][1] + y[1][1])] | |
59 |
|
59 | |||
60 |
|
60 | |||
61 | class SummaryController(BaseRepoController): |
|
61 | class SummaryController(base.BaseRepoController): | |
62 |
|
62 | |||
63 | def __get_readme_data(self, db_repo): |
|
63 | def __get_readme_data(self, db_repo): | |
64 | repo_name = db_repo.repo_name |
|
64 | repo_name = db_repo.repo_name | |
65 | log.debug('Looking for README file') |
|
65 | log.debug('Looking for README file') | |
66 |
|
66 | |||
67 | @cache_region('long_term_file', '_get_readme_from_cache') |
|
67 | @cache_region('long_term_file', '_get_readme_from_cache') | |
68 | def _get_readme_from_cache(*_cache_keys): # parameters are not really used - only as caching key |
|
68 | def _get_readme_from_cache(*_cache_keys): # parameters are not really used - only as caching key | |
69 | readme_data = None |
|
69 | readme_data = None | |
70 | readme_file = None |
|
70 | readme_file = None | |
71 | try: |
|
71 | try: | |
72 | # gets the landing revision! or tip if fails |
|
72 | # gets the landing revision! or tip if fails | |
73 | cs = db_repo.get_landing_changeset() |
|
73 | cs = db_repo.get_landing_changeset() | |
74 | if isinstance(cs, EmptyChangeset): |
|
74 | if isinstance(cs, EmptyChangeset): | |
75 | raise EmptyRepositoryError() |
|
75 | raise EmptyRepositoryError() | |
76 | renderer = MarkupRenderer() |
|
76 | renderer = MarkupRenderer() | |
77 | for f in README_FILES: |
|
77 | for f in README_FILES: | |
78 | try: |
|
78 | try: | |
79 | readme = cs.get_node(f) |
|
79 | readme = cs.get_node(f) | |
80 | if not isinstance(readme, FileNode): |
|
80 | if not isinstance(readme, FileNode): | |
81 | continue |
|
81 | continue | |
82 | readme_file = f |
|
82 | readme_file = f | |
83 | log.debug('Found README file `%s` rendering...', |
|
83 | log.debug('Found README file `%s` rendering...', | |
84 | readme_file) |
|
84 | readme_file) | |
85 | readme_data = renderer.render(safe_str(readme.content), |
|
85 | readme_data = renderer.render(safe_str(readme.content), | |
86 | filename=f) |
|
86 | filename=f) | |
87 | break |
|
87 | break | |
88 | except NodeDoesNotExistError: |
|
88 | except NodeDoesNotExistError: | |
89 | continue |
|
89 | continue | |
90 | except ChangesetError: |
|
90 | except ChangesetError: | |
91 | log.error(traceback.format_exc()) |
|
91 | log.error(traceback.format_exc()) | |
92 | pass |
|
92 | pass | |
93 | except EmptyRepositoryError: |
|
93 | except EmptyRepositoryError: | |
94 | pass |
|
94 | pass | |
95 |
|
95 | |||
96 | return readme_data, readme_file |
|
96 | return readme_data, readme_file | |
97 |
|
97 | |||
98 | kind = 'README' |
|
98 | kind = 'README' | |
99 | return _get_readme_from_cache(repo_name, kind, c.db_repo.changeset_cache.get('raw_id')) |
|
99 | return _get_readme_from_cache(repo_name, kind, c.db_repo.changeset_cache.get('raw_id')) | |
100 |
|
100 | |||
101 | @LoginRequired(allow_default_user=True) |
|
101 | @LoginRequired(allow_default_user=True) | |
102 | @HasRepoPermissionLevelDecorator('read') |
|
102 | @HasRepoPermissionLevelDecorator('read') | |
103 | def index(self, repo_name): |
|
103 | def index(self, repo_name): | |
104 | p = safe_int(request.GET.get('page'), 1) |
|
104 | p = safe_int(request.GET.get('page'), 1) | |
105 | size = safe_int(request.GET.get('size'), 10) |
|
105 | size = safe_int(request.GET.get('size'), 10) | |
106 | try: |
|
106 | try: | |
107 | collection = c.db_repo_scm_instance.get_changesets(reverse=True) |
|
107 | collection = c.db_repo_scm_instance.get_changesets(reverse=True) | |
108 | except EmptyRepositoryError as e: |
|
108 | except EmptyRepositoryError as e: | |
109 | webutils.flash(e, category='warning') |
|
109 | webutils.flash(e, category='warning') | |
110 | collection = [] |
|
110 | collection = [] | |
111 | c.cs_pagination = Page(collection, page=p, items_per_page=size) |
|
111 | c.cs_pagination = Page(collection, page=p, items_per_page=size) | |
112 | page_revisions = [x.raw_id for x in list(c.cs_pagination)] |
|
112 | page_revisions = [x.raw_id for x in list(c.cs_pagination)] | |
113 | c.cs_comments = c.db_repo.get_comments(page_revisions) |
|
113 | c.cs_comments = c.db_repo.get_comments(page_revisions) | |
114 | c.cs_statuses = c.db_repo.statuses(page_revisions) |
|
114 | c.cs_statuses = c.db_repo.statuses(page_revisions) | |
115 |
|
115 | |||
116 | c.ssh_repo_url = None |
|
116 | c.ssh_repo_url = None | |
117 | if request.authuser.is_default_user: |
|
117 | if request.authuser.is_default_user: | |
118 | username = None |
|
118 | username = None | |
119 | else: |
|
119 | else: | |
120 | username = request.authuser.username |
|
120 | username = request.authuser.username | |
121 | if c.ssh_enabled: |
|
121 | if c.ssh_enabled: | |
122 | c.ssh_repo_url = c.db_repo.clone_url(clone_uri_tmpl=c.clone_ssh_tmpl) |
|
122 | c.ssh_repo_url = c.db_repo.clone_url(clone_uri_tmpl=c.clone_ssh_tmpl) | |
123 |
|
123 | |||
124 | c.clone_repo_url = c.db_repo.clone_url(clone_uri_tmpl=c.clone_uri_tmpl, with_id=False, username=username) |
|
124 | c.clone_repo_url = c.db_repo.clone_url(clone_uri_tmpl=c.clone_uri_tmpl, with_id=False, username=username) | |
125 | c.clone_repo_url_id = c.db_repo.clone_url(clone_uri_tmpl=c.clone_uri_tmpl, with_id=True, username=username) |
|
125 | c.clone_repo_url_id = c.db_repo.clone_url(clone_uri_tmpl=c.clone_uri_tmpl, with_id=True, username=username) | |
126 |
|
126 | |||
127 | if c.db_repo.enable_statistics: |
|
127 | if c.db_repo.enable_statistics: | |
128 | c.show_stats = True |
|
128 | c.show_stats = True | |
129 | else: |
|
129 | else: | |
130 | c.show_stats = False |
|
130 | c.show_stats = False | |
131 |
|
131 | |||
132 | stats = db.Statistics.query() \ |
|
132 | stats = db.Statistics.query() \ | |
133 | .filter(db.Statistics.repository == c.db_repo) \ |
|
133 | .filter(db.Statistics.repository == c.db_repo) \ | |
134 | .scalar() |
|
134 | .scalar() | |
135 |
|
135 | |||
136 | c.stats_percentage = 0 |
|
136 | c.stats_percentage = 0 | |
137 |
|
137 | |||
138 | if stats and stats.languages: |
|
138 | if stats and stats.languages: | |
139 | lang_stats_d = ext_json.loads(stats.languages) |
|
139 | lang_stats_d = ext_json.loads(stats.languages) | |
140 | lang_stats = [(x, {"count": y, |
|
140 | lang_stats = [(x, {"count": y, | |
141 | "desc": LANGUAGES_EXTENSIONS_MAP.get(x, '?')}) |
|
141 | "desc": LANGUAGES_EXTENSIONS_MAP.get(x, '?')}) | |
142 | for x, y in lang_stats_d.items()] |
|
142 | for x, y in lang_stats_d.items()] | |
143 | lang_stats.sort(key=lambda k: (-k[1]['count'], k[0])) |
|
143 | lang_stats.sort(key=lambda k: (-k[1]['count'], k[0])) | |
144 | c.trending_languages = lang_stats[:10] |
|
144 | c.trending_languages = lang_stats[:10] | |
145 | else: |
|
145 | else: | |
146 | c.trending_languages = [] |
|
146 | c.trending_languages = [] | |
147 |
|
147 | |||
148 | c.enable_downloads = c.db_repo.enable_downloads |
|
148 | c.enable_downloads = c.db_repo.enable_downloads | |
149 | c.readme_data, c.readme_file = \ |
|
149 | c.readme_data, c.readme_file = \ | |
150 | self.__get_readme_data(c.db_repo) |
|
150 | self.__get_readme_data(c.db_repo) | |
151 | return render('summary/summary.html') |
|
151 | return base.render('summary/summary.html') | |
152 |
|
152 | |||
153 | @LoginRequired() |
|
153 | @LoginRequired() | |
154 | @HasRepoPermissionLevelDecorator('read') |
|
154 | @HasRepoPermissionLevelDecorator('read') | |
155 | @jsonify |
|
155 | @base.jsonify | |
156 | def repo_size(self, repo_name): |
|
156 | def repo_size(self, repo_name): | |
157 | if request.is_xhr: |
|
157 | if request.is_xhr: | |
158 | return c.db_repo._repo_size() |
|
158 | return c.db_repo._repo_size() | |
159 | else: |
|
159 | else: | |
160 | raise HTTPBadRequest() |
|
160 | raise HTTPBadRequest() | |
161 |
|
161 | |||
162 | @LoginRequired(allow_default_user=True) |
|
162 | @LoginRequired(allow_default_user=True) | |
163 | @HasRepoPermissionLevelDecorator('read') |
|
163 | @HasRepoPermissionLevelDecorator('read') | |
164 | def statistics(self, repo_name): |
|
164 | def statistics(self, repo_name): | |
165 | if c.db_repo.enable_statistics: |
|
165 | if c.db_repo.enable_statistics: | |
166 | c.show_stats = True |
|
166 | c.show_stats = True | |
167 | c.no_data_msg = _('No data ready yet') |
|
167 | c.no_data_msg = _('No data ready yet') | |
168 | else: |
|
168 | else: | |
169 | c.show_stats = False |
|
169 | c.show_stats = False | |
170 | c.no_data_msg = _('Statistics are disabled for this repository') |
|
170 | c.no_data_msg = _('Statistics are disabled for this repository') | |
171 |
|
171 | |||
172 | td = date.today() + timedelta(days=1) |
|
172 | td = date.today() + timedelta(days=1) | |
173 | td_1m = td - timedelta(days=calendar.monthrange(td.year, td.month)[1]) |
|
173 | td_1m = td - timedelta(days=calendar.monthrange(td.year, td.month)[1]) | |
174 | td_1y = td - timedelta(days=365) |
|
174 | td_1y = td - timedelta(days=365) | |
175 |
|
175 | |||
176 | ts_min_m = mktime(td_1m.timetuple()) |
|
176 | ts_min_m = mktime(td_1m.timetuple()) | |
177 | ts_min_y = mktime(td_1y.timetuple()) |
|
177 | ts_min_y = mktime(td_1y.timetuple()) | |
178 | ts_max_y = mktime(td.timetuple()) |
|
178 | ts_max_y = mktime(td.timetuple()) | |
179 | c.ts_min = ts_min_m |
|
179 | c.ts_min = ts_min_m | |
180 | c.ts_max = ts_max_y |
|
180 | c.ts_max = ts_max_y | |
181 |
|
181 | |||
182 | stats = db.Statistics.query() \ |
|
182 | stats = db.Statistics.query() \ | |
183 | .filter(db.Statistics.repository == c.db_repo) \ |
|
183 | .filter(db.Statistics.repository == c.db_repo) \ | |
184 | .scalar() |
|
184 | .scalar() | |
185 | c.stats_percentage = 0 |
|
185 | c.stats_percentage = 0 | |
186 | if stats and stats.languages: |
|
186 | if stats and stats.languages: | |
187 | c.commit_data = ext_json.loads(stats.commit_activity) |
|
187 | c.commit_data = ext_json.loads(stats.commit_activity) | |
188 | c.overview_data = ext_json.loads(stats.commit_activity_combined) |
|
188 | c.overview_data = ext_json.loads(stats.commit_activity_combined) | |
189 |
|
189 | |||
190 | lang_stats_d = ext_json.loads(stats.languages) |
|
190 | lang_stats_d = ext_json.loads(stats.languages) | |
191 | lang_stats = [(x, {"count": y, |
|
191 | lang_stats = [(x, {"count": y, | |
192 | "desc": LANGUAGES_EXTENSIONS_MAP.get(x, '?')}) |
|
192 | "desc": LANGUAGES_EXTENSIONS_MAP.get(x, '?')}) | |
193 | for x, y in lang_stats_d.items()] |
|
193 | for x, y in lang_stats_d.items()] | |
194 | lang_stats.sort(key=lambda k: (-k[1]['count'], k[0])) |
|
194 | lang_stats.sort(key=lambda k: (-k[1]['count'], k[0])) | |
195 | c.trending_languages = lang_stats[:10] |
|
195 | c.trending_languages = lang_stats[:10] | |
196 |
|
196 | |||
197 | last_rev = stats.stat_on_revision + 1 |
|
197 | last_rev = stats.stat_on_revision + 1 | |
198 | c.repo_last_rev = c.db_repo_scm_instance.count() \ |
|
198 | c.repo_last_rev = c.db_repo_scm_instance.count() \ | |
199 | if c.db_repo_scm_instance.revisions else 0 |
|
199 | if c.db_repo_scm_instance.revisions else 0 | |
200 | if last_rev == 0 or c.repo_last_rev == 0: |
|
200 | if last_rev == 0 or c.repo_last_rev == 0: | |
201 | pass |
|
201 | pass | |
202 | else: |
|
202 | else: | |
203 | c.stats_percentage = '%.2f' % ((float((last_rev)) / |
|
203 | c.stats_percentage = '%.2f' % ((float((last_rev)) / | |
204 | c.repo_last_rev) * 100) |
|
204 | c.repo_last_rev) * 100) | |
205 | else: |
|
205 | else: | |
206 | c.commit_data = {} |
|
206 | c.commit_data = {} | |
207 | c.overview_data = ([[ts_min_y, 0], [ts_max_y, 10]]) |
|
207 | c.overview_data = ([[ts_min_y, 0], [ts_max_y, 10]]) | |
208 | c.trending_languages = [] |
|
208 | c.trending_languages = [] | |
209 |
|
209 | |||
210 | recurse_limit = 500 # don't recurse more than 500 times when parsing |
|
210 | recurse_limit = 500 # don't recurse more than 500 times when parsing | |
211 | async_tasks.get_commits_stats(c.db_repo.repo_name, ts_min_y, ts_max_y, recurse_limit) |
|
211 | async_tasks.get_commits_stats(c.db_repo.repo_name, ts_min_y, ts_max_y, recurse_limit) | |
212 | return render('summary/statistics.html') |
|
212 | return base.render('summary/statistics.html') |
@@ -1,295 +1,295 b'' | |||||
1 | #!/usr/bin/env python3 |
|
1 | #!/usr/bin/env python3 | |
2 |
|
2 | |||
3 |
|
3 | |||
4 | import re |
|
4 | import re | |
5 | import sys |
|
5 | import sys | |
6 |
|
6 | |||
7 |
|
7 | |||
8 | ignored_modules = set(''' |
|
8 | ignored_modules = set(''' | |
9 | argparse |
|
9 | argparse | |
10 | base64 |
|
10 | base64 | |
11 | bcrypt |
|
11 | bcrypt | |
12 | binascii |
|
12 | binascii | |
13 | bleach |
|
13 | bleach | |
14 | calendar |
|
14 | calendar | |
15 | celery |
|
15 | celery | |
16 | celery |
|
16 | celery | |
17 | chardet |
|
17 | chardet | |
18 | click |
|
18 | click | |
19 | collections |
|
19 | collections | |
20 | configparser |
|
20 | configparser | |
21 | copy |
|
21 | copy | |
22 | csv |
|
22 | csv | |
23 | ctypes |
|
23 | ctypes | |
24 | datetime |
|
24 | datetime | |
25 | dateutil |
|
25 | dateutil | |
26 | decimal |
|
26 | decimal | |
27 | decorator |
|
27 | decorator | |
28 | difflib |
|
28 | difflib | |
29 | distutils |
|
29 | distutils | |
30 | docutils |
|
30 | docutils | |
31 |
|
31 | |||
32 | errno |
|
32 | errno | |
33 | fileinput |
|
33 | fileinput | |
34 | functools |
|
34 | functools | |
35 | getpass |
|
35 | getpass | |
36 | grp |
|
36 | grp | |
37 | hashlib |
|
37 | hashlib | |
38 | hmac |
|
38 | hmac | |
39 | html |
|
39 | html | |
40 | http |
|
40 | http | |
41 | imp |
|
41 | imp | |
42 | importlib |
|
42 | importlib | |
43 | inspect |
|
43 | inspect | |
44 | io |
|
44 | io | |
45 | ipaddr |
|
45 | ipaddr | |
46 | IPython |
|
46 | IPython | |
47 | isapi_wsgi |
|
47 | isapi_wsgi | |
48 | itertools |
|
48 | itertools | |
49 | json |
|
49 | json | |
50 | kajiki |
|
50 | kajiki | |
51 | ldap |
|
51 | ldap | |
52 | logging |
|
52 | logging | |
53 | mako |
|
53 | mako | |
54 | markdown |
|
54 | markdown | |
55 | mimetypes |
|
55 | mimetypes | |
56 | mock |
|
56 | mock | |
57 | msvcrt |
|
57 | msvcrt | |
58 | multiprocessing |
|
58 | multiprocessing | |
59 | operator |
|
59 | operator | |
60 | os |
|
60 | os | |
61 | paginate |
|
61 | paginate | |
62 | paginate_sqlalchemy |
|
62 | paginate_sqlalchemy | |
63 | pam |
|
63 | pam | |
64 | paste |
|
64 | paste | |
65 | pkg_resources |
|
65 | pkg_resources | |
66 | platform |
|
66 | platform | |
67 | posixpath |
|
67 | posixpath | |
68 | pprint |
|
68 | pprint | |
69 | pwd |
|
69 | pwd | |
70 | pyflakes |
|
70 | pyflakes | |
71 | pytest |
|
71 | pytest | |
72 | pytest_localserver |
|
72 | pytest_localserver | |
73 | random |
|
73 | random | |
74 | re |
|
74 | re | |
75 | routes |
|
75 | routes | |
76 | setuptools |
|
76 | setuptools | |
77 | shlex |
|
77 | shlex | |
78 | shutil |
|
78 | shutil | |
79 | smtplib |
|
79 | smtplib | |
80 | socket |
|
80 | socket | |
81 | ssl |
|
81 | ssl | |
82 | stat |
|
82 | stat | |
83 | string |
|
83 | string | |
84 | struct |
|
84 | struct | |
85 | subprocess |
|
85 | subprocess | |
86 | sys |
|
86 | sys | |
87 | tarfile |
|
87 | tarfile | |
88 | tempfile |
|
88 | tempfile | |
89 | textwrap |
|
89 | textwrap | |
90 | tgext |
|
90 | tgext | |
91 | threading |
|
91 | threading | |
92 | time |
|
92 | time | |
93 | traceback |
|
93 | traceback | |
94 | traitlets |
|
94 | traitlets | |
95 | types |
|
95 | types | |
96 | urllib |
|
96 | urllib | |
97 | urlobject |
|
97 | urlobject | |
98 | uuid |
|
98 | uuid | |
99 | warnings |
|
99 | warnings | |
100 | webhelpers2 |
|
100 | webhelpers2 | |
101 | webob |
|
101 | webob | |
102 | webtest |
|
102 | webtest | |
103 | whoosh |
|
103 | whoosh | |
104 | win32traceutil |
|
104 | win32traceutil | |
105 | zipfile |
|
105 | zipfile | |
106 | '''.split()) |
|
106 | '''.split()) | |
107 |
|
107 | |||
108 | top_modules = set(''' |
|
108 | top_modules = set(''' | |
109 | kallithea.alembic |
|
109 | kallithea.alembic | |
110 | kallithea.bin |
|
110 | kallithea.bin | |
111 | kallithea.config |
|
111 | kallithea.config | |
112 | kallithea.controllers |
|
112 | kallithea.controllers | |
113 | kallithea.templates.py |
|
113 | kallithea.templates.py | |
114 | scripts |
|
114 | scripts | |
115 | '''.split()) |
|
115 | '''.split()) | |
116 |
|
116 | |||
117 | bottom_external_modules = set(''' |
|
117 | bottom_external_modules = set(''' | |
118 | tg |
|
118 | tg | |
119 | mercurial |
|
119 | mercurial | |
120 | sqlalchemy |
|
120 | sqlalchemy | |
121 | alembic |
|
121 | alembic | |
122 | formencode |
|
122 | formencode | |
123 | pygments |
|
123 | pygments | |
124 | dulwich |
|
124 | dulwich | |
125 | beaker |
|
125 | beaker | |
126 | psycopg2 |
|
126 | psycopg2 | |
127 | docs |
|
127 | docs | |
128 | setup |
|
128 | setup | |
129 | conftest |
|
129 | conftest | |
130 | '''.split()) |
|
130 | '''.split()) | |
131 |
|
131 | |||
132 | normal_modules = set(''' |
|
132 | normal_modules = set(''' | |
133 | kallithea |
|
133 | kallithea | |
|
134 | kallithea.controllers.base | |||
134 | kallithea.lib |
|
135 | kallithea.lib | |
135 | kallithea.lib.auth |
|
136 | kallithea.lib.auth | |
136 | kallithea.lib.auth_modules |
|
137 | kallithea.lib.auth_modules | |
137 | kallithea.lib.base |
|
|||
138 | kallithea.lib.celerylib |
|
138 | kallithea.lib.celerylib | |
139 | kallithea.lib.db_manage |
|
139 | kallithea.lib.db_manage | |
140 | kallithea.lib.helpers |
|
140 | kallithea.lib.helpers | |
141 | kallithea.lib.hooks |
|
141 | kallithea.lib.hooks | |
142 | kallithea.lib.indexers |
|
142 | kallithea.lib.indexers | |
143 | kallithea.lib.utils |
|
143 | kallithea.lib.utils | |
144 | kallithea.lib.utils2 |
|
144 | kallithea.lib.utils2 | |
145 | kallithea.lib.vcs |
|
145 | kallithea.lib.vcs | |
146 | kallithea.lib.webutils |
|
146 | kallithea.lib.webutils | |
147 | kallithea.model |
|
147 | kallithea.model | |
148 | kallithea.model.async_tasks |
|
148 | kallithea.model.async_tasks | |
149 | kallithea.model.scm |
|
149 | kallithea.model.scm | |
150 | kallithea.templates.py |
|
150 | kallithea.templates.py | |
151 | '''.split()) |
|
151 | '''.split()) | |
152 |
|
152 | |||
153 | shown_modules = normal_modules | top_modules |
|
153 | shown_modules = normal_modules | top_modules | |
154 |
|
154 | |||
155 | # break the chains somehow - this is a cleanup TODO list |
|
155 | # break the chains somehow - this is a cleanup TODO list | |
156 | known_violations = [ |
|
156 | known_violations = [ | |
157 | ('kallithea.lib.auth_modules', 'kallithea.lib.auth'), # needs base&facade |
|
157 | ('kallithea.lib.auth_modules', 'kallithea.lib.auth'), # needs base&facade | |
158 | ('kallithea.lib.utils', 'kallithea.model'), # clean up utils |
|
158 | ('kallithea.lib.utils', 'kallithea.model'), # clean up utils | |
159 | ('kallithea.lib.utils', 'kallithea.model.db'), |
|
159 | ('kallithea.lib.utils', 'kallithea.model.db'), | |
160 | ('kallithea.lib.utils', 'kallithea.model.scm'), |
|
160 | ('kallithea.lib.utils', 'kallithea.model.scm'), | |
161 | ('kallithea.model.async_tasks', 'kallithea.lib.helpers'), |
|
161 | ('kallithea.model.async_tasks', 'kallithea.lib.helpers'), | |
162 | ('kallithea.model.async_tasks', 'kallithea.lib.hooks'), |
|
162 | ('kallithea.model.async_tasks', 'kallithea.lib.hooks'), | |
163 | ('kallithea.model.async_tasks', 'kallithea.lib.indexers'), |
|
163 | ('kallithea.model.async_tasks', 'kallithea.lib.indexers'), | |
164 | ('kallithea.model.async_tasks', 'kallithea.model'), |
|
164 | ('kallithea.model.async_tasks', 'kallithea.model'), | |
165 | ('kallithea.model', 'kallithea.lib.auth'), # auth.HasXXX |
|
165 | ('kallithea.model', 'kallithea.lib.auth'), # auth.HasXXX | |
166 | ('kallithea.model', 'kallithea.lib.auth_modules'), # validators |
|
166 | ('kallithea.model', 'kallithea.lib.auth_modules'), # validators | |
167 | ('kallithea.model', 'kallithea.lib.helpers'), |
|
167 | ('kallithea.model', 'kallithea.lib.helpers'), | |
168 | ('kallithea.model', 'kallithea.lib.hooks'), # clean up hooks |
|
168 | ('kallithea.model', 'kallithea.lib.hooks'), # clean up hooks | |
169 | ('kallithea.model', 'kallithea.model.scm'), |
|
169 | ('kallithea.model', 'kallithea.model.scm'), | |
170 | ('kallithea.model.scm', 'kallithea.lib.hooks'), |
|
170 | ('kallithea.model.scm', 'kallithea.lib.hooks'), | |
171 | ] |
|
171 | ] | |
172 |
|
172 | |||
173 | extra_edges = [ |
|
173 | extra_edges = [ | |
174 | ('kallithea.config', 'kallithea.controllers'), # through TG |
|
174 | ('kallithea.config', 'kallithea.controllers'), # through TG | |
175 | ('kallithea.lib.auth', 'kallithea.lib.auth_modules'), # custom loader |
|
175 | ('kallithea.lib.auth', 'kallithea.lib.auth_modules'), # custom loader | |
176 | ] |
|
176 | ] | |
177 |
|
177 | |||
178 |
|
178 | |||
179 | def normalize(s): |
|
179 | def normalize(s): | |
180 | """Given a string with dot path, return the string it should be shown as.""" |
|
180 | """Given a string with dot path, return the string it should be shown as.""" | |
181 | parts = s.replace('.__init__', '').split('.') |
|
181 | parts = s.replace('.__init__', '').split('.') | |
182 | short_2 = '.'.join(parts[:2]) |
|
182 | short_2 = '.'.join(parts[:2]) | |
183 | short_3 = '.'.join(parts[:3]) |
|
183 | short_3 = '.'.join(parts[:3]) | |
184 | short_4 = '.'.join(parts[:4]) |
|
184 | short_4 = '.'.join(parts[:4]) | |
185 | if parts[0] in ['scripts', 'contributor_data', 'i18n_utils']: |
|
185 | if parts[0] in ['scripts', 'contributor_data', 'i18n_utils']: | |
186 | return 'scripts' |
|
186 | return 'scripts' | |
187 | if short_3 == 'kallithea.model.meta': |
|
187 | if short_3 == 'kallithea.model.meta': | |
188 | return 'kallithea.model.db' |
|
188 | return 'kallithea.model.db' | |
189 | if parts[:4] == ['kallithea', 'lib', 'vcs', 'ssh']: |
|
189 | if parts[:4] == ['kallithea', 'lib', 'vcs', 'ssh']: | |
190 | return 'kallithea.lib.vcs.ssh' |
|
190 | return 'kallithea.lib.vcs.ssh' | |
191 | if short_4 in shown_modules: |
|
191 | if short_4 in shown_modules: | |
192 | return short_4 |
|
192 | return short_4 | |
193 | if short_3 in shown_modules: |
|
193 | if short_3 in shown_modules: | |
194 | return short_3 |
|
194 | return short_3 | |
195 | if short_2 in shown_modules: |
|
195 | if short_2 in shown_modules: | |
196 | return short_2 |
|
196 | return short_2 | |
197 | if short_2 == 'kallithea.tests': |
|
197 | if short_2 == 'kallithea.tests': | |
198 | return None |
|
198 | return None | |
199 | if parts[0] in ignored_modules: |
|
199 | if parts[0] in ignored_modules: | |
200 | return None |
|
200 | return None | |
201 | assert parts[0] in bottom_external_modules, parts |
|
201 | assert parts[0] in bottom_external_modules, parts | |
202 | return parts[0] |
|
202 | return parts[0] | |
203 |
|
203 | |||
204 |
|
204 | |||
205 | def main(filenames): |
|
205 | def main(filenames): | |
206 | if not filenames or filenames[0].startswith('-'): |
|
206 | if not filenames or filenames[0].startswith('-'): | |
207 | print('''\ |
|
207 | print('''\ | |
208 | Usage: |
|
208 | Usage: | |
209 | hg files 'set:!binary()&grep("^#!.*python")' 'set:**.py' | xargs scripts/deps.py |
|
209 | hg files 'set:!binary()&grep("^#!.*python")' 'set:**.py' | xargs scripts/deps.py | |
210 | dot -Tsvg deps.dot > deps.svg |
|
210 | dot -Tsvg deps.dot > deps.svg | |
211 | ''') |
|
211 | ''') | |
212 | raise SystemExit(1) |
|
212 | raise SystemExit(1) | |
213 |
|
213 | |||
214 | files_imports = dict() # map filenames to its imports |
|
214 | files_imports = dict() # map filenames to its imports | |
215 | import_deps = set() # set of tuples with module name and its imports |
|
215 | import_deps = set() # set of tuples with module name and its imports | |
216 | for fn in filenames: |
|
216 | for fn in filenames: | |
217 | with open(fn) as f: |
|
217 | with open(fn) as f: | |
218 | s = f.read() |
|
218 | s = f.read() | |
219 |
|
219 | |||
220 | dot_name = (fn[:-3] if fn.endswith('.py') else fn).replace('/', '.') |
|
220 | dot_name = (fn[:-3] if fn.endswith('.py') else fn).replace('/', '.') | |
221 | file_imports = set() |
|
221 | file_imports = set() | |
222 | for m in re.finditer(r'^ *(?:from ([^ ]*) import (?:([a-zA-Z].*)|\(([^)]*)\))|import (.*))$', s, re.MULTILINE): |
|
222 | for m in re.finditer(r'^ *(?:from ([^ ]*) import (?:([a-zA-Z].*)|\(([^)]*)\))|import (.*))$', s, re.MULTILINE): | |
223 | m_from, m_from_import, m_from_import2, m_import = m.groups() |
|
223 | m_from, m_from_import, m_from_import2, m_import = m.groups() | |
224 | if m_from: |
|
224 | if m_from: | |
225 | pre = m_from + '.' |
|
225 | pre = m_from + '.' | |
226 | if pre.startswith('.'): |
|
226 | if pre.startswith('.'): | |
227 | pre = dot_name.rsplit('.', 1)[0] + pre |
|
227 | pre = dot_name.rsplit('.', 1)[0] + pre | |
228 | importlist = m_from_import or m_from_import2 |
|
228 | importlist = m_from_import or m_from_import2 | |
229 | else: |
|
229 | else: | |
230 | pre = '' |
|
230 | pre = '' | |
231 | importlist = m_import |
|
231 | importlist = m_import | |
232 | for imp in importlist.split('#', 1)[0].split(','): |
|
232 | for imp in importlist.split('#', 1)[0].split(','): | |
233 | full_imp = pre + imp.strip().split(' as ', 1)[0] |
|
233 | full_imp = pre + imp.strip().split(' as ', 1)[0] | |
234 | file_imports.add(full_imp) |
|
234 | file_imports.add(full_imp) | |
235 | import_deps.add((dot_name, full_imp)) |
|
235 | import_deps.add((dot_name, full_imp)) | |
236 | files_imports[fn] = file_imports |
|
236 | files_imports[fn] = file_imports | |
237 |
|
237 | |||
238 | # dump out all deps for debugging and analysis |
|
238 | # dump out all deps for debugging and analysis | |
239 | with open('deps.txt', 'w') as f: |
|
239 | with open('deps.txt', 'w') as f: | |
240 | for fn, file_imports in sorted(files_imports.items()): |
|
240 | for fn, file_imports in sorted(files_imports.items()): | |
241 | for file_import in sorted(file_imports): |
|
241 | for file_import in sorted(file_imports): | |
242 | if file_import.split('.', 1)[0] in ignored_modules: |
|
242 | if file_import.split('.', 1)[0] in ignored_modules: | |
243 | continue |
|
243 | continue | |
244 | f.write('%s: %s\n' % (fn, file_import)) |
|
244 | f.write('%s: %s\n' % (fn, file_import)) | |
245 |
|
245 | |||
246 | # find leafs that haven't been ignored - they are the important external dependencies and shown in the bottom row |
|
246 | # find leafs that haven't been ignored - they are the important external dependencies and shown in the bottom row | |
247 | only_imported = set( |
|
247 | only_imported = set( | |
248 | set(normalize(b) for a, b in import_deps) - |
|
248 | set(normalize(b) for a, b in import_deps) - | |
249 | set(normalize(a) for a, b in import_deps) - |
|
249 | set(normalize(a) for a, b in import_deps) - | |
250 | set([None, 'kallithea']) |
|
250 | set([None, 'kallithea']) | |
251 | ) |
|
251 | ) | |
252 |
|
252 | |||
253 | normalized_dep_edges = set() |
|
253 | normalized_dep_edges = set() | |
254 | for dot_name, full_imp in import_deps: |
|
254 | for dot_name, full_imp in import_deps: | |
255 | a = normalize(dot_name) |
|
255 | a = normalize(dot_name) | |
256 | b = normalize(full_imp) |
|
256 | b = normalize(full_imp) | |
257 | if a is None or b is None or a == b: |
|
257 | if a is None or b is None or a == b: | |
258 | continue |
|
258 | continue | |
259 | normalized_dep_edges.add((a, b)) |
|
259 | normalized_dep_edges.add((a, b)) | |
260 | #print((dot_name, full_imp, a, b)) |
|
260 | #print((dot_name, full_imp, a, b)) | |
261 | normalized_dep_edges.update(extra_edges) |
|
261 | normalized_dep_edges.update(extra_edges) | |
262 |
|
262 | |||
263 | unseen_shown_modules = shown_modules.difference(a for a, b in normalized_dep_edges).difference(b for a, b in normalized_dep_edges) |
|
263 | unseen_shown_modules = shown_modules.difference(a for a, b in normalized_dep_edges).difference(b for a, b in normalized_dep_edges) | |
264 | assert not unseen_shown_modules, unseen_shown_modules |
|
264 | assert not unseen_shown_modules, unseen_shown_modules | |
265 |
|
265 | |||
266 | with open('deps.dot', 'w') as f: |
|
266 | with open('deps.dot', 'w') as f: | |
267 | f.write('digraph {\n') |
|
267 | f.write('digraph {\n') | |
268 | f.write('subgraph { rank = same; %s}\n' % ''.join('"%s"; ' % s for s in sorted(top_modules))) |
|
268 | f.write('subgraph { rank = same; %s}\n' % ''.join('"%s"; ' % s for s in sorted(top_modules))) | |
269 | f.write('subgraph { rank = same; %s}\n' % ''.join('"%s"; ' % s for s in sorted(only_imported))) |
|
269 | f.write('subgraph { rank = same; %s}\n' % ''.join('"%s"; ' % s for s in sorted(only_imported))) | |
270 | for a, b in sorted(normalized_dep_edges): |
|
270 | for a, b in sorted(normalized_dep_edges): | |
271 | f.write(' "%s" -> "%s"%s\n' % (a, b, ' [color=red]' if (a, b) in known_violations else ' [color=green]' if (a, b) in extra_edges else '')) |
|
271 | f.write(' "%s" -> "%s"%s\n' % (a, b, ' [color=red]' if (a, b) in known_violations else ' [color=green]' if (a, b) in extra_edges else '')) | |
272 | f.write('}\n') |
|
272 | f.write('}\n') | |
273 |
|
273 | |||
274 | # verify dependencies by untangling dependency chain bottom-up: |
|
274 | # verify dependencies by untangling dependency chain bottom-up: | |
275 | todo = set(normalized_dep_edges) |
|
275 | todo = set(normalized_dep_edges) | |
276 | for x in known_violations: |
|
276 | for x in known_violations: | |
277 | todo.remove(x) |
|
277 | todo.remove(x) | |
278 |
|
278 | |||
279 | while todo: |
|
279 | while todo: | |
280 | depending = set(a for a, b in todo) |
|
280 | depending = set(a for a, b in todo) | |
281 | depended = set(b for a, b in todo) |
|
281 | depended = set(b for a, b in todo) | |
282 | drop = depended - depending |
|
282 | drop = depended - depending | |
283 | if not drop: |
|
283 | if not drop: | |
284 | print('ERROR: cycles:', len(todo)) |
|
284 | print('ERROR: cycles:', len(todo)) | |
285 | for x in sorted(todo): |
|
285 | for x in sorted(todo): | |
286 | print('%s,' % (x,)) |
|
286 | print('%s,' % (x,)) | |
287 | raise SystemExit(1) |
|
287 | raise SystemExit(1) | |
288 | #for do_b in sorted(drop): |
|
288 | #for do_b in sorted(drop): | |
289 | # print('Picking', do_b, '- unblocks:', ' '.join(a for a, b in sorted((todo)) if b == do_b)) |
|
289 | # print('Picking', do_b, '- unblocks:', ' '.join(a for a, b in sorted((todo)) if b == do_b)) | |
290 | todo = set((a, b) for a, b in todo if b in depending) |
|
290 | todo = set((a, b) for a, b in todo if b in depending) | |
291 | #print() |
|
291 | #print() | |
292 |
|
292 | |||
293 |
|
293 | |||
294 | if __name__ == '__main__': |
|
294 | if __name__ == '__main__': | |
295 | main(sys.argv[1:]) |
|
295 | main(sys.argv[1:]) |
General Comments 0
You need to be logged in to leave comments.
Login now