Show More
@@ -1,397 +1,397 b'' | |||
|
1 | 1 | .. _contributing: |
|
2 | 2 | |
|
3 | 3 | ========================= |
|
4 | 4 | Contributing to Kallithea |
|
5 | 5 | ========================= |
|
6 | 6 | |
|
7 | 7 | Kallithea is developed and maintained by its users. Please join us and scratch |
|
8 | 8 | your own itch. |
|
9 | 9 | |
|
10 | 10 | |
|
11 | 11 | Infrastructure |
|
12 | 12 | -------------- |
|
13 | 13 | |
|
14 | 14 | The main repository is hosted on Our Own Kallithea (aka OOK) at |
|
15 | 15 | https://kallithea-scm.org/repos/kallithea/, our self-hosted instance |
|
16 | 16 | of Kallithea. |
|
17 | 17 | |
|
18 | 18 | Please use the `mailing list`_ to send patches or report issues. |
|
19 | 19 | |
|
20 | 20 | We use Weblate_ to translate the user interface messages into languages other |
|
21 | 21 | than English. Join our project on `Hosted Weblate`_ to help us. |
|
22 | 22 | To register, you can use your Bitbucket or GitHub account. See :ref:`translations` |
|
23 | 23 | for more details. |
|
24 | 24 | |
|
25 | 25 | |
|
26 | 26 | Getting started |
|
27 | 27 | --------------- |
|
28 | 28 | |
|
29 | 29 | To get started with Kallithea development run the following commands in your |
|
30 | 30 | bash shell:: |
|
31 | 31 | |
|
32 | 32 | hg clone https://kallithea-scm.org/repos/kallithea |
|
33 | 33 | cd kallithea |
|
34 | 34 | python3 -m venv venv |
|
35 | 35 | . venv/bin/activate |
|
36 | 36 | pip install --upgrade pip setuptools |
|
37 | 37 | pip install --upgrade -e . -r dev_requirements.txt python-ldap python-pam |
|
38 | 38 | kallithea-cli config-create my.ini |
|
39 | 39 | kallithea-cli db-create -c my.ini --user=user --email=user@example.com --password=password --repos=/tmp |
|
40 | 40 | kallithea-cli front-end-build |
|
41 | 41 | gearbox serve -c my.ini --reload & |
|
42 | 42 | firefox http://127.0.0.1:5000/ |
|
43 | 43 | |
|
44 | 44 | |
|
45 | 45 | Contribution flow |
|
46 | 46 | ----------------- |
|
47 | 47 | |
|
48 | 48 | Starting from an existing Kallithea clone, make sure it is up to date with the |
|
49 | 49 | latest upstream changes:: |
|
50 | 50 | |
|
51 | 51 | hg pull |
|
52 | 52 | hg update |
|
53 | 53 | |
|
54 | 54 | Review the :ref:`contributing-guidelines` and :ref:`coding-guidelines`. |
|
55 | 55 | |
|
56 | 56 | If you are new to Mercurial, refer to Mercurial `Quick Start`_ and `Beginners |
|
57 | 57 | Guide`_ on the Mercurial wiki. |
|
58 | 58 | |
|
59 | 59 | Now, make some changes and test them (see :ref:`contributing-tests`). Don't |
|
60 | 60 | forget to add new tests to cover new functionality or bug fixes. |
|
61 | 61 | |
|
62 | 62 | For documentation changes, run ``make html`` from the ``docs`` directory to |
|
63 | 63 | generate the HTML result, then review them in your browser. |
|
64 | 64 | |
|
65 | 65 | Before submitting any changes, run the cleanup script:: |
|
66 | 66 | |
|
67 | 67 | ./scripts/run-all-cleanup |
|
68 | 68 | |
|
69 | 69 | When you are completely ready, you can send your changes to the community for |
|
70 | 70 | review and inclusion, via the mailing list (via ``hg email``). |
|
71 | 71 | |
|
72 | 72 | .. _contributing-tests: |
|
73 | 73 | |
|
74 | 74 | |
|
75 | 75 | Internal dependencies |
|
76 | 76 | --------------------- |
|
77 | 77 | |
|
78 | 78 | We try to keep the code base clean and modular and avoid circular dependencies. |
|
79 | 79 | Code should only invoke code in layers below itself. |
|
80 | 80 | |
|
81 | 81 | Imports should import whole modules ``from`` their parent module, perhaps |
|
82 | 82 | ``as`` a shortened name. Avoid imports ``from`` modules. |
|
83 | 83 | |
|
84 | 84 | To avoid cycles and partially initialized modules, ``__init__.py`` should *not* |
|
85 | 85 | contain any non-trivial imports. The top level of a module should *not* be a |
|
86 | 86 | facade for the module functionality. |
|
87 | 87 | |
|
88 | 88 | Common code for a module is often in ``base.py``. |
|
89 | 89 | |
|
90 | 90 | The important part of the dependency graph is approximately linear. In the |
|
91 | 91 | following list, modules may only depend on modules below them: |
|
92 | 92 | |
|
93 | 93 | ``tests`` |
|
94 | 94 | Just get the job done - anything goes. |
|
95 | 95 | |
|
96 | 96 | ``bin/`` & ``config/`` & ``alembic/`` |
|
97 | 97 | The main entry points, defined in ``setup.py``. Note: The TurboGears template |
|
98 | 98 | use ``config`` for the high WSGI application - this is not for low level |
|
99 | 99 | configuration. |
|
100 | 100 | |
|
101 | 101 | ``controllers/`` |
|
102 | 102 | The top level web application, with TurboGears using the ``root`` controller |
|
103 | 103 | as entry point, and ``routing`` dispatching to other controllers. |
|
104 | 104 | |
|
105 | 105 | ``templates/**.html`` |
|
106 | 106 | The "view", rendering to HTML. Invoked by controllers which can pass them |
|
107 | 107 | anything from lower layers - especially ``helpers`` available as ``h`` will |
|
108 | 108 | cut through all layers, and ``c`` gives access to global variables. |
|
109 | 109 | |
|
110 | 110 | ``lib/helpers.py`` |
|
111 | 111 | High level helpers, exposing everything to templates as ``h``. It depends on |
|
112 | 112 | everything and has a huge dependency chain, so it should not be used for |
|
113 | 113 | anything else. TODO. |
|
114 | 114 | |
|
115 | ``controlles/base.py`` | |
|
115 | ``controllers/base.py`` | |
|
116 | 116 | The base class of controllers, with lots of model knowledge. |
|
117 | 117 | |
|
118 | 118 | ``lib/auth.py`` |
|
119 | 119 | All things related to authentication. TODO. |
|
120 | 120 | |
|
121 | 121 | ``lib/utils.py`` |
|
122 | 122 | High level utils with lots of model knowledge. TODO. |
|
123 | 123 | |
|
124 | 124 | ``lib/hooks.py`` |
|
125 | 125 | Hooks into "everything" to give centralized logging to database, cache |
|
126 | 126 | invalidation, and extension handling. TODO. |
|
127 | 127 | |
|
128 | 128 | ``model/`` |
|
129 | 129 | Convenience business logic wrappers around database models. |
|
130 | 130 | |
|
131 | 131 | ``model/db.py`` |
|
132 | 132 | Defines the database schema and provides some additional logic. |
|
133 | 133 | |
|
134 | 134 | ``model/scm.py`` |
|
135 | 135 | All things related to anything. TODO. |
|
136 | 136 | |
|
137 | 137 | SQLAlchemy |
|
138 | 138 | Database session and transaction in thread-local variables. |
|
139 | 139 | |
|
140 | 140 | ``lib/utils2.py`` |
|
141 | 141 | Low level utils specific to Kallithea. |
|
142 | 142 | |
|
143 | 143 | ``lib/webutils.py`` |
|
144 | 144 | Low level generic utils with awareness of the TurboGears environment. |
|
145 | 145 | |
|
146 | 146 | TurboGears |
|
147 | 147 | Request, response and state like i18n gettext in thread-local variables. |
|
148 | 148 | External dependency with global state - usage should be minimized. |
|
149 | 149 | |
|
150 | 150 | ``lib/vcs/`` |
|
151 | 151 | Previously an independent library. No awareness of web, database, or state. |
|
152 | 152 | |
|
153 | 153 | ``lib/*`` |
|
154 | 154 | Various "pure" functionality not depending on anything else. |
|
155 | 155 | |
|
156 | 156 | ``__init__`` |
|
157 | 157 | Very basic Kallithea constants - some of them are set very early based on ``.ini``. |
|
158 | 158 | |
|
159 | 159 | This is not exactly how it is right now, but we aim for something like that. |
|
160 | 160 | Especially the areas marked as TODO have some problems that need untangling. |
|
161 | 161 | |
|
162 | 162 | |
|
163 | 163 | Running tests |
|
164 | 164 | ------------- |
|
165 | 165 | |
|
166 | 166 | After finishing your changes make sure all tests pass cleanly. Run the testsuite |
|
167 | 167 | by invoking ``py.test`` from the project root:: |
|
168 | 168 | |
|
169 | 169 | py.test |
|
170 | 170 | |
|
171 | 171 | Note that on unix systems, the temporary directory (``/tmp`` or where |
|
172 | 172 | ``$TMPDIR`` points) must allow executable files; Git hooks must be executable, |
|
173 | 173 | and the test suite creates repositories in the temporary directory. Linux |
|
174 | 174 | systems with /tmp mounted noexec will thus fail. |
|
175 | 175 | |
|
176 | 176 | Tests can be run on PostgreSQL like:: |
|
177 | 177 | |
|
178 | 178 | sudo -u postgres createuser 'kallithea-test' --pwprompt # password password |
|
179 | 179 | sudo -u postgres createdb 'kallithea-test' --owner 'kallithea-test' |
|
180 | 180 | REUSE_TEST_DB='postgresql://kallithea-test:password@localhost/kallithea-test' py.test |
|
181 | 181 | |
|
182 | 182 | Tests can be run on MariaDB/MySQL like:: |
|
183 | 183 | |
|
184 | 184 | echo "GRANT ALL PRIVILEGES ON \`kallithea-test\`.* TO 'kallithea-test'@'localhost' IDENTIFIED BY 'password'" | sudo -u mysql mysql |
|
185 | 185 | TEST_DB='mysql://kallithea-test:password@localhost/kallithea-test?charset=utf8mb4' py.test |
|
186 | 186 | |
|
187 | 187 | You can also use ``tox`` to run the tests with all supported Python versions. |
|
188 | 188 | |
|
189 | 189 | When running tests, Kallithea generates a `test.ini` based on template values |
|
190 | 190 | in `kallithea/tests/conftest.py` and populates the SQLite database specified |
|
191 | 191 | there. |
|
192 | 192 | |
|
193 | 193 | It is possible to avoid recreating the full test database on each invocation of |
|
194 | 194 | the tests, thus eliminating the initial delay. To achieve this, run the tests as:: |
|
195 | 195 | |
|
196 | 196 | gearbox serve -c /tmp/kallithea-test-XXX/test.ini --pid-file=test.pid --daemon |
|
197 | 197 | KALLITHEA_WHOOSH_TEST_DISABLE=1 KALLITHEA_NO_TMP_PATH=1 py.test |
|
198 | 198 | kill -9 $(cat test.pid) |
|
199 | 199 | |
|
200 | 200 | In these commands, the following variables are used:: |
|
201 | 201 | |
|
202 | 202 | KALLITHEA_WHOOSH_TEST_DISABLE=1 - skip whoosh index building and tests |
|
203 | 203 | KALLITHEA_NO_TMP_PATH=1 - disable new temp path for tests, used mostly for testing_vcs_operations |
|
204 | 204 | |
|
205 | 205 | You can run individual tests by specifying their path as argument to py.test. |
|
206 | 206 | py.test also has many more options, see `py.test -h`. Some useful options |
|
207 | 207 | are:: |
|
208 | 208 | |
|
209 | 209 | -k EXPRESSION only run tests which match the given substring |
|
210 | 210 | expression. An expression is a python evaluable |
|
211 | 211 | expression where all names are substring-matched |
|
212 | 212 | against test names and their parent classes. Example: |
|
213 | 213 | -x, --exitfirst exit instantly on first error or failed test. |
|
214 | 214 | --lf rerun only the tests that failed at the last run (or |
|
215 | 215 | all if none failed) |
|
216 | 216 | --ff run all tests but run the last failures first. This |
|
217 | 217 | may re-order tests and thus lead to repeated fixture |
|
218 | 218 | setup/teardown |
|
219 | 219 | --pdb start the interactive Python debugger on errors. |
|
220 | 220 | -s, --capture=no don't capture stdout (any stdout output will be |
|
221 | 221 | printed immediately) |
|
222 | 222 | |
|
223 | 223 | Performance tests |
|
224 | 224 | ^^^^^^^^^^^^^^^^^ |
|
225 | 225 | |
|
226 | 226 | A number of performance tests are present in the test suite, but they are |
|
227 | 227 | not run in a standard test run. These tests are useful to |
|
228 | 228 | evaluate the impact of certain code changes with respect to performance. |
|
229 | 229 | |
|
230 | 230 | To run these tests:: |
|
231 | 231 | |
|
232 | 232 | env TEST_PERFORMANCE=1 py.test kallithea/tests/performance |
|
233 | 233 | |
|
234 | 234 | To analyze performance, you could install pytest-profiling_, which enables the |
|
235 | 235 | --profile and --profile-svg options to py.test. |
|
236 | 236 | |
|
237 | 237 | .. _pytest-profiling: https://github.com/manahl/pytest-plugins/tree/master/pytest-profiling |
|
238 | 238 | |
|
239 | 239 | .. _contributing-guidelines: |
|
240 | 240 | |
|
241 | 241 | |
|
242 | 242 | Contribution guidelines |
|
243 | 243 | ----------------------- |
|
244 | 244 | |
|
245 | 245 | Kallithea is GPLv3 and we assume all contributions are made by the |
|
246 | 246 | committer/contributor and under GPLv3 unless explicitly stated. We do care a |
|
247 | 247 | lot about preservation of copyright and license information for existing code |
|
248 | 248 | that is brought into the project. |
|
249 | 249 | |
|
250 | 250 | Contributions will be accepted in most formats -- such as commits hosted on your |
|
251 | 251 | own Kallithea instance, or patches sent by email to the `kallithea-general`_ |
|
252 | 252 | mailing list. |
|
253 | 253 | |
|
254 | 254 | Make sure to test your changes both manually and with the automatic tests |
|
255 | 255 | before posting. |
|
256 | 256 | |
|
257 | 257 | We care about quality and review and keeping a clean repository history. We |
|
258 | 258 | might give feedback that requests polishing contributions until they are |
|
259 | 259 | "perfect". We might also rebase and collapse and make minor adjustments to your |
|
260 | 260 | changes when we apply them. |
|
261 | 261 | |
|
262 | 262 | We try to make sure we have consensus on the direction the project is taking. |
|
263 | 263 | Everything non-sensitive should be discussed in public -- preferably on the |
|
264 | 264 | mailing list. We aim at having all non-trivial changes reviewed by at least |
|
265 | 265 | one other core developer before pushing. Obvious non-controversial changes will |
|
266 | 266 | be handled more casually. |
|
267 | 267 | |
|
268 | 268 | There is a main development branch ("default") which is generally stable so that |
|
269 | 269 | it can be (and is) used in production. There is also a "stable" branch that is |
|
270 | 270 | almost exclusively reserved for bug fixes or trivial changes. Experimental |
|
271 | 271 | changes should live elsewhere (for example in a pull request) until they are |
|
272 | 272 | ready. |
|
273 | 273 | |
|
274 | 274 | .. _coding-guidelines: |
|
275 | 275 | |
|
276 | 276 | |
|
277 | 277 | Coding guidelines |
|
278 | 278 | ----------------- |
|
279 | 279 | |
|
280 | 280 | We don't have a formal coding/formatting standard. We are currently using a mix |
|
281 | 281 | of Mercurial's (https://www.mercurial-scm.org/wiki/CodingStyle), pep8, and |
|
282 | 282 | consistency with existing code. Run ``scripts/run-all-cleanup`` before |
|
283 | 283 | committing to ensure some basic code formatting consistency. |
|
284 | 284 | |
|
285 | 285 | We support Python 3.6 and later. |
|
286 | 286 | |
|
287 | 287 | We try to support the most common modern web browsers. IE9 is still supported |
|
288 | 288 | to the extent it is feasible, IE8 is not. |
|
289 | 289 | |
|
290 | 290 | We primarily support Linux and OS X on the server side but Windows should also work. |
|
291 | 291 | |
|
292 | 292 | HTML templates should use 2 spaces for indentation ... but be pragmatic. We |
|
293 | 293 | should use templates cleverly and avoid duplication. We should use reasonable |
|
294 | 294 | semantic markup with element classes and IDs that can be used for styling and testing. |
|
295 | 295 | We should only use inline styles in places where it really is semantic (such as |
|
296 | 296 | ``display: none``). |
|
297 | 297 | |
|
298 | 298 | JavaScript must use ``;`` between/after statements. Indentation 4 spaces. Inline |
|
299 | 299 | multiline functions should be indented two levels -- one for the ``()`` and one for |
|
300 | 300 | ``{}``. |
|
301 | 301 | Variables holding jQuery objects should be named with a leading ``$``. |
|
302 | 302 | |
|
303 | 303 | Commit messages should have a leading short line summarizing the changes. For |
|
304 | 304 | bug fixes, put ``(Issue #123)`` at the end of this line. |
|
305 | 305 | |
|
306 | 306 | Use American English grammar and spelling overall. Use `English title case`_ for |
|
307 | 307 | page titles, button labels, headers, and 'labels' for fields in forms. |
|
308 | 308 | |
|
309 | 309 | .. _English title case: https://en.wikipedia.org/wiki/Capitalization#Title_case |
|
310 | 310 | |
|
311 | 311 | Template helpers (that is, everything in ``kallithea.lib.helpers``) |
|
312 | 312 | should only be referenced from templates. If you need to call a |
|
313 | 313 | helper from the Python code, consider moving the function somewhere |
|
314 | 314 | else (e.g. to the model). |
|
315 | 315 | |
|
316 | 316 | Notes on the SQLAlchemy session |
|
317 | 317 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
318 | 318 | |
|
319 | 319 | Each HTTP request runs inside an independent SQLAlchemy session (as well |
|
320 | 320 | as in an independent database transaction). ``Session`` is the session manager |
|
321 | 321 | and factory. ``Session()`` will create a new session on-demand or return the |
|
322 | 322 | current session for the active thread. Many database operations are methods on |
|
323 | 323 | such session instances. The session will generally be removed by |
|
324 | 324 | TurboGears automatically. |
|
325 | 325 | |
|
326 | 326 | Database model objects |
|
327 | 327 | (almost) always belong to a particular SQLAlchemy session, which means |
|
328 | 328 | that SQLAlchemy will ensure that they're kept in sync with the database |
|
329 | 329 | (but also means that they cannot be shared across requests). |
|
330 | 330 | |
|
331 | 331 | Objects can be added to the session using ``Session().add``, but this is |
|
332 | 332 | rarely needed: |
|
333 | 333 | |
|
334 | 334 | * When creating a database object by calling the constructor directly, |
|
335 | 335 | it must explicitly be added to the session. |
|
336 | 336 | |
|
337 | 337 | * When creating an object using a factory function (like |
|
338 | 338 | ``create_repo``), the returned object has already (by convention) |
|
339 | 339 | been added to the session, and should not be added again. |
|
340 | 340 | |
|
341 | 341 | * When getting an object from the session (via ``Session().query`` or |
|
342 | 342 | any of the utility functions that look up objects in the database), |
|
343 | 343 | it's already part of the session, and should not be added again. |
|
344 | 344 | SQLAlchemy monitors attribute modifications automatically for all |
|
345 | 345 | objects it knows about and syncs them to the database. |
|
346 | 346 | |
|
347 | 347 | SQLAlchemy also flushes changes to the database automatically; manually |
|
348 | 348 | calling ``Session().flush`` is usually only necessary when the Python |
|
349 | 349 | code needs the database to assign an "auto-increment" primary key ID to |
|
350 | 350 | a freshly created model object (before flushing, the ID attribute will |
|
351 | 351 | be ``None``). |
|
352 | 352 | |
|
353 | 353 | Debugging |
|
354 | 354 | ^^^^^^^^^ |
|
355 | 355 | |
|
356 | 356 | A good way to trace what Kallithea is doing is to keep an eye on the output on |
|
357 | 357 | stdout/stderr of the server process. Perhaps change ``my.ini`` to log at |
|
358 | 358 | ``DEBUG`` or ``INFO`` level, especially ``[logger_kallithea]``, but perhaps |
|
359 | 359 | also other loggers. It is often easier to add additional ``log`` or ``print`` |
|
360 | 360 | statements than to use a Python debugger. |
|
361 | 361 | |
|
362 | 362 | Sometimes it is simpler to disable ``errorpage.enabled`` and perhaps also |
|
363 | 363 | ``trace_errors.enable`` to expose raw errors instead of adding extra |
|
364 | 364 | processing. Enabling ``debug`` can be helpful for showing and exploring |
|
365 | 365 | tracebacks in the browser, but is also insecure and will add extra processing. |
|
366 | 366 | |
|
367 | 367 | TurboGears2 DebugBar |
|
368 | 368 | ^^^^^^^^^^^^^^^^^^^^ |
|
369 | 369 | |
|
370 | 370 | It is possible to enable the TurboGears2-provided DebugBar_, a toolbar overlayed |
|
371 | 371 | over the Kallithea web interface, allowing you to see: |
|
372 | 372 | |
|
373 | 373 | * timing information of the current request, including profiling information |
|
374 | 374 | * request data, including GET data, POST data, cookies, headers and environment |
|
375 | 375 | variables |
|
376 | 376 | * a list of executed database queries, including timing and result values |
|
377 | 377 | |
|
378 | 378 | DebugBar is only activated when ``debug = true`` is set in the configuration |
|
379 | 379 | file. This is important, because the DebugBar toolbar will be visible for all |
|
380 | 380 | users, and allow them to see information they should not be allowed to see. Like |
|
381 | 381 | is anyway the case for ``debug = true``, do not use this in production! |
|
382 | 382 | |
|
383 | 383 | To enable DebugBar, install ``tgext.debugbar`` and ``kajiki`` (typically via |
|
384 | 384 | ``pip``) and restart Kallithea (in debug mode). |
|
385 | 385 | |
|
386 | 386 | |
|
387 | 387 | Thank you for your contribution! |
|
388 | 388 | -------------------------------- |
|
389 | 389 | |
|
390 | 390 | |
|
391 | 391 | .. _Weblate: http://weblate.org/ |
|
392 | 392 | .. _mailing list: http://lists.sfconservancy.org/mailman/listinfo/kallithea-general |
|
393 | 393 | .. _kallithea-general: http://lists.sfconservancy.org/mailman/listinfo/kallithea-general |
|
394 | 394 | .. _Hosted Weblate: https://hosted.weblate.org/projects/kallithea/kallithea/ |
|
395 | 395 | .. _DebugBar: https://github.com/TurboGears/tgext.debugbar |
|
396 | 396 | .. _Quick Start: https://www.mercurial-scm.org/wiki/QuickStart |
|
397 | 397 | .. _Beginners Guide: https://www.mercurial-scm.org/wiki/BeginnersGuides |
@@ -1,93 +1,93 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.config.middleware.simplegit |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | SimpleGit middleware for handling Git protocol requests (push/clone etc.) |
|
19 | 19 | It's implemented with basic auth function |
|
20 | 20 | |
|
21 | 21 | This file was forked by the Kallithea project in July 2014. |
|
22 | 22 | Original author and date, and relevant copyright and licensing information is below: |
|
23 | 23 | :created_on: Apr 28, 2010 |
|
24 | 24 | :author: marcink |
|
25 | 25 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
26 | 26 | :license: GPLv3, see LICENSE.md for more details. |
|
27 | 27 | |
|
28 | 28 | """ |
|
29 | 29 | |
|
30 | 30 | |
|
31 | 31 | import logging |
|
32 | 32 | import re |
|
33 | 33 | |
|
34 | 34 | from kallithea.config.middleware.pygrack import make_wsgi_app |
|
35 | from kallithea.controllers import base | |
|
35 | 36 | from kallithea.lib import hooks |
|
36 | from kallithea.lib.base import BaseVCSController, get_path_info | |
|
37 | 37 | |
|
38 | 38 | |
|
39 | 39 | log = logging.getLogger(__name__) |
|
40 | 40 | |
|
41 | 41 | |
|
42 | 42 | GIT_PROTO_PAT = re.compile(r'^/(.+)/(info/refs|git-upload-pack|git-receive-pack)$') |
|
43 | 43 | |
|
44 | 44 | |
|
45 | 45 | cmd_mapping = { |
|
46 | 46 | 'git-receive-pack': 'push', |
|
47 | 47 | 'git-upload-pack': 'pull', |
|
48 | 48 | } |
|
49 | 49 | |
|
50 | 50 | |
|
51 | class SimpleGit(BaseVCSController): | |
|
51 | class SimpleGit(base.BaseVCSController): | |
|
52 | 52 | |
|
53 | 53 | scm_alias = 'git' |
|
54 | 54 | |
|
55 | 55 | @classmethod |
|
56 | 56 | def parse_request(cls, environ): |
|
57 | path_info = get_path_info(environ) | |
|
57 | path_info = base.get_path_info(environ) | |
|
58 | 58 | m = GIT_PROTO_PAT.match(path_info) |
|
59 | 59 | if m is None: |
|
60 | 60 | return None |
|
61 | 61 | |
|
62 | 62 | class parsed_request(object): |
|
63 | 63 | # See https://git-scm.com/book/en/v2/Git-Internals-Transfer-Protocols#_the_smart_protocol |
|
64 | 64 | repo_name = m.group(1).rstrip('/') |
|
65 | 65 | cmd = m.group(2) |
|
66 | 66 | |
|
67 | 67 | query_string = environ['QUERY_STRING'] |
|
68 | 68 | if cmd == 'info/refs' and query_string.startswith('service='): |
|
69 | 69 | service = query_string.split('=', 1)[1] |
|
70 | 70 | action = cmd_mapping.get(service) |
|
71 | 71 | else: |
|
72 | 72 | service = None |
|
73 | 73 | action = cmd_mapping.get(cmd) |
|
74 | 74 | |
|
75 | 75 | return parsed_request |
|
76 | 76 | |
|
77 | 77 | def _make_app(self, parsed_request): |
|
78 | 78 | """ |
|
79 | 79 | Return a pygrack wsgi application. |
|
80 | 80 | """ |
|
81 | 81 | pygrack_app = make_wsgi_app(parsed_request.repo_name, self.basepath) |
|
82 | 82 | |
|
83 | 83 | def wrapper_app(environ, start_response): |
|
84 | 84 | if (parsed_request.cmd == 'info/refs' and |
|
85 | 85 | parsed_request.service == 'git-upload-pack' |
|
86 | 86 | ): |
|
87 | 87 | # Run hooks like Mercurial outgoing.kallithea_pull_action does |
|
88 | 88 | hooks.log_pull_action() |
|
89 | 89 | # Note: push hooks are handled by post-receive hook |
|
90 | 90 | |
|
91 | 91 | return pygrack_app(environ, start_response) |
|
92 | 92 | |
|
93 | 93 | return wrapper_app |
@@ -1,149 +1,149 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.config.middleware.simplehg |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | SimpleHg middleware for handling Mercurial protocol requests (push/clone etc.). |
|
19 | 19 | It's implemented with basic auth function |
|
20 | 20 | |
|
21 | 21 | This file was forked by the Kallithea project in July 2014. |
|
22 | 22 | Original author and date, and relevant copyright and licensing information is below: |
|
23 | 23 | :created_on: Apr 28, 2010 |
|
24 | 24 | :author: marcink |
|
25 | 25 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
26 | 26 | :license: GPLv3, see LICENSE.md for more details. |
|
27 | 27 | |
|
28 | 28 | """ |
|
29 | 29 | |
|
30 | 30 | |
|
31 | 31 | import logging |
|
32 | 32 | import os |
|
33 | 33 | import urllib.parse |
|
34 | 34 | |
|
35 | 35 | import mercurial.hgweb |
|
36 | 36 | |
|
37 | from kallithea.lib.base import BaseVCSController, get_path_info | |
|
37 | from kallithea.controllers import base | |
|
38 | 38 | from kallithea.lib.utils import make_ui |
|
39 | 39 | from kallithea.lib.utils2 import safe_bytes |
|
40 | 40 | |
|
41 | 41 | |
|
42 | 42 | log = logging.getLogger(__name__) |
|
43 | 43 | |
|
44 | 44 | |
|
45 | 45 | def get_header_hgarg(environ): |
|
46 | 46 | """Decode the special Mercurial encoding of big requests over multiple headers. |
|
47 | 47 | >>> get_header_hgarg({}) |
|
48 | 48 | '' |
|
49 | 49 | >>> get_header_hgarg({'HTTP_X_HGARG_0': ' ', 'HTTP_X_HGARG_1': 'a','HTTP_X_HGARG_2': '','HTTP_X_HGARG_3': 'b+c %20'}) |
|
50 | 50 | 'ab+c %20' |
|
51 | 51 | """ |
|
52 | 52 | chunks = [] |
|
53 | 53 | i = 1 |
|
54 | 54 | while True: |
|
55 | 55 | v = environ.get('HTTP_X_HGARG_%d' % i) |
|
56 | 56 | if v is None: |
|
57 | 57 | break |
|
58 | 58 | chunks.append(v) |
|
59 | 59 | i += 1 |
|
60 | 60 | return ''.join(chunks) |
|
61 | 61 | |
|
62 | 62 | |
|
63 | 63 | cmd_mapping = { |
|
64 | 64 | # 'batch' is not in this list - it is handled explicitly |
|
65 | 65 | 'between': 'pull', |
|
66 | 66 | 'branches': 'pull', |
|
67 | 67 | 'branchmap': 'pull', |
|
68 | 68 | 'capabilities': 'pull', |
|
69 | 69 | 'changegroup': 'pull', |
|
70 | 70 | 'changegroupsubset': 'pull', |
|
71 | 71 | 'changesetdata': 'pull', |
|
72 | 72 | 'clonebundles': 'pull', |
|
73 | 73 | 'debugwireargs': 'pull', |
|
74 | 74 | 'filedata': 'pull', |
|
75 | 75 | 'getbundle': 'pull', |
|
76 | 76 | 'getlfile': 'pull', |
|
77 | 77 | 'heads': 'pull', |
|
78 | 78 | 'hello': 'pull', |
|
79 | 79 | 'known': 'pull', |
|
80 | 80 | 'lheads': 'pull', |
|
81 | 81 | 'listkeys': 'pull', |
|
82 | 82 | 'lookup': 'pull', |
|
83 | 83 | 'manifestdata': 'pull', |
|
84 | 84 | 'narrow_widen': 'pull', |
|
85 | 85 | 'protocaps': 'pull', |
|
86 | 86 | 'statlfile': 'pull', |
|
87 | 87 | 'stream_out': 'pull', |
|
88 | 88 | 'pushkey': 'push', |
|
89 | 89 | 'putlfile': 'push', |
|
90 | 90 | 'unbundle': 'push', |
|
91 | 91 | } |
|
92 | 92 | |
|
93 | 93 | |
|
94 | class SimpleHg(BaseVCSController): | |
|
94 | class SimpleHg(base.BaseVCSController): | |
|
95 | 95 | |
|
96 | 96 | scm_alias = 'hg' |
|
97 | 97 | |
|
98 | 98 | @classmethod |
|
99 | 99 | def parse_request(cls, environ): |
|
100 | 100 | http_accept = environ.get('HTTP_ACCEPT', '') |
|
101 | 101 | if not http_accept.startswith('application/mercurial'): |
|
102 | 102 | return None |
|
103 | path_info = get_path_info(environ) | |
|
103 | path_info = base.get_path_info(environ) | |
|
104 | 104 | if not path_info.startswith('/'): # it must! |
|
105 | 105 | return None |
|
106 | 106 | |
|
107 | 107 | class parsed_request(object): |
|
108 | 108 | repo_name = path_info[1:].rstrip('/') |
|
109 | 109 | |
|
110 | 110 | query_string = environ['QUERY_STRING'] |
|
111 | 111 | |
|
112 | 112 | action = None |
|
113 | 113 | for qry in query_string.split('&'): |
|
114 | 114 | parts = qry.split('=', 1) |
|
115 | 115 | if len(parts) == 2 and parts[0] == 'cmd': |
|
116 | 116 | cmd = parts[1] |
|
117 | 117 | if cmd == 'batch': |
|
118 | 118 | hgarg = get_header_hgarg(environ) |
|
119 | 119 | if not hgarg.startswith('cmds='): |
|
120 | 120 | action = 'push' # paranoid and safe |
|
121 | 121 | break |
|
122 | 122 | action = 'pull' |
|
123 | 123 | for cmd_arg in hgarg[5:].split(';'): |
|
124 | 124 | cmd, _args = urllib.parse.unquote_plus(cmd_arg).split(' ', 1) |
|
125 | 125 | op = cmd_mapping.get(cmd, 'push') |
|
126 | 126 | if op != 'pull': |
|
127 | 127 | assert op == 'push' |
|
128 | 128 | action = 'push' |
|
129 | 129 | break |
|
130 | 130 | else: |
|
131 | 131 | action = cmd_mapping.get(cmd, 'push') |
|
132 | 132 | break # only process one cmd |
|
133 | 133 | |
|
134 | 134 | return parsed_request |
|
135 | 135 | |
|
136 | 136 | def _make_app(self, parsed_request): |
|
137 | 137 | """ |
|
138 | 138 | Make an hgweb wsgi application. |
|
139 | 139 | """ |
|
140 | 140 | repo_name = parsed_request.repo_name |
|
141 | 141 | repo_path = os.path.join(self.basepath, repo_name) |
|
142 | 142 | baseui = make_ui(repo_path=repo_path) |
|
143 | 143 | hgweb_app = mercurial.hgweb.hgweb(safe_bytes(repo_path), name=safe_bytes(repo_name), baseui=baseui) |
|
144 | 144 | |
|
145 | 145 | def wrapper_app(environ, start_response): |
|
146 | 146 | environ['REPO_NAME'] = repo_name # used by mercurial.hgweb.hgweb |
|
147 | 147 | return hgweb_app(environ, start_response) |
|
148 | 148 | |
|
149 | 149 | return wrapper_app |
@@ -1,102 +1,102 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.config.middleware.wrapper |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Wrap app to measure request and response time ... all the way to the response |
|
19 | 19 | WSGI iterator has been closed. |
|
20 | 20 | |
|
21 | 21 | This file was forked by the Kallithea project in July 2014. |
|
22 | 22 | Original author and date, and relevant copyright and licensing information is below: |
|
23 | 23 | :created_on: May 23, 2013 |
|
24 | 24 | :author: marcink |
|
25 | 25 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
26 | 26 | :license: GPLv3, see LICENSE.md for more details. |
|
27 | 27 | """ |
|
28 | 28 | |
|
29 | 29 | import logging |
|
30 | 30 | import time |
|
31 | 31 | |
|
32 | from kallithea.lib.base import get_ip_addr, get_path_info | |
|
32 | from kallithea.controllers import base | |
|
33 | 33 | |
|
34 | 34 | |
|
35 | 35 | log = logging.getLogger(__name__) |
|
36 | 36 | |
|
37 | 37 | |
|
38 | 38 | class Meter: |
|
39 | 39 | |
|
40 | 40 | def __init__(self, start_response): |
|
41 | 41 | self._start_response = start_response |
|
42 | 42 | self._start = time.time() |
|
43 | 43 | self.status = None |
|
44 | 44 | self._size = 0 |
|
45 | 45 | |
|
46 | 46 | def duration(self): |
|
47 | 47 | return time.time() - self._start |
|
48 | 48 | |
|
49 | 49 | def start_response(self, status, response_headers, exc_info=None): |
|
50 | 50 | self.status = status |
|
51 | 51 | write = self._start_response(status, response_headers, exc_info) |
|
52 | 52 | def metered_write(s): |
|
53 | 53 | self.measure(s) |
|
54 | 54 | write(s) |
|
55 | 55 | return metered_write |
|
56 | 56 | |
|
57 | 57 | def measure(self, chunk): |
|
58 | 58 | self._size += len(chunk) |
|
59 | 59 | |
|
60 | 60 | def size(self): |
|
61 | 61 | return self._size |
|
62 | 62 | |
|
63 | 63 | |
|
64 | 64 | class ResultIter: |
|
65 | 65 | |
|
66 | 66 | def __init__(self, result, meter, description): |
|
67 | 67 | self._result_close = getattr(result, 'close', None) or (lambda: None) |
|
68 | 68 | self._next = iter(result).__next__ |
|
69 | 69 | self._meter = meter |
|
70 | 70 | self._description = description |
|
71 | 71 | |
|
72 | 72 | def __iter__(self): |
|
73 | 73 | return self |
|
74 | 74 | |
|
75 | 75 | def __next__(self): |
|
76 | 76 | chunk = self._next() |
|
77 | 77 | self._meter.measure(chunk) |
|
78 | 78 | return chunk |
|
79 | 79 | |
|
80 | 80 | def close(self): |
|
81 | 81 | self._result_close() |
|
82 | 82 | log.info("%s responded %r after %.3fs with %s bytes", self._description, self._meter.status, self._meter.duration(), self._meter.size()) |
|
83 | 83 | |
|
84 | 84 | |
|
85 | 85 | class RequestWrapper(object): |
|
86 | 86 | |
|
87 | 87 | def __init__(self, app, config): |
|
88 | 88 | self.application = app |
|
89 | 89 | self.config = config |
|
90 | 90 | |
|
91 | 91 | def __call__(self, environ, start_response): |
|
92 | 92 | meter = Meter(start_response) |
|
93 | 93 | description = "Request from %s for %s" % ( |
|
94 | get_ip_addr(environ), | |
|
95 | get_path_info(environ), | |
|
94 | base.get_ip_addr(environ), | |
|
95 | base.get_path_info(environ), | |
|
96 | 96 | ) |
|
97 | 97 | log.info("%s received", description) |
|
98 | 98 | try: |
|
99 | 99 | result = self.application(environ, meter.start_response) |
|
100 | 100 | finally: |
|
101 | 101 | log.info("%s responding %r after %.3fs", description, meter.status, meter.duration()) |
|
102 | 102 | return ResultIter(result, meter, description) |
@@ -1,147 +1,147 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.admin.admin |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Controller for Admin panel of Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 7, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | |
|
29 | 29 | import logging |
|
30 | 30 | |
|
31 | 31 | from sqlalchemy.orm import joinedload |
|
32 | 32 | from sqlalchemy.sql.expression import and_, func, or_ |
|
33 | 33 | from tg import request |
|
34 | 34 | from tg import tmpl_context as c |
|
35 | 35 | from whoosh import query |
|
36 | 36 | from whoosh.qparser.dateparse import DateParserPlugin |
|
37 | 37 | from whoosh.qparser.default import QueryParser |
|
38 | 38 | |
|
39 | from kallithea.controllers import base | |
|
39 | 40 | from kallithea.lib.auth import HasPermissionAnyDecorator, LoginRequired |
|
40 | from kallithea.lib.base import BaseController, render | |
|
41 | 41 | from kallithea.lib.indexers import JOURNAL_SCHEMA |
|
42 | 42 | from kallithea.lib.page import Page |
|
43 | 43 | from kallithea.lib.utils2 import remove_prefix, remove_suffix, safe_int |
|
44 | 44 | from kallithea.model import db |
|
45 | 45 | |
|
46 | 46 | |
|
47 | 47 | log = logging.getLogger(__name__) |
|
48 | 48 | |
|
49 | 49 | |
|
50 | 50 | def _journal_filter(user_log, search_term): |
|
51 | 51 | """ |
|
52 | 52 | Filters sqlalchemy user_log based on search_term with whoosh Query language |
|
53 | 53 | http://packages.python.org/Whoosh/querylang.html |
|
54 | 54 | |
|
55 | 55 | :param user_log: |
|
56 | 56 | :param search_term: |
|
57 | 57 | """ |
|
58 | 58 | log.debug('Initial search term: %r', search_term) |
|
59 | 59 | qry = None |
|
60 | 60 | if search_term: |
|
61 | 61 | qp = QueryParser('repository', schema=JOURNAL_SCHEMA) |
|
62 | 62 | qp.add_plugin(DateParserPlugin()) |
|
63 | 63 | qry = qp.parse(search_term) |
|
64 | 64 | log.debug('Filtering using parsed query %r', qry) |
|
65 | 65 | |
|
66 | 66 | def wildcard_handler(col, wc_term): |
|
67 | 67 | if wc_term.startswith('*') and not wc_term.endswith('*'): |
|
68 | 68 | # postfix == endswith |
|
69 | 69 | wc_term = remove_prefix(wc_term, prefix='*') |
|
70 | 70 | return func.lower(col).endswith(func.lower(wc_term)) |
|
71 | 71 | elif wc_term.startswith('*') and wc_term.endswith('*'): |
|
72 | 72 | # wildcard == ilike |
|
73 | 73 | wc_term = remove_prefix(wc_term, prefix='*') |
|
74 | 74 | wc_term = remove_suffix(wc_term, suffix='*') |
|
75 | 75 | return func.lower(col).contains(func.lower(wc_term)) |
|
76 | 76 | |
|
77 | 77 | def get_filterion(field, val, term): |
|
78 | 78 | |
|
79 | 79 | if field == 'repository': |
|
80 | 80 | field = getattr(db.UserLog, 'repository_name') |
|
81 | 81 | elif field == 'ip': |
|
82 | 82 | field = getattr(db.UserLog, 'user_ip') |
|
83 | 83 | elif field == 'date': |
|
84 | 84 | field = getattr(db.UserLog, 'action_date') |
|
85 | 85 | elif field == 'username': |
|
86 | 86 | field = getattr(db.UserLog, 'username') |
|
87 | 87 | else: |
|
88 | 88 | field = getattr(db.UserLog, field) |
|
89 | 89 | log.debug('filter field: %s val=>%s', field, val) |
|
90 | 90 | |
|
91 | 91 | # sql filtering |
|
92 | 92 | if isinstance(term, query.Wildcard): |
|
93 | 93 | return wildcard_handler(field, val) |
|
94 | 94 | elif isinstance(term, query.Prefix): |
|
95 | 95 | return func.lower(field).startswith(func.lower(val)) |
|
96 | 96 | elif isinstance(term, query.DateRange): |
|
97 | 97 | return and_(field >= val[0], field <= val[1]) |
|
98 | 98 | return func.lower(field) == func.lower(val) |
|
99 | 99 | |
|
100 | 100 | if isinstance(qry, (query.And, query.Term, query.Prefix, query.Wildcard, |
|
101 | 101 | query.DateRange)): |
|
102 | 102 | if not isinstance(qry, query.And): |
|
103 | 103 | qry = [qry] |
|
104 | 104 | for term in qry: |
|
105 | 105 | field = term.fieldname |
|
106 | 106 | val = (term.text if not isinstance(term, query.DateRange) |
|
107 | 107 | else [term.startdate, term.enddate]) |
|
108 | 108 | user_log = user_log.filter(get_filterion(field, val, term)) |
|
109 | 109 | elif isinstance(qry, query.Or): |
|
110 | 110 | filters = [] |
|
111 | 111 | for term in qry: |
|
112 | 112 | field = term.fieldname |
|
113 | 113 | val = (term.text if not isinstance(term, query.DateRange) |
|
114 | 114 | else [term.startdate, term.enddate]) |
|
115 | 115 | filters.append(get_filterion(field, val, term)) |
|
116 | 116 | user_log = user_log.filter(or_(*filters)) |
|
117 | 117 | |
|
118 | 118 | return user_log |
|
119 | 119 | |
|
120 | 120 | |
|
121 | class AdminController(BaseController): | |
|
121 | class AdminController(base.BaseController): | |
|
122 | 122 | |
|
123 | 123 | @LoginRequired(allow_default_user=True) |
|
124 | 124 | def _before(self, *args, **kwargs): |
|
125 | 125 | super(AdminController, self)._before(*args, **kwargs) |
|
126 | 126 | |
|
127 | 127 | @HasPermissionAnyDecorator('hg.admin') |
|
128 | 128 | def index(self): |
|
129 | 129 | users_log = db.UserLog.query() \ |
|
130 | 130 | .options(joinedload(db.UserLog.user)) \ |
|
131 | 131 | .options(joinedload(db.UserLog.repository)) |
|
132 | 132 | |
|
133 | 133 | # FILTERING |
|
134 | 134 | c.search_term = request.GET.get('filter') |
|
135 | 135 | users_log = _journal_filter(users_log, c.search_term) |
|
136 | 136 | |
|
137 | 137 | users_log = users_log.order_by(db.UserLog.action_date.desc()) |
|
138 | 138 | |
|
139 | 139 | p = safe_int(request.GET.get('page'), 1) |
|
140 | 140 | |
|
141 | 141 | c.users_log = Page(users_log, page=p, items_per_page=10, |
|
142 | 142 | filter=c.search_term) |
|
143 | 143 | |
|
144 | 144 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
145 | return render('admin/admin_log.html') | |
|
145 | return base.render('admin/admin_log.html') | |
|
146 | 146 | |
|
147 | return render('admin/admin.html') | |
|
147 | return base.render('admin/admin.html') |
@@ -1,148 +1,148 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.admin.auth_settings |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | pluggable authentication controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Nov 26, 2010 |
|
23 | 23 | :author: akesterson |
|
24 | 24 | """ |
|
25 | 25 | |
|
26 | 26 | import logging |
|
27 | 27 | import traceback |
|
28 | 28 | |
|
29 | 29 | import formencode.htmlfill |
|
30 | 30 | from tg import request |
|
31 | 31 | from tg import tmpl_context as c |
|
32 | 32 | from tg.i18n import ugettext as _ |
|
33 | 33 | from webob.exc import HTTPFound |
|
34 | 34 | |
|
35 | from kallithea.controllers import base | |
|
35 | 36 | from kallithea.lib import auth_modules, webutils |
|
36 | 37 | from kallithea.lib.auth import HasPermissionAnyDecorator, LoginRequired |
|
37 | from kallithea.lib.base import BaseController, render | |
|
38 | 38 | from kallithea.lib.webutils import url |
|
39 | 39 | from kallithea.model import db, meta |
|
40 | 40 | from kallithea.model.forms import AuthSettingsForm |
|
41 | 41 | |
|
42 | 42 | |
|
43 | 43 | log = logging.getLogger(__name__) |
|
44 | 44 | |
|
45 | 45 | |
|
46 | class AuthSettingsController(BaseController): | |
|
46 | class AuthSettingsController(base.BaseController): | |
|
47 | 47 | |
|
48 | 48 | @LoginRequired() |
|
49 | 49 | @HasPermissionAnyDecorator('hg.admin') |
|
50 | 50 | def _before(self, *args, **kwargs): |
|
51 | 51 | super(AuthSettingsController, self)._before(*args, **kwargs) |
|
52 | 52 | |
|
53 | 53 | def __load_defaults(self): |
|
54 | 54 | c.available_plugins = [ |
|
55 | 55 | 'kallithea.lib.auth_modules.auth_internal', |
|
56 | 56 | 'kallithea.lib.auth_modules.auth_container', |
|
57 | 57 | 'kallithea.lib.auth_modules.auth_ldap', |
|
58 | 58 | 'kallithea.lib.auth_modules.auth_crowd', |
|
59 | 59 | 'kallithea.lib.auth_modules.auth_pam' |
|
60 | 60 | ] |
|
61 | 61 | self.enabled_plugins = auth_modules.get_auth_plugins() |
|
62 | 62 | c.enabled_plugin_names = [plugin.__class__.__module__ for plugin in self.enabled_plugins] |
|
63 | 63 | |
|
64 | 64 | def __render(self, defaults, errors): |
|
65 | 65 | c.defaults = {} |
|
66 | 66 | c.plugin_settings = {} |
|
67 | 67 | c.plugin_shortnames = {} |
|
68 | 68 | |
|
69 | 69 | for plugin in self.enabled_plugins: |
|
70 | 70 | module = plugin.__class__.__module__ |
|
71 | 71 | c.plugin_shortnames[module] = plugin.name |
|
72 | 72 | c.plugin_settings[module] = plugin.plugin_settings() |
|
73 | 73 | for v in c.plugin_settings[module]: |
|
74 | 74 | fullname = "auth_%s_%s" % (plugin.name, v["name"]) |
|
75 | 75 | if "default" in v: |
|
76 | 76 | c.defaults[fullname] = v["default"] |
|
77 | 77 | # Current values will be the default on the form, if there are any |
|
78 | 78 | setting = db.Setting.get_by_name(fullname) |
|
79 | 79 | if setting is not None: |
|
80 | 80 | c.defaults[fullname] = setting.app_settings_value |
|
81 | 81 | if defaults: |
|
82 | 82 | c.defaults.update(defaults) |
|
83 | 83 | |
|
84 | 84 | # we want to show , separated list of enabled plugins |
|
85 | 85 | c.defaults['auth_plugins'] = ','.join(c.enabled_plugin_names) |
|
86 | 86 | |
|
87 | 87 | log.debug('defaults: %s', defaults) |
|
88 | 88 | return formencode.htmlfill.render( |
|
89 | render('admin/auth/auth_settings.html'), | |
|
89 | base.render('admin/auth/auth_settings.html'), | |
|
90 | 90 | defaults=c.defaults, |
|
91 | 91 | errors=errors, |
|
92 | 92 | prefix_error=False, |
|
93 | 93 | encoding="UTF-8", |
|
94 | 94 | force_defaults=False) |
|
95 | 95 | |
|
96 | 96 | def index(self): |
|
97 | 97 | self.__load_defaults() |
|
98 | 98 | return self.__render(defaults=None, errors=None) |
|
99 | 99 | |
|
100 | 100 | def auth_settings(self): |
|
101 | 101 | """POST create and store auth settings""" |
|
102 | 102 | self.__load_defaults() |
|
103 | 103 | log.debug("POST Result: %s", dict(request.POST)) |
|
104 | 104 | |
|
105 | 105 | # First, parse only the plugin list (not the plugin settings). |
|
106 | 106 | _auth_plugins_validator = AuthSettingsForm([]).fields['auth_plugins'] |
|
107 | 107 | try: |
|
108 | 108 | new_enabled_plugins = _auth_plugins_validator.to_python(request.POST.get('auth_plugins')) |
|
109 | 109 | except formencode.Invalid: |
|
110 | 110 | # User provided an invalid plugin list. Just fall back to |
|
111 | 111 | # the list of currently enabled plugins. (We'll re-validate |
|
112 | 112 | # and show an error message to the user, below.) |
|
113 | 113 | pass |
|
114 | 114 | else: |
|
115 | 115 | # Hide plugins that the user has asked to be disabled, but |
|
116 | 116 | # do not show plugins that the user has asked to be enabled |
|
117 | 117 | # (yet), since that'll cause validation errors and/or wrong |
|
118 | 118 | # settings being applied (e.g. checkboxes being cleared), |
|
119 | 119 | # since the plugin settings will not be in the POST data. |
|
120 | 120 | c.enabled_plugin_names = [p for p in c.enabled_plugin_names if p in new_enabled_plugins] |
|
121 | 121 | |
|
122 | 122 | # Next, parse everything including plugin settings. |
|
123 | 123 | _form = AuthSettingsForm(c.enabled_plugin_names)() |
|
124 | 124 | |
|
125 | 125 | try: |
|
126 | 126 | form_result = _form.to_python(dict(request.POST)) |
|
127 | 127 | for k, v in form_result.items(): |
|
128 | 128 | if k == 'auth_plugins': |
|
129 | 129 | # we want to store it comma separated inside our settings |
|
130 | 130 | v = ','.join(v) |
|
131 | 131 | log.debug("%s = %s", k, str(v)) |
|
132 | 132 | setting = db.Setting.create_or_update(k, v) |
|
133 | 133 | meta.Session().commit() |
|
134 | 134 | webutils.flash(_('Auth settings updated successfully'), |
|
135 | 135 | category='success') |
|
136 | 136 | except formencode.Invalid as errors: |
|
137 | 137 | log.error(traceback.format_exc()) |
|
138 | 138 | e = errors.error_dict or {} |
|
139 | 139 | return self.__render( |
|
140 | 140 | defaults=errors.value, |
|
141 | 141 | errors=e, |
|
142 | 142 | ) |
|
143 | 143 | except Exception: |
|
144 | 144 | log.error(traceback.format_exc()) |
|
145 | 145 | webutils.flash(_('error occurred during update of auth settings'), |
|
146 | 146 | category='error') |
|
147 | 147 | |
|
148 | 148 | raise HTTPFound(location=url('auth_home')) |
@@ -1,91 +1,91 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.admin.defaults |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | default settings controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 27, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | |
|
31 | 31 | import formencode |
|
32 | 32 | from formencode import htmlfill |
|
33 | 33 | from tg import request |
|
34 | 34 | from tg.i18n import ugettext as _ |
|
35 | 35 | from webob.exc import HTTPFound |
|
36 | 36 | |
|
37 | from kallithea.controllers import base | |
|
37 | 38 | from kallithea.lib import webutils |
|
38 | 39 | from kallithea.lib.auth import HasPermissionAnyDecorator, LoginRequired |
|
39 | from kallithea.lib.base import BaseController, render | |
|
40 | 40 | from kallithea.lib.webutils import url |
|
41 | 41 | from kallithea.model import db, meta |
|
42 | 42 | from kallithea.model.forms import DefaultsForm |
|
43 | 43 | |
|
44 | 44 | |
|
45 | 45 | log = logging.getLogger(__name__) |
|
46 | 46 | |
|
47 | 47 | |
|
48 | class DefaultsController(BaseController): | |
|
48 | class DefaultsController(base.BaseController): | |
|
49 | 49 | |
|
50 | 50 | @LoginRequired() |
|
51 | 51 | @HasPermissionAnyDecorator('hg.admin') |
|
52 | 52 | def _before(self, *args, **kwargs): |
|
53 | 53 | super(DefaultsController, self)._before(*args, **kwargs) |
|
54 | 54 | |
|
55 | 55 | def index(self, format='html'): |
|
56 | 56 | defaults = db.Setting.get_default_repo_settings() |
|
57 | 57 | |
|
58 | 58 | return htmlfill.render( |
|
59 | render('admin/defaults/defaults.html'), | |
|
59 | base.render('admin/defaults/defaults.html'), | |
|
60 | 60 | defaults=defaults, |
|
61 | 61 | encoding="UTF-8", |
|
62 | 62 | force_defaults=False |
|
63 | 63 | ) |
|
64 | 64 | |
|
65 | 65 | def update(self, id): |
|
66 | 66 | _form = DefaultsForm()() |
|
67 | 67 | |
|
68 | 68 | try: |
|
69 | 69 | form_result = _form.to_python(dict(request.POST)) |
|
70 | 70 | for k, v in form_result.items(): |
|
71 | 71 | setting = db.Setting.create_or_update(k, v) |
|
72 | 72 | meta.Session().commit() |
|
73 | 73 | webutils.flash(_('Default settings updated successfully'), |
|
74 | 74 | category='success') |
|
75 | 75 | |
|
76 | 76 | except formencode.Invalid as errors: |
|
77 | 77 | defaults = errors.value |
|
78 | 78 | |
|
79 | 79 | return htmlfill.render( |
|
80 | render('admin/defaults/defaults.html'), | |
|
80 | base.render('admin/defaults/defaults.html'), | |
|
81 | 81 | defaults=defaults, |
|
82 | 82 | errors=errors.error_dict or {}, |
|
83 | 83 | prefix_error=False, |
|
84 | 84 | encoding="UTF-8", |
|
85 | 85 | force_defaults=False) |
|
86 | 86 | except Exception: |
|
87 | 87 | log.error(traceback.format_exc()) |
|
88 | 88 | webutils.flash(_('Error occurred during update of defaults'), |
|
89 | 89 | category='error') |
|
90 | 90 | |
|
91 | 91 | raise HTTPFound(location=url('defaults')) |
@@ -1,265 +1,265 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.admin.gists |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | gist controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: May 9, 2013 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | |
|
31 | 31 | import formencode.htmlfill |
|
32 | 32 | from sqlalchemy.sql.expression import or_ |
|
33 | 33 | from tg import request, response |
|
34 | 34 | from tg import tmpl_context as c |
|
35 | 35 | from tg.i18n import ugettext as _ |
|
36 | 36 | from webob.exc import HTTPForbidden, HTTPFound, HTTPNotFound |
|
37 | 37 | |
|
38 | from kallithea.controllers import base | |
|
38 | 39 | from kallithea.lib import auth, webutils |
|
39 | 40 | from kallithea.lib.auth import LoginRequired |
|
40 | from kallithea.lib.base import BaseController, jsonify, render | |
|
41 | 41 | from kallithea.lib.page import Page |
|
42 | 42 | from kallithea.lib.utils2 import safe_int, safe_str, time_to_datetime |
|
43 | 43 | from kallithea.lib.vcs.exceptions import NodeNotChangedError, VCSError |
|
44 | 44 | from kallithea.lib.webutils import url |
|
45 | 45 | from kallithea.model import db, meta |
|
46 | 46 | from kallithea.model.forms import GistForm |
|
47 | 47 | from kallithea.model.gist import GistModel |
|
48 | 48 | |
|
49 | 49 | |
|
50 | 50 | log = logging.getLogger(__name__) |
|
51 | 51 | |
|
52 | 52 | |
|
53 | class GistsController(BaseController): | |
|
53 | class GistsController(base.BaseController): | |
|
54 | 54 | |
|
55 | 55 | def __load_defaults(self, extra_values=None): |
|
56 | 56 | c.lifetime_values = [ |
|
57 | 57 | (str(-1), _('Forever')), |
|
58 | 58 | (str(5), _('5 minutes')), |
|
59 | 59 | (str(60), _('1 hour')), |
|
60 | 60 | (str(60 * 24), _('1 day')), |
|
61 | 61 | (str(60 * 24 * 30), _('1 month')), |
|
62 | 62 | ] |
|
63 | 63 | if extra_values: |
|
64 | 64 | c.lifetime_values.append(extra_values) |
|
65 | 65 | c.lifetime_options = [(c.lifetime_values, _("Lifetime"))] |
|
66 | 66 | |
|
67 | 67 | @LoginRequired(allow_default_user=True) |
|
68 | 68 | def index(self): |
|
69 | 69 | not_default_user = not request.authuser.is_default_user |
|
70 | 70 | c.show_private = request.GET.get('private') and not_default_user |
|
71 | 71 | c.show_public = request.GET.get('public') and not_default_user |
|
72 | 72 | url_params = {} |
|
73 | 73 | if c.show_public: |
|
74 | 74 | url_params['public'] = 1 |
|
75 | 75 | elif c.show_private: |
|
76 | 76 | url_params['private'] = 1 |
|
77 | 77 | |
|
78 | 78 | gists = db.Gist().query() \ |
|
79 | 79 | .filter_by(is_expired=False) \ |
|
80 | 80 | .order_by(db.Gist.created_on.desc()) |
|
81 | 81 | |
|
82 | 82 | # MY private |
|
83 | 83 | if c.show_private and not c.show_public: |
|
84 | 84 | gists = gists.filter(db.Gist.gist_type == db.Gist.GIST_PRIVATE) \ |
|
85 | 85 | .filter(db.Gist.owner_id == request.authuser.user_id) |
|
86 | 86 | # MY public |
|
87 | 87 | elif c.show_public and not c.show_private: |
|
88 | 88 | gists = gists.filter(db.Gist.gist_type == db.Gist.GIST_PUBLIC) \ |
|
89 | 89 | .filter(db.Gist.owner_id == request.authuser.user_id) |
|
90 | 90 | |
|
91 | 91 | # MY public+private |
|
92 | 92 | elif c.show_private and c.show_public: |
|
93 | 93 | gists = gists.filter(or_(db.Gist.gist_type == db.Gist.GIST_PUBLIC, |
|
94 | 94 | db.Gist.gist_type == db.Gist.GIST_PRIVATE)) \ |
|
95 | 95 | .filter(db.Gist.owner_id == request.authuser.user_id) |
|
96 | 96 | |
|
97 | 97 | # default show ALL public gists |
|
98 | 98 | if not c.show_public and not c.show_private: |
|
99 | 99 | gists = gists.filter(db.Gist.gist_type == db.Gist.GIST_PUBLIC) |
|
100 | 100 | |
|
101 | 101 | c.gists = gists |
|
102 | 102 | p = safe_int(request.GET.get('page'), 1) |
|
103 | 103 | c.gists_pager = Page(c.gists, page=p, items_per_page=10, |
|
104 | 104 | **url_params) |
|
105 | return render('admin/gists/index.html') | |
|
105 | return base.render('admin/gists/index.html') | |
|
106 | 106 | |
|
107 | 107 | @LoginRequired() |
|
108 | 108 | def create(self): |
|
109 | 109 | self.__load_defaults() |
|
110 | 110 | gist_form = GistForm([x[0] for x in c.lifetime_values])() |
|
111 | 111 | try: |
|
112 | 112 | form_result = gist_form.to_python(dict(request.POST)) |
|
113 | 113 | # TODO: multiple files support, from the form |
|
114 | 114 | filename = form_result['filename'] or db.Gist.DEFAULT_FILENAME |
|
115 | 115 | nodes = { |
|
116 | 116 | filename: { |
|
117 | 117 | 'content': form_result['content'], |
|
118 | 118 | 'lexer': form_result['mimetype'] # None is autodetect |
|
119 | 119 | } |
|
120 | 120 | } |
|
121 | 121 | _public = form_result['public'] |
|
122 | 122 | gist_type = db.Gist.GIST_PUBLIC if _public else db.Gist.GIST_PRIVATE |
|
123 | 123 | gist = GistModel().create( |
|
124 | 124 | description=form_result['description'], |
|
125 | 125 | owner=request.authuser.user_id, |
|
126 | 126 | ip_addr=request.ip_addr, |
|
127 | 127 | gist_mapping=nodes, |
|
128 | 128 | gist_type=gist_type, |
|
129 | 129 | lifetime=form_result['lifetime'] |
|
130 | 130 | ) |
|
131 | 131 | meta.Session().commit() |
|
132 | 132 | new_gist_id = gist.gist_access_id |
|
133 | 133 | except formencode.Invalid as errors: |
|
134 | 134 | defaults = errors.value |
|
135 | 135 | |
|
136 | 136 | return formencode.htmlfill.render( |
|
137 | render('admin/gists/new.html'), | |
|
137 | base.render('admin/gists/new.html'), | |
|
138 | 138 | defaults=defaults, |
|
139 | 139 | errors=errors.error_dict or {}, |
|
140 | 140 | prefix_error=False, |
|
141 | 141 | encoding="UTF-8", |
|
142 | 142 | force_defaults=False) |
|
143 | 143 | |
|
144 | 144 | except Exception as e: |
|
145 | 145 | log.error(traceback.format_exc()) |
|
146 | 146 | webutils.flash(_('Error occurred during gist creation'), category='error') |
|
147 | 147 | raise HTTPFound(location=url('new_gist')) |
|
148 | 148 | raise HTTPFound(location=url('gist', gist_id=new_gist_id)) |
|
149 | 149 | |
|
150 | 150 | @LoginRequired() |
|
151 | 151 | def new(self, format='html'): |
|
152 | 152 | self.__load_defaults() |
|
153 | return render('admin/gists/new.html') | |
|
153 | return base.render('admin/gists/new.html') | |
|
154 | 154 | |
|
155 | 155 | @LoginRequired() |
|
156 | 156 | def delete(self, gist_id): |
|
157 | 157 | gist = GistModel().get_gist(gist_id) |
|
158 | 158 | owner = gist.owner_id == request.authuser.user_id |
|
159 | 159 | if auth.HasPermissionAny('hg.admin')() or owner: |
|
160 | 160 | GistModel().delete(gist) |
|
161 | 161 | meta.Session().commit() |
|
162 | 162 | webutils.flash(_('Deleted gist %s') % gist.gist_access_id, category='success') |
|
163 | 163 | else: |
|
164 | 164 | raise HTTPForbidden() |
|
165 | 165 | |
|
166 | 166 | raise HTTPFound(location=url('gists')) |
|
167 | 167 | |
|
168 | 168 | @LoginRequired(allow_default_user=True) |
|
169 | 169 | def show(self, gist_id, revision='tip', format='html', f_path=None): |
|
170 | 170 | c.gist = db.Gist.get_or_404(gist_id) |
|
171 | 171 | |
|
172 | 172 | if c.gist.is_expired: |
|
173 | 173 | log.error('Gist expired at %s', |
|
174 | 174 | time_to_datetime(c.gist.gist_expires)) |
|
175 | 175 | raise HTTPNotFound() |
|
176 | 176 | try: |
|
177 | 177 | c.file_changeset, c.files = GistModel().get_gist_files(gist_id, |
|
178 | 178 | revision=revision) |
|
179 | 179 | except VCSError: |
|
180 | 180 | log.error(traceback.format_exc()) |
|
181 | 181 | raise HTTPNotFound() |
|
182 | 182 | if format == 'raw': |
|
183 | 183 | content = '\n\n'.join( |
|
184 | 184 | safe_str(f.content) |
|
185 | 185 | for f in c.files if (f_path is None or f.path == f_path) |
|
186 | 186 | ) |
|
187 | 187 | response.content_type = 'text/plain' |
|
188 | 188 | return content |
|
189 | return render('admin/gists/show.html') | |
|
189 | return base.render('admin/gists/show.html') | |
|
190 | 190 | |
|
191 | 191 | @LoginRequired() |
|
192 | 192 | def edit(self, gist_id, format='html'): |
|
193 | 193 | c.gist = db.Gist.get_or_404(gist_id) |
|
194 | 194 | |
|
195 | 195 | if c.gist.is_expired: |
|
196 | 196 | log.error('Gist expired at %s', |
|
197 | 197 | time_to_datetime(c.gist.gist_expires)) |
|
198 | 198 | raise HTTPNotFound() |
|
199 | 199 | try: |
|
200 | 200 | c.file_changeset, c.files = GistModel().get_gist_files(gist_id) |
|
201 | 201 | except VCSError: |
|
202 | 202 | log.error(traceback.format_exc()) |
|
203 | 203 | raise HTTPNotFound() |
|
204 | 204 | |
|
205 | 205 | self.__load_defaults(extra_values=('0', _('Unmodified'))) |
|
206 | rendered = render('admin/gists/edit.html') | |
|
206 | rendered = base.render('admin/gists/edit.html') | |
|
207 | 207 | |
|
208 | 208 | if request.POST: |
|
209 | 209 | rpost = request.POST |
|
210 | 210 | nodes = {} |
|
211 | 211 | for org_filename, filename, mimetype, content in zip( |
|
212 | 212 | rpost.getall('org_files'), |
|
213 | 213 | rpost.getall('files'), |
|
214 | 214 | rpost.getall('mimetypes'), |
|
215 | 215 | rpost.getall('contents')): |
|
216 | 216 | |
|
217 | 217 | nodes[org_filename] = { |
|
218 | 218 | 'org_filename': org_filename, |
|
219 | 219 | 'filename': filename, |
|
220 | 220 | 'content': content, |
|
221 | 221 | 'lexer': mimetype, |
|
222 | 222 | } |
|
223 | 223 | try: |
|
224 | 224 | GistModel().update( |
|
225 | 225 | gist=c.gist, |
|
226 | 226 | description=rpost['description'], |
|
227 | 227 | owner=c.gist.owner, # FIXME: request.authuser.user_id ? |
|
228 | 228 | ip_addr=request.ip_addr, |
|
229 | 229 | gist_mapping=nodes, |
|
230 | 230 | gist_type=c.gist.gist_type, |
|
231 | 231 | lifetime=rpost['lifetime'] |
|
232 | 232 | ) |
|
233 | 233 | |
|
234 | 234 | meta.Session().commit() |
|
235 | 235 | webutils.flash(_('Successfully updated gist content'), category='success') |
|
236 | 236 | except NodeNotChangedError: |
|
237 | 237 | # raised if nothing was changed in repo itself. We anyway then |
|
238 | 238 | # store only DB stuff for gist |
|
239 | 239 | meta.Session().commit() |
|
240 | 240 | webutils.flash(_('Successfully updated gist data'), category='success') |
|
241 | 241 | except Exception: |
|
242 | 242 | log.error(traceback.format_exc()) |
|
243 | 243 | webutils.flash(_('Error occurred during update of gist %s') % gist_id, |
|
244 | 244 | category='error') |
|
245 | 245 | |
|
246 | 246 | raise HTTPFound(location=url('gist', gist_id=gist_id)) |
|
247 | 247 | |
|
248 | 248 | return rendered |
|
249 | 249 | |
|
250 | 250 | @LoginRequired() |
|
251 | @jsonify | |
|
251 | @base.jsonify | |
|
252 | 252 | def check_revision(self, gist_id): |
|
253 | 253 | c.gist = db.Gist.get_or_404(gist_id) |
|
254 | 254 | last_rev = c.gist.scm_instance.get_changeset() |
|
255 | 255 | success = True |
|
256 | 256 | revision = request.POST.get('revision') |
|
257 | 257 | |
|
258 | 258 | # TODO: maybe move this to model ? |
|
259 | 259 | if revision != last_rev.raw_id: |
|
260 | 260 | log.error('Last revision %s is different than submitted %s', |
|
261 | 261 | revision, last_rev) |
|
262 | 262 | # our gist has newer version than we |
|
263 | 263 | success = False |
|
264 | 264 | |
|
265 | 265 | return {'success': success} |
@@ -1,289 +1,289 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.admin.my_account |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | my account controller for Kallithea admin |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: August 20, 2013 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | |
|
31 | 31 | import formencode |
|
32 | 32 | from formencode import htmlfill |
|
33 | 33 | from tg import request |
|
34 | 34 | from tg import tmpl_context as c |
|
35 | 35 | from tg.i18n import ugettext as _ |
|
36 | 36 | from webob.exc import HTTPFound |
|
37 | 37 | |
|
38 | from kallithea.controllers import base | |
|
38 | 39 | from kallithea.lib import auth_modules, webutils |
|
39 | 40 | from kallithea.lib.auth import AuthUser, LoginRequired |
|
40 | from kallithea.lib.base import BaseController, IfSshEnabled, render | |
|
41 | 41 | from kallithea.lib.utils2 import generate_api_key, safe_int |
|
42 | 42 | from kallithea.lib.webutils import url |
|
43 | 43 | from kallithea.model import db, meta |
|
44 | 44 | from kallithea.model.api_key import ApiKeyModel |
|
45 | 45 | from kallithea.model.forms import PasswordChangeForm, UserForm |
|
46 | 46 | from kallithea.model.repo import RepoModel |
|
47 | 47 | from kallithea.model.ssh_key import SshKeyModel, SshKeyModelException |
|
48 | 48 | from kallithea.model.user import UserModel |
|
49 | 49 | |
|
50 | 50 | |
|
51 | 51 | log = logging.getLogger(__name__) |
|
52 | 52 | |
|
53 | 53 | |
|
54 | class MyAccountController(BaseController): | |
|
54 | class MyAccountController(base.BaseController): | |
|
55 | 55 | |
|
56 | 56 | @LoginRequired() |
|
57 | 57 | def _before(self, *args, **kwargs): |
|
58 | 58 | super(MyAccountController, self)._before(*args, **kwargs) |
|
59 | 59 | |
|
60 | 60 | def __load_data(self): |
|
61 | 61 | c.user = db.User.get(request.authuser.user_id) |
|
62 | 62 | if c.user.is_default_user: |
|
63 | 63 | webutils.flash(_("You can't edit this user since it's" |
|
64 | 64 | " crucial for entire application"), category='warning') |
|
65 | 65 | raise HTTPFound(location=url('users')) |
|
66 | 66 | |
|
67 | 67 | def _load_my_repos_data(self, watched=False): |
|
68 | 68 | if watched: |
|
69 | 69 | admin = False |
|
70 | 70 | repos_list = meta.Session().query(db.Repository) \ |
|
71 | 71 | .join(db.UserFollowing) \ |
|
72 | 72 | .filter(db.UserFollowing.user_id == |
|
73 | 73 | request.authuser.user_id).all() |
|
74 | 74 | else: |
|
75 | 75 | admin = True |
|
76 | 76 | repos_list = meta.Session().query(db.Repository) \ |
|
77 | 77 | .filter(db.Repository.owner_id == |
|
78 | 78 | request.authuser.user_id).all() |
|
79 | 79 | |
|
80 | 80 | return RepoModel().get_repos_as_dict(repos_list, admin=admin) |
|
81 | 81 | |
|
82 | 82 | def my_account(self): |
|
83 | 83 | c.active = 'profile' |
|
84 | 84 | self.__load_data() |
|
85 | 85 | c.perm_user = AuthUser(user_id=request.authuser.user_id) |
|
86 | 86 | managed_fields = auth_modules.get_managed_fields(c.user) |
|
87 | 87 | def_user_perms = AuthUser(dbuser=db.User.get_default_user()).global_permissions |
|
88 | 88 | if 'hg.register.none' in def_user_perms: |
|
89 | 89 | managed_fields.extend(['username', 'firstname', 'lastname', 'email']) |
|
90 | 90 | |
|
91 | 91 | c.readonly = lambda n: 'readonly' if n in managed_fields else None |
|
92 | 92 | |
|
93 | 93 | defaults = c.user.get_dict() |
|
94 | 94 | update = False |
|
95 | 95 | if request.POST: |
|
96 | 96 | _form = UserForm(edit=True, |
|
97 | 97 | old_data={'user_id': request.authuser.user_id, |
|
98 | 98 | 'email': request.authuser.email})() |
|
99 | 99 | form_result = {} |
|
100 | 100 | try: |
|
101 | 101 | post_data = dict(request.POST) |
|
102 | 102 | post_data['new_password'] = '' |
|
103 | 103 | post_data['password_confirmation'] = '' |
|
104 | 104 | form_result = _form.to_python(post_data) |
|
105 | 105 | # skip updating those attrs for my account |
|
106 | 106 | skip_attrs = ['admin', 'active', 'extern_type', 'extern_name', |
|
107 | 107 | 'new_password', 'password_confirmation', |
|
108 | 108 | ] + managed_fields |
|
109 | 109 | |
|
110 | 110 | UserModel().update(request.authuser.user_id, form_result, |
|
111 | 111 | skip_attrs=skip_attrs) |
|
112 | 112 | webutils.flash(_('Your account was updated successfully'), |
|
113 | 113 | category='success') |
|
114 | 114 | meta.Session().commit() |
|
115 | 115 | update = True |
|
116 | 116 | |
|
117 | 117 | except formencode.Invalid as errors: |
|
118 | 118 | return htmlfill.render( |
|
119 | render('admin/my_account/my_account.html'), | |
|
119 | base.render('admin/my_account/my_account.html'), | |
|
120 | 120 | defaults=errors.value, |
|
121 | 121 | errors=errors.error_dict or {}, |
|
122 | 122 | prefix_error=False, |
|
123 | 123 | encoding="UTF-8", |
|
124 | 124 | force_defaults=False) |
|
125 | 125 | except Exception: |
|
126 | 126 | log.error(traceback.format_exc()) |
|
127 | 127 | webutils.flash(_('Error occurred during update of user %s') |
|
128 | 128 | % form_result.get('username'), category='error') |
|
129 | 129 | if update: |
|
130 | 130 | raise HTTPFound(location='my_account') |
|
131 | 131 | return htmlfill.render( |
|
132 | render('admin/my_account/my_account.html'), | |
|
132 | base.render('admin/my_account/my_account.html'), | |
|
133 | 133 | defaults=defaults, |
|
134 | 134 | encoding="UTF-8", |
|
135 | 135 | force_defaults=False) |
|
136 | 136 | |
|
137 | 137 | def my_account_password(self): |
|
138 | 138 | c.active = 'password' |
|
139 | 139 | self.__load_data() |
|
140 | 140 | |
|
141 | 141 | managed_fields = auth_modules.get_managed_fields(c.user) |
|
142 | 142 | c.can_change_password = 'password' not in managed_fields |
|
143 | 143 | |
|
144 | 144 | if request.POST and c.can_change_password: |
|
145 | 145 | _form = PasswordChangeForm(request.authuser.username)() |
|
146 | 146 | try: |
|
147 | 147 | form_result = _form.to_python(request.POST) |
|
148 | 148 | UserModel().update(request.authuser.user_id, form_result) |
|
149 | 149 | meta.Session().commit() |
|
150 | 150 | webutils.flash(_("Successfully updated password"), category='success') |
|
151 | 151 | except formencode.Invalid as errors: |
|
152 | 152 | return htmlfill.render( |
|
153 | render('admin/my_account/my_account.html'), | |
|
153 | base.render('admin/my_account/my_account.html'), | |
|
154 | 154 | defaults=errors.value, |
|
155 | 155 | errors=errors.error_dict or {}, |
|
156 | 156 | prefix_error=False, |
|
157 | 157 | encoding="UTF-8", |
|
158 | 158 | force_defaults=False) |
|
159 | 159 | except Exception: |
|
160 | 160 | log.error(traceback.format_exc()) |
|
161 | 161 | webutils.flash(_('Error occurred during update of user password'), |
|
162 | 162 | category='error') |
|
163 | return render('admin/my_account/my_account.html') | |
|
163 | return base.render('admin/my_account/my_account.html') | |
|
164 | 164 | |
|
165 | 165 | def my_account_repos(self): |
|
166 | 166 | c.active = 'repos' |
|
167 | 167 | self.__load_data() |
|
168 | 168 | |
|
169 | 169 | # data used to render the grid |
|
170 | 170 | c.data = self._load_my_repos_data() |
|
171 | return render('admin/my_account/my_account.html') | |
|
171 | return base.render('admin/my_account/my_account.html') | |
|
172 | 172 | |
|
173 | 173 | def my_account_watched(self): |
|
174 | 174 | c.active = 'watched' |
|
175 | 175 | self.__load_data() |
|
176 | 176 | |
|
177 | 177 | # data used to render the grid |
|
178 | 178 | c.data = self._load_my_repos_data(watched=True) |
|
179 | return render('admin/my_account/my_account.html') | |
|
179 | return base.render('admin/my_account/my_account.html') | |
|
180 | 180 | |
|
181 | 181 | def my_account_perms(self): |
|
182 | 182 | c.active = 'perms' |
|
183 | 183 | self.__load_data() |
|
184 | 184 | c.perm_user = AuthUser(user_id=request.authuser.user_id) |
|
185 | 185 | |
|
186 | return render('admin/my_account/my_account.html') | |
|
186 | return base.render('admin/my_account/my_account.html') | |
|
187 | 187 | |
|
188 | 188 | def my_account_emails(self): |
|
189 | 189 | c.active = 'emails' |
|
190 | 190 | self.__load_data() |
|
191 | 191 | |
|
192 | 192 | c.user_email_map = db.UserEmailMap.query() \ |
|
193 | 193 | .filter(db.UserEmailMap.user == c.user).all() |
|
194 | return render('admin/my_account/my_account.html') | |
|
194 | return base.render('admin/my_account/my_account.html') | |
|
195 | 195 | |
|
196 | 196 | def my_account_emails_add(self): |
|
197 | 197 | email = request.POST.get('new_email') |
|
198 | 198 | |
|
199 | 199 | try: |
|
200 | 200 | UserModel().add_extra_email(request.authuser.user_id, email) |
|
201 | 201 | meta.Session().commit() |
|
202 | 202 | webutils.flash(_("Added email %s to user") % email, category='success') |
|
203 | 203 | except formencode.Invalid as error: |
|
204 | 204 | msg = error.error_dict['email'] |
|
205 | 205 | webutils.flash(msg, category='error') |
|
206 | 206 | except Exception: |
|
207 | 207 | log.error(traceback.format_exc()) |
|
208 | 208 | webutils.flash(_('An error occurred during email saving'), |
|
209 | 209 | category='error') |
|
210 | 210 | raise HTTPFound(location=url('my_account_emails')) |
|
211 | 211 | |
|
212 | 212 | def my_account_emails_delete(self): |
|
213 | 213 | email_id = request.POST.get('del_email_id') |
|
214 | 214 | user_model = UserModel() |
|
215 | 215 | user_model.delete_extra_email(request.authuser.user_id, email_id) |
|
216 | 216 | meta.Session().commit() |
|
217 | 217 | webutils.flash(_("Removed email from user"), category='success') |
|
218 | 218 | raise HTTPFound(location=url('my_account_emails')) |
|
219 | 219 | |
|
220 | 220 | def my_account_api_keys(self): |
|
221 | 221 | c.active = 'api_keys' |
|
222 | 222 | self.__load_data() |
|
223 | 223 | show_expired = True |
|
224 | 224 | c.lifetime_values = [ |
|
225 | 225 | (str(-1), _('Forever')), |
|
226 | 226 | (str(5), _('5 minutes')), |
|
227 | 227 | (str(60), _('1 hour')), |
|
228 | 228 | (str(60 * 24), _('1 day')), |
|
229 | 229 | (str(60 * 24 * 30), _('1 month')), |
|
230 | 230 | ] |
|
231 | 231 | c.lifetime_options = [(c.lifetime_values, _("Lifetime"))] |
|
232 | 232 | c.user_api_keys = ApiKeyModel().get_api_keys(request.authuser.user_id, |
|
233 | 233 | show_expired=show_expired) |
|
234 | return render('admin/my_account/my_account.html') | |
|
234 | return base.render('admin/my_account/my_account.html') | |
|
235 | 235 | |
|
236 | 236 | def my_account_api_keys_add(self): |
|
237 | 237 | lifetime = safe_int(request.POST.get('lifetime'), -1) |
|
238 | 238 | description = request.POST.get('description') |
|
239 | 239 | ApiKeyModel().create(request.authuser.user_id, description, lifetime) |
|
240 | 240 | meta.Session().commit() |
|
241 | 241 | webutils.flash(_("API key successfully created"), category='success') |
|
242 | 242 | raise HTTPFound(location=url('my_account_api_keys')) |
|
243 | 243 | |
|
244 | 244 | def my_account_api_keys_delete(self): |
|
245 | 245 | api_key = request.POST.get('del_api_key') |
|
246 | 246 | if request.POST.get('del_api_key_builtin'): |
|
247 | 247 | user = db.User.get(request.authuser.user_id) |
|
248 | 248 | user.api_key = generate_api_key() |
|
249 | 249 | meta.Session().commit() |
|
250 | 250 | webutils.flash(_("API key successfully reset"), category='success') |
|
251 | 251 | elif api_key: |
|
252 | 252 | ApiKeyModel().delete(api_key, request.authuser.user_id) |
|
253 | 253 | meta.Session().commit() |
|
254 | 254 | webutils.flash(_("API key successfully deleted"), category='success') |
|
255 | 255 | |
|
256 | 256 | raise HTTPFound(location=url('my_account_api_keys')) |
|
257 | 257 | |
|
258 | @IfSshEnabled | |
|
258 | @base.IfSshEnabled | |
|
259 | 259 | def my_account_ssh_keys(self): |
|
260 | 260 | c.active = 'ssh_keys' |
|
261 | 261 | self.__load_data() |
|
262 | 262 | c.user_ssh_keys = SshKeyModel().get_ssh_keys(request.authuser.user_id) |
|
263 | return render('admin/my_account/my_account.html') | |
|
263 | return base.render('admin/my_account/my_account.html') | |
|
264 | 264 | |
|
265 | @IfSshEnabled | |
|
265 | @base.IfSshEnabled | |
|
266 | 266 | def my_account_ssh_keys_add(self): |
|
267 | 267 | description = request.POST.get('description') |
|
268 | 268 | public_key = request.POST.get('public_key') |
|
269 | 269 | try: |
|
270 | 270 | new_ssh_key = SshKeyModel().create(request.authuser.user_id, |
|
271 | 271 | description, public_key) |
|
272 | 272 | meta.Session().commit() |
|
273 | 273 | SshKeyModel().write_authorized_keys() |
|
274 | 274 | webutils.flash(_("SSH key %s successfully added") % new_ssh_key.fingerprint, category='success') |
|
275 | 275 | except SshKeyModelException as e: |
|
276 | 276 | webutils.flash(e.args[0], category='error') |
|
277 | 277 | raise HTTPFound(location=url('my_account_ssh_keys')) |
|
278 | 278 | |
|
279 | @IfSshEnabled | |
|
279 | @base.IfSshEnabled | |
|
280 | 280 | def my_account_ssh_keys_delete(self): |
|
281 | 281 | fingerprint = request.POST.get('del_public_key_fingerprint') |
|
282 | 282 | try: |
|
283 | 283 | SshKeyModel().delete(fingerprint, request.authuser.user_id) |
|
284 | 284 | meta.Session().commit() |
|
285 | 285 | SshKeyModel().write_authorized_keys() |
|
286 | 286 | webutils.flash(_("SSH key successfully deleted"), category='success') |
|
287 | 287 | except SshKeyModelException as e: |
|
288 | 288 | webutils.flash(e.args[0], category='error') |
|
289 | 289 | raise HTTPFound(location=url('my_account_ssh_keys')) |
@@ -1,182 +1,182 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.admin.permissions |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | permissions controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 27, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | |
|
29 | 29 | import logging |
|
30 | 30 | import traceback |
|
31 | 31 | |
|
32 | 32 | import formencode |
|
33 | 33 | from formencode import htmlfill |
|
34 | 34 | from tg import request |
|
35 | 35 | from tg import tmpl_context as c |
|
36 | 36 | from tg.i18n import ugettext as _ |
|
37 | 37 | from webob.exc import HTTPFound |
|
38 | 38 | |
|
39 | from kallithea.controllers import base | |
|
39 | 40 | from kallithea.lib import webutils |
|
40 | 41 | from kallithea.lib.auth import AuthUser, HasPermissionAnyDecorator, LoginRequired |
|
41 | from kallithea.lib.base import BaseController, render | |
|
42 | 42 | from kallithea.lib.webutils import url |
|
43 | 43 | from kallithea.model import db, meta |
|
44 | 44 | from kallithea.model.forms import DefaultPermissionsForm |
|
45 | 45 | from kallithea.model.permission import PermissionModel |
|
46 | 46 | |
|
47 | 47 | |
|
48 | 48 | log = logging.getLogger(__name__) |
|
49 | 49 | |
|
50 | 50 | |
|
51 | class PermissionsController(BaseController): | |
|
51 | class PermissionsController(base.BaseController): | |
|
52 | 52 | |
|
53 | 53 | @LoginRequired() |
|
54 | 54 | @HasPermissionAnyDecorator('hg.admin') |
|
55 | 55 | def _before(self, *args, **kwargs): |
|
56 | 56 | super(PermissionsController, self)._before(*args, **kwargs) |
|
57 | 57 | |
|
58 | 58 | def __load_data(self): |
|
59 | 59 | # Permissions for the Default user on new repositories |
|
60 | 60 | c.repo_perms_choices = [('repository.none', _('None'),), |
|
61 | 61 | ('repository.read', _('Read'),), |
|
62 | 62 | ('repository.write', _('Write'),), |
|
63 | 63 | ('repository.admin', _('Admin'),)] |
|
64 | 64 | # Permissions for the Default user on new repository groups |
|
65 | 65 | c.group_perms_choices = [('group.none', _('None'),), |
|
66 | 66 | ('group.read', _('Read'),), |
|
67 | 67 | ('group.write', _('Write'),), |
|
68 | 68 | ('group.admin', _('Admin'),)] |
|
69 | 69 | # Permissions for the Default user on new user groups |
|
70 | 70 | c.user_group_perms_choices = [('usergroup.none', _('None'),), |
|
71 | 71 | ('usergroup.read', _('Read'),), |
|
72 | 72 | ('usergroup.write', _('Write'),), |
|
73 | 73 | ('usergroup.admin', _('Admin'),)] |
|
74 | 74 | # Registration - allow new Users to create an account |
|
75 | 75 | c.register_choices = [ |
|
76 | 76 | ('hg.register.none', |
|
77 | 77 | _('Disabled')), |
|
78 | 78 | ('hg.register.manual_activate', |
|
79 | 79 | _('Allowed with manual account activation')), |
|
80 | 80 | ('hg.register.auto_activate', |
|
81 | 81 | _('Allowed with automatic account activation')), ] |
|
82 | 82 | # External auth account activation |
|
83 | 83 | c.extern_activate_choices = [ |
|
84 | 84 | ('hg.extern_activate.manual', _('Manual activation of external account')), |
|
85 | 85 | ('hg.extern_activate.auto', _('Automatic activation of external account')), |
|
86 | 86 | ] |
|
87 | 87 | # Top level repository creation |
|
88 | 88 | c.repo_create_choices = [('hg.create.none', _('Disabled')), |
|
89 | 89 | ('hg.create.repository', _('Enabled'))] |
|
90 | 90 | # User group creation |
|
91 | 91 | c.user_group_create_choices = [('hg.usergroup.create.false', _('Disabled')), |
|
92 | 92 | ('hg.usergroup.create.true', _('Enabled'))] |
|
93 | 93 | # Repository forking: |
|
94 | 94 | c.fork_choices = [('hg.fork.none', _('Disabled')), |
|
95 | 95 | ('hg.fork.repository', _('Enabled'))] |
|
96 | 96 | |
|
97 | 97 | def permission_globals(self): |
|
98 | 98 | c.active = 'globals' |
|
99 | 99 | self.__load_data() |
|
100 | 100 | if request.POST: |
|
101 | 101 | _form = DefaultPermissionsForm( |
|
102 | 102 | [x[0] for x in c.repo_perms_choices], |
|
103 | 103 | [x[0] for x in c.group_perms_choices], |
|
104 | 104 | [x[0] for x in c.user_group_perms_choices], |
|
105 | 105 | [x[0] for x in c.repo_create_choices], |
|
106 | 106 | [x[0] for x in c.user_group_create_choices], |
|
107 | 107 | [x[0] for x in c.fork_choices], |
|
108 | 108 | [x[0] for x in c.register_choices], |
|
109 | 109 | [x[0] for x in c.extern_activate_choices])() |
|
110 | 110 | |
|
111 | 111 | try: |
|
112 | 112 | form_result = _form.to_python(dict(request.POST)) |
|
113 | 113 | form_result.update({'perm_user_name': 'default'}) |
|
114 | 114 | PermissionModel().update(form_result) |
|
115 | 115 | meta.Session().commit() |
|
116 | 116 | webutils.flash(_('Global permissions updated successfully'), |
|
117 | 117 | category='success') |
|
118 | 118 | |
|
119 | 119 | except formencode.Invalid as errors: |
|
120 | 120 | defaults = errors.value |
|
121 | 121 | |
|
122 | 122 | return htmlfill.render( |
|
123 | render('admin/permissions/permissions.html'), | |
|
123 | base.render('admin/permissions/permissions.html'), | |
|
124 | 124 | defaults=defaults, |
|
125 | 125 | errors=errors.error_dict or {}, |
|
126 | 126 | prefix_error=False, |
|
127 | 127 | encoding="UTF-8", |
|
128 | 128 | force_defaults=False) |
|
129 | 129 | except Exception: |
|
130 | 130 | log.error(traceback.format_exc()) |
|
131 | 131 | webutils.flash(_('Error occurred during update of permissions'), |
|
132 | 132 | category='error') |
|
133 | 133 | |
|
134 | 134 | raise HTTPFound(location=url('admin_permissions')) |
|
135 | 135 | |
|
136 | 136 | c.user = db.User.get_default_user() |
|
137 | 137 | defaults = {'anonymous': c.user.active} |
|
138 | 138 | |
|
139 | 139 | for p in c.user.user_perms: |
|
140 | 140 | if p.permission.permission_name.startswith('repository.'): |
|
141 | 141 | defaults['default_repo_perm'] = p.permission.permission_name |
|
142 | 142 | |
|
143 | 143 | if p.permission.permission_name.startswith('group.'): |
|
144 | 144 | defaults['default_group_perm'] = p.permission.permission_name |
|
145 | 145 | |
|
146 | 146 | if p.permission.permission_name.startswith('usergroup.'): |
|
147 | 147 | defaults['default_user_group_perm'] = p.permission.permission_name |
|
148 | 148 | |
|
149 | 149 | elif p.permission.permission_name.startswith('hg.create.'): |
|
150 | 150 | defaults['default_repo_create'] = p.permission.permission_name |
|
151 | 151 | |
|
152 | 152 | if p.permission.permission_name.startswith('hg.usergroup.'): |
|
153 | 153 | defaults['default_user_group_create'] = p.permission.permission_name |
|
154 | 154 | |
|
155 | 155 | if p.permission.permission_name.startswith('hg.register.'): |
|
156 | 156 | defaults['default_register'] = p.permission.permission_name |
|
157 | 157 | |
|
158 | 158 | if p.permission.permission_name.startswith('hg.extern_activate.'): |
|
159 | 159 | defaults['default_extern_activate'] = p.permission.permission_name |
|
160 | 160 | |
|
161 | 161 | if p.permission.permission_name.startswith('hg.fork.'): |
|
162 | 162 | defaults['default_fork'] = p.permission.permission_name |
|
163 | 163 | |
|
164 | 164 | return htmlfill.render( |
|
165 | render('admin/permissions/permissions.html'), | |
|
165 | base.render('admin/permissions/permissions.html'), | |
|
166 | 166 | defaults=defaults, |
|
167 | 167 | encoding="UTF-8", |
|
168 | 168 | force_defaults=False) |
|
169 | 169 | |
|
170 | 170 | def permission_ips(self): |
|
171 | 171 | c.active = 'ips' |
|
172 | 172 | c.user = db.User.get_default_user() |
|
173 | 173 | c.user_ip_map = db.UserIpMap.query() \ |
|
174 | 174 | .filter(db.UserIpMap.user == c.user).all() |
|
175 | 175 | |
|
176 | return render('admin/permissions/permissions.html') | |
|
176 | return base.render('admin/permissions/permissions.html') | |
|
177 | 177 | |
|
178 | 178 | def permission_perms(self): |
|
179 | 179 | c.active = 'perms' |
|
180 | 180 | c.user = db.User.get_default_user() |
|
181 | 181 | c.perm_user = AuthUser(dbuser=c.user) |
|
182 | return render('admin/permissions/permissions.html') | |
|
182 | return base.render('admin/permissions/permissions.html') |
@@ -1,402 +1,402 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.admin.repo_groups |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Repository groups controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Mar 23, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | |
|
31 | 31 | import formencode |
|
32 | 32 | from formencode import htmlfill |
|
33 | 33 | from tg import app_globals, request |
|
34 | 34 | from tg import tmpl_context as c |
|
35 | 35 | from tg.i18n import ugettext as _ |
|
36 | 36 | from tg.i18n import ungettext |
|
37 | 37 | from webob.exc import HTTPForbidden, HTTPFound, HTTPInternalServerError, HTTPNotFound |
|
38 | 38 | |
|
39 | 39 | import kallithea.lib.helpers as h |
|
40 | from kallithea.controllers import base | |
|
40 | 41 | from kallithea.lib import webutils |
|
41 | 42 | from kallithea.lib.auth import HasPermissionAny, HasRepoGroupPermissionLevel, HasRepoGroupPermissionLevelDecorator, LoginRequired |
|
42 | from kallithea.lib.base import BaseController, render | |
|
43 | 43 | from kallithea.lib.utils2 import safe_int |
|
44 | 44 | from kallithea.lib.webutils import url |
|
45 | 45 | from kallithea.model import db, meta |
|
46 | 46 | from kallithea.model.forms import RepoGroupForm, RepoGroupPermsForm |
|
47 | 47 | from kallithea.model.repo import RepoModel |
|
48 | 48 | from kallithea.model.repo_group import RepoGroupModel |
|
49 | 49 | from kallithea.model.scm import AvailableRepoGroupChoices, RepoGroupList |
|
50 | 50 | |
|
51 | 51 | |
|
52 | 52 | log = logging.getLogger(__name__) |
|
53 | 53 | |
|
54 | 54 | |
|
55 | class RepoGroupsController(BaseController): | |
|
55 | class RepoGroupsController(base.BaseController): | |
|
56 | 56 | |
|
57 | 57 | @LoginRequired(allow_default_user=True) |
|
58 | 58 | def _before(self, *args, **kwargs): |
|
59 | 59 | super(RepoGroupsController, self)._before(*args, **kwargs) |
|
60 | 60 | |
|
61 | 61 | def __load_defaults(self, extras=(), exclude=()): |
|
62 | 62 | """extras is used for keeping current parent ignoring permissions |
|
63 | 63 | exclude is used for not moving group to itself TODO: also exclude descendants |
|
64 | 64 | Note: only admin can create top level groups |
|
65 | 65 | """ |
|
66 | 66 | repo_groups = AvailableRepoGroupChoices('admin', extras) |
|
67 | 67 | exclude_group_ids = set(rg.group_id for rg in exclude) |
|
68 | 68 | c.repo_groups = [rg for rg in repo_groups |
|
69 | 69 | if rg[0] not in exclude_group_ids] |
|
70 | 70 | |
|
71 | 71 | def __load_data(self, group_id): |
|
72 | 72 | """ |
|
73 | 73 | Load defaults settings for edit, and update |
|
74 | 74 | |
|
75 | 75 | :param group_id: |
|
76 | 76 | """ |
|
77 | 77 | repo_group = db.RepoGroup.get_or_404(group_id) |
|
78 | 78 | data = repo_group.get_dict() |
|
79 | 79 | data['group_name'] = repo_group.name |
|
80 | 80 | |
|
81 | 81 | # fill repository group users |
|
82 | 82 | for p in repo_group.repo_group_to_perm: |
|
83 | 83 | data.update({'u_perm_%s' % p.user.username: |
|
84 | 84 | p.permission.permission_name}) |
|
85 | 85 | |
|
86 | 86 | # fill repository group groups |
|
87 | 87 | for p in repo_group.users_group_to_perm: |
|
88 | 88 | data.update({'g_perm_%s' % p.users_group.users_group_name: |
|
89 | 89 | p.permission.permission_name}) |
|
90 | 90 | |
|
91 | 91 | return data |
|
92 | 92 | |
|
93 | 93 | def _revoke_perms_on_yourself(self, form_result): |
|
94 | 94 | _up = [u for u in form_result['perms_updates'] if request.authuser.username == u[0]] |
|
95 | 95 | _new = [u for u in form_result['perms_new'] if request.authuser.username == u[0]] |
|
96 | 96 | if _new and _new[0][1] != 'group.admin' or _up and _up[0][1] != 'group.admin': |
|
97 | 97 | return True |
|
98 | 98 | return False |
|
99 | 99 | |
|
100 | 100 | def index(self, format='html'): |
|
101 | 101 | _list = db.RepoGroup.query(sorted=True).all() |
|
102 | 102 | group_iter = RepoGroupList(_list, perm_level='admin') |
|
103 | 103 | repo_groups_data = [] |
|
104 | 104 | _tmpl_lookup = app_globals.mako_lookup |
|
105 | 105 | template = _tmpl_lookup.get_template('data_table/_dt_elements.html') |
|
106 | 106 | |
|
107 | 107 | def repo_group_name(repo_group_name, children_groups): |
|
108 | 108 | return template.get_def("repo_group_name") \ |
|
109 | 109 | .render_unicode(repo_group_name, children_groups, _=_, h=h, c=c) |
|
110 | 110 | |
|
111 | 111 | def repo_group_actions(repo_group_id, repo_group_name, gr_count): |
|
112 | 112 | return template.get_def("repo_group_actions") \ |
|
113 | 113 | .render_unicode(repo_group_id, repo_group_name, gr_count, _=_, h=h, c=c, |
|
114 | 114 | ungettext=ungettext) |
|
115 | 115 | |
|
116 | 116 | for repo_gr in group_iter: |
|
117 | 117 | children_groups = [g.name for g in repo_gr.parents] + [repo_gr.name] |
|
118 | 118 | repo_count = repo_gr.repositories.count() |
|
119 | 119 | repo_groups_data.append({ |
|
120 | 120 | "raw_name": webutils.escape(repo_gr.group_name), |
|
121 | 121 | "group_name": repo_group_name(repo_gr.group_name, children_groups), |
|
122 | 122 | "desc": webutils.escape(repo_gr.group_description), |
|
123 | 123 | "repos": repo_count, |
|
124 | 124 | "owner": repo_gr.owner.username, |
|
125 | 125 | "action": repo_group_actions(repo_gr.group_id, repo_gr.group_name, |
|
126 | 126 | repo_count) |
|
127 | 127 | }) |
|
128 | 128 | |
|
129 | 129 | c.data = { |
|
130 | 130 | "sort": None, |
|
131 | 131 | "dir": "asc", |
|
132 | 132 | "records": repo_groups_data |
|
133 | 133 | } |
|
134 | 134 | |
|
135 | return render('admin/repo_groups/repo_groups.html') | |
|
135 | return base.render('admin/repo_groups/repo_groups.html') | |
|
136 | 136 | |
|
137 | 137 | def create(self): |
|
138 | 138 | self.__load_defaults() |
|
139 | 139 | |
|
140 | 140 | # permissions for can create group based on parent_id are checked |
|
141 | 141 | # here in the Form |
|
142 | 142 | repo_group_form = RepoGroupForm(repo_groups=c.repo_groups) |
|
143 | 143 | form_result = None |
|
144 | 144 | try: |
|
145 | 145 | form_result = repo_group_form.to_python(dict(request.POST)) |
|
146 | 146 | gr = RepoGroupModel().create( |
|
147 | 147 | group_name=form_result['group_name'], |
|
148 | 148 | group_description=form_result['group_description'], |
|
149 | 149 | parent=form_result['parent_group_id'], |
|
150 | 150 | owner=request.authuser.user_id, # TODO: make editable |
|
151 | 151 | copy_permissions=form_result['group_copy_permissions'] |
|
152 | 152 | ) |
|
153 | 153 | meta.Session().commit() |
|
154 | 154 | # TODO: in future action_logger(, '', '', '') |
|
155 | 155 | except formencode.Invalid as errors: |
|
156 | 156 | return htmlfill.render( |
|
157 | render('admin/repo_groups/repo_group_add.html'), | |
|
157 | base.render('admin/repo_groups/repo_group_add.html'), | |
|
158 | 158 | defaults=errors.value, |
|
159 | 159 | errors=errors.error_dict or {}, |
|
160 | 160 | prefix_error=False, |
|
161 | 161 | encoding="UTF-8", |
|
162 | 162 | force_defaults=False) |
|
163 | 163 | except Exception: |
|
164 | 164 | log.error(traceback.format_exc()) |
|
165 | 165 | webutils.flash(_('Error occurred during creation of repository group %s') |
|
166 | 166 | % request.POST.get('group_name'), category='error') |
|
167 | 167 | if form_result is None: |
|
168 | 168 | raise |
|
169 | 169 | parent_group_id = form_result['parent_group_id'] |
|
170 | 170 | # TODO: maybe we should get back to the main view, not the admin one |
|
171 | 171 | raise HTTPFound(location=url('repos_groups', parent_group=parent_group_id)) |
|
172 | 172 | webutils.flash(_('Created repository group %s') % gr.group_name, |
|
173 | 173 | category='success') |
|
174 | 174 | raise HTTPFound(location=url('repos_group_home', group_name=gr.group_name)) |
|
175 | 175 | |
|
176 | 176 | def new(self): |
|
177 | 177 | parent_group_id = safe_int(request.GET.get('parent_group') or '-1') |
|
178 | 178 | if HasPermissionAny('hg.admin')('group create'): |
|
179 | 179 | # we're global admin, we're ok and we can create TOP level groups |
|
180 | 180 | pass |
|
181 | 181 | else: |
|
182 | 182 | # we pass in parent group into creation form, thus we know |
|
183 | 183 | # what would be the group, we can check perms here ! |
|
184 | 184 | group = db.RepoGroup.get(parent_group_id) if parent_group_id else None |
|
185 | 185 | group_name = group.group_name if group else None |
|
186 | 186 | if HasRepoGroupPermissionLevel('admin')(group_name, 'group create'): |
|
187 | 187 | pass |
|
188 | 188 | else: |
|
189 | 189 | raise HTTPForbidden() |
|
190 | 190 | |
|
191 | 191 | self.__load_defaults() |
|
192 | 192 | return htmlfill.render( |
|
193 | render('admin/repo_groups/repo_group_add.html'), | |
|
193 | base.render('admin/repo_groups/repo_group_add.html'), | |
|
194 | 194 | defaults={'parent_group_id': parent_group_id}, |
|
195 | 195 | errors={}, |
|
196 | 196 | prefix_error=False, |
|
197 | 197 | encoding="UTF-8", |
|
198 | 198 | force_defaults=False) |
|
199 | 199 | |
|
200 | 200 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
201 | 201 | def update(self, group_name): |
|
202 | 202 | c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
203 | 203 | self.__load_defaults(extras=[c.repo_group.parent_group], |
|
204 | 204 | exclude=[c.repo_group]) |
|
205 | 205 | |
|
206 | 206 | # TODO: kill allow_empty_group - it is only used for redundant form validation! |
|
207 | 207 | if HasPermissionAny('hg.admin')('group edit'): |
|
208 | 208 | # we're global admin, we're ok and we can create TOP level groups |
|
209 | 209 | allow_empty_group = True |
|
210 | 210 | elif not c.repo_group.parent_group: |
|
211 | 211 | allow_empty_group = True |
|
212 | 212 | else: |
|
213 | 213 | allow_empty_group = False |
|
214 | 214 | repo_group_form = RepoGroupForm( |
|
215 | 215 | edit=True, |
|
216 | 216 | old_data=c.repo_group.get_dict(), |
|
217 | 217 | repo_groups=c.repo_groups, |
|
218 | 218 | can_create_in_root=allow_empty_group, |
|
219 | 219 | )() |
|
220 | 220 | try: |
|
221 | 221 | form_result = repo_group_form.to_python(dict(request.POST)) |
|
222 | 222 | |
|
223 | 223 | new_gr = RepoGroupModel().update(group_name, form_result) |
|
224 | 224 | meta.Session().commit() |
|
225 | 225 | webutils.flash(_('Updated repository group %s') |
|
226 | 226 | % form_result['group_name'], category='success') |
|
227 | 227 | # we now have new name ! |
|
228 | 228 | group_name = new_gr.group_name |
|
229 | 229 | # TODO: in future action_logger(, '', '', '') |
|
230 | 230 | except formencode.Invalid as errors: |
|
231 | 231 | c.active = 'settings' |
|
232 | 232 | return htmlfill.render( |
|
233 | render('admin/repo_groups/repo_group_edit.html'), | |
|
233 | base.render('admin/repo_groups/repo_group_edit.html'), | |
|
234 | 234 | defaults=errors.value, |
|
235 | 235 | errors=errors.error_dict or {}, |
|
236 | 236 | prefix_error=False, |
|
237 | 237 | encoding="UTF-8", |
|
238 | 238 | force_defaults=False) |
|
239 | 239 | except Exception: |
|
240 | 240 | log.error(traceback.format_exc()) |
|
241 | 241 | webutils.flash(_('Error occurred during update of repository group %s') |
|
242 | 242 | % request.POST.get('group_name'), category='error') |
|
243 | 243 | |
|
244 | 244 | raise HTTPFound(location=url('edit_repo_group', group_name=group_name)) |
|
245 | 245 | |
|
246 | 246 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
247 | 247 | def delete(self, group_name): |
|
248 | 248 | gr = c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
249 | 249 | repos = gr.repositories.all() |
|
250 | 250 | if repos: |
|
251 | 251 | webutils.flash(_('This group contains %s repositories and cannot be ' |
|
252 | 252 | 'deleted') % len(repos), category='warning') |
|
253 | 253 | raise HTTPFound(location=url('repos_groups')) |
|
254 | 254 | |
|
255 | 255 | children = gr.children.all() |
|
256 | 256 | if children: |
|
257 | 257 | webutils.flash(_('This group contains %s subgroups and cannot be deleted' |
|
258 | 258 | % (len(children))), category='warning') |
|
259 | 259 | raise HTTPFound(location=url('repos_groups')) |
|
260 | 260 | |
|
261 | 261 | try: |
|
262 | 262 | RepoGroupModel().delete(group_name) |
|
263 | 263 | meta.Session().commit() |
|
264 | 264 | webutils.flash(_('Removed repository group %s') % group_name, |
|
265 | 265 | category='success') |
|
266 | 266 | # TODO: in future action_logger(, '', '', '') |
|
267 | 267 | except Exception: |
|
268 | 268 | log.error(traceback.format_exc()) |
|
269 | 269 | webutils.flash(_('Error occurred during deletion of repository group %s') |
|
270 | 270 | % group_name, category='error') |
|
271 | 271 | |
|
272 | 272 | if gr.parent_group: |
|
273 | 273 | raise HTTPFound(location=url('repos_group_home', group_name=gr.parent_group.group_name)) |
|
274 | 274 | raise HTTPFound(location=url('repos_groups')) |
|
275 | 275 | |
|
276 | 276 | def show_by_name(self, group_name): |
|
277 | 277 | """ |
|
278 | 278 | This is a proxy that does a lookup group_name -> id, and shows |
|
279 | 279 | the group by id view instead |
|
280 | 280 | """ |
|
281 | 281 | group_name = group_name.rstrip('/') |
|
282 | 282 | id_ = db.RepoGroup.get_by_group_name(group_name) |
|
283 | 283 | if id_: |
|
284 | 284 | return self.show(group_name) |
|
285 | 285 | raise HTTPNotFound |
|
286 | 286 | |
|
287 | 287 | @HasRepoGroupPermissionLevelDecorator('read') |
|
288 | 288 | def show(self, group_name): |
|
289 | 289 | c.active = 'settings' |
|
290 | 290 | |
|
291 | 291 | c.group = c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
292 | 292 | |
|
293 | 293 | groups = db.RepoGroup.query(sorted=True).filter_by(parent_group=c.group).all() |
|
294 | 294 | repo_groups_list = self.scm_model.get_repo_groups(groups) |
|
295 | 295 | |
|
296 | 296 | repos_list = db.Repository.query(sorted=True).filter_by(group=c.group).all() |
|
297 | 297 | c.data = RepoModel().get_repos_as_dict(repos_list, |
|
298 | 298 | repo_groups_list=repo_groups_list, |
|
299 | 299 | short_name=True) |
|
300 | 300 | |
|
301 | return render('admin/repo_groups/repo_group_show.html') | |
|
301 | return base.render('admin/repo_groups/repo_group_show.html') | |
|
302 | 302 | |
|
303 | 303 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
304 | 304 | def edit(self, group_name): |
|
305 | 305 | c.active = 'settings' |
|
306 | 306 | |
|
307 | 307 | c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
308 | 308 | self.__load_defaults(extras=[c.repo_group.parent_group], |
|
309 | 309 | exclude=[c.repo_group]) |
|
310 | 310 | defaults = self.__load_data(c.repo_group.group_id) |
|
311 | 311 | |
|
312 | 312 | return htmlfill.render( |
|
313 | render('admin/repo_groups/repo_group_edit.html'), | |
|
313 | base.render('admin/repo_groups/repo_group_edit.html'), | |
|
314 | 314 | defaults=defaults, |
|
315 | 315 | encoding="UTF-8", |
|
316 | 316 | force_defaults=False |
|
317 | 317 | ) |
|
318 | 318 | |
|
319 | 319 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
320 | 320 | def edit_repo_group_advanced(self, group_name): |
|
321 | 321 | c.active = 'advanced' |
|
322 | 322 | c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
323 | 323 | |
|
324 | return render('admin/repo_groups/repo_group_edit.html') | |
|
324 | return base.render('admin/repo_groups/repo_group_edit.html') | |
|
325 | 325 | |
|
326 | 326 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
327 | 327 | def edit_repo_group_perms(self, group_name): |
|
328 | 328 | c.active = 'perms' |
|
329 | 329 | c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
330 | 330 | self.__load_defaults() |
|
331 | 331 | defaults = self.__load_data(c.repo_group.group_id) |
|
332 | 332 | |
|
333 | 333 | return htmlfill.render( |
|
334 | render('admin/repo_groups/repo_group_edit.html'), | |
|
334 | base.render('admin/repo_groups/repo_group_edit.html'), | |
|
335 | 335 | defaults=defaults, |
|
336 | 336 | encoding="UTF-8", |
|
337 | 337 | force_defaults=False |
|
338 | 338 | ) |
|
339 | 339 | |
|
340 | 340 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
341 | 341 | def update_perms(self, group_name): |
|
342 | 342 | """ |
|
343 | 343 | Update permissions for given repository group |
|
344 | 344 | |
|
345 | 345 | :param group_name: |
|
346 | 346 | """ |
|
347 | 347 | |
|
348 | 348 | c.repo_group = db.RepoGroup.guess_instance(group_name) |
|
349 | 349 | valid_recursive_choices = ['none', 'repos', 'groups', 'all'] |
|
350 | 350 | form_result = RepoGroupPermsForm(valid_recursive_choices)().to_python(request.POST) |
|
351 | 351 | if not request.authuser.is_admin: |
|
352 | 352 | if self._revoke_perms_on_yourself(form_result): |
|
353 | 353 | msg = _('Cannot revoke permission for yourself as admin') |
|
354 | 354 | webutils.flash(msg, category='warning') |
|
355 | 355 | raise HTTPFound(location=url('edit_repo_group_perms', group_name=group_name)) |
|
356 | 356 | recursive = form_result['recursive'] |
|
357 | 357 | # iterate over all members(if in recursive mode) of this groups and |
|
358 | 358 | # set the permissions ! |
|
359 | 359 | # this can be potentially heavy operation |
|
360 | 360 | RepoGroupModel()._update_permissions(c.repo_group, |
|
361 | 361 | form_result['perms_new'], |
|
362 | 362 | form_result['perms_updates'], |
|
363 | 363 | recursive) |
|
364 | 364 | # TODO: implement this |
|
365 | 365 | #action_logger(request.authuser, 'admin_changed_repo_permissions', |
|
366 | 366 | # repo_name, request.ip_addr) |
|
367 | 367 | meta.Session().commit() |
|
368 | 368 | webutils.flash(_('Repository group permissions updated'), category='success') |
|
369 | 369 | raise HTTPFound(location=url('edit_repo_group_perms', group_name=group_name)) |
|
370 | 370 | |
|
371 | 371 | @HasRepoGroupPermissionLevelDecorator('admin') |
|
372 | 372 | def delete_perms(self, group_name): |
|
373 | 373 | try: |
|
374 | 374 | obj_type = request.POST.get('obj_type') |
|
375 | 375 | obj_id = None |
|
376 | 376 | if obj_type == 'user': |
|
377 | 377 | obj_id = safe_int(request.POST.get('user_id')) |
|
378 | 378 | elif obj_type == 'user_group': |
|
379 | 379 | obj_id = safe_int(request.POST.get('user_group_id')) |
|
380 | 380 | |
|
381 | 381 | if not request.authuser.is_admin: |
|
382 | 382 | if obj_type == 'user' and request.authuser.user_id == obj_id: |
|
383 | 383 | msg = _('Cannot revoke permission for yourself as admin') |
|
384 | 384 | webutils.flash(msg, category='warning') |
|
385 | 385 | raise Exception('revoke admin permission on self') |
|
386 | 386 | recursive = request.POST.get('recursive', 'none') |
|
387 | 387 | if obj_type == 'user': |
|
388 | 388 | RepoGroupModel().delete_permission(repo_group=group_name, |
|
389 | 389 | obj=obj_id, obj_type='user', |
|
390 | 390 | recursive=recursive) |
|
391 | 391 | elif obj_type == 'user_group': |
|
392 | 392 | RepoGroupModel().delete_permission(repo_group=group_name, |
|
393 | 393 | obj=obj_id, |
|
394 | 394 | obj_type='user_group', |
|
395 | 395 | recursive=recursive) |
|
396 | 396 | |
|
397 | 397 | meta.Session().commit() |
|
398 | 398 | except Exception: |
|
399 | 399 | log.error(traceback.format_exc()) |
|
400 | 400 | webutils.flash(_('An error occurred during revoking of permission'), |
|
401 | 401 | category='error') |
|
402 | 402 | raise HTTPInternalServerError() |
@@ -1,513 +1,513 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.admin.repos |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Repositories controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 7, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | |
|
31 | 31 | import formencode |
|
32 | 32 | from formencode import htmlfill |
|
33 | 33 | from tg import request |
|
34 | 34 | from tg import tmpl_context as c |
|
35 | 35 | from tg.i18n import ugettext as _ |
|
36 | 36 | from webob.exc import HTTPForbidden, HTTPFound, HTTPInternalServerError, HTTPNotFound |
|
37 | 37 | |
|
38 | 38 | import kallithea |
|
39 | from kallithea.controllers import base | |
|
39 | 40 | from kallithea.lib import webutils |
|
40 | 41 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired, NotAnonymous |
|
41 | from kallithea.lib.base import BaseRepoController, jsonify, render | |
|
42 | 42 | from kallithea.lib.exceptions import AttachedForksError |
|
43 | 43 | from kallithea.lib.utils2 import safe_int |
|
44 | 44 | from kallithea.lib.vcs import RepositoryError |
|
45 | 45 | from kallithea.lib.webutils import url |
|
46 | 46 | from kallithea.model import db, meta, userlog |
|
47 | 47 | from kallithea.model.forms import RepoFieldForm, RepoForm, RepoPermsForm |
|
48 | 48 | from kallithea.model.repo import RepoModel |
|
49 | 49 | from kallithea.model.scm import AvailableRepoGroupChoices, RepoList, ScmModel |
|
50 | 50 | |
|
51 | 51 | |
|
52 | 52 | log = logging.getLogger(__name__) |
|
53 | 53 | |
|
54 | 54 | |
|
55 | class ReposController(BaseRepoController): | |
|
55 | class ReposController(base.BaseRepoController): | |
|
56 | 56 | |
|
57 | 57 | @LoginRequired(allow_default_user=True) |
|
58 | 58 | def _before(self, *args, **kwargs): |
|
59 | 59 | super(ReposController, self)._before(*args, **kwargs) |
|
60 | 60 | |
|
61 | 61 | def _load_repo(self): |
|
62 | 62 | repo_obj = c.db_repo |
|
63 | 63 | |
|
64 | 64 | if repo_obj is None: |
|
65 | 65 | raise HTTPNotFound() |
|
66 | 66 | |
|
67 | 67 | return repo_obj |
|
68 | 68 | |
|
69 | 69 | def __load_defaults(self, repo=None): |
|
70 | 70 | extras = [] if repo is None else [repo.group] |
|
71 | 71 | |
|
72 | 72 | c.repo_groups = AvailableRepoGroupChoices('write', extras) |
|
73 | 73 | |
|
74 | 74 | c.landing_revs_choices, c.landing_revs = ScmModel().get_repo_landing_revs(repo) |
|
75 | 75 | |
|
76 | 76 | def __load_data(self): |
|
77 | 77 | """ |
|
78 | 78 | Load defaults settings for edit, and update |
|
79 | 79 | """ |
|
80 | 80 | c.repo_info = self._load_repo() |
|
81 | 81 | self.__load_defaults(c.repo_info) |
|
82 | 82 | |
|
83 | 83 | defaults = RepoModel()._get_defaults(c.repo_name) |
|
84 | 84 | defaults['clone_uri'] = c.repo_info.clone_uri_hidden # don't show password |
|
85 | 85 | defaults['permanent_url'] = c.repo_info.clone_url(clone_uri_tmpl=c.clone_uri_tmpl, with_id=True) |
|
86 | 86 | |
|
87 | 87 | return defaults |
|
88 | 88 | |
|
89 | 89 | def index(self, format='html'): |
|
90 | 90 | repos_list = RepoList(db.Repository.query(sorted=True).all(), perm_level='admin') |
|
91 | 91 | # the repo list will be filtered to only show repos where the user has read permissions |
|
92 | 92 | repos_data = RepoModel().get_repos_as_dict(repos_list, admin=True) |
|
93 | 93 | # data used to render the grid |
|
94 | 94 | c.data = repos_data |
|
95 | 95 | |
|
96 | return render('admin/repos/repos.html') | |
|
96 | return base.render('admin/repos/repos.html') | |
|
97 | 97 | |
|
98 | 98 | @NotAnonymous() |
|
99 | 99 | def create(self): |
|
100 | 100 | self.__load_defaults() |
|
101 | 101 | try: |
|
102 | 102 | # CanWriteGroup validators checks permissions of this POST |
|
103 | 103 | form_result = RepoForm(repo_groups=c.repo_groups, |
|
104 | 104 | landing_revs=c.landing_revs_choices)() \ |
|
105 | 105 | .to_python(dict(request.POST)) |
|
106 | 106 | except formencode.Invalid as errors: |
|
107 | 107 | log.info(errors) |
|
108 | 108 | return htmlfill.render( |
|
109 | render('admin/repos/repo_add.html'), | |
|
109 | base.render('admin/repos/repo_add.html'), | |
|
110 | 110 | defaults=errors.value, |
|
111 | 111 | errors=errors.error_dict or {}, |
|
112 | 112 | prefix_error=False, |
|
113 | 113 | force_defaults=False, |
|
114 | 114 | encoding="UTF-8") |
|
115 | 115 | |
|
116 | 116 | try: |
|
117 | 117 | # create is done sometimes async on celery, db transaction |
|
118 | 118 | # management is handled there. |
|
119 | 119 | task = RepoModel().create(form_result, request.authuser.user_id) |
|
120 | 120 | task_id = task.task_id |
|
121 | 121 | except Exception: |
|
122 | 122 | log.error(traceback.format_exc()) |
|
123 | 123 | msg = (_('Error creating repository %s') |
|
124 | 124 | % form_result.get('repo_name')) |
|
125 | 125 | webutils.flash(msg, category='error') |
|
126 | 126 | raise HTTPFound(location=url('home')) |
|
127 | 127 | |
|
128 | 128 | raise HTTPFound(location=webutils.url('repo_creating_home', |
|
129 | 129 | repo_name=form_result['repo_name_full'], |
|
130 | 130 | task_id=task_id)) |
|
131 | 131 | |
|
132 | 132 | @NotAnonymous() |
|
133 | 133 | def create_repository(self): |
|
134 | 134 | self.__load_defaults() |
|
135 | 135 | if not c.repo_groups: |
|
136 | 136 | raise HTTPForbidden |
|
137 | 137 | parent_group = request.GET.get('parent_group') |
|
138 | 138 | |
|
139 | 139 | ## apply the defaults from defaults page |
|
140 | 140 | defaults = db.Setting.get_default_repo_settings(strip_prefix=True) |
|
141 | 141 | if parent_group: |
|
142 | 142 | prg = db.RepoGroup.get(parent_group) |
|
143 | 143 | if prg is None or not any(rgc[0] == prg.group_id |
|
144 | 144 | for rgc in c.repo_groups): |
|
145 | 145 | raise HTTPForbidden |
|
146 | 146 | else: |
|
147 | 147 | parent_group = '-1' |
|
148 | 148 | defaults.update({'repo_group': parent_group}) |
|
149 | 149 | |
|
150 | 150 | return htmlfill.render( |
|
151 | render('admin/repos/repo_add.html'), | |
|
151 | base.render('admin/repos/repo_add.html'), | |
|
152 | 152 | defaults=defaults, |
|
153 | 153 | errors={}, |
|
154 | 154 | prefix_error=False, |
|
155 | 155 | encoding="UTF-8", |
|
156 | 156 | force_defaults=False) |
|
157 | 157 | |
|
158 | 158 | @LoginRequired() |
|
159 | 159 | def repo_creating(self, repo_name): |
|
160 | 160 | c.repo = repo_name |
|
161 | 161 | c.task_id = request.GET.get('task_id') |
|
162 | 162 | if not c.repo: |
|
163 | 163 | raise HTTPNotFound() |
|
164 | return render('admin/repos/repo_creating.html') | |
|
164 | return base.render('admin/repos/repo_creating.html') | |
|
165 | 165 | |
|
166 | 166 | @LoginRequired() |
|
167 | @jsonify | |
|
167 | @base.jsonify | |
|
168 | 168 | def repo_check(self, repo_name): |
|
169 | 169 | c.repo = repo_name |
|
170 | 170 | task_id = request.GET.get('task_id') |
|
171 | 171 | |
|
172 | 172 | if task_id and task_id not in ['None']: |
|
173 | 173 | if kallithea.CELERY_APP: |
|
174 | 174 | task_result = kallithea.CELERY_APP.AsyncResult(task_id) |
|
175 | 175 | if task_result.failed(): |
|
176 | 176 | raise HTTPInternalServerError(task_result.traceback) |
|
177 | 177 | |
|
178 | 178 | repo = db.Repository.get_by_repo_name(repo_name) |
|
179 | 179 | if repo and repo.repo_state == db.Repository.STATE_CREATED: |
|
180 | 180 | if repo.clone_uri: |
|
181 | 181 | webutils.flash(_('Created repository %s from %s') |
|
182 | 182 | % (repo.repo_name, repo.clone_uri_hidden), category='success') |
|
183 | 183 | else: |
|
184 | 184 | repo_url = webutils.link_to(repo.repo_name, |
|
185 | 185 | webutils.url('summary_home', |
|
186 | 186 | repo_name=repo.repo_name)) |
|
187 | 187 | fork = repo.fork |
|
188 | 188 | if fork is not None: |
|
189 | 189 | fork_name = fork.repo_name |
|
190 | 190 | webutils.flash(webutils.HTML(_('Forked repository %s as %s')) |
|
191 | 191 | % (fork_name, repo_url), category='success') |
|
192 | 192 | else: |
|
193 | 193 | webutils.flash(webutils.HTML(_('Created repository %s')) % repo_url, |
|
194 | 194 | category='success') |
|
195 | 195 | return {'result': True} |
|
196 | 196 | return {'result': False} |
|
197 | 197 | |
|
198 | 198 | @HasRepoPermissionLevelDecorator('admin') |
|
199 | 199 | def update(self, repo_name): |
|
200 | 200 | c.repo_info = self._load_repo() |
|
201 | 201 | self.__load_defaults(c.repo_info) |
|
202 | 202 | c.active = 'settings' |
|
203 | 203 | c.repo_fields = db.RepositoryField.query() \ |
|
204 | 204 | .filter(db.RepositoryField.repository == c.repo_info).all() |
|
205 | 205 | |
|
206 | 206 | repo_model = RepoModel() |
|
207 | 207 | changed_name = repo_name |
|
208 | 208 | repo = db.Repository.get_by_repo_name(repo_name) |
|
209 | 209 | old_data = { |
|
210 | 210 | 'repo_name': repo_name, |
|
211 | 211 | 'repo_group': repo.group.get_dict() if repo.group else {}, |
|
212 | 212 | 'repo_type': repo.repo_type, |
|
213 | 213 | } |
|
214 | 214 | _form = RepoForm(edit=True, old_data=old_data, |
|
215 | 215 | repo_groups=c.repo_groups, |
|
216 | 216 | landing_revs=c.landing_revs_choices)() |
|
217 | 217 | |
|
218 | 218 | try: |
|
219 | 219 | form_result = _form.to_python(dict(request.POST)) |
|
220 | 220 | repo = repo_model.update(repo_name, **form_result) |
|
221 | 221 | ScmModel().mark_for_invalidation(repo_name) |
|
222 | 222 | webutils.flash(_('Repository %s updated successfully') % repo_name, |
|
223 | 223 | category='success') |
|
224 | 224 | changed_name = repo.repo_name |
|
225 | 225 | userlog.action_logger(request.authuser, 'admin_updated_repo', |
|
226 | 226 | changed_name, request.ip_addr) |
|
227 | 227 | meta.Session().commit() |
|
228 | 228 | except formencode.Invalid as errors: |
|
229 | 229 | log.info(errors) |
|
230 | 230 | defaults = self.__load_data() |
|
231 | 231 | defaults.update(errors.value) |
|
232 | 232 | return htmlfill.render( |
|
233 | render('admin/repos/repo_edit.html'), | |
|
233 | base.render('admin/repos/repo_edit.html'), | |
|
234 | 234 | defaults=defaults, |
|
235 | 235 | errors=errors.error_dict or {}, |
|
236 | 236 | prefix_error=False, |
|
237 | 237 | encoding="UTF-8", |
|
238 | 238 | force_defaults=False) |
|
239 | 239 | |
|
240 | 240 | except Exception: |
|
241 | 241 | log.error(traceback.format_exc()) |
|
242 | 242 | webutils.flash(_('Error occurred during update of repository %s') |
|
243 | 243 | % repo_name, category='error') |
|
244 | 244 | raise HTTPFound(location=url('edit_repo', repo_name=changed_name)) |
|
245 | 245 | |
|
246 | 246 | @HasRepoPermissionLevelDecorator('admin') |
|
247 | 247 | def delete(self, repo_name): |
|
248 | 248 | repo_model = RepoModel() |
|
249 | 249 | repo = repo_model.get_by_repo_name(repo_name) |
|
250 | 250 | if not repo: |
|
251 | 251 | raise HTTPNotFound() |
|
252 | 252 | try: |
|
253 | 253 | _forks = repo.forks.count() |
|
254 | 254 | handle_forks = None |
|
255 | 255 | if _forks and request.POST.get('forks'): |
|
256 | 256 | do = request.POST['forks'] |
|
257 | 257 | if do == 'detach_forks': |
|
258 | 258 | handle_forks = 'detach' |
|
259 | 259 | webutils.flash(_('Detached %s forks') % _forks, category='success') |
|
260 | 260 | elif do == 'delete_forks': |
|
261 | 261 | handle_forks = 'delete' |
|
262 | 262 | webutils.flash(_('Deleted %s forks') % _forks, category='success') |
|
263 | 263 | repo_model.delete(repo, forks=handle_forks) |
|
264 | 264 | userlog.action_logger(request.authuser, 'admin_deleted_repo', |
|
265 | 265 | repo_name, request.ip_addr) |
|
266 | 266 | ScmModel().mark_for_invalidation(repo_name) |
|
267 | 267 | webutils.flash(_('Deleted repository %s') % repo_name, category='success') |
|
268 | 268 | meta.Session().commit() |
|
269 | 269 | except AttachedForksError: |
|
270 | 270 | webutils.flash(_('Cannot delete repository %s which still has forks') |
|
271 | 271 | % repo_name, category='warning') |
|
272 | 272 | |
|
273 | 273 | except Exception: |
|
274 | 274 | log.error(traceback.format_exc()) |
|
275 | 275 | webutils.flash(_('An error occurred during deletion of %s') % repo_name, |
|
276 | 276 | category='error') |
|
277 | 277 | |
|
278 | 278 | if repo.group: |
|
279 | 279 | raise HTTPFound(location=url('repos_group_home', group_name=repo.group.group_name)) |
|
280 | 280 | raise HTTPFound(location=url('repos')) |
|
281 | 281 | |
|
282 | 282 | @HasRepoPermissionLevelDecorator('admin') |
|
283 | 283 | def edit(self, repo_name): |
|
284 | 284 | defaults = self.__load_data() |
|
285 | 285 | c.repo_fields = db.RepositoryField.query() \ |
|
286 | 286 | .filter(db.RepositoryField.repository == c.repo_info).all() |
|
287 | 287 | c.active = 'settings' |
|
288 | 288 | return htmlfill.render( |
|
289 | render('admin/repos/repo_edit.html'), | |
|
289 | base.render('admin/repos/repo_edit.html'), | |
|
290 | 290 | defaults=defaults, |
|
291 | 291 | encoding="UTF-8", |
|
292 | 292 | force_defaults=False) |
|
293 | 293 | |
|
294 | 294 | @HasRepoPermissionLevelDecorator('admin') |
|
295 | 295 | def edit_permissions(self, repo_name): |
|
296 | 296 | c.repo_info = self._load_repo() |
|
297 | 297 | c.active = 'permissions' |
|
298 | 298 | defaults = RepoModel()._get_defaults(repo_name) |
|
299 | 299 | |
|
300 | 300 | return htmlfill.render( |
|
301 | render('admin/repos/repo_edit.html'), | |
|
301 | base.render('admin/repos/repo_edit.html'), | |
|
302 | 302 | defaults=defaults, |
|
303 | 303 | encoding="UTF-8", |
|
304 | 304 | force_defaults=False) |
|
305 | 305 | |
|
306 | 306 | @HasRepoPermissionLevelDecorator('admin') |
|
307 | 307 | def edit_permissions_update(self, repo_name): |
|
308 | 308 | form = RepoPermsForm()().to_python(request.POST) |
|
309 | 309 | RepoModel()._update_permissions(repo_name, form['perms_new'], |
|
310 | 310 | form['perms_updates']) |
|
311 | 311 | # TODO: implement this |
|
312 | 312 | #action_logger(request.authuser, 'admin_changed_repo_permissions', |
|
313 | 313 | # repo_name, request.ip_addr) |
|
314 | 314 | meta.Session().commit() |
|
315 | 315 | webutils.flash(_('Repository permissions updated'), category='success') |
|
316 | 316 | raise HTTPFound(location=url('edit_repo_perms', repo_name=repo_name)) |
|
317 | 317 | |
|
318 | 318 | @HasRepoPermissionLevelDecorator('admin') |
|
319 | 319 | def edit_permissions_revoke(self, repo_name): |
|
320 | 320 | try: |
|
321 | 321 | obj_type = request.POST.get('obj_type') |
|
322 | 322 | obj_id = None |
|
323 | 323 | if obj_type == 'user': |
|
324 | 324 | obj_id = safe_int(request.POST.get('user_id')) |
|
325 | 325 | elif obj_type == 'user_group': |
|
326 | 326 | obj_id = safe_int(request.POST.get('user_group_id')) |
|
327 | 327 | else: |
|
328 | 328 | assert False |
|
329 | 329 | |
|
330 | 330 | if obj_type == 'user': |
|
331 | 331 | RepoModel().revoke_user_permission(repo=repo_name, user=obj_id) |
|
332 | 332 | elif obj_type == 'user_group': |
|
333 | 333 | RepoModel().revoke_user_group_permission( |
|
334 | 334 | repo=repo_name, group_name=obj_id |
|
335 | 335 | ) |
|
336 | 336 | else: |
|
337 | 337 | assert False |
|
338 | 338 | # TODO: implement this |
|
339 | 339 | #action_logger(request.authuser, 'admin_revoked_repo_permissions', |
|
340 | 340 | # repo_name, request.ip_addr) |
|
341 | 341 | meta.Session().commit() |
|
342 | 342 | except Exception: |
|
343 | 343 | log.error(traceback.format_exc()) |
|
344 | 344 | webutils.flash(_('An error occurred during revoking of permission'), |
|
345 | 345 | category='error') |
|
346 | 346 | raise HTTPInternalServerError() |
|
347 | 347 | return [] |
|
348 | 348 | |
|
349 | 349 | @HasRepoPermissionLevelDecorator('admin') |
|
350 | 350 | def edit_fields(self, repo_name): |
|
351 | 351 | c.repo_info = self._load_repo() |
|
352 | 352 | c.repo_fields = db.RepositoryField.query() \ |
|
353 | 353 | .filter(db.RepositoryField.repository == c.repo_info).all() |
|
354 | 354 | c.active = 'fields' |
|
355 | 355 | if request.POST: |
|
356 | 356 | |
|
357 | 357 | raise HTTPFound(location=url('repo_edit_fields')) |
|
358 | return render('admin/repos/repo_edit.html') | |
|
358 | return base.render('admin/repos/repo_edit.html') | |
|
359 | 359 | |
|
360 | 360 | @HasRepoPermissionLevelDecorator('admin') |
|
361 | 361 | def create_repo_field(self, repo_name): |
|
362 | 362 | try: |
|
363 | 363 | form_result = RepoFieldForm()().to_python(dict(request.POST)) |
|
364 | 364 | new_field = db.RepositoryField() |
|
365 | 365 | new_field.repository = db.Repository.get_by_repo_name(repo_name) |
|
366 | 366 | new_field.field_key = form_result['new_field_key'] |
|
367 | 367 | new_field.field_type = form_result['new_field_type'] # python type |
|
368 | 368 | new_field.field_value = form_result['new_field_value'] # set initial blank value |
|
369 | 369 | new_field.field_desc = form_result['new_field_desc'] |
|
370 | 370 | new_field.field_label = form_result['new_field_label'] |
|
371 | 371 | meta.Session().add(new_field) |
|
372 | 372 | meta.Session().commit() |
|
373 | 373 | except formencode.Invalid as e: |
|
374 | 374 | webutils.flash(_('Field validation error: %s') % e.msg, category='error') |
|
375 | 375 | except Exception as e: |
|
376 | 376 | log.error(traceback.format_exc()) |
|
377 | 377 | webutils.flash(_('An error occurred during creation of field: %r') % e, category='error') |
|
378 | 378 | raise HTTPFound(location=url('edit_repo_fields', repo_name=repo_name)) |
|
379 | 379 | |
|
380 | 380 | @HasRepoPermissionLevelDecorator('admin') |
|
381 | 381 | def delete_repo_field(self, repo_name, field_id): |
|
382 | 382 | field = db.RepositoryField.get_or_404(field_id) |
|
383 | 383 | try: |
|
384 | 384 | meta.Session().delete(field) |
|
385 | 385 | meta.Session().commit() |
|
386 | 386 | except Exception as e: |
|
387 | 387 | log.error(traceback.format_exc()) |
|
388 | 388 | msg = _('An error occurred during removal of field') |
|
389 | 389 | webutils.flash(msg, category='error') |
|
390 | 390 | raise HTTPFound(location=url('edit_repo_fields', repo_name=repo_name)) |
|
391 | 391 | |
|
392 | 392 | @HasRepoPermissionLevelDecorator('admin') |
|
393 | 393 | def edit_advanced(self, repo_name): |
|
394 | 394 | c.repo_info = self._load_repo() |
|
395 | 395 | c.default_user_id = kallithea.DEFAULT_USER_ID |
|
396 | 396 | c.in_public_journal = db.UserFollowing.query() \ |
|
397 | 397 | .filter(db.UserFollowing.user_id == c.default_user_id) \ |
|
398 | 398 | .filter(db.UserFollowing.follows_repository == c.repo_info).scalar() |
|
399 | 399 | |
|
400 | 400 | _repos = db.Repository.query(sorted=True).all() |
|
401 | 401 | read_access_repos = RepoList(_repos, perm_level='read') |
|
402 | 402 | c.repos_list = [(None, _('-- Not a fork --'))] |
|
403 | 403 | c.repos_list += [(x.repo_id, x.repo_name) |
|
404 | 404 | for x in read_access_repos |
|
405 | 405 | if x.repo_id != c.repo_info.repo_id |
|
406 | 406 | and x.repo_type == c.repo_info.repo_type] |
|
407 | 407 | |
|
408 | 408 | defaults = { |
|
409 | 409 | 'id_fork_of': c.repo_info.fork_id if c.repo_info.fork_id else '' |
|
410 | 410 | } |
|
411 | 411 | |
|
412 | 412 | c.active = 'advanced' |
|
413 | 413 | if request.POST: |
|
414 | 414 | raise HTTPFound(location=url('repo_edit_advanced')) |
|
415 | 415 | return htmlfill.render( |
|
416 | render('admin/repos/repo_edit.html'), | |
|
416 | base.render('admin/repos/repo_edit.html'), | |
|
417 | 417 | defaults=defaults, |
|
418 | 418 | encoding="UTF-8", |
|
419 | 419 | force_defaults=False) |
|
420 | 420 | |
|
421 | 421 | @HasRepoPermissionLevelDecorator('admin') |
|
422 | 422 | def edit_advanced_journal(self, repo_name): |
|
423 | 423 | """ |
|
424 | 424 | Sets this repository to be visible in public journal, |
|
425 | 425 | in other words asking default user to follow this repo |
|
426 | 426 | |
|
427 | 427 | :param repo_name: |
|
428 | 428 | """ |
|
429 | 429 | |
|
430 | 430 | try: |
|
431 | 431 | repo_id = db.Repository.get_by_repo_name(repo_name).repo_id |
|
432 | 432 | user_id = kallithea.DEFAULT_USER_ID |
|
433 | 433 | self.scm_model.toggle_following_repo(repo_id, user_id) |
|
434 | 434 | webutils.flash(_('Updated repository visibility in public journal'), |
|
435 | 435 | category='success') |
|
436 | 436 | meta.Session().commit() |
|
437 | 437 | except Exception: |
|
438 | 438 | webutils.flash(_('An error occurred during setting this' |
|
439 | 439 | ' repository in public journal'), |
|
440 | 440 | category='error') |
|
441 | 441 | raise HTTPFound(location=url('edit_repo_advanced', repo_name=repo_name)) |
|
442 | 442 | |
|
443 | 443 | @HasRepoPermissionLevelDecorator('admin') |
|
444 | 444 | def edit_advanced_fork(self, repo_name): |
|
445 | 445 | """ |
|
446 | 446 | Mark given repository as a fork of another |
|
447 | 447 | |
|
448 | 448 | :param repo_name: |
|
449 | 449 | """ |
|
450 | 450 | try: |
|
451 | 451 | fork_id = request.POST.get('id_fork_of') |
|
452 | 452 | repo = ScmModel().mark_as_fork(repo_name, fork_id, |
|
453 | 453 | request.authuser.username) |
|
454 | 454 | fork = repo.fork.repo_name if repo.fork else _('Nothing') |
|
455 | 455 | meta.Session().commit() |
|
456 | 456 | webutils.flash(_('Marked repository %s as fork of %s') % (repo_name, fork), |
|
457 | 457 | category='success') |
|
458 | 458 | except RepositoryError as e: |
|
459 | 459 | log.error(traceback.format_exc()) |
|
460 | 460 | webutils.flash(e, category='error') |
|
461 | 461 | except Exception as e: |
|
462 | 462 | log.error(traceback.format_exc()) |
|
463 | 463 | webutils.flash(_('An error occurred during this operation'), |
|
464 | 464 | category='error') |
|
465 | 465 | |
|
466 | 466 | raise HTTPFound(location=url('edit_repo_advanced', repo_name=repo_name)) |
|
467 | 467 | |
|
468 | 468 | @HasRepoPermissionLevelDecorator('admin') |
|
469 | 469 | def edit_remote(self, repo_name): |
|
470 | 470 | c.repo_info = self._load_repo() |
|
471 | 471 | c.active = 'remote' |
|
472 | 472 | if request.POST: |
|
473 | 473 | try: |
|
474 | 474 | ScmModel().pull_changes(repo_name, request.authuser.username, request.ip_addr) |
|
475 | 475 | webutils.flash(_('Pulled from remote location'), category='success') |
|
476 | 476 | except Exception as e: |
|
477 | 477 | log.error(traceback.format_exc()) |
|
478 | 478 | webutils.flash(_('An error occurred during pull from remote location'), |
|
479 | 479 | category='error') |
|
480 | 480 | raise HTTPFound(location=url('edit_repo_remote', repo_name=c.repo_name)) |
|
481 | return render('admin/repos/repo_edit.html') | |
|
481 | return base.render('admin/repos/repo_edit.html') | |
|
482 | 482 | |
|
483 | 483 | @HasRepoPermissionLevelDecorator('admin') |
|
484 | 484 | def edit_statistics(self, repo_name): |
|
485 | 485 | c.repo_info = self._load_repo() |
|
486 | 486 | repo = c.repo_info.scm_instance |
|
487 | 487 | |
|
488 | 488 | if c.repo_info.stats: |
|
489 | 489 | # this is on what revision we ended up so we add +1 for count |
|
490 | 490 | last_rev = c.repo_info.stats.stat_on_revision + 1 |
|
491 | 491 | else: |
|
492 | 492 | last_rev = 0 |
|
493 | 493 | c.stats_revision = last_rev |
|
494 | 494 | |
|
495 | 495 | c.repo_last_rev = repo.count() if repo.revisions else 0 |
|
496 | 496 | |
|
497 | 497 | if last_rev == 0 or c.repo_last_rev == 0: |
|
498 | 498 | c.stats_percentage = 0 |
|
499 | 499 | else: |
|
500 | 500 | c.stats_percentage = '%.2f' % ((float((last_rev)) / c.repo_last_rev) * 100) |
|
501 | 501 | |
|
502 | 502 | c.active = 'statistics' |
|
503 | 503 | if request.POST: |
|
504 | 504 | try: |
|
505 | 505 | RepoModel().delete_stats(repo_name) |
|
506 | 506 | meta.Session().commit() |
|
507 | 507 | except Exception as e: |
|
508 | 508 | log.error(traceback.format_exc()) |
|
509 | 509 | webutils.flash(_('An error occurred during deletion of repository stats'), |
|
510 | 510 | category='error') |
|
511 | 511 | raise HTTPFound(location=url('edit_repo_statistics', repo_name=c.repo_name)) |
|
512 | 512 | |
|
513 | return render('admin/repos/repo_edit.html') | |
|
513 | return base.render('admin/repos/repo_edit.html') |
@@ -1,410 +1,410 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.admin.settings |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | settings controller for Kallithea admin |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Jul 14, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | |
|
31 | 31 | import formencode |
|
32 | 32 | from formencode import htmlfill |
|
33 | 33 | from tg import config, request |
|
34 | 34 | from tg import tmpl_context as c |
|
35 | 35 | from tg.i18n import ugettext as _ |
|
36 | 36 | from webob.exc import HTTPFound |
|
37 | 37 | |
|
38 | 38 | import kallithea |
|
39 | from kallithea.controllers import base | |
|
39 | 40 | from kallithea.lib import webutils |
|
40 | 41 | from kallithea.lib.auth import HasPermissionAnyDecorator, LoginRequired |
|
41 | from kallithea.lib.base import BaseController, render | |
|
42 | 42 | from kallithea.lib.utils import repo2db_mapper, set_app_settings |
|
43 | 43 | from kallithea.lib.utils2 import safe_str |
|
44 | 44 | from kallithea.lib.vcs import VCSError |
|
45 | 45 | from kallithea.lib.webutils import url |
|
46 | 46 | from kallithea.model import async_tasks, db, meta |
|
47 | 47 | from kallithea.model.forms import ApplicationSettingsForm, ApplicationUiSettingsForm, ApplicationVisualisationForm |
|
48 | 48 | from kallithea.model.notification import EmailNotificationModel |
|
49 | 49 | from kallithea.model.scm import ScmModel |
|
50 | 50 | |
|
51 | 51 | |
|
52 | 52 | log = logging.getLogger(__name__) |
|
53 | 53 | |
|
54 | 54 | |
|
55 | class SettingsController(BaseController): | |
|
55 | class SettingsController(base.BaseController): | |
|
56 | 56 | |
|
57 | 57 | @LoginRequired(allow_default_user=True) |
|
58 | 58 | def _before(self, *args, **kwargs): |
|
59 | 59 | super(SettingsController, self)._before(*args, **kwargs) |
|
60 | 60 | |
|
61 | 61 | def _get_hg_ui_settings(self): |
|
62 | 62 | ret = db.Ui.query().all() |
|
63 | 63 | |
|
64 | 64 | settings = {} |
|
65 | 65 | for each in ret: |
|
66 | 66 | k = each.ui_section + '_' + each.ui_key |
|
67 | 67 | v = each.ui_value |
|
68 | 68 | if k == 'paths_/': |
|
69 | 69 | k = 'paths_root_path' |
|
70 | 70 | |
|
71 | 71 | k = k.replace('.', '_') |
|
72 | 72 | |
|
73 | 73 | if each.ui_section in ['hooks', 'extensions']: |
|
74 | 74 | v = each.ui_active |
|
75 | 75 | |
|
76 | 76 | settings[k] = v |
|
77 | 77 | return settings |
|
78 | 78 | |
|
79 | 79 | @HasPermissionAnyDecorator('hg.admin') |
|
80 | 80 | def settings_vcs(self): |
|
81 | 81 | c.active = 'vcs' |
|
82 | 82 | if request.POST: |
|
83 | 83 | application_form = ApplicationUiSettingsForm()() |
|
84 | 84 | try: |
|
85 | 85 | form_result = application_form.to_python(dict(request.POST)) |
|
86 | 86 | except formencode.Invalid as errors: |
|
87 | 87 | return htmlfill.render( |
|
88 | render('admin/settings/settings.html'), | |
|
88 | base.render('admin/settings/settings.html'), | |
|
89 | 89 | defaults=errors.value, |
|
90 | 90 | errors=errors.error_dict or {}, |
|
91 | 91 | prefix_error=False, |
|
92 | 92 | encoding="UTF-8", |
|
93 | 93 | force_defaults=False) |
|
94 | 94 | |
|
95 | 95 | try: |
|
96 | 96 | if c.visual.allow_repo_location_change: |
|
97 | 97 | sett = db.Ui.get_by_key('paths', '/') |
|
98 | 98 | sett.ui_value = form_result['paths_root_path'] |
|
99 | 99 | |
|
100 | 100 | # HOOKS |
|
101 | 101 | sett = db.Ui.get_by_key('hooks', db.Ui.HOOK_UPDATE) |
|
102 | 102 | sett.ui_active = form_result['hooks_changegroup_update'] |
|
103 | 103 | |
|
104 | 104 | sett = db.Ui.get_by_key('hooks', db.Ui.HOOK_REPO_SIZE) |
|
105 | 105 | sett.ui_active = form_result['hooks_changegroup_repo_size'] |
|
106 | 106 | |
|
107 | 107 | ## EXTENSIONS |
|
108 | 108 | sett = db.Ui.get_or_create('extensions', 'largefiles') |
|
109 | 109 | sett.ui_active = form_result['extensions_largefiles'] |
|
110 | 110 | |
|
111 | 111 | # sett = db.Ui.get_or_create('extensions', 'hggit') |
|
112 | 112 | # sett.ui_active = form_result['extensions_hggit'] |
|
113 | 113 | |
|
114 | 114 | meta.Session().commit() |
|
115 | 115 | |
|
116 | 116 | webutils.flash(_('Updated VCS settings'), category='success') |
|
117 | 117 | |
|
118 | 118 | except Exception: |
|
119 | 119 | log.error(traceback.format_exc()) |
|
120 | 120 | webutils.flash(_('Error occurred while updating ' |
|
121 | 121 | 'application settings'), category='error') |
|
122 | 122 | |
|
123 | 123 | defaults = db.Setting.get_app_settings() |
|
124 | 124 | defaults.update(self._get_hg_ui_settings()) |
|
125 | 125 | |
|
126 | 126 | return htmlfill.render( |
|
127 | render('admin/settings/settings.html'), | |
|
127 | base.render('admin/settings/settings.html'), | |
|
128 | 128 | defaults=defaults, |
|
129 | 129 | encoding="UTF-8", |
|
130 | 130 | force_defaults=False) |
|
131 | 131 | |
|
132 | 132 | @HasPermissionAnyDecorator('hg.admin') |
|
133 | 133 | def settings_mapping(self): |
|
134 | 134 | c.active = 'mapping' |
|
135 | 135 | if request.POST: |
|
136 | 136 | rm_obsolete = request.POST.get('destroy', False) |
|
137 | 137 | install_git_hooks = request.POST.get('hooks', False) |
|
138 | 138 | overwrite_git_hooks = request.POST.get('hooks_overwrite', False) |
|
139 | 139 | invalidate_cache = request.POST.get('invalidate', False) |
|
140 | 140 | log.debug('rescanning repo location with destroy obsolete=%s, ' |
|
141 | 141 | 'install git hooks=%s and ' |
|
142 | 142 | 'overwrite git hooks=%s' % (rm_obsolete, install_git_hooks, overwrite_git_hooks)) |
|
143 | 143 | |
|
144 | 144 | filesystem_repos = ScmModel().repo_scan() |
|
145 | 145 | added, removed = repo2db_mapper(filesystem_repos, rm_obsolete, |
|
146 | 146 | install_git_hooks=install_git_hooks, |
|
147 | 147 | user=request.authuser.username, |
|
148 | 148 | overwrite_git_hooks=overwrite_git_hooks) |
|
149 | 149 | added_msg = webutils.HTML(', ').join( |
|
150 | 150 | webutils.link_to(safe_str(repo_name), webutils.url('summary_home', repo_name=repo_name)) for repo_name in added |
|
151 | 151 | ) or '-' |
|
152 | 152 | removed_msg = webutils.HTML(', ').join( |
|
153 | 153 | safe_str(repo_name) for repo_name in removed |
|
154 | 154 | ) or '-' |
|
155 | 155 | webutils.flash(webutils.HTML(_('Repositories successfully rescanned. Added: %s. Removed: %s.')) % |
|
156 | 156 | (added_msg, removed_msg), category='success') |
|
157 | 157 | |
|
158 | 158 | if invalidate_cache: |
|
159 | 159 | log.debug('invalidating all repositories cache') |
|
160 | 160 | i = 0 |
|
161 | 161 | for repo in db.Repository.query(): |
|
162 | 162 | try: |
|
163 | 163 | ScmModel().mark_for_invalidation(repo.repo_name) |
|
164 | 164 | i += 1 |
|
165 | 165 | except VCSError as e: |
|
166 | 166 | log.warning('VCS error invalidating %s: %s', repo.repo_name, e) |
|
167 | 167 | webutils.flash(_('Invalidated %s repositories') % i, category='success') |
|
168 | 168 | |
|
169 | 169 | raise HTTPFound(location=url('admin_settings_mapping')) |
|
170 | 170 | |
|
171 | 171 | defaults = db.Setting.get_app_settings() |
|
172 | 172 | defaults.update(self._get_hg_ui_settings()) |
|
173 | 173 | |
|
174 | 174 | return htmlfill.render( |
|
175 | render('admin/settings/settings.html'), | |
|
175 | base.render('admin/settings/settings.html'), | |
|
176 | 176 | defaults=defaults, |
|
177 | 177 | encoding="UTF-8", |
|
178 | 178 | force_defaults=False) |
|
179 | 179 | |
|
180 | 180 | @HasPermissionAnyDecorator('hg.admin') |
|
181 | 181 | def settings_global(self): |
|
182 | 182 | c.active = 'global' |
|
183 | 183 | if request.POST: |
|
184 | 184 | application_form = ApplicationSettingsForm()() |
|
185 | 185 | try: |
|
186 | 186 | form_result = application_form.to_python(dict(request.POST)) |
|
187 | 187 | except formencode.Invalid as errors: |
|
188 | 188 | return htmlfill.render( |
|
189 | render('admin/settings/settings.html'), | |
|
189 | base.render('admin/settings/settings.html'), | |
|
190 | 190 | defaults=errors.value, |
|
191 | 191 | errors=errors.error_dict or {}, |
|
192 | 192 | prefix_error=False, |
|
193 | 193 | encoding="UTF-8", |
|
194 | 194 | force_defaults=False) |
|
195 | 195 | |
|
196 | 196 | try: |
|
197 | 197 | for setting in ( |
|
198 | 198 | 'title', |
|
199 | 199 | 'realm', |
|
200 | 200 | 'ga_code', |
|
201 | 201 | 'captcha_public_key', |
|
202 | 202 | 'captcha_private_key', |
|
203 | 203 | ): |
|
204 | 204 | db.Setting.create_or_update(setting, form_result[setting]) |
|
205 | 205 | |
|
206 | 206 | meta.Session().commit() |
|
207 | 207 | set_app_settings(config) |
|
208 | 208 | webutils.flash(_('Updated application settings'), category='success') |
|
209 | 209 | |
|
210 | 210 | except Exception: |
|
211 | 211 | log.error(traceback.format_exc()) |
|
212 | 212 | webutils.flash(_('Error occurred while updating ' |
|
213 | 213 | 'application settings'), |
|
214 | 214 | category='error') |
|
215 | 215 | |
|
216 | 216 | raise HTTPFound(location=url('admin_settings_global')) |
|
217 | 217 | |
|
218 | 218 | defaults = db.Setting.get_app_settings() |
|
219 | 219 | defaults.update(self._get_hg_ui_settings()) |
|
220 | 220 | |
|
221 | 221 | return htmlfill.render( |
|
222 | render('admin/settings/settings.html'), | |
|
222 | base.render('admin/settings/settings.html'), | |
|
223 | 223 | defaults=defaults, |
|
224 | 224 | encoding="UTF-8", |
|
225 | 225 | force_defaults=False) |
|
226 | 226 | |
|
227 | 227 | @HasPermissionAnyDecorator('hg.admin') |
|
228 | 228 | def settings_visual(self): |
|
229 | 229 | c.active = 'visual' |
|
230 | 230 | if request.POST: |
|
231 | 231 | application_form = ApplicationVisualisationForm()() |
|
232 | 232 | try: |
|
233 | 233 | form_result = application_form.to_python(dict(request.POST)) |
|
234 | 234 | except formencode.Invalid as errors: |
|
235 | 235 | return htmlfill.render( |
|
236 | render('admin/settings/settings.html'), | |
|
236 | base.render('admin/settings/settings.html'), | |
|
237 | 237 | defaults=errors.value, |
|
238 | 238 | errors=errors.error_dict or {}, |
|
239 | 239 | prefix_error=False, |
|
240 | 240 | encoding="UTF-8", |
|
241 | 241 | force_defaults=False) |
|
242 | 242 | |
|
243 | 243 | try: |
|
244 | 244 | settings = [ |
|
245 | 245 | ('show_public_icon', 'show_public_icon', 'bool'), |
|
246 | 246 | ('show_private_icon', 'show_private_icon', 'bool'), |
|
247 | 247 | ('stylify_metalabels', 'stylify_metalabels', 'bool'), |
|
248 | 248 | ('repository_fields', 'repository_fields', 'bool'), |
|
249 | 249 | ('dashboard_items', 'dashboard_items', 'int'), |
|
250 | 250 | ('admin_grid_items', 'admin_grid_items', 'int'), |
|
251 | 251 | ('show_version', 'show_version', 'bool'), |
|
252 | 252 | ('use_gravatar', 'use_gravatar', 'bool'), |
|
253 | 253 | ('gravatar_url', 'gravatar_url', 'unicode'), |
|
254 | 254 | ('clone_uri_tmpl', 'clone_uri_tmpl', 'unicode'), |
|
255 | 255 | ('clone_ssh_tmpl', 'clone_ssh_tmpl', 'unicode'), |
|
256 | 256 | ] |
|
257 | 257 | for setting, form_key, type_ in settings: |
|
258 | 258 | db.Setting.create_or_update(setting, form_result[form_key], type_) |
|
259 | 259 | |
|
260 | 260 | meta.Session().commit() |
|
261 | 261 | set_app_settings(config) |
|
262 | 262 | webutils.flash(_('Updated visualisation settings'), |
|
263 | 263 | category='success') |
|
264 | 264 | |
|
265 | 265 | except Exception: |
|
266 | 266 | log.error(traceback.format_exc()) |
|
267 | 267 | webutils.flash(_('Error occurred during updating ' |
|
268 | 268 | 'visualisation settings'), |
|
269 | 269 | category='error') |
|
270 | 270 | |
|
271 | 271 | raise HTTPFound(location=url('admin_settings_visual')) |
|
272 | 272 | |
|
273 | 273 | defaults = db.Setting.get_app_settings() |
|
274 | 274 | defaults.update(self._get_hg_ui_settings()) |
|
275 | 275 | |
|
276 | 276 | return htmlfill.render( |
|
277 | render('admin/settings/settings.html'), | |
|
277 | base.render('admin/settings/settings.html'), | |
|
278 | 278 | defaults=defaults, |
|
279 | 279 | encoding="UTF-8", |
|
280 | 280 | force_defaults=False) |
|
281 | 281 | |
|
282 | 282 | @HasPermissionAnyDecorator('hg.admin') |
|
283 | 283 | def settings_email(self): |
|
284 | 284 | c.active = 'email' |
|
285 | 285 | if request.POST: |
|
286 | 286 | test_email = request.POST.get('test_email') |
|
287 | 287 | test_email_subj = 'Kallithea test email' |
|
288 | 288 | test_body = ('Kallithea Email test, ' |
|
289 | 289 | 'Kallithea version: %s' % c.kallithea_version) |
|
290 | 290 | if not test_email: |
|
291 | 291 | webutils.flash(_('Please enter email address'), category='error') |
|
292 | 292 | raise HTTPFound(location=url('admin_settings_email')) |
|
293 | 293 | |
|
294 | 294 | test_email_txt_body = EmailNotificationModel() \ |
|
295 | 295 | .get_email_tmpl(EmailNotificationModel.TYPE_DEFAULT, |
|
296 | 296 | 'txt', body=test_body) |
|
297 | 297 | test_email_html_body = EmailNotificationModel() \ |
|
298 | 298 | .get_email_tmpl(EmailNotificationModel.TYPE_DEFAULT, |
|
299 | 299 | 'html', body=test_body) |
|
300 | 300 | |
|
301 | 301 | recipients = [test_email] if test_email else None |
|
302 | 302 | |
|
303 | 303 | async_tasks.send_email(recipients, test_email_subj, |
|
304 | 304 | test_email_txt_body, test_email_html_body) |
|
305 | 305 | |
|
306 | 306 | webutils.flash(_('Send email task created'), category='success') |
|
307 | 307 | raise HTTPFound(location=url('admin_settings_email')) |
|
308 | 308 | |
|
309 | 309 | defaults = db.Setting.get_app_settings() |
|
310 | 310 | defaults.update(self._get_hg_ui_settings()) |
|
311 | 311 | |
|
312 | 312 | c.ini = kallithea.CONFIG |
|
313 | 313 | |
|
314 | 314 | return htmlfill.render( |
|
315 | render('admin/settings/settings.html'), | |
|
315 | base.render('admin/settings/settings.html'), | |
|
316 | 316 | defaults=defaults, |
|
317 | 317 | encoding="UTF-8", |
|
318 | 318 | force_defaults=False) |
|
319 | 319 | |
|
320 | 320 | @HasPermissionAnyDecorator('hg.admin') |
|
321 | 321 | def settings_hooks(self): |
|
322 | 322 | c.active = 'hooks' |
|
323 | 323 | if request.POST: |
|
324 | 324 | if c.visual.allow_custom_hooks_settings: |
|
325 | 325 | ui_key = request.POST.get('new_hook_ui_key') |
|
326 | 326 | ui_value = request.POST.get('new_hook_ui_value') |
|
327 | 327 | |
|
328 | 328 | hook_id = request.POST.get('hook_id') |
|
329 | 329 | |
|
330 | 330 | try: |
|
331 | 331 | ui_key = ui_key and ui_key.strip() |
|
332 | 332 | if ui_key in (x.ui_key for x in db.Ui.get_custom_hooks()): |
|
333 | 333 | webutils.flash(_('Hook already exists'), category='error') |
|
334 | 334 | elif ui_key in (x.ui_key for x in db.Ui.get_builtin_hooks()): |
|
335 | 335 | webutils.flash(_('Builtin hooks are read-only. Please use another hook name.'), category='error') |
|
336 | 336 | elif ui_value and ui_key: |
|
337 | 337 | db.Ui.create_or_update_hook(ui_key, ui_value) |
|
338 | 338 | webutils.flash(_('Added new hook'), category='success') |
|
339 | 339 | elif hook_id: |
|
340 | 340 | db.Ui.delete(hook_id) |
|
341 | 341 | meta.Session().commit() |
|
342 | 342 | |
|
343 | 343 | # check for edits |
|
344 | 344 | update = False |
|
345 | 345 | _d = request.POST.dict_of_lists() |
|
346 | 346 | for k, v, ov in zip(_d.get('hook_ui_key', []), |
|
347 | 347 | _d.get('hook_ui_value_new', []), |
|
348 | 348 | _d.get('hook_ui_value', [])): |
|
349 | 349 | if v != ov: |
|
350 | 350 | db.Ui.create_or_update_hook(k, v) |
|
351 | 351 | update = True |
|
352 | 352 | |
|
353 | 353 | if update: |
|
354 | 354 | webutils.flash(_('Updated hooks'), category='success') |
|
355 | 355 | meta.Session().commit() |
|
356 | 356 | except Exception: |
|
357 | 357 | log.error(traceback.format_exc()) |
|
358 | 358 | webutils.flash(_('Error occurred during hook creation'), |
|
359 | 359 | category='error') |
|
360 | 360 | |
|
361 | 361 | raise HTTPFound(location=url('admin_settings_hooks')) |
|
362 | 362 | |
|
363 | 363 | defaults = db.Setting.get_app_settings() |
|
364 | 364 | defaults.update(self._get_hg_ui_settings()) |
|
365 | 365 | |
|
366 | 366 | c.hooks = db.Ui.get_builtin_hooks() |
|
367 | 367 | c.custom_hooks = db.Ui.get_custom_hooks() |
|
368 | 368 | |
|
369 | 369 | return htmlfill.render( |
|
370 | render('admin/settings/settings.html'), | |
|
370 | base.render('admin/settings/settings.html'), | |
|
371 | 371 | defaults=defaults, |
|
372 | 372 | encoding="UTF-8", |
|
373 | 373 | force_defaults=False) |
|
374 | 374 | |
|
375 | 375 | @HasPermissionAnyDecorator('hg.admin') |
|
376 | 376 | def settings_search(self): |
|
377 | 377 | c.active = 'search' |
|
378 | 378 | if request.POST: |
|
379 | 379 | repo_location = self._get_hg_ui_settings()['paths_root_path'] |
|
380 | 380 | full_index = request.POST.get('full_index', False) |
|
381 | 381 | async_tasks.whoosh_index(repo_location, full_index) |
|
382 | 382 | webutils.flash(_('Whoosh reindex task scheduled'), category='success') |
|
383 | 383 | raise HTTPFound(location=url('admin_settings_search')) |
|
384 | 384 | |
|
385 | 385 | defaults = db.Setting.get_app_settings() |
|
386 | 386 | defaults.update(self._get_hg_ui_settings()) |
|
387 | 387 | |
|
388 | 388 | return htmlfill.render( |
|
389 | render('admin/settings/settings.html'), | |
|
389 | base.render('admin/settings/settings.html'), | |
|
390 | 390 | defaults=defaults, |
|
391 | 391 | encoding="UTF-8", |
|
392 | 392 | force_defaults=False) |
|
393 | 393 | |
|
394 | 394 | @HasPermissionAnyDecorator('hg.admin') |
|
395 | 395 | def settings_system(self): |
|
396 | 396 | c.active = 'system' |
|
397 | 397 | |
|
398 | 398 | defaults = db.Setting.get_app_settings() |
|
399 | 399 | defaults.update(self._get_hg_ui_settings()) |
|
400 | 400 | |
|
401 | 401 | c.ini = kallithea.CONFIG |
|
402 | 402 | server_info = db.Setting.get_server_info() |
|
403 | 403 | for key, val in server_info.items(): |
|
404 | 404 | setattr(c, key, val) |
|
405 | 405 | |
|
406 | 406 | return htmlfill.render( |
|
407 | render('admin/settings/settings.html'), | |
|
407 | base.render('admin/settings/settings.html'), | |
|
408 | 408 | defaults=defaults, |
|
409 | 409 | encoding="UTF-8", |
|
410 | 410 | force_defaults=False) |
@@ -1,407 +1,407 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.admin.user_groups |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | User Groups crud controller |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Jan 25, 2011 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | |
|
31 | 31 | import formencode |
|
32 | 32 | from formencode import htmlfill |
|
33 | 33 | from sqlalchemy.orm import joinedload |
|
34 | 34 | from sqlalchemy.sql.expression import func |
|
35 | 35 | from tg import app_globals, request |
|
36 | 36 | from tg import tmpl_context as c |
|
37 | 37 | from tg.i18n import ugettext as _ |
|
38 | 38 | from webob.exc import HTTPFound, HTTPInternalServerError |
|
39 | 39 | |
|
40 | 40 | import kallithea.lib.helpers as h |
|
41 | from kallithea.controllers import base | |
|
41 | 42 | from kallithea.lib import webutils |
|
42 | 43 | from kallithea.lib.auth import HasPermissionAnyDecorator, HasUserGroupPermissionLevelDecorator, LoginRequired |
|
43 | from kallithea.lib.base import BaseController, render | |
|
44 | 44 | from kallithea.lib.exceptions import RepoGroupAssignmentError, UserGroupsAssignedException |
|
45 | 45 | from kallithea.lib.utils2 import safe_int, safe_str |
|
46 | 46 | from kallithea.lib.webutils import url |
|
47 | 47 | from kallithea.model import db, meta, userlog |
|
48 | 48 | from kallithea.model.forms import CustomDefaultPermissionsForm, UserGroupForm, UserGroupPermsForm |
|
49 | 49 | from kallithea.model.scm import UserGroupList |
|
50 | 50 | from kallithea.model.user_group import UserGroupModel |
|
51 | 51 | |
|
52 | 52 | |
|
53 | 53 | log = logging.getLogger(__name__) |
|
54 | 54 | |
|
55 | 55 | |
|
56 | class UserGroupsController(BaseController): | |
|
56 | class UserGroupsController(base.BaseController): | |
|
57 | 57 | |
|
58 | 58 | @LoginRequired(allow_default_user=True) |
|
59 | 59 | def _before(self, *args, **kwargs): |
|
60 | 60 | super(UserGroupsController, self)._before(*args, **kwargs) |
|
61 | 61 | |
|
62 | 62 | def __load_data(self, user_group_id): |
|
63 | 63 | c.group_members_obj = sorted((x.user for x in c.user_group.members), |
|
64 | 64 | key=lambda u: u.username.lower()) |
|
65 | 65 | |
|
66 | 66 | c.group_members = [(x.user_id, x.username) for x in c.group_members_obj] |
|
67 | 67 | c.available_members = sorted(((x.user_id, x.username) for x in |
|
68 | 68 | db.User.query().all()), |
|
69 | 69 | key=lambda u: u[1].lower()) |
|
70 | 70 | |
|
71 | 71 | def __load_defaults(self, user_group_id): |
|
72 | 72 | """ |
|
73 | 73 | Load defaults settings for edit, and update |
|
74 | 74 | |
|
75 | 75 | :param user_group_id: |
|
76 | 76 | """ |
|
77 | 77 | user_group = db.UserGroup.get_or_404(user_group_id) |
|
78 | 78 | data = user_group.get_dict() |
|
79 | 79 | return data |
|
80 | 80 | |
|
81 | 81 | def index(self, format='html'): |
|
82 | 82 | _list = db.UserGroup.query() \ |
|
83 | 83 | .order_by(func.lower(db.UserGroup.users_group_name)) \ |
|
84 | 84 | .all() |
|
85 | 85 | group_iter = UserGroupList(_list, perm_level='admin') |
|
86 | 86 | user_groups_data = [] |
|
87 | 87 | _tmpl_lookup = app_globals.mako_lookup |
|
88 | 88 | template = _tmpl_lookup.get_template('data_table/_dt_elements.html') |
|
89 | 89 | |
|
90 | 90 | def user_group_name(user_group_id, user_group_name): |
|
91 | 91 | return template.get_def("user_group_name") \ |
|
92 | 92 | .render_unicode(user_group_id, user_group_name, _=_, h=h, c=c) |
|
93 | 93 | |
|
94 | 94 | def user_group_actions(user_group_id, user_group_name): |
|
95 | 95 | return template.get_def("user_group_actions") \ |
|
96 | 96 | .render_unicode(user_group_id, user_group_name, _=_, h=h, c=c) |
|
97 | 97 | |
|
98 | 98 | for user_gr in group_iter: |
|
99 | 99 | user_groups_data.append({ |
|
100 | 100 | "raw_name": user_gr.users_group_name, |
|
101 | 101 | "group_name": user_group_name(user_gr.users_group_id, |
|
102 | 102 | user_gr.users_group_name), |
|
103 | 103 | "desc": webutils.escape(user_gr.user_group_description), |
|
104 | 104 | "members": len(user_gr.members), |
|
105 | 105 | "active": h.boolicon(user_gr.users_group_active), |
|
106 | 106 | "owner": user_gr.owner.username, |
|
107 | 107 | "action": user_group_actions(user_gr.users_group_id, user_gr.users_group_name) |
|
108 | 108 | }) |
|
109 | 109 | |
|
110 | 110 | c.data = { |
|
111 | 111 | "sort": None, |
|
112 | 112 | "dir": "asc", |
|
113 | 113 | "records": user_groups_data |
|
114 | 114 | } |
|
115 | 115 | |
|
116 | return render('admin/user_groups/user_groups.html') | |
|
116 | return base.render('admin/user_groups/user_groups.html') | |
|
117 | 117 | |
|
118 | 118 | @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true') |
|
119 | 119 | def create(self): |
|
120 | 120 | users_group_form = UserGroupForm()() |
|
121 | 121 | try: |
|
122 | 122 | form_result = users_group_form.to_python(dict(request.POST)) |
|
123 | 123 | ug = UserGroupModel().create(name=form_result['users_group_name'], |
|
124 | 124 | description=form_result['user_group_description'], |
|
125 | 125 | owner=request.authuser.user_id, |
|
126 | 126 | active=form_result['users_group_active']) |
|
127 | 127 | |
|
128 | 128 | gr = form_result['users_group_name'] |
|
129 | 129 | userlog.action_logger(request.authuser, |
|
130 | 130 | 'admin_created_users_group:%s' % gr, |
|
131 | 131 | None, request.ip_addr) |
|
132 | 132 | webutils.flash(webutils.HTML(_('Created user group %s')) % webutils.link_to(gr, url('edit_users_group', id=ug.users_group_id)), |
|
133 | 133 | category='success') |
|
134 | 134 | meta.Session().commit() |
|
135 | 135 | except formencode.Invalid as errors: |
|
136 | 136 | return htmlfill.render( |
|
137 | render('admin/user_groups/user_group_add.html'), | |
|
137 | base.render('admin/user_groups/user_group_add.html'), | |
|
138 | 138 | defaults=errors.value, |
|
139 | 139 | errors=errors.error_dict or {}, |
|
140 | 140 | prefix_error=False, |
|
141 | 141 | encoding="UTF-8", |
|
142 | 142 | force_defaults=False) |
|
143 | 143 | except Exception: |
|
144 | 144 | log.error(traceback.format_exc()) |
|
145 | 145 | webutils.flash(_('Error occurred during creation of user group %s') |
|
146 | 146 | % request.POST.get('users_group_name'), category='error') |
|
147 | 147 | |
|
148 | 148 | raise HTTPFound(location=url('users_groups')) |
|
149 | 149 | |
|
150 | 150 | @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true') |
|
151 | 151 | def new(self, format='html'): |
|
152 | return render('admin/user_groups/user_group_add.html') | |
|
152 | return base.render('admin/user_groups/user_group_add.html') | |
|
153 | 153 | |
|
154 | 154 | @HasUserGroupPermissionLevelDecorator('admin') |
|
155 | 155 | def update(self, id): |
|
156 | 156 | c.user_group = db.UserGroup.get_or_404(id) |
|
157 | 157 | c.active = 'settings' |
|
158 | 158 | self.__load_data(id) |
|
159 | 159 | |
|
160 | 160 | available_members = [safe_str(x[0]) for x in c.available_members] |
|
161 | 161 | |
|
162 | 162 | users_group_form = UserGroupForm(edit=True, |
|
163 | 163 | old_data=c.user_group.get_dict(), |
|
164 | 164 | available_members=available_members)() |
|
165 | 165 | |
|
166 | 166 | try: |
|
167 | 167 | form_result = users_group_form.to_python(request.POST) |
|
168 | 168 | UserGroupModel().update(c.user_group, form_result) |
|
169 | 169 | gr = form_result['users_group_name'] |
|
170 | 170 | userlog.action_logger(request.authuser, |
|
171 | 171 | 'admin_updated_users_group:%s' % gr, |
|
172 | 172 | None, request.ip_addr) |
|
173 | 173 | webutils.flash(_('Updated user group %s') % gr, category='success') |
|
174 | 174 | meta.Session().commit() |
|
175 | 175 | except formencode.Invalid as errors: |
|
176 | 176 | ug_model = UserGroupModel() |
|
177 | 177 | defaults = errors.value |
|
178 | 178 | e = errors.error_dict or {} |
|
179 | 179 | defaults.update({ |
|
180 | 180 | 'create_repo_perm': ug_model.has_perm(id, |
|
181 | 181 | 'hg.create.repository'), |
|
182 | 182 | 'fork_repo_perm': ug_model.has_perm(id, |
|
183 | 183 | 'hg.fork.repository'), |
|
184 | 184 | }) |
|
185 | 185 | |
|
186 | 186 | return htmlfill.render( |
|
187 | render('admin/user_groups/user_group_edit.html'), | |
|
187 | base.render('admin/user_groups/user_group_edit.html'), | |
|
188 | 188 | defaults=defaults, |
|
189 | 189 | errors=e, |
|
190 | 190 | prefix_error=False, |
|
191 | 191 | encoding="UTF-8", |
|
192 | 192 | force_defaults=False) |
|
193 | 193 | except Exception: |
|
194 | 194 | log.error(traceback.format_exc()) |
|
195 | 195 | webutils.flash(_('Error occurred during update of user group %s') |
|
196 | 196 | % request.POST.get('users_group_name'), category='error') |
|
197 | 197 | |
|
198 | 198 | raise HTTPFound(location=url('edit_users_group', id=id)) |
|
199 | 199 | |
|
200 | 200 | @HasUserGroupPermissionLevelDecorator('admin') |
|
201 | 201 | def delete(self, id): |
|
202 | 202 | usr_gr = db.UserGroup.get_or_404(id) |
|
203 | 203 | try: |
|
204 | 204 | UserGroupModel().delete(usr_gr) |
|
205 | 205 | meta.Session().commit() |
|
206 | 206 | webutils.flash(_('Successfully deleted user group'), category='success') |
|
207 | 207 | except UserGroupsAssignedException as e: |
|
208 | 208 | webutils.flash(e, category='error') |
|
209 | 209 | except Exception: |
|
210 | 210 | log.error(traceback.format_exc()) |
|
211 | 211 | webutils.flash(_('An error occurred during deletion of user group'), |
|
212 | 212 | category='error') |
|
213 | 213 | raise HTTPFound(location=url('users_groups')) |
|
214 | 214 | |
|
215 | 215 | @HasUserGroupPermissionLevelDecorator('admin') |
|
216 | 216 | def edit(self, id, format='html'): |
|
217 | 217 | c.user_group = db.UserGroup.get_or_404(id) |
|
218 | 218 | c.active = 'settings' |
|
219 | 219 | self.__load_data(id) |
|
220 | 220 | |
|
221 | 221 | defaults = self.__load_defaults(id) |
|
222 | 222 | |
|
223 | 223 | return htmlfill.render( |
|
224 | render('admin/user_groups/user_group_edit.html'), | |
|
224 | base.render('admin/user_groups/user_group_edit.html'), | |
|
225 | 225 | defaults=defaults, |
|
226 | 226 | encoding="UTF-8", |
|
227 | 227 | force_defaults=False |
|
228 | 228 | ) |
|
229 | 229 | |
|
230 | 230 | @HasUserGroupPermissionLevelDecorator('admin') |
|
231 | 231 | def edit_perms(self, id): |
|
232 | 232 | c.user_group = db.UserGroup.get_or_404(id) |
|
233 | 233 | c.active = 'perms' |
|
234 | 234 | |
|
235 | 235 | defaults = {} |
|
236 | 236 | # fill user group users |
|
237 | 237 | for p in c.user_group.user_user_group_to_perm: |
|
238 | 238 | defaults.update({'u_perm_%s' % p.user.username: |
|
239 | 239 | p.permission.permission_name}) |
|
240 | 240 | |
|
241 | 241 | for p in c.user_group.user_group_user_group_to_perm: |
|
242 | 242 | defaults.update({'g_perm_%s' % p.user_group.users_group_name: |
|
243 | 243 | p.permission.permission_name}) |
|
244 | 244 | |
|
245 | 245 | return htmlfill.render( |
|
246 | render('admin/user_groups/user_group_edit.html'), | |
|
246 | base.render('admin/user_groups/user_group_edit.html'), | |
|
247 | 247 | defaults=defaults, |
|
248 | 248 | encoding="UTF-8", |
|
249 | 249 | force_defaults=False |
|
250 | 250 | ) |
|
251 | 251 | |
|
252 | 252 | @HasUserGroupPermissionLevelDecorator('admin') |
|
253 | 253 | def update_perms(self, id): |
|
254 | 254 | """ |
|
255 | 255 | grant permission for given usergroup |
|
256 | 256 | |
|
257 | 257 | :param id: |
|
258 | 258 | """ |
|
259 | 259 | user_group = db.UserGroup.get_or_404(id) |
|
260 | 260 | form = UserGroupPermsForm()().to_python(request.POST) |
|
261 | 261 | |
|
262 | 262 | # set the permissions ! |
|
263 | 263 | try: |
|
264 | 264 | UserGroupModel()._update_permissions(user_group, form['perms_new'], |
|
265 | 265 | form['perms_updates']) |
|
266 | 266 | except RepoGroupAssignmentError: |
|
267 | 267 | webutils.flash(_('Target group cannot be the same'), category='error') |
|
268 | 268 | raise HTTPFound(location=url('edit_user_group_perms', id=id)) |
|
269 | 269 | # TODO: implement this |
|
270 | 270 | #action_logger(request.authuser, 'admin_changed_repo_permissions', |
|
271 | 271 | # repo_name, request.ip_addr) |
|
272 | 272 | meta.Session().commit() |
|
273 | 273 | webutils.flash(_('User group permissions updated'), category='success') |
|
274 | 274 | raise HTTPFound(location=url('edit_user_group_perms', id=id)) |
|
275 | 275 | |
|
276 | 276 | @HasUserGroupPermissionLevelDecorator('admin') |
|
277 | 277 | def delete_perms(self, id): |
|
278 | 278 | try: |
|
279 | 279 | obj_type = request.POST.get('obj_type') |
|
280 | 280 | obj_id = None |
|
281 | 281 | if obj_type == 'user': |
|
282 | 282 | obj_id = safe_int(request.POST.get('user_id')) |
|
283 | 283 | elif obj_type == 'user_group': |
|
284 | 284 | obj_id = safe_int(request.POST.get('user_group_id')) |
|
285 | 285 | |
|
286 | 286 | if not request.authuser.is_admin: |
|
287 | 287 | if obj_type == 'user' and request.authuser.user_id == obj_id: |
|
288 | 288 | msg = _('Cannot revoke permission for yourself as admin') |
|
289 | 289 | webutils.flash(msg, category='warning') |
|
290 | 290 | raise Exception('revoke admin permission on self') |
|
291 | 291 | if obj_type == 'user': |
|
292 | 292 | UserGroupModel().revoke_user_permission(user_group=id, |
|
293 | 293 | user=obj_id) |
|
294 | 294 | elif obj_type == 'user_group': |
|
295 | 295 | UserGroupModel().revoke_user_group_permission(target_user_group=id, |
|
296 | 296 | user_group=obj_id) |
|
297 | 297 | meta.Session().commit() |
|
298 | 298 | except Exception: |
|
299 | 299 | log.error(traceback.format_exc()) |
|
300 | 300 | webutils.flash(_('An error occurred during revoking of permission'), |
|
301 | 301 | category='error') |
|
302 | 302 | raise HTTPInternalServerError() |
|
303 | 303 | |
|
304 | 304 | @HasUserGroupPermissionLevelDecorator('admin') |
|
305 | 305 | def edit_default_perms(self, id): |
|
306 | 306 | c.user_group = db.UserGroup.get_or_404(id) |
|
307 | 307 | c.active = 'default_perms' |
|
308 | 308 | |
|
309 | 309 | permissions = { |
|
310 | 310 | 'repositories': {}, |
|
311 | 311 | 'repositories_groups': {} |
|
312 | 312 | } |
|
313 | 313 | ugroup_repo_perms = db.UserGroupRepoToPerm.query() \ |
|
314 | 314 | .options(joinedload(db.UserGroupRepoToPerm.permission)) \ |
|
315 | 315 | .options(joinedload(db.UserGroupRepoToPerm.repository)) \ |
|
316 | 316 | .filter(db.UserGroupRepoToPerm.users_group_id == id) \ |
|
317 | 317 | .all() |
|
318 | 318 | |
|
319 | 319 | for gr in ugroup_repo_perms: |
|
320 | 320 | permissions['repositories'][gr.repository.repo_name] \ |
|
321 | 321 | = gr.permission.permission_name |
|
322 | 322 | |
|
323 | 323 | ugroup_group_perms = db.UserGroupRepoGroupToPerm.query() \ |
|
324 | 324 | .options(joinedload(db.UserGroupRepoGroupToPerm.permission)) \ |
|
325 | 325 | .options(joinedload(db.UserGroupRepoGroupToPerm.group)) \ |
|
326 | 326 | .filter(db.UserGroupRepoGroupToPerm.users_group_id == id) \ |
|
327 | 327 | .all() |
|
328 | 328 | |
|
329 | 329 | for gr in ugroup_group_perms: |
|
330 | 330 | permissions['repositories_groups'][gr.group.group_name] \ |
|
331 | 331 | = gr.permission.permission_name |
|
332 | 332 | c.permissions = permissions |
|
333 | 333 | |
|
334 | 334 | ug_model = UserGroupModel() |
|
335 | 335 | |
|
336 | 336 | defaults = c.user_group.get_dict() |
|
337 | 337 | defaults.update({ |
|
338 | 338 | 'create_repo_perm': ug_model.has_perm(c.user_group, |
|
339 | 339 | 'hg.create.repository'), |
|
340 | 340 | 'create_user_group_perm': ug_model.has_perm(c.user_group, |
|
341 | 341 | 'hg.usergroup.create.true'), |
|
342 | 342 | 'fork_repo_perm': ug_model.has_perm(c.user_group, |
|
343 | 343 | 'hg.fork.repository'), |
|
344 | 344 | }) |
|
345 | 345 | |
|
346 | 346 | return htmlfill.render( |
|
347 | render('admin/user_groups/user_group_edit.html'), | |
|
347 | base.render('admin/user_groups/user_group_edit.html'), | |
|
348 | 348 | defaults=defaults, |
|
349 | 349 | encoding="UTF-8", |
|
350 | 350 | force_defaults=False |
|
351 | 351 | ) |
|
352 | 352 | |
|
353 | 353 | @HasUserGroupPermissionLevelDecorator('admin') |
|
354 | 354 | def update_default_perms(self, id): |
|
355 | 355 | user_group = db.UserGroup.get_or_404(id) |
|
356 | 356 | |
|
357 | 357 | try: |
|
358 | 358 | form = CustomDefaultPermissionsForm()() |
|
359 | 359 | form_result = form.to_python(request.POST) |
|
360 | 360 | |
|
361 | 361 | usergroup_model = UserGroupModel() |
|
362 | 362 | |
|
363 | 363 | defs = db.UserGroupToPerm.query() \ |
|
364 | 364 | .filter(db.UserGroupToPerm.users_group == user_group) \ |
|
365 | 365 | .all() |
|
366 | 366 | for ug in defs: |
|
367 | 367 | meta.Session().delete(ug) |
|
368 | 368 | |
|
369 | 369 | if form_result['create_repo_perm']: |
|
370 | 370 | usergroup_model.grant_perm(id, 'hg.create.repository') |
|
371 | 371 | else: |
|
372 | 372 | usergroup_model.grant_perm(id, 'hg.create.none') |
|
373 | 373 | if form_result['create_user_group_perm']: |
|
374 | 374 | usergroup_model.grant_perm(id, 'hg.usergroup.create.true') |
|
375 | 375 | else: |
|
376 | 376 | usergroup_model.grant_perm(id, 'hg.usergroup.create.false') |
|
377 | 377 | if form_result['fork_repo_perm']: |
|
378 | 378 | usergroup_model.grant_perm(id, 'hg.fork.repository') |
|
379 | 379 | else: |
|
380 | 380 | usergroup_model.grant_perm(id, 'hg.fork.none') |
|
381 | 381 | |
|
382 | 382 | webutils.flash(_("Updated permissions"), category='success') |
|
383 | 383 | meta.Session().commit() |
|
384 | 384 | except Exception: |
|
385 | 385 | log.error(traceback.format_exc()) |
|
386 | 386 | webutils.flash(_('An error occurred during permissions saving'), |
|
387 | 387 | category='error') |
|
388 | 388 | |
|
389 | 389 | raise HTTPFound(location=url('edit_user_group_default_perms', id=id)) |
|
390 | 390 | |
|
391 | 391 | @HasUserGroupPermissionLevelDecorator('admin') |
|
392 | 392 | def edit_advanced(self, id): |
|
393 | 393 | c.user_group = db.UserGroup.get_or_404(id) |
|
394 | 394 | c.active = 'advanced' |
|
395 | 395 | c.group_members_obj = sorted((x.user for x in c.user_group.members), |
|
396 | 396 | key=lambda u: u.username.lower()) |
|
397 | return render('admin/user_groups/user_group_edit.html') | |
|
397 | return base.render('admin/user_groups/user_group_edit.html') | |
|
398 | 398 | |
|
399 | 399 | @HasUserGroupPermissionLevelDecorator('admin') |
|
400 | 400 | def edit_members(self, id): |
|
401 | 401 | c.user_group = db.UserGroup.get_or_404(id) |
|
402 | 402 | c.active = 'members' |
|
403 | 403 | c.group_members_obj = sorted((x.user for x in c.user_group.members), |
|
404 | 404 | key=lambda u: u.username.lower()) |
|
405 | 405 | |
|
406 | 406 | c.group_members = [(x.user_id, x.username) for x in c.group_members_obj] |
|
407 | return render('admin/user_groups/user_group_edit.html') | |
|
407 | return base.render('admin/user_groups/user_group_edit.html') |
@@ -1,472 +1,472 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.admin.users |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Users crud controller |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 4, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | |
|
31 | 31 | import formencode |
|
32 | 32 | from formencode import htmlfill |
|
33 | 33 | from sqlalchemy.sql.expression import func |
|
34 | 34 | from tg import app_globals, request |
|
35 | 35 | from tg import tmpl_context as c |
|
36 | 36 | from tg.i18n import ugettext as _ |
|
37 | 37 | from webob.exc import HTTPFound, HTTPNotFound |
|
38 | 38 | |
|
39 | 39 | import kallithea |
|
40 | 40 | import kallithea.lib.helpers as h |
|
41 | from kallithea.controllers import base | |
|
41 | 42 | from kallithea.lib import auth_modules, webutils |
|
42 | 43 | from kallithea.lib.auth import AuthUser, HasPermissionAnyDecorator, LoginRequired |
|
43 | from kallithea.lib.base import BaseController, IfSshEnabled, render | |
|
44 | 44 | from kallithea.lib.exceptions import DefaultUserException, UserCreationError, UserOwnsReposException |
|
45 | 45 | from kallithea.lib.utils2 import datetime_to_time, fmt_date, generate_api_key, safe_int |
|
46 | 46 | from kallithea.lib.webutils import url |
|
47 | 47 | from kallithea.model import db, meta, userlog |
|
48 | 48 | from kallithea.model.api_key import ApiKeyModel |
|
49 | 49 | from kallithea.model.forms import CustomDefaultPermissionsForm, UserForm |
|
50 | 50 | from kallithea.model.ssh_key import SshKeyModel, SshKeyModelException |
|
51 | 51 | from kallithea.model.user import UserModel |
|
52 | 52 | |
|
53 | 53 | |
|
54 | 54 | log = logging.getLogger(__name__) |
|
55 | 55 | |
|
56 | 56 | |
|
57 | class UsersController(BaseController): | |
|
57 | class UsersController(base.BaseController): | |
|
58 | 58 | |
|
59 | 59 | @LoginRequired() |
|
60 | 60 | @HasPermissionAnyDecorator('hg.admin') |
|
61 | 61 | def _before(self, *args, **kwargs): |
|
62 | 62 | super(UsersController, self)._before(*args, **kwargs) |
|
63 | 63 | |
|
64 | 64 | def index(self, format='html'): |
|
65 | 65 | c.users_list = db.User.query().order_by(db.User.username) \ |
|
66 | 66 | .filter_by(is_default_user=False) \ |
|
67 | 67 | .order_by(func.lower(db.User.username)) \ |
|
68 | 68 | .all() |
|
69 | 69 | |
|
70 | 70 | users_data = [] |
|
71 | 71 | _tmpl_lookup = app_globals.mako_lookup |
|
72 | 72 | template = _tmpl_lookup.get_template('data_table/_dt_elements.html') |
|
73 | 73 | |
|
74 | 74 | grav_tmpl = '<div class="gravatar">%s</div>' |
|
75 | 75 | |
|
76 | 76 | def username(user_id, username): |
|
77 | 77 | return template.get_def("user_name") \ |
|
78 | 78 | .render_unicode(user_id, username, _=_, h=h, c=c) |
|
79 | 79 | |
|
80 | 80 | def user_actions(user_id, username): |
|
81 | 81 | return template.get_def("user_actions") \ |
|
82 | 82 | .render_unicode(user_id, username, _=_, h=h, c=c) |
|
83 | 83 | |
|
84 | 84 | for user in c.users_list: |
|
85 | 85 | users_data.append({ |
|
86 | 86 | "gravatar": grav_tmpl % h.gravatar(user.email, size=20), |
|
87 | 87 | "raw_name": user.username, |
|
88 | 88 | "username": username(user.user_id, user.username), |
|
89 | 89 | "firstname": webutils.escape(user.name), |
|
90 | 90 | "lastname": webutils.escape(user.lastname), |
|
91 | 91 | "last_login": fmt_date(user.last_login), |
|
92 | 92 | "last_login_raw": datetime_to_time(user.last_login), |
|
93 | 93 | "active": h.boolicon(user.active), |
|
94 | 94 | "admin": h.boolicon(user.admin), |
|
95 | 95 | "extern_type": user.extern_type, |
|
96 | 96 | "extern_name": user.extern_name, |
|
97 | 97 | "action": user_actions(user.user_id, user.username), |
|
98 | 98 | }) |
|
99 | 99 | |
|
100 | 100 | c.data = { |
|
101 | 101 | "sort": None, |
|
102 | 102 | "dir": "asc", |
|
103 | 103 | "records": users_data |
|
104 | 104 | } |
|
105 | 105 | |
|
106 | return render('admin/users/users.html') | |
|
106 | return base.render('admin/users/users.html') | |
|
107 | 107 | |
|
108 | 108 | def create(self): |
|
109 | 109 | c.default_extern_type = db.User.DEFAULT_AUTH_TYPE |
|
110 | 110 | c.default_extern_name = '' |
|
111 | 111 | user_model = UserModel() |
|
112 | 112 | user_form = UserForm()() |
|
113 | 113 | try: |
|
114 | 114 | form_result = user_form.to_python(dict(request.POST)) |
|
115 | 115 | user = user_model.create(form_result) |
|
116 | 116 | userlog.action_logger(request.authuser, 'admin_created_user:%s' % user.username, |
|
117 | 117 | None, request.ip_addr) |
|
118 | 118 | webutils.flash(_('Created user %s') % user.username, |
|
119 | 119 | category='success') |
|
120 | 120 | meta.Session().commit() |
|
121 | 121 | except formencode.Invalid as errors: |
|
122 | 122 | return htmlfill.render( |
|
123 | render('admin/users/user_add.html'), | |
|
123 | base.render('admin/users/user_add.html'), | |
|
124 | 124 | defaults=errors.value, |
|
125 | 125 | errors=errors.error_dict or {}, |
|
126 | 126 | prefix_error=False, |
|
127 | 127 | encoding="UTF-8", |
|
128 | 128 | force_defaults=False) |
|
129 | 129 | except UserCreationError as e: |
|
130 | 130 | webutils.flash(e, 'error') |
|
131 | 131 | except Exception: |
|
132 | 132 | log.error(traceback.format_exc()) |
|
133 | 133 | webutils.flash(_('Error occurred during creation of user %s') |
|
134 | 134 | % request.POST.get('username'), category='error') |
|
135 | 135 | raise HTTPFound(location=url('edit_user', id=user.user_id)) |
|
136 | 136 | |
|
137 | 137 | def new(self, format='html'): |
|
138 | 138 | c.default_extern_type = db.User.DEFAULT_AUTH_TYPE |
|
139 | 139 | c.default_extern_name = '' |
|
140 | return render('admin/users/user_add.html') | |
|
140 | return base.render('admin/users/user_add.html') | |
|
141 | 141 | |
|
142 | 142 | def update(self, id): |
|
143 | 143 | user_model = UserModel() |
|
144 | 144 | user = user_model.get(id) |
|
145 | 145 | _form = UserForm(edit=True, old_data={'user_id': id, |
|
146 | 146 | 'email': user.email})() |
|
147 | 147 | form_result = {} |
|
148 | 148 | try: |
|
149 | 149 | form_result = _form.to_python(dict(request.POST)) |
|
150 | 150 | skip_attrs = ['extern_type', 'extern_name', |
|
151 | 151 | ] + auth_modules.get_managed_fields(user) |
|
152 | 152 | |
|
153 | 153 | user_model.update(id, form_result, skip_attrs=skip_attrs) |
|
154 | 154 | usr = form_result['username'] |
|
155 | 155 | userlog.action_logger(request.authuser, 'admin_updated_user:%s' % usr, |
|
156 | 156 | None, request.ip_addr) |
|
157 | 157 | webutils.flash(_('User updated successfully'), category='success') |
|
158 | 158 | meta.Session().commit() |
|
159 | 159 | except formencode.Invalid as errors: |
|
160 | 160 | defaults = errors.value |
|
161 | 161 | e = errors.error_dict or {} |
|
162 | 162 | defaults.update({ |
|
163 | 163 | 'create_repo_perm': user_model.has_perm(id, |
|
164 | 164 | 'hg.create.repository'), |
|
165 | 165 | 'fork_repo_perm': user_model.has_perm(id, 'hg.fork.repository'), |
|
166 | 166 | }) |
|
167 | 167 | return htmlfill.render( |
|
168 | 168 | self._render_edit_profile(user), |
|
169 | 169 | defaults=defaults, |
|
170 | 170 | errors=e, |
|
171 | 171 | prefix_error=False, |
|
172 | 172 | encoding="UTF-8", |
|
173 | 173 | force_defaults=False) |
|
174 | 174 | except Exception: |
|
175 | 175 | log.error(traceback.format_exc()) |
|
176 | 176 | webutils.flash(_('Error occurred during update of user %s') |
|
177 | 177 | % form_result.get('username'), category='error') |
|
178 | 178 | raise HTTPFound(location=url('edit_user', id=id)) |
|
179 | 179 | |
|
180 | 180 | def delete(self, id): |
|
181 | 181 | usr = db.User.get_or_404(id) |
|
182 | 182 | has_ssh_keys = bool(usr.ssh_keys) |
|
183 | 183 | try: |
|
184 | 184 | UserModel().delete(usr) |
|
185 | 185 | meta.Session().commit() |
|
186 | 186 | webutils.flash(_('Successfully deleted user'), category='success') |
|
187 | 187 | except (UserOwnsReposException, DefaultUserException) as e: |
|
188 | 188 | webutils.flash(e, category='warning') |
|
189 | 189 | except Exception: |
|
190 | 190 | log.error(traceback.format_exc()) |
|
191 | 191 | webutils.flash(_('An error occurred during deletion of user'), |
|
192 | 192 | category='error') |
|
193 | 193 | else: |
|
194 | 194 | if has_ssh_keys: |
|
195 | 195 | SshKeyModel().write_authorized_keys() |
|
196 | 196 | raise HTTPFound(location=url('users')) |
|
197 | 197 | |
|
198 | 198 | def _get_user_or_raise_if_default(self, id): |
|
199 | 199 | try: |
|
200 | 200 | return db.User.get_or_404(id, allow_default=False) |
|
201 | 201 | except DefaultUserException: |
|
202 | 202 | webutils.flash(_("The default user cannot be edited"), category='warning') |
|
203 | 203 | raise HTTPNotFound |
|
204 | 204 | |
|
205 | 205 | def _render_edit_profile(self, user): |
|
206 | 206 | c.user = user |
|
207 | 207 | c.active = 'profile' |
|
208 | 208 | c.perm_user = AuthUser(dbuser=user) |
|
209 | 209 | managed_fields = auth_modules.get_managed_fields(user) |
|
210 | 210 | c.readonly = lambda n: 'readonly' if n in managed_fields else None |
|
211 | return render('admin/users/user_edit.html') | |
|
211 | return base.render('admin/users/user_edit.html') | |
|
212 | 212 | |
|
213 | 213 | def edit(self, id, format='html'): |
|
214 | 214 | user = self._get_user_or_raise_if_default(id) |
|
215 | 215 | defaults = user.get_dict() |
|
216 | 216 | |
|
217 | 217 | return htmlfill.render( |
|
218 | 218 | self._render_edit_profile(user), |
|
219 | 219 | defaults=defaults, |
|
220 | 220 | encoding="UTF-8", |
|
221 | 221 | force_defaults=False) |
|
222 | 222 | |
|
223 | 223 | def edit_advanced(self, id): |
|
224 | 224 | c.user = self._get_user_or_raise_if_default(id) |
|
225 | 225 | c.active = 'advanced' |
|
226 | 226 | c.perm_user = AuthUser(dbuser=c.user) |
|
227 | 227 | |
|
228 | 228 | umodel = UserModel() |
|
229 | 229 | defaults = c.user.get_dict() |
|
230 | 230 | defaults.update({ |
|
231 | 231 | 'create_repo_perm': umodel.has_perm(c.user, 'hg.create.repository'), |
|
232 | 232 | 'create_user_group_perm': umodel.has_perm(c.user, |
|
233 | 233 | 'hg.usergroup.create.true'), |
|
234 | 234 | 'fork_repo_perm': umodel.has_perm(c.user, 'hg.fork.repository'), |
|
235 | 235 | }) |
|
236 | 236 | return htmlfill.render( |
|
237 | render('admin/users/user_edit.html'), | |
|
237 | base.render('admin/users/user_edit.html'), | |
|
238 | 238 | defaults=defaults, |
|
239 | 239 | encoding="UTF-8", |
|
240 | 240 | force_defaults=False) |
|
241 | 241 | |
|
242 | 242 | def edit_api_keys(self, id): |
|
243 | 243 | c.user = self._get_user_or_raise_if_default(id) |
|
244 | 244 | c.active = 'api_keys' |
|
245 | 245 | show_expired = True |
|
246 | 246 | c.lifetime_values = [ |
|
247 | 247 | (str(-1), _('Forever')), |
|
248 | 248 | (str(5), _('5 minutes')), |
|
249 | 249 | (str(60), _('1 hour')), |
|
250 | 250 | (str(60 * 24), _('1 day')), |
|
251 | 251 | (str(60 * 24 * 30), _('1 month')), |
|
252 | 252 | ] |
|
253 | 253 | c.lifetime_options = [(c.lifetime_values, _("Lifetime"))] |
|
254 | 254 | c.user_api_keys = ApiKeyModel().get_api_keys(c.user.user_id, |
|
255 | 255 | show_expired=show_expired) |
|
256 | 256 | defaults = c.user.get_dict() |
|
257 | 257 | return htmlfill.render( |
|
258 | render('admin/users/user_edit.html'), | |
|
258 | base.render('admin/users/user_edit.html'), | |
|
259 | 259 | defaults=defaults, |
|
260 | 260 | encoding="UTF-8", |
|
261 | 261 | force_defaults=False) |
|
262 | 262 | |
|
263 | 263 | def add_api_key(self, id): |
|
264 | 264 | c.user = self._get_user_or_raise_if_default(id) |
|
265 | 265 | |
|
266 | 266 | lifetime = safe_int(request.POST.get('lifetime'), -1) |
|
267 | 267 | description = request.POST.get('description') |
|
268 | 268 | ApiKeyModel().create(c.user.user_id, description, lifetime) |
|
269 | 269 | meta.Session().commit() |
|
270 | 270 | webutils.flash(_("API key successfully created"), category='success') |
|
271 | 271 | raise HTTPFound(location=url('edit_user_api_keys', id=c.user.user_id)) |
|
272 | 272 | |
|
273 | 273 | def delete_api_key(self, id): |
|
274 | 274 | c.user = self._get_user_or_raise_if_default(id) |
|
275 | 275 | |
|
276 | 276 | api_key = request.POST.get('del_api_key') |
|
277 | 277 | if request.POST.get('del_api_key_builtin'): |
|
278 | 278 | c.user.api_key = generate_api_key() |
|
279 | 279 | meta.Session().commit() |
|
280 | 280 | webutils.flash(_("API key successfully reset"), category='success') |
|
281 | 281 | elif api_key: |
|
282 | 282 | ApiKeyModel().delete(api_key, c.user.user_id) |
|
283 | 283 | meta.Session().commit() |
|
284 | 284 | webutils.flash(_("API key successfully deleted"), category='success') |
|
285 | 285 | |
|
286 | 286 | raise HTTPFound(location=url('edit_user_api_keys', id=c.user.user_id)) |
|
287 | 287 | |
|
288 | 288 | def update_account(self, id): |
|
289 | 289 | pass |
|
290 | 290 | |
|
291 | 291 | def edit_perms(self, id): |
|
292 | 292 | c.user = self._get_user_or_raise_if_default(id) |
|
293 | 293 | c.active = 'perms' |
|
294 | 294 | c.perm_user = AuthUser(dbuser=c.user) |
|
295 | 295 | |
|
296 | 296 | umodel = UserModel() |
|
297 | 297 | defaults = c.user.get_dict() |
|
298 | 298 | defaults.update({ |
|
299 | 299 | 'create_repo_perm': umodel.has_perm(c.user, 'hg.create.repository'), |
|
300 | 300 | 'create_user_group_perm': umodel.has_perm(c.user, |
|
301 | 301 | 'hg.usergroup.create.true'), |
|
302 | 302 | 'fork_repo_perm': umodel.has_perm(c.user, 'hg.fork.repository'), |
|
303 | 303 | }) |
|
304 | 304 | return htmlfill.render( |
|
305 | render('admin/users/user_edit.html'), | |
|
305 | base.render('admin/users/user_edit.html'), | |
|
306 | 306 | defaults=defaults, |
|
307 | 307 | encoding="UTF-8", |
|
308 | 308 | force_defaults=False) |
|
309 | 309 | |
|
310 | 310 | def update_perms(self, id): |
|
311 | 311 | user = self._get_user_or_raise_if_default(id) |
|
312 | 312 | |
|
313 | 313 | try: |
|
314 | 314 | form = CustomDefaultPermissionsForm()() |
|
315 | 315 | form_result = form.to_python(request.POST) |
|
316 | 316 | |
|
317 | 317 | user_model = UserModel() |
|
318 | 318 | |
|
319 | 319 | defs = db.UserToPerm.query() \ |
|
320 | 320 | .filter(db.UserToPerm.user == user) \ |
|
321 | 321 | .all() |
|
322 | 322 | for ug in defs: |
|
323 | 323 | meta.Session().delete(ug) |
|
324 | 324 | |
|
325 | 325 | if form_result['create_repo_perm']: |
|
326 | 326 | user_model.grant_perm(id, 'hg.create.repository') |
|
327 | 327 | else: |
|
328 | 328 | user_model.grant_perm(id, 'hg.create.none') |
|
329 | 329 | if form_result['create_user_group_perm']: |
|
330 | 330 | user_model.grant_perm(id, 'hg.usergroup.create.true') |
|
331 | 331 | else: |
|
332 | 332 | user_model.grant_perm(id, 'hg.usergroup.create.false') |
|
333 | 333 | if form_result['fork_repo_perm']: |
|
334 | 334 | user_model.grant_perm(id, 'hg.fork.repository') |
|
335 | 335 | else: |
|
336 | 336 | user_model.grant_perm(id, 'hg.fork.none') |
|
337 | 337 | webutils.flash(_("Updated permissions"), category='success') |
|
338 | 338 | meta.Session().commit() |
|
339 | 339 | except Exception: |
|
340 | 340 | log.error(traceback.format_exc()) |
|
341 | 341 | webutils.flash(_('An error occurred during permissions saving'), |
|
342 | 342 | category='error') |
|
343 | 343 | raise HTTPFound(location=url('edit_user_perms', id=id)) |
|
344 | 344 | |
|
345 | 345 | def edit_emails(self, id): |
|
346 | 346 | c.user = self._get_user_or_raise_if_default(id) |
|
347 | 347 | c.active = 'emails' |
|
348 | 348 | c.user_email_map = db.UserEmailMap.query() \ |
|
349 | 349 | .filter(db.UserEmailMap.user == c.user).all() |
|
350 | 350 | |
|
351 | 351 | defaults = c.user.get_dict() |
|
352 | 352 | return htmlfill.render( |
|
353 | render('admin/users/user_edit.html'), | |
|
353 | base.render('admin/users/user_edit.html'), | |
|
354 | 354 | defaults=defaults, |
|
355 | 355 | encoding="UTF-8", |
|
356 | 356 | force_defaults=False) |
|
357 | 357 | |
|
358 | 358 | def add_email(self, id): |
|
359 | 359 | user = self._get_user_or_raise_if_default(id) |
|
360 | 360 | email = request.POST.get('new_email') |
|
361 | 361 | user_model = UserModel() |
|
362 | 362 | |
|
363 | 363 | try: |
|
364 | 364 | user_model.add_extra_email(id, email) |
|
365 | 365 | meta.Session().commit() |
|
366 | 366 | webutils.flash(_("Added email %s to user") % email, category='success') |
|
367 | 367 | except formencode.Invalid as error: |
|
368 | 368 | msg = error.error_dict['email'] |
|
369 | 369 | webutils.flash(msg, category='error') |
|
370 | 370 | except Exception: |
|
371 | 371 | log.error(traceback.format_exc()) |
|
372 | 372 | webutils.flash(_('An error occurred during email saving'), |
|
373 | 373 | category='error') |
|
374 | 374 | raise HTTPFound(location=url('edit_user_emails', id=id)) |
|
375 | 375 | |
|
376 | 376 | def delete_email(self, id): |
|
377 | 377 | user = self._get_user_or_raise_if_default(id) |
|
378 | 378 | email_id = request.POST.get('del_email_id') |
|
379 | 379 | user_model = UserModel() |
|
380 | 380 | user_model.delete_extra_email(id, email_id) |
|
381 | 381 | meta.Session().commit() |
|
382 | 382 | webutils.flash(_("Removed email from user"), category='success') |
|
383 | 383 | raise HTTPFound(location=url('edit_user_emails', id=id)) |
|
384 | 384 | |
|
385 | 385 | def edit_ips(self, id): |
|
386 | 386 | c.user = self._get_user_or_raise_if_default(id) |
|
387 | 387 | c.active = 'ips' |
|
388 | 388 | c.user_ip_map = db.UserIpMap.query() \ |
|
389 | 389 | .filter(db.UserIpMap.user == c.user).all() |
|
390 | 390 | |
|
391 | 391 | c.default_user_ip_map = db.UserIpMap.query() \ |
|
392 | 392 | .filter(db.UserIpMap.user_id == kallithea.DEFAULT_USER_ID).all() |
|
393 | 393 | |
|
394 | 394 | defaults = c.user.get_dict() |
|
395 | 395 | return htmlfill.render( |
|
396 | render('admin/users/user_edit.html'), | |
|
396 | base.render('admin/users/user_edit.html'), | |
|
397 | 397 | defaults=defaults, |
|
398 | 398 | encoding="UTF-8", |
|
399 | 399 | force_defaults=False) |
|
400 | 400 | |
|
401 | 401 | def add_ip(self, id): |
|
402 | 402 | ip = request.POST.get('new_ip') |
|
403 | 403 | user_model = UserModel() |
|
404 | 404 | |
|
405 | 405 | try: |
|
406 | 406 | user_model.add_extra_ip(id, ip) |
|
407 | 407 | meta.Session().commit() |
|
408 | 408 | webutils.flash(_("Added IP address %s to user whitelist") % ip, category='success') |
|
409 | 409 | except formencode.Invalid as error: |
|
410 | 410 | msg = error.error_dict['ip'] |
|
411 | 411 | webutils.flash(msg, category='error') |
|
412 | 412 | except Exception: |
|
413 | 413 | log.error(traceback.format_exc()) |
|
414 | 414 | webutils.flash(_('An error occurred while adding IP address'), |
|
415 | 415 | category='error') |
|
416 | 416 | |
|
417 | 417 | if 'default_user' in request.POST: |
|
418 | 418 | raise HTTPFound(location=url('admin_permissions_ips')) |
|
419 | 419 | raise HTTPFound(location=url('edit_user_ips', id=id)) |
|
420 | 420 | |
|
421 | 421 | def delete_ip(self, id): |
|
422 | 422 | ip_id = request.POST.get('del_ip_id') |
|
423 | 423 | user_model = UserModel() |
|
424 | 424 | user_model.delete_extra_ip(id, ip_id) |
|
425 | 425 | meta.Session().commit() |
|
426 | 426 | webutils.flash(_("Removed IP address from user whitelist"), category='success') |
|
427 | 427 | |
|
428 | 428 | if 'default_user' in request.POST: |
|
429 | 429 | raise HTTPFound(location=url('admin_permissions_ips')) |
|
430 | 430 | raise HTTPFound(location=url('edit_user_ips', id=id)) |
|
431 | 431 | |
|
432 | @IfSshEnabled | |
|
432 | @base.IfSshEnabled | |
|
433 | 433 | def edit_ssh_keys(self, id): |
|
434 | 434 | c.user = self._get_user_or_raise_if_default(id) |
|
435 | 435 | c.active = 'ssh_keys' |
|
436 | 436 | c.user_ssh_keys = SshKeyModel().get_ssh_keys(c.user.user_id) |
|
437 | 437 | defaults = c.user.get_dict() |
|
438 | 438 | return htmlfill.render( |
|
439 | render('admin/users/user_edit.html'), | |
|
439 | base.render('admin/users/user_edit.html'), | |
|
440 | 440 | defaults=defaults, |
|
441 | 441 | encoding="UTF-8", |
|
442 | 442 | force_defaults=False) |
|
443 | 443 | |
|
444 | @IfSshEnabled | |
|
444 | @base.IfSshEnabled | |
|
445 | 445 | def ssh_keys_add(self, id): |
|
446 | 446 | c.user = self._get_user_or_raise_if_default(id) |
|
447 | 447 | |
|
448 | 448 | description = request.POST.get('description') |
|
449 | 449 | public_key = request.POST.get('public_key') |
|
450 | 450 | try: |
|
451 | 451 | new_ssh_key = SshKeyModel().create(c.user.user_id, |
|
452 | 452 | description, public_key) |
|
453 | 453 | meta.Session().commit() |
|
454 | 454 | SshKeyModel().write_authorized_keys() |
|
455 | 455 | webutils.flash(_("SSH key %s successfully added") % new_ssh_key.fingerprint, category='success') |
|
456 | 456 | except SshKeyModelException as e: |
|
457 | 457 | webutils.flash(e.args[0], category='error') |
|
458 | 458 | raise HTTPFound(location=url('edit_user_ssh_keys', id=c.user.user_id)) |
|
459 | 459 | |
|
460 | @IfSshEnabled | |
|
460 | @base.IfSshEnabled | |
|
461 | 461 | def ssh_keys_delete(self, id): |
|
462 | 462 | c.user = self._get_user_or_raise_if_default(id) |
|
463 | 463 | |
|
464 | 464 | fingerprint = request.POST.get('del_public_key_fingerprint') |
|
465 | 465 | try: |
|
466 | 466 | SshKeyModel().delete(fingerprint, c.user.user_id) |
|
467 | 467 | meta.Session().commit() |
|
468 | 468 | SshKeyModel().write_authorized_keys() |
|
469 | 469 | webutils.flash(_("SSH key successfully deleted"), category='success') |
|
470 | 470 | except SshKeyModelException as e: |
|
471 | 471 | webutils.flash(e.args[0], category='error') |
|
472 | 472 | raise HTTPFound(location=url('edit_user_ssh_keys', id=c.user.user_id)) |
@@ -1,265 +1,265 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.api |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | JSON RPC controller |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Aug 20, 2011 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import inspect |
|
29 | 29 | import itertools |
|
30 | 30 | import logging |
|
31 | 31 | import time |
|
32 | 32 | import traceback |
|
33 | 33 | import types |
|
34 | 34 | |
|
35 | 35 | from tg import Response, TGController, request, response |
|
36 | 36 | from webob.exc import HTTPError, HTTPException |
|
37 | 37 | |
|
38 | from kallithea.controllers import base | |
|
38 | 39 | from kallithea.lib import ext_json |
|
39 | 40 | from kallithea.lib.auth import AuthUser |
|
40 | from kallithea.lib.base import get_ip_addr, get_path_info | |
|
41 | 41 | from kallithea.lib.utils2 import ascii_bytes |
|
42 | 42 | from kallithea.model import db |
|
43 | 43 | |
|
44 | 44 | |
|
45 | 45 | log = logging.getLogger('JSONRPC') |
|
46 | 46 | |
|
47 | 47 | |
|
48 | 48 | class JSONRPCError(BaseException): |
|
49 | 49 | |
|
50 | 50 | def __init__(self, message): |
|
51 | 51 | self.message = message |
|
52 | 52 | super(JSONRPCError, self).__init__() |
|
53 | 53 | |
|
54 | 54 | def __str__(self): |
|
55 | 55 | return self.message |
|
56 | 56 | |
|
57 | 57 | |
|
58 | 58 | class JSONRPCErrorResponse(Response, HTTPException): |
|
59 | 59 | """ |
|
60 | 60 | Generate a Response object with a JSON-RPC error body |
|
61 | 61 | """ |
|
62 | 62 | |
|
63 | 63 | def __init__(self, message=None, retid=None, code=None): |
|
64 | 64 | HTTPException.__init__(self, message, self) |
|
65 | 65 | Response.__init__(self, |
|
66 | 66 | json_body=dict(id=retid, result=None, error=message), |
|
67 | 67 | status=code, |
|
68 | 68 | content_type='application/json') |
|
69 | 69 | |
|
70 | 70 | |
|
71 | 71 | class JSONRPCController(TGController): |
|
72 | 72 | """ |
|
73 | 73 | A WSGI-speaking JSON-RPC controller class |
|
74 | 74 | |
|
75 | 75 | See the specification: |
|
76 | 76 | <http://json-rpc.org/wiki/specification>`. |
|
77 | 77 | |
|
78 | 78 | Valid controller return values should be json-serializable objects. |
|
79 | 79 | |
|
80 | 80 | Sub-classes should catch their exceptions and raise JSONRPCError |
|
81 | 81 | if they want to pass meaningful errors to the client. |
|
82 | 82 | |
|
83 | 83 | """ |
|
84 | 84 | |
|
85 | 85 | def _get_method_args(self): |
|
86 | 86 | """ |
|
87 | 87 | Return `self._rpc_args` to dispatched controller method |
|
88 | 88 | chosen by __call__ |
|
89 | 89 | """ |
|
90 | 90 | return self._rpc_args |
|
91 | 91 | |
|
92 | 92 | def _dispatch(self, state, remainder=None): |
|
93 | 93 | """ |
|
94 | 94 | Parse the request body as JSON, look up the method on the |
|
95 | 95 | controller and if it exists, dispatch to it. |
|
96 | 96 | """ |
|
97 | 97 | # Since we are here we should respond as JSON |
|
98 | 98 | response.content_type = 'application/json' |
|
99 | 99 | |
|
100 | 100 | environ = state.request.environ |
|
101 | 101 | start = time.time() |
|
102 | ip_addr = get_ip_addr(environ) | |
|
102 | ip_addr = base.get_ip_addr(environ) | |
|
103 | 103 | self._req_id = None |
|
104 | 104 | if 'CONTENT_LENGTH' not in environ: |
|
105 | 105 | log.debug("No Content-Length") |
|
106 | 106 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
107 | 107 | message="No Content-Length in request") |
|
108 | 108 | else: |
|
109 | 109 | length = environ['CONTENT_LENGTH'] or 0 |
|
110 | 110 | length = int(environ['CONTENT_LENGTH']) |
|
111 | 111 | log.debug('Content-Length: %s', length) |
|
112 | 112 | |
|
113 | 113 | if length == 0: |
|
114 | 114 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
115 | 115 | message="Content-Length is 0") |
|
116 | 116 | |
|
117 | 117 | raw_body = environ['wsgi.input'].read(length) |
|
118 | 118 | |
|
119 | 119 | try: |
|
120 | 120 | json_body = ext_json.loads(raw_body) |
|
121 | 121 | except ValueError as e: |
|
122 | 122 | # catch JSON errors Here |
|
123 | 123 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
124 | 124 | message="JSON parse error ERR:%s RAW:%r" |
|
125 | 125 | % (e, raw_body)) |
|
126 | 126 | |
|
127 | 127 | # check AUTH based on API key |
|
128 | 128 | try: |
|
129 | 129 | self._req_api_key = json_body['api_key'] |
|
130 | 130 | self._req_id = json_body['id'] |
|
131 | 131 | self._req_method = json_body['method'] |
|
132 | 132 | self._request_params = json_body['args'] |
|
133 | 133 | if not isinstance(self._request_params, dict): |
|
134 | 134 | self._request_params = {} |
|
135 | 135 | |
|
136 | 136 | log.debug('method: %s, params: %s', |
|
137 | 137 | self._req_method, self._request_params) |
|
138 | 138 | except KeyError as e: |
|
139 | 139 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
140 | 140 | message='Incorrect JSON query missing %s' % e) |
|
141 | 141 | |
|
142 | 142 | # check if we can find this session using api_key |
|
143 | 143 | try: |
|
144 | 144 | u = db.User.get_by_api_key(self._req_api_key) |
|
145 | 145 | auth_user = AuthUser.make(dbuser=u, ip_addr=ip_addr) |
|
146 | 146 | if auth_user is None: |
|
147 | 147 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
148 | 148 | message='Invalid API key') |
|
149 | 149 | except Exception as e: |
|
150 | 150 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
151 | 151 | message='Invalid API key') |
|
152 | 152 | |
|
153 | 153 | request.authuser = auth_user |
|
154 | 154 | request.ip_addr = ip_addr |
|
155 | 155 | |
|
156 | 156 | self._error = None |
|
157 | 157 | try: |
|
158 | 158 | self._func = self._find_method() |
|
159 | 159 | except AttributeError as e: |
|
160 | 160 | raise JSONRPCErrorResponse(retid=self._req_id, |
|
161 | 161 | message=str(e)) |
|
162 | 162 | |
|
163 | 163 | # now that we have a method, add self._req_params to |
|
164 | 164 | # self.kargs and dispatch control to WGIController |
|
165 | 165 | argspec = inspect.getfullargspec(self._func) |
|
166 | 166 | arglist = argspec.args[1:] |
|
167 | 167 | argtypes = [type(arg) for arg in argspec.defaults or []] |
|
168 | 168 | default_empty = type(NotImplemented) |
|
169 | 169 | |
|
170 | 170 | # kw arguments required by this method |
|
171 | 171 | func_kwargs = dict(itertools.zip_longest(reversed(arglist), reversed(argtypes), |
|
172 | 172 | fillvalue=default_empty)) |
|
173 | 173 | |
|
174 | 174 | # This attribute will need to be first param of a method that uses |
|
175 | 175 | # api_key, which is translated to instance of user at that name |
|
176 | 176 | USER_SESSION_ATTR = 'apiuser' |
|
177 | 177 | |
|
178 | 178 | # get our arglist and check if we provided them as args |
|
179 | 179 | for arg, default in func_kwargs.items(): |
|
180 | 180 | if arg == USER_SESSION_ATTR: |
|
181 | 181 | # USER_SESSION_ATTR is something translated from API key and |
|
182 | 182 | # this is checked before so we don't need validate it |
|
183 | 183 | continue |
|
184 | 184 | |
|
185 | 185 | # skip the required param check if it's default value is |
|
186 | 186 | # NotImplementedType (default_empty) |
|
187 | 187 | if default == default_empty and arg not in self._request_params: |
|
188 | 188 | raise JSONRPCErrorResponse( |
|
189 | 189 | retid=self._req_id, |
|
190 | 190 | message='Missing non optional `%s` arg in JSON DATA' % arg, |
|
191 | 191 | ) |
|
192 | 192 | |
|
193 | 193 | extra = set(self._request_params).difference(func_kwargs) |
|
194 | 194 | if extra: |
|
195 | 195 | raise JSONRPCErrorResponse( |
|
196 | 196 | retid=self._req_id, |
|
197 | 197 | message='Unknown %s arg in JSON DATA' % |
|
198 | 198 | ', '.join('`%s`' % arg for arg in extra), |
|
199 | 199 | ) |
|
200 | 200 | |
|
201 | 201 | self._rpc_args = {} |
|
202 | 202 | self._rpc_args.update(self._request_params) |
|
203 | 203 | self._rpc_args['action'] = self._req_method |
|
204 | 204 | self._rpc_args['environ'] = environ |
|
205 | 205 | |
|
206 | 206 | log.info('IP: %s Request to %s time: %.3fs' % ( |
|
207 | get_ip_addr(environ), | |
|
208 | get_path_info(environ), time.time() - start) | |
|
207 | base.get_ip_addr(environ), | |
|
208 | base.get_path_info(environ), time.time() - start) | |
|
209 | 209 | ) |
|
210 | 210 | |
|
211 | 211 | state.set_action(self._rpc_call, []) |
|
212 | 212 | state.set_params(self._rpc_args) |
|
213 | 213 | return state |
|
214 | 214 | |
|
215 | 215 | def _rpc_call(self, action, environ, **rpc_args): |
|
216 | 216 | """ |
|
217 | 217 | Call the specified RPC Method |
|
218 | 218 | """ |
|
219 | 219 | raw_response = '' |
|
220 | 220 | try: |
|
221 | 221 | raw_response = getattr(self, action)(**rpc_args) |
|
222 | 222 | if isinstance(raw_response, HTTPError): |
|
223 | 223 | self._error = str(raw_response) |
|
224 | 224 | except JSONRPCError as e: |
|
225 | 225 | self._error = str(e) |
|
226 | 226 | except Exception as e: |
|
227 | 227 | log.error('Encountered unhandled exception: %s', |
|
228 | 228 | traceback.format_exc(),) |
|
229 | 229 | json_exc = JSONRPCError('Internal server error') |
|
230 | 230 | self._error = str(json_exc) |
|
231 | 231 | |
|
232 | 232 | if self._error is not None: |
|
233 | 233 | raw_response = None |
|
234 | 234 | |
|
235 | 235 | response = dict(id=self._req_id, result=raw_response, error=self._error) |
|
236 | 236 | try: |
|
237 | 237 | return ascii_bytes(ext_json.dumps(response)) |
|
238 | 238 | except TypeError as e: |
|
239 | 239 | log.error('API FAILED. Error encoding response for %s %s: %s\n%s', action, rpc_args, e, traceback.format_exc()) |
|
240 | 240 | return ascii_bytes(ext_json.dumps( |
|
241 | 241 | dict( |
|
242 | 242 | id=self._req_id, |
|
243 | 243 | result=None, |
|
244 | 244 | error="Error encoding response", |
|
245 | 245 | ) |
|
246 | 246 | )) |
|
247 | 247 | |
|
248 | 248 | def _find_method(self): |
|
249 | 249 | """ |
|
250 | 250 | Return method named by `self._req_method` in controller if able |
|
251 | 251 | """ |
|
252 | 252 | log.debug('Trying to find JSON-RPC method: %s', self._req_method) |
|
253 | 253 | if self._req_method.startswith('_'): |
|
254 | 254 | raise AttributeError("Method not allowed") |
|
255 | 255 | |
|
256 | 256 | try: |
|
257 | 257 | func = getattr(self, self._req_method, None) |
|
258 | 258 | except UnicodeEncodeError: |
|
259 | 259 | raise AttributeError("Problem decoding unicode in requested " |
|
260 | 260 | "method name.") |
|
261 | 261 | |
|
262 | 262 | if isinstance(func, types.MethodType): |
|
263 | 263 | return func |
|
264 | 264 | else: |
|
265 | 265 | raise AttributeError("No such method: %s" % (self._req_method,)) |
@@ -1,639 +1,639 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | |
|
15 | 15 | """ |
|
16 |
kallithea. |
|
|
17 | ~~~~~~~~~~~~~~~~~~ | |
|
16 | kallithea.controllers.base | |
|
17 | ~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
|
18 | 18 | |
|
19 | 19 | The base Controller API |
|
20 | 20 | Provides the BaseController class for subclassing. And usage in different |
|
21 | 21 | controllers |
|
22 | 22 | |
|
23 | 23 | This file was forked by the Kallithea project in July 2014. |
|
24 | 24 | Original author and date, and relevant copyright and licensing information is below: |
|
25 | 25 | :created_on: Oct 06, 2010 |
|
26 | 26 | :author: marcink |
|
27 | 27 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
28 | 28 | :license: GPLv3, see LICENSE.md for more details. |
|
29 | 29 | """ |
|
30 | 30 | |
|
31 | 31 | import base64 |
|
32 | 32 | import datetime |
|
33 | 33 | import logging |
|
34 | 34 | import traceback |
|
35 | 35 | import warnings |
|
36 | 36 | |
|
37 | 37 | import decorator |
|
38 | 38 | import paste.auth.basic |
|
39 | 39 | import paste.httpexceptions |
|
40 | 40 | import paste.httpheaders |
|
41 | 41 | import webob.exc |
|
42 | 42 | from tg import TGController, config, render_template, request, response, session |
|
43 | 43 | from tg import tmpl_context as c |
|
44 | 44 | from tg.i18n import ugettext as _ |
|
45 | 45 | |
|
46 | 46 | import kallithea |
|
47 | 47 | from kallithea.lib import auth_modules, ext_json, webutils |
|
48 | 48 | from kallithea.lib.auth import AuthUser, HasPermissionAnyMiddleware |
|
49 | 49 | from kallithea.lib.exceptions import UserCreationError |
|
50 | 50 | from kallithea.lib.utils import get_repo_slug, is_valid_repo |
|
51 | 51 | from kallithea.lib.utils2 import AttributeDict, asbool, ascii_bytes, safe_int, safe_str, set_hook_environment |
|
52 | 52 | from kallithea.lib.vcs.exceptions import ChangesetDoesNotExistError, EmptyRepositoryError, RepositoryError |
|
53 | 53 | from kallithea.lib.webutils import url |
|
54 | 54 | from kallithea.model import db, meta |
|
55 | 55 | from kallithea.model.scm import ScmModel |
|
56 | 56 | |
|
57 | 57 | |
|
58 | 58 | log = logging.getLogger(__name__) |
|
59 | 59 | |
|
60 | 60 | |
|
61 | 61 | def render(template_path): |
|
62 | 62 | return render_template({'url': url}, 'mako', template_path) |
|
63 | 63 | |
|
64 | 64 | |
|
65 | 65 | def _filter_proxy(ip): |
|
66 | 66 | """ |
|
67 | 67 | HEADERS can have multiple ips inside the left-most being the original |
|
68 | 68 | client, and each successive proxy that passed the request adding the IP |
|
69 | 69 | address where it received the request from. |
|
70 | 70 | |
|
71 | 71 | :param ip: |
|
72 | 72 | """ |
|
73 | 73 | if ',' in ip: |
|
74 | 74 | _ips = ip.split(',') |
|
75 | 75 | _first_ip = _ips[0].strip() |
|
76 | 76 | log.debug('Got multiple IPs %s, using %s', ','.join(_ips), _first_ip) |
|
77 | 77 | return _first_ip |
|
78 | 78 | return ip |
|
79 | 79 | |
|
80 | 80 | |
|
81 | 81 | def get_ip_addr(environ): |
|
82 | 82 | proxy_key = 'HTTP_X_REAL_IP' |
|
83 | 83 | proxy_key2 = 'HTTP_X_FORWARDED_FOR' |
|
84 | 84 | def_key = 'REMOTE_ADDR' |
|
85 | 85 | |
|
86 | 86 | ip = environ.get(proxy_key) |
|
87 | 87 | if ip: |
|
88 | 88 | return _filter_proxy(ip) |
|
89 | 89 | |
|
90 | 90 | ip = environ.get(proxy_key2) |
|
91 | 91 | if ip: |
|
92 | 92 | return _filter_proxy(ip) |
|
93 | 93 | |
|
94 | 94 | ip = environ.get(def_key, '0.0.0.0') |
|
95 | 95 | return _filter_proxy(ip) |
|
96 | 96 | |
|
97 | 97 | |
|
98 | 98 | def get_path_info(environ): |
|
99 | 99 | """Return PATH_INFO from environ ... using tg.original_request if available. |
|
100 | 100 | |
|
101 | 101 | In Python 3 WSGI, PATH_INFO is a unicode str, but kind of contains encoded |
|
102 | 102 | bytes. The code points are guaranteed to only use the lower 8 bit bits, and |
|
103 | 103 | encoding the string with the 1:1 encoding latin1 will give the |
|
104 | 104 | corresponding byte string ... which then can be decoded to proper unicode. |
|
105 | 105 | """ |
|
106 | 106 | org_req = environ.get('tg.original_request') |
|
107 | 107 | if org_req is not None: |
|
108 | 108 | environ = org_req.environ |
|
109 | 109 | return safe_str(environ['PATH_INFO'].encode('latin1')) |
|
110 | 110 | |
|
111 | 111 | |
|
112 | 112 | def log_in_user(user, remember, is_external_auth, ip_addr): |
|
113 | 113 | """ |
|
114 | 114 | Log a `User` in and update session and cookies. If `remember` is True, |
|
115 | 115 | the session cookie is set to expire in a year; otherwise, it expires at |
|
116 | 116 | the end of the browser session. |
|
117 | 117 | |
|
118 | 118 | Returns populated `AuthUser` object. |
|
119 | 119 | """ |
|
120 | 120 | # It should not be possible to explicitly log in as the default user. |
|
121 | 121 | assert not user.is_default_user, user |
|
122 | 122 | |
|
123 | 123 | auth_user = AuthUser.make(dbuser=user, is_external_auth=is_external_auth, ip_addr=ip_addr) |
|
124 | 124 | if auth_user is None: |
|
125 | 125 | return None |
|
126 | 126 | |
|
127 | 127 | user.update_lastlogin() |
|
128 | 128 | meta.Session().commit() |
|
129 | 129 | |
|
130 | 130 | # Start new session to prevent session fixation attacks. |
|
131 | 131 | session.invalidate() |
|
132 | 132 | session['authuser'] = cookie = auth_user.to_cookie() |
|
133 | 133 | |
|
134 | 134 | # If they want to be remembered, update the cookie. |
|
135 | 135 | # NOTE: Assumes that beaker defaults to browser session cookie. |
|
136 | 136 | if remember: |
|
137 | 137 | t = datetime.datetime.now() + datetime.timedelta(days=365) |
|
138 | 138 | session._set_cookie_expires(t) |
|
139 | 139 | |
|
140 | 140 | session.save() |
|
141 | 141 | |
|
142 | 142 | log.info('user %s is now authenticated and stored in ' |
|
143 | 143 | 'session, session attrs %s', user.username, cookie) |
|
144 | 144 | |
|
145 | 145 | # dumps session attrs back to cookie |
|
146 | 146 | session._update_cookie_out() |
|
147 | 147 | |
|
148 | 148 | return auth_user |
|
149 | 149 | |
|
150 | 150 | |
|
151 | 151 | class BasicAuth(paste.auth.basic.AuthBasicAuthenticator): |
|
152 | 152 | |
|
153 | 153 | def __init__(self, realm, authfunc, auth_http_code=None): |
|
154 | 154 | self.realm = realm |
|
155 | 155 | self.authfunc = authfunc |
|
156 | 156 | self._rc_auth_http_code = auth_http_code |
|
157 | 157 | |
|
158 | 158 | def build_authentication(self, environ): |
|
159 | 159 | head = paste.httpheaders.WWW_AUTHENTICATE.tuples('Basic realm="%s"' % self.realm) |
|
160 | 160 | # Consume the whole body before sending a response |
|
161 | 161 | try: |
|
162 | 162 | request_body_size = int(environ.get('CONTENT_LENGTH', 0)) |
|
163 | 163 | except (ValueError): |
|
164 | 164 | request_body_size = 0 |
|
165 | 165 | environ['wsgi.input'].read(request_body_size) |
|
166 | 166 | if self._rc_auth_http_code and self._rc_auth_http_code == '403': |
|
167 | 167 | # return 403 if alternative http return code is specified in |
|
168 | 168 | # Kallithea config |
|
169 | 169 | return paste.httpexceptions.HTTPForbidden(headers=head) |
|
170 | 170 | return paste.httpexceptions.HTTPUnauthorized(headers=head) |
|
171 | 171 | |
|
172 | 172 | def authenticate(self, environ): |
|
173 | 173 | authorization = paste.httpheaders.AUTHORIZATION(environ) |
|
174 | 174 | if not authorization: |
|
175 | 175 | return self.build_authentication(environ) |
|
176 | 176 | (authmeth, auth) = authorization.split(' ', 1) |
|
177 | 177 | if 'basic' != authmeth.lower(): |
|
178 | 178 | return self.build_authentication(environ) |
|
179 | 179 | auth = safe_str(base64.b64decode(auth.strip())) |
|
180 | 180 | _parts = auth.split(':', 1) |
|
181 | 181 | if len(_parts) == 2: |
|
182 | 182 | username, password = _parts |
|
183 | 183 | if self.authfunc(username, password, environ) is not None: |
|
184 | 184 | return username |
|
185 | 185 | return self.build_authentication(environ) |
|
186 | 186 | |
|
187 | 187 | __call__ = authenticate |
|
188 | 188 | |
|
189 | 189 | |
|
190 | 190 | class BaseVCSController(object): |
|
191 | 191 | """Base controller for handling Mercurial/Git protocol requests |
|
192 | 192 | (coming from a VCS client, and not a browser). |
|
193 | 193 | """ |
|
194 | 194 | |
|
195 | 195 | scm_alias = None # 'hg' / 'git' |
|
196 | 196 | |
|
197 | 197 | def __init__(self, application, config): |
|
198 | 198 | self.application = application |
|
199 | 199 | self.config = config |
|
200 | 200 | # base path of repo locations |
|
201 | 201 | self.basepath = self.config['base_path'] |
|
202 | 202 | # authenticate this VCS request using the authentication modules |
|
203 | 203 | self.authenticate = BasicAuth('', auth_modules.authenticate, |
|
204 | 204 | config.get('auth_ret_code')) |
|
205 | 205 | |
|
206 | 206 | @classmethod |
|
207 | 207 | def parse_request(cls, environ): |
|
208 | 208 | """If request is parsed as a request for this VCS, return a namespace with the parsed request. |
|
209 | 209 | If the request is unknown, return None. |
|
210 | 210 | """ |
|
211 | 211 | raise NotImplementedError() |
|
212 | 212 | |
|
213 | 213 | def _authorize(self, environ, action, repo_name, ip_addr): |
|
214 | 214 | """Authenticate and authorize user. |
|
215 | 215 | |
|
216 | 216 | Since we're dealing with a VCS client and not a browser, we only |
|
217 | 217 | support HTTP basic authentication, either directly via raw header |
|
218 | 218 | inspection, or by using container authentication to delegate the |
|
219 | 219 | authentication to the web server. |
|
220 | 220 | |
|
221 | 221 | Returns (user, None) on successful authentication and authorization. |
|
222 | 222 | Returns (None, wsgi_app) to send the wsgi_app response to the client. |
|
223 | 223 | """ |
|
224 | 224 | # Use anonymous access if allowed for action on repo. |
|
225 | 225 | default_user = db.User.get_default_user() |
|
226 | 226 | default_authuser = AuthUser.make(dbuser=default_user, ip_addr=ip_addr) |
|
227 | 227 | if default_authuser is None: |
|
228 | 228 | log.debug('No anonymous access at all') # move on to proper user auth |
|
229 | 229 | else: |
|
230 | 230 | if self._check_permission(action, default_authuser, repo_name): |
|
231 | 231 | return default_authuser, None |
|
232 | 232 | log.debug('Not authorized to access this repository as anonymous user') |
|
233 | 233 | |
|
234 | 234 | username = None |
|
235 | 235 | #============================================================== |
|
236 | 236 | # DEFAULT PERM FAILED OR ANONYMOUS ACCESS IS DISABLED SO WE |
|
237 | 237 | # NEED TO AUTHENTICATE AND ASK FOR AUTH USER PERMISSIONS |
|
238 | 238 | #============================================================== |
|
239 | 239 | |
|
240 | 240 | # try to auth based on environ, container auth methods |
|
241 | 241 | log.debug('Running PRE-AUTH for container based authentication') |
|
242 | 242 | pre_auth = auth_modules.authenticate('', '', environ) |
|
243 | 243 | if pre_auth is not None and pre_auth.get('username'): |
|
244 | 244 | username = pre_auth['username'] |
|
245 | 245 | log.debug('PRE-AUTH got %s as username', username) |
|
246 | 246 | |
|
247 | 247 | # If not authenticated by the container, running basic auth |
|
248 | 248 | if not username: |
|
249 | 249 | self.authenticate.realm = self.config['realm'] |
|
250 | 250 | result = self.authenticate(environ) |
|
251 | 251 | if isinstance(result, str): |
|
252 | 252 | paste.httpheaders.AUTH_TYPE.update(environ, 'basic') |
|
253 | 253 | paste.httpheaders.REMOTE_USER.update(environ, result) |
|
254 | 254 | username = result |
|
255 | 255 | else: |
|
256 | 256 | return None, result.wsgi_application |
|
257 | 257 | |
|
258 | 258 | #============================================================== |
|
259 | 259 | # CHECK PERMISSIONS FOR THIS REQUEST USING GIVEN USERNAME |
|
260 | 260 | #============================================================== |
|
261 | 261 | try: |
|
262 | 262 | user = db.User.get_by_username_or_email(username) |
|
263 | 263 | except Exception: |
|
264 | 264 | log.error(traceback.format_exc()) |
|
265 | 265 | return None, webob.exc.HTTPInternalServerError() |
|
266 | 266 | |
|
267 | 267 | authuser = AuthUser.make(dbuser=user, ip_addr=ip_addr) |
|
268 | 268 | if authuser is None: |
|
269 | 269 | return None, webob.exc.HTTPForbidden() |
|
270 | 270 | if not self._check_permission(action, authuser, repo_name): |
|
271 | 271 | return None, webob.exc.HTTPForbidden() |
|
272 | 272 | |
|
273 | 273 | return user, None |
|
274 | 274 | |
|
275 | 275 | def _handle_request(self, environ, start_response): |
|
276 | 276 | raise NotImplementedError() |
|
277 | 277 | |
|
278 | 278 | def _check_permission(self, action, authuser, repo_name): |
|
279 | 279 | """ |
|
280 | 280 | :param action: 'push' or 'pull' |
|
281 | 281 | :param user: `AuthUser` instance |
|
282 | 282 | :param repo_name: repository name |
|
283 | 283 | """ |
|
284 | 284 | if action == 'push': |
|
285 | 285 | if not HasPermissionAnyMiddleware('repository.write', |
|
286 | 286 | 'repository.admin')(authuser, |
|
287 | 287 | repo_name): |
|
288 | 288 | return False |
|
289 | 289 | |
|
290 | 290 | elif action == 'pull': |
|
291 | 291 | #any other action need at least read permission |
|
292 | 292 | if not HasPermissionAnyMiddleware('repository.read', |
|
293 | 293 | 'repository.write', |
|
294 | 294 | 'repository.admin')(authuser, |
|
295 | 295 | repo_name): |
|
296 | 296 | return False |
|
297 | 297 | |
|
298 | 298 | else: |
|
299 | 299 | assert False, action |
|
300 | 300 | |
|
301 | 301 | return True |
|
302 | 302 | |
|
303 | 303 | def __call__(self, environ, start_response): |
|
304 | 304 | try: |
|
305 | 305 | # try parsing a request for this VCS - if it fails, call the wrapped app |
|
306 | 306 | parsed_request = self.parse_request(environ) |
|
307 | 307 | if parsed_request is None: |
|
308 | 308 | return self.application(environ, start_response) |
|
309 | 309 | |
|
310 | 310 | # skip passing error to error controller |
|
311 | 311 | environ['pylons.status_code_redirect'] = True |
|
312 | 312 | |
|
313 | 313 | # quick check if repo exists... |
|
314 | 314 | if not is_valid_repo(parsed_request.repo_name, self.basepath, self.scm_alias): |
|
315 | 315 | raise webob.exc.HTTPNotFound() |
|
316 | 316 | |
|
317 | 317 | if parsed_request.action is None: |
|
318 | 318 | # Note: the client doesn't get the helpful error message |
|
319 | 319 | raise webob.exc.HTTPBadRequest('Unable to detect pull/push action for %r! Are you using a nonstandard command or client?' % parsed_request.repo_name) |
|
320 | 320 | |
|
321 | 321 | #====================================================================== |
|
322 | 322 | # CHECK PERMISSIONS |
|
323 | 323 | #====================================================================== |
|
324 | 324 | ip_addr = get_ip_addr(environ) |
|
325 | 325 | user, response_app = self._authorize(environ, parsed_request.action, parsed_request.repo_name, ip_addr) |
|
326 | 326 | if response_app is not None: |
|
327 | 327 | return response_app(environ, start_response) |
|
328 | 328 | |
|
329 | 329 | #====================================================================== |
|
330 | 330 | # REQUEST HANDLING |
|
331 | 331 | #====================================================================== |
|
332 | 332 | set_hook_environment(user.username, ip_addr, |
|
333 | 333 | parsed_request.repo_name, self.scm_alias, parsed_request.action) |
|
334 | 334 | |
|
335 | 335 | try: |
|
336 | 336 | log.info('%s action on %s repo "%s" by "%s" from %s', |
|
337 | 337 | parsed_request.action, self.scm_alias, parsed_request.repo_name, user.username, ip_addr) |
|
338 | 338 | app = self._make_app(parsed_request) |
|
339 | 339 | return app(environ, start_response) |
|
340 | 340 | except Exception: |
|
341 | 341 | log.error(traceback.format_exc()) |
|
342 | 342 | raise webob.exc.HTTPInternalServerError() |
|
343 | 343 | |
|
344 | 344 | except webob.exc.HTTPException as e: |
|
345 | 345 | return e(environ, start_response) |
|
346 | 346 | |
|
347 | 347 | |
|
348 | 348 | class BaseController(TGController): |
|
349 | 349 | |
|
350 | 350 | def _before(self, *args, **kwargs): |
|
351 | 351 | """ |
|
352 | 352 | _before is called before controller methods and after __call__ |
|
353 | 353 | """ |
|
354 | 354 | if request.needs_csrf_check: |
|
355 | 355 | # CSRF protection: Whenever a request has ambient authority (whether |
|
356 | 356 | # through a session cookie or its origin IP address), it must include |
|
357 | 357 | # the correct token, unless the HTTP method is GET or HEAD (and thus |
|
358 | 358 | # guaranteed to be side effect free. In practice, the only situation |
|
359 | 359 | # where we allow side effects without ambient authority is when the |
|
360 | 360 | # authority comes from an API key; and that is handled above. |
|
361 | 361 | token = request.POST.get(webutils.session_csrf_secret_name) |
|
362 | 362 | if not token or token != webutils.session_csrf_secret_token(): |
|
363 | 363 | log.error('CSRF check failed') |
|
364 | 364 | raise webob.exc.HTTPForbidden() |
|
365 | 365 | |
|
366 | 366 | c.kallithea_version = kallithea.__version__ |
|
367 | 367 | settings = db.Setting.get_app_settings() |
|
368 | 368 | |
|
369 | 369 | # Visual options |
|
370 | 370 | c.visual = AttributeDict({}) |
|
371 | 371 | |
|
372 | 372 | ## DB stored |
|
373 | 373 | c.visual.show_public_icon = asbool(settings.get('show_public_icon')) |
|
374 | 374 | c.visual.show_private_icon = asbool(settings.get('show_private_icon')) |
|
375 | 375 | c.visual.stylify_metalabels = asbool(settings.get('stylify_metalabels')) |
|
376 | 376 | c.visual.page_size = safe_int(settings.get('dashboard_items', 100)) |
|
377 | 377 | c.visual.admin_grid_items = safe_int(settings.get('admin_grid_items', 100)) |
|
378 | 378 | c.visual.repository_fields = asbool(settings.get('repository_fields')) |
|
379 | 379 | c.visual.show_version = asbool(settings.get('show_version')) |
|
380 | 380 | c.visual.use_gravatar = asbool(settings.get('use_gravatar')) |
|
381 | 381 | c.visual.gravatar_url = settings.get('gravatar_url') |
|
382 | 382 | |
|
383 | 383 | c.ga_code = settings.get('ga_code') |
|
384 | 384 | # TODO: replace undocumented backwards compatibility hack with db upgrade and rename ga_code |
|
385 | 385 | if c.ga_code and '<' not in c.ga_code: |
|
386 | 386 | c.ga_code = '''<script type="text/javascript"> |
|
387 | 387 | var _gaq = _gaq || []; |
|
388 | 388 | _gaq.push(['_setAccount', '%s']); |
|
389 | 389 | _gaq.push(['_trackPageview']); |
|
390 | 390 | |
|
391 | 391 | (function() { |
|
392 | 392 | var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; |
|
393 | 393 | ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; |
|
394 | 394 | var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); |
|
395 | 395 | })(); |
|
396 | 396 | </script>''' % c.ga_code |
|
397 | 397 | c.site_name = settings.get('title') |
|
398 | 398 | c.clone_uri_tmpl = settings.get('clone_uri_tmpl') or db.Repository.DEFAULT_CLONE_URI |
|
399 | 399 | c.clone_ssh_tmpl = settings.get('clone_ssh_tmpl') or db.Repository.DEFAULT_CLONE_SSH |
|
400 | 400 | |
|
401 | 401 | ## INI stored |
|
402 | 402 | c.visual.allow_repo_location_change = asbool(config.get('allow_repo_location_change', True)) |
|
403 | 403 | c.visual.allow_custom_hooks_settings = asbool(config.get('allow_custom_hooks_settings', True)) |
|
404 | 404 | c.ssh_enabled = asbool(config.get('ssh_enabled', False)) |
|
405 | 405 | |
|
406 | 406 | c.instance_id = config.get('instance_id') |
|
407 | 407 | c.issues_url = config.get('bugtracker', url('issues_url')) |
|
408 | 408 | # END CONFIG VARS |
|
409 | 409 | |
|
410 | 410 | c.repo_name = get_repo_slug(request) # can be empty |
|
411 | 411 | c.backends = list(kallithea.BACKENDS) |
|
412 | 412 | |
|
413 | 413 | self.cut_off_limit = safe_int(config.get('cut_off_limit')) |
|
414 | 414 | |
|
415 | 415 | c.my_pr_count = db.PullRequest.query(reviewer_id=request.authuser.user_id, include_closed=False).count() |
|
416 | 416 | |
|
417 | 417 | self.scm_model = ScmModel() |
|
418 | 418 | |
|
419 | 419 | @staticmethod |
|
420 | 420 | def _determine_auth_user(session_authuser, ip_addr): |
|
421 | 421 | """ |
|
422 | 422 | Create an `AuthUser` object given the API key/bearer token |
|
423 | 423 | (if any) and the value of the authuser session cookie. |
|
424 | 424 | Returns None if no valid user is found (like not active or no access for IP). |
|
425 | 425 | """ |
|
426 | 426 | |
|
427 | 427 | # Authenticate by session cookie |
|
428 | 428 | # In ancient login sessions, 'authuser' may not be a dict. |
|
429 | 429 | # In that case, the user will have to log in again. |
|
430 | 430 | # v0.3 and earlier included an 'is_authenticated' key; if present, |
|
431 | 431 | # this must be True. |
|
432 | 432 | if isinstance(session_authuser, dict) and session_authuser.get('is_authenticated', True): |
|
433 | 433 | return AuthUser.from_cookie(session_authuser, ip_addr=ip_addr) |
|
434 | 434 | |
|
435 | 435 | # Authenticate by auth_container plugin (if enabled) |
|
436 | 436 | if any( |
|
437 | 437 | plugin.is_container_auth |
|
438 | 438 | for plugin in auth_modules.get_auth_plugins() |
|
439 | 439 | ): |
|
440 | 440 | try: |
|
441 | 441 | user_info = auth_modules.authenticate('', '', request.environ) |
|
442 | 442 | except UserCreationError as e: |
|
443 | 443 | webutils.flash(e, 'error', logf=log.error) |
|
444 | 444 | else: |
|
445 | 445 | if user_info is not None: |
|
446 | 446 | username = user_info['username'] |
|
447 | 447 | user = db.User.get_by_username(username, case_insensitive=True) |
|
448 | 448 | return log_in_user(user, remember=False, is_external_auth=True, ip_addr=ip_addr) |
|
449 | 449 | |
|
450 | 450 | # User is default user (if active) or anonymous |
|
451 | 451 | default_user = db.User.get_default_user() |
|
452 | 452 | authuser = AuthUser.make(dbuser=default_user, ip_addr=ip_addr) |
|
453 | 453 | if authuser is None: # fall back to anonymous |
|
454 | 454 | authuser = AuthUser(dbuser=default_user) # TODO: somehow use .make? |
|
455 | 455 | return authuser |
|
456 | 456 | |
|
457 | 457 | @staticmethod |
|
458 | 458 | def _basic_security_checks(): |
|
459 | 459 | """Perform basic security/sanity checks before processing the request.""" |
|
460 | 460 | |
|
461 | 461 | # Only allow the following HTTP request methods. |
|
462 | 462 | if request.method not in ['GET', 'HEAD', 'POST']: |
|
463 | 463 | raise webob.exc.HTTPMethodNotAllowed() |
|
464 | 464 | |
|
465 | 465 | # Also verify the _method override - no longer allowed. |
|
466 | 466 | if request.params.get('_method') is None: |
|
467 | 467 | pass # no override, no problem |
|
468 | 468 | else: |
|
469 | 469 | raise webob.exc.HTTPMethodNotAllowed() |
|
470 | 470 | |
|
471 | 471 | # Make sure CSRF token never appears in the URL. If so, invalidate it. |
|
472 | 472 | if webutils.session_csrf_secret_name in request.GET: |
|
473 | 473 | log.error('CSRF key leak detected') |
|
474 | 474 | session.pop(webutils.session_csrf_secret_name, None) |
|
475 | 475 | session.save() |
|
476 | 476 | webutils.flash(_('CSRF token leak has been detected - all form tokens have been expired'), |
|
477 | 477 | category='error') |
|
478 | 478 | |
|
479 | 479 | # WebOb already ignores request payload parameters for anything other |
|
480 | 480 | # than POST/PUT, but double-check since other Kallithea code relies on |
|
481 | 481 | # this assumption. |
|
482 | 482 | if request.method not in ['POST', 'PUT'] and request.POST: |
|
483 | 483 | log.error('%r request with payload parameters; WebOb should have stopped this', request.method) |
|
484 | 484 | raise webob.exc.HTTPBadRequest() |
|
485 | 485 | |
|
486 | 486 | def __call__(self, environ, context): |
|
487 | 487 | try: |
|
488 | 488 | ip_addr = get_ip_addr(environ) |
|
489 | 489 | self._basic_security_checks() |
|
490 | 490 | |
|
491 | 491 | api_key = request.GET.get('api_key') |
|
492 | 492 | try: |
|
493 | 493 | # Request.authorization may raise ValueError on invalid input |
|
494 | 494 | type, params = request.authorization |
|
495 | 495 | except (ValueError, TypeError): |
|
496 | 496 | pass |
|
497 | 497 | else: |
|
498 | 498 | if type.lower() == 'bearer': |
|
499 | 499 | api_key = params # bearer token is an api key too |
|
500 | 500 | |
|
501 | 501 | if api_key is None: |
|
502 | 502 | authuser = self._determine_auth_user( |
|
503 | 503 | session.get('authuser'), |
|
504 | 504 | ip_addr=ip_addr, |
|
505 | 505 | ) |
|
506 | 506 | needs_csrf_check = request.method not in ['GET', 'HEAD'] |
|
507 | 507 | |
|
508 | 508 | else: |
|
509 | 509 | dbuser = db.User.get_by_api_key(api_key) |
|
510 | 510 | if dbuser is None: |
|
511 | 511 | log.info('No db user found for authentication with API key ****%s from %s', |
|
512 | 512 | api_key[-4:], ip_addr) |
|
513 | 513 | authuser = AuthUser.make(dbuser=dbuser, is_external_auth=True, ip_addr=ip_addr) |
|
514 | 514 | needs_csrf_check = False # API key provides CSRF protection |
|
515 | 515 | |
|
516 | 516 | if authuser is None: |
|
517 | 517 | log.info('No valid user found') |
|
518 | 518 | raise webob.exc.HTTPForbidden() |
|
519 | 519 | |
|
520 | 520 | # set globals for auth user |
|
521 | 521 | request.authuser = authuser |
|
522 | 522 | request.ip_addr = ip_addr |
|
523 | 523 | request.needs_csrf_check = needs_csrf_check |
|
524 | 524 | |
|
525 | 525 | log.info('IP: %s User: %s Request: %s', |
|
526 | 526 | request.ip_addr, request.authuser, |
|
527 | 527 | get_path_info(environ), |
|
528 | 528 | ) |
|
529 | 529 | return super(BaseController, self).__call__(environ, context) |
|
530 | 530 | except webob.exc.HTTPException as e: |
|
531 | 531 | return e |
|
532 | 532 | |
|
533 | 533 | |
|
534 | 534 | class BaseRepoController(BaseController): |
|
535 | 535 | """ |
|
536 | 536 | Base class for controllers responsible for loading all needed data for |
|
537 | 537 | repository loaded items are |
|
538 | 538 | |
|
539 | 539 | c.db_repo_scm_instance: instance of scm repository |
|
540 | 540 | c.db_repo: instance of db |
|
541 | 541 | c.repository_followers: number of followers |
|
542 | 542 | c.repository_forks: number of forks |
|
543 | 543 | c.repository_following: weather the current user is following the current repo |
|
544 | 544 | """ |
|
545 | 545 | |
|
546 | 546 | def _before(self, *args, **kwargs): |
|
547 | 547 | super(BaseRepoController, self)._before(*args, **kwargs) |
|
548 | 548 | if c.repo_name: # extracted from request by base-base BaseController._before |
|
549 | 549 | _dbr = db.Repository.get_by_repo_name(c.repo_name) |
|
550 | 550 | if not _dbr: |
|
551 | 551 | return |
|
552 | 552 | |
|
553 | 553 | log.debug('Found repository in database %s with state `%s`', |
|
554 | 554 | _dbr, _dbr.repo_state) |
|
555 | 555 | route = getattr(request.environ.get('routes.route'), 'name', '') |
|
556 | 556 | |
|
557 | 557 | # allow to delete repos that are somehow damages in filesystem |
|
558 | 558 | if route in ['delete_repo']: |
|
559 | 559 | return |
|
560 | 560 | |
|
561 | 561 | if _dbr.repo_state in [db.Repository.STATE_PENDING]: |
|
562 | 562 | if route in ['repo_creating_home']: |
|
563 | 563 | return |
|
564 | 564 | check_url = url('repo_creating_home', repo_name=c.repo_name) |
|
565 | 565 | raise webob.exc.HTTPFound(location=check_url) |
|
566 | 566 | |
|
567 | 567 | dbr = c.db_repo = _dbr |
|
568 | 568 | c.db_repo_scm_instance = c.db_repo.scm_instance |
|
569 | 569 | if c.db_repo_scm_instance is None: |
|
570 | 570 | log.error('%s this repository is present in database but it ' |
|
571 | 571 | 'cannot be created as an scm instance', c.repo_name) |
|
572 | 572 | webutils.flash(_('Repository not found in the filesystem'), |
|
573 | 573 | category='error') |
|
574 | 574 | raise webob.exc.HTTPNotFound() |
|
575 | 575 | |
|
576 | 576 | # some globals counter for menu |
|
577 | 577 | c.repository_followers = self.scm_model.get_followers(dbr) |
|
578 | 578 | c.repository_forks = self.scm_model.get_forks(dbr) |
|
579 | 579 | c.repository_pull_requests = self.scm_model.get_pull_requests(dbr) |
|
580 | 580 | c.repository_following = self.scm_model.is_following_repo( |
|
581 | 581 | c.repo_name, request.authuser.user_id) |
|
582 | 582 | |
|
583 | 583 | @staticmethod |
|
584 | 584 | def _get_ref_rev(repo, ref_type, ref_name, returnempty=False): |
|
585 | 585 | """ |
|
586 | 586 | Safe way to get changeset. If error occurs show error. |
|
587 | 587 | """ |
|
588 | 588 | try: |
|
589 | 589 | return repo.scm_instance.get_ref_revision(ref_type, ref_name) |
|
590 | 590 | except EmptyRepositoryError as e: |
|
591 | 591 | if returnempty: |
|
592 | 592 | return repo.scm_instance.EMPTY_CHANGESET |
|
593 | 593 | webutils.flash(_('There are no changesets yet'), category='error') |
|
594 | 594 | raise webob.exc.HTTPNotFound() |
|
595 | 595 | except ChangesetDoesNotExistError as e: |
|
596 | 596 | webutils.flash(_('Changeset for %s %s not found in %s') % |
|
597 | 597 | (ref_type, ref_name, repo.repo_name), |
|
598 | 598 | category='error') |
|
599 | 599 | raise webob.exc.HTTPNotFound() |
|
600 | 600 | except RepositoryError as e: |
|
601 | 601 | log.error(traceback.format_exc()) |
|
602 | 602 | webutils.flash(e, category='error') |
|
603 | 603 | raise webob.exc.HTTPBadRequest() |
|
604 | 604 | |
|
605 | 605 | |
|
606 | 606 | @decorator.decorator |
|
607 | 607 | def jsonify(func, *args, **kwargs): |
|
608 | 608 | """Action decorator that formats output for JSON |
|
609 | 609 | |
|
610 | 610 | Given a function that will return content, this decorator will turn |
|
611 | 611 | the result into JSON, with a content-type of 'application/json' and |
|
612 | 612 | output it. |
|
613 | 613 | """ |
|
614 | 614 | response.headers['Content-Type'] = 'application/json; charset=utf-8' |
|
615 | 615 | data = func(*args, **kwargs) |
|
616 | 616 | if isinstance(data, (list, tuple)): |
|
617 | 617 | # A JSON list response is syntactically valid JavaScript and can be |
|
618 | 618 | # loaded and executed as JavaScript by a malicious third-party site |
|
619 | 619 | # using <script>, which can lead to cross-site data leaks. |
|
620 | 620 | # JSON responses should therefore be scalars or objects (i.e. Python |
|
621 | 621 | # dicts), because a JSON object is a syntax error if intepreted as JS. |
|
622 | 622 | msg = "JSON responses with Array envelopes are susceptible to " \ |
|
623 | 623 | "cross-site data leak attacks, see " \ |
|
624 | 624 | "https://web.archive.org/web/20120519231904/http://wiki.pylonshq.com/display/pylonsfaq/Warnings" |
|
625 | 625 | warnings.warn(msg, Warning, 2) |
|
626 | 626 | log.warning(msg) |
|
627 | 627 | log.debug("Returning JSON wrapped action output") |
|
628 | 628 | return ascii_bytes(ext_json.dumps(data)) |
|
629 | 629 | |
|
630 | 630 | @decorator.decorator |
|
631 | 631 | def IfSshEnabled(func, *args, **kwargs): |
|
632 | 632 | """Decorator for functions that can only be called if SSH access is enabled. |
|
633 | 633 | |
|
634 | 634 | If SSH access is disabled in the configuration file, HTTPNotFound is raised. |
|
635 | 635 | """ |
|
636 | 636 | if not c.ssh_enabled: |
|
637 | 637 | webutils.flash(_("SSH access is disabled."), category='warning') |
|
638 | 638 | raise webob.exc.HTTPNotFound() |
|
639 | 639 | return func(*args, **kwargs) |
@@ -1,157 +1,157 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.changelog |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | changelog controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 21, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | |
|
31 | 31 | from tg import request, session |
|
32 | 32 | from tg import tmpl_context as c |
|
33 | 33 | from tg.i18n import ugettext as _ |
|
34 | 34 | from webob.exc import HTTPBadRequest, HTTPFound, HTTPNotFound |
|
35 | 35 | |
|
36 | from kallithea.controllers import base | |
|
36 | 37 | from kallithea.lib import webutils |
|
37 | 38 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
38 | from kallithea.lib.base import BaseRepoController, render | |
|
39 | 39 | from kallithea.lib.graphmod import graph_data |
|
40 | 40 | from kallithea.lib.page import Page |
|
41 | 41 | from kallithea.lib.utils2 import safe_int |
|
42 | 42 | from kallithea.lib.vcs.exceptions import ChangesetDoesNotExistError, ChangesetError, EmptyRepositoryError, NodeDoesNotExistError, RepositoryError |
|
43 | 43 | from kallithea.lib.webutils import url |
|
44 | 44 | |
|
45 | 45 | |
|
46 | 46 | log = logging.getLogger(__name__) |
|
47 | 47 | |
|
48 | 48 | |
|
49 | class ChangelogController(BaseRepoController): | |
|
49 | class ChangelogController(base.BaseRepoController): | |
|
50 | 50 | |
|
51 | 51 | def _before(self, *args, **kwargs): |
|
52 | 52 | super(ChangelogController, self)._before(*args, **kwargs) |
|
53 | 53 | c.affected_files_cut_off = 60 |
|
54 | 54 | |
|
55 | 55 | @staticmethod |
|
56 | 56 | def __get_cs(rev, repo): |
|
57 | 57 | """ |
|
58 | 58 | Safe way to get changeset. If error occur fail with error message. |
|
59 | 59 | |
|
60 | 60 | :param rev: revision to fetch |
|
61 | 61 | :param repo: repo instance |
|
62 | 62 | """ |
|
63 | 63 | |
|
64 | 64 | try: |
|
65 | 65 | return c.db_repo_scm_instance.get_changeset(rev) |
|
66 | 66 | except EmptyRepositoryError as e: |
|
67 | 67 | webutils.flash(_('There are no changesets yet'), category='error') |
|
68 | 68 | except RepositoryError as e: |
|
69 | 69 | log.error(traceback.format_exc()) |
|
70 | 70 | webutils.flash(e, category='error') |
|
71 | 71 | raise HTTPBadRequest() |
|
72 | 72 | |
|
73 | 73 | @LoginRequired(allow_default_user=True) |
|
74 | 74 | @HasRepoPermissionLevelDecorator('read') |
|
75 | 75 | def index(self, repo_name, revision=None, f_path=None): |
|
76 | 76 | limit = 2000 |
|
77 | 77 | default = 100 |
|
78 | 78 | if request.GET.get('size'): |
|
79 | 79 | c.size = max(min(safe_int(request.GET.get('size')), limit), 1) |
|
80 | 80 | session['changelog_size'] = c.size |
|
81 | 81 | session.save() |
|
82 | 82 | else: |
|
83 | 83 | c.size = int(session.get('changelog_size', default)) |
|
84 | 84 | # min size must be 1 |
|
85 | 85 | c.size = max(c.size, 1) |
|
86 | 86 | p = safe_int(request.GET.get('page'), 1) |
|
87 | 87 | branch_name = request.GET.get('branch', None) |
|
88 | 88 | if (branch_name and |
|
89 | 89 | branch_name not in c.db_repo_scm_instance.branches and |
|
90 | 90 | branch_name not in c.db_repo_scm_instance.closed_branches and |
|
91 | 91 | not revision |
|
92 | 92 | ): |
|
93 | 93 | raise HTTPFound(location=url('changelog_file_home', repo_name=c.repo_name, |
|
94 | 94 | revision=branch_name, f_path=f_path or '')) |
|
95 | 95 | |
|
96 | 96 | if revision == 'tip': |
|
97 | 97 | revision = None |
|
98 | 98 | |
|
99 | 99 | c.changelog_for_path = f_path |
|
100 | 100 | try: |
|
101 | 101 | |
|
102 | 102 | if f_path: |
|
103 | 103 | log.debug('generating changelog for path %s', f_path) |
|
104 | 104 | # get the history for the file ! |
|
105 | 105 | tip_cs = c.db_repo_scm_instance.get_changeset() |
|
106 | 106 | try: |
|
107 | 107 | collection = tip_cs.get_file_history(f_path) |
|
108 | 108 | except (NodeDoesNotExistError, ChangesetError): |
|
109 | 109 | # this node is not present at tip ! |
|
110 | 110 | try: |
|
111 | 111 | cs = self.__get_cs(revision, repo_name) |
|
112 | 112 | collection = cs.get_file_history(f_path) |
|
113 | 113 | except RepositoryError as e: |
|
114 | 114 | webutils.flash(e, category='warning') |
|
115 | 115 | raise HTTPFound(location=webutils.url('changelog_home', repo_name=repo_name)) |
|
116 | 116 | else: |
|
117 | 117 | collection = c.db_repo_scm_instance.get_changesets(start=0, end=revision, |
|
118 | 118 | branch_name=branch_name, reverse=True) |
|
119 | 119 | c.total_cs = len(collection) |
|
120 | 120 | |
|
121 | 121 | c.cs_pagination = Page(collection, page=p, item_count=c.total_cs, items_per_page=c.size, |
|
122 | 122 | branch=branch_name) |
|
123 | 123 | |
|
124 | 124 | page_revisions = [x.raw_id for x in c.cs_pagination] |
|
125 | 125 | c.cs_comments = c.db_repo.get_comments(page_revisions) |
|
126 | 126 | c.cs_statuses = c.db_repo.statuses(page_revisions) |
|
127 | 127 | except EmptyRepositoryError as e: |
|
128 | 128 | webutils.flash(e, category='warning') |
|
129 | 129 | raise HTTPFound(location=url('summary_home', repo_name=c.repo_name)) |
|
130 | 130 | except (RepositoryError, ChangesetDoesNotExistError, Exception) as e: |
|
131 | 131 | log.error(traceback.format_exc()) |
|
132 | 132 | webutils.flash(e, category='error') |
|
133 | 133 | raise HTTPFound(location=url('changelog_home', repo_name=c.repo_name)) |
|
134 | 134 | |
|
135 | 135 | c.branch_name = branch_name |
|
136 | 136 | c.branch_filters = [('', _('None'))] + \ |
|
137 | 137 | [(k, k) for k in c.db_repo_scm_instance.branches] |
|
138 | 138 | if c.db_repo_scm_instance.closed_branches: |
|
139 | 139 | prefix = _('(closed)') + ' ' |
|
140 | 140 | c.branch_filters += [('-', '-')] + \ |
|
141 | 141 | [(k, prefix + k) for k in c.db_repo_scm_instance.closed_branches] |
|
142 | 142 | revs = [] |
|
143 | 143 | if not f_path: |
|
144 | 144 | revs = [x.revision for x in c.cs_pagination] |
|
145 | 145 | c.jsdata = graph_data(c.db_repo_scm_instance, revs) |
|
146 | 146 | |
|
147 | 147 | c.revision = revision # requested revision ref |
|
148 | 148 | c.first_revision = c.cs_pagination[0] # pagination is never empty here! |
|
149 | return render('changelog/changelog.html') | |
|
149 | return base.render('changelog/changelog.html') | |
|
150 | 150 | |
|
151 | 151 | @LoginRequired(allow_default_user=True) |
|
152 | 152 | @HasRepoPermissionLevelDecorator('read') |
|
153 | 153 | def changelog_details(self, cs): |
|
154 | 154 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
155 | 155 | c.cs = c.db_repo_scm_instance.get_changeset(cs) |
|
156 | return render('changelog/changelog_details.html') | |
|
156 | return base.render('changelog/changelog_details.html') | |
|
157 | 157 | raise HTTPNotFound() |
@@ -1,370 +1,370 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.changeset |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | changeset controller showing changes between revisions |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 25, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import binascii |
|
29 | 29 | import logging |
|
30 | 30 | import traceback |
|
31 | 31 | from collections import OrderedDict |
|
32 | 32 | |
|
33 | 33 | from tg import request, response |
|
34 | 34 | from tg import tmpl_context as c |
|
35 | 35 | from tg.i18n import ugettext as _ |
|
36 | 36 | from webob.exc import HTTPBadRequest, HTTPForbidden, HTTPNotFound |
|
37 | 37 | |
|
38 | 38 | import kallithea.lib.helpers as h |
|
39 | from kallithea.controllers import base | |
|
39 | 40 | from kallithea.lib import auth, diffs, webutils |
|
40 | 41 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
41 | from kallithea.lib.base import BaseRepoController, jsonify, render | |
|
42 | 42 | from kallithea.lib.graphmod import graph_data |
|
43 | 43 | from kallithea.lib.utils2 import ascii_str, safe_str |
|
44 | 44 | from kallithea.lib.vcs.backends.base import EmptyChangeset |
|
45 | 45 | from kallithea.lib.vcs.exceptions import ChangesetDoesNotExistError, EmptyRepositoryError, RepositoryError |
|
46 | 46 | from kallithea.model import db, meta, userlog |
|
47 | 47 | from kallithea.model.changeset_status import ChangesetStatusModel |
|
48 | 48 | from kallithea.model.comment import ChangesetCommentsModel |
|
49 | 49 | from kallithea.model.pull_request import PullRequestModel |
|
50 | 50 | |
|
51 | 51 | |
|
52 | 52 | log = logging.getLogger(__name__) |
|
53 | 53 | |
|
54 | 54 | |
|
55 | 55 | def create_cs_pr_comment(repo_name, revision=None, pull_request=None, allowed_to_change_status=True): |
|
56 | 56 | """ |
|
57 | 57 | Add a comment to the specified changeset or pull request, using POST values |
|
58 | 58 | from the request. |
|
59 | 59 | |
|
60 | 60 | Comments can be inline (when a file path and line number is specified in |
|
61 | 61 | POST) or general comments. |
|
62 | 62 | A comment can be accompanied by a review status change (accepted, rejected, |
|
63 | 63 | etc.). Pull requests can be closed or deleted. |
|
64 | 64 | |
|
65 | 65 | Parameter 'allowed_to_change_status' is used for both status changes and |
|
66 | 66 | closing of pull requests. For deleting of pull requests, more specific |
|
67 | 67 | checks are done. |
|
68 | 68 | """ |
|
69 | 69 | |
|
70 | 70 | assert request.environ.get('HTTP_X_PARTIAL_XHR') |
|
71 | 71 | if pull_request: |
|
72 | 72 | pull_request_id = pull_request.pull_request_id |
|
73 | 73 | else: |
|
74 | 74 | pull_request_id = None |
|
75 | 75 | |
|
76 | 76 | status = request.POST.get('changeset_status') |
|
77 | 77 | close_pr = request.POST.get('save_close') |
|
78 | 78 | delete = request.POST.get('save_delete') |
|
79 | 79 | f_path = request.POST.get('f_path') |
|
80 | 80 | line_no = request.POST.get('line') |
|
81 | 81 | |
|
82 | 82 | if (status or close_pr or delete) and (f_path or line_no): |
|
83 | 83 | # status votes and closing is only possible in general comments |
|
84 | 84 | raise HTTPBadRequest() |
|
85 | 85 | |
|
86 | 86 | if not allowed_to_change_status: |
|
87 | 87 | if status or close_pr: |
|
88 | 88 | webutils.flash(_('No permission to change status'), 'error') |
|
89 | 89 | raise HTTPForbidden() |
|
90 | 90 | |
|
91 | 91 | if pull_request and delete == "delete": |
|
92 | 92 | if (pull_request.owner_id == request.authuser.user_id or |
|
93 | 93 | auth.HasPermissionAny('hg.admin')() or |
|
94 | 94 | auth.HasRepoPermissionLevel('admin')(pull_request.org_repo.repo_name) or |
|
95 | 95 | auth.HasRepoPermissionLevel('admin')(pull_request.other_repo.repo_name) |
|
96 | 96 | ) and not pull_request.is_closed(): |
|
97 | 97 | PullRequestModel().delete(pull_request) |
|
98 | 98 | meta.Session().commit() |
|
99 | 99 | webutils.flash(_('Successfully deleted pull request %s') % pull_request_id, |
|
100 | 100 | category='success') |
|
101 | 101 | return { |
|
102 | 102 | 'location': webutils.url('my_pullrequests'), # or repo pr list? |
|
103 | 103 | } |
|
104 | 104 | raise HTTPForbidden() |
|
105 | 105 | |
|
106 | 106 | text = request.POST.get('text', '').strip() |
|
107 | 107 | |
|
108 | 108 | comment = ChangesetCommentsModel().create( |
|
109 | 109 | text=text, |
|
110 | 110 | repo=c.db_repo.repo_id, |
|
111 | 111 | author=request.authuser.user_id, |
|
112 | 112 | revision=revision, |
|
113 | 113 | pull_request=pull_request_id, |
|
114 | 114 | f_path=f_path or None, |
|
115 | 115 | line_no=line_no or None, |
|
116 | 116 | status_change=db.ChangesetStatus.get_status_lbl(status) if status else None, |
|
117 | 117 | closing_pr=close_pr, |
|
118 | 118 | ) |
|
119 | 119 | |
|
120 | 120 | if status: |
|
121 | 121 | ChangesetStatusModel().set_status( |
|
122 | 122 | c.db_repo.repo_id, |
|
123 | 123 | status, |
|
124 | 124 | request.authuser.user_id, |
|
125 | 125 | comment, |
|
126 | 126 | revision=revision, |
|
127 | 127 | pull_request=pull_request_id, |
|
128 | 128 | ) |
|
129 | 129 | |
|
130 | 130 | if pull_request: |
|
131 | 131 | action = 'user_commented_pull_request:%s' % pull_request_id |
|
132 | 132 | else: |
|
133 | 133 | action = 'user_commented_revision:%s' % revision |
|
134 | 134 | userlog.action_logger(request.authuser, action, c.db_repo, request.ip_addr) |
|
135 | 135 | |
|
136 | 136 | if pull_request and close_pr: |
|
137 | 137 | PullRequestModel().close_pull_request(pull_request_id) |
|
138 | 138 | userlog.action_logger(request.authuser, |
|
139 | 139 | 'user_closed_pull_request:%s' % pull_request_id, |
|
140 | 140 | c.db_repo, request.ip_addr) |
|
141 | 141 | |
|
142 | 142 | meta.Session().commit() |
|
143 | 143 | |
|
144 | 144 | data = { |
|
145 | 145 | 'target_id': webutils.safeid(request.POST.get('f_path')), |
|
146 | 146 | } |
|
147 | 147 | if comment is not None: |
|
148 | 148 | c.comment = comment |
|
149 | 149 | data.update(comment.get_dict()) |
|
150 | 150 | data.update({'rendered_text': |
|
151 | render('changeset/changeset_comment_block.html')}) | |
|
151 | base.render('changeset/changeset_comment_block.html')}) | |
|
152 | 152 | |
|
153 | 153 | return data |
|
154 | 154 | |
|
155 | 155 | def delete_cs_pr_comment(repo_name, comment_id): |
|
156 | 156 | """Delete a comment from a changeset or pull request""" |
|
157 | 157 | co = db.ChangesetComment.get_or_404(comment_id) |
|
158 | 158 | if co.repo.repo_name != repo_name: |
|
159 | 159 | raise HTTPNotFound() |
|
160 | 160 | if co.pull_request and co.pull_request.is_closed(): |
|
161 | 161 | # don't allow deleting comments on closed pull request |
|
162 | 162 | raise HTTPForbidden() |
|
163 | 163 | |
|
164 | 164 | owner = co.author_id == request.authuser.user_id |
|
165 | 165 | repo_admin = auth.HasRepoPermissionLevel('admin')(repo_name) |
|
166 | 166 | if auth.HasPermissionAny('hg.admin')() or repo_admin or owner: |
|
167 | 167 | ChangesetCommentsModel().delete(comment=co) |
|
168 | 168 | meta.Session().commit() |
|
169 | 169 | return True |
|
170 | 170 | else: |
|
171 | 171 | raise HTTPForbidden() |
|
172 | 172 | |
|
173 | class ChangesetController(BaseRepoController): | |
|
173 | class ChangesetController(base.BaseRepoController): | |
|
174 | 174 | |
|
175 | 175 | def _before(self, *args, **kwargs): |
|
176 | 176 | super(ChangesetController, self)._before(*args, **kwargs) |
|
177 | 177 | c.affected_files_cut_off = 60 |
|
178 | 178 | |
|
179 | 179 | def _index(self, revision, method): |
|
180 | 180 | c.pull_request = None |
|
181 | 181 | c.fulldiff = request.GET.get('fulldiff') # for reporting number of changed files |
|
182 | 182 | # get ranges of revisions if preset |
|
183 | 183 | rev_range = revision.split('...')[:2] |
|
184 | 184 | c.cs_repo = c.db_repo |
|
185 | 185 | try: |
|
186 | 186 | if len(rev_range) == 2: |
|
187 | 187 | rev_start = rev_range[0] |
|
188 | 188 | rev_end = rev_range[1] |
|
189 | 189 | rev_ranges = c.db_repo_scm_instance.get_changesets(start=rev_start, |
|
190 | 190 | end=rev_end) |
|
191 | 191 | else: |
|
192 | 192 | rev_ranges = [c.db_repo_scm_instance.get_changeset(revision)] |
|
193 | 193 | |
|
194 | 194 | c.cs_ranges = list(rev_ranges) |
|
195 | 195 | if not c.cs_ranges: |
|
196 | 196 | raise RepositoryError('Changeset range returned empty result') |
|
197 | 197 | |
|
198 | 198 | except (ChangesetDoesNotExistError, EmptyRepositoryError): |
|
199 | 199 | log.debug(traceback.format_exc()) |
|
200 | 200 | msg = _('Such revision does not exist for this repository') |
|
201 | 201 | webutils.flash(msg, category='error') |
|
202 | 202 | raise HTTPNotFound() |
|
203 | 203 | |
|
204 | 204 | c.changes = OrderedDict() |
|
205 | 205 | |
|
206 | 206 | c.lines_added = 0 # count of lines added |
|
207 | 207 | c.lines_deleted = 0 # count of lines removes |
|
208 | 208 | |
|
209 | 209 | c.changeset_statuses = db.ChangesetStatus.STATUSES |
|
210 | 210 | comments = dict() |
|
211 | 211 | c.statuses = [] |
|
212 | 212 | c.inline_comments = [] |
|
213 | 213 | c.inline_cnt = 0 |
|
214 | 214 | |
|
215 | 215 | # Iterate over ranges (default changeset view is always one changeset) |
|
216 | 216 | for changeset in c.cs_ranges: |
|
217 | 217 | if method == 'show': |
|
218 | 218 | c.statuses.extend([ChangesetStatusModel().get_status( |
|
219 | 219 | c.db_repo.repo_id, changeset.raw_id)]) |
|
220 | 220 | |
|
221 | 221 | # Changeset comments |
|
222 | 222 | comments.update((com.comment_id, com) |
|
223 | 223 | for com in ChangesetCommentsModel() |
|
224 | 224 | .get_comments(c.db_repo.repo_id, |
|
225 | 225 | revision=changeset.raw_id)) |
|
226 | 226 | |
|
227 | 227 | # Status change comments - mostly from pull requests |
|
228 | 228 | comments.update((st.comment_id, st.comment) |
|
229 | 229 | for st in ChangesetStatusModel() |
|
230 | 230 | .get_statuses(c.db_repo.repo_id, |
|
231 | 231 | changeset.raw_id, with_revisions=True) |
|
232 | 232 | if st.comment_id is not None) |
|
233 | 233 | |
|
234 | 234 | inlines = ChangesetCommentsModel() \ |
|
235 | 235 | .get_inline_comments(c.db_repo.repo_id, |
|
236 | 236 | revision=changeset.raw_id) |
|
237 | 237 | c.inline_comments.extend(inlines) |
|
238 | 238 | |
|
239 | 239 | cs2 = changeset.raw_id |
|
240 | 240 | cs1 = changeset.parents[0].raw_id if changeset.parents else EmptyChangeset().raw_id |
|
241 | 241 | ignore_whitespace_diff = h.get_ignore_whitespace_diff(request.GET) |
|
242 | 242 | diff_context_size = h.get_diff_context_size(request.GET) |
|
243 | 243 | raw_diff = diffs.get_diff(c.db_repo_scm_instance, cs1, cs2, |
|
244 | 244 | ignore_whitespace=ignore_whitespace_diff, context=diff_context_size) |
|
245 | 245 | diff_limit = None if c.fulldiff else self.cut_off_limit |
|
246 | 246 | file_diff_data = [] |
|
247 | 247 | if method == 'show': |
|
248 | 248 | diff_processor = diffs.DiffProcessor(raw_diff, |
|
249 | 249 | vcs=c.db_repo_scm_instance.alias, |
|
250 | 250 | diff_limit=diff_limit) |
|
251 | 251 | c.limited_diff = diff_processor.limited_diff |
|
252 | 252 | for f in diff_processor.parsed: |
|
253 | 253 | st = f['stats'] |
|
254 | 254 | c.lines_added += st['added'] |
|
255 | 255 | c.lines_deleted += st['deleted'] |
|
256 | 256 | filename = f['filename'] |
|
257 | 257 | fid = h.FID(changeset.raw_id, filename) |
|
258 | 258 | url_fid = h.FID('', filename) |
|
259 | 259 | html_diff = diffs.as_html(parsed_lines=[f]) |
|
260 | 260 | file_diff_data.append((fid, url_fid, f['operation'], f['old_filename'], filename, html_diff, st)) |
|
261 | 261 | else: |
|
262 | 262 | # downloads/raw we only need RAW diff nothing else |
|
263 | 263 | file_diff_data.append(('', None, None, None, raw_diff, None)) |
|
264 | 264 | c.changes[changeset.raw_id] = (cs1, cs2, file_diff_data) |
|
265 | 265 | |
|
266 | 266 | # sort comments in creation order |
|
267 | 267 | c.comments = [com for com_id, com in sorted(comments.items())] |
|
268 | 268 | |
|
269 | 269 | # count inline comments |
|
270 | 270 | for __, lines in c.inline_comments: |
|
271 | 271 | for comments in lines.values(): |
|
272 | 272 | c.inline_cnt += len(comments) |
|
273 | 273 | |
|
274 | 274 | if len(c.cs_ranges) == 1: |
|
275 | 275 | c.changeset = c.cs_ranges[0] |
|
276 | 276 | c.parent_tmpl = ''.join(['# Parent %s\n' % x.raw_id |
|
277 | 277 | for x in c.changeset.parents]) |
|
278 | 278 | c.changeset_graft_source_hash = ascii_str(c.changeset.extra.get(b'source', b'')) |
|
279 | 279 | c.changeset_transplant_source_hash = ascii_str(binascii.hexlify(c.changeset.extra.get(b'transplant_source', b''))) |
|
280 | 280 | if method == 'download': |
|
281 | 281 | response.content_type = 'text/plain' |
|
282 | 282 | response.content_disposition = 'attachment; filename=%s.diff' \ |
|
283 | 283 | % revision[:12] |
|
284 | 284 | return raw_diff |
|
285 | 285 | elif method == 'patch': |
|
286 | 286 | response.content_type = 'text/plain' |
|
287 | 287 | c.diff = safe_str(raw_diff) |
|
288 | return render('changeset/patch_changeset.html') | |
|
288 | return base.render('changeset/patch_changeset.html') | |
|
289 | 289 | elif method == 'raw': |
|
290 | 290 | response.content_type = 'text/plain' |
|
291 | 291 | return raw_diff |
|
292 | 292 | elif method == 'show': |
|
293 | 293 | if len(c.cs_ranges) == 1: |
|
294 | return render('changeset/changeset.html') | |
|
294 | return base.render('changeset/changeset.html') | |
|
295 | 295 | else: |
|
296 | 296 | c.cs_ranges_org = None |
|
297 | 297 | c.cs_comments = {} |
|
298 | 298 | revs = [ctx.revision for ctx in reversed(c.cs_ranges)] |
|
299 | 299 | c.jsdata = graph_data(c.db_repo_scm_instance, revs) |
|
300 | return render('changeset/changeset_range.html') | |
|
300 | return base.render('changeset/changeset_range.html') | |
|
301 | 301 | |
|
302 | 302 | @LoginRequired(allow_default_user=True) |
|
303 | 303 | @HasRepoPermissionLevelDecorator('read') |
|
304 | 304 | def index(self, revision, method='show'): |
|
305 | 305 | return self._index(revision, method=method) |
|
306 | 306 | |
|
307 | 307 | @LoginRequired(allow_default_user=True) |
|
308 | 308 | @HasRepoPermissionLevelDecorator('read') |
|
309 | 309 | def changeset_raw(self, revision): |
|
310 | 310 | return self._index(revision, method='raw') |
|
311 | 311 | |
|
312 | 312 | @LoginRequired(allow_default_user=True) |
|
313 | 313 | @HasRepoPermissionLevelDecorator('read') |
|
314 | 314 | def changeset_patch(self, revision): |
|
315 | 315 | return self._index(revision, method='patch') |
|
316 | 316 | |
|
317 | 317 | @LoginRequired(allow_default_user=True) |
|
318 | 318 | @HasRepoPermissionLevelDecorator('read') |
|
319 | 319 | def changeset_download(self, revision): |
|
320 | 320 | return self._index(revision, method='download') |
|
321 | 321 | |
|
322 | 322 | @LoginRequired() |
|
323 | 323 | @HasRepoPermissionLevelDecorator('read') |
|
324 | @jsonify | |
|
324 | @base.jsonify | |
|
325 | 325 | def comment(self, repo_name, revision): |
|
326 | 326 | return create_cs_pr_comment(repo_name, revision=revision) |
|
327 | 327 | |
|
328 | 328 | @LoginRequired() |
|
329 | 329 | @HasRepoPermissionLevelDecorator('read') |
|
330 | @jsonify | |
|
330 | @base.jsonify | |
|
331 | 331 | def delete_comment(self, repo_name, comment_id): |
|
332 | 332 | return delete_cs_pr_comment(repo_name, comment_id) |
|
333 | 333 | |
|
334 | 334 | @LoginRequired(allow_default_user=True) |
|
335 | 335 | @HasRepoPermissionLevelDecorator('read') |
|
336 | @jsonify | |
|
336 | @base.jsonify | |
|
337 | 337 | def changeset_info(self, repo_name, revision): |
|
338 | 338 | if request.is_xhr: |
|
339 | 339 | try: |
|
340 | 340 | return c.db_repo_scm_instance.get_changeset(revision) |
|
341 | 341 | except ChangesetDoesNotExistError as e: |
|
342 | 342 | return EmptyChangeset(message=str(e)) |
|
343 | 343 | else: |
|
344 | 344 | raise HTTPBadRequest() |
|
345 | 345 | |
|
346 | 346 | @LoginRequired(allow_default_user=True) |
|
347 | 347 | @HasRepoPermissionLevelDecorator('read') |
|
348 | @jsonify | |
|
348 | @base.jsonify | |
|
349 | 349 | def changeset_children(self, repo_name, revision): |
|
350 | 350 | if request.is_xhr: |
|
351 | 351 | changeset = c.db_repo_scm_instance.get_changeset(revision) |
|
352 | 352 | result = {"results": []} |
|
353 | 353 | if changeset.children: |
|
354 | 354 | result = {"results": changeset.children} |
|
355 | 355 | return result |
|
356 | 356 | else: |
|
357 | 357 | raise HTTPBadRequest() |
|
358 | 358 | |
|
359 | 359 | @LoginRequired(allow_default_user=True) |
|
360 | 360 | @HasRepoPermissionLevelDecorator('read') |
|
361 | @jsonify | |
|
361 | @base.jsonify | |
|
362 | 362 | def changeset_parents(self, repo_name, revision): |
|
363 | 363 | if request.is_xhr: |
|
364 | 364 | changeset = c.db_repo_scm_instance.get_changeset(revision) |
|
365 | 365 | result = {"results": []} |
|
366 | 366 | if changeset.parents: |
|
367 | 367 | result = {"results": changeset.parents} |
|
368 | 368 | return result |
|
369 | 369 | else: |
|
370 | 370 | raise HTTPBadRequest() |
@@ -1,188 +1,188 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.compare |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | compare controller showing differences between two |
|
19 | 19 | repos, branches, bookmarks or tips |
|
20 | 20 | |
|
21 | 21 | This file was forked by the Kallithea project in July 2014. |
|
22 | 22 | Original author and date, and relevant copyright and licensing information is below: |
|
23 | 23 | :created_on: May 6, 2012 |
|
24 | 24 | :author: marcink |
|
25 | 25 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
26 | 26 | :license: GPLv3, see LICENSE.md for more details. |
|
27 | 27 | """ |
|
28 | 28 | |
|
29 | 29 | |
|
30 | 30 | import logging |
|
31 | 31 | |
|
32 | 32 | from tg import request |
|
33 | 33 | from tg import tmpl_context as c |
|
34 | 34 | from tg.i18n import ugettext as _ |
|
35 | 35 | from webob.exc import HTTPBadRequest, HTTPFound, HTTPNotFound |
|
36 | 36 | |
|
37 | 37 | import kallithea.lib.helpers as h |
|
38 | from kallithea.controllers import base | |
|
38 | 39 | from kallithea.lib import diffs, webutils |
|
39 | 40 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
40 | from kallithea.lib.base import BaseRepoController, render | |
|
41 | 41 | from kallithea.lib.graphmod import graph_data |
|
42 | 42 | from kallithea.lib.webutils import url |
|
43 | 43 | from kallithea.model import db |
|
44 | 44 | |
|
45 | 45 | |
|
46 | 46 | log = logging.getLogger(__name__) |
|
47 | 47 | |
|
48 | 48 | |
|
49 | class CompareController(BaseRepoController): | |
|
49 | class CompareController(base.BaseRepoController): | |
|
50 | 50 | |
|
51 | 51 | def _before(self, *args, **kwargs): |
|
52 | 52 | super(CompareController, self)._before(*args, **kwargs) |
|
53 | 53 | |
|
54 | 54 | # The base repository has already been retrieved. |
|
55 | 55 | c.a_repo = c.db_repo |
|
56 | 56 | |
|
57 | 57 | # Retrieve the "changeset" repository (default: same as base). |
|
58 | 58 | other_repo = request.GET.get('other_repo', None) |
|
59 | 59 | if other_repo is None: |
|
60 | 60 | c.cs_repo = c.a_repo |
|
61 | 61 | else: |
|
62 | 62 | c.cs_repo = db.Repository.get_by_repo_name(other_repo) |
|
63 | 63 | if c.cs_repo is None: |
|
64 | 64 | msg = _('Could not find other repository %s') % other_repo |
|
65 | 65 | webutils.flash(msg, category='error') |
|
66 | 66 | raise HTTPFound(location=url('compare_home', repo_name=c.a_repo.repo_name)) |
|
67 | 67 | |
|
68 | 68 | # Verify that it's even possible to compare these two repositories. |
|
69 | 69 | if c.a_repo.scm_instance.alias != c.cs_repo.scm_instance.alias: |
|
70 | 70 | msg = _('Cannot compare repositories of different types') |
|
71 | 71 | webutils.flash(msg, category='error') |
|
72 | 72 | raise HTTPFound(location=url('compare_home', repo_name=c.a_repo.repo_name)) |
|
73 | 73 | |
|
74 | 74 | @LoginRequired(allow_default_user=True) |
|
75 | 75 | @HasRepoPermissionLevelDecorator('read') |
|
76 | 76 | def index(self, repo_name): |
|
77 | 77 | c.compare_home = True |
|
78 | 78 | c.a_ref_name = c.cs_ref_name = None |
|
79 | return render('compare/compare_diff.html') | |
|
79 | return base.render('compare/compare_diff.html') | |
|
80 | 80 | |
|
81 | 81 | @LoginRequired(allow_default_user=True) |
|
82 | 82 | @HasRepoPermissionLevelDecorator('read') |
|
83 | 83 | def compare(self, repo_name, org_ref_type, org_ref_name, other_ref_type, other_ref_name): |
|
84 | 84 | org_ref_name = org_ref_name.strip() |
|
85 | 85 | other_ref_name = other_ref_name.strip() |
|
86 | 86 | |
|
87 | 87 | # If merge is True: |
|
88 | 88 | # Show what org would get if merged with other: |
|
89 | 89 | # List changesets that are ancestors of other but not of org. |
|
90 | 90 | # New changesets in org is thus ignored. |
|
91 | 91 | # Diff will be from common ancestor, and merges of org to other will thus be ignored. |
|
92 | 92 | # If merge is False: |
|
93 | 93 | # Make a raw diff from org to other, no matter if related or not. |
|
94 | 94 | # Changesets in one and not in the other will be ignored |
|
95 | 95 | merge = bool(request.GET.get('merge')) |
|
96 | 96 | # fulldiff disables cut_off_limit |
|
97 | 97 | fulldiff = request.GET.get('fulldiff') |
|
98 | 98 | # partial uses compare_cs.html template directly |
|
99 | 99 | partial = request.environ.get('HTTP_X_PARTIAL_XHR') |
|
100 | 100 | # is_ajax_preview puts hidden input field with changeset revisions |
|
101 | 101 | c.is_ajax_preview = partial and request.GET.get('is_ajax_preview') |
|
102 | 102 | # swap url for compare_diff page - never partial and never is_ajax_preview |
|
103 | 103 | c.swap_url = webutils.url('compare_url', |
|
104 | 104 | repo_name=c.cs_repo.repo_name, |
|
105 | 105 | org_ref_type=other_ref_type, org_ref_name=other_ref_name, |
|
106 | 106 | other_repo=c.a_repo.repo_name, |
|
107 | 107 | other_ref_type=org_ref_type, other_ref_name=org_ref_name, |
|
108 | 108 | merge=merge or '') |
|
109 | 109 | ignore_whitespace_diff = h.get_ignore_whitespace_diff(request.GET) |
|
110 | 110 | diff_context_size = h.get_diff_context_size(request.GET) |
|
111 | 111 | |
|
112 | 112 | c.a_rev = self._get_ref_rev(c.a_repo, org_ref_type, org_ref_name, |
|
113 | 113 | returnempty=True) |
|
114 | 114 | c.cs_rev = self._get_ref_rev(c.cs_repo, other_ref_type, other_ref_name) |
|
115 | 115 | |
|
116 | 116 | c.compare_home = False |
|
117 | 117 | c.a_ref_name = org_ref_name |
|
118 | 118 | c.a_ref_type = org_ref_type |
|
119 | 119 | c.cs_ref_name = other_ref_name |
|
120 | 120 | c.cs_ref_type = other_ref_type |
|
121 | 121 | |
|
122 | 122 | c.cs_ranges, c.cs_ranges_org, c.ancestors = c.a_repo.scm_instance.get_diff_changesets( |
|
123 | 123 | c.a_rev, c.cs_repo.scm_instance, c.cs_rev) |
|
124 | 124 | raw_ids = [x.raw_id for x in c.cs_ranges] |
|
125 | 125 | c.cs_comments = c.cs_repo.get_comments(raw_ids) |
|
126 | 126 | c.cs_statuses = c.cs_repo.statuses(raw_ids) |
|
127 | 127 | |
|
128 | 128 | revs = [ctx.revision for ctx in reversed(c.cs_ranges)] |
|
129 | 129 | c.jsdata = graph_data(c.cs_repo.scm_instance, revs) |
|
130 | 130 | |
|
131 | 131 | if partial: |
|
132 | return render('compare/compare_cs.html') | |
|
132 | return base.render('compare/compare_cs.html') | |
|
133 | 133 | |
|
134 | 134 | org_repo = c.a_repo |
|
135 | 135 | other_repo = c.cs_repo |
|
136 | 136 | |
|
137 | 137 | if merge: |
|
138 | 138 | rev1 = msg = None |
|
139 | 139 | if not c.cs_ranges: |
|
140 | 140 | msg = _('Cannot show empty diff') |
|
141 | 141 | elif not c.ancestors: |
|
142 | 142 | msg = _('No ancestor found for merge diff') |
|
143 | 143 | elif len(c.ancestors) == 1: |
|
144 | 144 | rev1 = c.ancestors[0] |
|
145 | 145 | else: |
|
146 | 146 | msg = _('Multiple merge ancestors found for merge compare') |
|
147 | 147 | if rev1 is None: |
|
148 | 148 | webutils.flash(msg, category='error') |
|
149 | 149 | log.error(msg) |
|
150 | 150 | raise HTTPNotFound |
|
151 | 151 | |
|
152 | 152 | # case we want a simple diff without incoming changesets, |
|
153 | 153 | # previewing what will be merged. |
|
154 | 154 | # Make the diff on the other repo (which is known to have other_rev) |
|
155 | 155 | log.debug('Using ancestor %s as rev1 instead of %s', |
|
156 | 156 | rev1, c.a_rev) |
|
157 | 157 | org_repo = other_repo |
|
158 | 158 | else: # comparing tips, not necessarily linearly related |
|
159 | 159 | if org_repo != other_repo: |
|
160 | 160 | # TODO: we could do this by using hg unionrepo |
|
161 | 161 | log.error('cannot compare across repos %s and %s', org_repo, other_repo) |
|
162 | 162 | webutils.flash(_('Cannot compare repositories without using common ancestor'), category='error') |
|
163 | 163 | raise HTTPBadRequest |
|
164 | 164 | rev1 = c.a_rev |
|
165 | 165 | |
|
166 | 166 | diff_limit = None if fulldiff else self.cut_off_limit |
|
167 | 167 | |
|
168 | 168 | log.debug('running diff between %s and %s in %s', |
|
169 | 169 | rev1, c.cs_rev, org_repo.scm_instance.path) |
|
170 | 170 | raw_diff = diffs.get_diff(org_repo.scm_instance, rev1=rev1, rev2=c.cs_rev, |
|
171 | 171 | ignore_whitespace=ignore_whitespace_diff, |
|
172 | 172 | context=diff_context_size) |
|
173 | 173 | |
|
174 | 174 | diff_processor = diffs.DiffProcessor(raw_diff, diff_limit=diff_limit) |
|
175 | 175 | c.limited_diff = diff_processor.limited_diff |
|
176 | 176 | c.file_diff_data = [] |
|
177 | 177 | c.lines_added = 0 |
|
178 | 178 | c.lines_deleted = 0 |
|
179 | 179 | for f in diff_processor.parsed: |
|
180 | 180 | st = f['stats'] |
|
181 | 181 | c.lines_added += st['added'] |
|
182 | 182 | c.lines_deleted += st['deleted'] |
|
183 | 183 | filename = f['filename'] |
|
184 | 184 | fid = h.FID('', filename) |
|
185 | 185 | html_diff = diffs.as_html(parsed_lines=[f]) |
|
186 | 186 | c.file_diff_data.append((fid, None, f['operation'], f['old_filename'], filename, html_diff, st)) |
|
187 | 187 | |
|
188 | return render('compare/compare_diff.html') | |
|
188 | return base.render('compare/compare_diff.html') |
@@ -1,91 +1,91 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.error |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Kallithea error controller |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Dec 8, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import html |
|
29 | 29 | import logging |
|
30 | 30 | |
|
31 | 31 | from tg import config, expose, request |
|
32 | 32 | from tg import tmpl_context as c |
|
33 | 33 | from tg.i18n import ugettext as _ |
|
34 | 34 | |
|
35 | from kallithea.lib.base import BaseController | |
|
35 | from kallithea.controllers import base | |
|
36 | 36 | |
|
37 | 37 | |
|
38 | 38 | log = logging.getLogger(__name__) |
|
39 | 39 | |
|
40 | 40 | |
|
41 | class ErrorController(BaseController): | |
|
41 | class ErrorController(base.BaseController): | |
|
42 | 42 | """Generates error documents as and when they are required. |
|
43 | 43 | |
|
44 | 44 | The errorpage middleware renders /error/document when error |
|
45 | 45 | related status codes are returned from the application. |
|
46 | 46 | """ |
|
47 | 47 | |
|
48 | 48 | def _before(self, *args, **kwargs): |
|
49 | 49 | # disable all base actions since we don't need them here |
|
50 | 50 | pass |
|
51 | 51 | |
|
52 | 52 | @expose('/errors/error_document.html') |
|
53 | 53 | def document(self, *args, **kwargs): |
|
54 | 54 | resp = request.environ.get('tg.original_response') |
|
55 | 55 | c.site_name = config.get('title') |
|
56 | 56 | |
|
57 | 57 | log.debug('### %s ###', resp and resp.status or 'no response') |
|
58 | 58 | |
|
59 | 59 | e = request.environ |
|
60 | 60 | c.serv_p = r'%(protocol)s://%(host)s/' % { |
|
61 | 61 | 'protocol': e.get('wsgi.url_scheme'), |
|
62 | 62 | 'host': e.get('HTTP_HOST'), } |
|
63 | 63 | if resp: |
|
64 | 64 | c.error_message = html.escape(request.GET.get('code', str(resp.status))) |
|
65 | 65 | c.error_explanation = self.get_error_explanation(resp.status_int) |
|
66 | 66 | else: |
|
67 | 67 | c.error_message = _('No response') |
|
68 | 68 | c.error_explanation = _('Unknown error') |
|
69 | 69 | |
|
70 | 70 | return dict() |
|
71 | 71 | |
|
72 | 72 | def get_error_explanation(self, code): |
|
73 | 73 | """ get the error explanations of int codes |
|
74 | 74 | [400, 401, 403, 404, 500]""" |
|
75 | 75 | try: |
|
76 | 76 | code = int(code) |
|
77 | 77 | except ValueError: |
|
78 | 78 | code = 500 |
|
79 | 79 | |
|
80 | 80 | if code == 400: |
|
81 | 81 | return _('The request could not be understood by the server' |
|
82 | 82 | ' due to malformed syntax.') |
|
83 | 83 | if code == 401: |
|
84 | 84 | return _('Unauthorized access to resource') |
|
85 | 85 | if code == 403: |
|
86 | 86 | return _("You don't have permission to view this page") |
|
87 | 87 | if code == 404: |
|
88 | 88 | return _('The resource could not be found') |
|
89 | 89 | if code == 500: |
|
90 | 90 | return _('The server encountered an unexpected condition' |
|
91 | 91 | ' which prevented it from fulfilling the request.') |
@@ -1,134 +1,134 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.feed |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Feed controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 23, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | |
|
29 | 29 | import logging |
|
30 | 30 | |
|
31 | 31 | from beaker.cache import cache_region |
|
32 | 32 | from tg import response |
|
33 | 33 | from tg import tmpl_context as c |
|
34 | 34 | from tg.i18n import ugettext as _ |
|
35 | 35 | |
|
36 | 36 | import kallithea |
|
37 | 37 | import kallithea.lib.helpers as h |
|
38 | from kallithea.controllers import base | |
|
38 | 39 | from kallithea.lib import feeds, webutils |
|
39 | 40 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
40 | from kallithea.lib.base import BaseRepoController | |
|
41 | 41 | from kallithea.lib.diffs import DiffProcessor |
|
42 | 42 | from kallithea.lib.utils2 import asbool, fmt_date, safe_int, safe_str, shorter |
|
43 | 43 | |
|
44 | 44 | |
|
45 | 45 | log = logging.getLogger(__name__) |
|
46 | 46 | |
|
47 | 47 | |
|
48 | class FeedController(BaseRepoController): | |
|
48 | class FeedController(base.BaseRepoController): | |
|
49 | 49 | |
|
50 | 50 | @LoginRequired(allow_default_user=True) |
|
51 | 51 | @HasRepoPermissionLevelDecorator('read') |
|
52 | 52 | def _before(self, *args, **kwargs): |
|
53 | 53 | super(FeedController, self)._before(*args, **kwargs) |
|
54 | 54 | |
|
55 | 55 | def _get_title(self, cs): |
|
56 | 56 | return shorter(cs.message, 160) |
|
57 | 57 | |
|
58 | 58 | def __get_desc(self, cs): |
|
59 | 59 | desc_msg = [(_('%s committed on %s') |
|
60 | 60 | % (h.person(cs.author), fmt_date(cs.date))) + '<br/>'] |
|
61 | 61 | # branches, tags, bookmarks |
|
62 | 62 | for branch in cs.branches: |
|
63 | 63 | desc_msg.append('branch: %s<br/>' % branch) |
|
64 | 64 | for book in cs.bookmarks: |
|
65 | 65 | desc_msg.append('bookmark: %s<br/>' % book) |
|
66 | 66 | for tag in cs.tags: |
|
67 | 67 | desc_msg.append('tag: %s<br/>' % tag) |
|
68 | 68 | |
|
69 | 69 | changes = [] |
|
70 | 70 | diff_limit = safe_int(kallithea.CONFIG.get('rss_cut_off_limit', 32 * 1024)) |
|
71 | 71 | raw_diff = cs.diff() |
|
72 | 72 | diff_processor = DiffProcessor(raw_diff, |
|
73 | 73 | diff_limit=diff_limit, |
|
74 | 74 | inline_diff=False) |
|
75 | 75 | |
|
76 | 76 | for st in diff_processor.parsed: |
|
77 | 77 | st.update({'added': st['stats']['added'], |
|
78 | 78 | 'removed': st['stats']['deleted']}) |
|
79 | 79 | changes.append('\n %(operation)s %(filename)s ' |
|
80 | 80 | '(%(added)s lines added, %(removed)s lines removed)' |
|
81 | 81 | % st) |
|
82 | 82 | if diff_processor.limited_diff: |
|
83 | 83 | changes = changes + ['\n ' + |
|
84 | 84 | _('Changeset was too big and was cut off...')] |
|
85 | 85 | |
|
86 | 86 | # rev link |
|
87 | 87 | _url = webutils.canonical_url('changeset_home', repo_name=c.db_repo.repo_name, |
|
88 | 88 | revision=cs.raw_id) |
|
89 | 89 | desc_msg.append('changeset: <a href="%s">%s</a>' % (_url, cs.raw_id[:8])) |
|
90 | 90 | |
|
91 | 91 | desc_msg.append('<pre>') |
|
92 | 92 | desc_msg.append(webutils.urlify_text(cs.message)) |
|
93 | 93 | desc_msg.append('\n') |
|
94 | 94 | desc_msg.extend(changes) |
|
95 | 95 | if asbool(kallithea.CONFIG.get('rss_include_diff', False)): |
|
96 | 96 | desc_msg.append('\n\n') |
|
97 | 97 | desc_msg.append(safe_str(raw_diff)) |
|
98 | 98 | desc_msg.append('</pre>') |
|
99 | 99 | return desc_msg |
|
100 | 100 | |
|
101 | 101 | def _feed(self, repo_name, feeder): |
|
102 | 102 | """Produce a simple feed""" |
|
103 | 103 | |
|
104 | 104 | @cache_region('long_term_file', '_get_feed_from_cache') |
|
105 | 105 | def _get_feed_from_cache(*_cache_keys): # parameters are not really used - only as caching key |
|
106 | 106 | header = dict( |
|
107 | 107 | title=_('%s %s feed') % (c.site_name, repo_name), |
|
108 | 108 | link=webutils.canonical_url('summary_home', repo_name=repo_name), |
|
109 | 109 | description=_('Changes on %s repository') % repo_name, |
|
110 | 110 | ) |
|
111 | 111 | |
|
112 | 112 | rss_items_per_page = safe_int(kallithea.CONFIG.get('rss_items_per_page', 20)) |
|
113 | 113 | entries=[] |
|
114 | 114 | for cs in reversed(list(c.db_repo_scm_instance[-rss_items_per_page:])): |
|
115 | 115 | entries.append(dict( |
|
116 | 116 | title=self._get_title(cs), |
|
117 | 117 | link=webutils.canonical_url('changeset_home', repo_name=repo_name, revision=cs.raw_id), |
|
118 | 118 | author_email=cs.author_email, |
|
119 | 119 | author_name=cs.author_name, |
|
120 | 120 | description=''.join(self.__get_desc(cs)), |
|
121 | 121 | pubdate=cs.date, |
|
122 | 122 | )) |
|
123 | 123 | return feeder.render(header, entries) |
|
124 | 124 | |
|
125 | 125 | response.content_type = feeder.content_type |
|
126 | 126 | return _get_feed_from_cache(repo_name, feeder.__name__) |
|
127 | 127 | |
|
128 | 128 | def atom(self, repo_name): |
|
129 | 129 | """Produce a simple atom-1.0 feed""" |
|
130 | 130 | return self._feed(repo_name, feeds.AtomFeed) |
|
131 | 131 | |
|
132 | 132 | def rss(self, repo_name): |
|
133 | 133 | """Produce a simple rss2 feed""" |
|
134 | 134 | return self._feed(repo_name, feeds.RssFeed) |
@@ -1,745 +1,745 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.files |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Files controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 21, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import os |
|
30 | 30 | import posixpath |
|
31 | 31 | import shutil |
|
32 | 32 | import tempfile |
|
33 | 33 | import traceback |
|
34 | 34 | from collections import OrderedDict |
|
35 | 35 | |
|
36 | 36 | from tg import request, response |
|
37 | 37 | from tg import tmpl_context as c |
|
38 | 38 | from tg.i18n import ugettext as _ |
|
39 | 39 | from webob.exc import HTTPFound, HTTPNotFound |
|
40 | 40 | |
|
41 | 41 | import kallithea |
|
42 | 42 | import kallithea.lib.helpers as h |
|
43 | from kallithea.controllers import base | |
|
43 | 44 | from kallithea.lib import diffs, webutils |
|
44 | 45 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
45 | from kallithea.lib.base import BaseRepoController, jsonify, render | |
|
46 | 46 | from kallithea.lib.exceptions import NonRelativePathError |
|
47 | 47 | from kallithea.lib.utils2 import asbool, convert_line_endings, detect_mode, safe_str |
|
48 | 48 | from kallithea.lib.vcs.backends.base import EmptyChangeset |
|
49 | 49 | from kallithea.lib.vcs.conf import settings |
|
50 | 50 | from kallithea.lib.vcs.exceptions import (ChangesetDoesNotExistError, ChangesetError, EmptyRepositoryError, ImproperArchiveTypeError, NodeAlreadyExistsError, |
|
51 | 51 | NodeDoesNotExistError, NodeError, RepositoryError, VCSError) |
|
52 | 52 | from kallithea.lib.vcs.nodes import FileNode |
|
53 | 53 | from kallithea.lib.vcs.utils import author_email |
|
54 | 54 | from kallithea.lib.webutils import url |
|
55 | 55 | from kallithea.model import userlog |
|
56 | 56 | from kallithea.model.repo import RepoModel |
|
57 | 57 | from kallithea.model.scm import ScmModel |
|
58 | 58 | |
|
59 | 59 | |
|
60 | 60 | log = logging.getLogger(__name__) |
|
61 | 61 | |
|
62 | 62 | |
|
63 | class FilesController(BaseRepoController): | |
|
63 | class FilesController(base.BaseRepoController): | |
|
64 | 64 | |
|
65 | 65 | def _before(self, *args, **kwargs): |
|
66 | 66 | super(FilesController, self)._before(*args, **kwargs) |
|
67 | 67 | |
|
68 | 68 | def __get_cs(self, rev, silent_empty=False): |
|
69 | 69 | """ |
|
70 | 70 | Safe way to get changeset if error occur it redirects to tip with |
|
71 | 71 | proper message |
|
72 | 72 | |
|
73 | 73 | :param rev: revision to fetch |
|
74 | 74 | :silent_empty: return None if repository is empty |
|
75 | 75 | """ |
|
76 | 76 | |
|
77 | 77 | try: |
|
78 | 78 | return c.db_repo_scm_instance.get_changeset(rev) |
|
79 | 79 | except EmptyRepositoryError as e: |
|
80 | 80 | if silent_empty: |
|
81 | 81 | return None |
|
82 | 82 | url_ = url('files_add_home', |
|
83 | 83 | repo_name=c.repo_name, |
|
84 | 84 | revision=0, f_path='', anchor='edit') |
|
85 | 85 | add_new = webutils.link_to(_('Click here to add new file'), url_, class_="alert-link") |
|
86 | 86 | webutils.flash(_('There are no files yet.') + ' ' + add_new, category='warning') |
|
87 | 87 | raise HTTPNotFound() |
|
88 | 88 | except (ChangesetDoesNotExistError, LookupError): |
|
89 | 89 | msg = _('Such revision does not exist for this repository') |
|
90 | 90 | webutils.flash(msg, category='error') |
|
91 | 91 | raise HTTPNotFound() |
|
92 | 92 | except RepositoryError as e: |
|
93 | 93 | webutils.flash(e, category='error') |
|
94 | 94 | raise HTTPNotFound() |
|
95 | 95 | |
|
96 | 96 | def __get_filenode(self, cs, path): |
|
97 | 97 | """ |
|
98 | 98 | Returns file_node or raise HTTP error. |
|
99 | 99 | |
|
100 | 100 | :param cs: given changeset |
|
101 | 101 | :param path: path to lookup |
|
102 | 102 | """ |
|
103 | 103 | |
|
104 | 104 | try: |
|
105 | 105 | file_node = cs.get_node(path) |
|
106 | 106 | if file_node.is_dir(): |
|
107 | 107 | raise RepositoryError('given path is a directory') |
|
108 | 108 | except ChangesetDoesNotExistError: |
|
109 | 109 | msg = _('Such revision does not exist for this repository') |
|
110 | 110 | webutils.flash(msg, category='error') |
|
111 | 111 | raise HTTPNotFound() |
|
112 | 112 | except RepositoryError as e: |
|
113 | 113 | webutils.flash(e, category='error') |
|
114 | 114 | raise HTTPNotFound() |
|
115 | 115 | |
|
116 | 116 | return file_node |
|
117 | 117 | |
|
118 | 118 | @LoginRequired(allow_default_user=True) |
|
119 | 119 | @HasRepoPermissionLevelDecorator('read') |
|
120 | 120 | def index(self, repo_name, revision, f_path, annotate=False): |
|
121 | 121 | # redirect to given revision from form if given |
|
122 | 122 | post_revision = request.POST.get('at_rev', None) |
|
123 | 123 | if post_revision: |
|
124 | 124 | cs = self.__get_cs(post_revision) # FIXME - unused! |
|
125 | 125 | |
|
126 | 126 | c.revision = revision |
|
127 | 127 | c.changeset = self.__get_cs(revision) |
|
128 | 128 | c.branch = request.GET.get('branch', None) |
|
129 | 129 | c.f_path = f_path |
|
130 | 130 | c.annotate = annotate |
|
131 | 131 | cur_rev = c.changeset.revision |
|
132 | 132 | # used in files_source.html: |
|
133 | 133 | c.cut_off_limit = self.cut_off_limit |
|
134 | 134 | c.fulldiff = request.GET.get('fulldiff') |
|
135 | 135 | |
|
136 | 136 | # prev link |
|
137 | 137 | try: |
|
138 | 138 | prev_rev = c.db_repo_scm_instance.get_changeset(cur_rev).prev(c.branch) |
|
139 | 139 | c.url_prev = url('files_home', repo_name=c.repo_name, |
|
140 | 140 | revision=prev_rev.raw_id, f_path=f_path) |
|
141 | 141 | if c.branch: |
|
142 | 142 | c.url_prev += '?branch=%s' % c.branch |
|
143 | 143 | except (ChangesetDoesNotExistError, VCSError): |
|
144 | 144 | c.url_prev = '#' |
|
145 | 145 | |
|
146 | 146 | # next link |
|
147 | 147 | try: |
|
148 | 148 | next_rev = c.db_repo_scm_instance.get_changeset(cur_rev).next(c.branch) |
|
149 | 149 | c.url_next = url('files_home', repo_name=c.repo_name, |
|
150 | 150 | revision=next_rev.raw_id, f_path=f_path) |
|
151 | 151 | if c.branch: |
|
152 | 152 | c.url_next += '?branch=%s' % c.branch |
|
153 | 153 | except (ChangesetDoesNotExistError, VCSError): |
|
154 | 154 | c.url_next = '#' |
|
155 | 155 | |
|
156 | 156 | # files or dirs |
|
157 | 157 | try: |
|
158 | 158 | c.file = c.changeset.get_node(f_path) |
|
159 | 159 | |
|
160 | 160 | if c.file.is_submodule(): |
|
161 | 161 | raise HTTPFound(location=c.file.url) |
|
162 | 162 | elif c.file.is_file(): |
|
163 | 163 | c.load_full_history = False |
|
164 | 164 | # determine if we're on branch head |
|
165 | 165 | _branches = c.db_repo_scm_instance.branches |
|
166 | 166 | c.on_branch_head = revision in _branches or revision in _branches.values() |
|
167 | 167 | _hist = [] |
|
168 | 168 | c.file_history = [] |
|
169 | 169 | if c.load_full_history: |
|
170 | 170 | c.file_history, _hist = self._get_node_history(c.changeset, f_path) |
|
171 | 171 | |
|
172 | 172 | c.authors = [] |
|
173 | 173 | for a in set([x.author for x in _hist]): |
|
174 | 174 | c.authors.append((author_email(a), h.person(a))) |
|
175 | 175 | else: |
|
176 | 176 | c.authors = c.file_history = [] |
|
177 | 177 | except RepositoryError as e: |
|
178 | 178 | webutils.flash(e, category='error') |
|
179 | 179 | raise HTTPNotFound() |
|
180 | 180 | |
|
181 | 181 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
182 | return render('files/files_ypjax.html') | |
|
182 | return base.render('files/files_ypjax.html') | |
|
183 | 183 | |
|
184 | 184 | # TODO: tags and bookmarks? |
|
185 | 185 | c.revision_options = [(c.changeset.raw_id, |
|
186 | 186 | _('%s at %s') % (b, c.changeset.short_id)) for b in c.changeset.branches] + \ |
|
187 | 187 | [(n, b) for b, n in c.db_repo_scm_instance.branches.items()] |
|
188 | 188 | if c.db_repo_scm_instance.closed_branches: |
|
189 | 189 | prefix = _('(closed)') + ' ' |
|
190 | 190 | c.revision_options += [('-', '-')] + \ |
|
191 | 191 | [(n, prefix + b) for b, n in c.db_repo_scm_instance.closed_branches.items()] |
|
192 | 192 | |
|
193 | return render('files/files.html') | |
|
193 | return base.render('files/files.html') | |
|
194 | 194 | |
|
195 | 195 | @LoginRequired(allow_default_user=True) |
|
196 | 196 | @HasRepoPermissionLevelDecorator('read') |
|
197 | @jsonify | |
|
197 | @base.jsonify | |
|
198 | 198 | def history(self, repo_name, revision, f_path): |
|
199 | 199 | changeset = self.__get_cs(revision) |
|
200 | 200 | _file = changeset.get_node(f_path) |
|
201 | 201 | if _file.is_file(): |
|
202 | 202 | file_history, _hist = self._get_node_history(changeset, f_path) |
|
203 | 203 | |
|
204 | 204 | res = [] |
|
205 | 205 | for obj in file_history: |
|
206 | 206 | res.append({ |
|
207 | 207 | 'text': obj[1], |
|
208 | 208 | 'children': [{'id': o[0], 'text': o[1]} for o in obj[0]] |
|
209 | 209 | }) |
|
210 | 210 | |
|
211 | 211 | data = { |
|
212 | 212 | 'more': False, |
|
213 | 213 | 'results': res |
|
214 | 214 | } |
|
215 | 215 | return data |
|
216 | 216 | |
|
217 | 217 | @LoginRequired(allow_default_user=True) |
|
218 | 218 | @HasRepoPermissionLevelDecorator('read') |
|
219 | 219 | def authors(self, repo_name, revision, f_path): |
|
220 | 220 | changeset = self.__get_cs(revision) |
|
221 | 221 | _file = changeset.get_node(f_path) |
|
222 | 222 | if _file.is_file(): |
|
223 | 223 | file_history, _hist = self._get_node_history(changeset, f_path) |
|
224 | 224 | c.authors = [] |
|
225 | 225 | for a in set([x.author for x in _hist]): |
|
226 | 226 | c.authors.append((author_email(a), h.person(a))) |
|
227 | return render('files/files_history_box.html') | |
|
227 | return base.render('files/files_history_box.html') | |
|
228 | 228 | |
|
229 | 229 | @LoginRequired(allow_default_user=True) |
|
230 | 230 | @HasRepoPermissionLevelDecorator('read') |
|
231 | 231 | def rawfile(self, repo_name, revision, f_path): |
|
232 | 232 | cs = self.__get_cs(revision) |
|
233 | 233 | file_node = self.__get_filenode(cs, f_path) |
|
234 | 234 | |
|
235 | 235 | response.content_disposition = \ |
|
236 | 236 | 'attachment; filename=%s' % f_path.split(kallithea.URL_SEP)[-1] |
|
237 | 237 | |
|
238 | 238 | response.content_type = file_node.mimetype |
|
239 | 239 | return file_node.content |
|
240 | 240 | |
|
241 | 241 | @LoginRequired(allow_default_user=True) |
|
242 | 242 | @HasRepoPermissionLevelDecorator('read') |
|
243 | 243 | def raw(self, repo_name, revision, f_path): |
|
244 | 244 | cs = self.__get_cs(revision) |
|
245 | 245 | file_node = self.__get_filenode(cs, f_path) |
|
246 | 246 | |
|
247 | 247 | raw_mimetype_mapping = { |
|
248 | 248 | # map original mimetype to a mimetype used for "show as raw" |
|
249 | 249 | # you can also provide a content-disposition to override the |
|
250 | 250 | # default "attachment" disposition. |
|
251 | 251 | # orig_type: (new_type, new_dispo) |
|
252 | 252 | |
|
253 | 253 | # show images inline: |
|
254 | 254 | 'image/x-icon': ('image/x-icon', 'inline'), |
|
255 | 255 | 'image/png': ('image/png', 'inline'), |
|
256 | 256 | 'image/gif': ('image/gif', 'inline'), |
|
257 | 257 | 'image/jpeg': ('image/jpeg', 'inline'), |
|
258 | 258 | 'image/svg+xml': ('image/svg+xml', 'inline'), |
|
259 | 259 | } |
|
260 | 260 | |
|
261 | 261 | mimetype = file_node.mimetype |
|
262 | 262 | try: |
|
263 | 263 | mimetype, dispo = raw_mimetype_mapping[mimetype] |
|
264 | 264 | except KeyError: |
|
265 | 265 | # we don't know anything special about this, handle it safely |
|
266 | 266 | if file_node.is_binary: |
|
267 | 267 | # do same as download raw for binary files |
|
268 | 268 | mimetype, dispo = 'application/octet-stream', 'attachment' |
|
269 | 269 | else: |
|
270 | 270 | # do not just use the original mimetype, but force text/plain, |
|
271 | 271 | # otherwise it would serve text/html and that might be unsafe. |
|
272 | 272 | # Note: underlying vcs library fakes text/plain mimetype if the |
|
273 | 273 | # mimetype can not be determined and it thinks it is not |
|
274 | 274 | # binary.This might lead to erroneous text display in some |
|
275 | 275 | # cases, but helps in other cases, like with text files |
|
276 | 276 | # without extension. |
|
277 | 277 | mimetype, dispo = 'text/plain', 'inline' |
|
278 | 278 | |
|
279 | 279 | if dispo == 'attachment': |
|
280 | 280 | dispo = 'attachment; filename=%s' % f_path.split(os.sep)[-1] |
|
281 | 281 | |
|
282 | 282 | response.content_disposition = dispo |
|
283 | 283 | response.content_type = mimetype |
|
284 | 284 | return file_node.content |
|
285 | 285 | |
|
286 | 286 | @LoginRequired() |
|
287 | 287 | @HasRepoPermissionLevelDecorator('write') |
|
288 | 288 | def delete(self, repo_name, revision, f_path): |
|
289 | 289 | repo = c.db_repo |
|
290 | 290 | # check if revision is a branch identifier- basically we cannot |
|
291 | 291 | # create multiple heads via file editing |
|
292 | 292 | _branches = repo.scm_instance.branches |
|
293 | 293 | # check if revision is a branch name or branch hash |
|
294 | 294 | if revision not in _branches and revision not in _branches.values(): |
|
295 | 295 | webutils.flash(_('You can only delete files with revision ' |
|
296 | 296 | 'being a valid branch'), category='warning') |
|
297 | 297 | raise HTTPFound(location=webutils.url('files_home', |
|
298 | 298 | repo_name=repo_name, revision='tip', |
|
299 | 299 | f_path=f_path)) |
|
300 | 300 | |
|
301 | 301 | r_post = request.POST |
|
302 | 302 | |
|
303 | 303 | c.cs = self.__get_cs(revision) |
|
304 | 304 | c.file = self.__get_filenode(c.cs, f_path) |
|
305 | 305 | |
|
306 | 306 | c.default_message = _('Deleted file %s via Kallithea') % (f_path) |
|
307 | 307 | c.f_path = f_path |
|
308 | 308 | node_path = f_path |
|
309 | 309 | author = request.authuser.full_contact |
|
310 | 310 | |
|
311 | 311 | if r_post: |
|
312 | 312 | message = r_post.get('message') or c.default_message |
|
313 | 313 | |
|
314 | 314 | try: |
|
315 | 315 | nodes = { |
|
316 | 316 | node_path: { |
|
317 | 317 | 'content': '' |
|
318 | 318 | } |
|
319 | 319 | } |
|
320 | 320 | self.scm_model.delete_nodes( |
|
321 | 321 | user=request.authuser.user_id, |
|
322 | 322 | ip_addr=request.ip_addr, |
|
323 | 323 | repo=c.db_repo, |
|
324 | 324 | message=message, |
|
325 | 325 | nodes=nodes, |
|
326 | 326 | parent_cs=c.cs, |
|
327 | 327 | author=author, |
|
328 | 328 | ) |
|
329 | 329 | |
|
330 | 330 | webutils.flash(_('Successfully deleted file %s') % f_path, |
|
331 | 331 | category='success') |
|
332 | 332 | except Exception: |
|
333 | 333 | log.error(traceback.format_exc()) |
|
334 | 334 | webutils.flash(_('Error occurred during commit'), category='error') |
|
335 | 335 | raise HTTPFound(location=url('changeset_home', |
|
336 | 336 | repo_name=c.repo_name, revision='tip')) |
|
337 | 337 | |
|
338 | return render('files/files_delete.html') | |
|
338 | return base.render('files/files_delete.html') | |
|
339 | 339 | |
|
340 | 340 | @LoginRequired() |
|
341 | 341 | @HasRepoPermissionLevelDecorator('write') |
|
342 | 342 | def edit(self, repo_name, revision, f_path): |
|
343 | 343 | repo = c.db_repo |
|
344 | 344 | # check if revision is a branch identifier- basically we cannot |
|
345 | 345 | # create multiple heads via file editing |
|
346 | 346 | _branches = repo.scm_instance.branches |
|
347 | 347 | # check if revision is a branch name or branch hash |
|
348 | 348 | if revision not in _branches and revision not in _branches.values(): |
|
349 | 349 | webutils.flash(_('You can only edit files with revision ' |
|
350 | 350 | 'being a valid branch'), category='warning') |
|
351 | 351 | raise HTTPFound(location=webutils.url('files_home', |
|
352 | 352 | repo_name=repo_name, revision='tip', |
|
353 | 353 | f_path=f_path)) |
|
354 | 354 | |
|
355 | 355 | r_post = request.POST |
|
356 | 356 | |
|
357 | 357 | c.cs = self.__get_cs(revision) |
|
358 | 358 | c.file = self.__get_filenode(c.cs, f_path) |
|
359 | 359 | |
|
360 | 360 | if c.file.is_binary: |
|
361 | 361 | raise HTTPFound(location=url('files_home', repo_name=c.repo_name, |
|
362 | 362 | revision=c.cs.raw_id, f_path=f_path)) |
|
363 | 363 | c.default_message = _('Edited file %s via Kallithea') % (f_path) |
|
364 | 364 | c.f_path = f_path |
|
365 | 365 | |
|
366 | 366 | if r_post: |
|
367 | 367 | old_content = safe_str(c.file.content) |
|
368 | 368 | sl = old_content.splitlines(1) |
|
369 | 369 | first_line = sl[0] if sl else '' |
|
370 | 370 | # modes: 0 - Unix, 1 - Mac, 2 - DOS |
|
371 | 371 | mode = detect_mode(first_line, 0) |
|
372 | 372 | content = convert_line_endings(r_post.get('content', ''), mode) |
|
373 | 373 | |
|
374 | 374 | message = r_post.get('message') or c.default_message |
|
375 | 375 | author = request.authuser.full_contact |
|
376 | 376 | |
|
377 | 377 | if content == old_content: |
|
378 | 378 | webutils.flash(_('No changes'), category='warning') |
|
379 | 379 | raise HTTPFound(location=url('changeset_home', repo_name=c.repo_name, |
|
380 | 380 | revision='tip')) |
|
381 | 381 | try: |
|
382 | 382 | self.scm_model.commit_change(repo=c.db_repo_scm_instance, |
|
383 | 383 | repo_name=repo_name, cs=c.cs, |
|
384 | 384 | user=request.authuser.user_id, |
|
385 | 385 | ip_addr=request.ip_addr, |
|
386 | 386 | author=author, message=message, |
|
387 | 387 | content=content, f_path=f_path) |
|
388 | 388 | webutils.flash(_('Successfully committed to %s') % f_path, |
|
389 | 389 | category='success') |
|
390 | 390 | except Exception: |
|
391 | 391 | log.error(traceback.format_exc()) |
|
392 | 392 | webutils.flash(_('Error occurred during commit'), category='error') |
|
393 | 393 | raise HTTPFound(location=url('changeset_home', |
|
394 | 394 | repo_name=c.repo_name, revision='tip')) |
|
395 | 395 | |
|
396 | return render('files/files_edit.html') | |
|
396 | return base.render('files/files_edit.html') | |
|
397 | 397 | |
|
398 | 398 | @LoginRequired() |
|
399 | 399 | @HasRepoPermissionLevelDecorator('write') |
|
400 | 400 | def add(self, repo_name, revision, f_path): |
|
401 | 401 | |
|
402 | 402 | repo = c.db_repo |
|
403 | 403 | r_post = request.POST |
|
404 | 404 | c.cs = self.__get_cs(revision, silent_empty=True) |
|
405 | 405 | if c.cs is None: |
|
406 | 406 | c.cs = EmptyChangeset(alias=c.db_repo_scm_instance.alias) |
|
407 | 407 | c.default_message = (_('Added file via Kallithea')) |
|
408 | 408 | c.f_path = f_path |
|
409 | 409 | |
|
410 | 410 | if r_post: |
|
411 | 411 | unix_mode = 0 |
|
412 | 412 | content = convert_line_endings(r_post.get('content', ''), unix_mode) |
|
413 | 413 | |
|
414 | 414 | message = r_post.get('message') or c.default_message |
|
415 | 415 | filename = r_post.get('filename') |
|
416 | 416 | location = r_post.get('location', '') |
|
417 | 417 | file_obj = r_post.get('upload_file', None) |
|
418 | 418 | |
|
419 | 419 | if file_obj is not None and hasattr(file_obj, 'filename'): |
|
420 | 420 | filename = file_obj.filename |
|
421 | 421 | content = file_obj.file |
|
422 | 422 | |
|
423 | 423 | if hasattr(content, 'file'): |
|
424 | 424 | # non posix systems store real file under file attr |
|
425 | 425 | content = content.file |
|
426 | 426 | |
|
427 | 427 | if not content: |
|
428 | 428 | webutils.flash(_('No content'), category='warning') |
|
429 | 429 | raise HTTPFound(location=url('changeset_home', repo_name=c.repo_name, |
|
430 | 430 | revision='tip')) |
|
431 | 431 | if not filename: |
|
432 | 432 | webutils.flash(_('No filename'), category='warning') |
|
433 | 433 | raise HTTPFound(location=url('changeset_home', repo_name=c.repo_name, |
|
434 | 434 | revision='tip')) |
|
435 | 435 | # strip all crap out of file, just leave the basename |
|
436 | 436 | filename = os.path.basename(filename) |
|
437 | 437 | node_path = posixpath.join(location, filename) |
|
438 | 438 | author = request.authuser.full_contact |
|
439 | 439 | |
|
440 | 440 | try: |
|
441 | 441 | nodes = { |
|
442 | 442 | node_path: { |
|
443 | 443 | 'content': content |
|
444 | 444 | } |
|
445 | 445 | } |
|
446 | 446 | self.scm_model.create_nodes( |
|
447 | 447 | user=request.authuser.user_id, |
|
448 | 448 | ip_addr=request.ip_addr, |
|
449 | 449 | repo=c.db_repo, |
|
450 | 450 | message=message, |
|
451 | 451 | nodes=nodes, |
|
452 | 452 | parent_cs=c.cs, |
|
453 | 453 | author=author, |
|
454 | 454 | ) |
|
455 | 455 | |
|
456 | 456 | webutils.flash(_('Successfully committed to %s') % node_path, |
|
457 | 457 | category='success') |
|
458 | 458 | except NonRelativePathError as e: |
|
459 | 459 | webutils.flash(_('Location must be relative path and must not ' |
|
460 | 460 | 'contain .. in path'), category='warning') |
|
461 | 461 | raise HTTPFound(location=url('changeset_home', repo_name=c.repo_name, |
|
462 | 462 | revision='tip')) |
|
463 | 463 | except (NodeError, NodeAlreadyExistsError) as e: |
|
464 | 464 | webutils.flash(_(e), category='error') |
|
465 | 465 | except Exception: |
|
466 | 466 | log.error(traceback.format_exc()) |
|
467 | 467 | webutils.flash(_('Error occurred during commit'), category='error') |
|
468 | 468 | raise HTTPFound(location=url('changeset_home', |
|
469 | 469 | repo_name=c.repo_name, revision='tip')) |
|
470 | 470 | |
|
471 | return render('files/files_add.html') | |
|
471 | return base.render('files/files_add.html') | |
|
472 | 472 | |
|
473 | 473 | @LoginRequired(allow_default_user=True) |
|
474 | 474 | @HasRepoPermissionLevelDecorator('read') |
|
475 | 475 | def archivefile(self, repo_name, fname): |
|
476 | 476 | fileformat = None |
|
477 | 477 | revision = None |
|
478 | 478 | ext = None |
|
479 | 479 | subrepos = request.GET.get('subrepos') == 'true' |
|
480 | 480 | |
|
481 | 481 | for a_type, ext_data in settings.ARCHIVE_SPECS.items(): |
|
482 | 482 | archive_spec = fname.split(ext_data[1]) |
|
483 | 483 | if len(archive_spec) == 2 and archive_spec[1] == '': |
|
484 | 484 | fileformat = a_type or ext_data[1] |
|
485 | 485 | revision = archive_spec[0] |
|
486 | 486 | ext = ext_data[1] |
|
487 | 487 | |
|
488 | 488 | try: |
|
489 | 489 | dbrepo = RepoModel().get_by_repo_name(repo_name) |
|
490 | 490 | if not dbrepo.enable_downloads: |
|
491 | 491 | return _('Downloads disabled') # TODO: do something else? |
|
492 | 492 | |
|
493 | 493 | if c.db_repo_scm_instance.alias == 'hg': |
|
494 | 494 | # patch and reset hooks section of UI config to not run any |
|
495 | 495 | # hooks on fetching archives with subrepos |
|
496 | 496 | for k, v in c.db_repo_scm_instance._repo.ui.configitems('hooks'): |
|
497 | 497 | c.db_repo_scm_instance._repo.ui.setconfig('hooks', k, None) |
|
498 | 498 | |
|
499 | 499 | cs = c.db_repo_scm_instance.get_changeset(revision) |
|
500 | 500 | content_type = settings.ARCHIVE_SPECS[fileformat][0] |
|
501 | 501 | except ChangesetDoesNotExistError: |
|
502 | 502 | return _('Unknown revision %s') % revision |
|
503 | 503 | except EmptyRepositoryError: |
|
504 | 504 | return _('Empty repository') |
|
505 | 505 | except (ImproperArchiveTypeError, KeyError): |
|
506 | 506 | return _('Unknown archive type') |
|
507 | 507 | |
|
508 | 508 | rev_name = cs.raw_id[:12] |
|
509 | 509 | archive_name = '%s-%s%s' % (repo_name.replace('/', '_'), rev_name, ext) |
|
510 | 510 | |
|
511 | 511 | archive_path = None |
|
512 | 512 | cached_archive_path = None |
|
513 | 513 | archive_cache_dir = kallithea.CONFIG.get('archive_cache_dir') |
|
514 | 514 | if archive_cache_dir and not subrepos: # TODO: subrepo caching? |
|
515 | 515 | if not os.path.isdir(archive_cache_dir): |
|
516 | 516 | os.makedirs(archive_cache_dir) |
|
517 | 517 | cached_archive_path = os.path.join(archive_cache_dir, archive_name) |
|
518 | 518 | if os.path.isfile(cached_archive_path): |
|
519 | 519 | log.debug('Found cached archive in %s', cached_archive_path) |
|
520 | 520 | archive_path = cached_archive_path |
|
521 | 521 | else: |
|
522 | 522 | log.debug('Archive %s is not yet cached', archive_name) |
|
523 | 523 | |
|
524 | 524 | if archive_path is None: |
|
525 | 525 | # generate new archive |
|
526 | 526 | fd, archive_path = tempfile.mkstemp() |
|
527 | 527 | log.debug('Creating new temp archive in %s', archive_path) |
|
528 | 528 | with os.fdopen(fd, 'wb') as stream: |
|
529 | 529 | cs.fill_archive(stream=stream, kind=fileformat, subrepos=subrepos) |
|
530 | 530 | # stream (and thus fd) has been closed by cs.fill_archive |
|
531 | 531 | if cached_archive_path is not None: |
|
532 | 532 | # we generated the archive - move it to cache |
|
533 | 533 | log.debug('Storing new archive in %s', cached_archive_path) |
|
534 | 534 | shutil.move(archive_path, cached_archive_path) |
|
535 | 535 | archive_path = cached_archive_path |
|
536 | 536 | |
|
537 | 537 | def get_chunked_archive(archive_path): |
|
538 | 538 | stream = open(archive_path, 'rb') |
|
539 | 539 | while True: |
|
540 | 540 | data = stream.read(16 * 1024) |
|
541 | 541 | if not data: |
|
542 | 542 | break |
|
543 | 543 | yield data |
|
544 | 544 | stream.close() |
|
545 | 545 | if archive_path != cached_archive_path: |
|
546 | 546 | log.debug('Destroying temp archive %s', archive_path) |
|
547 | 547 | os.remove(archive_path) |
|
548 | 548 | |
|
549 | 549 | userlog.action_logger(user=request.authuser, |
|
550 | 550 | action='user_downloaded_archive:%s' % (archive_name), |
|
551 | 551 | repo=repo_name, ipaddr=request.ip_addr, commit=True) |
|
552 | 552 | |
|
553 | 553 | response.content_disposition = str('attachment; filename=%s' % (archive_name)) |
|
554 | 554 | response.content_type = str(content_type) |
|
555 | 555 | return get_chunked_archive(archive_path) |
|
556 | 556 | |
|
557 | 557 | @LoginRequired(allow_default_user=True) |
|
558 | 558 | @HasRepoPermissionLevelDecorator('read') |
|
559 | 559 | def diff(self, repo_name, f_path): |
|
560 | 560 | ignore_whitespace_diff = h.get_ignore_whitespace_diff(request.GET) |
|
561 | 561 | diff_context_size = h.get_diff_context_size(request.GET) |
|
562 | 562 | diff2 = request.GET.get('diff2', '') |
|
563 | 563 | diff1 = request.GET.get('diff1', '') or diff2 |
|
564 | 564 | c.action = request.GET.get('diff') |
|
565 | 565 | c.no_changes = diff1 == diff2 |
|
566 | 566 | c.f_path = f_path |
|
567 | 567 | c.big_diff = False |
|
568 | 568 | fulldiff = request.GET.get('fulldiff') |
|
569 | 569 | c.changes = OrderedDict() |
|
570 | 570 | c.changes[diff2] = [] |
|
571 | 571 | |
|
572 | 572 | # special case if we want a show rev only, it's impl here |
|
573 | 573 | # to reduce JS and callbacks |
|
574 | 574 | |
|
575 | 575 | if request.GET.get('show_rev'): |
|
576 | 576 | if asbool(request.GET.get('annotate', 'False')): |
|
577 | 577 | _url = url('files_annotate_home', repo_name=c.repo_name, |
|
578 | 578 | revision=diff1, f_path=c.f_path) |
|
579 | 579 | else: |
|
580 | 580 | _url = url('files_home', repo_name=c.repo_name, |
|
581 | 581 | revision=diff1, f_path=c.f_path) |
|
582 | 582 | |
|
583 | 583 | raise HTTPFound(location=_url) |
|
584 | 584 | try: |
|
585 | 585 | if diff1 not in ['', None, 'None', '0' * 12, '0' * 40]: |
|
586 | 586 | c.changeset_1 = c.db_repo_scm_instance.get_changeset(diff1) |
|
587 | 587 | try: |
|
588 | 588 | node1 = c.changeset_1.get_node(f_path) |
|
589 | 589 | if node1.is_dir(): |
|
590 | 590 | raise NodeError('%s path is a %s not a file' |
|
591 | 591 | % (node1, type(node1))) |
|
592 | 592 | except NodeDoesNotExistError: |
|
593 | 593 | c.changeset_1 = EmptyChangeset(cs=diff1, |
|
594 | 594 | revision=c.changeset_1.revision, |
|
595 | 595 | repo=c.db_repo_scm_instance) |
|
596 | 596 | node1 = FileNode(f_path, '', changeset=c.changeset_1) |
|
597 | 597 | else: |
|
598 | 598 | c.changeset_1 = EmptyChangeset(repo=c.db_repo_scm_instance) |
|
599 | 599 | node1 = FileNode(f_path, '', changeset=c.changeset_1) |
|
600 | 600 | |
|
601 | 601 | if diff2 not in ['', None, 'None', '0' * 12, '0' * 40]: |
|
602 | 602 | c.changeset_2 = c.db_repo_scm_instance.get_changeset(diff2) |
|
603 | 603 | try: |
|
604 | 604 | node2 = c.changeset_2.get_node(f_path) |
|
605 | 605 | if node2.is_dir(): |
|
606 | 606 | raise NodeError('%s path is a %s not a file' |
|
607 | 607 | % (node2, type(node2))) |
|
608 | 608 | except NodeDoesNotExistError: |
|
609 | 609 | c.changeset_2 = EmptyChangeset(cs=diff2, |
|
610 | 610 | revision=c.changeset_2.revision, |
|
611 | 611 | repo=c.db_repo_scm_instance) |
|
612 | 612 | node2 = FileNode(f_path, '', changeset=c.changeset_2) |
|
613 | 613 | else: |
|
614 | 614 | c.changeset_2 = EmptyChangeset(repo=c.db_repo_scm_instance) |
|
615 | 615 | node2 = FileNode(f_path, '', changeset=c.changeset_2) |
|
616 | 616 | except (RepositoryError, NodeError): |
|
617 | 617 | log.error(traceback.format_exc()) |
|
618 | 618 | raise HTTPFound(location=url('files_home', repo_name=c.repo_name, |
|
619 | 619 | f_path=f_path)) |
|
620 | 620 | |
|
621 | 621 | if c.action == 'download': |
|
622 | 622 | raw_diff = diffs.get_gitdiff(node1, node2, |
|
623 | 623 | ignore_whitespace=ignore_whitespace_diff, |
|
624 | 624 | context=diff_context_size) |
|
625 | 625 | diff_name = '%s_vs_%s.diff' % (diff1, diff2) |
|
626 | 626 | response.content_type = 'text/plain' |
|
627 | 627 | response.content_disposition = ( |
|
628 | 628 | 'attachment; filename=%s' % diff_name |
|
629 | 629 | ) |
|
630 | 630 | return raw_diff |
|
631 | 631 | |
|
632 | 632 | elif c.action == 'raw': |
|
633 | 633 | raw_diff = diffs.get_gitdiff(node1, node2, |
|
634 | 634 | ignore_whitespace=ignore_whitespace_diff, |
|
635 | 635 | context=diff_context_size) |
|
636 | 636 | response.content_type = 'text/plain' |
|
637 | 637 | return raw_diff |
|
638 | 638 | |
|
639 | 639 | else: |
|
640 | 640 | fid = h.FID(diff2, node2.path) |
|
641 | 641 | diff_limit = None if fulldiff else self.cut_off_limit |
|
642 | 642 | c.a_rev, c.cs_rev, a_path, diff, st, op = diffs.wrapped_diff(filenode_old=node1, |
|
643 | 643 | filenode_new=node2, |
|
644 | 644 | diff_limit=diff_limit, |
|
645 | 645 | ignore_whitespace=ignore_whitespace_diff, |
|
646 | 646 | line_context=diff_context_size) |
|
647 | 647 | c.file_diff_data = [(fid, fid, op, a_path, node2.path, diff, st)] |
|
648 | return render('files/file_diff.html') | |
|
648 | return base.render('files/file_diff.html') | |
|
649 | 649 | |
|
650 | 650 | @LoginRequired(allow_default_user=True) |
|
651 | 651 | @HasRepoPermissionLevelDecorator('read') |
|
652 | 652 | def diff_2way(self, repo_name, f_path): |
|
653 | 653 | diff1 = request.GET.get('diff1', '') |
|
654 | 654 | diff2 = request.GET.get('diff2', '') |
|
655 | 655 | try: |
|
656 | 656 | if diff1 not in ['', None, 'None', '0' * 12, '0' * 40]: |
|
657 | 657 | c.changeset_1 = c.db_repo_scm_instance.get_changeset(diff1) |
|
658 | 658 | try: |
|
659 | 659 | node1 = c.changeset_1.get_node(f_path) |
|
660 | 660 | if node1.is_dir(): |
|
661 | 661 | raise NodeError('%s path is a %s not a file' |
|
662 | 662 | % (node1, type(node1))) |
|
663 | 663 | except NodeDoesNotExistError: |
|
664 | 664 | c.changeset_1 = EmptyChangeset(cs=diff1, |
|
665 | 665 | revision=c.changeset_1.revision, |
|
666 | 666 | repo=c.db_repo_scm_instance) |
|
667 | 667 | node1 = FileNode(f_path, '', changeset=c.changeset_1) |
|
668 | 668 | else: |
|
669 | 669 | c.changeset_1 = EmptyChangeset(repo=c.db_repo_scm_instance) |
|
670 | 670 | node1 = FileNode(f_path, '', changeset=c.changeset_1) |
|
671 | 671 | |
|
672 | 672 | if diff2 not in ['', None, 'None', '0' * 12, '0' * 40]: |
|
673 | 673 | c.changeset_2 = c.db_repo_scm_instance.get_changeset(diff2) |
|
674 | 674 | try: |
|
675 | 675 | node2 = c.changeset_2.get_node(f_path) |
|
676 | 676 | if node2.is_dir(): |
|
677 | 677 | raise NodeError('%s path is a %s not a file' |
|
678 | 678 | % (node2, type(node2))) |
|
679 | 679 | except NodeDoesNotExistError: |
|
680 | 680 | c.changeset_2 = EmptyChangeset(cs=diff2, |
|
681 | 681 | revision=c.changeset_2.revision, |
|
682 | 682 | repo=c.db_repo_scm_instance) |
|
683 | 683 | node2 = FileNode(f_path, '', changeset=c.changeset_2) |
|
684 | 684 | else: |
|
685 | 685 | c.changeset_2 = EmptyChangeset(repo=c.db_repo_scm_instance) |
|
686 | 686 | node2 = FileNode(f_path, '', changeset=c.changeset_2) |
|
687 | 687 | except ChangesetDoesNotExistError as e: |
|
688 | 688 | msg = _('Such revision does not exist for this repository') |
|
689 | 689 | webutils.flash(msg, category='error') |
|
690 | 690 | raise HTTPNotFound() |
|
691 | 691 | c.node1 = node1 |
|
692 | 692 | c.node2 = node2 |
|
693 | 693 | c.cs1 = c.changeset_1 |
|
694 | 694 | c.cs2 = c.changeset_2 |
|
695 | 695 | |
|
696 | return render('files/diff_2way.html') | |
|
696 | return base.render('files/diff_2way.html') | |
|
697 | 697 | |
|
698 | 698 | def _get_node_history(self, cs, f_path, changesets=None): |
|
699 | 699 | """ |
|
700 | 700 | get changesets history for given node |
|
701 | 701 | |
|
702 | 702 | :param cs: changeset to calculate history |
|
703 | 703 | :param f_path: path for node to calculate history for |
|
704 | 704 | :param changesets: if passed don't calculate history and take |
|
705 | 705 | changesets defined in this list |
|
706 | 706 | """ |
|
707 | 707 | # calculate history based on tip |
|
708 | 708 | tip_cs = c.db_repo_scm_instance.get_changeset() |
|
709 | 709 | if changesets is None: |
|
710 | 710 | try: |
|
711 | 711 | changesets = tip_cs.get_file_history(f_path) |
|
712 | 712 | except (NodeDoesNotExistError, ChangesetError): |
|
713 | 713 | # this node is not present at tip ! |
|
714 | 714 | changesets = cs.get_file_history(f_path) |
|
715 | 715 | hist_l = [] |
|
716 | 716 | |
|
717 | 717 | changesets_group = ([], _("Changesets")) |
|
718 | 718 | branches_group = ([], _("Branches")) |
|
719 | 719 | tags_group = ([], _("Tags")) |
|
720 | 720 | for chs in changesets: |
|
721 | 721 | # TODO: loop over chs.branches ... but that will not give all the bogus None branches for Git ... |
|
722 | 722 | _branch = chs.branch |
|
723 | 723 | n_desc = '%s (%s)' % (h.show_id(chs), _branch) |
|
724 | 724 | changesets_group[0].append((chs.raw_id, n_desc,)) |
|
725 | 725 | hist_l.append(changesets_group) |
|
726 | 726 | |
|
727 | 727 | for name, chs in c.db_repo_scm_instance.branches.items(): |
|
728 | 728 | branches_group[0].append((chs, name),) |
|
729 | 729 | hist_l.append(branches_group) |
|
730 | 730 | |
|
731 | 731 | for name, chs in c.db_repo_scm_instance.tags.items(): |
|
732 | 732 | tags_group[0].append((chs, name),) |
|
733 | 733 | hist_l.append(tags_group) |
|
734 | 734 | |
|
735 | 735 | return hist_l, changesets |
|
736 | 736 | |
|
737 | 737 | @LoginRequired(allow_default_user=True) |
|
738 | 738 | @HasRepoPermissionLevelDecorator('read') |
|
739 | @jsonify | |
|
739 | @base.jsonify | |
|
740 | 740 | def nodelist(self, repo_name, revision, f_path): |
|
741 | 741 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
742 | 742 | cs = self.__get_cs(revision) |
|
743 | 743 | _d, _f = ScmModel().get_nodes(repo_name, cs.raw_id, f_path, |
|
744 | 744 | flat=False) |
|
745 | 745 | return {'nodes': _d + _f} |
@@ -1,57 +1,57 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.followers |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Followers controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 23, 2011 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | |
|
30 | 30 | from tg import request |
|
31 | 31 | from tg import tmpl_context as c |
|
32 | 32 | |
|
33 | from kallithea.controllers import base | |
|
33 | 34 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
34 | from kallithea.lib.base import BaseRepoController, render | |
|
35 | 35 | from kallithea.lib.page import Page |
|
36 | 36 | from kallithea.lib.utils2 import safe_int |
|
37 | 37 | from kallithea.model import db |
|
38 | 38 | |
|
39 | 39 | |
|
40 | 40 | log = logging.getLogger(__name__) |
|
41 | 41 | |
|
42 | 42 | |
|
43 | class FollowersController(BaseRepoController): | |
|
43 | class FollowersController(base.BaseRepoController): | |
|
44 | 44 | |
|
45 | 45 | @LoginRequired(allow_default_user=True) |
|
46 | 46 | @HasRepoPermissionLevelDecorator('read') |
|
47 | 47 | def followers(self, repo_name): |
|
48 | 48 | p = safe_int(request.GET.get('page'), 1) |
|
49 | 49 | repo_id = c.db_repo.repo_id |
|
50 | 50 | d = db.UserFollowing.get_repo_followers(repo_id) \ |
|
51 | 51 | .order_by(db.UserFollowing.follows_from) |
|
52 | 52 | c.followers_pager = Page(d, page=p, items_per_page=20) |
|
53 | 53 | |
|
54 | 54 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
55 | return render('/followers/followers_data.html') | |
|
55 | return base.render('/followers/followers_data.html') | |
|
56 | 56 | |
|
57 | return render('/followers/followers.html') | |
|
57 | return base.render('/followers/followers.html') |
@@ -1,173 +1,173 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.forks |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | forks controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 23, 2011 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | |
|
31 | 31 | import formencode |
|
32 | 32 | from formencode import htmlfill |
|
33 | 33 | from tg import request |
|
34 | 34 | from tg import tmpl_context as c |
|
35 | 35 | from tg.i18n import ugettext as _ |
|
36 | 36 | from webob.exc import HTTPFound, HTTPNotFound |
|
37 | 37 | |
|
38 | 38 | import kallithea |
|
39 | from kallithea.controllers import base | |
|
39 | 40 | from kallithea.lib import webutils |
|
40 | 41 | from kallithea.lib.auth import HasPermissionAnyDecorator, HasRepoPermissionLevel, HasRepoPermissionLevelDecorator, LoginRequired |
|
41 | from kallithea.lib.base import BaseRepoController, render | |
|
42 | 42 | from kallithea.lib.page import Page |
|
43 | 43 | from kallithea.lib.utils2 import safe_int |
|
44 | 44 | from kallithea.model import db |
|
45 | 45 | from kallithea.model.forms import RepoForkForm |
|
46 | 46 | from kallithea.model.repo import RepoModel |
|
47 | 47 | from kallithea.model.scm import AvailableRepoGroupChoices, ScmModel |
|
48 | 48 | |
|
49 | 49 | |
|
50 | 50 | log = logging.getLogger(__name__) |
|
51 | 51 | |
|
52 | 52 | |
|
53 | class ForksController(BaseRepoController): | |
|
53 | class ForksController(base.BaseRepoController): | |
|
54 | 54 | |
|
55 | 55 | def __load_defaults(self): |
|
56 | 56 | c.repo_groups = AvailableRepoGroupChoices('write') |
|
57 | 57 | |
|
58 | 58 | c.landing_revs_choices, c.landing_revs = ScmModel().get_repo_landing_revs() |
|
59 | 59 | |
|
60 | 60 | c.can_update = db.Ui.get_by_key('hooks', db.Ui.HOOK_UPDATE).ui_active |
|
61 | 61 | |
|
62 | 62 | def __load_data(self): |
|
63 | 63 | """ |
|
64 | 64 | Load defaults settings for edit, and update |
|
65 | 65 | """ |
|
66 | 66 | self.__load_defaults() |
|
67 | 67 | |
|
68 | 68 | c.repo_info = c.db_repo |
|
69 | 69 | repo = c.db_repo.scm_instance |
|
70 | 70 | |
|
71 | 71 | if c.repo_info is None: |
|
72 | 72 | raise HTTPNotFound() |
|
73 | 73 | |
|
74 | 74 | c.default_user_id = kallithea.DEFAULT_USER_ID |
|
75 | 75 | c.in_public_journal = db.UserFollowing.query() \ |
|
76 | 76 | .filter(db.UserFollowing.user_id == c.default_user_id) \ |
|
77 | 77 | .filter(db.UserFollowing.follows_repository == c.repo_info).scalar() |
|
78 | 78 | |
|
79 | 79 | if c.repo_info.stats: |
|
80 | 80 | last_rev = c.repo_info.stats.stat_on_revision + 1 |
|
81 | 81 | else: |
|
82 | 82 | last_rev = 0 |
|
83 | 83 | c.stats_revision = last_rev |
|
84 | 84 | |
|
85 | 85 | c.repo_last_rev = repo.count() if repo.revisions else 0 |
|
86 | 86 | |
|
87 | 87 | if last_rev == 0 or c.repo_last_rev == 0: |
|
88 | 88 | c.stats_percentage = 0 |
|
89 | 89 | else: |
|
90 | 90 | c.stats_percentage = '%.2f' % ((float((last_rev)) / |
|
91 | 91 | c.repo_last_rev) * 100) |
|
92 | 92 | |
|
93 | 93 | defaults = RepoModel()._get_defaults(c.repo_name) |
|
94 | 94 | # alter the description to indicate a fork |
|
95 | 95 | defaults['description'] = ('fork of repository: %s \n%s' |
|
96 | 96 | % (defaults['repo_name'], |
|
97 | 97 | defaults['description'])) |
|
98 | 98 | # add suffix to fork |
|
99 | 99 | defaults['repo_name'] = '%s-fork' % defaults['repo_name'] |
|
100 | 100 | |
|
101 | 101 | return defaults |
|
102 | 102 | |
|
103 | 103 | @LoginRequired(allow_default_user=True) |
|
104 | 104 | @HasRepoPermissionLevelDecorator('read') |
|
105 | 105 | def forks(self, repo_name): |
|
106 | 106 | p = safe_int(request.GET.get('page'), 1) |
|
107 | 107 | repo_id = c.db_repo.repo_id |
|
108 | 108 | d = [] |
|
109 | 109 | for r in db.Repository.get_repo_forks(repo_id): |
|
110 | 110 | if not HasRepoPermissionLevel('read')(r.repo_name, 'get forks check'): |
|
111 | 111 | continue |
|
112 | 112 | d.append(r) |
|
113 | 113 | c.forks_pager = Page(d, page=p, items_per_page=20) |
|
114 | 114 | |
|
115 | 115 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
116 | return render('/forks/forks_data.html') | |
|
116 | return base.render('/forks/forks_data.html') | |
|
117 | 117 | |
|
118 | return render('/forks/forks.html') | |
|
118 | return base.render('/forks/forks.html') | |
|
119 | 119 | |
|
120 | 120 | @LoginRequired() |
|
121 | 121 | @HasPermissionAnyDecorator('hg.admin', 'hg.fork.repository') |
|
122 | 122 | @HasRepoPermissionLevelDecorator('read') |
|
123 | 123 | def fork(self, repo_name): |
|
124 | 124 | c.repo_info = db.Repository.get_by_repo_name(repo_name) |
|
125 | 125 | if not c.repo_info: |
|
126 | 126 | raise HTTPNotFound() |
|
127 | 127 | |
|
128 | 128 | defaults = self.__load_data() |
|
129 | 129 | |
|
130 | 130 | return htmlfill.render( |
|
131 | render('forks/fork.html'), | |
|
131 | base.render('forks/fork.html'), | |
|
132 | 132 | defaults=defaults, |
|
133 | 133 | encoding="UTF-8", |
|
134 | 134 | force_defaults=False) |
|
135 | 135 | |
|
136 | 136 | @LoginRequired() |
|
137 | 137 | @HasPermissionAnyDecorator('hg.admin', 'hg.fork.repository') |
|
138 | 138 | @HasRepoPermissionLevelDecorator('read') |
|
139 | 139 | def fork_create(self, repo_name): |
|
140 | 140 | self.__load_defaults() |
|
141 | 141 | c.repo_info = db.Repository.get_by_repo_name(repo_name) |
|
142 | 142 | _form = RepoForkForm(old_data={'repo_type': c.repo_info.repo_type}, |
|
143 | 143 | repo_groups=c.repo_groups, |
|
144 | 144 | landing_revs=c.landing_revs_choices)() |
|
145 | 145 | form_result = {} |
|
146 | 146 | task_id = None |
|
147 | 147 | try: |
|
148 | 148 | form_result = _form.to_python(dict(request.POST)) |
|
149 | 149 | |
|
150 | 150 | # an approximation that is better than nothing |
|
151 | 151 | if not db.Ui.get_by_key('hooks', db.Ui.HOOK_UPDATE).ui_active: |
|
152 | 152 | form_result['update_after_clone'] = False |
|
153 | 153 | |
|
154 | 154 | # create fork is done sometimes async on celery, db transaction |
|
155 | 155 | # management is handled there. |
|
156 | 156 | task = RepoModel().create_fork(form_result, request.authuser.user_id) |
|
157 | 157 | task_id = task.task_id |
|
158 | 158 | except formencode.Invalid as errors: |
|
159 | 159 | return htmlfill.render( |
|
160 | render('forks/fork.html'), | |
|
160 | base.render('forks/fork.html'), | |
|
161 | 161 | defaults=errors.value, |
|
162 | 162 | errors=errors.error_dict or {}, |
|
163 | 163 | prefix_error=False, |
|
164 | 164 | encoding="UTF-8", |
|
165 | 165 | force_defaults=False) |
|
166 | 166 | except Exception: |
|
167 | 167 | log.error(traceback.format_exc()) |
|
168 | 168 | webutils.flash(_('An error occurred during repository forking %s') % |
|
169 | 169 | repo_name, category='error') |
|
170 | 170 | |
|
171 | 171 | raise HTTPFound(location=webutils.url('repo_creating_home', |
|
172 | 172 | repo_name=form_result['repo_name_full'], |
|
173 | 173 | task_id=task_id)) |
@@ -1,211 +1,211 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.home |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Home controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Feb 18, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | |
|
27 | 27 | """ |
|
28 | 28 | |
|
29 | 29 | import logging |
|
30 | 30 | |
|
31 | 31 | from sqlalchemy import or_ |
|
32 | 32 | from tg import request |
|
33 | 33 | from tg import tmpl_context as c |
|
34 | 34 | from tg.i18n import ugettext as _ |
|
35 | 35 | from webob.exc import HTTPBadRequest |
|
36 | 36 | |
|
37 | 37 | import kallithea.lib.helpers as h |
|
38 | from kallithea.controllers import base | |
|
38 | 39 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
39 | from kallithea.lib.base import BaseController, jsonify, render | |
|
40 | 40 | from kallithea.lib.utils2 import safe_str |
|
41 | 41 | from kallithea.model import db |
|
42 | 42 | from kallithea.model.repo import RepoModel |
|
43 | 43 | from kallithea.model.scm import UserGroupList |
|
44 | 44 | |
|
45 | 45 | |
|
46 | 46 | log = logging.getLogger(__name__) |
|
47 | 47 | |
|
48 | 48 | |
|
49 | class HomeController(BaseController): | |
|
49 | class HomeController(base.BaseController): | |
|
50 | 50 | |
|
51 | 51 | def about(self): |
|
52 | return render('/about.html') | |
|
52 | return base.render('/about.html') | |
|
53 | 53 | |
|
54 | 54 | @LoginRequired(allow_default_user=True) |
|
55 | 55 | def index(self): |
|
56 | 56 | c.group = None |
|
57 | 57 | |
|
58 | 58 | repo_groups_list = self.scm_model.get_repo_groups() |
|
59 | 59 | repos_list = db.Repository.query(sorted=True).filter_by(group=None).all() |
|
60 | 60 | |
|
61 | 61 | c.data = RepoModel().get_repos_as_dict(repos_list, |
|
62 | 62 | repo_groups_list=repo_groups_list, |
|
63 | 63 | short_name=True) |
|
64 | 64 | |
|
65 | return render('/index.html') | |
|
65 | return base.render('/index.html') | |
|
66 | 66 | |
|
67 | 67 | @LoginRequired(allow_default_user=True) |
|
68 | @jsonify | |
|
68 | @base.jsonify | |
|
69 | 69 | def repo_switcher_data(self): |
|
70 | 70 | if request.is_xhr: |
|
71 | 71 | all_repos = db.Repository.query(sorted=True).all() |
|
72 | 72 | repo_iter = self.scm_model.get_repos(all_repos) |
|
73 | 73 | all_groups = db.RepoGroup.query(sorted=True).all() |
|
74 | 74 | repo_groups_iter = self.scm_model.get_repo_groups(all_groups) |
|
75 | 75 | |
|
76 | 76 | res = [{ |
|
77 | 77 | 'text': _('Groups'), |
|
78 | 78 | 'children': [ |
|
79 | 79 | {'id': obj.group_name, |
|
80 | 80 | 'text': obj.group_name, |
|
81 | 81 | 'type': 'group', |
|
82 | 82 | 'obj': {}} |
|
83 | 83 | for obj in repo_groups_iter |
|
84 | 84 | ], |
|
85 | 85 | }, |
|
86 | 86 | { |
|
87 | 87 | 'text': _('Repositories'), |
|
88 | 88 | 'children': [ |
|
89 | 89 | {'id': obj.repo_name, |
|
90 | 90 | 'text': obj.repo_name, |
|
91 | 91 | 'type': 'repo', |
|
92 | 92 | 'obj': obj.get_dict()} |
|
93 | 93 | for obj in repo_iter |
|
94 | 94 | ], |
|
95 | 95 | }] |
|
96 | 96 | |
|
97 | 97 | for res_dict in res: |
|
98 | 98 | for child in (res_dict['children']): |
|
99 | 99 | child['obj'].pop('_changeset_cache', None) # bytes cannot be encoded in json ... but this value isn't relevant on client side at all ... |
|
100 | 100 | |
|
101 | 101 | data = { |
|
102 | 102 | 'more': False, |
|
103 | 103 | 'results': res, |
|
104 | 104 | } |
|
105 | 105 | return data |
|
106 | 106 | |
|
107 | 107 | else: |
|
108 | 108 | raise HTTPBadRequest() |
|
109 | 109 | |
|
110 | 110 | @LoginRequired(allow_default_user=True) |
|
111 | 111 | @HasRepoPermissionLevelDecorator('read') |
|
112 | @jsonify | |
|
112 | @base.jsonify | |
|
113 | 113 | def repo_refs_data(self, repo_name): |
|
114 | 114 | repo = db.Repository.get_by_repo_name(repo_name).scm_instance |
|
115 | 115 | res = [] |
|
116 | 116 | _branches = repo.branches.items() |
|
117 | 117 | if _branches: |
|
118 | 118 | res.append({ |
|
119 | 119 | 'text': _('Branch'), |
|
120 | 120 | 'children': [{'id': safe_str(rev), 'text': safe_str(name), 'type': 'branch'} for name, rev in _branches] |
|
121 | 121 | }) |
|
122 | 122 | _closed_branches = repo.closed_branches.items() |
|
123 | 123 | if _closed_branches: |
|
124 | 124 | res.append({ |
|
125 | 125 | 'text': _('Closed Branches'), |
|
126 | 126 | 'children': [{'id': safe_str(rev), 'text': safe_str(name), 'type': 'closed-branch'} for name, rev in _closed_branches] |
|
127 | 127 | }) |
|
128 | 128 | _tags = repo.tags.items() |
|
129 | 129 | if _tags: |
|
130 | 130 | res.append({ |
|
131 | 131 | 'text': _('Tag'), |
|
132 | 132 | 'children': [{'id': safe_str(rev), 'text': safe_str(name), 'type': 'tag'} for name, rev in _tags] |
|
133 | 133 | }) |
|
134 | 134 | _bookmarks = repo.bookmarks.items() |
|
135 | 135 | if _bookmarks: |
|
136 | 136 | res.append({ |
|
137 | 137 | 'text': _('Bookmark'), |
|
138 | 138 | 'children': [{'id': safe_str(rev), 'text': safe_str(name), 'type': 'book'} for name, rev in _bookmarks] |
|
139 | 139 | }) |
|
140 | 140 | data = { |
|
141 | 141 | 'more': False, |
|
142 | 142 | 'results': res |
|
143 | 143 | } |
|
144 | 144 | return data |
|
145 | 145 | |
|
146 | 146 | @LoginRequired() |
|
147 | @jsonify | |
|
147 | @base.jsonify | |
|
148 | 148 | def users_and_groups_data(self): |
|
149 | 149 | """ |
|
150 | 150 | Returns 'results' with a list of users and user groups. |
|
151 | 151 | |
|
152 | 152 | You can either use the 'key' GET parameter to get a user by providing |
|
153 | 153 | the exact user key or you can use the 'query' parameter to |
|
154 | 154 | search for users by user key, first name and last name. |
|
155 | 155 | 'types' defaults to just 'users' but can be set to 'users,groups' to |
|
156 | 156 | get both users and groups. |
|
157 | 157 | No more than 500 results (of each kind) will be returned. |
|
158 | 158 | """ |
|
159 | 159 | types = request.GET.get('types', 'users').split(',') |
|
160 | 160 | key = request.GET.get('key', '') |
|
161 | 161 | query = request.GET.get('query', '') |
|
162 | 162 | results = [] |
|
163 | 163 | if 'users' in types: |
|
164 | 164 | user_list = [] |
|
165 | 165 | if key: |
|
166 | 166 | u = db.User.get_by_username(key) |
|
167 | 167 | if u: |
|
168 | 168 | user_list = [u] |
|
169 | 169 | elif query: |
|
170 | 170 | user_list = db.User.query() \ |
|
171 | 171 | .filter(db.User.is_default_user == False) \ |
|
172 | 172 | .filter(db.User.active == True) \ |
|
173 | 173 | .filter(or_( |
|
174 | 174 | db.User.username.ilike("%%" + query + "%%"), |
|
175 | 175 | db.User.name.concat(' ').concat(db.User.lastname).ilike("%%" + query + "%%"), |
|
176 | 176 | db.User.lastname.concat(' ').concat(db.User.name).ilike("%%" + query + "%%"), |
|
177 | 177 | db.User.email.ilike("%%" + query + "%%"), |
|
178 | 178 | )) \ |
|
179 | 179 | .order_by(db.User.username) \ |
|
180 | 180 | .limit(500) \ |
|
181 | 181 | .all() |
|
182 | 182 | for u in user_list: |
|
183 | 183 | results.append({ |
|
184 | 184 | 'type': 'user', |
|
185 | 185 | 'id': u.user_id, |
|
186 | 186 | 'nname': u.username, |
|
187 | 187 | 'fname': u.name, |
|
188 | 188 | 'lname': u.lastname, |
|
189 | 189 | 'gravatar_lnk': h.gravatar_url(u.email, size=28, default='default'), |
|
190 | 190 | 'gravatar_size': 14, |
|
191 | 191 | }) |
|
192 | 192 | if 'groups' in types: |
|
193 | 193 | grp_list = [] |
|
194 | 194 | if key: |
|
195 | 195 | grp = db.UserGroup.get_by_group_name(key) |
|
196 | 196 | if grp: |
|
197 | 197 | grp_list = [grp] |
|
198 | 198 | elif query: |
|
199 | 199 | grp_list = db.UserGroup.query() \ |
|
200 | 200 | .filter(db.UserGroup.users_group_name.ilike("%%" + query + "%%")) \ |
|
201 | 201 | .filter(db.UserGroup.users_group_active == True) \ |
|
202 | 202 | .order_by(db.UserGroup.users_group_name) \ |
|
203 | 203 | .limit(500) \ |
|
204 | 204 | .all() |
|
205 | 205 | for g in UserGroupList(grp_list, perm_level='read'): |
|
206 | 206 | results.append({ |
|
207 | 207 | 'type': 'group', |
|
208 | 208 | 'id': g.users_group_id, |
|
209 | 209 | 'grname': g.users_group_name, |
|
210 | 210 | }) |
|
211 | 211 | return dict(results=results) |
@@ -1,275 +1,275 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.journal |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Journal controller |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Nov 21, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | from itertools import groupby |
|
31 | 31 | |
|
32 | 32 | from sqlalchemy import or_ |
|
33 | 33 | from sqlalchemy.orm import joinedload |
|
34 | 34 | from tg import request, response |
|
35 | 35 | from tg import tmpl_context as c |
|
36 | 36 | from tg.i18n import ugettext as _ |
|
37 | 37 | from webob.exc import HTTPBadRequest |
|
38 | 38 | |
|
39 | 39 | import kallithea.lib.helpers as h |
|
40 | from kallithea.controllers import base | |
|
40 | 41 | from kallithea.controllers.admin.admin import _journal_filter |
|
41 | 42 | from kallithea.lib import feeds, webutils |
|
42 | 43 | from kallithea.lib.auth import LoginRequired |
|
43 | from kallithea.lib.base import BaseController, render | |
|
44 | 44 | from kallithea.lib.page import Page |
|
45 | 45 | from kallithea.lib.utils2 import AttributeDict, safe_int |
|
46 | 46 | from kallithea.model import db, meta |
|
47 | 47 | from kallithea.model.repo import RepoModel |
|
48 | 48 | |
|
49 | 49 | |
|
50 | 50 | log = logging.getLogger(__name__) |
|
51 | 51 | |
|
52 | 52 | |
|
53 | 53 | language = 'en-us' |
|
54 | 54 | ttl = "5" |
|
55 | 55 | feed_nr = 20 |
|
56 | 56 | |
|
57 | 57 | |
|
58 | class JournalController(BaseController): | |
|
58 | class JournalController(base.BaseController): | |
|
59 | 59 | |
|
60 | 60 | def _before(self, *args, **kwargs): |
|
61 | 61 | super(JournalController, self)._before(*args, **kwargs) |
|
62 | 62 | c.search_term = request.GET.get('filter') |
|
63 | 63 | |
|
64 | 64 | def _get_daily_aggregate(self, journal): |
|
65 | 65 | groups = [] |
|
66 | 66 | for k, g in groupby(journal, lambda x: x.action_as_day): |
|
67 | 67 | user_group = [] |
|
68 | 68 | # groupby username if it's a present value, else fallback to journal username |
|
69 | 69 | for _unused, g2 in groupby(list(g), lambda x: x.user.username if x.user else x.username): |
|
70 | 70 | l = list(g2) |
|
71 | 71 | user_group.append((l[0].user, l)) |
|
72 | 72 | |
|
73 | 73 | groups.append((k, user_group,)) |
|
74 | 74 | |
|
75 | 75 | return groups |
|
76 | 76 | |
|
77 | 77 | def _get_journal_data(self, following_repos): |
|
78 | 78 | repo_ids = [x.follows_repository_id for x in following_repos |
|
79 | 79 | if x.follows_repository_id is not None] |
|
80 | 80 | user_ids = [x.follows_user_id for x in following_repos |
|
81 | 81 | if x.follows_user_id is not None] |
|
82 | 82 | |
|
83 | 83 | filtering_criterion = None |
|
84 | 84 | |
|
85 | 85 | if repo_ids and user_ids: |
|
86 | 86 | filtering_criterion = or_(db.UserLog.repository_id.in_(repo_ids), |
|
87 | 87 | db.UserLog.user_id.in_(user_ids)) |
|
88 | 88 | if repo_ids and not user_ids: |
|
89 | 89 | filtering_criterion = db.UserLog.repository_id.in_(repo_ids) |
|
90 | 90 | if not repo_ids and user_ids: |
|
91 | 91 | filtering_criterion = db.UserLog.user_id.in_(user_ids) |
|
92 | 92 | if filtering_criterion is not None: |
|
93 | 93 | journal = db.UserLog.query() \ |
|
94 | 94 | .options(joinedload(db.UserLog.user)) \ |
|
95 | 95 | .options(joinedload(db.UserLog.repository)) |
|
96 | 96 | # filter |
|
97 | 97 | journal = _journal_filter(journal, c.search_term) |
|
98 | 98 | journal = journal.filter(filtering_criterion) \ |
|
99 | 99 | .order_by(db.UserLog.action_date.desc()) |
|
100 | 100 | else: |
|
101 | 101 | journal = [] |
|
102 | 102 | |
|
103 | 103 | return journal |
|
104 | 104 | |
|
105 | 105 | def _feed(self, repos, feeder, link, desc): |
|
106 | 106 | response.content_type = feeder.content_type |
|
107 | 107 | journal = self._get_journal_data(repos) |
|
108 | 108 | |
|
109 | 109 | header = dict( |
|
110 | 110 | title=desc, |
|
111 | 111 | link=link, |
|
112 | 112 | description=desc, |
|
113 | 113 | ) |
|
114 | 114 | |
|
115 | 115 | entries=[] |
|
116 | 116 | for entry in journal[:feed_nr]: |
|
117 | 117 | user = entry.user |
|
118 | 118 | if user is None: |
|
119 | 119 | # fix deleted users |
|
120 | 120 | user = AttributeDict({'short_contact': entry.username, |
|
121 | 121 | 'email': '', |
|
122 | 122 | 'full_contact': ''}) |
|
123 | 123 | action, action_extra, ico = h.action_parser(entry, feed=True) |
|
124 | 124 | title = "%s - %s %s" % (user.short_contact, action(), |
|
125 | 125 | entry.repository.repo_name) |
|
126 | 126 | _url = None |
|
127 | 127 | if entry.repository is not None: |
|
128 | 128 | _url = webutils.canonical_url('changelog_home', |
|
129 | 129 | repo_name=entry.repository.repo_name) |
|
130 | 130 | |
|
131 | 131 | entries.append(dict( |
|
132 | 132 | title=title, |
|
133 | 133 | pubdate=entry.action_date, |
|
134 | 134 | link=_url or webutils.canonical_url(''), |
|
135 | 135 | author_email=user.email, |
|
136 | 136 | author_name=user.full_name_or_username, |
|
137 | 137 | description=action_extra(), |
|
138 | 138 | )) |
|
139 | 139 | |
|
140 | 140 | return feeder.render(header, entries) |
|
141 | 141 | |
|
142 | 142 | def _atom_feed(self, repos, public=True): |
|
143 | 143 | if public: |
|
144 | 144 | link = webutils.canonical_url('public_journal_atom') |
|
145 | 145 | desc = '%s %s %s' % (c.site_name, _('Public Journal'), |
|
146 | 146 | 'atom feed') |
|
147 | 147 | else: |
|
148 | 148 | link = webutils.canonical_url('journal_atom') |
|
149 | 149 | desc = '%s %s %s' % (c.site_name, _('Journal'), 'atom feed') |
|
150 | 150 | |
|
151 | 151 | return self._feed(repos, feeds.AtomFeed, link, desc) |
|
152 | 152 | |
|
153 | 153 | def _rss_feed(self, repos, public=True): |
|
154 | 154 | if public: |
|
155 | 155 | link = webutils.canonical_url('public_journal_atom') |
|
156 | 156 | desc = '%s %s %s' % (c.site_name, _('Public Journal'), |
|
157 | 157 | 'rss feed') |
|
158 | 158 | else: |
|
159 | 159 | link = webutils.canonical_url('journal_atom') |
|
160 | 160 | desc = '%s %s %s' % (c.site_name, _('Journal'), 'rss feed') |
|
161 | 161 | |
|
162 | 162 | return self._feed(repos, feeds.RssFeed, link, desc) |
|
163 | 163 | |
|
164 | 164 | @LoginRequired() |
|
165 | 165 | def index(self): |
|
166 | 166 | # Return a rendered template |
|
167 | 167 | p = safe_int(request.GET.get('page'), 1) |
|
168 | 168 | c.user = db.User.get(request.authuser.user_id) |
|
169 | 169 | c.following = db.UserFollowing.query() \ |
|
170 | 170 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ |
|
171 | 171 | .options(joinedload(db.UserFollowing.follows_repository)) \ |
|
172 | 172 | .all() |
|
173 | 173 | |
|
174 | 174 | journal = self._get_journal_data(c.following) |
|
175 | 175 | |
|
176 | 176 | c.journal_pager = Page(journal, page=p, items_per_page=20, |
|
177 | 177 | filter=c.search_term) |
|
178 | 178 | c.journal_day_aggregate = self._get_daily_aggregate(c.journal_pager) |
|
179 | 179 | |
|
180 | 180 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
181 | return render('journal/journal_data.html') | |
|
181 | return base.render('journal/journal_data.html') | |
|
182 | 182 | |
|
183 | 183 | repos_list = db.Repository.query(sorted=True) \ |
|
184 | 184 | .filter_by(owner_id=request.authuser.user_id).all() |
|
185 | 185 | |
|
186 | 186 | repos_data = RepoModel().get_repos_as_dict(repos_list, admin=True) |
|
187 | 187 | # data used to render the grid |
|
188 | 188 | c.data = repos_data |
|
189 | 189 | |
|
190 | return render('journal/journal.html') | |
|
190 | return base.render('journal/journal.html') | |
|
191 | 191 | |
|
192 | 192 | @LoginRequired() |
|
193 | 193 | def journal_atom(self): |
|
194 | 194 | """Produce a simple atom-1.0 feed""" |
|
195 | 195 | following = db.UserFollowing.query() \ |
|
196 | 196 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ |
|
197 | 197 | .options(joinedload(db.UserFollowing.follows_repository)) \ |
|
198 | 198 | .all() |
|
199 | 199 | return self._atom_feed(following, public=False) |
|
200 | 200 | |
|
201 | 201 | @LoginRequired() |
|
202 | 202 | def journal_rss(self): |
|
203 | 203 | """Produce a simple rss2 feed""" |
|
204 | 204 | following = db.UserFollowing.query() \ |
|
205 | 205 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ |
|
206 | 206 | .options(joinedload(db.UserFollowing.follows_repository)) \ |
|
207 | 207 | .all() |
|
208 | 208 | return self._rss_feed(following, public=False) |
|
209 | 209 | |
|
210 | 210 | @LoginRequired() |
|
211 | 211 | def toggle_following(self): |
|
212 | 212 | user_id = request.POST.get('follows_user_id') |
|
213 | 213 | if user_id: |
|
214 | 214 | try: |
|
215 | 215 | self.scm_model.toggle_following_user(user_id, |
|
216 | 216 | request.authuser.user_id) |
|
217 | 217 | meta.Session().commit() |
|
218 | 218 | return 'ok' |
|
219 | 219 | except Exception: |
|
220 | 220 | log.error(traceback.format_exc()) |
|
221 | 221 | raise HTTPBadRequest() |
|
222 | 222 | |
|
223 | 223 | repo_id = request.POST.get('follows_repository_id') |
|
224 | 224 | if repo_id: |
|
225 | 225 | try: |
|
226 | 226 | self.scm_model.toggle_following_repo(repo_id, |
|
227 | 227 | request.authuser.user_id) |
|
228 | 228 | meta.Session().commit() |
|
229 | 229 | return 'ok' |
|
230 | 230 | except Exception: |
|
231 | 231 | log.error(traceback.format_exc()) |
|
232 | 232 | raise HTTPBadRequest() |
|
233 | 233 | |
|
234 | 234 | raise HTTPBadRequest() |
|
235 | 235 | |
|
236 | 236 | @LoginRequired(allow_default_user=True) |
|
237 | 237 | def public_journal(self): |
|
238 | 238 | # Return a rendered template |
|
239 | 239 | p = safe_int(request.GET.get('page'), 1) |
|
240 | 240 | |
|
241 | 241 | c.following = db.UserFollowing.query() \ |
|
242 | 242 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ |
|
243 | 243 | .options(joinedload(db.UserFollowing.follows_repository)) \ |
|
244 | 244 | .all() |
|
245 | 245 | |
|
246 | 246 | journal = self._get_journal_data(c.following) |
|
247 | 247 | |
|
248 | 248 | c.journal_pager = Page(journal, page=p, items_per_page=20) |
|
249 | 249 | |
|
250 | 250 | c.journal_day_aggregate = self._get_daily_aggregate(c.journal_pager) |
|
251 | 251 | |
|
252 | 252 | if request.environ.get('HTTP_X_PARTIAL_XHR'): |
|
253 | return render('journal/journal_data.html') | |
|
253 | return base.render('journal/journal_data.html') | |
|
254 | 254 | |
|
255 | return render('journal/public_journal.html') | |
|
255 | return base.render('journal/public_journal.html') | |
|
256 | 256 | |
|
257 | 257 | @LoginRequired(allow_default_user=True) |
|
258 | 258 | def public_journal_atom(self): |
|
259 | 259 | """Produce a simple atom-1.0 feed""" |
|
260 | 260 | c.following = db.UserFollowing.query() \ |
|
261 | 261 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ |
|
262 | 262 | .options(joinedload(db.UserFollowing.follows_repository)) \ |
|
263 | 263 | .all() |
|
264 | 264 | |
|
265 | 265 | return self._atom_feed(c.following) |
|
266 | 266 | |
|
267 | 267 | @LoginRequired(allow_default_user=True) |
|
268 | 268 | def public_journal_rss(self): |
|
269 | 269 | """Produce a simple rss2 feed""" |
|
270 | 270 | c.following = db.UserFollowing.query() \ |
|
271 | 271 | .filter(db.UserFollowing.user_id == request.authuser.user_id) \ |
|
272 | 272 | .options(joinedload(db.UserFollowing.follows_repository)) \ |
|
273 | 273 | .all() |
|
274 | 274 | |
|
275 | 275 | return self._rss_feed(c.following) |
@@ -1,256 +1,256 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.login |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Login controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 22, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | |
|
29 | 29 | import logging |
|
30 | 30 | import re |
|
31 | 31 | |
|
32 | 32 | import formencode |
|
33 | 33 | from formencode import htmlfill |
|
34 | 34 | from tg import request, session |
|
35 | 35 | from tg import tmpl_context as c |
|
36 | 36 | from tg.i18n import ugettext as _ |
|
37 | 37 | from webob.exc import HTTPBadRequest, HTTPFound |
|
38 | 38 | |
|
39 | from kallithea.controllers import base | |
|
39 | 40 | from kallithea.lib import webutils |
|
40 | 41 | from kallithea.lib.auth import AuthUser, HasPermissionAnyDecorator |
|
41 | from kallithea.lib.base import BaseController, log_in_user, render | |
|
42 | 42 | from kallithea.lib.exceptions import UserCreationError |
|
43 | 43 | from kallithea.lib.recaptcha import submit |
|
44 | 44 | from kallithea.lib.webutils import url |
|
45 | 45 | from kallithea.model import db, meta |
|
46 | 46 | from kallithea.model.forms import LoginForm, PasswordResetConfirmationForm, PasswordResetRequestForm, RegisterForm |
|
47 | 47 | from kallithea.model.user import UserModel |
|
48 | 48 | |
|
49 | 49 | |
|
50 | 50 | log = logging.getLogger(__name__) |
|
51 | 51 | |
|
52 | 52 | |
|
53 | class LoginController(BaseController): | |
|
53 | class LoginController(base.BaseController): | |
|
54 | 54 | |
|
55 | 55 | def _validate_came_from(self, came_from, |
|
56 | 56 | _re=re.compile(r"/(?!/)[-!#$%&'()*+,./:;=?@_~0-9A-Za-z]*$")): |
|
57 | 57 | """Return True if came_from is valid and can and should be used. |
|
58 | 58 | |
|
59 | 59 | Determines if a URI reference is valid and relative to the origin; |
|
60 | 60 | or in RFC 3986 terms, whether it matches this production: |
|
61 | 61 | |
|
62 | 62 | origin-relative-ref = path-absolute [ "?" query ] [ "#" fragment ] |
|
63 | 63 | |
|
64 | 64 | with the exception that '%' escapes are not validated and '#' is |
|
65 | 65 | allowed inside the fragment part. |
|
66 | 66 | """ |
|
67 | 67 | return _re.match(came_from) is not None |
|
68 | 68 | |
|
69 | 69 | def index(self): |
|
70 | 70 | c.came_from = request.GET.get('came_from', '') |
|
71 | 71 | if c.came_from: |
|
72 | 72 | if not self._validate_came_from(c.came_from): |
|
73 | 73 | log.error('Invalid came_from (not server-relative): %r', c.came_from) |
|
74 | 74 | raise HTTPBadRequest() |
|
75 | 75 | else: |
|
76 | 76 | c.came_from = url('home') |
|
77 | 77 | |
|
78 | 78 | if request.POST: |
|
79 | 79 | # import Login Form validator class |
|
80 | 80 | login_form = LoginForm()() |
|
81 | 81 | try: |
|
82 | 82 | # login_form will check username/password using ValidAuth and report failure to the user |
|
83 | 83 | c.form_result = login_form.to_python(dict(request.POST)) |
|
84 | 84 | username = c.form_result['username'] |
|
85 | 85 | user = db.User.get_by_username_or_email(username) |
|
86 | 86 | assert user is not None # the same user get just passed in the form validation |
|
87 | 87 | except formencode.Invalid as errors: |
|
88 | 88 | defaults = errors.value |
|
89 | 89 | # remove password from filling in form again |
|
90 | 90 | defaults.pop('password', None) |
|
91 | 91 | return htmlfill.render( |
|
92 | render('/login.html'), | |
|
92 | base.render('/login.html'), | |
|
93 | 93 | defaults=errors.value, |
|
94 | 94 | errors=errors.error_dict or {}, |
|
95 | 95 | prefix_error=False, |
|
96 | 96 | encoding="UTF-8", |
|
97 | 97 | force_defaults=False) |
|
98 | 98 | except UserCreationError as e: |
|
99 | 99 | # container auth or other auth functions that create users on |
|
100 | 100 | # the fly can throw this exception signaling that there's issue |
|
101 | 101 | # with user creation, explanation should be provided in |
|
102 | 102 | # Exception itself |
|
103 | 103 | webutils.flash(e, 'error') |
|
104 | 104 | else: |
|
105 | 105 | # login_form already validated the password - now set the session cookie accordingly |
|
106 | auth_user = log_in_user(user, c.form_result['remember'], is_external_auth=False, ip_addr=request.ip_addr) | |
|
106 | auth_user = base.log_in_user(user, c.form_result['remember'], is_external_auth=False, ip_addr=request.ip_addr) | |
|
107 | 107 | if auth_user: |
|
108 | 108 | raise HTTPFound(location=c.came_from) |
|
109 | 109 | webutils.flash(_('Authentication failed.'), 'error') |
|
110 | 110 | else: |
|
111 | 111 | # redirect if already logged in |
|
112 | 112 | if not request.authuser.is_anonymous: |
|
113 | 113 | raise HTTPFound(location=c.came_from) |
|
114 | 114 | # continue to show login to default user |
|
115 | 115 | |
|
116 | return render('/login.html') | |
|
116 | return base.render('/login.html') | |
|
117 | 117 | |
|
118 | 118 | @HasPermissionAnyDecorator('hg.admin', 'hg.register.auto_activate', |
|
119 | 119 | 'hg.register.manual_activate') |
|
120 | 120 | def register(self): |
|
121 | 121 | def_user_perms = AuthUser(dbuser=db.User.get_default_user()).global_permissions |
|
122 | 122 | c.auto_active = 'hg.register.auto_activate' in def_user_perms |
|
123 | 123 | |
|
124 | 124 | settings = db.Setting.get_app_settings() |
|
125 | 125 | captcha_private_key = settings.get('captcha_private_key') |
|
126 | 126 | c.captcha_active = bool(captcha_private_key) |
|
127 | 127 | c.captcha_public_key = settings.get('captcha_public_key') |
|
128 | 128 | |
|
129 | 129 | if request.POST: |
|
130 | 130 | register_form = RegisterForm()() |
|
131 | 131 | try: |
|
132 | 132 | form_result = register_form.to_python(dict(request.POST)) |
|
133 | 133 | form_result['active'] = c.auto_active |
|
134 | 134 | |
|
135 | 135 | if c.captcha_active: |
|
136 | 136 | response = submit(request.POST.get('g-recaptcha-response'), |
|
137 | 137 | private_key=captcha_private_key, |
|
138 | 138 | remoteip=request.ip_addr) |
|
139 | 139 | if not response.is_valid: |
|
140 | 140 | _value = form_result |
|
141 | 141 | _msg = _('Bad captcha') |
|
142 | 142 | error_dict = {'recaptcha_field': _msg} |
|
143 | 143 | raise formencode.Invalid(_msg, _value, None, |
|
144 | 144 | error_dict=error_dict) |
|
145 | 145 | |
|
146 | 146 | UserModel().create_registration(form_result) |
|
147 | 147 | webutils.flash(_('You have successfully registered with %s') % (c.site_name or 'Kallithea'), |
|
148 | 148 | category='success') |
|
149 | 149 | meta.Session().commit() |
|
150 | 150 | raise HTTPFound(location=url('login_home')) |
|
151 | 151 | |
|
152 | 152 | except formencode.Invalid as errors: |
|
153 | 153 | return htmlfill.render( |
|
154 | render('/register.html'), | |
|
154 | base.render('/register.html'), | |
|
155 | 155 | defaults=errors.value, |
|
156 | 156 | errors=errors.error_dict or {}, |
|
157 | 157 | prefix_error=False, |
|
158 | 158 | encoding="UTF-8", |
|
159 | 159 | force_defaults=False) |
|
160 | 160 | except UserCreationError as e: |
|
161 | 161 | # container auth or other auth functions that create users on |
|
162 | 162 | # the fly can throw this exception signaling that there's issue |
|
163 | 163 | # with user creation, explanation should be provided in |
|
164 | 164 | # Exception itself |
|
165 | 165 | webutils.flash(e, 'error') |
|
166 | 166 | |
|
167 | return render('/register.html') | |
|
167 | return base.render('/register.html') | |
|
168 | 168 | |
|
169 | 169 | def password_reset(self): |
|
170 | 170 | settings = db.Setting.get_app_settings() |
|
171 | 171 | captcha_private_key = settings.get('captcha_private_key') |
|
172 | 172 | c.captcha_active = bool(captcha_private_key) |
|
173 | 173 | c.captcha_public_key = settings.get('captcha_public_key') |
|
174 | 174 | |
|
175 | 175 | if request.POST: |
|
176 | 176 | password_reset_form = PasswordResetRequestForm()() |
|
177 | 177 | try: |
|
178 | 178 | form_result = password_reset_form.to_python(dict(request.POST)) |
|
179 | 179 | if c.captcha_active: |
|
180 | 180 | response = submit(request.POST.get('g-recaptcha-response'), |
|
181 | 181 | private_key=captcha_private_key, |
|
182 | 182 | remoteip=request.ip_addr) |
|
183 | 183 | if not response.is_valid: |
|
184 | 184 | _value = form_result |
|
185 | 185 | _msg = _('Bad captcha') |
|
186 | 186 | error_dict = {'recaptcha_field': _msg} |
|
187 | 187 | raise formencode.Invalid(_msg, _value, None, |
|
188 | 188 | error_dict=error_dict) |
|
189 | 189 | redirect_link = UserModel().send_reset_password_email(form_result) |
|
190 | 190 | webutils.flash(_('A password reset confirmation code has been sent'), |
|
191 | 191 | category='success') |
|
192 | 192 | raise HTTPFound(location=redirect_link) |
|
193 | 193 | |
|
194 | 194 | except formencode.Invalid as errors: |
|
195 | 195 | return htmlfill.render( |
|
196 | render('/password_reset.html'), | |
|
196 | base.render('/password_reset.html'), | |
|
197 | 197 | defaults=errors.value, |
|
198 | 198 | errors=errors.error_dict or {}, |
|
199 | 199 | prefix_error=False, |
|
200 | 200 | encoding="UTF-8", |
|
201 | 201 | force_defaults=False) |
|
202 | 202 | |
|
203 | return render('/password_reset.html') | |
|
203 | return base.render('/password_reset.html') | |
|
204 | 204 | |
|
205 | 205 | def password_reset_confirmation(self): |
|
206 | 206 | # This controller handles both GET and POST requests, though we |
|
207 | 207 | # only ever perform the actual password change on POST (since |
|
208 | 208 | # GET requests are not allowed to have side effects, and do not |
|
209 | 209 | # receive automatic CSRF protection). |
|
210 | 210 | |
|
211 | 211 | # The template needs the email address outside of the form. |
|
212 | 212 | c.email = request.params.get('email') |
|
213 | 213 | c.timestamp = request.params.get('timestamp') or '' |
|
214 | 214 | c.token = request.params.get('token') or '' |
|
215 | 215 | if not request.POST: |
|
216 | return render('/password_reset_confirmation.html') | |
|
216 | return base.render('/password_reset_confirmation.html') | |
|
217 | 217 | |
|
218 | 218 | form = PasswordResetConfirmationForm()() |
|
219 | 219 | try: |
|
220 | 220 | form_result = form.to_python(dict(request.POST)) |
|
221 | 221 | except formencode.Invalid as errors: |
|
222 | 222 | return htmlfill.render( |
|
223 | render('/password_reset_confirmation.html'), | |
|
223 | base.render('/password_reset_confirmation.html'), | |
|
224 | 224 | defaults=errors.value, |
|
225 | 225 | errors=errors.error_dict or {}, |
|
226 | 226 | prefix_error=False, |
|
227 | 227 | encoding='UTF-8') |
|
228 | 228 | |
|
229 | 229 | if not UserModel().verify_reset_password_token( |
|
230 | 230 | form_result['email'], |
|
231 | 231 | form_result['timestamp'], |
|
232 | 232 | form_result['token'], |
|
233 | 233 | ): |
|
234 | 234 | return htmlfill.render( |
|
235 | render('/password_reset_confirmation.html'), | |
|
235 | base.render('/password_reset_confirmation.html'), | |
|
236 | 236 | defaults=form_result, |
|
237 | 237 | errors={'token': _('Invalid password reset token')}, |
|
238 | 238 | prefix_error=False, |
|
239 | 239 | encoding='UTF-8') |
|
240 | 240 | |
|
241 | 241 | UserModel().reset_password(form_result['email'], form_result['password']) |
|
242 | 242 | webutils.flash(_('Successfully updated password'), category='success') |
|
243 | 243 | raise HTTPFound(location=url('login_home')) |
|
244 | 244 | |
|
245 | 245 | def logout(self): |
|
246 | 246 | session.delete() |
|
247 | 247 | log.info('Logging out and deleting session for user') |
|
248 | 248 | raise HTTPFound(location=url('home')) |
|
249 | 249 | |
|
250 | 250 | def session_csrf_secret_token(self): |
|
251 | 251 | """Return the CSRF protection token for the session - just like it |
|
252 | 252 | could have been screen scraped from a page with a form. |
|
253 | 253 | Only intended for testing but might also be useful for other kinds |
|
254 | 254 | of automation. |
|
255 | 255 | """ |
|
256 | 256 | return webutils.session_csrf_secret_token() |
@@ -1,638 +1,638 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.pullrequests |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | pull requests controller for Kallithea for initializing pull requests |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: May 7, 2012 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | |
|
31 | 31 | import formencode |
|
32 | 32 | import mercurial.unionrepo |
|
33 | 33 | from tg import request |
|
34 | 34 | from tg import tmpl_context as c |
|
35 | 35 | from tg.i18n import ugettext as _ |
|
36 | 36 | from webob.exc import HTTPBadRequest, HTTPForbidden, HTTPFound, HTTPNotFound |
|
37 | 37 | |
|
38 | 38 | import kallithea.lib.helpers as h |
|
39 | from kallithea.controllers import base | |
|
39 | 40 | from kallithea.controllers.changeset import create_cs_pr_comment, delete_cs_pr_comment |
|
40 | 41 | from kallithea.lib import auth, diffs, webutils |
|
41 | 42 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
42 | from kallithea.lib.base import BaseRepoController, jsonify, render | |
|
43 | 43 | from kallithea.lib.graphmod import graph_data |
|
44 | 44 | from kallithea.lib.page import Page |
|
45 | 45 | from kallithea.lib.utils2 import ascii_bytes, safe_bytes, safe_int |
|
46 | 46 | from kallithea.lib.vcs.exceptions import ChangesetDoesNotExistError, EmptyRepositoryError |
|
47 | 47 | from kallithea.lib.webutils import url |
|
48 | 48 | from kallithea.model import db, meta |
|
49 | 49 | from kallithea.model.changeset_status import ChangesetStatusModel |
|
50 | 50 | from kallithea.model.comment import ChangesetCommentsModel |
|
51 | 51 | from kallithea.model.forms import PullRequestForm, PullRequestPostForm |
|
52 | 52 | from kallithea.model.pull_request import CreatePullRequestAction, CreatePullRequestIterationAction, PullRequestModel |
|
53 | 53 | |
|
54 | 54 | |
|
55 | 55 | log = logging.getLogger(__name__) |
|
56 | 56 | |
|
57 | 57 | |
|
58 | 58 | def _get_reviewer(user_id): |
|
59 | 59 | """Look up user by ID and validate it as a potential reviewer.""" |
|
60 | 60 | try: |
|
61 | 61 | user = db.User.get(int(user_id)) |
|
62 | 62 | except ValueError: |
|
63 | 63 | user = None |
|
64 | 64 | |
|
65 | 65 | if user is None or user.is_default_user: |
|
66 | 66 | webutils.flash(_('Invalid reviewer "%s" specified') % user_id, category='error') |
|
67 | 67 | raise HTTPBadRequest() |
|
68 | 68 | |
|
69 | 69 | return user |
|
70 | 70 | |
|
71 | 71 | |
|
72 | class PullrequestsController(BaseRepoController): | |
|
72 | class PullrequestsController(base.BaseRepoController): | |
|
73 | 73 | |
|
74 | 74 | def _get_repo_refs(self, repo, rev=None, branch=None, branch_rev=None): |
|
75 | 75 | """return a structure with scm repo's interesting changesets, suitable for |
|
76 | 76 | the selectors in pullrequest.html |
|
77 | 77 | |
|
78 | 78 | rev: a revision that must be in the list somehow and selected by default |
|
79 | 79 | branch: a branch that must be in the list and selected by default - even if closed |
|
80 | 80 | branch_rev: a revision of which peers should be preferred and available.""" |
|
81 | 81 | # list named branches that has been merged to this named branch - it should probably merge back |
|
82 | 82 | peers = [] |
|
83 | 83 | |
|
84 | 84 | if branch_rev: |
|
85 | 85 | # a revset not restricting to merge() would be better |
|
86 | 86 | # (especially because it would get the branch point) |
|
87 | 87 | # ... but is currently too expensive |
|
88 | 88 | # including branches of children could be nice too |
|
89 | 89 | peerbranches = set() |
|
90 | 90 | for i in repo._repo.revs( |
|
91 | 91 | b"sort(parents(branch(id(%s)) and merge()) - branch(id(%s)), -rev)", |
|
92 | 92 | ascii_bytes(branch_rev), ascii_bytes(branch_rev), |
|
93 | 93 | ): |
|
94 | 94 | for abranch in repo.get_changeset(i).branches: |
|
95 | 95 | if abranch not in peerbranches: |
|
96 | 96 | n = 'branch:%s:%s' % (abranch, repo.get_changeset(abranch).raw_id) |
|
97 | 97 | peers.append((n, abranch)) |
|
98 | 98 | peerbranches.add(abranch) |
|
99 | 99 | |
|
100 | 100 | selected = None |
|
101 | 101 | tiprev = repo.tags.get('tip') |
|
102 | 102 | tipbranch = None |
|
103 | 103 | |
|
104 | 104 | branches = [] |
|
105 | 105 | for abranch, branchrev in repo.branches.items(): |
|
106 | 106 | n = 'branch:%s:%s' % (abranch, branchrev) |
|
107 | 107 | desc = abranch |
|
108 | 108 | if branchrev == tiprev: |
|
109 | 109 | tipbranch = abranch |
|
110 | 110 | desc = '%s (current tip)' % desc |
|
111 | 111 | branches.append((n, desc)) |
|
112 | 112 | if rev == branchrev: |
|
113 | 113 | selected = n |
|
114 | 114 | if branch == abranch: |
|
115 | 115 | if not rev: |
|
116 | 116 | selected = n |
|
117 | 117 | branch = None |
|
118 | 118 | if branch: # branch not in list - it is probably closed |
|
119 | 119 | branchrev = repo.closed_branches.get(branch) |
|
120 | 120 | if branchrev: |
|
121 | 121 | n = 'branch:%s:%s' % (branch, branchrev) |
|
122 | 122 | branches.append((n, _('%s (closed)') % branch)) |
|
123 | 123 | selected = n |
|
124 | 124 | branch = None |
|
125 | 125 | if branch: |
|
126 | 126 | log.debug('branch %r not found in %s', branch, repo) |
|
127 | 127 | |
|
128 | 128 | bookmarks = [] |
|
129 | 129 | for bookmark, bookmarkrev in repo.bookmarks.items(): |
|
130 | 130 | n = 'book:%s:%s' % (bookmark, bookmarkrev) |
|
131 | 131 | bookmarks.append((n, bookmark)) |
|
132 | 132 | if rev == bookmarkrev: |
|
133 | 133 | selected = n |
|
134 | 134 | |
|
135 | 135 | tags = [] |
|
136 | 136 | for tag, tagrev in repo.tags.items(): |
|
137 | 137 | if tag == 'tip': |
|
138 | 138 | continue |
|
139 | 139 | n = 'tag:%s:%s' % (tag, tagrev) |
|
140 | 140 | tags.append((n, tag)) |
|
141 | 141 | # note: even if rev == tagrev, don't select the static tag - it must be chosen explicitly |
|
142 | 142 | |
|
143 | 143 | # prio 1: rev was selected as existing entry above |
|
144 | 144 | |
|
145 | 145 | # prio 2: create special entry for rev; rev _must_ be used |
|
146 | 146 | specials = [] |
|
147 | 147 | if rev and selected is None: |
|
148 | 148 | selected = 'rev:%s:%s' % (rev, rev) |
|
149 | 149 | specials = [(selected, '%s: %s' % (_("Changeset"), rev[:12]))] |
|
150 | 150 | |
|
151 | 151 | # prio 3: most recent peer branch |
|
152 | 152 | if peers and not selected: |
|
153 | 153 | selected = peers[0][0] |
|
154 | 154 | |
|
155 | 155 | # prio 4: tip revision |
|
156 | 156 | if not selected: |
|
157 | 157 | if repo.alias == 'hg': |
|
158 | 158 | if tipbranch: |
|
159 | 159 | selected = 'branch:%s:%s' % (tipbranch, tiprev) |
|
160 | 160 | else: |
|
161 | 161 | selected = 'tag:null:' + repo.EMPTY_CHANGESET |
|
162 | 162 | tags.append((selected, 'null')) |
|
163 | 163 | else: # Git |
|
164 | 164 | assert repo.alias == 'git' |
|
165 | 165 | if not repo.branches: |
|
166 | 166 | selected = '' # doesn't make sense, but better than nothing |
|
167 | 167 | elif 'master' in repo.branches: |
|
168 | 168 | selected = 'branch:master:%s' % repo.branches['master'] |
|
169 | 169 | else: |
|
170 | 170 | k, v = list(repo.branches.items())[0] |
|
171 | 171 | selected = 'branch:%s:%s' % (k, v) |
|
172 | 172 | |
|
173 | 173 | groups = [(specials, _("Special")), |
|
174 | 174 | (peers, _("Peer branches")), |
|
175 | 175 | (bookmarks, _("Bookmarks")), |
|
176 | 176 | (branches, _("Branches")), |
|
177 | 177 | (tags, _("Tags")), |
|
178 | 178 | ] |
|
179 | 179 | return [g for g in groups if g[0]], selected |
|
180 | 180 | |
|
181 | 181 | def _is_allowed_to_change_status(self, pull_request): |
|
182 | 182 | if pull_request.is_closed(): |
|
183 | 183 | return False |
|
184 | 184 | |
|
185 | 185 | owner = request.authuser.user_id == pull_request.owner_id |
|
186 | 186 | reviewer = db.PullRequestReviewer.query() \ |
|
187 | 187 | .filter(db.PullRequestReviewer.pull_request == pull_request) \ |
|
188 | 188 | .filter(db.PullRequestReviewer.user_id == request.authuser.user_id) \ |
|
189 | 189 | .count() != 0 |
|
190 | 190 | |
|
191 | 191 | return request.authuser.admin or owner or reviewer |
|
192 | 192 | |
|
193 | 193 | @LoginRequired(allow_default_user=True) |
|
194 | 194 | @HasRepoPermissionLevelDecorator('read') |
|
195 | 195 | def show_all(self, repo_name): |
|
196 | 196 | c.from_ = request.GET.get('from_') or '' |
|
197 | 197 | c.closed = request.GET.get('closed') or '' |
|
198 | 198 | url_params = {} |
|
199 | 199 | if c.from_: |
|
200 | 200 | url_params['from_'] = 1 |
|
201 | 201 | if c.closed: |
|
202 | 202 | url_params['closed'] = 1 |
|
203 | 203 | p = safe_int(request.GET.get('page'), 1) |
|
204 | 204 | |
|
205 | 205 | q = db.PullRequest.query(include_closed=c.closed, sorted=True) |
|
206 | 206 | if c.from_: |
|
207 | 207 | q = q.filter_by(org_repo=c.db_repo) |
|
208 | 208 | else: |
|
209 | 209 | q = q.filter_by(other_repo=c.db_repo) |
|
210 | 210 | c.pull_requests = q.all() |
|
211 | 211 | |
|
212 | 212 | c.pullrequests_pager = Page(c.pull_requests, page=p, items_per_page=100, **url_params) |
|
213 | 213 | |
|
214 | return render('/pullrequests/pullrequest_show_all.html') | |
|
214 | return base.render('/pullrequests/pullrequest_show_all.html') | |
|
215 | 215 | |
|
216 | 216 | @LoginRequired() |
|
217 | 217 | def show_my(self): |
|
218 | 218 | c.closed = request.GET.get('closed') or '' |
|
219 | 219 | |
|
220 | 220 | c.my_pull_requests = db.PullRequest.query( |
|
221 | 221 | include_closed=c.closed, |
|
222 | 222 | sorted=True, |
|
223 | 223 | ).filter_by(owner_id=request.authuser.user_id).all() |
|
224 | 224 | |
|
225 | 225 | c.participate_in_pull_requests = [] |
|
226 | 226 | c.participate_in_pull_requests_todo = [] |
|
227 | 227 | done_status = set([db.ChangesetStatus.STATUS_APPROVED, db.ChangesetStatus.STATUS_REJECTED]) |
|
228 | 228 | for pr in db.PullRequest.query( |
|
229 | 229 | include_closed=c.closed, |
|
230 | 230 | reviewer_id=request.authuser.user_id, |
|
231 | 231 | sorted=True, |
|
232 | 232 | ): |
|
233 | 233 | status = pr.user_review_status(request.authuser.user_id) # very inefficient!!! |
|
234 | 234 | if status in done_status: |
|
235 | 235 | c.participate_in_pull_requests.append(pr) |
|
236 | 236 | else: |
|
237 | 237 | c.participate_in_pull_requests_todo.append(pr) |
|
238 | 238 | |
|
239 | return render('/pullrequests/pullrequest_show_my.html') | |
|
239 | return base.render('/pullrequests/pullrequest_show_my.html') | |
|
240 | 240 | |
|
241 | 241 | @LoginRequired() |
|
242 | 242 | @HasRepoPermissionLevelDecorator('read') |
|
243 | 243 | def index(self): |
|
244 | 244 | org_repo = c.db_repo |
|
245 | 245 | org_scm_instance = org_repo.scm_instance |
|
246 | 246 | try: |
|
247 | 247 | org_scm_instance.get_changeset() |
|
248 | 248 | except EmptyRepositoryError as e: |
|
249 | 249 | webutils.flash(_('There are no changesets yet'), |
|
250 | 250 | category='warning') |
|
251 | 251 | raise HTTPFound(location=url('summary_home', repo_name=org_repo.repo_name)) |
|
252 | 252 | |
|
253 | 253 | org_rev = request.GET.get('rev_end') |
|
254 | 254 | # rev_start is not directly useful - its parent could however be used |
|
255 | 255 | # as default for other and thus give a simple compare view |
|
256 | 256 | rev_start = request.GET.get('rev_start') |
|
257 | 257 | other_rev = None |
|
258 | 258 | if rev_start: |
|
259 | 259 | starters = org_repo.get_changeset(rev_start).parents |
|
260 | 260 | if starters: |
|
261 | 261 | other_rev = starters[0].raw_id |
|
262 | 262 | else: |
|
263 | 263 | other_rev = org_repo.scm_instance.EMPTY_CHANGESET |
|
264 | 264 | branch = request.GET.get('branch') |
|
265 | 265 | |
|
266 | 266 | c.cs_repos = [(org_repo.repo_name, org_repo.repo_name)] |
|
267 | 267 | c.default_cs_repo = org_repo.repo_name |
|
268 | 268 | c.cs_refs, c.default_cs_ref = self._get_repo_refs(org_scm_instance, rev=org_rev, branch=branch) |
|
269 | 269 | |
|
270 | 270 | default_cs_ref_type, default_cs_branch, default_cs_rev = c.default_cs_ref.split(':') |
|
271 | 271 | if default_cs_ref_type != 'branch': |
|
272 | 272 | default_cs_branch = org_repo.get_changeset(default_cs_rev).branch |
|
273 | 273 | |
|
274 | 274 | # add org repo to other so we can open pull request against peer branches on itself |
|
275 | 275 | c.a_repos = [(org_repo.repo_name, '%s (self)' % org_repo.repo_name)] |
|
276 | 276 | |
|
277 | 277 | if org_repo.parent: |
|
278 | 278 | # add parent of this fork also and select it. |
|
279 | 279 | # use the same branch on destination as on source, if available. |
|
280 | 280 | c.a_repos.append((org_repo.parent.repo_name, '%s (parent)' % org_repo.parent.repo_name)) |
|
281 | 281 | c.a_repo = org_repo.parent |
|
282 | 282 | c.a_refs, c.default_a_ref = self._get_repo_refs( |
|
283 | 283 | org_repo.parent.scm_instance, branch=default_cs_branch, rev=other_rev) |
|
284 | 284 | |
|
285 | 285 | else: |
|
286 | 286 | c.a_repo = org_repo |
|
287 | 287 | c.a_refs, c.default_a_ref = self._get_repo_refs(org_scm_instance, rev=other_rev) |
|
288 | 288 | |
|
289 | 289 | # gather forks and add to this list ... even though it is rare to |
|
290 | 290 | # request forks to pull from their parent |
|
291 | 291 | for fork in org_repo.forks: |
|
292 | 292 | c.a_repos.append((fork.repo_name, fork.repo_name)) |
|
293 | 293 | |
|
294 | return render('/pullrequests/pullrequest.html') | |
|
294 | return base.render('/pullrequests/pullrequest.html') | |
|
295 | 295 | |
|
296 | 296 | @LoginRequired() |
|
297 | 297 | @HasRepoPermissionLevelDecorator('read') |
|
298 | @jsonify | |
|
298 | @base.jsonify | |
|
299 | 299 | def repo_info(self, repo_name): |
|
300 | 300 | repo = c.db_repo |
|
301 | 301 | refs, selected_ref = self._get_repo_refs(repo.scm_instance) |
|
302 | 302 | return { |
|
303 | 303 | 'description': repo.description.split('\n', 1)[0], |
|
304 | 304 | 'selected_ref': selected_ref, |
|
305 | 305 | 'refs': refs, |
|
306 | 306 | } |
|
307 | 307 | |
|
308 | 308 | @LoginRequired() |
|
309 | 309 | @HasRepoPermissionLevelDecorator('read') |
|
310 | 310 | def create(self, repo_name): |
|
311 | 311 | repo = c.db_repo |
|
312 | 312 | try: |
|
313 | 313 | _form = PullRequestForm(repo.repo_id)().to_python(request.POST) |
|
314 | 314 | except formencode.Invalid as errors: |
|
315 | 315 | log.error(traceback.format_exc()) |
|
316 | 316 | log.error(str(errors)) |
|
317 | 317 | msg = _('Error creating pull request: %s') % errors.msg |
|
318 | 318 | webutils.flash(msg, 'error') |
|
319 | 319 | raise HTTPBadRequest |
|
320 | 320 | |
|
321 | 321 | # heads up: org and other might seem backward here ... |
|
322 | 322 | org_ref = _form['org_ref'] # will have merge_rev as rev but symbolic name |
|
323 | 323 | org_repo = db.Repository.guess_instance(_form['org_repo']) |
|
324 | 324 | |
|
325 | 325 | other_ref = _form['other_ref'] # will have symbolic name and head revision |
|
326 | 326 | other_repo = db.Repository.guess_instance(_form['other_repo']) |
|
327 | 327 | |
|
328 | 328 | reviewers = [] |
|
329 | 329 | |
|
330 | 330 | title = _form['pullrequest_title'] |
|
331 | 331 | description = _form['pullrequest_desc'].strip() |
|
332 | 332 | owner = db.User.get(request.authuser.user_id) |
|
333 | 333 | |
|
334 | 334 | try: |
|
335 | 335 | cmd = CreatePullRequestAction(org_repo, other_repo, org_ref, other_ref, title, description, owner, reviewers) |
|
336 | 336 | except CreatePullRequestAction.ValidationError as e: |
|
337 | 337 | webutils.flash(e, category='error', logf=log.error) |
|
338 | 338 | raise HTTPNotFound |
|
339 | 339 | |
|
340 | 340 | try: |
|
341 | 341 | pull_request = cmd.execute() |
|
342 | 342 | meta.Session().commit() |
|
343 | 343 | except Exception: |
|
344 | 344 | webutils.flash(_('Error occurred while creating pull request'), |
|
345 | 345 | category='error') |
|
346 | 346 | log.error(traceback.format_exc()) |
|
347 | 347 | raise HTTPFound(location=url('pullrequest_home', repo_name=repo_name)) |
|
348 | 348 | |
|
349 | 349 | webutils.flash(_('Successfully opened new pull request'), |
|
350 | 350 | category='success') |
|
351 | 351 | raise HTTPFound(location=pull_request.url()) |
|
352 | 352 | |
|
353 | 353 | def create_new_iteration(self, old_pull_request, new_rev, title, description, reviewers): |
|
354 | 354 | owner = db.User.get(request.authuser.user_id) |
|
355 | 355 | new_org_rev = self._get_ref_rev(old_pull_request.org_repo, 'rev', new_rev) |
|
356 | 356 | new_other_rev = self._get_ref_rev(old_pull_request.other_repo, old_pull_request.other_ref_parts[0], old_pull_request.other_ref_parts[1]) |
|
357 | 357 | try: |
|
358 | 358 | cmd = CreatePullRequestIterationAction(old_pull_request, new_org_rev, new_other_rev, title, description, owner, reviewers) |
|
359 | 359 | except CreatePullRequestAction.ValidationError as e: |
|
360 | 360 | webutils.flash(e, category='error', logf=log.error) |
|
361 | 361 | raise HTTPNotFound |
|
362 | 362 | |
|
363 | 363 | try: |
|
364 | 364 | pull_request = cmd.execute() |
|
365 | 365 | meta.Session().commit() |
|
366 | 366 | except Exception: |
|
367 | 367 | webutils.flash(_('Error occurred while creating pull request'), |
|
368 | 368 | category='error') |
|
369 | 369 | log.error(traceback.format_exc()) |
|
370 | 370 | raise HTTPFound(location=old_pull_request.url()) |
|
371 | 371 | |
|
372 | 372 | webutils.flash(_('New pull request iteration created'), |
|
373 | 373 | category='success') |
|
374 | 374 | raise HTTPFound(location=pull_request.url()) |
|
375 | 375 | |
|
376 | 376 | # pullrequest_post for PR editing |
|
377 | 377 | @LoginRequired() |
|
378 | 378 | @HasRepoPermissionLevelDecorator('read') |
|
379 | 379 | def post(self, repo_name, pull_request_id): |
|
380 | 380 | pull_request = db.PullRequest.get_or_404(pull_request_id) |
|
381 | 381 | if pull_request.is_closed(): |
|
382 | 382 | raise HTTPForbidden() |
|
383 | 383 | assert pull_request.other_repo.repo_name == repo_name |
|
384 | 384 | # only owner or admin can update it |
|
385 | 385 | owner = pull_request.owner_id == request.authuser.user_id |
|
386 | 386 | repo_admin = auth.HasRepoPermissionLevel('admin')(c.repo_name) |
|
387 | 387 | if not (auth.HasPermissionAny('hg.admin')() or repo_admin or owner): |
|
388 | 388 | raise HTTPForbidden() |
|
389 | 389 | |
|
390 | 390 | _form = PullRequestPostForm()().to_python(request.POST) |
|
391 | 391 | |
|
392 | 392 | cur_reviewers = set(pull_request.get_reviewer_users()) |
|
393 | 393 | new_reviewers = set(_get_reviewer(s) for s in _form['review_members']) |
|
394 | 394 | old_reviewers = set(_get_reviewer(s) for s in _form['org_review_members']) |
|
395 | 395 | |
|
396 | 396 | other_added = cur_reviewers - old_reviewers |
|
397 | 397 | other_removed = old_reviewers - cur_reviewers |
|
398 | 398 | |
|
399 | 399 | if other_added: |
|
400 | 400 | webutils.flash(_('Meanwhile, the following reviewers have been added: %s') % |
|
401 | 401 | (', '.join(u.username for u in other_added)), |
|
402 | 402 | category='warning') |
|
403 | 403 | if other_removed: |
|
404 | 404 | webutils.flash(_('Meanwhile, the following reviewers have been removed: %s') % |
|
405 | 405 | (', '.join(u.username for u in other_removed)), |
|
406 | 406 | category='warning') |
|
407 | 407 | |
|
408 | 408 | if _form['updaterev']: |
|
409 | 409 | return self.create_new_iteration(pull_request, |
|
410 | 410 | _form['updaterev'], |
|
411 | 411 | _form['pullrequest_title'], |
|
412 | 412 | _form['pullrequest_desc'], |
|
413 | 413 | new_reviewers) |
|
414 | 414 | |
|
415 | 415 | added_reviewers = new_reviewers - old_reviewers - cur_reviewers |
|
416 | 416 | removed_reviewers = (old_reviewers - new_reviewers) & cur_reviewers |
|
417 | 417 | |
|
418 | 418 | old_description = pull_request.description |
|
419 | 419 | pull_request.title = _form['pullrequest_title'] |
|
420 | 420 | pull_request.description = _form['pullrequest_desc'].strip() or _('No description') |
|
421 | 421 | pull_request.owner = db.User.get_by_username(_form['owner']) |
|
422 | 422 | user = db.User.get(request.authuser.user_id) |
|
423 | 423 | |
|
424 | 424 | PullRequestModel().mention_from_description(user, pull_request, old_description) |
|
425 | 425 | PullRequestModel().add_reviewers(user, pull_request, added_reviewers) |
|
426 | 426 | PullRequestModel().remove_reviewers(user, pull_request, removed_reviewers) |
|
427 | 427 | |
|
428 | 428 | meta.Session().commit() |
|
429 | 429 | webutils.flash(_('Pull request updated'), category='success') |
|
430 | 430 | |
|
431 | 431 | raise HTTPFound(location=pull_request.url()) |
|
432 | 432 | |
|
433 | 433 | @LoginRequired() |
|
434 | 434 | @HasRepoPermissionLevelDecorator('read') |
|
435 | @jsonify | |
|
435 | @base.jsonify | |
|
436 | 436 | def delete(self, repo_name, pull_request_id): |
|
437 | 437 | pull_request = db.PullRequest.get_or_404(pull_request_id) |
|
438 | 438 | # only owner can delete it ! |
|
439 | 439 | if pull_request.owner_id == request.authuser.user_id: |
|
440 | 440 | PullRequestModel().delete(pull_request) |
|
441 | 441 | meta.Session().commit() |
|
442 | 442 | webutils.flash(_('Successfully deleted pull request'), |
|
443 | 443 | category='success') |
|
444 | 444 | raise HTTPFound(location=url('my_pullrequests')) |
|
445 | 445 | raise HTTPForbidden() |
|
446 | 446 | |
|
447 | 447 | @LoginRequired(allow_default_user=True) |
|
448 | 448 | @HasRepoPermissionLevelDecorator('read') |
|
449 | 449 | def show(self, repo_name, pull_request_id, extra=None): |
|
450 | 450 | c.pull_request = db.PullRequest.get_or_404(pull_request_id) |
|
451 | 451 | c.allowed_to_change_status = self._is_allowed_to_change_status(c.pull_request) |
|
452 | 452 | cc_model = ChangesetCommentsModel() |
|
453 | 453 | cs_model = ChangesetStatusModel() |
|
454 | 454 | |
|
455 | 455 | # pull_requests repo_name we opened it against |
|
456 | 456 | # ie. other_repo must match |
|
457 | 457 | if repo_name != c.pull_request.other_repo.repo_name: |
|
458 | 458 | raise HTTPNotFound |
|
459 | 459 | |
|
460 | 460 | # load compare data into template context |
|
461 | 461 | c.cs_repo = c.pull_request.org_repo |
|
462 | 462 | (c.cs_ref_type, |
|
463 | 463 | c.cs_ref_name, |
|
464 | 464 | c.cs_rev) = c.pull_request.org_ref.split(':') |
|
465 | 465 | |
|
466 | 466 | c.a_repo = c.pull_request.other_repo |
|
467 | 467 | (c.a_ref_type, |
|
468 | 468 | c.a_ref_name, |
|
469 | 469 | c.a_rev) = c.pull_request.other_ref.split(':') # a_rev is ancestor |
|
470 | 470 | |
|
471 | 471 | org_scm_instance = c.cs_repo.scm_instance # property with expensive cache invalidation check!!! |
|
472 | 472 | c.cs_ranges = [] |
|
473 | 473 | for x in c.pull_request.revisions: |
|
474 | 474 | try: |
|
475 | 475 | c.cs_ranges.append(org_scm_instance.get_changeset(x)) |
|
476 | 476 | except ChangesetDoesNotExistError: |
|
477 | 477 | c.cs_ranges = [] |
|
478 | 478 | webutils.flash(_('Revision %s not found in %s') % (x, c.cs_repo.repo_name), |
|
479 | 479 | 'error') |
|
480 | 480 | break |
|
481 | 481 | c.cs_ranges_org = None # not stored and not important and moving target - could be calculated ... |
|
482 | 482 | revs = [ctx.revision for ctx in reversed(c.cs_ranges)] |
|
483 | 483 | c.jsdata = graph_data(org_scm_instance, revs) |
|
484 | 484 | |
|
485 | 485 | c.is_range = False |
|
486 | 486 | try: |
|
487 | 487 | if c.a_ref_type == 'rev': # this looks like a free range where target is ancestor |
|
488 | 488 | cs_a = org_scm_instance.get_changeset(c.a_rev) |
|
489 | 489 | root_parents = c.cs_ranges[0].parents |
|
490 | 490 | c.is_range = cs_a in root_parents |
|
491 | 491 | #c.merge_root = len(root_parents) > 1 # a range starting with a merge might deserve a warning |
|
492 | 492 | except ChangesetDoesNotExistError: # probably because c.a_rev not found |
|
493 | 493 | pass |
|
494 | 494 | except IndexError: # probably because c.cs_ranges is empty, probably because revisions are missing |
|
495 | 495 | pass |
|
496 | 496 | |
|
497 | 497 | avail_revs = set() |
|
498 | 498 | avail_show = [] |
|
499 | 499 | c.cs_branch_name = c.cs_ref_name |
|
500 | 500 | c.a_branch_name = None |
|
501 | 501 | other_scm_instance = c.a_repo.scm_instance |
|
502 | 502 | c.update_msg = "" |
|
503 | 503 | c.update_msg_other = "" |
|
504 | 504 | try: |
|
505 | 505 | if not c.cs_ranges: |
|
506 | 506 | c.update_msg = _('Error: changesets not found when displaying pull request from %s.') % c.cs_rev |
|
507 | 507 | elif org_scm_instance.alias == 'hg' and c.a_ref_name != 'ancestor': |
|
508 | 508 | if c.cs_ref_type != 'branch': |
|
509 | 509 | c.cs_branch_name = org_scm_instance.get_changeset(c.cs_ref_name).branch # use ref_type ? |
|
510 | 510 | c.a_branch_name = c.a_ref_name |
|
511 | 511 | if c.a_ref_type != 'branch': |
|
512 | 512 | try: |
|
513 | 513 | c.a_branch_name = other_scm_instance.get_changeset(c.a_ref_name).branch # use ref_type ? |
|
514 | 514 | except EmptyRepositoryError: |
|
515 | 515 | c.a_branch_name = 'null' # not a branch name ... but close enough |
|
516 | 516 | # candidates: descendants of old head that are on the right branch |
|
517 | 517 | # and not are the old head itself ... |
|
518 | 518 | # and nothing at all if old head is a descendant of target ref name |
|
519 | 519 | if not c.is_range and other_scm_instance._repo.revs('present(%s)::&%s', c.cs_ranges[-1].raw_id, c.a_branch_name): |
|
520 | 520 | c.update_msg = _('This pull request has already been merged to %s.') % c.a_branch_name |
|
521 | 521 | elif c.pull_request.is_closed(): |
|
522 | 522 | c.update_msg = _('This pull request has been closed and can not be updated.') |
|
523 | 523 | else: # look for descendants of PR head on source branch in org repo |
|
524 | 524 | avail_revs = org_scm_instance._repo.revs('%s:: & branch(%s)', |
|
525 | 525 | revs[0], c.cs_branch_name) |
|
526 | 526 | if len(avail_revs) > 1: # more than just revs[0] |
|
527 | 527 | # also show changesets that not are descendants but would be merged in |
|
528 | 528 | targethead = other_scm_instance.get_changeset(c.a_branch_name).raw_id |
|
529 | 529 | if org_scm_instance.path != other_scm_instance.path: |
|
530 | 530 | # Note: org_scm_instance.path must come first so all |
|
531 | 531 | # valid revision numbers are 100% org_scm compatible |
|
532 | 532 | # - both for avail_revs and for revset results |
|
533 | 533 | hgrepo = mercurial.unionrepo.makeunionrepository(org_scm_instance.baseui, |
|
534 | 534 | safe_bytes(org_scm_instance.path), |
|
535 | 535 | safe_bytes(other_scm_instance.path)) |
|
536 | 536 | else: |
|
537 | 537 | hgrepo = org_scm_instance._repo |
|
538 | 538 | show = set(hgrepo.revs('::%ld & !::parents(%s) & !::%s', |
|
539 | 539 | avail_revs, revs[0], targethead)) |
|
540 | 540 | if show: |
|
541 | 541 | c.update_msg = _('The following additional changes are available on %s:') % c.cs_branch_name |
|
542 | 542 | else: |
|
543 | 543 | c.update_msg = _('No additional changesets found for iterating on this pull request.') |
|
544 | 544 | else: |
|
545 | 545 | show = set() |
|
546 | 546 | avail_revs = set() # drop revs[0] |
|
547 | 547 | c.update_msg = _('No additional changesets found for iterating on this pull request.') |
|
548 | 548 | |
|
549 | 549 | # TODO: handle branch heads that not are tip-most |
|
550 | 550 | brevs = org_scm_instance._repo.revs('%s - %ld - %s', c.cs_branch_name, avail_revs, revs[0]) |
|
551 | 551 | if brevs: |
|
552 | 552 | # also show changesets that are on branch but neither ancestors nor descendants |
|
553 | 553 | show.update(org_scm_instance._repo.revs('::%ld - ::%ld - ::%s', brevs, avail_revs, c.a_branch_name)) |
|
554 | 554 | show.add(revs[0]) # make sure graph shows this so we can see how they relate |
|
555 | 555 | c.update_msg_other = _('Note: Branch %s has another head: %s.') % (c.cs_branch_name, |
|
556 | 556 | org_scm_instance.get_changeset(max(brevs)).short_id) |
|
557 | 557 | |
|
558 | 558 | avail_show = sorted(show, reverse=True) |
|
559 | 559 | |
|
560 | 560 | elif org_scm_instance.alias == 'git': |
|
561 | 561 | c.cs_repo.scm_instance.get_changeset(c.cs_rev) # check it exists - raise ChangesetDoesNotExistError if not |
|
562 | 562 | c.update_msg = _("Git pull requests don't support iterating yet.") |
|
563 | 563 | except ChangesetDoesNotExistError: |
|
564 | 564 | c.update_msg = _('Error: some changesets not found when displaying pull request from %s.') % c.cs_rev |
|
565 | 565 | |
|
566 | 566 | c.avail_revs = avail_revs |
|
567 | 567 | c.avail_cs = [org_scm_instance.get_changeset(r) for r in avail_show] |
|
568 | 568 | c.avail_jsdata = graph_data(org_scm_instance, avail_show) |
|
569 | 569 | |
|
570 | 570 | raw_ids = [x.raw_id for x in c.cs_ranges] |
|
571 | 571 | c.cs_comments = c.cs_repo.get_comments(raw_ids) |
|
572 | 572 | c.cs_statuses = c.cs_repo.statuses(raw_ids) |
|
573 | 573 | |
|
574 | 574 | ignore_whitespace_diff = h.get_ignore_whitespace_diff(request.GET) |
|
575 | 575 | diff_context_size = h.get_diff_context_size(request.GET) |
|
576 | 576 | fulldiff = request.GET.get('fulldiff') |
|
577 | 577 | diff_limit = None if fulldiff else self.cut_off_limit |
|
578 | 578 | |
|
579 | 579 | # we swap org/other ref since we run a simple diff on one repo |
|
580 | 580 | log.debug('running diff between %s and %s in %s', |
|
581 | 581 | c.a_rev, c.cs_rev, org_scm_instance.path) |
|
582 | 582 | try: |
|
583 | 583 | raw_diff = diffs.get_diff(org_scm_instance, rev1=c.a_rev, rev2=c.cs_rev, |
|
584 | 584 | ignore_whitespace=ignore_whitespace_diff, context=diff_context_size) |
|
585 | 585 | except ChangesetDoesNotExistError: |
|
586 | 586 | raw_diff = safe_bytes(_("The diff can't be shown - the PR revisions could not be found.")) |
|
587 | 587 | diff_processor = diffs.DiffProcessor(raw_diff, diff_limit=diff_limit) |
|
588 | 588 | c.limited_diff = diff_processor.limited_diff |
|
589 | 589 | c.file_diff_data = [] |
|
590 | 590 | c.lines_added = 0 |
|
591 | 591 | c.lines_deleted = 0 |
|
592 | 592 | |
|
593 | 593 | for f in diff_processor.parsed: |
|
594 | 594 | st = f['stats'] |
|
595 | 595 | c.lines_added += st['added'] |
|
596 | 596 | c.lines_deleted += st['deleted'] |
|
597 | 597 | filename = f['filename'] |
|
598 | 598 | fid = h.FID('', filename) |
|
599 | 599 | html_diff = diffs.as_html(parsed_lines=[f]) |
|
600 | 600 | c.file_diff_data.append((fid, None, f['operation'], f['old_filename'], filename, html_diff, st)) |
|
601 | 601 | |
|
602 | 602 | # inline comments |
|
603 | 603 | c.inline_cnt = 0 |
|
604 | 604 | c.inline_comments = cc_model.get_inline_comments( |
|
605 | 605 | c.db_repo.repo_id, |
|
606 | 606 | pull_request=pull_request_id) |
|
607 | 607 | # count inline comments |
|
608 | 608 | for __, lines in c.inline_comments: |
|
609 | 609 | for comments in lines.values(): |
|
610 | 610 | c.inline_cnt += len(comments) |
|
611 | 611 | # comments |
|
612 | 612 | c.comments = cc_model.get_comments(c.db_repo.repo_id, pull_request=pull_request_id) |
|
613 | 613 | |
|
614 | 614 | # (badly named) pull-request status calculation based on reviewer votes |
|
615 | 615 | (c.pull_request_reviewers, |
|
616 | 616 | c.pull_request_pending_reviewers, |
|
617 | 617 | c.current_voting_result, |
|
618 | 618 | ) = cs_model.calculate_pull_request_result(c.pull_request) |
|
619 | 619 | c.changeset_statuses = db.ChangesetStatus.STATUSES |
|
620 | 620 | |
|
621 | 621 | c.is_ajax_preview = False |
|
622 | 622 | c.ancestors = None # [c.a_rev] ... but that is shown in an other way |
|
623 | return render('/pullrequests/pullrequest_show.html') | |
|
623 | return base.render('/pullrequests/pullrequest_show.html') | |
|
624 | 624 | |
|
625 | 625 | @LoginRequired() |
|
626 | 626 | @HasRepoPermissionLevelDecorator('read') |
|
627 | @jsonify | |
|
627 | @base.jsonify | |
|
628 | 628 | def comment(self, repo_name, pull_request_id): |
|
629 | 629 | pull_request = db.PullRequest.get_or_404(pull_request_id) |
|
630 | 630 | allowed_to_change_status = self._is_allowed_to_change_status(pull_request) |
|
631 | 631 | return create_cs_pr_comment(repo_name, pull_request=pull_request, |
|
632 | 632 | allowed_to_change_status=allowed_to_change_status) |
|
633 | 633 | |
|
634 | 634 | @LoginRequired() |
|
635 | 635 | @HasRepoPermissionLevelDecorator('read') |
|
636 | @jsonify | |
|
636 | @base.jsonify | |
|
637 | 637 | def delete_comment(self, repo_name, comment_id): |
|
638 | 638 | return delete_cs_pr_comment(repo_name, comment_id) |
@@ -1,35 +1,35 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | from tg import config |
|
15 | 15 | from tgext.routes import RoutedController |
|
16 | 16 | |
|
17 | from kallithea.controllers import base | |
|
17 | 18 | from kallithea.controllers.error import ErrorController |
|
18 | 19 | from kallithea.controllers.routing import make_map |
|
19 | from kallithea.lib.base import BaseController | |
|
20 | 20 | |
|
21 | 21 | |
|
22 | 22 | # This is the main Kallithea entry point; TurboGears will forward all requests |
|
23 | 23 | # to an instance of 'controller.root.RootController' in the configured |
|
24 | 24 | # 'application' module (set by app_cfg.py). Requests are forwarded to |
|
25 | 25 | # controllers based on the routing mapper that lives in this root instance. |
|
26 | 26 | # The mapper is configured using routes defined in routing.py. This use of the |
|
27 | 27 | # 'mapper' attribute is a feature of tgext.routes, which is activated by |
|
28 | 28 | # inheriting from its RoutedController class. |
|
29 | class RootController(RoutedController, BaseController): | |
|
29 | class RootController(RoutedController, base.BaseController): | |
|
30 | 30 | |
|
31 | 31 | def __init__(self): |
|
32 | 32 | self.mapper = make_map(config) |
|
33 | 33 | |
|
34 | 34 | # The URL '/error/document' (the default TG errorpage.path) should be handled by ErrorController.document |
|
35 | 35 | self.error = ErrorController() |
@@ -1,142 +1,142 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.search |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Search controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Aug 7, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import logging |
|
29 | 29 | import traceback |
|
30 | 30 | |
|
31 | 31 | from tg import config, request |
|
32 | 32 | from tg import tmpl_context as c |
|
33 | 33 | from tg.i18n import ugettext as _ |
|
34 | 34 | from whoosh.index import EmptyIndexError, exists_in, open_dir |
|
35 | 35 | from whoosh.qparser import QueryParser, QueryParserError |
|
36 | 36 | from whoosh.query import Phrase, Prefix |
|
37 | 37 | |
|
38 | from kallithea.controllers import base | |
|
38 | 39 | from kallithea.lib.auth import LoginRequired |
|
39 | from kallithea.lib.base import BaseRepoController, render | |
|
40 | 40 | from kallithea.lib.indexers import CHGSET_IDX_NAME, CHGSETS_SCHEMA, IDX_NAME, SCHEMA, WhooshResultWrapper |
|
41 | 41 | from kallithea.lib.page import Page |
|
42 | 42 | from kallithea.lib.utils2 import safe_int |
|
43 | 43 | from kallithea.model.repo import RepoModel |
|
44 | 44 | |
|
45 | 45 | |
|
46 | 46 | log = logging.getLogger(__name__) |
|
47 | 47 | |
|
48 | 48 | |
|
49 | class SearchController(BaseRepoController): | |
|
49 | class SearchController(base.BaseRepoController): | |
|
50 | 50 | |
|
51 | 51 | @LoginRequired(allow_default_user=True) |
|
52 | 52 | def index(self, repo_name=None): |
|
53 | 53 | c.repo_name = repo_name |
|
54 | 54 | c.formated_results = [] |
|
55 | 55 | c.runtime = '' |
|
56 | 56 | c.cur_query = request.GET.get('q', None) |
|
57 | 57 | c.cur_type = request.GET.get('type', 'content') |
|
58 | 58 | c.cur_search = search_type = {'content': 'content', |
|
59 | 59 | 'commit': 'message', |
|
60 | 60 | 'path': 'path', |
|
61 | 61 | 'repository': 'repository' |
|
62 | 62 | }.get(c.cur_type, 'content') |
|
63 | 63 | |
|
64 | 64 | index_name = { |
|
65 | 65 | 'content': IDX_NAME, |
|
66 | 66 | 'commit': CHGSET_IDX_NAME, |
|
67 | 67 | 'path': IDX_NAME |
|
68 | 68 | }.get(c.cur_type, IDX_NAME) |
|
69 | 69 | |
|
70 | 70 | schema_defn = { |
|
71 | 71 | 'content': SCHEMA, |
|
72 | 72 | 'commit': CHGSETS_SCHEMA, |
|
73 | 73 | 'path': SCHEMA |
|
74 | 74 | }.get(c.cur_type, SCHEMA) |
|
75 | 75 | |
|
76 | 76 | log.debug('IDX: %s', index_name) |
|
77 | 77 | log.debug('SCHEMA: %s', schema_defn) |
|
78 | 78 | |
|
79 | 79 | if c.cur_query: |
|
80 | 80 | cur_query = c.cur_query.lower() |
|
81 | 81 | log.debug(cur_query) |
|
82 | 82 | |
|
83 | 83 | if c.cur_query: |
|
84 | 84 | p = safe_int(request.GET.get('page'), 1) |
|
85 | 85 | highlight_items = set() |
|
86 | 86 | index_dir = config['index_dir'] |
|
87 | 87 | try: |
|
88 | 88 | if not exists_in(index_dir, index_name): |
|
89 | 89 | raise EmptyIndexError |
|
90 | 90 | idx = open_dir(index_dir, indexname=index_name) |
|
91 | 91 | searcher = idx.searcher() |
|
92 | 92 | |
|
93 | 93 | qp = QueryParser(search_type, schema=schema_defn) |
|
94 | 94 | if c.repo_name: |
|
95 | 95 | # use "repository_rawname:" instead of "repository:" |
|
96 | 96 | # for case-sensitive matching |
|
97 | 97 | cur_query = 'repository_rawname:%s %s' % (c.repo_name, cur_query) |
|
98 | 98 | try: |
|
99 | 99 | query = qp.parse(cur_query) |
|
100 | 100 | # extract words for highlight |
|
101 | 101 | if isinstance(query, Phrase): |
|
102 | 102 | highlight_items.update(query.words) |
|
103 | 103 | elif isinstance(query, Prefix): |
|
104 | 104 | highlight_items.add(query.text) |
|
105 | 105 | else: |
|
106 | 106 | for i in query.all_terms(): |
|
107 | 107 | if i[0] in ['content', 'message']: |
|
108 | 108 | highlight_items.add(i[1]) |
|
109 | 109 | |
|
110 | 110 | matcher = query.matcher(searcher) |
|
111 | 111 | |
|
112 | 112 | log.debug('query: %s', query) |
|
113 | 113 | log.debug('hl terms: %s', highlight_items) |
|
114 | 114 | results = searcher.search(query) |
|
115 | 115 | res_ln = len(results) |
|
116 | 116 | c.runtime = '%s results (%.3f seconds)' % ( |
|
117 | 117 | res_ln, results.runtime |
|
118 | 118 | ) |
|
119 | 119 | |
|
120 | 120 | repo_location = RepoModel().repos_path |
|
121 | 121 | c.formated_results = Page( |
|
122 | 122 | WhooshResultWrapper(search_type, searcher, matcher, |
|
123 | 123 | highlight_items, repo_location), |
|
124 | 124 | page=p, |
|
125 | 125 | item_count=res_ln, |
|
126 | 126 | items_per_page=10, |
|
127 | 127 | type=c.cur_type, |
|
128 | 128 | q=c.cur_query, |
|
129 | 129 | ) |
|
130 | 130 | |
|
131 | 131 | except QueryParserError: |
|
132 | 132 | c.runtime = _('Invalid search query. Try quoting it.') |
|
133 | 133 | searcher.close() |
|
134 | 134 | except EmptyIndexError: |
|
135 | 135 | log.error("Empty search index - run 'kallithea-cli index-create' regularly") |
|
136 | 136 | c.runtime = _('The server has no search index.') |
|
137 | 137 | except Exception: |
|
138 | 138 | log.error(traceback.format_exc()) |
|
139 | 139 | c.runtime = _('An error occurred during search operation.') |
|
140 | 140 | |
|
141 | 141 | # Return a rendered template |
|
142 | return render('/search/search.html') | |
|
142 | return base.render('/search/search.html') |
@@ -1,212 +1,212 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | # This program is free software: you can redistribute it and/or modify |
|
3 | 3 | # it under the terms of the GNU General Public License as published by |
|
4 | 4 | # the Free Software Foundation, either version 3 of the License, or |
|
5 | 5 | # (at your option) any later version. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | """ |
|
15 | 15 | kallithea.controllers.summary |
|
16 | 16 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ |
|
17 | 17 | |
|
18 | 18 | Summary controller for Kallithea |
|
19 | 19 | |
|
20 | 20 | This file was forked by the Kallithea project in July 2014. |
|
21 | 21 | Original author and date, and relevant copyright and licensing information is below: |
|
22 | 22 | :created_on: Apr 18, 2010 |
|
23 | 23 | :author: marcink |
|
24 | 24 | :copyright: (c) 2013 RhodeCode GmbH, and others. |
|
25 | 25 | :license: GPLv3, see LICENSE.md for more details. |
|
26 | 26 | """ |
|
27 | 27 | |
|
28 | 28 | import calendar |
|
29 | 29 | import itertools |
|
30 | 30 | import logging |
|
31 | 31 | import traceback |
|
32 | 32 | from datetime import date, timedelta |
|
33 | 33 | from time import mktime |
|
34 | 34 | |
|
35 | 35 | from beaker.cache import cache_region |
|
36 | 36 | from tg import request |
|
37 | 37 | from tg import tmpl_context as c |
|
38 | 38 | from tg.i18n import ugettext as _ |
|
39 | 39 | from webob.exc import HTTPBadRequest |
|
40 | 40 | |
|
41 | from kallithea.controllers import base | |
|
41 | 42 | from kallithea.lib import ext_json, webutils |
|
42 | 43 | from kallithea.lib.auth import HasRepoPermissionLevelDecorator, LoginRequired |
|
43 | from kallithea.lib.base import BaseRepoController, jsonify, render | |
|
44 | 44 | from kallithea.lib.conf import ALL_EXTS, ALL_READMES, LANGUAGES_EXTENSIONS_MAP |
|
45 | 45 | from kallithea.lib.markup_renderer import MarkupRenderer |
|
46 | 46 | from kallithea.lib.page import Page |
|
47 | 47 | from kallithea.lib.utils2 import safe_int, safe_str |
|
48 | 48 | from kallithea.lib.vcs.backends.base import EmptyChangeset |
|
49 | 49 | from kallithea.lib.vcs.exceptions import ChangesetError, EmptyRepositoryError, NodeDoesNotExistError |
|
50 | 50 | from kallithea.lib.vcs.nodes import FileNode |
|
51 | 51 | from kallithea.model import async_tasks, db |
|
52 | 52 | |
|
53 | 53 | |
|
54 | 54 | log = logging.getLogger(__name__) |
|
55 | 55 | |
|
56 | 56 | README_FILES = [''.join([x[0][0], x[1][0]]) for x in |
|
57 | 57 | sorted(list(itertools.product(ALL_READMES, ALL_EXTS)), |
|
58 | 58 | key=lambda y:y[0][1] + y[1][1])] |
|
59 | 59 | |
|
60 | 60 | |
|
61 | class SummaryController(BaseRepoController): | |
|
61 | class SummaryController(base.BaseRepoController): | |
|
62 | 62 | |
|
63 | 63 | def __get_readme_data(self, db_repo): |
|
64 | 64 | repo_name = db_repo.repo_name |
|
65 | 65 | log.debug('Looking for README file') |
|
66 | 66 | |
|
67 | 67 | @cache_region('long_term_file', '_get_readme_from_cache') |
|
68 | 68 | def _get_readme_from_cache(*_cache_keys): # parameters are not really used - only as caching key |
|
69 | 69 | readme_data = None |
|
70 | 70 | readme_file = None |
|
71 | 71 | try: |
|
72 | 72 | # gets the landing revision! or tip if fails |
|
73 | 73 | cs = db_repo.get_landing_changeset() |
|
74 | 74 | if isinstance(cs, EmptyChangeset): |
|
75 | 75 | raise EmptyRepositoryError() |
|
76 | 76 | renderer = MarkupRenderer() |
|
77 | 77 | for f in README_FILES: |
|
78 | 78 | try: |
|
79 | 79 | readme = cs.get_node(f) |
|
80 | 80 | if not isinstance(readme, FileNode): |
|
81 | 81 | continue |
|
82 | 82 | readme_file = f |
|
83 | 83 | log.debug('Found README file `%s` rendering...', |
|
84 | 84 | readme_file) |
|
85 | 85 | readme_data = renderer.render(safe_str(readme.content), |
|
86 | 86 | filename=f) |
|
87 | 87 | break |
|
88 | 88 | except NodeDoesNotExistError: |
|
89 | 89 | continue |
|
90 | 90 | except ChangesetError: |
|
91 | 91 | log.error(traceback.format_exc()) |
|
92 | 92 | pass |
|
93 | 93 | except EmptyRepositoryError: |
|
94 | 94 | pass |
|
95 | 95 | |
|
96 | 96 | return readme_data, readme_file |
|
97 | 97 | |
|
98 | 98 | kind = 'README' |
|
99 | 99 | return _get_readme_from_cache(repo_name, kind, c.db_repo.changeset_cache.get('raw_id')) |
|
100 | 100 | |
|
101 | 101 | @LoginRequired(allow_default_user=True) |
|
102 | 102 | @HasRepoPermissionLevelDecorator('read') |
|
103 | 103 | def index(self, repo_name): |
|
104 | 104 | p = safe_int(request.GET.get('page'), 1) |
|
105 | 105 | size = safe_int(request.GET.get('size'), 10) |
|
106 | 106 | try: |
|
107 | 107 | collection = c.db_repo_scm_instance.get_changesets(reverse=True) |
|
108 | 108 | except EmptyRepositoryError as e: |
|
109 | 109 | webutils.flash(e, category='warning') |
|
110 | 110 | collection = [] |
|
111 | 111 | c.cs_pagination = Page(collection, page=p, items_per_page=size) |
|
112 | 112 | page_revisions = [x.raw_id for x in list(c.cs_pagination)] |
|
113 | 113 | c.cs_comments = c.db_repo.get_comments(page_revisions) |
|
114 | 114 | c.cs_statuses = c.db_repo.statuses(page_revisions) |
|
115 | 115 | |
|
116 | 116 | c.ssh_repo_url = None |
|
117 | 117 | if request.authuser.is_default_user: |
|
118 | 118 | username = None |
|
119 | 119 | else: |
|
120 | 120 | username = request.authuser.username |
|
121 | 121 | if c.ssh_enabled: |
|
122 | 122 | c.ssh_repo_url = c.db_repo.clone_url(clone_uri_tmpl=c.clone_ssh_tmpl) |
|
123 | 123 | |
|
124 | 124 | c.clone_repo_url = c.db_repo.clone_url(clone_uri_tmpl=c.clone_uri_tmpl, with_id=False, username=username) |
|
125 | 125 | c.clone_repo_url_id = c.db_repo.clone_url(clone_uri_tmpl=c.clone_uri_tmpl, with_id=True, username=username) |
|
126 | 126 | |
|
127 | 127 | if c.db_repo.enable_statistics: |
|
128 | 128 | c.show_stats = True |
|
129 | 129 | else: |
|
130 | 130 | c.show_stats = False |
|
131 | 131 | |
|
132 | 132 | stats = db.Statistics.query() \ |
|
133 | 133 | .filter(db.Statistics.repository == c.db_repo) \ |
|
134 | 134 | .scalar() |
|
135 | 135 | |
|
136 | 136 | c.stats_percentage = 0 |
|
137 | 137 | |
|
138 | 138 | if stats and stats.languages: |
|
139 | 139 | lang_stats_d = ext_json.loads(stats.languages) |
|
140 | 140 | lang_stats = [(x, {"count": y, |
|
141 | 141 | "desc": LANGUAGES_EXTENSIONS_MAP.get(x, '?')}) |
|
142 | 142 | for x, y in lang_stats_d.items()] |
|
143 | 143 | lang_stats.sort(key=lambda k: (-k[1]['count'], k[0])) |
|
144 | 144 | c.trending_languages = lang_stats[:10] |
|
145 | 145 | else: |
|
146 | 146 | c.trending_languages = [] |
|
147 | 147 | |
|
148 | 148 | c.enable_downloads = c.db_repo.enable_downloads |
|
149 | 149 | c.readme_data, c.readme_file = \ |
|
150 | 150 | self.__get_readme_data(c.db_repo) |
|
151 | return render('summary/summary.html') | |
|
151 | return base.render('summary/summary.html') | |
|
152 | 152 | |
|
153 | 153 | @LoginRequired() |
|
154 | 154 | @HasRepoPermissionLevelDecorator('read') |
|
155 | @jsonify | |
|
155 | @base.jsonify | |
|
156 | 156 | def repo_size(self, repo_name): |
|
157 | 157 | if request.is_xhr: |
|
158 | 158 | return c.db_repo._repo_size() |
|
159 | 159 | else: |
|
160 | 160 | raise HTTPBadRequest() |
|
161 | 161 | |
|
162 | 162 | @LoginRequired(allow_default_user=True) |
|
163 | 163 | @HasRepoPermissionLevelDecorator('read') |
|
164 | 164 | def statistics(self, repo_name): |
|
165 | 165 | if c.db_repo.enable_statistics: |
|
166 | 166 | c.show_stats = True |
|
167 | 167 | c.no_data_msg = _('No data ready yet') |
|
168 | 168 | else: |
|
169 | 169 | c.show_stats = False |
|
170 | 170 | c.no_data_msg = _('Statistics are disabled for this repository') |
|
171 | 171 | |
|
172 | 172 | td = date.today() + timedelta(days=1) |
|
173 | 173 | td_1m = td - timedelta(days=calendar.monthrange(td.year, td.month)[1]) |
|
174 | 174 | td_1y = td - timedelta(days=365) |
|
175 | 175 | |
|
176 | 176 | ts_min_m = mktime(td_1m.timetuple()) |
|
177 | 177 | ts_min_y = mktime(td_1y.timetuple()) |
|
178 | 178 | ts_max_y = mktime(td.timetuple()) |
|
179 | 179 | c.ts_min = ts_min_m |
|
180 | 180 | c.ts_max = ts_max_y |
|
181 | 181 | |
|
182 | 182 | stats = db.Statistics.query() \ |
|
183 | 183 | .filter(db.Statistics.repository == c.db_repo) \ |
|
184 | 184 | .scalar() |
|
185 | 185 | c.stats_percentage = 0 |
|
186 | 186 | if stats and stats.languages: |
|
187 | 187 | c.commit_data = ext_json.loads(stats.commit_activity) |
|
188 | 188 | c.overview_data = ext_json.loads(stats.commit_activity_combined) |
|
189 | 189 | |
|
190 | 190 | lang_stats_d = ext_json.loads(stats.languages) |
|
191 | 191 | lang_stats = [(x, {"count": y, |
|
192 | 192 | "desc": LANGUAGES_EXTENSIONS_MAP.get(x, '?')}) |
|
193 | 193 | for x, y in lang_stats_d.items()] |
|
194 | 194 | lang_stats.sort(key=lambda k: (-k[1]['count'], k[0])) |
|
195 | 195 | c.trending_languages = lang_stats[:10] |
|
196 | 196 | |
|
197 | 197 | last_rev = stats.stat_on_revision + 1 |
|
198 | 198 | c.repo_last_rev = c.db_repo_scm_instance.count() \ |
|
199 | 199 | if c.db_repo_scm_instance.revisions else 0 |
|
200 | 200 | if last_rev == 0 or c.repo_last_rev == 0: |
|
201 | 201 | pass |
|
202 | 202 | else: |
|
203 | 203 | c.stats_percentage = '%.2f' % ((float((last_rev)) / |
|
204 | 204 | c.repo_last_rev) * 100) |
|
205 | 205 | else: |
|
206 | 206 | c.commit_data = {} |
|
207 | 207 | c.overview_data = ([[ts_min_y, 0], [ts_max_y, 10]]) |
|
208 | 208 | c.trending_languages = [] |
|
209 | 209 | |
|
210 | 210 | recurse_limit = 500 # don't recurse more than 500 times when parsing |
|
211 | 211 | async_tasks.get_commits_stats(c.db_repo.repo_name, ts_min_y, ts_max_y, recurse_limit) |
|
212 | return render('summary/statistics.html') | |
|
212 | return base.render('summary/statistics.html') |
@@ -1,295 +1,295 b'' | |||
|
1 | 1 | #!/usr/bin/env python3 |
|
2 | 2 | |
|
3 | 3 | |
|
4 | 4 | import re |
|
5 | 5 | import sys |
|
6 | 6 | |
|
7 | 7 | |
|
8 | 8 | ignored_modules = set(''' |
|
9 | 9 | argparse |
|
10 | 10 | base64 |
|
11 | 11 | bcrypt |
|
12 | 12 | binascii |
|
13 | 13 | bleach |
|
14 | 14 | calendar |
|
15 | 15 | celery |
|
16 | 16 | celery |
|
17 | 17 | chardet |
|
18 | 18 | click |
|
19 | 19 | collections |
|
20 | 20 | configparser |
|
21 | 21 | copy |
|
22 | 22 | csv |
|
23 | 23 | ctypes |
|
24 | 24 | datetime |
|
25 | 25 | dateutil |
|
26 | 26 | decimal |
|
27 | 27 | decorator |
|
28 | 28 | difflib |
|
29 | 29 | distutils |
|
30 | 30 | docutils |
|
31 | 31 | |
|
32 | 32 | errno |
|
33 | 33 | fileinput |
|
34 | 34 | functools |
|
35 | 35 | getpass |
|
36 | 36 | grp |
|
37 | 37 | hashlib |
|
38 | 38 | hmac |
|
39 | 39 | html |
|
40 | 40 | http |
|
41 | 41 | imp |
|
42 | 42 | importlib |
|
43 | 43 | inspect |
|
44 | 44 | io |
|
45 | 45 | ipaddr |
|
46 | 46 | IPython |
|
47 | 47 | isapi_wsgi |
|
48 | 48 | itertools |
|
49 | 49 | json |
|
50 | 50 | kajiki |
|
51 | 51 | ldap |
|
52 | 52 | logging |
|
53 | 53 | mako |
|
54 | 54 | markdown |
|
55 | 55 | mimetypes |
|
56 | 56 | mock |
|
57 | 57 | msvcrt |
|
58 | 58 | multiprocessing |
|
59 | 59 | operator |
|
60 | 60 | os |
|
61 | 61 | paginate |
|
62 | 62 | paginate_sqlalchemy |
|
63 | 63 | pam |
|
64 | 64 | paste |
|
65 | 65 | pkg_resources |
|
66 | 66 | platform |
|
67 | 67 | posixpath |
|
68 | 68 | pprint |
|
69 | 69 | pwd |
|
70 | 70 | pyflakes |
|
71 | 71 | pytest |
|
72 | 72 | pytest_localserver |
|
73 | 73 | random |
|
74 | 74 | re |
|
75 | 75 | routes |
|
76 | 76 | setuptools |
|
77 | 77 | shlex |
|
78 | 78 | shutil |
|
79 | 79 | smtplib |
|
80 | 80 | socket |
|
81 | 81 | ssl |
|
82 | 82 | stat |
|
83 | 83 | string |
|
84 | 84 | struct |
|
85 | 85 | subprocess |
|
86 | 86 | sys |
|
87 | 87 | tarfile |
|
88 | 88 | tempfile |
|
89 | 89 | textwrap |
|
90 | 90 | tgext |
|
91 | 91 | threading |
|
92 | 92 | time |
|
93 | 93 | traceback |
|
94 | 94 | traitlets |
|
95 | 95 | types |
|
96 | 96 | urllib |
|
97 | 97 | urlobject |
|
98 | 98 | uuid |
|
99 | 99 | warnings |
|
100 | 100 | webhelpers2 |
|
101 | 101 | webob |
|
102 | 102 | webtest |
|
103 | 103 | whoosh |
|
104 | 104 | win32traceutil |
|
105 | 105 | zipfile |
|
106 | 106 | '''.split()) |
|
107 | 107 | |
|
108 | 108 | top_modules = set(''' |
|
109 | 109 | kallithea.alembic |
|
110 | 110 | kallithea.bin |
|
111 | 111 | kallithea.config |
|
112 | 112 | kallithea.controllers |
|
113 | 113 | kallithea.templates.py |
|
114 | 114 | scripts |
|
115 | 115 | '''.split()) |
|
116 | 116 | |
|
117 | 117 | bottom_external_modules = set(''' |
|
118 | 118 | tg |
|
119 | 119 | mercurial |
|
120 | 120 | sqlalchemy |
|
121 | 121 | alembic |
|
122 | 122 | formencode |
|
123 | 123 | pygments |
|
124 | 124 | dulwich |
|
125 | 125 | beaker |
|
126 | 126 | psycopg2 |
|
127 | 127 | docs |
|
128 | 128 | setup |
|
129 | 129 | conftest |
|
130 | 130 | '''.split()) |
|
131 | 131 | |
|
132 | 132 | normal_modules = set(''' |
|
133 | 133 | kallithea |
|
134 | kallithea.controllers.base | |
|
134 | 135 | kallithea.lib |
|
135 | 136 | kallithea.lib.auth |
|
136 | 137 | kallithea.lib.auth_modules |
|
137 | kallithea.lib.base | |
|
138 | 138 | kallithea.lib.celerylib |
|
139 | 139 | kallithea.lib.db_manage |
|
140 | 140 | kallithea.lib.helpers |
|
141 | 141 | kallithea.lib.hooks |
|
142 | 142 | kallithea.lib.indexers |
|
143 | 143 | kallithea.lib.utils |
|
144 | 144 | kallithea.lib.utils2 |
|
145 | 145 | kallithea.lib.vcs |
|
146 | 146 | kallithea.lib.webutils |
|
147 | 147 | kallithea.model |
|
148 | 148 | kallithea.model.async_tasks |
|
149 | 149 | kallithea.model.scm |
|
150 | 150 | kallithea.templates.py |
|
151 | 151 | '''.split()) |
|
152 | 152 | |
|
153 | 153 | shown_modules = normal_modules | top_modules |
|
154 | 154 | |
|
155 | 155 | # break the chains somehow - this is a cleanup TODO list |
|
156 | 156 | known_violations = [ |
|
157 | 157 | ('kallithea.lib.auth_modules', 'kallithea.lib.auth'), # needs base&facade |
|
158 | 158 | ('kallithea.lib.utils', 'kallithea.model'), # clean up utils |
|
159 | 159 | ('kallithea.lib.utils', 'kallithea.model.db'), |
|
160 | 160 | ('kallithea.lib.utils', 'kallithea.model.scm'), |
|
161 | 161 | ('kallithea.model.async_tasks', 'kallithea.lib.helpers'), |
|
162 | 162 | ('kallithea.model.async_tasks', 'kallithea.lib.hooks'), |
|
163 | 163 | ('kallithea.model.async_tasks', 'kallithea.lib.indexers'), |
|
164 | 164 | ('kallithea.model.async_tasks', 'kallithea.model'), |
|
165 | 165 | ('kallithea.model', 'kallithea.lib.auth'), # auth.HasXXX |
|
166 | 166 | ('kallithea.model', 'kallithea.lib.auth_modules'), # validators |
|
167 | 167 | ('kallithea.model', 'kallithea.lib.helpers'), |
|
168 | 168 | ('kallithea.model', 'kallithea.lib.hooks'), # clean up hooks |
|
169 | 169 | ('kallithea.model', 'kallithea.model.scm'), |
|
170 | 170 | ('kallithea.model.scm', 'kallithea.lib.hooks'), |
|
171 | 171 | ] |
|
172 | 172 | |
|
173 | 173 | extra_edges = [ |
|
174 | 174 | ('kallithea.config', 'kallithea.controllers'), # through TG |
|
175 | 175 | ('kallithea.lib.auth', 'kallithea.lib.auth_modules'), # custom loader |
|
176 | 176 | ] |
|
177 | 177 | |
|
178 | 178 | |
|
179 | 179 | def normalize(s): |
|
180 | 180 | """Given a string with dot path, return the string it should be shown as.""" |
|
181 | 181 | parts = s.replace('.__init__', '').split('.') |
|
182 | 182 | short_2 = '.'.join(parts[:2]) |
|
183 | 183 | short_3 = '.'.join(parts[:3]) |
|
184 | 184 | short_4 = '.'.join(parts[:4]) |
|
185 | 185 | if parts[0] in ['scripts', 'contributor_data', 'i18n_utils']: |
|
186 | 186 | return 'scripts' |
|
187 | 187 | if short_3 == 'kallithea.model.meta': |
|
188 | 188 | return 'kallithea.model.db' |
|
189 | 189 | if parts[:4] == ['kallithea', 'lib', 'vcs', 'ssh']: |
|
190 | 190 | return 'kallithea.lib.vcs.ssh' |
|
191 | 191 | if short_4 in shown_modules: |
|
192 | 192 | return short_4 |
|
193 | 193 | if short_3 in shown_modules: |
|
194 | 194 | return short_3 |
|
195 | 195 | if short_2 in shown_modules: |
|
196 | 196 | return short_2 |
|
197 | 197 | if short_2 == 'kallithea.tests': |
|
198 | 198 | return None |
|
199 | 199 | if parts[0] in ignored_modules: |
|
200 | 200 | return None |
|
201 | 201 | assert parts[0] in bottom_external_modules, parts |
|
202 | 202 | return parts[0] |
|
203 | 203 | |
|
204 | 204 | |
|
205 | 205 | def main(filenames): |
|
206 | 206 | if not filenames or filenames[0].startswith('-'): |
|
207 | 207 | print('''\ |
|
208 | 208 | Usage: |
|
209 | 209 | hg files 'set:!binary()&grep("^#!.*python")' 'set:**.py' | xargs scripts/deps.py |
|
210 | 210 | dot -Tsvg deps.dot > deps.svg |
|
211 | 211 | ''') |
|
212 | 212 | raise SystemExit(1) |
|
213 | 213 | |
|
214 | 214 | files_imports = dict() # map filenames to its imports |
|
215 | 215 | import_deps = set() # set of tuples with module name and its imports |
|
216 | 216 | for fn in filenames: |
|
217 | 217 | with open(fn) as f: |
|
218 | 218 | s = f.read() |
|
219 | 219 | |
|
220 | 220 | dot_name = (fn[:-3] if fn.endswith('.py') else fn).replace('/', '.') |
|
221 | 221 | file_imports = set() |
|
222 | 222 | for m in re.finditer(r'^ *(?:from ([^ ]*) import (?:([a-zA-Z].*)|\(([^)]*)\))|import (.*))$', s, re.MULTILINE): |
|
223 | 223 | m_from, m_from_import, m_from_import2, m_import = m.groups() |
|
224 | 224 | if m_from: |
|
225 | 225 | pre = m_from + '.' |
|
226 | 226 | if pre.startswith('.'): |
|
227 | 227 | pre = dot_name.rsplit('.', 1)[0] + pre |
|
228 | 228 | importlist = m_from_import or m_from_import2 |
|
229 | 229 | else: |
|
230 | 230 | pre = '' |
|
231 | 231 | importlist = m_import |
|
232 | 232 | for imp in importlist.split('#', 1)[0].split(','): |
|
233 | 233 | full_imp = pre + imp.strip().split(' as ', 1)[0] |
|
234 | 234 | file_imports.add(full_imp) |
|
235 | 235 | import_deps.add((dot_name, full_imp)) |
|
236 | 236 | files_imports[fn] = file_imports |
|
237 | 237 | |
|
238 | 238 | # dump out all deps for debugging and analysis |
|
239 | 239 | with open('deps.txt', 'w') as f: |
|
240 | 240 | for fn, file_imports in sorted(files_imports.items()): |
|
241 | 241 | for file_import in sorted(file_imports): |
|
242 | 242 | if file_import.split('.', 1)[0] in ignored_modules: |
|
243 | 243 | continue |
|
244 | 244 | f.write('%s: %s\n' % (fn, file_import)) |
|
245 | 245 | |
|
246 | 246 | # find leafs that haven't been ignored - they are the important external dependencies and shown in the bottom row |
|
247 | 247 | only_imported = set( |
|
248 | 248 | set(normalize(b) for a, b in import_deps) - |
|
249 | 249 | set(normalize(a) for a, b in import_deps) - |
|
250 | 250 | set([None, 'kallithea']) |
|
251 | 251 | ) |
|
252 | 252 | |
|
253 | 253 | normalized_dep_edges = set() |
|
254 | 254 | for dot_name, full_imp in import_deps: |
|
255 | 255 | a = normalize(dot_name) |
|
256 | 256 | b = normalize(full_imp) |
|
257 | 257 | if a is None or b is None or a == b: |
|
258 | 258 | continue |
|
259 | 259 | normalized_dep_edges.add((a, b)) |
|
260 | 260 | #print((dot_name, full_imp, a, b)) |
|
261 | 261 | normalized_dep_edges.update(extra_edges) |
|
262 | 262 | |
|
263 | 263 | unseen_shown_modules = shown_modules.difference(a for a, b in normalized_dep_edges).difference(b for a, b in normalized_dep_edges) |
|
264 | 264 | assert not unseen_shown_modules, unseen_shown_modules |
|
265 | 265 | |
|
266 | 266 | with open('deps.dot', 'w') as f: |
|
267 | 267 | f.write('digraph {\n') |
|
268 | 268 | f.write('subgraph { rank = same; %s}\n' % ''.join('"%s"; ' % s for s in sorted(top_modules))) |
|
269 | 269 | f.write('subgraph { rank = same; %s}\n' % ''.join('"%s"; ' % s for s in sorted(only_imported))) |
|
270 | 270 | for a, b in sorted(normalized_dep_edges): |
|
271 | 271 | f.write(' "%s" -> "%s"%s\n' % (a, b, ' [color=red]' if (a, b) in known_violations else ' [color=green]' if (a, b) in extra_edges else '')) |
|
272 | 272 | f.write('}\n') |
|
273 | 273 | |
|
274 | 274 | # verify dependencies by untangling dependency chain bottom-up: |
|
275 | 275 | todo = set(normalized_dep_edges) |
|
276 | 276 | for x in known_violations: |
|
277 | 277 | todo.remove(x) |
|
278 | 278 | |
|
279 | 279 | while todo: |
|
280 | 280 | depending = set(a for a, b in todo) |
|
281 | 281 | depended = set(b for a, b in todo) |
|
282 | 282 | drop = depended - depending |
|
283 | 283 | if not drop: |
|
284 | 284 | print('ERROR: cycles:', len(todo)) |
|
285 | 285 | for x in sorted(todo): |
|
286 | 286 | print('%s,' % (x,)) |
|
287 | 287 | raise SystemExit(1) |
|
288 | 288 | #for do_b in sorted(drop): |
|
289 | 289 | # print('Picking', do_b, '- unblocks:', ' '.join(a for a, b in sorted((todo)) if b == do_b)) |
|
290 | 290 | todo = set((a, b) for a, b in todo if b in depending) |
|
291 | 291 | #print() |
|
292 | 292 | |
|
293 | 293 | |
|
294 | 294 | if __name__ == '__main__': |
|
295 | 295 | main(sys.argv[1:]) |
General Comments 0
You need to be logged in to leave comments.
Login now