Show More
The requested changes are too big and content was truncated. Show full diff
@@ -0,0 +1,137 b'' | |||||
|
1 | .. _checklist-tickets: | |||
|
2 | ||||
|
3 | ================= | |||
|
4 | Ticket Checklists | |||
|
5 | ================= | |||
|
6 | ||||
|
7 | ||||
|
8 | Ticket Description | |||
|
9 | ================== | |||
|
10 | ||||
|
11 | In general these things really matter in the description: | |||
|
12 | ||||
|
13 | - Reasoning / Rationale. Explain "WHY" it makes sense and is important. | |||
|
14 | ||||
|
15 | - How to reproduce. Easy to follow steps, that’s important. | |||
|
16 | ||||
|
17 | - Observation: The problem (short) | |||
|
18 | ||||
|
19 | - Expectation: How it should be (short) | |||
|
20 | ||||
|
21 | - Specs: It is fine to draft them as good as it works. | |||
|
22 | ||||
|
23 | If anything is unclear, please ask for a review or help on this via the | |||
|
24 | Community Portal or Slack channel. | |||
|
25 | ||||
|
26 | ||||
|
27 | Checklists for Tickets | |||
|
28 | ====================== | |||
|
29 | ||||
|
30 | BUG | |||
|
31 | --- | |||
|
32 | ||||
|
33 | Definition: An existing function that does not work as expected for the user. | |||
|
34 | ||||
|
35 | - Problem description | |||
|
36 | - Steps needed to recreate (gherkin) | |||
|
37 | - Link to the screen in question and/or description of how to find it via | |||
|
38 | navigation | |||
|
39 | - Explanation of what the expected outcome is | |||
|
40 | - Any hints into the source of the problem | |||
|
41 | - Information about platform/browser/db/etc. where applicable | |||
|
42 | - Examples of other similar cases which have different behaviour | |||
|
43 | ||||
|
44 | DESIGN | |||
|
45 | ------ | |||
|
46 | ||||
|
47 | Definition: Styling and user interface issues, including cosmetic improvements | |||
|
48 | or appearance and behaviour of frontend functionality. | |||
|
49 | ||||
|
50 | - Screenshot/animation of existing page/behaviour | |||
|
51 | - Sketches or wireframes if available | |||
|
52 | - Link to the screen in question and/or description of how to find it via | |||
|
53 | navigation | |||
|
54 | - Problem description | |||
|
55 | - Explanation of what the expected outcome is | |||
|
56 | - Since this may be examined by a designer; it should be written in a way that a | |||
|
57 | non-developer can understand | |||
|
58 | ||||
|
59 | EPIC | |||
|
60 | ---- | |||
|
61 | ||||
|
62 | Definition: A collection of tickets which together complete a larger overall | |||
|
63 | project. | |||
|
64 | ||||
|
65 | - Benefit explanation | |||
|
66 | - Clear objective - when is this complete? | |||
|
67 | - Explanations of exceptions/corner cases | |||
|
68 | - Documentation subtask | |||
|
69 | - Comprehensive wireframes and/or design subtasks | |||
|
70 | - Links to subtasks | |||
|
71 | ||||
|
72 | FEATURE | |||
|
73 | ------- | |||
|
74 | ||||
|
75 | Definition: A new function in the software which previously did not exist. | |||
|
76 | ||||
|
77 | - Benefit explanation | |||
|
78 | - Clear objective | |||
|
79 | - Explanations of exceptions/corner cases | |||
|
80 | - Documentation subtask | |||
|
81 | - Comprehensive wireframes and/or design subtasks | |||
|
82 | ||||
|
83 | SUPPORT | |||
|
84 | ------- | |||
|
85 | ||||
|
86 | Definition: An issue related to a customer report. | |||
|
87 | ||||
|
88 | - Link to support ticket, if available | |||
|
89 | - Problem description | |||
|
90 | - Steps needed to recreate (gherkin) | |||
|
91 | - Link to the screen in question and/or description of how to find it via | |||
|
92 | navigation | |||
|
93 | - Explanation of what the expected outcome is | |||
|
94 | - Any hints into the source of the problem | |||
|
95 | - Information about platform/browser/db/etc. where applicable | |||
|
96 | - Examples of other similar cases which have different behaviour | |||
|
97 | ||||
|
98 | TASK | |||
|
99 | ---- | |||
|
100 | ||||
|
101 | Definition: An improvement or step towards implementing a feature or fixing | |||
|
102 | a bug. Includes refactoring and other tech debt. | |||
|
103 | ||||
|
104 | - Clear objective | |||
|
105 | - Benefit explanation | |||
|
106 | - Links to parent/related tickets | |||
|
107 | ||||
|
108 | ||||
|
109 | All details below. | |||
|
110 | ||||
|
111 | ||||
|
112 | External links: | |||
|
113 | ||||
|
114 | - Avoid linking to external images; they disappear over time. Please attach any | |||
|
115 | relevant images to the ticket itself. | |||
|
116 | ||||
|
117 | - External links in general: They also disappear over time, consider copying the | |||
|
118 | relevant bit of information into a comment or write a paragraph to sum up the | |||
|
119 | general idea. | |||
|
120 | ||||
|
121 | ||||
|
122 | Hints | |||
|
123 | ===== | |||
|
124 | ||||
|
125 | Change Description | |||
|
126 | ------------------ | |||
|
127 | ||||
|
128 | It can be tricky to figure out how to change the description of a ticket. There | |||
|
129 | is a very small pencil which has to be clicked once you see the edit form of a | |||
|
130 | ticket. | |||
|
131 | ||||
|
132 | ||||
|
133 | .. figure:: images/redmine-description.png | |||
|
134 | :alt: Example of pencil to change the ticket description | |||
|
135 | ||||
|
136 | Shows an example of the pencil which lets you change the description. | |||
|
137 |
@@ -0,0 +1,153 b'' | |||||
|
1 | ||||
|
2 | ================================================== | |||
|
3 | Code style and structure guide for frontend work | |||
|
4 | ================================================== | |||
|
5 | ||||
|
6 | About: Outline of frontend development practices. | |||
|
7 | ||||
|
8 | ||||
|
9 | ||||
|
10 | ||||
|
11 | Templates | |||
|
12 | ========= | |||
|
13 | ||||
|
14 | - Indent with 4 spaces in general. | |||
|
15 | - Embedded Python code follows the same conventions as in the backend. | |||
|
16 | ||||
|
17 | A common problem is missed spaces around operators. | |||
|
18 | ||||
|
19 | ||||
|
20 | ||||
|
21 | ||||
|
22 | Grunt | |||
|
23 | ===== | |||
|
24 | ||||
|
25 | We use Grunt to compile our JavaScript and LESS files. This is done automatically | |||
|
26 | when you start an instance. If you are changing these files, however, it is | |||
|
27 | recommended to amend `--reload` to the `runserver` command, or use `grunt watch` | |||
|
28 | - the Gruntfile is located in the base directory. For more info on Grunt, see | |||
|
29 | http://gruntjs.com/ | |||
|
30 | ||||
|
31 | ||||
|
32 | ||||
|
33 | ||||
|
34 | LESS CSS | |||
|
35 | ======== | |||
|
36 | ||||
|
37 | ||||
|
38 | Style | |||
|
39 | ----- | |||
|
40 | ||||
|
41 | - Use 4 spaces instead of tabs. | |||
|
42 | - Avoid ``!important``; it is very often an indicator for a problem. | |||
|
43 | ||||
|
44 | ||||
|
45 | ||||
|
46 | ||||
|
47 | Structure | |||
|
48 | --------- | |||
|
49 | ||||
|
50 | It is important that we maintain consistency in the LESS files so that things | |||
|
51 | scale properly. CSS is organized using LESS and then compiled into a CSS file | |||
|
52 | to be used in production. Find the class you need to change and change it | |||
|
53 | there. Do not add overriding styles at the end of the file. The LESS file will | |||
|
54 | be minified; use plenty of spacing and comments for readability. | |||
|
55 | ||||
|
56 | These will be kept in auxillary LESS files to be imported (in this order) at the top: | |||
|
57 | ||||
|
58 | - `fonts.less` (font-face declarations) | |||
|
59 | - `mixins` (place all LESS mixins here) | |||
|
60 | - `helpers` (basic classes for hiding mobile elements, centering, etc) | |||
|
61 | - `variables` (theme-specific colors, spacing, and fonts which might change later) | |||
|
62 | ||||
|
63 | ||||
|
64 | Sections of the primary LESS file are as follows. Add comments describing | |||
|
65 | layout and modules. | |||
|
66 | ||||
|
67 | .. code-block:: css | |||
|
68 | ||||
|
69 | //--- BASE ------------------// | |||
|
70 | Very basic, sitewide styles. | |||
|
71 | ||||
|
72 | //--- LAYOUT ------------------// | |||
|
73 | Essential layout, ex. containers and wrappers. | |||
|
74 | Do not put type styles in here. | |||
|
75 | ||||
|
76 | //--- MODULES ------------------// | |||
|
77 | Reusable sections, such as sidebars and menus. | |||
|
78 | ||||
|
79 | //--- THEME ------------------// | |||
|
80 | Theme styles, typography, etc. | |||
|
81 | ||||
|
82 | ||||
|
83 | ||||
|
84 | Formatting rules | |||
|
85 | ~~~~~~~~~~~~~~~~ | |||
|
86 | ||||
|
87 | - Each rule should be indented on a separate line (this is helpful for diff | |||
|
88 | checking). | |||
|
89 | ||||
|
90 | - Use a space after each colon and a semicolon after each last rule. | |||
|
91 | ||||
|
92 | - Put a blank line between each class. | |||
|
93 | ||||
|
94 | - Nested classes should be listed after the parent class' rules, separated with a | |||
|
95 | blank line, and indented. | |||
|
96 | ||||
|
97 | - Using the below as a guide, place each rule in order of its effect on content, | |||
|
98 | layout, sizing, and last listing minor style changes such as font color and | |||
|
99 | backgrounds. Not every possible rule is listed here; when adding new ones, | |||
|
100 | judge where it should go in the list based on that hierarchy. | |||
|
101 | ||||
|
102 | .. code-block:: scss | |||
|
103 | ||||
|
104 | .class { | |||
|
105 | content | |||
|
106 | list-style-type | |||
|
107 | position | |||
|
108 | float | |||
|
109 | top | |||
|
110 | right | |||
|
111 | bottom | |||
|
112 | left | |||
|
113 | height | |||
|
114 | max-height | |||
|
115 | min-height | |||
|
116 | width | |||
|
117 | max-width | |||
|
118 | min-width | |||
|
119 | margin | |||
|
120 | padding | |||
|
121 | indent | |||
|
122 | vertical-align | |||
|
123 | text-align | |||
|
124 | border | |||
|
125 | border-radius | |||
|
126 | font-size | |||
|
127 | line-height | |||
|
128 | font | |||
|
129 | font-style | |||
|
130 | font-variant | |||
|
131 | font-weight | |||
|
132 | color | |||
|
133 | text-shadow | |||
|
134 | background | |||
|
135 | background-color | |||
|
136 | box-shadow | |||
|
137 | background-url | |||
|
138 | background-position | |||
|
139 | background-repeat | |||
|
140 | background-cover | |||
|
141 | transitions | |||
|
142 | cursor | |||
|
143 | pointer-events | |||
|
144 | ||||
|
145 | .nested-class { | |||
|
146 | position | |||
|
147 | background-color | |||
|
148 | ||||
|
149 | &:hover { | |||
|
150 | color | |||
|
151 | } | |||
|
152 | } | |||
|
153 | } |
@@ -0,0 +1,111 b'' | |||||
|
1 | ||||
|
2 | ======================= | |||
|
3 | Contributing Overview | |||
|
4 | ======================= | |||
|
5 | ||||
|
6 | ||||
|
7 | RhodeCode Community Edition is an open source code management platform. We | |||
|
8 | encourage contributions to our project from the community. This is a basic | |||
|
9 | overview of the procedures for adding your contribution to RhodeCode. | |||
|
10 | ||||
|
11 | ||||
|
12 | ||||
|
13 | Check the Issue Tracker | |||
|
14 | ======================= | |||
|
15 | ||||
|
16 | Make an account at https://issues.rhodecode.com/account/register and browse the | |||
|
17 | current tickets for bugs to fix and tasks to do. Have a bug or feature that you | |||
|
18 | can't find in the tracker? Create a new issue for it. When you select a ticket, | |||
|
19 | make sure to assign it to yourself and mark it "in progress" to avoid duplicated | |||
|
20 | work. | |||
|
21 | ||||
|
22 | ||||
|
23 | ||||
|
24 | Sign Up at code.rhodecode.com | |||
|
25 | ============================= | |||
|
26 | ||||
|
27 | Make an account at https://code.rhodecode.com/ using an email or your existing | |||
|
28 | GitHub, Bitbucket, Google, or Twitter account. Fork the repo you'd like to | |||
|
29 | contribute to; we suggest adding your username to the fork name. Clone your fork | |||
|
30 | to your computer. We use Mercurial for source control management; see | |||
|
31 | https://www.mercurial-scm.org/guide to get started quickly. | |||
|
32 | ||||
|
33 | ||||
|
34 | ||||
|
35 | Set Up A Local Instance | |||
|
36 | ======================= | |||
|
37 | ||||
|
38 | You will need to set up an instance of RhodeCode CE using VCSServer so that you | |||
|
39 | can see your work locally as you make changes. We recommend using Linux for this | |||
|
40 | but it can also be built on OSX. | |||
|
41 | ||||
|
42 | See :doc:`dev-setup` for instructions. | |||
|
43 | ||||
|
44 | ||||
|
45 | ||||
|
46 | Code! | |||
|
47 | ===== | |||
|
48 | ||||
|
49 | You can now make, see, and test your changes locally. We are always improving to | |||
|
50 | keep our code clean and the cost of maintaining it low. This applies in the same | |||
|
51 | way for contributions. We run automated checks on our pull requests, and expect | |||
|
52 | understandable code. We also aim to provide test coverage for as much of our | |||
|
53 | codebase as possible; any new features should be augmented with tests. | |||
|
54 | ||||
|
55 | Keep in mind that when we accept your contribution, we also take responsibility | |||
|
56 | for it; we must understand it to take on that responsibility. | |||
|
57 | ||||
|
58 | See :doc:`standards` for more detailed information. | |||
|
59 | ||||
|
60 | ||||
|
61 | ||||
|
62 | Commit And Push Your Changes | |||
|
63 | ============================ | |||
|
64 | ||||
|
65 | We highly recommend making a new bookmark for each feature, bug, or set of | |||
|
66 | commits you make so that you can point to it when creating your pull request. | |||
|
67 | Please also reference the ticket number in your commit messages. Don't forget to | |||
|
68 | push the bookmark! | |||
|
69 | ||||
|
70 | ||||
|
71 | ||||
|
72 | Submit a Pull Request | |||
|
73 | ===================== | |||
|
74 | ||||
|
75 | Go to your fork, and choose "Create Pull Request" from the Options menu. Use | |||
|
76 | your bookmark as the source, and choose someone to review it. Don't worry about | |||
|
77 | chosing the right person; we'll assign the best contributor for the job. You'll | |||
|
78 | get feedback and an assigned status. | |||
|
79 | ||||
|
80 | Be prepared to make updates to your pull request after some feedback. | |||
|
81 | Collaboration is part of the process and improvements can often be made. | |||
|
82 | ||||
|
83 | ||||
|
84 | ||||
|
85 | Sign the Contributor License Agreement | |||
|
86 | ====================================== | |||
|
87 | ||||
|
88 | If your contribution is approved, you will need to virtually sign the license | |||
|
89 | agreement in order for it to be merged into the project's codebase. You can read | |||
|
90 | it on our website here: https://rhodecode.com/static/pdf/RhodeCode-CLA.pdf | |||
|
91 | ||||
|
92 | To sign, go to code.rhodecode.com | |||
|
93 | and clone the CLA repository. Add your name and make a pull request to add it to | |||
|
94 | the contributor agreement; this serves as your virtual signature. Once your | |||
|
95 | signature is merged, add a link to the relevant commit to your contribution | |||
|
96 | pull request. | |||
|
97 | ||||
|
98 | ||||
|
99 | ||||
|
100 | That's it! We'll take it from there. Thanks for your contribution! | |||
|
101 | ------------------------------------------------------------------ | |||
|
102 | ||||
|
103 | .. note:: If you have any questions or comments, feel free to contact us through | |||
|
104 | either the community portal(community.rhodecode.com), IRC | |||
|
105 | (irc.freenode.net), or Slack (rhodecode.com/join). | |||
|
106 | ||||
|
107 | ||||
|
108 | ||||
|
109 | ||||
|
110 | ||||
|
111 |
@@ -0,0 +1,177 b'' | |||||
|
1 | ||||
|
2 | ====================== | |||
|
3 | Contribution Standards | |||
|
4 | ====================== | |||
|
5 | ||||
|
6 | Standards help to improve the quality of our product and its development. Herein | |||
|
7 | we define our standards for processes and development to maintain consistency | |||
|
8 | and function well as a community. It is a work in progress; modifications to this | |||
|
9 | document should be discussed and agreed upon by the community. | |||
|
10 | ||||
|
11 | ||||
|
12 | -------------------------------------------------------------------------------- | |||
|
13 | ||||
|
14 | Code | |||
|
15 | ==== | |||
|
16 | ||||
|
17 | This provides an outline for standards we use in our codebase to keep our code | |||
|
18 | easy to read and easy to maintain. Much of our code guidelines are based on the | |||
|
19 | book `Clean Code <http://www.pearsonhighered.com/educator/product/Clean-Code-A-Handbook-of-Agile-Software-Craftsmanship/9780132350884.page>`_ | |||
|
20 | by Robert Martin. | |||
|
21 | ||||
|
22 | We maintain a Tech Glossary to provide consistency in terms and symbolic names | |||
|
23 | used for items and concepts within the application. This is found in the CE | |||
|
24 | project in /docs-internal/glossary.rst | |||
|
25 | ||||
|
26 | ||||
|
27 | Refactoring | |||
|
28 | ----------- | |||
|
29 | Make it better than you found it! | |||
|
30 | ||||
|
31 | Our codebase can always use improvement and often benefits from refactoring. | |||
|
32 | New code should be refactored as it is being written, and old code should be | |||
|
33 | treated with the same care as if it was new. Before doing any refactoring, | |||
|
34 | ensure that there is test coverage on the affected code; this will help | |||
|
35 | minimize issues. | |||
|
36 | ||||
|
37 | ||||
|
38 | Python | |||
|
39 | ------ | |||
|
40 | For Python, we use `PEP8 <https://www.python.org/dev/peps/pep-0008/>`_. | |||
|
41 | We adjust lines of code to under 80 characters and use 4 spaces for indentation. | |||
|
42 | ||||
|
43 | ||||
|
44 | JavaScript | |||
|
45 | ---------- | |||
|
46 | This currently remains undefined. Suggestions welcome! | |||
|
47 | ||||
|
48 | ||||
|
49 | HTML | |||
|
50 | ---- | |||
|
51 | Unfortunately, we currently have no strict HTML standards, but there are a few | |||
|
52 | guidelines we do follow. Templates must work in all modern browsers. HTML should | |||
|
53 | be clean and easy to read, and additionally should be free of inline CSS or | |||
|
54 | JavaScript. It is recommended to use data attributes for JS actions where | |||
|
55 | possible in order to separate it from styling and prevent unintentional changes. | |||
|
56 | ||||
|
57 | ||||
|
58 | LESS/CSS | |||
|
59 | -------- | |||
|
60 | We use LESS for our CSS; see :doc:`frontend` for structure and formatting | |||
|
61 | guidelines. | |||
|
62 | ||||
|
63 | ||||
|
64 | Linters | |||
|
65 | ------- | |||
|
66 | Currently, we have a linter for pull requests which checks code against PEP8. | |||
|
67 | We intend to add more in the future as we clarify standards. | |||
|
68 | ||||
|
69 | ||||
|
70 | -------------------------------------------------------------------------------- | |||
|
71 | ||||
|
72 | Naming Conventions | |||
|
73 | ================== | |||
|
74 | ||||
|
75 | These still need to be defined for naming everything from Python variables to | |||
|
76 | HTML classes to files and folders. | |||
|
77 | ||||
|
78 | ||||
|
79 | -------------------------------------------------------------------------------- | |||
|
80 | ||||
|
81 | Testing | |||
|
82 | ======= | |||
|
83 | ||||
|
84 | Testing is a very important aspect of our process, especially as we are our own | |||
|
85 | quality control team. While it is of course unrealistic to hit every potential | |||
|
86 | combination, our goal is to cover every line of Python code with a test. | |||
|
87 | ||||
|
88 | The following is a brief introduction to our test suite. Our tests are primarily | |||
|
89 | written using `py.test <http://pytest.org/>`_ | |||
|
90 | ||||
|
91 | ||||
|
92 | Acceptance Tests | |||
|
93 | ---------------- | |||
|
94 | Also known as "ac tests", these test from the user and business perspective to | |||
|
95 | check if the requirements of a feature are met. Scenarios are created at a | |||
|
96 | feature's inception and help to define its value. | |||
|
97 | ||||
|
98 | py.test is used for ac tests; they are located in a repo separate from the | |||
|
99 | other tests which follow. Each feature has a .feature file which contains a | |||
|
100 | brief description and the scenarios to be tested. | |||
|
101 | ||||
|
102 | ||||
|
103 | Functional Tests | |||
|
104 | ---------------- | |||
|
105 | These test specific functionality in the application which checks through the | |||
|
106 | entire stack. Typically these are user actions or permissions which go through | |||
|
107 | the web browser. They are located in rhodecode/tests. | |||
|
108 | ||||
|
109 | ||||
|
110 | Unit Tests | |||
|
111 | ---------- | |||
|
112 | These test isolated, individual modules to ensure that they function correctly. | |||
|
113 | They are located in rhodecode/tests. | |||
|
114 | ||||
|
115 | ||||
|
116 | Integration Tests | |||
|
117 | ----------------- | |||
|
118 | These are used for testing performance of larger systems than the unit tests. | |||
|
119 | They are located in rhodecode/tests. | |||
|
120 | ||||
|
121 | ||||
|
122 | JavaScript Testing | |||
|
123 | ------------------ | |||
|
124 | Currently, we have not defined how to test our JavaScript code. | |||
|
125 | ||||
|
126 | ||||
|
127 | -------------------------------------------------------------------------------- | |||
|
128 | ||||
|
129 | Pull Requests | |||
|
130 | ============= | |||
|
131 | ||||
|
132 | Pull requests should generally contain only one thing: a single feature, one bug | |||
|
133 | fix, etc.. The commit history should be concise and clean, and the pull request | |||
|
134 | should contain the ticket number (also a good idea for the commits themselves) | |||
|
135 | to provide context for the reviewer. | |||
|
136 | ||||
|
137 | See also: :doc:`checklist-pull-request` | |||
|
138 | ||||
|
139 | ||||
|
140 | Reviewers | |||
|
141 | --------- | |||
|
142 | Each pull request must be approved by at least one member of the RhodeCode | |||
|
143 | team. Assignments may be based on expertise or familiarity with a particular | |||
|
144 | area of code, or simply availability. Switching up or adding extra community | |||
|
145 | members for different pull requests helps to share knowledge as well as provide | |||
|
146 | other perspectives. | |||
|
147 | ||||
|
148 | ||||
|
149 | Responsibility | |||
|
150 | -------------- | |||
|
151 | The community is responsible for maintaining features and this must be taken | |||
|
152 | into consideration. External contributions must be held to the same standards | |||
|
153 | as internal contributions. | |||
|
154 | ||||
|
155 | ||||
|
156 | Feature Switch | |||
|
157 | -------------- | |||
|
158 | Experimental and work-in-progress features can be hidden behind one of two | |||
|
159 | switches: | |||
|
160 | ||||
|
161 | * A setting can be added to the Labs page in the Admin section which may allow | |||
|
162 | customers to access and toggle additional features. | |||
|
163 | * For work-in-progress or other features where customer access is not desired, | |||
|
164 | use a custom setting in the .ini file as a trigger. | |||
|
165 | ||||
|
166 | ||||
|
167 | -------------------------------------------------------------------------------- | |||
|
168 | ||||
|
169 | Tickets | |||
|
170 | ======= | |||
|
171 | ||||
|
172 | Redmine tickets are a crucial part of our development process. Any code added or | |||
|
173 | changed in our codebase should have a corresponding ticket to document it. With | |||
|
174 | this in mind, it is important that tickets be as clear and concise as possible, | |||
|
175 | including what the expected outcome is. | |||
|
176 | ||||
|
177 | See also: :doc:`checklist-tickets` |
@@ -0,0 +1,70 b'' | |||||
|
1 | |RCE| 4.2.0 |RNS| | |||
|
2 | ----------------- | |||
|
3 | ||||
|
4 | Release Date | |||
|
5 | ^^^^^^^^^^^^ | |||
|
6 | ||||
|
7 | - 2016-06-30 | |||
|
8 | ||||
|
9 | ||||
|
10 | General | |||
|
11 | ^^^^^^^ | |||
|
12 | ||||
|
13 | - Autocomplete: add GET flag support to show/hide active users on autocomplete, | |||
|
14 | also display this information in autocomplete data. ref #3374 | |||
|
15 | - Gravatar: add flag to show current gravatar + user as disabled user (non-active) | |||
|
16 | - Repos, repo groups, user groups: allow to use disabled users in owner field. | |||
|
17 | This fixes #3374. | |||
|
18 | - Repos, repo groups, user groups: visually show what user is an owner of a | |||
|
19 | resource, and if potentially he is disabled in the system. | |||
|
20 | - Pull requests: reorder navigation on repo pull requests, fixes #2995 | |||
|
21 | - Dependencies: bump dulwich to 0.13.0 | |||
|
22 | ||||
|
23 | New Features | |||
|
24 | ^^^^^^^^^^^^ | |||
|
25 | ||||
|
26 | - My account: made pull requests aggregate view for users look like not | |||
|
27 | created in 1995. Now uses a consistent look with repo one. | |||
|
28 | - emails: expose profile link on registation email that super-admins receive. | |||
|
29 | Implements #3999. | |||
|
30 | - Social auth: move the buttons to the top of nav so they are easier to reach. | |||
|
31 | ||||
|
32 | ||||
|
33 | Security | |||
|
34 | ^^^^^^^^ | |||
|
35 | ||||
|
36 | - Encryption: allow to pass in alternative key for encryption values. Now | |||
|
37 | users can use `rhodecode.encrypted_values.secret` that is alternative key, | |||
|
38 | de-coupled from `beaker.session` key. | |||
|
39 | - Encryption: Implement a slightly improved AesCipher encryption. | |||
|
40 | This addresses issues from #4036. | |||
|
41 | - Encryption: encrypted values now by default uses HMAC signatures to detect | |||
|
42 | if data or secret key is incorrect. The strict checks can be disabled using | |||
|
43 | `rhodecode.encrypted_values.strict = false` .ini setting | |||
|
44 | ||||
|
45 | ||||
|
46 | Performance | |||
|
47 | ^^^^^^^^^^^ | |||
|
48 | ||||
|
49 | - Sql: use smarter JOINs when fetching repository information | |||
|
50 | - Helpers: optimize slight calls for link_to_user to save some intense queries. | |||
|
51 | - App settings: use computed caches for repository settings, this in some cases | |||
|
52 | brings almost 4x performance increase for large repos with a lot of issue | |||
|
53 | tracker patterns. | |||
|
54 | ||||
|
55 | ||||
|
56 | Fixes | |||
|
57 | ^^^^^ | |||
|
58 | ||||
|
59 | - Fixed events on user pre/post create actions | |||
|
60 | - Authentication: fixed problem with saving forms with errors on auth plugins | |||
|
61 | - Svn: Avoid chunked transfer for Subversion that caused checkout issues in some cases. | |||
|
62 | - Users: fix generate new user password helper. | |||
|
63 | - Celery: fixed problem with workers running action in sync mode in some cases. | |||
|
64 | - Setup-db: fix redundant question on writable dir. The question needs to be | |||
|
65 | asked only when the dir is actually not writable. | |||
|
66 | - Elasticsearch: fixed issues when searching single repo using elastic search | |||
|
67 | - Social auth: fix issues with non-active users using social authentication | |||
|
68 | causing a 500 error. | |||
|
69 | - Fixed problem with largefiles extensions on per-repo settings using local | |||
|
70 | .hgrc files present inside the repo directory. |
@@ -0,0 +1,40 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2016-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |||
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |||
|
20 | ||||
|
21 | ||||
|
22 | from rhodecode.admin.navigation import NavigationRegistry | |||
|
23 | from rhodecode.config.routing import ADMIN_PREFIX | |||
|
24 | from rhodecode.lib.utils2 import str2bool | |||
|
25 | ||||
|
26 | ||||
|
27 | def includeme(config): | |||
|
28 | settings = config.get_settings() | |||
|
29 | ||||
|
30 | # Create admin navigation registry and add it to the pyramid registry. | |||
|
31 | labs_active = str2bool(settings.get('labs_settings_active', False)) | |||
|
32 | navigation_registry = NavigationRegistry(labs_active=labs_active) | |||
|
33 | config.registry.registerUtility(navigation_registry) | |||
|
34 | ||||
|
35 | config.add_route( | |||
|
36 | name='admin_settings_open_source', | |||
|
37 | pattern=ADMIN_PREFIX + '/settings/open_source') | |||
|
38 | ||||
|
39 | # Scan module for configuration decorators. | |||
|
40 | config.scan() |
@@ -0,0 +1,29 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2016-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |||
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |||
|
20 | ||||
|
21 | from zope.interface import Interface | |||
|
22 | ||||
|
23 | ||||
|
24 | class IAdminNavigationRegistry(Interface): | |||
|
25 | """ | |||
|
26 | Interface for the admin navigation registry. Currently this is only | |||
|
27 | used to register and retrieve it via pyramids registry. | |||
|
28 | """ | |||
|
29 | pass |
@@ -0,0 +1,124 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2016-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |||
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |||
|
20 | ||||
|
21 | ||||
|
22 | import logging | |||
|
23 | import collections | |||
|
24 | ||||
|
25 | from pylons import url | |||
|
26 | from zope.interface import implementer | |||
|
27 | ||||
|
28 | from rhodecode.admin.interfaces import IAdminNavigationRegistry | |||
|
29 | from rhodecode.lib.utils import get_registry | |||
|
30 | from rhodecode.translation import _ | |||
|
31 | ||||
|
32 | ||||
|
33 | log = logging.getLogger(__name__) | |||
|
34 | ||||
|
35 | NavListEntry = collections.namedtuple('NavListEntry', ['key', 'name', 'url']) | |||
|
36 | ||||
|
37 | ||||
|
38 | class NavEntry(object): | |||
|
39 | """ | |||
|
40 | Represents an entry in the admin navigation. | |||
|
41 | ||||
|
42 | :param key: Unique identifier used to store reference in an OrderedDict. | |||
|
43 | :param name: Display name, usually a translation string. | |||
|
44 | :param view_name: Name of the view, used generate the URL. | |||
|
45 | :param pyramid: Indicator to use pyramid for URL generation. This should | |||
|
46 | be removed as soon as we are fully migrated to pyramid. | |||
|
47 | """ | |||
|
48 | ||||
|
49 | def __init__(self, key, name, view_name, pyramid=False): | |||
|
50 | self.key = key | |||
|
51 | self.name = name | |||
|
52 | self.view_name = view_name | |||
|
53 | self.pyramid = pyramid | |||
|
54 | ||||
|
55 | def generate_url(self, request): | |||
|
56 | if self.pyramid: | |||
|
57 | if hasattr(request, 'route_path'): | |||
|
58 | return request.route_path(self.view_name) | |||
|
59 | else: | |||
|
60 | # TODO: johbo: Remove this after migrating to pyramid. | |||
|
61 | # We need the pyramid request here to generate URLs to pyramid | |||
|
62 | # views from within pylons views. | |||
|
63 | from pyramid.threadlocal import get_current_request | |||
|
64 | pyramid_request = get_current_request() | |||
|
65 | return pyramid_request.route_path(self.view_name) | |||
|
66 | else: | |||
|
67 | return url(self.view_name) | |||
|
68 | ||||
|
69 | ||||
|
70 | @implementer(IAdminNavigationRegistry) | |||
|
71 | class NavigationRegistry(object): | |||
|
72 | ||||
|
73 | _base_entries = [ | |||
|
74 | NavEntry('global', _('Global'), 'admin_settings_global'), | |||
|
75 | NavEntry('vcs', _('VCS'), 'admin_settings_vcs'), | |||
|
76 | NavEntry('visual', _('Visual'), 'admin_settings_visual'), | |||
|
77 | NavEntry('mapping', _('Remap and Rescan'), 'admin_settings_mapping'), | |||
|
78 | NavEntry('issuetracker', _('Issue Tracker'), | |||
|
79 | 'admin_settings_issuetracker'), | |||
|
80 | NavEntry('email', _('Email'), 'admin_settings_email'), | |||
|
81 | NavEntry('hooks', _('Hooks'), 'admin_settings_hooks'), | |||
|
82 | NavEntry('search', _('Full Text Search'), 'admin_settings_search'), | |||
|
83 | NavEntry('system', _('System Info'), 'admin_settings_system'), | |||
|
84 | NavEntry('open_source', _('Open Source Licenses'), | |||
|
85 | 'admin_settings_open_source', pyramid=True), | |||
|
86 | # TODO: marcink: we disable supervisor now until the supervisor stats | |||
|
87 | # page is fixed in the nix configuration | |||
|
88 | # NavEntry('supervisor', _('Supervisor'), 'admin_settings_supervisor'), | |||
|
89 | ] | |||
|
90 | ||||
|
91 | _labs_entry = NavEntry('labs', _('Labs'), | |||
|
92 | 'admin_settings_labs') | |||
|
93 | ||||
|
94 | def __init__(self, labs_active=False): | |||
|
95 | self._registered_entries = collections.OrderedDict([ | |||
|
96 | (item.key, item) for item in self.__class__._base_entries | |||
|
97 | ]) | |||
|
98 | ||||
|
99 | if labs_active: | |||
|
100 | self.add_entry(self._labs_entry) | |||
|
101 | ||||
|
102 | def add_entry(self, entry): | |||
|
103 | self._registered_entries[entry.key] = entry | |||
|
104 | ||||
|
105 | def get_navlist(self, request): | |||
|
106 | navlist = [NavListEntry(i.key, i.name, i.generate_url(request)) | |||
|
107 | for i in self._registered_entries.values()] | |||
|
108 | return navlist | |||
|
109 | ||||
|
110 | ||||
|
111 | def navigation_registry(request): | |||
|
112 | """ | |||
|
113 | Helper that returns the admin navigation registry. | |||
|
114 | """ | |||
|
115 | pyramid_registry = get_registry(request) | |||
|
116 | nav_registry = pyramid_registry.queryUtility(IAdminNavigationRegistry) | |||
|
117 | return nav_registry | |||
|
118 | ||||
|
119 | ||||
|
120 | def navigation_list(request): | |||
|
121 | """ | |||
|
122 | Helper that returns the admin navigation as list of NavListEntry objects. | |||
|
123 | """ | |||
|
124 | return navigation_registry(request).get_navlist(request) |
@@ -0,0 +1,55 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2016-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |||
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |||
|
20 | ||||
|
21 | import collections | |||
|
22 | import logging | |||
|
23 | ||||
|
24 | from pylons import tmpl_context as c | |||
|
25 | from pyramid.view import view_config | |||
|
26 | ||||
|
27 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator | |||
|
28 | from rhodecode.lib.utils import read_opensource_licenses | |||
|
29 | ||||
|
30 | from .navigation import navigation_list | |||
|
31 | ||||
|
32 | ||||
|
33 | log = logging.getLogger(__name__) | |||
|
34 | ||||
|
35 | ||||
|
36 | class AdminSettingsView(object): | |||
|
37 | ||||
|
38 | def __init__(self, context, request): | |||
|
39 | self.request = request | |||
|
40 | self.context = context | |||
|
41 | self.session = request.session | |||
|
42 | self._rhodecode_user = request.user | |||
|
43 | ||||
|
44 | @LoginRequired() | |||
|
45 | @HasPermissionAllDecorator('hg.admin') | |||
|
46 | @view_config( | |||
|
47 | route_name='admin_settings_open_source', request_method='GET', | |||
|
48 | renderer='rhodecode:templates/admin/settings/settings.html') | |||
|
49 | def open_source_licenses(self): | |||
|
50 | c.active = 'open_source' | |||
|
51 | c.navlist = navigation_list(self.request) | |||
|
52 | c.opensource_licenses = collections.OrderedDict( | |||
|
53 | sorted(read_opensource_licenses().items(), key=lambda t: t[0])) | |||
|
54 | ||||
|
55 | return {} |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 |
@@ -0,0 +1,92 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2016-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |||
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |||
|
20 | ||||
|
21 | ||||
|
22 | import pytest | |||
|
23 | ||||
|
24 | ||||
|
25 | class EnabledAuthPlugin(): | |||
|
26 | """ | |||
|
27 | Context manager that updates the 'auth_plugins' setting in DB to enable | |||
|
28 | a plugin. Previous setting is restored on exit. The rhodecode auth plugin | |||
|
29 | is also enabled because it is needed to login the test users. | |||
|
30 | """ | |||
|
31 | ||||
|
32 | def __init__(self, plugin): | |||
|
33 | self.new_value = set([ | |||
|
34 | 'egg:rhodecode-enterprise-ce#rhodecode', | |||
|
35 | plugin.get_id() | |||
|
36 | ]) | |||
|
37 | ||||
|
38 | def __enter__(self): | |||
|
39 | from rhodecode.model.settings import SettingsModel | |||
|
40 | self._old_value = SettingsModel().get_auth_plugins() | |||
|
41 | SettingsModel().create_or_update_setting( | |||
|
42 | 'auth_plugins', ','.join(self.new_value)) | |||
|
43 | ||||
|
44 | def __exit__(self, type, value, traceback): | |||
|
45 | from rhodecode.model.settings import SettingsModel | |||
|
46 | SettingsModel().create_or_update_setting( | |||
|
47 | 'auth_plugins', ','.join(self._old_value)) | |||
|
48 | ||||
|
49 | ||||
|
50 | class DisabledAuthPlugin(): | |||
|
51 | """ | |||
|
52 | Context manager that updates the 'auth_plugins' setting in DB to disable | |||
|
53 | a plugin. Previous setting is restored on exit. | |||
|
54 | """ | |||
|
55 | ||||
|
56 | def __init__(self, plugin): | |||
|
57 | self.plugin_id = plugin.get_id() | |||
|
58 | ||||
|
59 | def __enter__(self): | |||
|
60 | from rhodecode.model.settings import SettingsModel | |||
|
61 | self._old_value = SettingsModel().get_auth_plugins() | |||
|
62 | new_value = [id_ for id_ in self._old_value if id_ != self.plugin_id] | |||
|
63 | SettingsModel().create_or_update_setting( | |||
|
64 | 'auth_plugins', ','.join(new_value)) | |||
|
65 | ||||
|
66 | def __exit__(self, type, value, traceback): | |||
|
67 | from rhodecode.model.settings import SettingsModel | |||
|
68 | SettingsModel().create_or_update_setting( | |||
|
69 | 'auth_plugins', ','.join(self._old_value)) | |||
|
70 | ||||
|
71 | ||||
|
72 | @pytest.fixture(params=[ | |||
|
73 | ('rhodecode.authentication.plugins.auth_crowd', 'egg:rhodecode-enterprise-ce#crowd'), | |||
|
74 | ('rhodecode.authentication.plugins.auth_headers', 'egg:rhodecode-enterprise-ce#headers'), | |||
|
75 | ('rhodecode.authentication.plugins.auth_jasig_cas', 'egg:rhodecode-enterprise-ce#jasig_cas'), | |||
|
76 | ('rhodecode.authentication.plugins.auth_ldap', 'egg:rhodecode-enterprise-ce#ldap'), | |||
|
77 | ('rhodecode.authentication.plugins.auth_pam', 'egg:rhodecode-enterprise-ce#pam'), | |||
|
78 | ('rhodecode.authentication.plugins.auth_rhodecode', 'egg:rhodecode-enterprise-ce#rhodecode'), | |||
|
79 | ('rhodecode.authentication.plugins.auth_token', 'egg:rhodecode-enterprise-ce#token'), | |||
|
80 | ]) | |||
|
81 | def auth_plugin(request): | |||
|
82 | """ | |||
|
83 | Fixture that provides instance for each authentication plugin. These | |||
|
84 | instances are NOT the instances which are registered to the authentication | |||
|
85 | registry. | |||
|
86 | """ | |||
|
87 | from importlib import import_module | |||
|
88 | ||||
|
89 | # Create plugin instance. | |||
|
90 | module, plugin_id = request.param | |||
|
91 | plugin_module = import_module(module) | |||
|
92 | return plugin_module.plugin_factory(plugin_id) |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 |
@@ -0,0 +1,77 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2016-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |||
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |||
|
20 | ||||
|
21 | ||||
|
22 | import pytest | |||
|
23 | ||||
|
24 | from rhodecode.authentication.tests.conftest import ( | |||
|
25 | EnabledAuthPlugin, DisabledAuthPlugin) | |||
|
26 | from rhodecode.config.routing import ADMIN_PREFIX | |||
|
27 | ||||
|
28 | ||||
|
29 | @pytest.mark.usefixtures('autologin_user', 'app') | |||
|
30 | class TestAuthenticationSettings: | |||
|
31 | ||||
|
32 | def test_auth_settings_global_view_get(self, app): | |||
|
33 | url = '{prefix}/auth/'.format(prefix=ADMIN_PREFIX) | |||
|
34 | response = app.get(url) | |||
|
35 | assert response.status_code == 200 | |||
|
36 | ||||
|
37 | def test_plugin_settings_view_get(self, app, auth_plugin): | |||
|
38 | url = '{prefix}/auth/{name}'.format( | |||
|
39 | prefix=ADMIN_PREFIX, | |||
|
40 | name=auth_plugin.name) | |||
|
41 | with EnabledAuthPlugin(auth_plugin): | |||
|
42 | response = app.get(url) | |||
|
43 | assert response.status_code == 200 | |||
|
44 | ||||
|
45 | def test_plugin_settings_view_post(self, app, auth_plugin, csrf_token): | |||
|
46 | url = '{prefix}/auth/{name}'.format( | |||
|
47 | prefix=ADMIN_PREFIX, | |||
|
48 | name=auth_plugin.name) | |||
|
49 | params = { | |||
|
50 | 'enabled': True, | |||
|
51 | 'cache_ttl': 0, | |||
|
52 | 'csrf_token': csrf_token, | |||
|
53 | } | |||
|
54 | with EnabledAuthPlugin(auth_plugin): | |||
|
55 | response = app.post(url, params=params) | |||
|
56 | assert response.status_code in [200, 302] | |||
|
57 | ||||
|
58 | def test_plugin_settings_view_get_404(self, app, auth_plugin): | |||
|
59 | url = '{prefix}/auth/{name}'.format( | |||
|
60 | prefix=ADMIN_PREFIX, | |||
|
61 | name=auth_plugin.name) | |||
|
62 | with DisabledAuthPlugin(auth_plugin): | |||
|
63 | response = app.get(url, status=404) | |||
|
64 | assert response.status_code == 404 | |||
|
65 | ||||
|
66 | def test_plugin_settings_view_post_404(self, app, auth_plugin, csrf_token): | |||
|
67 | url = '{prefix}/auth/{name}'.format( | |||
|
68 | prefix=ADMIN_PREFIX, | |||
|
69 | name=auth_plugin.name) | |||
|
70 | params = { | |||
|
71 | 'enabled': True, | |||
|
72 | 'cache_ttl': 0, | |||
|
73 | 'csrf_token': csrf_token, | |||
|
74 | } | |||
|
75 | with DisabledAuthPlugin(auth_plugin): | |||
|
76 | response = app.post(url, params=params, status=404) | |||
|
77 | assert response.status_code == 404 |
@@ -0,0 +1,28 b'' | |||||
|
1 | # Copyright (C) 2016 RhodeCode GmbH | |||
|
2 | # | |||
|
3 | # This program is free software: you can redistribute it and/or modify | |||
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
5 | # (only), as published by the Free Software Foundation. | |||
|
6 | # | |||
|
7 | # This program is distributed in the hope that it will be useful, | |||
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
10 | # GNU General Public License for more details. | |||
|
11 | # | |||
|
12 | # You should have received a copy of the GNU Affero General Public License | |||
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
14 | # | |||
|
15 | # This program is dual-licensed. If you wish to learn more about the | |||
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |||
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |||
|
18 | ||||
|
19 | """ | |||
|
20 | Checks around the API of the class RhodeCodeAuthPluginBase. | |||
|
21 | """ | |||
|
22 | ||||
|
23 | from rhodecode.authentication.base import RhodeCodeAuthPluginBase | |||
|
24 | ||||
|
25 | ||||
|
26 | def test_str_returns_plugin_id(): | |||
|
27 | plugin = RhodeCodeAuthPluginBase(plugin_id='stub_plugin_id') | |||
|
28 | assert str(plugin) == 'stub_plugin_id' |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
@@ -1,6 +1,6 b'' | |||||
1 | [bumpversion] |
|
1 | [bumpversion] | |
2 |
current_version = 4. |
|
2 | current_version = 4.2.0 | |
3 | message = release: Bump version {current_version} to {new_version} |
|
3 | message = release: Bump version {current_version} to {new_version} | |
4 |
|
4 | |||
5 | [bumpversion:file:rhodecode/VERSION] |
|
5 | [bumpversion:file:rhodecode/VERSION] | |
6 |
|
6 |
@@ -1,34 +1,28 b'' | |||||
1 | [DEFAULT] |
|
1 | [DEFAULT] | |
2 | done = false |
|
2 | done = false | |
3 |
|
3 | |||
|
4 | [task:fixes_on_stable] | |||
|
5 | ||||
|
6 | [task:changelog_updated] | |||
|
7 | ||||
4 | [task:bump_version] |
|
8 | [task:bump_version] | |
5 | done = true |
|
9 | done = true | |
6 |
|
10 | |||
7 | [task:rc_tools_pinned] |
|
11 | [task:generate_api_docs] | |
8 | done = true |
|
12 | ||
|
13 | [task:updated_translation] | |||
9 |
|
14 | |||
10 | [task:fixes_on_stable] |
|
15 | [release] | |
11 | done = true |
|
16 | state = in_progress | |
|
17 | version = 4.2.0 | |||
|
18 | ||||
|
19 | [task:rc_tools_pinned] | |||
12 |
|
20 | |||
13 | [task:pip2nix_generated] |
|
21 | [task:pip2nix_generated] | |
14 | done = true |
|
|||
15 |
|
||||
16 | [task:changelog_updated] |
|
|||
17 | done = true |
|
|||
18 |
|
||||
19 | [task:generate_api_docs] |
|
|||
20 | done = true |
|
|||
21 |
|
22 | |||
22 | [task:generate_js_routes] |
|
23 | [task:generate_js_routes] | |
23 | done = true |
|
|||
24 |
|
||||
25 | [release] |
|
|||
26 | state = prepared |
|
|||
27 | version = 4.1.2 |
|
|||
28 |
|
||||
29 | [task:updated_translation] |
|
|||
30 |
|
24 | |||
31 | [task:updated_trial_license] |
|
25 | [task:updated_trial_license] | |
32 |
|
26 | |||
33 | [task:generate_oss_licenses] |
|
27 | [task:generate_oss_licenses] | |
34 |
|
28 |
@@ -1,101 +1,114 b'' | |||||
1 | ========= |
|
1 | ========= | |
2 | RhodeCode |
|
2 | RhodeCode | |
3 | ========= |
|
3 | ========= | |
4 |
|
4 | |||
5 | About |
|
5 | About | |
6 | ----- |
|
6 | ----- | |
7 |
|
7 | |||
8 | ``RhodeCode`` is a fast and powerful management tool for Mercurial_ and GIT_ |
|
8 | ``RhodeCode`` is a fast and powerful management tool for Mercurial_ and GIT_ | |
9 | and Subversion_ with a built in push/pull server, full text search, |
|
9 | and Subversion_ with a built in push/pull server, full text search, | |
10 | pull requests and powerfull code-review system. It works on http/https and |
|
10 | pull requests and powerfull code-review system. It works on http/https and | |
11 | has a few unique features like: |
|
11 | has a few unique features like: | |
12 | - plugable architecture |
|
12 | - plugable architecture | |
13 | - advanced permission system with IP restrictions |
|
13 | - advanced permission system with IP restrictions | |
14 | - rich set of authentication plugins including LDAP, |
|
14 | - rich set of authentication plugins including LDAP, | |
15 | ActiveDirectory, Atlassian Crowd, Http-Headers, Pam, Token-Auth. |
|
15 | ActiveDirectory, Atlassian Crowd, Http-Headers, Pam, Token-Auth. | |
16 | - live code-review chat |
|
16 | - live code-review chat | |
17 | - full web based file editing |
|
17 | - full web based file editing | |
18 | - unified multi vcs support |
|
18 | - unified multi vcs support | |
19 | - snippets (gist) system |
|
19 | - snippets (gist) system | |
20 | - integration with all 3rd party issue trackers |
|
20 | - integration with all 3rd party issue trackers | |
21 |
|
21 | |||
22 | RhodeCode also provides rich API, and multiple event hooks so it's easy |
|
22 | RhodeCode also provides rich API, and multiple event hooks so it's easy | |
23 | integrable with existing external systems. |
|
23 | integrable with existing external systems. | |
24 |
|
24 | |||
25 | RhodeCode is similar in some respects to gitlab_, github_ or bitbucket_, |
|
25 | RhodeCode is similar in some respects to gitlab_, github_ or bitbucket_, | |
26 | however RhodeCode can be run as standalone hosted application on your own server. |
|
26 | however RhodeCode can be run as standalone hosted application on your own server. | |
27 | RhodeCode can be installed on \*nix or Windows systems. |
|
27 | RhodeCode can be installed on \*nix or Windows systems. | |
28 |
|
28 | |||
29 | RhodeCode uses `PEP386 versioning <http://www.python.org/dev/peps/pep-0386/>`_ |
|
29 | RhodeCode uses `PEP386 versioning <http://www.python.org/dev/peps/pep-0386/>`_ | |
30 |
|
30 | |||
31 | Installation |
|
31 | Installation | |
32 | ------------ |
|
32 | ------------ | |
33 | Please visit https://docs.rhodecode.com/RhodeCode-Control/tasks/install-cli.html |
|
33 | Please visit https://docs.rhodecode.com/RhodeCode-Control/tasks/install-cli.html | |
34 | for more details |
|
34 | for more details | |
35 |
|
35 | |||
36 |
|
36 | |||
37 | Source code |
|
37 | Source code | |
38 | ----------- |
|
38 | ----------- | |
39 |
|
39 | |||
40 | The latest sources can be obtained from official RhodeCode instance |
|
40 | The latest sources can be obtained from official RhodeCode instance | |
41 | https://code.rhodecode.com |
|
41 | https://code.rhodecode.com | |
42 |
|
42 | |||
43 |
|
43 | |||
|
44 | Contributions | |||
|
45 | ------------- | |||
|
46 | ||||
|
47 | RhodeCode is open-source; contributions are welcome! | |||
|
48 | ||||
|
49 | Please see the contribution documentation inside of the docs folder, which is | |||
|
50 | also available at | |||
|
51 | https://docs.rhodecode.com/RhodeCode-Enterprise/contributing/contributing.html | |||
|
52 | ||||
|
53 | For additional information about collaboration tools, our issue tracker, | |||
|
54 | licensing, and contribution credit, visit https://rhodecode.com/open-source | |||
|
55 | ||||
|
56 | ||||
44 | RhodeCode Features |
|
57 | RhodeCode Features | |
45 | ------------------ |
|
58 | ------------------ | |
46 |
|
59 | |||
47 | Check out all features of RhodeCode at https://rhodecode.com/features |
|
60 | Check out all features of RhodeCode at https://rhodecode.com/features | |
48 |
|
61 | |||
49 | License |
|
62 | License | |
50 | ------- |
|
63 | ------- | |
51 |
|
64 | |||
52 | ``RhodeCode`` is dual-licensed with AGPLv3 and commercial license. |
|
65 | ``RhodeCode`` is dual-licensed with AGPLv3 and commercial license. | |
53 | Please see LICENSE.txt file for details. |
|
66 | Please see LICENSE.txt file for details. | |
54 |
|
67 | |||
55 |
|
68 | |||
56 | Getting help |
|
69 | Getting help | |
57 | ------------ |
|
70 | ------------ | |
58 |
|
71 | |||
59 | Listed bellow are various support resources that should help. |
|
72 | Listed bellow are various support resources that should help. | |
60 |
|
73 | |||
61 | .. note:: |
|
74 | .. note:: | |
62 |
|
75 | |||
63 | Please try to read the documentation before posting any issues, especially |
|
76 | Please try to read the documentation before posting any issues, especially | |
64 | the **troubleshooting section** |
|
77 | the **troubleshooting section** | |
65 |
|
78 | |||
66 | - Official issue tracker `RhodeCode Issue tracker <https://issues.rhodecode.com>`_ |
|
79 | - Official issue tracker `RhodeCode Issue tracker <https://issues.rhodecode.com>`_ | |
67 |
|
80 | |||
68 | - Search our community portal `Community portal <https://community.rhodecode.com>`_ |
|
81 | - Search our community portal `Community portal <https://community.rhodecode.com>`_ | |
69 |
|
82 | |||
70 | - Join #rhodecode on FreeNode (irc.freenode.net) |
|
83 | - Join #rhodecode on FreeNode (irc.freenode.net) | |
71 | or use http://webchat.freenode.net/?channels=rhodecode for web access to irc. |
|
84 | or use http://webchat.freenode.net/?channels=rhodecode for web access to irc. | |
72 |
|
85 | |||
73 | - You can also follow RhodeCode on twitter **@RhodeCode** where we often post |
|
86 | - You can also follow RhodeCode on twitter **@RhodeCode** where we often post | |
74 | news and other interesting stuff about RhodeCode. |
|
87 | news and other interesting stuff about RhodeCode. | |
75 |
|
88 | |||
76 |
|
89 | |||
77 | Online documentation |
|
90 | Online documentation | |
78 | -------------------- |
|
91 | -------------------- | |
79 |
|
92 | |||
80 | Online documentation for the current version of RhodeCode is available at |
|
93 | Online documentation for the current version of RhodeCode is available at | |
81 | - http://rhodecode.com/docs |
|
94 | - http://rhodecode.com/docs | |
82 |
|
95 | |||
83 | You may also build the documentation for yourself - go into ``docs/`` and run:: |
|
96 | You may also build the documentation for yourself - go into ``docs/`` and run:: | |
84 |
|
97 | |||
85 | nix-build default.nix -o result && make clean html |
|
98 | nix-build default.nix -o result && make clean html | |
86 |
|
99 | |||
87 | (You need to have sphinx_ installed to build the documentation. If you don't |
|
100 | (You need to have sphinx_ installed to build the documentation. If you don't | |
88 | have sphinx_ installed you can install it via the command: |
|
101 | have sphinx_ installed you can install it via the command: | |
89 | ``pip install sphinx``) |
|
102 | ``pip install sphinx``) | |
90 |
|
103 | |||
91 | .. _virtualenv: http://pypi.python.org/pypi/virtualenv |
|
104 | .. _virtualenv: http://pypi.python.org/pypi/virtualenv | |
92 | .. _python: http://www.python.org/ |
|
105 | .. _python: http://www.python.org/ | |
93 | .. _sphinx: http://sphinx.pocoo.org/ |
|
106 | .. _sphinx: http://sphinx.pocoo.org/ | |
94 | .. _mercurial: http://mercurial.selenic.com/ |
|
107 | .. _mercurial: http://mercurial.selenic.com/ | |
95 | .. _bitbucket: http://bitbucket.org/ |
|
108 | .. _bitbucket: http://bitbucket.org/ | |
96 | .. _github: http://github.com/ |
|
109 | .. _github: http://github.com/ | |
97 | .. _gitlab: http://gitlab.com/ |
|
110 | .. _gitlab: http://gitlab.com/ | |
98 | .. _subversion: http://subversion.tigris.org/ |
|
111 | .. _subversion: http://subversion.tigris.org/ | |
99 | .. _git: http://git-scm.com/ |
|
112 | .. _git: http://git-scm.com/ | |
100 | .. _celery: http://celeryproject.org/ |
|
113 | .. _celery: http://celeryproject.org/ | |
101 | .. _vcs: http://pypi.python.org/pypi/vcs |
|
114 | .. _vcs: http://pypi.python.org/pypi/vcs |
@@ -1,47 +1,47 b'' | |||||
1 | README - Quickstart |
|
1 | README - Quickstart | |
2 | =================== |
|
2 | =================== | |
3 |
|
3 | |||
4 |
This folder contains functional tests and |
|
4 | This folder contains the functional tests and automation of specification | |
5 | examples. Details about testing can be found in |
|
5 | examples. Details about testing can be found in | |
6 | `/docs-internal/testing/index.rst`. |
|
6 | `/docs-internal/testing/index.rst`. | |
7 |
|
7 | |||
8 |
|
8 | |||
9 | Setting up your Rhodecode Enterprise instance |
|
9 | Setting up your Rhodecode Enterprise instance | |
10 | --------------------------------------------- |
|
10 | --------------------------------------------- | |
11 |
|
11 | |||
12 | The tests will create users and repositories as needed, so you can start with a |
|
12 | The tests will create users and repositories as needed, so you can start with a | |
13 | new and empty instance. |
|
13 | new and empty instance. | |
14 |
|
14 | |||
15 | Use the following example call for the database setup of Enterprise:: |
|
15 | Use the following example call for the database setup of Enterprise:: | |
16 |
|
16 | |||
17 | paster setup-rhodecode \ |
|
17 | paster setup-rhodecode \ | |
18 | --user=admin \ |
|
18 | --user=admin \ | |
19 | --email=admin@example.com \ |
|
19 | --email=admin@example.com \ | |
20 | --password=secret \ |
|
20 | --password=secret \ | |
21 | --api-key=9999999999999999999999999999999999999999 \ |
|
21 | --api-key=9999999999999999999999999999999999999999 \ | |
22 | your-enterprise-config.ini |
|
22 | your-enterprise-config.ini | |
23 |
|
23 | |||
24 | This way the username, password and auth token of the admin user will match the |
|
24 | This way the username, password, and auth token of the admin user will match the | |
25 | defaults from the test run. |
|
25 | defaults from the test run. | |
26 |
|
26 | |||
27 |
|
27 | |||
28 | Usage |
|
28 | Usage | |
29 | ----- |
|
29 | ----- | |
30 |
|
30 | |||
31 | 1. Make sure your Rhodecode Enterprise instance is running at |
|
31 | 1. Make sure your Rhodecode Enterprise instance is running at | |
32 | http://localhost:5000. |
|
32 | http://localhost:5000. | |
33 |
|
33 | |||
34 | 2. Enter `nix-shell` from the acceptance_tests folder:: |
|
34 | 2. Enter `nix-shell` from the acceptance_tests folder:: | |
35 |
|
35 | |||
36 | cd acceptance_tests |
|
36 | cd acceptance_tests | |
37 |
nix-shell |
|
37 | nix-shell | |
38 |
|
38 | |||
39 | Make sure that `rcpkgs` and `rcnixpkgs` are available on the nix path. |
|
39 | Make sure that `rcpkgs` and `rcnixpkgs` are available on the nix path. | |
40 |
|
40 | |||
41 | 3. Run the tests:: |
|
41 | 3. Run the tests:: | |
42 |
|
42 | |||
43 | py.test -c example.ini -vs |
|
43 | py.test -c example.ini -vs | |
44 |
|
44 | |||
45 | The parameter ``-vs`` allows you to see debugging output during the test |
|
45 | The parameter ``-vs`` allows you to see debugging output during the test | |
46 | run. Check ``py.test --help`` and the documentation at http://pytest.org to |
|
46 | run. Check ``py.test --help`` and the documentation at http://pytest.org to | |
47 | learn all details about the test runner. |
|
47 | learn all details about the test runner. |
@@ -1,608 +1,612 b'' | |||||
1 | ################################################################################ |
|
1 | ################################################################################ | |
2 | ################################################################################ |
|
2 | ################################################################################ | |
3 | # RhodeCode Enterprise - configuration file # |
|
3 | # RhodeCode Enterprise - configuration file # | |
4 | # Built-in functions and variables # |
|
4 | # Built-in functions and variables # | |
5 | # The %(here)s variable will be replaced with the parent directory of this file# |
|
5 | # The %(here)s variable will be replaced with the parent directory of this file# | |
6 | # # |
|
6 | # # | |
7 | ################################################################################ |
|
7 | ################################################################################ | |
8 |
|
8 | |||
9 | [DEFAULT] |
|
9 | [DEFAULT] | |
10 | debug = true |
|
10 | debug = true | |
11 | pdebug = false |
|
|||
12 | ################################################################################ |
|
11 | ################################################################################ | |
13 | ## Uncomment and replace with the email address which should receive ## |
|
12 | ## Uncomment and replace with the email address which should receive ## | |
14 | ## any error reports after an application crash ## |
|
13 | ## any error reports after an application crash ## | |
15 | ## Additionally these settings will be used by the RhodeCode mailing system ## |
|
14 | ## Additionally these settings will be used by the RhodeCode mailing system ## | |
16 | ################################################################################ |
|
15 | ################################################################################ | |
17 | #email_to = admin@localhost |
|
16 | #email_to = admin@localhost | |
18 | #error_email_from = paste_error@localhost |
|
17 | #error_email_from = paste_error@localhost | |
19 | #app_email_from = rhodecode-noreply@localhost |
|
18 | #app_email_from = rhodecode-noreply@localhost | |
20 | #error_message = |
|
19 | #error_message = | |
21 | #email_prefix = [RhodeCode] |
|
20 | #email_prefix = [RhodeCode] | |
22 |
|
21 | |||
23 | #smtp_server = mail.server.com |
|
22 | #smtp_server = mail.server.com | |
24 | #smtp_username = |
|
23 | #smtp_username = | |
25 | #smtp_password = |
|
24 | #smtp_password = | |
26 | #smtp_port = |
|
25 | #smtp_port = | |
27 | #smtp_use_tls = false |
|
26 | #smtp_use_tls = false | |
28 | #smtp_use_ssl = true |
|
27 | #smtp_use_ssl = true | |
29 | ## Specify available auth parameters here (e.g. LOGIN PLAIN CRAM-MD5, etc.) |
|
28 | ## Specify available auth parameters here (e.g. LOGIN PLAIN CRAM-MD5, etc.) | |
30 | #smtp_auth = |
|
29 | #smtp_auth = | |
31 |
|
30 | |||
32 | [server:main] |
|
31 | [server:main] | |
33 | ## COMMON ## |
|
32 | ## COMMON ## | |
34 | host = 127.0.0.1 |
|
33 | host = 127.0.0.1 | |
35 | port = 5000 |
|
34 | port = 5000 | |
36 |
|
35 | |||
37 | ################################## |
|
36 | ################################## | |
38 | ## WAITRESS WSGI SERVER ## |
|
37 | ## WAITRESS WSGI SERVER ## | |
39 | ## Recommended for Development ## |
|
38 | ## Recommended for Development ## | |
40 | ################################## |
|
39 | ################################## | |
41 | use = egg:waitress#main |
|
40 | use = egg:waitress#main | |
42 | ## number of worker threads |
|
41 | ## number of worker threads | |
43 | threads = 5 |
|
42 | threads = 5 | |
44 | ## MAX BODY SIZE 100GB |
|
43 | ## MAX BODY SIZE 100GB | |
45 | max_request_body_size = 107374182400 |
|
44 | max_request_body_size = 107374182400 | |
46 | ## Use poll instead of select, fixes file descriptors limits problems. |
|
45 | ## Use poll instead of select, fixes file descriptors limits problems. | |
47 | ## May not work on old windows systems. |
|
46 | ## May not work on old windows systems. | |
48 | asyncore_use_poll = true |
|
47 | asyncore_use_poll = true | |
49 |
|
48 | |||
50 |
|
49 | |||
51 | ########################## |
|
50 | ########################## | |
52 | ## GUNICORN WSGI SERVER ## |
|
51 | ## GUNICORN WSGI SERVER ## | |
53 | ########################## |
|
52 | ########################## | |
54 | ## run with gunicorn --log-config <inifile.ini> --paste <inifile.ini> |
|
53 | ## run with gunicorn --log-config <inifile.ini> --paste <inifile.ini> | |
55 | #use = egg:gunicorn#main |
|
54 | #use = egg:gunicorn#main | |
56 | ## Sets the number of process workers. You must set `instance_id = *` |
|
55 | ## Sets the number of process workers. You must set `instance_id = *` | |
57 | ## when this option is set to more than one worker, recommended |
|
56 | ## when this option is set to more than one worker, recommended | |
58 | ## value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers |
|
57 | ## value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers | |
59 | ## The `instance_id = *` must be set in the [app:main] section below |
|
58 | ## The `instance_id = *` must be set in the [app:main] section below | |
60 | #workers = 2 |
|
59 | #workers = 2 | |
61 | ## number of threads for each of the worker, must be set to 1 for gevent |
|
60 | ## number of threads for each of the worker, must be set to 1 for gevent | |
62 | ## generally recommened to be at 1 |
|
61 | ## generally recommened to be at 1 | |
63 | #threads = 1 |
|
62 | #threads = 1 | |
64 | ## process name |
|
63 | ## process name | |
65 | #proc_name = rhodecode |
|
64 | #proc_name = rhodecode | |
66 | ## type of worker class, one of sync, gevent |
|
65 | ## type of worker class, one of sync, gevent | |
67 | ## recommended for bigger setup is using of of other than sync one |
|
66 | ## recommended for bigger setup is using of of other than sync one | |
68 | #worker_class = sync |
|
67 | #worker_class = sync | |
69 | ## The maximum number of simultaneous clients. Valid only for Gevent |
|
68 | ## The maximum number of simultaneous clients. Valid only for Gevent | |
70 | #worker_connections = 10 |
|
69 | #worker_connections = 10 | |
71 | ## max number of requests that worker will handle before being gracefully |
|
70 | ## max number of requests that worker will handle before being gracefully | |
72 | ## restarted, could prevent memory leaks |
|
71 | ## restarted, could prevent memory leaks | |
73 | #max_requests = 1000 |
|
72 | #max_requests = 1000 | |
74 | #max_requests_jitter = 30 |
|
73 | #max_requests_jitter = 30 | |
75 | ## amount of time a worker can spend with handling a request before it |
|
74 | ## amount of time a worker can spend with handling a request before it | |
76 | ## gets killed and restarted. Set to 6hrs |
|
75 | ## gets killed and restarted. Set to 6hrs | |
77 | #timeout = 21600 |
|
76 | #timeout = 21600 | |
78 |
|
77 | |||
79 |
|
78 | |||
80 | ## prefix middleware for RhodeCode, disables force_https flag. |
|
79 | ## prefix middleware for RhodeCode, disables force_https flag. | |
81 | ## allows to set RhodeCode under a prefix in server. |
|
80 | ## allows to set RhodeCode under a prefix in server. | |
82 | ## eg https://server.com/<prefix>. Enable `filter-with =` option below as well. |
|
81 | ## eg https://server.com/<prefix>. Enable `filter-with =` option below as well. | |
83 | #[filter:proxy-prefix] |
|
82 | #[filter:proxy-prefix] | |
84 | #use = egg:PasteDeploy#prefix |
|
83 | #use = egg:PasteDeploy#prefix | |
85 | #prefix = /<your-prefix> |
|
84 | #prefix = /<your-prefix> | |
86 |
|
85 | |||
87 | [app:main] |
|
86 | [app:main] | |
88 | use = egg:rhodecode-enterprise-ce |
|
87 | use = egg:rhodecode-enterprise-ce | |
89 | ## enable proxy prefix middleware, defined below |
|
88 | ## enable proxy prefix middleware, defined below | |
90 | #filter-with = proxy-prefix |
|
89 | #filter-with = proxy-prefix | |
91 |
|
90 | |||
92 | # During development the we want to have the debug toolbar enabled |
|
91 | # During development the we want to have the debug toolbar enabled | |
93 | pyramid.includes = |
|
92 | pyramid.includes = | |
94 | pyramid_debugtoolbar |
|
93 | pyramid_debugtoolbar | |
95 | rhodecode.utils.debugtoolbar |
|
94 | rhodecode.utils.debugtoolbar | |
96 | rhodecode.lib.middleware.request_wrapper |
|
95 | rhodecode.lib.middleware.request_wrapper | |
97 |
|
96 | |||
98 | pyramid.reload_templates = true |
|
97 | pyramid.reload_templates = true | |
99 |
|
98 | |||
100 | debugtoolbar.hosts = 0.0.0.0/0 |
|
99 | debugtoolbar.hosts = 0.0.0.0/0 | |
101 | debugtoolbar.exclude_prefixes = |
|
100 | debugtoolbar.exclude_prefixes = | |
102 | /css |
|
101 | /css | |
103 | /fonts |
|
102 | /fonts | |
104 | /images |
|
103 | /images | |
105 | /js |
|
104 | /js | |
106 |
|
105 | |||
107 | ## RHODECODE PLUGINS ## |
|
106 | ## RHODECODE PLUGINS ## | |
108 | rhodecode.includes = |
|
107 | rhodecode.includes = | |
109 | rhodecode.api |
|
108 | rhodecode.api | |
110 |
|
109 | |||
111 |
|
110 | |||
112 | # api prefix url |
|
111 | # api prefix url | |
113 | rhodecode.api.url = /_admin/api |
|
112 | rhodecode.api.url = /_admin/api | |
114 |
|
113 | |||
115 |
|
114 | |||
116 | ## END RHODECODE PLUGINS ## |
|
115 | ## END RHODECODE PLUGINS ## | |
117 |
|
116 | |||
|
117 | ## encryption key used to encrypt social plugin tokens, | |||
|
118 | ## remote_urls with credentials etc, if not set it defaults to | |||
|
119 | ## `beaker.session.secret` | |||
|
120 | #rhodecode.encrypted_values.secret = | |||
|
121 | ||||
|
122 | ## decryption strict mode (enabled by default). It controls if decryption raises | |||
|
123 | ## `SignatureVerificationError` in case of wrong key, or damaged encryption data. | |||
|
124 | #rhodecode.encrypted_values.strict = false | |||
|
125 | ||||
118 | full_stack = true |
|
126 | full_stack = true | |
119 |
|
127 | |||
120 | ## Serve static files via RhodeCode, disable to serve them via HTTP server |
|
128 | ## Serve static files via RhodeCode, disable to serve them via HTTP server | |
121 | static_files = true |
|
129 | static_files = true | |
122 |
|
130 | |||
|
131 | # autogenerate javascript routes file on startup | |||
|
132 | generate_js_files = false | |||
|
133 | ||||
123 | ## Optional Languages |
|
134 | ## Optional Languages | |
124 | ## en(default), be, de, es, fr, it, ja, pl, pt, ru, zh |
|
135 | ## en(default), be, de, es, fr, it, ja, pl, pt, ru, zh | |
125 | lang = en |
|
136 | lang = en | |
126 |
|
137 | |||
127 | ## perform a full repository scan on each server start, this should be |
|
138 | ## perform a full repository scan on each server start, this should be | |
128 | ## set to false after first startup, to allow faster server restarts. |
|
139 | ## set to false after first startup, to allow faster server restarts. | |
129 | startup.import_repos = false |
|
140 | startup.import_repos = false | |
130 |
|
141 | |||
131 | ## Uncomment and set this path to use archive download cache. |
|
142 | ## Uncomment and set this path to use archive download cache. | |
132 | ## Once enabled, generated archives will be cached at this location |
|
143 | ## Once enabled, generated archives will be cached at this location | |
133 | ## and served from the cache during subsequent requests for the same archive of |
|
144 | ## and served from the cache during subsequent requests for the same archive of | |
134 | ## the repository. |
|
145 | ## the repository. | |
135 | #archive_cache_dir = /tmp/tarballcache |
|
146 | #archive_cache_dir = /tmp/tarballcache | |
136 |
|
147 | |||
137 | ## change this to unique ID for security |
|
148 | ## change this to unique ID for security | |
138 | app_instance_uuid = rc-production |
|
149 | app_instance_uuid = rc-production | |
139 |
|
150 | |||
140 | ## cut off limit for large diffs (size in bytes) |
|
151 | ## cut off limit for large diffs (size in bytes) | |
141 | cut_off_limit_diff = 1024000 |
|
152 | cut_off_limit_diff = 1024000 | |
142 | cut_off_limit_file = 256000 |
|
153 | cut_off_limit_file = 256000 | |
143 |
|
154 | |||
144 | ## use cache version of scm repo everywhere |
|
155 | ## use cache version of scm repo everywhere | |
145 | vcs_full_cache = true |
|
156 | vcs_full_cache = true | |
146 |
|
157 | |||
147 | ## force https in RhodeCode, fixes https redirects, assumes it's always https |
|
158 | ## force https in RhodeCode, fixes https redirects, assumes it's always https | |
148 | ## Normally this is controlled by proper http flags sent from http server |
|
159 | ## Normally this is controlled by proper http flags sent from http server | |
149 | force_https = false |
|
160 | force_https = false | |
150 |
|
161 | |||
151 | ## use Strict-Transport-Security headers |
|
162 | ## use Strict-Transport-Security headers | |
152 | use_htsts = false |
|
163 | use_htsts = false | |
153 |
|
164 | |||
154 | ## number of commits stats will parse on each iteration |
|
165 | ## number of commits stats will parse on each iteration | |
155 | commit_parse_limit = 25 |
|
166 | commit_parse_limit = 25 | |
156 |
|
167 | |||
157 | ## git rev filter option, --all is the default filter, if you need to |
|
168 | ## git rev filter option, --all is the default filter, if you need to | |
158 | ## hide all refs in changelog switch this to --branches --tags |
|
169 | ## hide all refs in changelog switch this to --branches --tags | |
159 | git_rev_filter = --branches --tags |
|
170 | git_rev_filter = --branches --tags | |
160 |
|
171 | |||
161 | # Set to true if your repos are exposed using the dumb protocol |
|
172 | # Set to true if your repos are exposed using the dumb protocol | |
162 | git_update_server_info = false |
|
173 | git_update_server_info = false | |
163 |
|
174 | |||
164 | ## RSS/ATOM feed options |
|
175 | ## RSS/ATOM feed options | |
165 | rss_cut_off_limit = 256000 |
|
176 | rss_cut_off_limit = 256000 | |
166 | rss_items_per_page = 10 |
|
177 | rss_items_per_page = 10 | |
167 | rss_include_diff = false |
|
178 | rss_include_diff = false | |
168 |
|
179 | |||
169 | ## gist URL alias, used to create nicer urls for gist. This should be an |
|
180 | ## gist URL alias, used to create nicer urls for gist. This should be an | |
170 | ## url that does rewrites to _admin/gists/<gistid>. |
|
181 | ## url that does rewrites to _admin/gists/<gistid>. | |
171 | ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal |
|
182 | ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal | |
172 | ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/<gistid> |
|
183 | ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/<gistid> | |
173 | gist_alias_url = |
|
184 | gist_alias_url = | |
174 |
|
185 | |||
175 | ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be |
|
186 | ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be | |
176 | ## used for access. |
|
187 | ## used for access. | |
177 | ## Adding ?auth_token = <token> to the url authenticates this request as if it |
|
188 | ## Adding ?auth_token = <token> to the url authenticates this request as if it | |
178 | ## came from the the logged in user who own this authentication token. |
|
189 | ## came from the the logged in user who own this authentication token. | |
179 | ## |
|
190 | ## | |
180 | ## Syntax is <ControllerClass>:<function_pattern>. |
|
191 | ## Syntax is <ControllerClass>:<function_pattern>. | |
181 | ## To enable access to raw_files put `FilesController:raw`. |
|
192 | ## To enable access to raw_files put `FilesController:raw`. | |
182 | ## To enable access to patches add `ChangesetController:changeset_patch`. |
|
193 | ## To enable access to patches add `ChangesetController:changeset_patch`. | |
183 | ## The list should be "," separated and on a single line. |
|
194 | ## The list should be "," separated and on a single line. | |
184 | ## |
|
195 | ## | |
185 | ## Recommended controllers to enable: |
|
196 | ## Recommended controllers to enable: | |
186 | # ChangesetController:changeset_patch, |
|
197 | # ChangesetController:changeset_patch, | |
187 | # ChangesetController:changeset_raw, |
|
198 | # ChangesetController:changeset_raw, | |
188 | # FilesController:raw, |
|
199 | # FilesController:raw, | |
189 | # FilesController:archivefile, |
|
200 | # FilesController:archivefile, | |
190 | # GistsController:*, |
|
201 | # GistsController:*, | |
191 | api_access_controllers_whitelist = |
|
202 | api_access_controllers_whitelist = | |
192 |
|
203 | |||
193 | ## default encoding used to convert from and to unicode |
|
204 | ## default encoding used to convert from and to unicode | |
194 | ## can be also a comma separated list of encoding in case of mixed encodings |
|
205 | ## can be also a comma separated list of encoding in case of mixed encodings | |
195 | default_encoding = UTF-8 |
|
206 | default_encoding = UTF-8 | |
196 |
|
207 | |||
197 | ## instance-id prefix |
|
208 | ## instance-id prefix | |
198 | ## a prefix key for this instance used for cache invalidation when running |
|
209 | ## a prefix key for this instance used for cache invalidation when running | |
199 | ## multiple instances of rhodecode, make sure it's globally unique for |
|
210 | ## multiple instances of rhodecode, make sure it's globally unique for | |
200 | ## all running rhodecode instances. Leave empty if you don't use it |
|
211 | ## all running rhodecode instances. Leave empty if you don't use it | |
201 | instance_id = |
|
212 | instance_id = | |
202 |
|
213 | |||
203 | ## Fallback authentication plugin. Set this to a plugin ID to force the usage |
|
214 | ## Fallback authentication plugin. Set this to a plugin ID to force the usage | |
204 | ## of an authentication plugin also if it is disabled by it's settings. |
|
215 | ## of an authentication plugin also if it is disabled by it's settings. | |
205 | ## This could be useful if you are unable to log in to the system due to broken |
|
216 | ## This could be useful if you are unable to log in to the system due to broken | |
206 | ## authentication settings. Then you can enable e.g. the internal rhodecode auth |
|
217 | ## authentication settings. Then you can enable e.g. the internal rhodecode auth | |
207 | ## module to log in again and fix the settings. |
|
218 | ## module to log in again and fix the settings. | |
208 | ## |
|
219 | ## | |
209 | ## Available builtin plugin IDs (hash is part of the ID): |
|
220 | ## Available builtin plugin IDs (hash is part of the ID): | |
210 | ## egg:rhodecode-enterprise-ce#rhodecode |
|
221 | ## egg:rhodecode-enterprise-ce#rhodecode | |
211 | ## egg:rhodecode-enterprise-ce#pam |
|
222 | ## egg:rhodecode-enterprise-ce#pam | |
212 | ## egg:rhodecode-enterprise-ce#ldap |
|
223 | ## egg:rhodecode-enterprise-ce#ldap | |
213 | ## egg:rhodecode-enterprise-ce#jasig_cas |
|
224 | ## egg:rhodecode-enterprise-ce#jasig_cas | |
214 | ## egg:rhodecode-enterprise-ce#headers |
|
225 | ## egg:rhodecode-enterprise-ce#headers | |
215 | ## egg:rhodecode-enterprise-ce#crowd |
|
226 | ## egg:rhodecode-enterprise-ce#crowd | |
216 | #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode |
|
227 | #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode | |
217 |
|
228 | |||
218 | ## alternative return HTTP header for failed authentication. Default HTTP |
|
229 | ## alternative return HTTP header for failed authentication. Default HTTP | |
219 | ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with |
|
230 | ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with | |
220 | ## handling that causing a series of failed authentication calls. |
|
231 | ## handling that causing a series of failed authentication calls. | |
221 | ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code |
|
232 | ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code | |
222 | ## This will be served instead of default 401 on bad authnetication |
|
233 | ## This will be served instead of default 401 on bad authnetication | |
223 | auth_ret_code = |
|
234 | auth_ret_code = | |
224 |
|
235 | |||
225 | ## use special detection method when serving auth_ret_code, instead of serving |
|
236 | ## use special detection method when serving auth_ret_code, instead of serving | |
226 | ## ret_code directly, use 401 initially (Which triggers credentials prompt) |
|
237 | ## ret_code directly, use 401 initially (Which triggers credentials prompt) | |
227 | ## and then serve auth_ret_code to clients |
|
238 | ## and then serve auth_ret_code to clients | |
228 | auth_ret_code_detection = false |
|
239 | auth_ret_code_detection = false | |
229 |
|
240 | |||
230 | ## locking return code. When repository is locked return this HTTP code. 2XX |
|
241 | ## locking return code. When repository is locked return this HTTP code. 2XX | |
231 | ## codes don't break the transactions while 4XX codes do |
|
242 | ## codes don't break the transactions while 4XX codes do | |
232 | lock_ret_code = 423 |
|
243 | lock_ret_code = 423 | |
233 |
|
244 | |||
234 | ## allows to change the repository location in settings page |
|
245 | ## allows to change the repository location in settings page | |
235 | allow_repo_location_change = true |
|
246 | allow_repo_location_change = true | |
236 |
|
247 | |||
237 | ## allows to setup custom hooks in settings page |
|
248 | ## allows to setup custom hooks in settings page | |
238 | allow_custom_hooks_settings = true |
|
249 | allow_custom_hooks_settings = true | |
239 |
|
250 | |||
240 | ## generated license token, goto license page in RhodeCode settings to obtain |
|
251 | ## generated license token, goto license page in RhodeCode settings to obtain | |
241 | ## new token |
|
252 | ## new token | |
242 | license_token = |
|
253 | license_token = | |
243 |
|
254 | |||
244 | ## supervisor connection uri, for managing supervisor and logs. |
|
255 | ## supervisor connection uri, for managing supervisor and logs. | |
245 | supervisor.uri = |
|
256 | supervisor.uri = | |
246 | ## supervisord group name/id we only want this RC instance to handle |
|
257 | ## supervisord group name/id we only want this RC instance to handle | |
247 | supervisor.group_id = dev |
|
258 | supervisor.group_id = dev | |
248 |
|
259 | |||
249 | ## Display extended labs settings |
|
260 | ## Display extended labs settings | |
250 | labs_settings_active = true |
|
261 | labs_settings_active = true | |
251 |
|
262 | |||
252 | #################################### |
|
263 | #################################### | |
253 | ### CELERY CONFIG #### |
|
264 | ### CELERY CONFIG #### | |
254 | #################################### |
|
265 | #################################### | |
255 | use_celery = false |
|
266 | use_celery = false | |
256 | broker.host = localhost |
|
267 | broker.host = localhost | |
257 | broker.vhost = rabbitmqhost |
|
268 | broker.vhost = rabbitmqhost | |
258 | broker.port = 5672 |
|
269 | broker.port = 5672 | |
259 | broker.user = rabbitmq |
|
270 | broker.user = rabbitmq | |
260 | broker.password = qweqwe |
|
271 | broker.password = qweqwe | |
261 |
|
272 | |||
262 | celery.imports = rhodecode.lib.celerylib.tasks |
|
273 | celery.imports = rhodecode.lib.celerylib.tasks | |
263 |
|
274 | |||
264 | celery.result.backend = amqp |
|
275 | celery.result.backend = amqp | |
265 | celery.result.dburi = amqp:// |
|
276 | celery.result.dburi = amqp:// | |
266 | celery.result.serialier = json |
|
277 | celery.result.serialier = json | |
267 |
|
278 | |||
268 | #celery.send.task.error.emails = true |
|
279 | #celery.send.task.error.emails = true | |
269 | #celery.amqp.task.result.expires = 18000 |
|
280 | #celery.amqp.task.result.expires = 18000 | |
270 |
|
281 | |||
271 | celeryd.concurrency = 2 |
|
282 | celeryd.concurrency = 2 | |
272 | #celeryd.log.file = celeryd.log |
|
283 | #celeryd.log.file = celeryd.log | |
273 | celeryd.log.level = debug |
|
284 | celeryd.log.level = debug | |
274 | celeryd.max.tasks.per.child = 1 |
|
285 | celeryd.max.tasks.per.child = 1 | |
275 |
|
286 | |||
276 | ## tasks will never be sent to the queue, but executed locally instead. |
|
287 | ## tasks will never be sent to the queue, but executed locally instead. | |
277 | celery.always.eager = false |
|
288 | celery.always.eager = false | |
278 |
|
289 | |||
279 | #################################### |
|
290 | #################################### | |
280 | ### BEAKER CACHE #### |
|
291 | ### BEAKER CACHE #### | |
281 | #################################### |
|
292 | #################################### | |
282 | # default cache dir for templates. Putting this into a ramdisk |
|
293 | # default cache dir for templates. Putting this into a ramdisk | |
283 | ## can boost performance, eg. %(here)s/data_ramdisk |
|
294 | ## can boost performance, eg. %(here)s/data_ramdisk | |
284 | cache_dir = %(here)s/data |
|
295 | cache_dir = %(here)s/data | |
285 |
|
296 | |||
286 | ## locking and default file storage for Beaker. Putting this into a ramdisk |
|
297 | ## locking and default file storage for Beaker. Putting this into a ramdisk | |
287 | ## can boost performance, eg. %(here)s/data_ramdisk/cache/beaker_data |
|
298 | ## can boost performance, eg. %(here)s/data_ramdisk/cache/beaker_data | |
288 | beaker.cache.data_dir = %(here)s/data/cache/beaker_data |
|
299 | beaker.cache.data_dir = %(here)s/data/cache/beaker_data | |
289 | beaker.cache.lock_dir = %(here)s/data/cache/beaker_lock |
|
300 | beaker.cache.lock_dir = %(here)s/data/cache/beaker_lock | |
290 |
|
301 | |||
291 | beaker.cache.regions = super_short_term, short_term, long_term, sql_cache_short, auth_plugins, repo_cache_long |
|
302 | beaker.cache.regions = super_short_term, short_term, long_term, sql_cache_short, auth_plugins, repo_cache_long | |
292 |
|
303 | |||
293 | beaker.cache.super_short_term.type = memory |
|
304 | beaker.cache.super_short_term.type = memory | |
294 | beaker.cache.super_short_term.expire = 10 |
|
305 | beaker.cache.super_short_term.expire = 10 | |
295 | beaker.cache.super_short_term.key_length = 256 |
|
306 | beaker.cache.super_short_term.key_length = 256 | |
296 |
|
307 | |||
297 | beaker.cache.short_term.type = memory |
|
308 | beaker.cache.short_term.type = memory | |
298 | beaker.cache.short_term.expire = 60 |
|
309 | beaker.cache.short_term.expire = 60 | |
299 | beaker.cache.short_term.key_length = 256 |
|
310 | beaker.cache.short_term.key_length = 256 | |
300 |
|
311 | |||
301 | beaker.cache.long_term.type = memory |
|
312 | beaker.cache.long_term.type = memory | |
302 | beaker.cache.long_term.expire = 36000 |
|
313 | beaker.cache.long_term.expire = 36000 | |
303 | beaker.cache.long_term.key_length = 256 |
|
314 | beaker.cache.long_term.key_length = 256 | |
304 |
|
315 | |||
305 | beaker.cache.sql_cache_short.type = memory |
|
316 | beaker.cache.sql_cache_short.type = memory | |
306 | beaker.cache.sql_cache_short.expire = 10 |
|
317 | beaker.cache.sql_cache_short.expire = 10 | |
307 | beaker.cache.sql_cache_short.key_length = 256 |
|
318 | beaker.cache.sql_cache_short.key_length = 256 | |
308 |
|
319 | |||
309 | # default is memory cache, configure only if required |
|
320 | # default is memory cache, configure only if required | |
310 | # using multi-node or multi-worker setup |
|
321 | # using multi-node or multi-worker setup | |
311 | #beaker.cache.auth_plugins.type = ext:database |
|
322 | #beaker.cache.auth_plugins.type = ext:database | |
312 | #beaker.cache.auth_plugins.lock_dir = %(here)s/data/cache/auth_plugin_lock |
|
323 | #beaker.cache.auth_plugins.lock_dir = %(here)s/data/cache/auth_plugin_lock | |
313 | #beaker.cache.auth_plugins.url = postgresql://postgres:secret@localhost/rhodecode |
|
324 | #beaker.cache.auth_plugins.url = postgresql://postgres:secret@localhost/rhodecode | |
314 | #beaker.cache.auth_plugins.url = mysql://root:secret@127.0.0.1/rhodecode |
|
325 | #beaker.cache.auth_plugins.url = mysql://root:secret@127.0.0.1/rhodecode | |
315 | #beaker.cache.auth_plugins.sa.pool_recycle = 3600 |
|
326 | #beaker.cache.auth_plugins.sa.pool_recycle = 3600 | |
316 | #beaker.cache.auth_plugins.sa.pool_size = 10 |
|
327 | #beaker.cache.auth_plugins.sa.pool_size = 10 | |
317 | #beaker.cache.auth_plugins.sa.max_overflow = 0 |
|
328 | #beaker.cache.auth_plugins.sa.max_overflow = 0 | |
318 |
|
329 | |||
319 | beaker.cache.repo_cache_long.type = memorylru_base |
|
330 | beaker.cache.repo_cache_long.type = memorylru_base | |
320 | beaker.cache.repo_cache_long.max_items = 4096 |
|
331 | beaker.cache.repo_cache_long.max_items = 4096 | |
321 | beaker.cache.repo_cache_long.expire = 2592000 |
|
332 | beaker.cache.repo_cache_long.expire = 2592000 | |
322 |
|
333 | |||
323 | # default is memorylru_base cache, configure only if required |
|
334 | # default is memorylru_base cache, configure only if required | |
324 | # using multi-node or multi-worker setup |
|
335 | # using multi-node or multi-worker setup | |
325 | #beaker.cache.repo_cache_long.type = ext:memcached |
|
336 | #beaker.cache.repo_cache_long.type = ext:memcached | |
326 | #beaker.cache.repo_cache_long.url = localhost:11211 |
|
337 | #beaker.cache.repo_cache_long.url = localhost:11211 | |
327 | #beaker.cache.repo_cache_long.expire = 1209600 |
|
338 | #beaker.cache.repo_cache_long.expire = 1209600 | |
328 | #beaker.cache.repo_cache_long.key_length = 256 |
|
339 | #beaker.cache.repo_cache_long.key_length = 256 | |
329 |
|
340 | |||
330 | #################################### |
|
341 | #################################### | |
331 | ### BEAKER SESSION #### |
|
342 | ### BEAKER SESSION #### | |
332 | #################################### |
|
343 | #################################### | |
333 |
|
344 | |||
334 | ## .session.type is type of storage options for the session, current allowed |
|
345 | ## .session.type is type of storage options for the session, current allowed | |
335 | ## types are file, ext:memcached, ext:database, and memory (default). |
|
346 | ## types are file, ext:memcached, ext:database, and memory (default). | |
336 | beaker.session.type = file |
|
347 | beaker.session.type = file | |
337 | beaker.session.data_dir = %(here)s/data/sessions/data |
|
348 | beaker.session.data_dir = %(here)s/data/sessions/data | |
338 |
|
349 | |||
339 | ## db based session, fast, and allows easy management over logged in users ## |
|
350 | ## db based session, fast, and allows easy management over logged in users ## | |
340 | #beaker.session.type = ext:database |
|
351 | #beaker.session.type = ext:database | |
341 | #beaker.session.table_name = db_session |
|
352 | #beaker.session.table_name = db_session | |
342 | #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode |
|
353 | #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode | |
343 | #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode |
|
354 | #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode | |
344 | #beaker.session.sa.pool_recycle = 3600 |
|
355 | #beaker.session.sa.pool_recycle = 3600 | |
345 | #beaker.session.sa.echo = false |
|
356 | #beaker.session.sa.echo = false | |
346 |
|
357 | |||
347 | beaker.session.key = rhodecode |
|
358 | beaker.session.key = rhodecode | |
348 | beaker.session.secret = develop-rc-uytcxaz |
|
359 | beaker.session.secret = develop-rc-uytcxaz | |
349 | beaker.session.lock_dir = %(here)s/data/sessions/lock |
|
360 | beaker.session.lock_dir = %(here)s/data/sessions/lock | |
350 |
|
361 | |||
351 | ## Secure encrypted cookie. Requires AES and AES python libraries |
|
362 | ## Secure encrypted cookie. Requires AES and AES python libraries | |
352 | ## you must disable beaker.session.secret to use this |
|
363 | ## you must disable beaker.session.secret to use this | |
353 | #beaker.session.encrypt_key = <key_for_encryption> |
|
364 | #beaker.session.encrypt_key = <key_for_encryption> | |
354 | #beaker.session.validate_key = <validation_key> |
|
365 | #beaker.session.validate_key = <validation_key> | |
355 |
|
366 | |||
356 | ## sets session as invalid(also logging out user) if it haven not been |
|
367 | ## sets session as invalid(also logging out user) if it haven not been | |
357 | ## accessed for given amount of time in seconds |
|
368 | ## accessed for given amount of time in seconds | |
358 | beaker.session.timeout = 2592000 |
|
369 | beaker.session.timeout = 2592000 | |
359 | beaker.session.httponly = true |
|
370 | beaker.session.httponly = true | |
360 | #beaker.session.cookie_path = /<your-prefix> |
|
371 | #beaker.session.cookie_path = /<your-prefix> | |
361 |
|
372 | |||
362 | ## uncomment for https secure cookie |
|
373 | ## uncomment for https secure cookie | |
363 | beaker.session.secure = false |
|
374 | beaker.session.secure = false | |
364 |
|
375 | |||
365 | ## auto save the session to not to use .save() |
|
376 | ## auto save the session to not to use .save() | |
366 | beaker.session.auto = false |
|
377 | beaker.session.auto = false | |
367 |
|
378 | |||
368 | ## default cookie expiration time in seconds, set to `true` to set expire |
|
379 | ## default cookie expiration time in seconds, set to `true` to set expire | |
369 | ## at browser close |
|
380 | ## at browser close | |
370 | #beaker.session.cookie_expires = 3600 |
|
381 | #beaker.session.cookie_expires = 3600 | |
371 |
|
382 | |||
372 | ################################### |
|
383 | ################################### | |
373 | ## SEARCH INDEXING CONFIGURATION ## |
|
384 | ## SEARCH INDEXING CONFIGURATION ## | |
374 | ################################### |
|
385 | ################################### | |
375 | ## Full text search indexer is available in rhodecode-tools under |
|
386 | ## Full text search indexer is available in rhodecode-tools under | |
376 | ## `rhodecode-tools index` command |
|
387 | ## `rhodecode-tools index` command | |
377 |
|
388 | |||
378 | # WHOOSH Backend, doesn't require additional services to run |
|
389 | # WHOOSH Backend, doesn't require additional services to run | |
379 | # it works good with few dozen repos |
|
390 | # it works good with few dozen repos | |
380 | search.module = rhodecode.lib.index.whoosh |
|
391 | search.module = rhodecode.lib.index.whoosh | |
381 | search.location = %(here)s/data/index |
|
392 | search.location = %(here)s/data/index | |
382 |
|
393 | |||
383 |
|
||||
384 | ################################### |
|
394 | ################################### | |
385 | ## APPENLIGHT CONFIG ## |
|
395 | ## APPENLIGHT CONFIG ## | |
386 | ################################### |
|
396 | ################################### | |
387 |
|
397 | |||
388 | ## Appenlight is tailored to work with RhodeCode, see |
|
398 | ## Appenlight is tailored to work with RhodeCode, see | |
389 | ## http://appenlight.com for details how to obtain an account |
|
399 | ## http://appenlight.com for details how to obtain an account | |
390 |
|
400 | |||
391 | ## appenlight integration enabled |
|
401 | ## appenlight integration enabled | |
392 | appenlight = false |
|
402 | appenlight = false | |
393 |
|
403 | |||
394 | appenlight.server_url = https://api.appenlight.com |
|
404 | appenlight.server_url = https://api.appenlight.com | |
395 | appenlight.api_key = YOUR_API_KEY |
|
405 | appenlight.api_key = YOUR_API_KEY | |
396 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 |
|
406 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 | |
397 |
|
407 | |||
398 | # used for JS client |
|
408 | # used for JS client | |
399 | appenlight.api_public_key = YOUR_API_PUBLIC_KEY |
|
409 | appenlight.api_public_key = YOUR_API_PUBLIC_KEY | |
400 |
|
410 | |||
401 | ## TWEAK AMOUNT OF INFO SENT HERE |
|
411 | ## TWEAK AMOUNT OF INFO SENT HERE | |
402 |
|
412 | |||
403 | ## enables 404 error logging (default False) |
|
413 | ## enables 404 error logging (default False) | |
404 | appenlight.report_404 = false |
|
414 | appenlight.report_404 = false | |
405 |
|
415 | |||
406 | ## time in seconds after request is considered being slow (default 1) |
|
416 | ## time in seconds after request is considered being slow (default 1) | |
407 | appenlight.slow_request_time = 1 |
|
417 | appenlight.slow_request_time = 1 | |
408 |
|
418 | |||
409 | ## record slow requests in application |
|
419 | ## record slow requests in application | |
410 | ## (needs to be enabled for slow datastore recording and time tracking) |
|
420 | ## (needs to be enabled for slow datastore recording and time tracking) | |
411 | appenlight.slow_requests = true |
|
421 | appenlight.slow_requests = true | |
412 |
|
422 | |||
413 | ## enable hooking to application loggers |
|
423 | ## enable hooking to application loggers | |
414 | appenlight.logging = true |
|
424 | appenlight.logging = true | |
415 |
|
425 | |||
416 | ## minimum log level for log capture |
|
426 | ## minimum log level for log capture | |
417 | appenlight.logging.level = WARNING |
|
427 | appenlight.logging.level = WARNING | |
418 |
|
428 | |||
419 | ## send logs only from erroneous/slow requests |
|
429 | ## send logs only from erroneous/slow requests | |
420 | ## (saves API quota for intensive logging) |
|
430 | ## (saves API quota for intensive logging) | |
421 | appenlight.logging_on_error = false |
|
431 | appenlight.logging_on_error = false | |
422 |
|
432 | |||
423 | ## list of additonal keywords that should be grabbed from environ object |
|
433 | ## list of additonal keywords that should be grabbed from environ object | |
424 | ## can be string with comma separated list of words in lowercase |
|
434 | ## can be string with comma separated list of words in lowercase | |
425 | ## (by default client will always send following info: |
|
435 | ## (by default client will always send following info: | |
426 | ## 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that |
|
436 | ## 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that | |
427 | ## start with HTTP* this list be extended with additional keywords here |
|
437 | ## start with HTTP* this list be extended with additional keywords here | |
428 | appenlight.environ_keys_whitelist = |
|
438 | appenlight.environ_keys_whitelist = | |
429 |
|
439 | |||
430 | ## list of keywords that should be blanked from request object |
|
440 | ## list of keywords that should be blanked from request object | |
431 | ## can be string with comma separated list of words in lowercase |
|
441 | ## can be string with comma separated list of words in lowercase | |
432 | ## (by default client will always blank keys that contain following words |
|
442 | ## (by default client will always blank keys that contain following words | |
433 | ## 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf' |
|
443 | ## 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf' | |
434 | ## this list be extended with additional keywords set here |
|
444 | ## this list be extended with additional keywords set here | |
435 | appenlight.request_keys_blacklist = |
|
445 | appenlight.request_keys_blacklist = | |
436 |
|
446 | |||
437 | ## list of namespaces that should be ignores when gathering log entries |
|
447 | ## list of namespaces that should be ignores when gathering log entries | |
438 | ## can be string with comma separated list of namespaces |
|
448 | ## can be string with comma separated list of namespaces | |
439 | ## (by default the client ignores own entries: appenlight_client.client) |
|
449 | ## (by default the client ignores own entries: appenlight_client.client) | |
440 | appenlight.log_namespace_blacklist = |
|
450 | appenlight.log_namespace_blacklist = | |
441 |
|
451 | |||
442 |
|
452 | |||
443 | ################################################################################ |
|
453 | ################################################################################ | |
444 | ## WARNING: *THE LINE BELOW MUST BE UNCOMMENTED ON A PRODUCTION ENVIRONMENT* ## |
|
454 | ## WARNING: *THE LINE BELOW MUST BE UNCOMMENTED ON A PRODUCTION ENVIRONMENT* ## | |
445 | ## Debug mode will enable the interactive debugging tool, allowing ANYONE to ## |
|
455 | ## Debug mode will enable the interactive debugging tool, allowing ANYONE to ## | |
446 | ## execute malicious code after an exception is raised. ## |
|
456 | ## execute malicious code after an exception is raised. ## | |
447 | ################################################################################ |
|
457 | ################################################################################ | |
448 | #set debug = false |
|
458 | #set debug = false | |
449 |
|
459 | |||
450 |
|
460 | |||
451 | ############## |
|
461 | ############## | |
452 | ## STYLING ## |
|
462 | ## STYLING ## | |
453 | ############## |
|
463 | ############## | |
454 | debug_style = true |
|
464 | debug_style = true | |
455 |
|
465 | |||
456 | ######################################################### |
|
466 | ######################################################### | |
457 | ### DB CONFIGS - EACH DB WILL HAVE IT'S OWN CONFIG ### |
|
467 | ### DB CONFIGS - EACH DB WILL HAVE IT'S OWN CONFIG ### | |
458 | ######################################################### |
|
468 | ######################################################### | |
459 | sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 |
|
469 | sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 | |
460 | #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode |
|
470 | #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode | |
461 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode |
|
471 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode | |
462 |
|
472 | |||
463 | # see sqlalchemy docs for other advanced settings |
|
473 | # see sqlalchemy docs for other advanced settings | |
464 |
|
474 | |||
465 | ## print the sql statements to output |
|
475 | ## print the sql statements to output | |
466 | sqlalchemy.db1.echo = false |
|
476 | sqlalchemy.db1.echo = false | |
467 | ## recycle the connections after this ammount of seconds |
|
477 | ## recycle the connections after this ammount of seconds | |
468 | sqlalchemy.db1.pool_recycle = 3600 |
|
478 | sqlalchemy.db1.pool_recycle = 3600 | |
469 | sqlalchemy.db1.convert_unicode = true |
|
479 | sqlalchemy.db1.convert_unicode = true | |
470 |
|
480 | |||
471 | ## the number of connections to keep open inside the connection pool. |
|
481 | ## the number of connections to keep open inside the connection pool. | |
472 | ## 0 indicates no limit |
|
482 | ## 0 indicates no limit | |
473 | #sqlalchemy.db1.pool_size = 5 |
|
483 | #sqlalchemy.db1.pool_size = 5 | |
474 |
|
484 | |||
475 | ## the number of connections to allow in connection pool "overflow", that is |
|
485 | ## the number of connections to allow in connection pool "overflow", that is | |
476 | ## connections that can be opened above and beyond the pool_size setting, |
|
486 | ## connections that can be opened above and beyond the pool_size setting, | |
477 | ## which defaults to five. |
|
487 | ## which defaults to five. | |
478 | #sqlalchemy.db1.max_overflow = 10 |
|
488 | #sqlalchemy.db1.max_overflow = 10 | |
479 |
|
489 | |||
480 |
|
490 | |||
481 | ################## |
|
491 | ################## | |
482 | ### VCS CONFIG ### |
|
492 | ### VCS CONFIG ### | |
483 | ################## |
|
493 | ################## | |
484 | vcs.server.enable = true |
|
494 | vcs.server.enable = true | |
485 | vcs.server = localhost:9900 |
|
495 | vcs.server = localhost:9900 | |
486 |
|
496 | |||
487 | ## Web server connectivity protocol, responsible for web based VCS operatations |
|
497 | ## Web server connectivity protocol, responsible for web based VCS operatations | |
488 | ## Available protocols are: |
|
498 | ## Available protocols are: | |
489 | ## `pyro4` - using pyro4 server |
|
499 | ## `pyro4` - using pyro4 server | |
490 | ## `http` - using http-rpc backend |
|
500 | ## `http` - using http-rpc backend | |
491 | #vcs.server.protocol = http |
|
501 | #vcs.server.protocol = http | |
492 |
|
502 | |||
493 | ## Push/Pull operations protocol, available options are: |
|
503 | ## Push/Pull operations protocol, available options are: | |
494 | ## `pyro4` - using pyro4 server |
|
504 | ## `pyro4` - using pyro4 server | |
495 | ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended |
|
505 | ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended | |
496 | ## `vcsserver.scm_app` - internal app (EE only) |
|
506 | ## `vcsserver.scm_app` - internal app (EE only) | |
497 | #vcs.scm_app_implementation = rhodecode.lib.middleware.utils.scm_app_http |
|
507 | #vcs.scm_app_implementation = rhodecode.lib.middleware.utils.scm_app_http | |
498 |
|
508 | |||
499 | ## Push/Pull operations hooks protocol, available options are: |
|
509 | ## Push/Pull operations hooks protocol, available options are: | |
500 | ## `pyro4` - using pyro4 server |
|
510 | ## `pyro4` - using pyro4 server | |
501 | ## `http` - using http-rpc backend |
|
511 | ## `http` - using http-rpc backend | |
502 | #vcs.hooks.protocol = http |
|
512 | #vcs.hooks.protocol = http | |
503 |
|
513 | |||
504 | vcs.server.log_level = debug |
|
514 | vcs.server.log_level = debug | |
505 | ## Start VCSServer with this instance as a subprocess, usefull for development |
|
515 | ## Start VCSServer with this instance as a subprocess, usefull for development | |
506 | vcs.start_server = true |
|
516 | vcs.start_server = true | |
507 | vcs.backends = hg, git, svn |
|
517 | vcs.backends = hg, git, svn | |
508 | vcs.connection_timeout = 3600 |
|
518 | vcs.connection_timeout = 3600 | |
509 | ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out. |
|
519 | ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |
510 | ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible |
|
520 | ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible | |
511 | #vcs.svn.compatible_version = pre-1.8-compatible |
|
521 | #vcs.svn.compatible_version = pre-1.8-compatible | |
512 |
|
522 | |||
513 | ################################ |
|
523 | ################################ | |
514 | ### LOGGING CONFIGURATION #### |
|
524 | ### LOGGING CONFIGURATION #### | |
515 | ################################ |
|
525 | ################################ | |
516 | [loggers] |
|
526 | [loggers] | |
517 |
keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates |
|
527 | keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates | |
518 |
|
528 | |||
519 | [handlers] |
|
529 | [handlers] | |
520 | keys = console, console_sql |
|
530 | keys = console, console_sql | |
521 |
|
531 | |||
522 | [formatters] |
|
532 | [formatters] | |
523 | keys = generic, color_formatter, color_formatter_sql |
|
533 | keys = generic, color_formatter, color_formatter_sql | |
524 |
|
534 | |||
525 | ############# |
|
535 | ############# | |
526 | ## LOGGERS ## |
|
536 | ## LOGGERS ## | |
527 | ############# |
|
537 | ############# | |
528 | [logger_root] |
|
538 | [logger_root] | |
529 | level = NOTSET |
|
539 | level = NOTSET | |
530 | handlers = console |
|
540 | handlers = console | |
531 |
|
541 | |||
532 | [logger_routes] |
|
542 | [logger_routes] | |
533 | level = DEBUG |
|
543 | level = DEBUG | |
534 | handlers = |
|
544 | handlers = | |
535 | qualname = routes.middleware |
|
545 | qualname = routes.middleware | |
536 | ## "level = DEBUG" logs the route matched and routing variables. |
|
546 | ## "level = DEBUG" logs the route matched and routing variables. | |
537 | propagate = 1 |
|
547 | propagate = 1 | |
538 |
|
548 | |||
539 | [logger_beaker] |
|
549 | [logger_beaker] | |
540 | level = DEBUG |
|
550 | level = DEBUG | |
541 | handlers = |
|
551 | handlers = | |
542 | qualname = beaker.container |
|
552 | qualname = beaker.container | |
543 | propagate = 1 |
|
553 | propagate = 1 | |
544 |
|
554 | |||
545 | [logger_pyro4] |
|
555 | [logger_pyro4] | |
546 | level = DEBUG |
|
556 | level = DEBUG | |
547 | handlers = |
|
557 | handlers = | |
548 | qualname = Pyro4 |
|
558 | qualname = Pyro4 | |
549 | propagate = 1 |
|
559 | propagate = 1 | |
550 |
|
560 | |||
551 | [logger_templates] |
|
561 | [logger_templates] | |
552 | level = INFO |
|
562 | level = INFO | |
553 | handlers = |
|
563 | handlers = | |
554 | qualname = pylons.templating |
|
564 | qualname = pylons.templating | |
555 | propagate = 1 |
|
565 | propagate = 1 | |
556 |
|
566 | |||
557 | [logger_rhodecode] |
|
567 | [logger_rhodecode] | |
558 | level = DEBUG |
|
568 | level = DEBUG | |
559 | handlers = |
|
569 | handlers = | |
560 | qualname = rhodecode |
|
570 | qualname = rhodecode | |
561 | propagate = 1 |
|
571 | propagate = 1 | |
562 |
|
572 | |||
563 | [logger_sqlalchemy] |
|
573 | [logger_sqlalchemy] | |
564 | level = INFO |
|
574 | level = INFO | |
565 | handlers = console_sql |
|
575 | handlers = console_sql | |
566 | qualname = sqlalchemy.engine |
|
576 | qualname = sqlalchemy.engine | |
567 | propagate = 0 |
|
577 | propagate = 0 | |
568 |
|
578 | |||
569 | [logger_whoosh_indexer] |
|
|||
570 | level = DEBUG |
|
|||
571 | handlers = |
|
|||
572 | qualname = whoosh_indexer |
|
|||
573 | propagate = 1 |
|
|||
574 |
|
||||
575 | ############## |
|
579 | ############## | |
576 | ## HANDLERS ## |
|
580 | ## HANDLERS ## | |
577 | ############## |
|
581 | ############## | |
578 |
|
582 | |||
579 | [handler_console] |
|
583 | [handler_console] | |
580 | class = StreamHandler |
|
584 | class = StreamHandler | |
581 | args = (sys.stderr,) |
|
585 | args = (sys.stderr,) | |
582 | level = DEBUG |
|
586 | level = DEBUG | |
583 | formatter = color_formatter |
|
587 | formatter = color_formatter | |
584 |
|
588 | |||
585 | [handler_console_sql] |
|
589 | [handler_console_sql] | |
586 | class = StreamHandler |
|
590 | class = StreamHandler | |
587 | args = (sys.stderr,) |
|
591 | args = (sys.stderr,) | |
588 | level = DEBUG |
|
592 | level = DEBUG | |
589 | formatter = color_formatter_sql |
|
593 | formatter = color_formatter_sql | |
590 |
|
594 | |||
591 | ################ |
|
595 | ################ | |
592 | ## FORMATTERS ## |
|
596 | ## FORMATTERS ## | |
593 | ################ |
|
597 | ################ | |
594 |
|
598 | |||
595 | [formatter_generic] |
|
599 | [formatter_generic] | |
596 | class = rhodecode.lib.logging_formatter.Pyro4AwareFormatter |
|
600 | class = rhodecode.lib.logging_formatter.Pyro4AwareFormatter | |
597 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
601 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s | |
598 | datefmt = %Y-%m-%d %H:%M:%S |
|
602 | datefmt = %Y-%m-%d %H:%M:%S | |
599 |
|
603 | |||
600 | [formatter_color_formatter] |
|
604 | [formatter_color_formatter] | |
601 | class = rhodecode.lib.logging_formatter.ColorFormatter |
|
605 | class = rhodecode.lib.logging_formatter.ColorFormatter | |
602 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
606 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s | |
603 | datefmt = %Y-%m-%d %H:%M:%S |
|
607 | datefmt = %Y-%m-%d %H:%M:%S | |
604 |
|
608 | |||
605 | [formatter_color_formatter_sql] |
|
609 | [formatter_color_formatter_sql] | |
606 | class = rhodecode.lib.logging_formatter.ColorFormatterSql |
|
610 | class = rhodecode.lib.logging_formatter.ColorFormatterSql | |
607 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
611 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s | |
608 | datefmt = %Y-%m-%d %H:%M:%S |
|
612 | datefmt = %Y-%m-%d %H:%M:%S |
@@ -1,577 +1,581 b'' | |||||
1 | ################################################################################ |
|
1 | ################################################################################ | |
2 | ################################################################################ |
|
2 | ################################################################################ | |
3 | # RhodeCode Enterprise - configuration file # |
|
3 | # RhodeCode Enterprise - configuration file # | |
4 | # Built-in functions and variables # |
|
4 | # Built-in functions and variables # | |
5 | # The %(here)s variable will be replaced with the parent directory of this file# |
|
5 | # The %(here)s variable will be replaced with the parent directory of this file# | |
6 | # # |
|
6 | # # | |
7 | ################################################################################ |
|
7 | ################################################################################ | |
8 |
|
8 | |||
9 | [DEFAULT] |
|
9 | [DEFAULT] | |
10 | debug = true |
|
10 | debug = true | |
11 | pdebug = false |
|
|||
12 | ################################################################################ |
|
11 | ################################################################################ | |
13 | ## Uncomment and replace with the email address which should receive ## |
|
12 | ## Uncomment and replace with the email address which should receive ## | |
14 | ## any error reports after an application crash ## |
|
13 | ## any error reports after an application crash ## | |
15 | ## Additionally these settings will be used by the RhodeCode mailing system ## |
|
14 | ## Additionally these settings will be used by the RhodeCode mailing system ## | |
16 | ################################################################################ |
|
15 | ################################################################################ | |
17 | #email_to = admin@localhost |
|
16 | #email_to = admin@localhost | |
18 | #error_email_from = paste_error@localhost |
|
17 | #error_email_from = paste_error@localhost | |
19 | #app_email_from = rhodecode-noreply@localhost |
|
18 | #app_email_from = rhodecode-noreply@localhost | |
20 | #error_message = |
|
19 | #error_message = | |
21 | #email_prefix = [RhodeCode] |
|
20 | #email_prefix = [RhodeCode] | |
22 |
|
21 | |||
23 | #smtp_server = mail.server.com |
|
22 | #smtp_server = mail.server.com | |
24 | #smtp_username = |
|
23 | #smtp_username = | |
25 | #smtp_password = |
|
24 | #smtp_password = | |
26 | #smtp_port = |
|
25 | #smtp_port = | |
27 | #smtp_use_tls = false |
|
26 | #smtp_use_tls = false | |
28 | #smtp_use_ssl = true |
|
27 | #smtp_use_ssl = true | |
29 | ## Specify available auth parameters here (e.g. LOGIN PLAIN CRAM-MD5, etc.) |
|
28 | ## Specify available auth parameters here (e.g. LOGIN PLAIN CRAM-MD5, etc.) | |
30 | #smtp_auth = |
|
29 | #smtp_auth = | |
31 |
|
30 | |||
32 | [server:main] |
|
31 | [server:main] | |
33 | ## COMMON ## |
|
32 | ## COMMON ## | |
34 | host = 127.0.0.1 |
|
33 | host = 127.0.0.1 | |
35 | port = 5000 |
|
34 | port = 5000 | |
36 |
|
35 | |||
37 | ################################## |
|
36 | ################################## | |
38 | ## WAITRESS WSGI SERVER ## |
|
37 | ## WAITRESS WSGI SERVER ## | |
39 | ## Recommended for Development ## |
|
38 | ## Recommended for Development ## | |
40 | ################################## |
|
39 | ################################## | |
41 | #use = egg:waitress#main |
|
40 | #use = egg:waitress#main | |
42 | ## number of worker threads |
|
41 | ## number of worker threads | |
43 | #threads = 5 |
|
42 | #threads = 5 | |
44 | ## MAX BODY SIZE 100GB |
|
43 | ## MAX BODY SIZE 100GB | |
45 | #max_request_body_size = 107374182400 |
|
44 | #max_request_body_size = 107374182400 | |
46 | ## Use poll instead of select, fixes file descriptors limits problems. |
|
45 | ## Use poll instead of select, fixes file descriptors limits problems. | |
47 | ## May not work on old windows systems. |
|
46 | ## May not work on old windows systems. | |
48 | #asyncore_use_poll = true |
|
47 | #asyncore_use_poll = true | |
49 |
|
48 | |||
50 |
|
49 | |||
51 | ########################## |
|
50 | ########################## | |
52 | ## GUNICORN WSGI SERVER ## |
|
51 | ## GUNICORN WSGI SERVER ## | |
53 | ########################## |
|
52 | ########################## | |
54 | ## run with gunicorn --log-config <inifile.ini> --paste <inifile.ini> |
|
53 | ## run with gunicorn --log-config <inifile.ini> --paste <inifile.ini> | |
55 | use = egg:gunicorn#main |
|
54 | use = egg:gunicorn#main | |
56 | ## Sets the number of process workers. You must set `instance_id = *` |
|
55 | ## Sets the number of process workers. You must set `instance_id = *` | |
57 | ## when this option is set to more than one worker, recommended |
|
56 | ## when this option is set to more than one worker, recommended | |
58 | ## value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers |
|
57 | ## value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers | |
59 | ## The `instance_id = *` must be set in the [app:main] section below |
|
58 | ## The `instance_id = *` must be set in the [app:main] section below | |
60 | workers = 2 |
|
59 | workers = 2 | |
61 | ## number of threads for each of the worker, must be set to 1 for gevent |
|
60 | ## number of threads for each of the worker, must be set to 1 for gevent | |
62 | ## generally recommened to be at 1 |
|
61 | ## generally recommened to be at 1 | |
63 | #threads = 1 |
|
62 | #threads = 1 | |
64 | ## process name |
|
63 | ## process name | |
65 | proc_name = rhodecode |
|
64 | proc_name = rhodecode | |
66 | ## type of worker class, one of sync, gevent |
|
65 | ## type of worker class, one of sync, gevent | |
67 | ## recommended for bigger setup is using of of other than sync one |
|
66 | ## recommended for bigger setup is using of of other than sync one | |
68 | worker_class = sync |
|
67 | worker_class = sync | |
69 | ## The maximum number of simultaneous clients. Valid only for Gevent |
|
68 | ## The maximum number of simultaneous clients. Valid only for Gevent | |
70 | #worker_connections = 10 |
|
69 | #worker_connections = 10 | |
71 | ## max number of requests that worker will handle before being gracefully |
|
70 | ## max number of requests that worker will handle before being gracefully | |
72 | ## restarted, could prevent memory leaks |
|
71 | ## restarted, could prevent memory leaks | |
73 | max_requests = 1000 |
|
72 | max_requests = 1000 | |
74 | max_requests_jitter = 30 |
|
73 | max_requests_jitter = 30 | |
75 | ## amount of time a worker can spend with handling a request before it |
|
74 | ## amount of time a worker can spend with handling a request before it | |
76 | ## gets killed and restarted. Set to 6hrs |
|
75 | ## gets killed and restarted. Set to 6hrs | |
77 | timeout = 21600 |
|
76 | timeout = 21600 | |
78 |
|
77 | |||
79 |
|
78 | |||
80 | ## prefix middleware for RhodeCode, disables force_https flag. |
|
79 | ## prefix middleware for RhodeCode, disables force_https flag. | |
81 | ## allows to set RhodeCode under a prefix in server. |
|
80 | ## allows to set RhodeCode under a prefix in server. | |
82 | ## eg https://server.com/<prefix>. Enable `filter-with =` option below as well. |
|
81 | ## eg https://server.com/<prefix>. Enable `filter-with =` option below as well. | |
83 | #[filter:proxy-prefix] |
|
82 | #[filter:proxy-prefix] | |
84 | #use = egg:PasteDeploy#prefix |
|
83 | #use = egg:PasteDeploy#prefix | |
85 | #prefix = /<your-prefix> |
|
84 | #prefix = /<your-prefix> | |
86 |
|
85 | |||
87 | [app:main] |
|
86 | [app:main] | |
88 | use = egg:rhodecode-enterprise-ce |
|
87 | use = egg:rhodecode-enterprise-ce | |
89 | ## enable proxy prefix middleware, defined below |
|
88 | ## enable proxy prefix middleware, defined below | |
90 | #filter-with = proxy-prefix |
|
89 | #filter-with = proxy-prefix | |
91 |
|
90 | |||
|
91 | ## encryption key used to encrypt social plugin tokens, | |||
|
92 | ## remote_urls with credentials etc, if not set it defaults to | |||
|
93 | ## `beaker.session.secret` | |||
|
94 | #rhodecode.encrypted_values.secret = | |||
|
95 | ||||
|
96 | ## decryption strict mode (enabled by default). It controls if decryption raises | |||
|
97 | ## `SignatureVerificationError` in case of wrong key, or damaged encryption data. | |||
|
98 | #rhodecode.encrypted_values.strict = false | |||
|
99 | ||||
92 | full_stack = true |
|
100 | full_stack = true | |
93 |
|
101 | |||
94 | ## Serve static files via RhodeCode, disable to serve them via HTTP server |
|
102 | ## Serve static files via RhodeCode, disable to serve them via HTTP server | |
95 | static_files = true |
|
103 | static_files = true | |
96 |
|
104 | |||
|
105 | # autogenerate javascript routes file on startup | |||
|
106 | generate_js_files = false | |||
|
107 | ||||
97 | ## Optional Languages |
|
108 | ## Optional Languages | |
98 | ## en(default), be, de, es, fr, it, ja, pl, pt, ru, zh |
|
109 | ## en(default), be, de, es, fr, it, ja, pl, pt, ru, zh | |
99 | lang = en |
|
110 | lang = en | |
100 |
|
111 | |||
101 | ## perform a full repository scan on each server start, this should be |
|
112 | ## perform a full repository scan on each server start, this should be | |
102 | ## set to false after first startup, to allow faster server restarts. |
|
113 | ## set to false after first startup, to allow faster server restarts. | |
103 | startup.import_repos = false |
|
114 | startup.import_repos = false | |
104 |
|
115 | |||
105 | ## Uncomment and set this path to use archive download cache. |
|
116 | ## Uncomment and set this path to use archive download cache. | |
106 | ## Once enabled, generated archives will be cached at this location |
|
117 | ## Once enabled, generated archives will be cached at this location | |
107 | ## and served from the cache during subsequent requests for the same archive of |
|
118 | ## and served from the cache during subsequent requests for the same archive of | |
108 | ## the repository. |
|
119 | ## the repository. | |
109 | #archive_cache_dir = /tmp/tarballcache |
|
120 | #archive_cache_dir = /tmp/tarballcache | |
110 |
|
121 | |||
111 | ## change this to unique ID for security |
|
122 | ## change this to unique ID for security | |
112 | app_instance_uuid = rc-production |
|
123 | app_instance_uuid = rc-production | |
113 |
|
124 | |||
114 | ## cut off limit for large diffs (size in bytes) |
|
125 | ## cut off limit for large diffs (size in bytes) | |
115 | cut_off_limit_diff = 1024000 |
|
126 | cut_off_limit_diff = 1024000 | |
116 | cut_off_limit_file = 256000 |
|
127 | cut_off_limit_file = 256000 | |
117 |
|
128 | |||
118 | ## use cache version of scm repo everywhere |
|
129 | ## use cache version of scm repo everywhere | |
119 | vcs_full_cache = true |
|
130 | vcs_full_cache = true | |
120 |
|
131 | |||
121 | ## force https in RhodeCode, fixes https redirects, assumes it's always https |
|
132 | ## force https in RhodeCode, fixes https redirects, assumes it's always https | |
122 | ## Normally this is controlled by proper http flags sent from http server |
|
133 | ## Normally this is controlled by proper http flags sent from http server | |
123 | force_https = false |
|
134 | force_https = false | |
124 |
|
135 | |||
125 | ## use Strict-Transport-Security headers |
|
136 | ## use Strict-Transport-Security headers | |
126 | use_htsts = false |
|
137 | use_htsts = false | |
127 |
|
138 | |||
128 | ## number of commits stats will parse on each iteration |
|
139 | ## number of commits stats will parse on each iteration | |
129 | commit_parse_limit = 25 |
|
140 | commit_parse_limit = 25 | |
130 |
|
141 | |||
131 | ## git rev filter option, --all is the default filter, if you need to |
|
142 | ## git rev filter option, --all is the default filter, if you need to | |
132 | ## hide all refs in changelog switch this to --branches --tags |
|
143 | ## hide all refs in changelog switch this to --branches --tags | |
133 | git_rev_filter = --branches --tags |
|
144 | git_rev_filter = --branches --tags | |
134 |
|
145 | |||
135 | # Set to true if your repos are exposed using the dumb protocol |
|
146 | # Set to true if your repos are exposed using the dumb protocol | |
136 | git_update_server_info = false |
|
147 | git_update_server_info = false | |
137 |
|
148 | |||
138 | ## RSS/ATOM feed options |
|
149 | ## RSS/ATOM feed options | |
139 | rss_cut_off_limit = 256000 |
|
150 | rss_cut_off_limit = 256000 | |
140 | rss_items_per_page = 10 |
|
151 | rss_items_per_page = 10 | |
141 | rss_include_diff = false |
|
152 | rss_include_diff = false | |
142 |
|
153 | |||
143 | ## gist URL alias, used to create nicer urls for gist. This should be an |
|
154 | ## gist URL alias, used to create nicer urls for gist. This should be an | |
144 | ## url that does rewrites to _admin/gists/<gistid>. |
|
155 | ## url that does rewrites to _admin/gists/<gistid>. | |
145 | ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal |
|
156 | ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal | |
146 | ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/<gistid> |
|
157 | ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/<gistid> | |
147 | gist_alias_url = |
|
158 | gist_alias_url = | |
148 |
|
159 | |||
149 | ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be |
|
160 | ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be | |
150 | ## used for access. |
|
161 | ## used for access. | |
151 | ## Adding ?auth_token = <token> to the url authenticates this request as if it |
|
162 | ## Adding ?auth_token = <token> to the url authenticates this request as if it | |
152 | ## came from the the logged in user who own this authentication token. |
|
163 | ## came from the the logged in user who own this authentication token. | |
153 | ## |
|
164 | ## | |
154 | ## Syntax is <ControllerClass>:<function_pattern>. |
|
165 | ## Syntax is <ControllerClass>:<function_pattern>. | |
155 | ## To enable access to raw_files put `FilesController:raw`. |
|
166 | ## To enable access to raw_files put `FilesController:raw`. | |
156 | ## To enable access to patches add `ChangesetController:changeset_patch`. |
|
167 | ## To enable access to patches add `ChangesetController:changeset_patch`. | |
157 | ## The list should be "," separated and on a single line. |
|
168 | ## The list should be "," separated and on a single line. | |
158 | ## |
|
169 | ## | |
159 | ## Recommended controllers to enable: |
|
170 | ## Recommended controllers to enable: | |
160 | # ChangesetController:changeset_patch, |
|
171 | # ChangesetController:changeset_patch, | |
161 | # ChangesetController:changeset_raw, |
|
172 | # ChangesetController:changeset_raw, | |
162 | # FilesController:raw, |
|
173 | # FilesController:raw, | |
163 | # FilesController:archivefile, |
|
174 | # FilesController:archivefile, | |
164 | # GistsController:*, |
|
175 | # GistsController:*, | |
165 | api_access_controllers_whitelist = |
|
176 | api_access_controllers_whitelist = | |
166 |
|
177 | |||
167 | ## default encoding used to convert from and to unicode |
|
178 | ## default encoding used to convert from and to unicode | |
168 | ## can be also a comma separated list of encoding in case of mixed encodings |
|
179 | ## can be also a comma separated list of encoding in case of mixed encodings | |
169 | default_encoding = UTF-8 |
|
180 | default_encoding = UTF-8 | |
170 |
|
181 | |||
171 | ## instance-id prefix |
|
182 | ## instance-id prefix | |
172 | ## a prefix key for this instance used for cache invalidation when running |
|
183 | ## a prefix key for this instance used for cache invalidation when running | |
173 | ## multiple instances of rhodecode, make sure it's globally unique for |
|
184 | ## multiple instances of rhodecode, make sure it's globally unique for | |
174 | ## all running rhodecode instances. Leave empty if you don't use it |
|
185 | ## all running rhodecode instances. Leave empty if you don't use it | |
175 | instance_id = |
|
186 | instance_id = | |
176 |
|
187 | |||
177 | ## Fallback authentication plugin. Set this to a plugin ID to force the usage |
|
188 | ## Fallback authentication plugin. Set this to a plugin ID to force the usage | |
178 | ## of an authentication plugin also if it is disabled by it's settings. |
|
189 | ## of an authentication plugin also if it is disabled by it's settings. | |
179 | ## This could be useful if you are unable to log in to the system due to broken |
|
190 | ## This could be useful if you are unable to log in to the system due to broken | |
180 | ## authentication settings. Then you can enable e.g. the internal rhodecode auth |
|
191 | ## authentication settings. Then you can enable e.g. the internal rhodecode auth | |
181 | ## module to log in again and fix the settings. |
|
192 | ## module to log in again and fix the settings. | |
182 | ## |
|
193 | ## | |
183 | ## Available builtin plugin IDs (hash is part of the ID): |
|
194 | ## Available builtin plugin IDs (hash is part of the ID): | |
184 | ## egg:rhodecode-enterprise-ce#rhodecode |
|
195 | ## egg:rhodecode-enterprise-ce#rhodecode | |
185 | ## egg:rhodecode-enterprise-ce#pam |
|
196 | ## egg:rhodecode-enterprise-ce#pam | |
186 | ## egg:rhodecode-enterprise-ce#ldap |
|
197 | ## egg:rhodecode-enterprise-ce#ldap | |
187 | ## egg:rhodecode-enterprise-ce#jasig_cas |
|
198 | ## egg:rhodecode-enterprise-ce#jasig_cas | |
188 | ## egg:rhodecode-enterprise-ce#headers |
|
199 | ## egg:rhodecode-enterprise-ce#headers | |
189 | ## egg:rhodecode-enterprise-ce#crowd |
|
200 | ## egg:rhodecode-enterprise-ce#crowd | |
190 | #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode |
|
201 | #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode | |
191 |
|
202 | |||
192 | ## alternative return HTTP header for failed authentication. Default HTTP |
|
203 | ## alternative return HTTP header for failed authentication. Default HTTP | |
193 | ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with |
|
204 | ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with | |
194 | ## handling that causing a series of failed authentication calls. |
|
205 | ## handling that causing a series of failed authentication calls. | |
195 | ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code |
|
206 | ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code | |
196 | ## This will be served instead of default 401 on bad authnetication |
|
207 | ## This will be served instead of default 401 on bad authnetication | |
197 | auth_ret_code = |
|
208 | auth_ret_code = | |
198 |
|
209 | |||
199 | ## use special detection method when serving auth_ret_code, instead of serving |
|
210 | ## use special detection method when serving auth_ret_code, instead of serving | |
200 | ## ret_code directly, use 401 initially (Which triggers credentials prompt) |
|
211 | ## ret_code directly, use 401 initially (Which triggers credentials prompt) | |
201 | ## and then serve auth_ret_code to clients |
|
212 | ## and then serve auth_ret_code to clients | |
202 | auth_ret_code_detection = false |
|
213 | auth_ret_code_detection = false | |
203 |
|
214 | |||
204 | ## locking return code. When repository is locked return this HTTP code. 2XX |
|
215 | ## locking return code. When repository is locked return this HTTP code. 2XX | |
205 | ## codes don't break the transactions while 4XX codes do |
|
216 | ## codes don't break the transactions while 4XX codes do | |
206 | lock_ret_code = 423 |
|
217 | lock_ret_code = 423 | |
207 |
|
218 | |||
208 | ## allows to change the repository location in settings page |
|
219 | ## allows to change the repository location in settings page | |
209 | allow_repo_location_change = true |
|
220 | allow_repo_location_change = true | |
210 |
|
221 | |||
211 | ## allows to setup custom hooks in settings page |
|
222 | ## allows to setup custom hooks in settings page | |
212 | allow_custom_hooks_settings = true |
|
223 | allow_custom_hooks_settings = true | |
213 |
|
224 | |||
214 | ## generated license token, goto license page in RhodeCode settings to obtain |
|
225 | ## generated license token, goto license page in RhodeCode settings to obtain | |
215 | ## new token |
|
226 | ## new token | |
216 | license_token = |
|
227 | license_token = | |
217 |
|
228 | |||
218 | ## supervisor connection uri, for managing supervisor and logs. |
|
229 | ## supervisor connection uri, for managing supervisor and logs. | |
219 | supervisor.uri = |
|
230 | supervisor.uri = | |
220 | ## supervisord group name/id we only want this RC instance to handle |
|
231 | ## supervisord group name/id we only want this RC instance to handle | |
221 | supervisor.group_id = prod |
|
232 | supervisor.group_id = prod | |
222 |
|
233 | |||
223 | ## Display extended labs settings |
|
234 | ## Display extended labs settings | |
224 | labs_settings_active = true |
|
235 | labs_settings_active = true | |
225 |
|
236 | |||
226 | #################################### |
|
237 | #################################### | |
227 | ### CELERY CONFIG #### |
|
238 | ### CELERY CONFIG #### | |
228 | #################################### |
|
239 | #################################### | |
229 | use_celery = false |
|
240 | use_celery = false | |
230 | broker.host = localhost |
|
241 | broker.host = localhost | |
231 | broker.vhost = rabbitmqhost |
|
242 | broker.vhost = rabbitmqhost | |
232 | broker.port = 5672 |
|
243 | broker.port = 5672 | |
233 | broker.user = rabbitmq |
|
244 | broker.user = rabbitmq | |
234 | broker.password = qweqwe |
|
245 | broker.password = qweqwe | |
235 |
|
246 | |||
236 | celery.imports = rhodecode.lib.celerylib.tasks |
|
247 | celery.imports = rhodecode.lib.celerylib.tasks | |
237 |
|
248 | |||
238 | celery.result.backend = amqp |
|
249 | celery.result.backend = amqp | |
239 | celery.result.dburi = amqp:// |
|
250 | celery.result.dburi = amqp:// | |
240 | celery.result.serialier = json |
|
251 | celery.result.serialier = json | |
241 |
|
252 | |||
242 | #celery.send.task.error.emails = true |
|
253 | #celery.send.task.error.emails = true | |
243 | #celery.amqp.task.result.expires = 18000 |
|
254 | #celery.amqp.task.result.expires = 18000 | |
244 |
|
255 | |||
245 | celeryd.concurrency = 2 |
|
256 | celeryd.concurrency = 2 | |
246 | #celeryd.log.file = celeryd.log |
|
257 | #celeryd.log.file = celeryd.log | |
247 | celeryd.log.level = debug |
|
258 | celeryd.log.level = debug | |
248 | celeryd.max.tasks.per.child = 1 |
|
259 | celeryd.max.tasks.per.child = 1 | |
249 |
|
260 | |||
250 | ## tasks will never be sent to the queue, but executed locally instead. |
|
261 | ## tasks will never be sent to the queue, but executed locally instead. | |
251 | celery.always.eager = false |
|
262 | celery.always.eager = false | |
252 |
|
263 | |||
253 | #################################### |
|
264 | #################################### | |
254 | ### BEAKER CACHE #### |
|
265 | ### BEAKER CACHE #### | |
255 | #################################### |
|
266 | #################################### | |
256 | # default cache dir for templates. Putting this into a ramdisk |
|
267 | # default cache dir for templates. Putting this into a ramdisk | |
257 | ## can boost performance, eg. %(here)s/data_ramdisk |
|
268 | ## can boost performance, eg. %(here)s/data_ramdisk | |
258 | cache_dir = %(here)s/data |
|
269 | cache_dir = %(here)s/data | |
259 |
|
270 | |||
260 | ## locking and default file storage for Beaker. Putting this into a ramdisk |
|
271 | ## locking and default file storage for Beaker. Putting this into a ramdisk | |
261 | ## can boost performance, eg. %(here)s/data_ramdisk/cache/beaker_data |
|
272 | ## can boost performance, eg. %(here)s/data_ramdisk/cache/beaker_data | |
262 | beaker.cache.data_dir = %(here)s/data/cache/beaker_data |
|
273 | beaker.cache.data_dir = %(here)s/data/cache/beaker_data | |
263 | beaker.cache.lock_dir = %(here)s/data/cache/beaker_lock |
|
274 | beaker.cache.lock_dir = %(here)s/data/cache/beaker_lock | |
264 |
|
275 | |||
265 | beaker.cache.regions = super_short_term, short_term, long_term, sql_cache_short, auth_plugins, repo_cache_long |
|
276 | beaker.cache.regions = super_short_term, short_term, long_term, sql_cache_short, auth_plugins, repo_cache_long | |
266 |
|
277 | |||
267 | beaker.cache.super_short_term.type = memory |
|
278 | beaker.cache.super_short_term.type = memory | |
268 | beaker.cache.super_short_term.expire = 10 |
|
279 | beaker.cache.super_short_term.expire = 10 | |
269 | beaker.cache.super_short_term.key_length = 256 |
|
280 | beaker.cache.super_short_term.key_length = 256 | |
270 |
|
281 | |||
271 | beaker.cache.short_term.type = memory |
|
282 | beaker.cache.short_term.type = memory | |
272 | beaker.cache.short_term.expire = 60 |
|
283 | beaker.cache.short_term.expire = 60 | |
273 | beaker.cache.short_term.key_length = 256 |
|
284 | beaker.cache.short_term.key_length = 256 | |
274 |
|
285 | |||
275 | beaker.cache.long_term.type = memory |
|
286 | beaker.cache.long_term.type = memory | |
276 | beaker.cache.long_term.expire = 36000 |
|
287 | beaker.cache.long_term.expire = 36000 | |
277 | beaker.cache.long_term.key_length = 256 |
|
288 | beaker.cache.long_term.key_length = 256 | |
278 |
|
289 | |||
279 | beaker.cache.sql_cache_short.type = memory |
|
290 | beaker.cache.sql_cache_short.type = memory | |
280 | beaker.cache.sql_cache_short.expire = 10 |
|
291 | beaker.cache.sql_cache_short.expire = 10 | |
281 | beaker.cache.sql_cache_short.key_length = 256 |
|
292 | beaker.cache.sql_cache_short.key_length = 256 | |
282 |
|
293 | |||
283 | # default is memory cache, configure only if required |
|
294 | # default is memory cache, configure only if required | |
284 | # using multi-node or multi-worker setup |
|
295 | # using multi-node or multi-worker setup | |
285 | #beaker.cache.auth_plugins.type = ext:database |
|
296 | #beaker.cache.auth_plugins.type = ext:database | |
286 | #beaker.cache.auth_plugins.lock_dir = %(here)s/data/cache/auth_plugin_lock |
|
297 | #beaker.cache.auth_plugins.lock_dir = %(here)s/data/cache/auth_plugin_lock | |
287 | #beaker.cache.auth_plugins.url = postgresql://postgres:secret@localhost/rhodecode |
|
298 | #beaker.cache.auth_plugins.url = postgresql://postgres:secret@localhost/rhodecode | |
288 | #beaker.cache.auth_plugins.url = mysql://root:secret@127.0.0.1/rhodecode |
|
299 | #beaker.cache.auth_plugins.url = mysql://root:secret@127.0.0.1/rhodecode | |
289 | #beaker.cache.auth_plugins.sa.pool_recycle = 3600 |
|
300 | #beaker.cache.auth_plugins.sa.pool_recycle = 3600 | |
290 | #beaker.cache.auth_plugins.sa.pool_size = 10 |
|
301 | #beaker.cache.auth_plugins.sa.pool_size = 10 | |
291 | #beaker.cache.auth_plugins.sa.max_overflow = 0 |
|
302 | #beaker.cache.auth_plugins.sa.max_overflow = 0 | |
292 |
|
303 | |||
293 | beaker.cache.repo_cache_long.type = memorylru_base |
|
304 | beaker.cache.repo_cache_long.type = memorylru_base | |
294 | beaker.cache.repo_cache_long.max_items = 4096 |
|
305 | beaker.cache.repo_cache_long.max_items = 4096 | |
295 | beaker.cache.repo_cache_long.expire = 2592000 |
|
306 | beaker.cache.repo_cache_long.expire = 2592000 | |
296 |
|
307 | |||
297 | # default is memorylru_base cache, configure only if required |
|
308 | # default is memorylru_base cache, configure only if required | |
298 | # using multi-node or multi-worker setup |
|
309 | # using multi-node or multi-worker setup | |
299 | #beaker.cache.repo_cache_long.type = ext:memcached |
|
310 | #beaker.cache.repo_cache_long.type = ext:memcached | |
300 | #beaker.cache.repo_cache_long.url = localhost:11211 |
|
311 | #beaker.cache.repo_cache_long.url = localhost:11211 | |
301 | #beaker.cache.repo_cache_long.expire = 1209600 |
|
312 | #beaker.cache.repo_cache_long.expire = 1209600 | |
302 | #beaker.cache.repo_cache_long.key_length = 256 |
|
313 | #beaker.cache.repo_cache_long.key_length = 256 | |
303 |
|
314 | |||
304 | #################################### |
|
315 | #################################### | |
305 | ### BEAKER SESSION #### |
|
316 | ### BEAKER SESSION #### | |
306 | #################################### |
|
317 | #################################### | |
307 |
|
318 | |||
308 | ## .session.type is type of storage options for the session, current allowed |
|
319 | ## .session.type is type of storage options for the session, current allowed | |
309 | ## types are file, ext:memcached, ext:database, and memory (default). |
|
320 | ## types are file, ext:memcached, ext:database, and memory (default). | |
310 | beaker.session.type = file |
|
321 | beaker.session.type = file | |
311 | beaker.session.data_dir = %(here)s/data/sessions/data |
|
322 | beaker.session.data_dir = %(here)s/data/sessions/data | |
312 |
|
323 | |||
313 | ## db based session, fast, and allows easy management over logged in users ## |
|
324 | ## db based session, fast, and allows easy management over logged in users ## | |
314 | #beaker.session.type = ext:database |
|
325 | #beaker.session.type = ext:database | |
315 | #beaker.session.table_name = db_session |
|
326 | #beaker.session.table_name = db_session | |
316 | #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode |
|
327 | #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode | |
317 | #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode |
|
328 | #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode | |
318 | #beaker.session.sa.pool_recycle = 3600 |
|
329 | #beaker.session.sa.pool_recycle = 3600 | |
319 | #beaker.session.sa.echo = false |
|
330 | #beaker.session.sa.echo = false | |
320 |
|
331 | |||
321 | beaker.session.key = rhodecode |
|
332 | beaker.session.key = rhodecode | |
322 | beaker.session.secret = production-rc-uytcxaz |
|
333 | beaker.session.secret = production-rc-uytcxaz | |
323 | beaker.session.lock_dir = %(here)s/data/sessions/lock |
|
334 | beaker.session.lock_dir = %(here)s/data/sessions/lock | |
324 |
|
335 | |||
325 | ## Secure encrypted cookie. Requires AES and AES python libraries |
|
336 | ## Secure encrypted cookie. Requires AES and AES python libraries | |
326 | ## you must disable beaker.session.secret to use this |
|
337 | ## you must disable beaker.session.secret to use this | |
327 | #beaker.session.encrypt_key = <key_for_encryption> |
|
338 | #beaker.session.encrypt_key = <key_for_encryption> | |
328 | #beaker.session.validate_key = <validation_key> |
|
339 | #beaker.session.validate_key = <validation_key> | |
329 |
|
340 | |||
330 | ## sets session as invalid(also logging out user) if it haven not been |
|
341 | ## sets session as invalid(also logging out user) if it haven not been | |
331 | ## accessed for given amount of time in seconds |
|
342 | ## accessed for given amount of time in seconds | |
332 | beaker.session.timeout = 2592000 |
|
343 | beaker.session.timeout = 2592000 | |
333 | beaker.session.httponly = true |
|
344 | beaker.session.httponly = true | |
334 | #beaker.session.cookie_path = /<your-prefix> |
|
345 | #beaker.session.cookie_path = /<your-prefix> | |
335 |
|
346 | |||
336 | ## uncomment for https secure cookie |
|
347 | ## uncomment for https secure cookie | |
337 | beaker.session.secure = false |
|
348 | beaker.session.secure = false | |
338 |
|
349 | |||
339 | ## auto save the session to not to use .save() |
|
350 | ## auto save the session to not to use .save() | |
340 | beaker.session.auto = false |
|
351 | beaker.session.auto = false | |
341 |
|
352 | |||
342 | ## default cookie expiration time in seconds, set to `true` to set expire |
|
353 | ## default cookie expiration time in seconds, set to `true` to set expire | |
343 | ## at browser close |
|
354 | ## at browser close | |
344 | #beaker.session.cookie_expires = 3600 |
|
355 | #beaker.session.cookie_expires = 3600 | |
345 |
|
356 | |||
346 | ################################### |
|
357 | ################################### | |
347 | ## SEARCH INDEXING CONFIGURATION ## |
|
358 | ## SEARCH INDEXING CONFIGURATION ## | |
348 | ################################### |
|
359 | ################################### | |
349 | ## Full text search indexer is available in rhodecode-tools under |
|
360 | ## Full text search indexer is available in rhodecode-tools under | |
350 | ## `rhodecode-tools index` command |
|
361 | ## `rhodecode-tools index` command | |
351 |
|
362 | |||
352 | # WHOOSH Backend, doesn't require additional services to run |
|
363 | # WHOOSH Backend, doesn't require additional services to run | |
353 | # it works good with few dozen repos |
|
364 | # it works good with few dozen repos | |
354 | search.module = rhodecode.lib.index.whoosh |
|
365 | search.module = rhodecode.lib.index.whoosh | |
355 | search.location = %(here)s/data/index |
|
366 | search.location = %(here)s/data/index | |
356 |
|
367 | |||
357 |
|
||||
358 | ################################### |
|
368 | ################################### | |
359 | ## APPENLIGHT CONFIG ## |
|
369 | ## APPENLIGHT CONFIG ## | |
360 | ################################### |
|
370 | ################################### | |
361 |
|
371 | |||
362 | ## Appenlight is tailored to work with RhodeCode, see |
|
372 | ## Appenlight is tailored to work with RhodeCode, see | |
363 | ## http://appenlight.com for details how to obtain an account |
|
373 | ## http://appenlight.com for details how to obtain an account | |
364 |
|
374 | |||
365 | ## appenlight integration enabled |
|
375 | ## appenlight integration enabled | |
366 | appenlight = false |
|
376 | appenlight = false | |
367 |
|
377 | |||
368 | appenlight.server_url = https://api.appenlight.com |
|
378 | appenlight.server_url = https://api.appenlight.com | |
369 | appenlight.api_key = YOUR_API_KEY |
|
379 | appenlight.api_key = YOUR_API_KEY | |
370 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 |
|
380 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 | |
371 |
|
381 | |||
372 | # used for JS client |
|
382 | # used for JS client | |
373 | appenlight.api_public_key = YOUR_API_PUBLIC_KEY |
|
383 | appenlight.api_public_key = YOUR_API_PUBLIC_KEY | |
374 |
|
384 | |||
375 | ## TWEAK AMOUNT OF INFO SENT HERE |
|
385 | ## TWEAK AMOUNT OF INFO SENT HERE | |
376 |
|
386 | |||
377 | ## enables 404 error logging (default False) |
|
387 | ## enables 404 error logging (default False) | |
378 | appenlight.report_404 = false |
|
388 | appenlight.report_404 = false | |
379 |
|
389 | |||
380 | ## time in seconds after request is considered being slow (default 1) |
|
390 | ## time in seconds after request is considered being slow (default 1) | |
381 | appenlight.slow_request_time = 1 |
|
391 | appenlight.slow_request_time = 1 | |
382 |
|
392 | |||
383 | ## record slow requests in application |
|
393 | ## record slow requests in application | |
384 | ## (needs to be enabled for slow datastore recording and time tracking) |
|
394 | ## (needs to be enabled for slow datastore recording and time tracking) | |
385 | appenlight.slow_requests = true |
|
395 | appenlight.slow_requests = true | |
386 |
|
396 | |||
387 | ## enable hooking to application loggers |
|
397 | ## enable hooking to application loggers | |
388 | appenlight.logging = true |
|
398 | appenlight.logging = true | |
389 |
|
399 | |||
390 | ## minimum log level for log capture |
|
400 | ## minimum log level for log capture | |
391 | appenlight.logging.level = WARNING |
|
401 | appenlight.logging.level = WARNING | |
392 |
|
402 | |||
393 | ## send logs only from erroneous/slow requests |
|
403 | ## send logs only from erroneous/slow requests | |
394 | ## (saves API quota for intensive logging) |
|
404 | ## (saves API quota for intensive logging) | |
395 | appenlight.logging_on_error = false |
|
405 | appenlight.logging_on_error = false | |
396 |
|
406 | |||
397 | ## list of additonal keywords that should be grabbed from environ object |
|
407 | ## list of additonal keywords that should be grabbed from environ object | |
398 | ## can be string with comma separated list of words in lowercase |
|
408 | ## can be string with comma separated list of words in lowercase | |
399 | ## (by default client will always send following info: |
|
409 | ## (by default client will always send following info: | |
400 | ## 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that |
|
410 | ## 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that | |
401 | ## start with HTTP* this list be extended with additional keywords here |
|
411 | ## start with HTTP* this list be extended with additional keywords here | |
402 | appenlight.environ_keys_whitelist = |
|
412 | appenlight.environ_keys_whitelist = | |
403 |
|
413 | |||
404 | ## list of keywords that should be blanked from request object |
|
414 | ## list of keywords that should be blanked from request object | |
405 | ## can be string with comma separated list of words in lowercase |
|
415 | ## can be string with comma separated list of words in lowercase | |
406 | ## (by default client will always blank keys that contain following words |
|
416 | ## (by default client will always blank keys that contain following words | |
407 | ## 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf' |
|
417 | ## 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf' | |
408 | ## this list be extended with additional keywords set here |
|
418 | ## this list be extended with additional keywords set here | |
409 | appenlight.request_keys_blacklist = |
|
419 | appenlight.request_keys_blacklist = | |
410 |
|
420 | |||
411 | ## list of namespaces that should be ignores when gathering log entries |
|
421 | ## list of namespaces that should be ignores when gathering log entries | |
412 | ## can be string with comma separated list of namespaces |
|
422 | ## can be string with comma separated list of namespaces | |
413 | ## (by default the client ignores own entries: appenlight_client.client) |
|
423 | ## (by default the client ignores own entries: appenlight_client.client) | |
414 | appenlight.log_namespace_blacklist = |
|
424 | appenlight.log_namespace_blacklist = | |
415 |
|
425 | |||
416 |
|
426 | |||
417 | ################################################################################ |
|
427 | ################################################################################ | |
418 | ## WARNING: *THE LINE BELOW MUST BE UNCOMMENTED ON A PRODUCTION ENVIRONMENT* ## |
|
428 | ## WARNING: *THE LINE BELOW MUST BE UNCOMMENTED ON A PRODUCTION ENVIRONMENT* ## | |
419 | ## Debug mode will enable the interactive debugging tool, allowing ANYONE to ## |
|
429 | ## Debug mode will enable the interactive debugging tool, allowing ANYONE to ## | |
420 | ## execute malicious code after an exception is raised. ## |
|
430 | ## execute malicious code after an exception is raised. ## | |
421 | ################################################################################ |
|
431 | ################################################################################ | |
422 | set debug = false |
|
432 | set debug = false | |
423 |
|
433 | |||
424 |
|
434 | |||
425 | ######################################################### |
|
435 | ######################################################### | |
426 | ### DB CONFIGS - EACH DB WILL HAVE IT'S OWN CONFIG ### |
|
436 | ### DB CONFIGS - EACH DB WILL HAVE IT'S OWN CONFIG ### | |
427 | ######################################################### |
|
437 | ######################################################### | |
428 | #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 |
|
438 | #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 | |
429 | sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode |
|
439 | sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode | |
430 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode |
|
440 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode | |
431 |
|
441 | |||
432 | # see sqlalchemy docs for other advanced settings |
|
442 | # see sqlalchemy docs for other advanced settings | |
433 |
|
443 | |||
434 | ## print the sql statements to output |
|
444 | ## print the sql statements to output | |
435 | sqlalchemy.db1.echo = false |
|
445 | sqlalchemy.db1.echo = false | |
436 | ## recycle the connections after this ammount of seconds |
|
446 | ## recycle the connections after this ammount of seconds | |
437 | sqlalchemy.db1.pool_recycle = 3600 |
|
447 | sqlalchemy.db1.pool_recycle = 3600 | |
438 | sqlalchemy.db1.convert_unicode = true |
|
448 | sqlalchemy.db1.convert_unicode = true | |
439 |
|
449 | |||
440 | ## the number of connections to keep open inside the connection pool. |
|
450 | ## the number of connections to keep open inside the connection pool. | |
441 | ## 0 indicates no limit |
|
451 | ## 0 indicates no limit | |
442 | #sqlalchemy.db1.pool_size = 5 |
|
452 | #sqlalchemy.db1.pool_size = 5 | |
443 |
|
453 | |||
444 | ## the number of connections to allow in connection pool "overflow", that is |
|
454 | ## the number of connections to allow in connection pool "overflow", that is | |
445 | ## connections that can be opened above and beyond the pool_size setting, |
|
455 | ## connections that can be opened above and beyond the pool_size setting, | |
446 | ## which defaults to five. |
|
456 | ## which defaults to five. | |
447 | #sqlalchemy.db1.max_overflow = 10 |
|
457 | #sqlalchemy.db1.max_overflow = 10 | |
448 |
|
458 | |||
449 |
|
459 | |||
450 | ################## |
|
460 | ################## | |
451 | ### VCS CONFIG ### |
|
461 | ### VCS CONFIG ### | |
452 | ################## |
|
462 | ################## | |
453 | vcs.server.enable = true |
|
463 | vcs.server.enable = true | |
454 | vcs.server = localhost:9900 |
|
464 | vcs.server = localhost:9900 | |
455 |
|
465 | |||
456 | ## Web server connectivity protocol, responsible for web based VCS operatations |
|
466 | ## Web server connectivity protocol, responsible for web based VCS operatations | |
457 | ## Available protocols are: |
|
467 | ## Available protocols are: | |
458 | ## `pyro4` - using pyro4 server |
|
468 | ## `pyro4` - using pyro4 server | |
459 | ## `http` - using http-rpc backend |
|
469 | ## `http` - using http-rpc backend | |
460 | #vcs.server.protocol = http |
|
470 | #vcs.server.protocol = http | |
461 |
|
471 | |||
462 | ## Push/Pull operations protocol, available options are: |
|
472 | ## Push/Pull operations protocol, available options are: | |
463 | ## `pyro4` - using pyro4 server |
|
473 | ## `pyro4` - using pyro4 server | |
464 | ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended |
|
474 | ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended | |
465 | ## `vcsserver.scm_app` - internal app (EE only) |
|
475 | ## `vcsserver.scm_app` - internal app (EE only) | |
466 | #vcs.scm_app_implementation = rhodecode.lib.middleware.utils.scm_app_http |
|
476 | #vcs.scm_app_implementation = rhodecode.lib.middleware.utils.scm_app_http | |
467 |
|
477 | |||
468 | ## Push/Pull operations hooks protocol, available options are: |
|
478 | ## Push/Pull operations hooks protocol, available options are: | |
469 | ## `pyro4` - using pyro4 server |
|
479 | ## `pyro4` - using pyro4 server | |
470 | ## `http` - using http-rpc backend |
|
480 | ## `http` - using http-rpc backend | |
471 | #vcs.hooks.protocol = http |
|
481 | #vcs.hooks.protocol = http | |
472 |
|
482 | |||
473 | vcs.server.log_level = info |
|
483 | vcs.server.log_level = info | |
474 | ## Start VCSServer with this instance as a subprocess, usefull for development |
|
484 | ## Start VCSServer with this instance as a subprocess, usefull for development | |
475 | vcs.start_server = false |
|
485 | vcs.start_server = false | |
476 | vcs.backends = hg, git, svn |
|
486 | vcs.backends = hg, git, svn | |
477 | vcs.connection_timeout = 3600 |
|
487 | vcs.connection_timeout = 3600 | |
478 | ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out. |
|
488 | ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |
479 | ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible |
|
489 | ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible | |
480 | #vcs.svn.compatible_version = pre-1.8-compatible |
|
490 | #vcs.svn.compatible_version = pre-1.8-compatible | |
481 |
|
491 | |||
482 | ################################ |
|
492 | ################################ | |
483 | ### LOGGING CONFIGURATION #### |
|
493 | ### LOGGING CONFIGURATION #### | |
484 | ################################ |
|
494 | ################################ | |
485 | [loggers] |
|
495 | [loggers] | |
486 |
keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates |
|
496 | keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates | |
487 |
|
497 | |||
488 | [handlers] |
|
498 | [handlers] | |
489 | keys = console, console_sql |
|
499 | keys = console, console_sql | |
490 |
|
500 | |||
491 | [formatters] |
|
501 | [formatters] | |
492 | keys = generic, color_formatter, color_formatter_sql |
|
502 | keys = generic, color_formatter, color_formatter_sql | |
493 |
|
503 | |||
494 | ############# |
|
504 | ############# | |
495 | ## LOGGERS ## |
|
505 | ## LOGGERS ## | |
496 | ############# |
|
506 | ############# | |
497 | [logger_root] |
|
507 | [logger_root] | |
498 | level = NOTSET |
|
508 | level = NOTSET | |
499 | handlers = console |
|
509 | handlers = console | |
500 |
|
510 | |||
501 | [logger_routes] |
|
511 | [logger_routes] | |
502 | level = DEBUG |
|
512 | level = DEBUG | |
503 | handlers = |
|
513 | handlers = | |
504 | qualname = routes.middleware |
|
514 | qualname = routes.middleware | |
505 | ## "level = DEBUG" logs the route matched and routing variables. |
|
515 | ## "level = DEBUG" logs the route matched and routing variables. | |
506 | propagate = 1 |
|
516 | propagate = 1 | |
507 |
|
517 | |||
508 | [logger_beaker] |
|
518 | [logger_beaker] | |
509 | level = DEBUG |
|
519 | level = DEBUG | |
510 | handlers = |
|
520 | handlers = | |
511 | qualname = beaker.container |
|
521 | qualname = beaker.container | |
512 | propagate = 1 |
|
522 | propagate = 1 | |
513 |
|
523 | |||
514 | [logger_pyro4] |
|
524 | [logger_pyro4] | |
515 | level = DEBUG |
|
525 | level = DEBUG | |
516 | handlers = |
|
526 | handlers = | |
517 | qualname = Pyro4 |
|
527 | qualname = Pyro4 | |
518 | propagate = 1 |
|
528 | propagate = 1 | |
519 |
|
529 | |||
520 | [logger_templates] |
|
530 | [logger_templates] | |
521 | level = INFO |
|
531 | level = INFO | |
522 | handlers = |
|
532 | handlers = | |
523 | qualname = pylons.templating |
|
533 | qualname = pylons.templating | |
524 | propagate = 1 |
|
534 | propagate = 1 | |
525 |
|
535 | |||
526 | [logger_rhodecode] |
|
536 | [logger_rhodecode] | |
527 | level = DEBUG |
|
537 | level = DEBUG | |
528 | handlers = |
|
538 | handlers = | |
529 | qualname = rhodecode |
|
539 | qualname = rhodecode | |
530 | propagate = 1 |
|
540 | propagate = 1 | |
531 |
|
541 | |||
532 | [logger_sqlalchemy] |
|
542 | [logger_sqlalchemy] | |
533 | level = INFO |
|
543 | level = INFO | |
534 | handlers = console_sql |
|
544 | handlers = console_sql | |
535 | qualname = sqlalchemy.engine |
|
545 | qualname = sqlalchemy.engine | |
536 | propagate = 0 |
|
546 | propagate = 0 | |
537 |
|
547 | |||
538 | [logger_whoosh_indexer] |
|
|||
539 | level = DEBUG |
|
|||
540 | handlers = |
|
|||
541 | qualname = whoosh_indexer |
|
|||
542 | propagate = 1 |
|
|||
543 |
|
||||
544 | ############## |
|
548 | ############## | |
545 | ## HANDLERS ## |
|
549 | ## HANDLERS ## | |
546 | ############## |
|
550 | ############## | |
547 |
|
551 | |||
548 | [handler_console] |
|
552 | [handler_console] | |
549 | class = StreamHandler |
|
553 | class = StreamHandler | |
550 | args = (sys.stderr,) |
|
554 | args = (sys.stderr,) | |
551 | level = INFO |
|
555 | level = INFO | |
552 | formatter = generic |
|
556 | formatter = generic | |
553 |
|
557 | |||
554 | [handler_console_sql] |
|
558 | [handler_console_sql] | |
555 | class = StreamHandler |
|
559 | class = StreamHandler | |
556 | args = (sys.stderr,) |
|
560 | args = (sys.stderr,) | |
557 | level = WARN |
|
561 | level = WARN | |
558 | formatter = generic |
|
562 | formatter = generic | |
559 |
|
563 | |||
560 | ################ |
|
564 | ################ | |
561 | ## FORMATTERS ## |
|
565 | ## FORMATTERS ## | |
562 | ################ |
|
566 | ################ | |
563 |
|
567 | |||
564 | [formatter_generic] |
|
568 | [formatter_generic] | |
565 | class = rhodecode.lib.logging_formatter.Pyro4AwareFormatter |
|
569 | class = rhodecode.lib.logging_formatter.Pyro4AwareFormatter | |
566 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
570 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s | |
567 | datefmt = %Y-%m-%d %H:%M:%S |
|
571 | datefmt = %Y-%m-%d %H:%M:%S | |
568 |
|
572 | |||
569 | [formatter_color_formatter] |
|
573 | [formatter_color_formatter] | |
570 | class = rhodecode.lib.logging_formatter.ColorFormatter |
|
574 | class = rhodecode.lib.logging_formatter.ColorFormatter | |
571 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
575 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s | |
572 | datefmt = %Y-%m-%d %H:%M:%S |
|
576 | datefmt = %Y-%m-%d %H:%M:%S | |
573 |
|
577 | |||
574 | [formatter_color_formatter_sql] |
|
578 | [formatter_color_formatter_sql] | |
575 | class = rhodecode.lib.logging_formatter.ColorFormatterSql |
|
579 | class = rhodecode.lib.logging_formatter.ColorFormatterSql | |
576 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
580 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s | |
577 | datefmt = %Y-%m-%d %H:%M:%S |
|
581 | datefmt = %Y-%m-%d %H:%M:%S |
@@ -1,219 +1,227 b'' | |||||
1 | # Nix environment for the community edition |
|
1 | # Nix environment for the community edition | |
2 | # |
|
2 | # | |
3 | # This shall be as lean as possible, just producing the Enterprise |
|
3 | # This shall be as lean as possible, just producing the Enterprise | |
4 | # derivation. For advanced tweaks to pimp up the development environment we use |
|
4 | # derivation. For advanced tweaks to pimp up the development environment we use | |
5 | # "shell.nix" so that it does not have to clutter this file. |
|
5 | # "shell.nix" so that it does not have to clutter this file. | |
6 |
|
6 | |||
7 | { pkgs ? (import <nixpkgs> {}) |
|
7 | { pkgs ? (import <nixpkgs> {}) | |
8 | , pythonPackages ? "python27Packages" |
|
8 | , pythonPackages ? "python27Packages" | |
9 | , pythonExternalOverrides ? self: super: {} |
|
9 | , pythonExternalOverrides ? self: super: {} | |
10 | , doCheck ? true |
|
10 | , doCheck ? true | |
11 | }: |
|
11 | }: | |
12 |
|
12 | |||
13 | let pkgs_ = pkgs; in |
|
13 | let pkgs_ = pkgs; in | |
14 |
|
14 | |||
15 | let |
|
15 | let | |
16 | pkgs = pkgs_.overridePackages (self: super: { |
|
16 | pkgs = pkgs_.overridePackages (self: super: { | |
17 | # Override subversion derivation to |
|
17 | # Override subversion derivation to | |
18 | # - activate python bindings |
|
18 | # - activate python bindings | |
19 | # - set version to 1.8 |
|
19 | # - set version to 1.8 | |
20 | subversion = super.subversion18.override { |
|
20 | subversion = super.subversion18.override { | |
21 | httpSupport = true; |
|
21 | httpSupport = true; | |
22 | pythonBindings = true; |
|
22 | pythonBindings = true; | |
23 | python = self.python27Packages.python; |
|
23 | python = self.python27Packages.python; | |
24 | }; |
|
24 | }; | |
25 | }); |
|
25 | }); | |
26 |
|
26 | |||
27 | inherit (pkgs.lib) fix extends; |
|
27 | inherit (pkgs.lib) fix extends; | |
28 |
|
28 | |||
29 | basePythonPackages = with builtins; if isAttrs pythonPackages |
|
29 | basePythonPackages = with builtins; if isAttrs pythonPackages | |
30 | then pythonPackages |
|
30 | then pythonPackages | |
31 | else getAttr pythonPackages pkgs; |
|
31 | else getAttr pythonPackages pkgs; | |
32 |
|
32 | |||
33 | elem = builtins.elem; |
|
33 | elem = builtins.elem; | |
34 | basename = path: with pkgs.lib; last (splitString "/" path); |
|
34 | basename = path: with pkgs.lib; last (splitString "/" path); | |
35 | startsWith = prefix: full: let |
|
35 | startsWith = prefix: full: let | |
36 | actualPrefix = builtins.substring 0 (builtins.stringLength prefix) full; |
|
36 | actualPrefix = builtins.substring 0 (builtins.stringLength prefix) full; | |
37 | in actualPrefix == prefix; |
|
37 | in actualPrefix == prefix; | |
38 |
|
38 | |||
39 | src-filter = path: type: with pkgs.lib; |
|
39 | src-filter = path: type: with pkgs.lib; | |
40 | let |
|
40 | let | |
41 | ext = last (splitString "." path); |
|
41 | ext = last (splitString "." path); | |
42 | in |
|
42 | in | |
43 | !elem (basename path) [ |
|
43 | !elem (basename path) [ | |
44 | ".git" ".hg" "__pycache__" ".eggs" "node_modules" |
|
44 | ".git" ".hg" "__pycache__" ".eggs" "node_modules" | |
45 | "build" "data" "tmp"] && |
|
45 | "build" "data" "tmp"] && | |
46 | !elem ext ["egg-info" "pyc"] && |
|
46 | !elem ext ["egg-info" "pyc"] && | |
47 | !startsWith "result" path; |
|
47 | !startsWith "result" path; | |
48 |
|
48 | |||
|
49 | sources = pkgs.config.rc.sources or {}; | |||
49 | rhodecode-enterprise-ce-src = builtins.filterSource src-filter ./.; |
|
50 | rhodecode-enterprise-ce-src = builtins.filterSource src-filter ./.; | |
50 |
|
51 | |||
51 | # Load the generated node packages |
|
52 | # Load the generated node packages | |
52 | nodePackages = pkgs.callPackage "${pkgs.path}/pkgs/top-level/node-packages.nix" rec { |
|
53 | nodePackages = pkgs.callPackage "${pkgs.path}/pkgs/top-level/node-packages.nix" rec { | |
53 | self = nodePackages; |
|
54 | self = nodePackages; | |
54 | generated = pkgs.callPackage ./pkgs/node-packages.nix { inherit self; }; |
|
55 | generated = pkgs.callPackage ./pkgs/node-packages.nix { inherit self; }; | |
55 | }; |
|
56 | }; | |
56 |
|
57 | |||
57 | # TODO: Should be taken automatically out of the generates packages. |
|
58 | # TODO: Should be taken automatically out of the generates packages. | |
58 | # apps.nix has one solution for this, although I'd prefer to have the deps |
|
59 | # apps.nix has one solution for this, although I'd prefer to have the deps | |
59 | # from package.json mapped in here. |
|
60 | # from package.json mapped in here. | |
60 | nodeDependencies = with nodePackages; [ |
|
61 | nodeDependencies = with nodePackages; [ | |
61 | grunt |
|
62 | grunt | |
62 | grunt-contrib-concat |
|
63 | grunt-contrib-concat | |
63 | grunt-contrib-jshint |
|
64 | grunt-contrib-jshint | |
64 | grunt-contrib-less |
|
65 | grunt-contrib-less | |
65 | grunt-contrib-watch |
|
66 | grunt-contrib-watch | |
66 | jshint |
|
67 | jshint | |
67 | ]; |
|
68 | ]; | |
68 |
|
69 | |||
69 | pythonGeneratedPackages = self: basePythonPackages.override (a: { |
|
70 | pythonGeneratedPackages = self: basePythonPackages.override (a: { | |
70 | inherit self; |
|
71 | inherit self; | |
71 | }) |
|
72 | }) | |
72 | // (scopedImport { |
|
73 | // (scopedImport { | |
73 | self = self; |
|
74 | self = self; | |
74 | super = basePythonPackages; |
|
75 | super = basePythonPackages; | |
75 | inherit pkgs; |
|
76 | inherit pkgs; | |
76 | inherit (pkgs) fetchurl fetchgit; |
|
77 | inherit (pkgs) fetchurl fetchgit; | |
77 | } ./pkgs/python-packages.nix); |
|
78 | } ./pkgs/python-packages.nix); | |
78 |
|
79 | |||
79 | pythonOverrides = import ./pkgs/python-packages-overrides.nix { |
|
80 | pythonOverrides = import ./pkgs/python-packages-overrides.nix { | |
80 | inherit |
|
81 | inherit | |
81 | basePythonPackages |
|
82 | basePythonPackages | |
82 | pkgs; |
|
83 | pkgs; | |
83 | }; |
|
84 | }; | |
84 |
|
85 | |||
85 | pythonLocalOverrides = self: super: { |
|
86 | pythonLocalOverrides = self: super: { | |
86 | rhodecode-enterprise-ce = |
|
87 | rhodecode-enterprise-ce = | |
87 | let |
|
88 | let | |
88 | version = builtins.readFile ./rhodecode/VERSION; |
|
89 | version = builtins.readFile ./rhodecode/VERSION; | |
89 | linkNodeModules = '' |
|
90 | linkNodeModules = '' | |
90 | echo "Link node packages" |
|
91 | echo "Link node packages" | |
91 | # TODO: check if this adds stuff as a dependency, closure size |
|
92 | # TODO: check if this adds stuff as a dependency, closure size | |
92 | rm -fr node_modules |
|
93 | rm -fr node_modules | |
93 | mkdir -p node_modules |
|
94 | mkdir -p node_modules | |
94 | ${pkgs.lib.concatMapStrings (dep: '' |
|
95 | ${pkgs.lib.concatMapStrings (dep: '' | |
95 | ln -sfv ${dep}/lib/node_modules/${dep.pkgName} node_modules/ |
|
96 | ln -sfv ${dep}/lib/node_modules/${dep.pkgName} node_modules/ | |
96 | '') nodeDependencies} |
|
97 | '') nodeDependencies} | |
97 | echo "DONE: Link node packages" |
|
98 | echo "DONE: Link node packages" | |
98 | ''; |
|
99 | ''; | |
99 | in super.rhodecode-enterprise-ce.override (attrs: { |
|
100 | in super.rhodecode-enterprise-ce.override (attrs: { | |
100 |
|
101 | |||
101 |
inherit |
|
102 | inherit | |
|
103 | doCheck | |||
|
104 | version; | |||
102 | name = "rhodecode-enterprise-ce-${version}"; |
|
105 | name = "rhodecode-enterprise-ce-${version}"; | |
103 | version = version; |
|
106 | releaseName = "RhodeCodeEnterpriseCE-${version}"; | |
104 | src = rhodecode-enterprise-ce-src; |
|
107 | src = rhodecode-enterprise-ce-src; | |
105 |
|
108 | |||
106 | buildInputs = |
|
109 | buildInputs = | |
107 | attrs.buildInputs ++ |
|
110 | attrs.buildInputs ++ | |
108 | (with self; [ |
|
111 | (with self; [ | |
109 | pkgs.nodePackages.grunt-cli |
|
112 | pkgs.nodePackages.grunt-cli | |
110 | pkgs.subversion |
|
113 | pkgs.subversion | |
111 | pytest-catchlog |
|
114 | pytest-catchlog | |
112 |
r |
|
115 | rhodecode-testdata | |
113 | ]); |
|
116 | ]); | |
114 |
|
117 | |||
115 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ (with self; [ |
|
118 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ (with self; [ | |
116 | rhodecode-tools |
|
119 | rhodecode-tools | |
117 | ]); |
|
120 | ]); | |
118 |
|
121 | |||
119 | # TODO: johbo: Make a nicer way to expose the parts. Maybe |
|
122 | # TODO: johbo: Make a nicer way to expose the parts. Maybe | |
120 | # pkgs/default.nix? |
|
123 | # pkgs/default.nix? | |
121 | passthru = { |
|
124 | passthru = { | |
122 | inherit |
|
125 | inherit | |
123 | pythonLocalOverrides |
|
126 | pythonLocalOverrides | |
124 | myPythonPackagesUnfix; |
|
127 | myPythonPackagesUnfix; | |
125 | pythonPackages = self; |
|
128 | pythonPackages = self; | |
126 | }; |
|
129 | }; | |
127 |
|
130 | |||
128 | LC_ALL = "en_US.UTF-8"; |
|
131 | LC_ALL = "en_US.UTF-8"; | |
129 | LOCALE_ARCHIVE = |
|
132 | LOCALE_ARCHIVE = | |
130 | if pkgs.stdenv ? glibc |
|
133 | if pkgs.stdenv ? glibc | |
131 | then "${pkgs.glibcLocales}/lib/locale/locale-archive" |
|
134 | then "${pkgs.glibcLocales}/lib/locale/locale-archive" | |
132 | else ""; |
|
135 | else ""; | |
133 |
|
136 | |||
134 | # Somewhat snappier setup of the development environment |
|
137 | # Somewhat snappier setup of the development environment | |
135 | # TODO: move into shell.nix |
|
138 | # TODO: move into shell.nix | |
136 | # TODO: think of supporting a stable path again, so that multiple shells |
|
139 | # TODO: think of supporting a stable path again, so that multiple shells | |
137 | # can share it. |
|
140 | # can share it. | |
138 | shellHook = '' |
|
141 | shellHook = '' | |
139 | tmp_path=$(mktemp -d) |
|
142 | tmp_path=$(mktemp -d) | |
140 | export PATH="$tmp_path/bin:$PATH" |
|
143 | export PATH="$tmp_path/bin:$PATH" | |
141 | export PYTHONPATH="$tmp_path/${self.python.sitePackages}:$PYTHONPATH" |
|
144 | export PYTHONPATH="$tmp_path/${self.python.sitePackages}:$PYTHONPATH" | |
142 | mkdir -p $tmp_path/${self.python.sitePackages} |
|
145 | mkdir -p $tmp_path/${self.python.sitePackages} | |
143 | python setup.py develop --prefix $tmp_path --allow-hosts "" |
|
146 | python setup.py develop --prefix $tmp_path --allow-hosts "" | |
144 | '' + linkNodeModules; |
|
147 | '' + linkNodeModules; | |
145 |
|
148 | |||
146 | preCheck = '' |
|
149 | preCheck = '' | |
147 | export PATH="$out/bin:$PATH" |
|
150 | export PATH="$out/bin:$PATH" | |
148 | ''; |
|
151 | ''; | |
149 |
|
152 | |||
150 | postCheck = '' |
|
153 | postCheck = '' | |
151 | rm -rf $out/lib/${self.python.libPrefix}/site-packages/pytest_pylons |
|
154 | rm -rf $out/lib/${self.python.libPrefix}/site-packages/pytest_pylons | |
152 | rm -rf $out/lib/${self.python.libPrefix}/site-packages/rhodecode/tests |
|
155 | rm -rf $out/lib/${self.python.libPrefix}/site-packages/rhodecode/tests | |
153 | ''; |
|
156 | ''; | |
154 |
|
157 | |||
155 | preBuild = linkNodeModules + '' |
|
158 | preBuild = linkNodeModules + '' | |
156 | grunt |
|
159 | grunt | |
157 | rm -fr node_modules |
|
160 | rm -fr node_modules | |
158 | ''; |
|
161 | ''; | |
159 |
|
162 | |||
160 | postInstall = '' |
|
163 | postInstall = '' | |
161 | # python based programs need to be wrapped |
|
164 | # python based programs need to be wrapped | |
162 | ln -s ${self.supervisor}/bin/supervisor* $out/bin/ |
|
165 | ln -s ${self.supervisor}/bin/supervisor* $out/bin/ | |
163 | ln -s ${self.gunicorn}/bin/gunicorn $out/bin/ |
|
166 | ln -s ${self.gunicorn}/bin/gunicorn $out/bin/ | |
164 | ln -s ${self.PasteScript}/bin/paster $out/bin/ |
|
167 | ln -s ${self.PasteScript}/bin/paster $out/bin/ | |
165 | ln -s ${self.pyramid}/bin/* $out/bin/ #*/ |
|
168 | ln -s ${self.pyramid}/bin/* $out/bin/ #*/ | |
166 |
|
169 | |||
167 | # rhodecode-tools |
|
170 | # rhodecode-tools | |
168 | # TODO: johbo: re-think this. Do the tools import anything from enterprise? |
|
171 | # TODO: johbo: re-think this. Do the tools import anything from enterprise? | |
169 | ln -s ${self.rhodecode-tools}/bin/rhodecode-* $out/bin/ |
|
172 | ln -s ${self.rhodecode-tools}/bin/rhodecode-* $out/bin/ | |
170 |
|
173 | |||
171 | # note that condition should be restricted when adding further tools |
|
174 | # note that condition should be restricted when adding further tools | |
172 | for file in $out/bin/*; do #*/ |
|
175 | for file in $out/bin/*; do #*/ | |
173 | wrapProgram $file \ |
|
176 | wrapProgram $file \ | |
174 | --prefix PYTHONPATH : $PYTHONPATH \ |
|
177 | --prefix PYTHONPATH : $PYTHONPATH \ | |
175 | --prefix PATH : $PATH \ |
|
178 | --prefix PATH : $PATH \ | |
176 | --set PYTHONHASHSEED random |
|
179 | --set PYTHONHASHSEED random | |
177 | done |
|
180 | done | |
178 |
|
181 | |||
179 | mkdir $out/etc |
|
182 | mkdir $out/etc | |
180 | cp configs/production.ini $out/etc |
|
183 | cp configs/production.ini $out/etc | |
181 |
|
184 | |||
182 | echo "Writing meta information for rccontrol to nix-support/rccontrol" |
|
185 | echo "Writing meta information for rccontrol to nix-support/rccontrol" | |
183 | mkdir -p $out/nix-support/rccontrol |
|
186 | mkdir -p $out/nix-support/rccontrol | |
184 | cp -v rhodecode/VERSION $out/nix-support/rccontrol/version |
|
187 | cp -v rhodecode/VERSION $out/nix-support/rccontrol/version | |
185 | echo "DONE: Meta information for rccontrol written" |
|
188 | echo "DONE: Meta information for rccontrol written" | |
186 |
|
189 | |||
187 | # TODO: johbo: Make part of ac-tests |
|
190 | # TODO: johbo: Make part of ac-tests | |
188 | if [ ! -f rhodecode/public/js/scripts.js ]; then |
|
191 | if [ ! -f rhodecode/public/js/scripts.js ]; then | |
189 | echo "Missing scripts.js" |
|
192 | echo "Missing scripts.js" | |
190 | exit 1 |
|
193 | exit 1 | |
191 | fi |
|
194 | fi | |
192 | if [ ! -f rhodecode/public/css/style.css ]; then |
|
195 | if [ ! -f rhodecode/public/css/style.css ]; then | |
193 | echo "Missing style.css" |
|
196 | echo "Missing style.css" | |
194 | exit 1 |
|
197 | exit 1 | |
195 | fi |
|
198 | fi | |
196 | ''; |
|
199 | ''; | |
197 |
|
200 | |||
198 | }); |
|
201 | }); | |
199 |
|
202 | |||
200 | rc_testdata = self.buildPythonPackage rec { |
|
203 | rhodecode-testdata = import "${rhodecode-testdata-src}/default.nix" { | |
201 | name = "rc_testdata-0.7.0"; |
|
204 | inherit | |
202 | src = pkgs.fetchhg { |
|
205 | doCheck | |
203 | url = "https://code.rhodecode.com/upstream/rc_testdata"; |
|
206 | pkgs | |
204 | rev = "v0.7.0"; |
|
207 | pythonPackages; | |
205 | sha256 = "0w3z0zn8lagr707v67lgys23sl6pbi4xg7pfvdbw58h3q384h6rx"; |
|
|||
206 | }; |
|
|||
207 | }; |
|
208 | }; | |
208 |
|
209 | |||
209 | }; |
|
210 | }; | |
210 |
|
211 | |||
|
212 | rhodecode-testdata-src = sources.rhodecode-testdata or ( | |||
|
213 | pkgs.fetchhg { | |||
|
214 | url = "https://code.rhodecode.com/upstream/rc_testdata"; | |||
|
215 | rev = "v0.8.0"; | |||
|
216 | sha256 = "0hy1ba134rq2f9si85yx7j4qhc9ky0hjzdk553s3q026i7km809m"; | |||
|
217 | }); | |||
|
218 | ||||
211 | # Apply all overrides and fix the final package set |
|
219 | # Apply all overrides and fix the final package set | |
212 | myPythonPackagesUnfix = |
|
220 | myPythonPackagesUnfix = | |
213 | (extends pythonExternalOverrides |
|
221 | (extends pythonExternalOverrides | |
214 | (extends pythonLocalOverrides |
|
222 | (extends pythonLocalOverrides | |
215 | (extends pythonOverrides |
|
223 | (extends pythonOverrides | |
216 | pythonGeneratedPackages))); |
|
224 | pythonGeneratedPackages))); | |
217 | myPythonPackages = (fix myPythonPackagesUnfix); |
|
225 | myPythonPackages = (fix myPythonPackagesUnfix); | |
218 |
|
226 | |||
219 | in myPythonPackages.rhodecode-enterprise-ce |
|
227 | in myPythonPackages.rhodecode-enterprise-ce |
@@ -1,136 +1,130 b'' | |||||
1 | .. _debug-mode: |
|
1 | .. _debug-mode: | |
2 |
|
2 | |||
3 | Enabling Debug Mode |
|
3 | Enabling Debug Mode | |
4 | ------------------- |
|
4 | ------------------- | |
5 |
|
5 | |||
6 | To enable debug mode on a |RCE| instance you need to set the debug property |
|
6 | To enable debug mode on a |RCE| instance you need to set the debug property | |
7 | in the :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini` file. To |
|
7 | in the :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini` file. To | |
8 | do this, use the following steps |
|
8 | do this, use the following steps | |
9 |
|
9 | |||
10 | 1. Open the file and set the ``debug`` line to ``true`` |
|
10 | 1. Open the file and set the ``debug`` line to ``true`` | |
11 | 2. Restart you instance using the ``rccontrol restart`` command, |
|
11 | 2. Restart you instance using the ``rccontrol restart`` command, | |
12 | see the following example: |
|
12 | see the following example: | |
13 |
|
13 | |||
14 | You can also set the log level, the follow are the valid options; |
|
14 | You can also set the log level, the follow are the valid options; | |
15 | ``debug``, ``info``, ``warning``, or ``fatal``. |
|
15 | ``debug``, ``info``, ``warning``, or ``fatal``. | |
16 |
|
16 | |||
17 | .. code-block:: ini |
|
17 | .. code-block:: ini | |
18 |
|
18 | |||
19 | [DEFAULT] |
|
19 | [DEFAULT] | |
20 | debug = true |
|
20 | debug = true | |
21 | pdebug = false |
|
21 | pdebug = false | |
22 |
|
22 | |||
23 | .. code-block:: bash |
|
23 | .. code-block:: bash | |
24 |
|
24 | |||
25 | # Restart your instance |
|
25 | # Restart your instance | |
26 | $ rccontrol restart enterprise-1 |
|
26 | $ rccontrol restart enterprise-1 | |
27 | Instance "enterprise-1" successfully stopped. |
|
27 | Instance "enterprise-1" successfully stopped. | |
28 | Instance "enterprise-1" successfully started. |
|
28 | Instance "enterprise-1" successfully started. | |
29 |
|
29 | |||
30 | Debug and Logging Configuration |
|
30 | Debug and Logging Configuration | |
31 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
31 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
32 |
|
32 | |||
33 | Further debugging and logging settings can also be set in the |
|
33 | Further debugging and logging settings can also be set in the | |
34 | :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini` file. |
|
34 | :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini` file. | |
35 |
|
35 | |||
36 | In the logging section, the various packages that run with |RCE| can have |
|
36 | In the logging section, the various packages that run with |RCE| can have | |
37 | different debug levels set. If you want to increase the logging level change |
|
37 | different debug levels set. If you want to increase the logging level change | |
38 | ``level = DEBUG`` line to one of the valid options. |
|
38 | ``level = DEBUG`` line to one of the valid options. | |
39 |
|
39 | |||
40 | You also need to change the log level for handlers. See the example |
|
40 | You also need to change the log level for handlers. See the example | |
41 | ``##handler`` section below. The ``handler`` level takes the same options as |
|
41 | ``##handler`` section below. The ``handler`` level takes the same options as | |
42 | the ``debug`` level. |
|
42 | the ``debug`` level. | |
43 |
|
43 | |||
44 | .. code-block:: ini |
|
44 | .. code-block:: ini | |
45 |
|
45 | |||
46 | ################################ |
|
46 | ################################ | |
47 | ### LOGGING CONFIGURATION #### |
|
47 | ### LOGGING CONFIGURATION #### | |
48 | ################################ |
|
48 | ################################ | |
49 | [loggers] |
|
49 | [loggers] | |
50 |
keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates |
|
50 | keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates | |
51 |
|
51 | |||
52 | [handlers] |
|
52 | [handlers] | |
53 | keys = console, console_sql, file, file_rotating |
|
53 | keys = console, console_sql, file, file_rotating | |
54 |
|
54 | |||
55 | [formatters] |
|
55 | [formatters] | |
56 | keys = generic, color_formatter, color_formatter_sql |
|
56 | keys = generic, color_formatter, color_formatter_sql | |
57 |
|
57 | |||
58 | ############# |
|
58 | ############# | |
59 | ## LOGGERS ## |
|
59 | ## LOGGERS ## | |
60 | ############# |
|
60 | ############# | |
61 | [logger_root] |
|
61 | [logger_root] | |
62 | level = NOTSET |
|
62 | level = NOTSET | |
63 | handlers = console |
|
63 | handlers = console | |
64 |
|
64 | |||
65 | [logger_routes] |
|
65 | [logger_routes] | |
66 | level = DEBUG |
|
66 | level = DEBUG | |
67 | handlers = |
|
67 | handlers = | |
68 | qualname = routes.middleware |
|
68 | qualname = routes.middleware | |
69 | ## "level = DEBUG" logs the route matched and routing variables. |
|
69 | ## "level = DEBUG" logs the route matched and routing variables. | |
70 | propagate = 1 |
|
70 | propagate = 1 | |
71 |
|
71 | |||
72 | [logger_beaker] |
|
72 | [logger_beaker] | |
73 | level = DEBUG |
|
73 | level = DEBUG | |
74 | handlers = |
|
74 | handlers = | |
75 | qualname = beaker.container |
|
75 | qualname = beaker.container | |
76 | propagate = 1 |
|
76 | propagate = 1 | |
77 |
|
77 | |||
78 | [logger_pyro4] |
|
78 | [logger_pyro4] | |
79 | level = DEBUG |
|
79 | level = DEBUG | |
80 | handlers = |
|
80 | handlers = | |
81 | qualname = Pyro4 |
|
81 | qualname = Pyro4 | |
82 | propagate = 1 |
|
82 | propagate = 1 | |
83 |
|
83 | |||
84 | [logger_templates] |
|
84 | [logger_templates] | |
85 | level = INFO |
|
85 | level = INFO | |
86 | handlers = |
|
86 | handlers = | |
87 | qualname = pylons.templating |
|
87 | qualname = pylons.templating | |
88 | propagate = 1 |
|
88 | propagate = 1 | |
89 |
|
89 | |||
90 | [logger_rhodecode] |
|
90 | [logger_rhodecode] | |
91 | level = DEBUG |
|
91 | level = DEBUG | |
92 | handlers = |
|
92 | handlers = | |
93 | qualname = rhodecode |
|
93 | qualname = rhodecode | |
94 | propagate = 1 |
|
94 | propagate = 1 | |
95 |
|
95 | |||
96 | [logger_sqlalchemy] |
|
96 | [logger_sqlalchemy] | |
97 | level = INFO |
|
97 | level = INFO | |
98 | handlers = console_sql |
|
98 | handlers = console_sql | |
99 | qualname = sqlalchemy.engine |
|
99 | qualname = sqlalchemy.engine | |
100 | propagate = 0 |
|
100 | propagate = 0 | |
101 |
|
101 | |||
102 | [logger_whoosh_indexer] |
|
|||
103 | level = DEBUG |
|
|||
104 | handlers = |
|
|||
105 | qualname = whoosh_indexer |
|
|||
106 | propagate = 1 |
|
|||
107 |
|
||||
108 | ############## |
|
102 | ############## | |
109 | ## HANDLERS ## |
|
103 | ## HANDLERS ## | |
110 | ############## |
|
104 | ############## | |
111 |
|
105 | |||
112 | [handler_console] |
|
106 | [handler_console] | |
113 | class = StreamHandler |
|
107 | class = StreamHandler | |
114 | args = (sys.stderr,) |
|
108 | args = (sys.stderr,) | |
115 | level = INFO |
|
109 | level = INFO | |
116 | formatter = generic |
|
110 | formatter = generic | |
117 |
|
111 | |||
118 | [handler_console_sql] |
|
112 | [handler_console_sql] | |
119 | class = StreamHandler |
|
113 | class = StreamHandler | |
120 | args = (sys.stderr,) |
|
114 | args = (sys.stderr,) | |
121 | level = WARN |
|
115 | level = WARN | |
122 | formatter = generic |
|
116 | formatter = generic | |
123 |
|
117 | |||
124 | [handler_file] |
|
118 | [handler_file] | |
125 | class = FileHandler |
|
119 | class = FileHandler | |
126 | args = ('rhodecode.log', 'a',) |
|
120 | args = ('rhodecode.log', 'a',) | |
127 | level = INFO |
|
121 | level = INFO | |
128 | formatter = generic |
|
122 | formatter = generic | |
129 |
|
123 | |||
130 | [handler_file_rotating] |
|
124 | [handler_file_rotating] | |
131 | class = logging.handlers.TimedRotatingFileHandler |
|
125 | class = logging.handlers.TimedRotatingFileHandler | |
132 | # 'D', 5 - rotate every 5days |
|
126 | # 'D', 5 - rotate every 5days | |
133 | # you can set 'h', 'midnight' |
|
127 | # you can set 'h', 'midnight' | |
134 | args = ('rhodecode.log', 'D', 5, 10,) |
|
128 | args = ('rhodecode.log', 'D', 5, 10,) | |
135 | level = INFO |
|
129 | level = INFO | |
136 | formatter = generic |
|
130 | formatter = generic |
@@ -1,19 +1,20 b'' | |||||
1 | .. _contributing: |
|
1 | .. _contributing: | |
2 |
|
2 | |||
3 | Contributing to RhodeCode |
|
3 | Contributing to RhodeCode | |
4 | ========================= |
|
4 | ========================= | |
5 |
|
5 | |||
6 |
|
6 | |||
7 |
|
7 | |||
8 | Welcome to contribution guides and development docs of RhodeCode. |
|
8 | Welcome to the contribution guides and development docs of RhodeCode. | |
9 |
|
9 | |||
10 |
|
10 | |||
11 |
|
11 | |||
12 | .. toctree:: |
|
12 | .. toctree:: | |
13 | :maxdepth: 1 |
|
13 | :maxdepth: 1 | |
14 |
|
14 | |||
|
15 | overview | |||
15 | testing/index |
|
16 | testing/index | |
16 | dev-setup |
|
17 | dev-setup | |
17 | db-schema |
|
18 | db-schema | |
18 | dev-settings |
|
19 | dev-settings | |
19 | api |
|
20 | api |
@@ -1,52 +1,52 b'' | |||||
1 | ======================= |
|
1 | ======================= | |
2 | DB Schema and Migration |
|
2 | DB Schema and Migration | |
3 | ======================= |
|
3 | ======================= | |
4 |
|
4 | |||
5 | To create or alter tables in the database it's necessary to change a couple of |
|
5 | To create or alter tables in the database, it's necessary to change a couple of | |
6 | files, apart from configuring the settings pointing to the latest database |
|
6 | files, apart from configuring the settings pointing to the latest database | |
7 | schema. |
|
7 | schema. | |
8 |
|
8 | |||
9 |
|
9 | |||
10 | Database Model and ORM |
|
10 | Database Model and ORM | |
11 | ---------------------- |
|
11 | ---------------------- | |
12 |
|
12 | |||
13 | On ``rhodecode.model.db`` you will find the database definition of all tables and |
|
13 | On ``rhodecode.model.db`` you will find the database definition of all tables and | |
14 | fields. Any fresh install database will be correctly created by the definitions |
|
14 | fields. Any freshly installed database will be correctly created by the definitions | |
15 |
here. So, any change to this file |
|
15 | here. So, any change to this file will affect the tests without having to change | |
16 | any other file. |
|
16 | any other file. | |
17 |
|
17 | |||
18 |
A second layer are the busin |
|
18 | A second layer are the business classes inside ``rhodecode.model``. | |
19 |
|
19 | |||
20 |
|
20 | |||
21 | Database Migration |
|
21 | Database Migration | |
22 | ------------------ |
|
22 | ------------------ | |
23 |
|
23 | |||
24 | Three files play a role when creating database migrations: |
|
24 | Three files play a role when creating database migrations: | |
25 |
|
25 | |||
26 | * Database schema inside ``rhodecode.lib.dbmigrate`` |
|
26 | * Database schema inside ``rhodecode.lib.dbmigrate`` | |
27 | * Database version inside ``rhodecode.lib.dbmigrate`` |
|
27 | * Database version inside ``rhodecode.lib.dbmigrate`` | |
28 | * Configuration ``__dbversion__`` at ``rhodecode.__init__`` |
|
28 | * Configuration ``__dbversion__`` at ``rhodecode.__init__`` | |
29 |
|
29 | |||
30 |
|
30 | |||
31 | Schema is a snapshot of the database version BEFORE the migration. So, it's |
|
31 | Schema is a snapshot of the database version BEFORE the migration. So, it's | |
32 | the initial state before any changes were added. The name convention is |
|
32 | the initial state before any changes were added. The name convention is | |
33 |
the latest release version where the snapshot w |
|
33 | the latest release version where the snapshot was created, and not the | |
34 | target version of this code. |
|
34 | target version of this code. | |
35 |
|
35 | |||
36 | Version is the method that will define how to UPGRADE/DOWNGRADE the database. |
|
36 | Version is the method that will define how to UPGRADE/DOWNGRADE the database. | |
37 |
|
37 | |||
38 | ``rhodecode.__init__`` contains only a variable that defines up to which version of |
|
38 | ``rhodecode.__init__`` contains only a variable that defines up to which version of | |
39 | the database will be used to upgrade. Eg.: ``__dbversion__ = 45`` |
|
39 | the database will be used to upgrade. Eg.: ``__dbversion__ = 45`` | |
40 |
|
40 | |||
41 |
|
41 | |||
42 | For examples on how to create those files, please see the existing code. |
|
42 | For examples on how to create those files, please see the existing code. | |
43 |
|
43 | |||
44 |
|
44 | |||
45 | Migration Command |
|
45 | Migration Command | |
46 | ^^^^^^^^^^^^^^^^^ |
|
46 | ^^^^^^^^^^^^^^^^^ | |
47 |
|
47 | |||
48 | After you changed the database ORM and migration files, you can run:: |
|
48 | After you've changed the database ORM and migration files, you can run:: | |
49 |
|
49 | |||
50 | paster upgrade-db <ini-file> |
|
50 | paster upgrade-db <ini-file> | |
51 |
|
51 | |||
52 |
|
|
52 | The database will be upgraded up to the version defined in the ``__init__`` file. No newline at end of file |
@@ -1,46 +1,46 b'' | |||||
1 |
|
1 | |||
2 | ========================== |
|
2 | ========================== | |
3 | Settings for Development |
|
3 | Settings for Development | |
4 | ========================== |
|
4 | ========================== | |
5 |
|
5 | |||
6 |
|
6 | |||
7 | We have a few settings which are intended to be used only for development |
|
7 | We have a few settings which are intended to be used only for development | |
8 | purposes. This section contains an overview of them. |
|
8 | purposes. This section contains an overview of them. | |
9 |
|
9 | |||
10 |
|
10 | |||
11 |
|
11 | |||
12 | `debug_style` |
|
12 | `debug_style` | |
13 | ============= |
|
13 | ============= | |
14 |
|
14 | |||
15 | Enables the section "Style" in the application. This section provides an |
|
15 | Enables the section "Style" in the application. This section provides an | |
16 |
overview of all components which are found in the frontend |
|
16 | overview of all components which are found in the frontend of the | |
17 | application. |
|
17 | application. | |
18 |
|
18 | |||
19 |
|
19 | |||
20 |
|
20 | |||
21 | `vcs.start_server` |
|
21 | `vcs.start_server` | |
22 | ================== |
|
22 | ================== | |
23 |
|
23 | |||
24 | Starts the server as a subprocess while the system comes up. Intended usage is |
|
24 | Starts the server as a subprocess while the system comes up. Intended usage is | |
25 | to ease development. |
|
25 | to ease development. | |
26 |
|
26 | |||
27 |
|
27 | |||
28 |
|
28 | |||
29 | `[logging]` |
|
29 | `[logging]` | |
30 | =========== |
|
30 | =========== | |
31 |
|
31 | |||
32 | Use this to configure loggig to your current needs. The documentation of |
|
32 | Use this to configure logging to your current needs. The documentation of | |
33 |
Python's `logging` module explains all details. The following snippets |
|
33 | Python's `logging` module explains all of the details. The following snippets | |
34 | useful for day to day development work. |
|
34 | are useful for day to day development work. | |
35 |
|
35 | |||
36 |
|
36 | |||
37 | Mute SQL output |
|
37 | Mute SQL output | |
38 | --------------- |
|
38 | --------------- | |
39 |
|
39 | |||
40 | They come out of the package `sqlalchemy.engine`:: |
|
40 | They come out of the package `sqlalchemy.engine`:: | |
41 |
|
41 | |||
42 | [logger_sqlalchemy] |
|
42 | [logger_sqlalchemy] | |
43 | level = WARNING |
|
43 | level = WARNING | |
44 | handlers = console_sql |
|
44 | handlers = console_sql | |
45 | qualname = sqlalchemy.engine |
|
45 | qualname = sqlalchemy.engine | |
46 | propagate = 0 |
|
46 | propagate = 0 |
@@ -1,141 +1,144 b'' | |||||
|
1 | .. _dev-setup: | |||
1 |
|
2 | |||
2 | =================== |
|
3 | =================== | |
3 | Development setup |
|
4 | Development setup | |
4 | =================== |
|
5 | =================== | |
5 |
|
6 | |||
6 |
|
7 | |||
7 | RhodeCode Enterprise runs inside a Nix managed environment. This ensures build |
|
8 | RhodeCode Enterprise runs inside a Nix managed environment. This ensures build | |
8 | environment dependencies are correctly declared and installed during setup. |
|
9 | environment dependencies are correctly declared and installed during setup. | |
9 | It also enables atomic upgrades, rollbacks, and multiple instances of RhodeCode |
|
10 | It also enables atomic upgrades, rollbacks, and multiple instances of RhodeCode | |
10 | Enterprise for efficient cluster management. |
|
11 | Enterprise running with isolation. | |
11 |
|
12 | |||
12 | To set up RhodeCode Enterprise inside the Nix environment use the following steps: |
|
13 | To set up RhodeCode Enterprise inside the Nix environment, use the following steps: | |
13 |
|
14 | |||
14 |
|
15 | |||
15 |
|
16 | |||
16 | Setup Nix Package Manager |
|
17 | Setup Nix Package Manager | |
17 | ------------------------- |
|
18 | ------------------------- | |
18 |
|
19 | |||
19 | To install the Nix Package Manager please run:: |
|
20 | To install the Nix Package Manager, please run:: | |
20 |
|
21 | |||
21 | $ curl https://nixos.org/nix/install | sh |
|
22 | $ curl https://nixos.org/nix/install | sh | |
22 |
|
23 | |||
23 |
or go to https://nixos.org/nix/ and follow the |
|
24 | or go to https://nixos.org/nix/ and follow the installation instructions. | |
24 | Once this is correctly set up on your system you should be able to use the |
|
25 | Once this is correctly set up on your system, you should be able to use the | |
25 | following commands: |
|
26 | following commands: | |
26 |
|
27 | |||
27 | * `nix-env` |
|
28 | * `nix-env` | |
28 |
|
29 | |||
29 | * `nix-shell` |
|
30 | * `nix-shell` | |
30 |
|
31 | |||
31 |
|
32 | |||
32 | .. tip:: |
|
33 | .. tip:: | |
33 |
|
34 | |||
34 | Update your channels frequently by running ``nix-channel --upgrade``. |
|
35 | Update your channels frequently by running ``nix-channel --upgrade``. | |
35 |
|
36 | |||
36 |
|
37 | |||
37 | Switch nix to latest STABLE channel |
|
38 | Switch nix to the latest STABLE channel | |
38 | ----------------------------------- |
|
39 | --------------------------------------- | |
39 |
|
40 | |||
40 | run:: |
|
41 | run:: | |
41 |
|
42 | |||
42 | nix-channel --add https://nixos.org/channels/nixos-16.03 nixpkgs |
|
43 | nix-channel --add https://nixos.org/channels/nixos-16.03 nixpkgs | |
43 |
|
44 | |||
44 | Followed by:: |
|
45 | Followed by:: | |
45 |
|
46 | |||
46 | nix-channel --update |
|
47 | nix-channel --update | |
47 |
|
48 | |||
48 |
|
49 | |||
49 | Clone the required repositories |
|
50 | Clone the required repositories | |
50 | ------------------------------- |
|
51 | ------------------------------- | |
51 |
|
52 | |||
52 |
After Nix is set up, clone the RhodeCode Enterprise Community Edition |
|
53 | After Nix is set up, clone the RhodeCode Enterprise Community Edition and | |
53 | RhodeCode VCSServer repositories into the same directory. |
|
54 | RhodeCode VCSServer repositories into the same directory. | |
54 | To do this, use the following example:: |
|
55 | To do this, use the following example:: | |
55 |
|
56 | |||
56 | mkdir rhodecode-develop && cd rhodecode-develop |
|
57 | mkdir rhodecode-develop && cd rhodecode-develop | |
57 | hg clone https://code.rhodecode.com/rhodecode-enterprise-ce |
|
58 | hg clone https://code.rhodecode.com/rhodecode-enterprise-ce | |
58 | hg clone https://code.rhodecode.com/rhodecode-vcsserver |
|
59 | hg clone https://code.rhodecode.com/rhodecode-vcsserver | |
59 |
|
60 | |||
60 | .. note:: |
|
61 | .. note:: | |
61 |
|
62 | |||
62 |
If you cannot clone the repository, please request read permissions |
|
63 | If you cannot clone the repository, please request read permissions | |
|
64 | via support@rhodecode.com | |||
63 |
|
65 | |||
64 |
|
66 | |||
65 |
|
67 | |||
66 | Enter the Development Shell |
|
68 | Enter the Development Shell | |
67 | --------------------------- |
|
69 | --------------------------- | |
68 |
|
70 | |||
69 |
The final step is to start |
|
71 | The final step is to start the development shell. To do this, run the | |
70 | following command from inside the cloned repository:: |
|
72 | following command from inside the cloned repository:: | |
71 |
|
73 | |||
72 | cd ~/rhodecode-enterprise-ce |
|
74 | cd ~/rhodecode-enterprise-ce | |
73 |
nix-shell |
|
75 | nix-shell | |
74 |
|
76 | |||
75 | .. note:: |
|
77 | .. note:: | |
76 |
|
78 | |||
77 | On the first run, this will take a while to download and optionally compile |
|
79 | On the first run, this will take a while to download and optionally compile | |
78 |
a few things. The |
|
80 | a few things. The following runs will be faster. The development shell works | |
|
81 | fine on MacOS and Linux platforms. | |||
79 |
|
82 | |||
80 |
|
83 | |||
81 |
|
84 | |||
82 | Creating a Development Configuration |
|
85 | Creating a Development Configuration | |
83 | ------------------------------------ |
|
86 | ------------------------------------ | |
84 |
|
87 | |||
85 | To create a development environment for RhodeCode Enterprise, |
|
88 | To create a development environment for RhodeCode Enterprise, | |
86 | use the following steps: |
|
89 | use the following steps: | |
87 |
|
90 | |||
88 | 1. Create a copy of `~/rhodecode-enterprise-ce/configs/development.ini` |
|
91 | 1. Create a copy of `~/rhodecode-enterprise-ce/configs/development.ini` | |
89 | 2. Adjust the configuration settings to your needs |
|
92 | 2. Adjust the configuration settings to your needs | |
90 |
|
93 | |||
91 | .. note:: |
|
94 | .. note:: | |
92 |
|
95 | |||
93 |
It is recommended to |
|
96 | It is recommended to use the name `dev.ini`. | |
94 |
|
97 | |||
95 |
|
98 | |||
96 | Setup the Development Database |
|
99 | Setup the Development Database | |
97 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
100 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
98 |
|
101 | |||
99 | To create a development database use the following example. This is a one |
|
102 | To create a development database, use the following example. This is a one | |
100 | time operation:: |
|
103 | time operation:: | |
101 |
|
104 | |||
102 | paster setup-rhodecode dev.ini \ |
|
105 | paster setup-rhodecode dev.ini \ | |
103 | --user=admin --password=secret \ |
|
106 | --user=admin --password=secret \ | |
104 | --email=admin@example.com \ |
|
107 | --email=admin@example.com \ | |
105 | --repos=~/my_dev_repos |
|
108 | --repos=~/my_dev_repos | |
106 |
|
109 | |||
107 |
|
110 | |||
108 | Start the Development Server |
|
111 | Start the Development Server | |
109 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
112 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
110 |
|
113 | |||
111 | When starting the development server, you should start the vcsserver as a |
|
114 | When starting the development server, you should start the vcsserver as a | |
112 | separate process. To do this use one of the following examples: |
|
115 | separate process. To do this, use one of the following examples: | |
113 |
|
116 | |||
114 | 1. Set the `start.vcs_server` flag in the ``dev.ini`` file to true. For example: |
|
117 | 1. Set the `start.vcs_server` flag in the ``dev.ini`` file to true. For example: | |
115 |
|
118 | |||
116 | .. code-block:: python |
|
119 | .. code-block:: python | |
117 |
|
120 | |||
118 | ### VCS CONFIG ### |
|
121 | ### VCS CONFIG ### | |
119 | ################## |
|
122 | ################## | |
120 | vcs.start_server = true |
|
123 | vcs.start_server = true | |
121 | vcs.server = localhost:9900 |
|
124 | vcs.server = localhost:9900 | |
122 | vcs.server.log_level = debug |
|
125 | vcs.server.log_level = debug | |
123 |
|
126 | |||
124 | Then start the server using the following command: ``rcserver dev.ini`` |
|
127 | Then start the server using the following command: ``rcserver dev.ini`` | |
125 |
|
128 | |||
126 | 2. Start the development server using the following example:: |
|
129 | 2. Start the development server using the following example:: | |
127 |
|
130 | |||
128 | rcserver --with-vcsserver dev.ini |
|
131 | rcserver --with-vcsserver dev.ini | |
129 |
|
132 | |||
130 | 3. Start the development server in a different terminal using the following |
|
133 | 3. Start the development server in a different terminal using the following | |
131 | example:: |
|
134 | example:: | |
132 |
|
135 | |||
133 | vcsserver |
|
136 | vcsserver | |
134 |
|
137 | |||
135 |
|
138 | |||
136 | Run the Environment Tests |
|
139 | Run the Environment Tests | |
137 | ^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
140 | ^^^^^^^^^^^^^^^^^^^^^^^^^ | |
138 |
|
141 | |||
139 | Please make sure that the test are passing to verify that your environment is |
|
142 | Please make sure that the tests are passing to verify that your environment is | |
140 | set up correctly. More details about the tests are described in: |
|
143 | set up correctly. RhodeCode uses py.test to run tests. | |
141 | :file:`/docs/dev/testing`. |
|
144 | Please simply run ``make test`` to run the basic test suite. |
@@ -1,28 +1,28 b'' | |||||
1 |
|
1 | |||
2 |
|
2 | |||
3 | ============================ |
|
3 | ============================ | |
4 | Testing and Specifications |
|
4 | Testing and Specifications | |
5 | ============================ |
|
5 | ============================ | |
6 |
|
6 | |||
7 |
|
7 | |||
8 | .. toctree:: |
|
8 | .. toctree:: | |
9 | :maxdepth: 2 |
|
9 | :maxdepth: 2 | |
10 |
|
10 | |||
11 | unit-and-functional |
|
11 | unit-and-functional | |
12 | spec-by-example |
|
12 | spec-by-example | |
13 | naming-conventions |
|
13 | naming-conventions | |
14 |
|
14 | |||
15 |
|
15 | |||
16 |
|
16 | |||
17 | Overview |
|
17 | Overview | |
18 | ======== |
|
18 | ======== | |
19 |
|
19 | |||
20 |
We have a quite |
|
20 | We have a quite large test suite inside of :file:`rhodecode/tests` which is a mix | |
21 | of unit tests and functional or integration tests. More details are in |
|
21 | of unit tests and functional or integration tests. More details are in | |
22 | :ref:`test-unit-and-functional`. |
|
22 | :ref:`test-unit-and-functional`. | |
23 |
|
23 | |||
24 |
|
24 | |||
25 |
Apart from that we start to apply "Specification by Example" and maintain |
|
25 | Apart from that, we are starting to apply "Specification by Example" and maintain | |
26 |
collection of such specifications together with an implementation so that it |
|
26 | a collection of such specifications together with an implementation so that it | |
27 | be validated in an automatic way. The files can be found in |
|
27 | can be validated in an automatic way. The files can be found in | |
28 | :file:`acceptance_tests`. More details are in :ref:`test-spec-by-example`. |
|
28 | :file:`acceptance_tests`. More details are in :ref:`test-spec-by-example`. |
@@ -1,75 +1,75 b'' | |||||
1 |
|
1 | |||
2 | .. _test-spec-by-example: |
|
2 | .. _test-spec-by-example: | |
3 |
|
3 | |||
4 | ========================== |
|
4 | ========================== | |
5 | Specification by Example |
|
5 | Specification by Example | |
6 | ========================== |
|
6 | ========================== | |
7 |
|
7 | |||
8 |
|
8 | |||
9 | .. Avoid duplicating the quickstart instructions by importing the README |
|
9 | .. Avoid duplicating the quickstart instructions by importing the README | |
10 | file. |
|
10 | file. | |
11 |
|
11 | |||
12 | .. include:: ../../../acceptance_tests/README.rst |
|
12 | .. include:: ../../../acceptance_tests/README.rst | |
13 |
|
13 | |||
14 |
|
14 | |||
15 |
|
15 | |||
16 | Choices of technology and tools |
|
16 | Choices of technology and tools | |
17 | =============================== |
|
17 | =============================== | |
18 |
|
18 | |||
19 |
|
19 | |||
20 | `nix` as runtime environment |
|
20 | `nix` as runtime environment | |
21 | ---------------------------- |
|
21 | ---------------------------- | |
22 |
|
22 | |||
23 | We settled to use the `nix` tools to provide us the needed environment for |
|
23 | We settled to use the `nix` tools to provide us the needed environment for | |
24 | running the tests. |
|
24 | running the tests. | |
25 |
|
25 | |||
26 |
|
26 | |||
27 |
|
27 | |||
28 | `Gherkins` as specification language |
|
28 | `Gherkins` as specification language | |
29 | ------------------------------------ |
|
29 | ------------------------------------ | |
30 |
|
30 | |||
31 | To specify by example, we settled on Gherkins as the semi-formal specification |
|
31 | To specify by example, we settled on Gherkins as the semi-formal specification | |
32 | language. |
|
32 | language. | |
33 |
|
33 | |||
34 |
|
34 | |||
35 | `py.test` as a runner |
|
35 | `py.test` as a runner | |
36 | --------------------- |
|
36 | --------------------- | |
37 |
|
37 | |||
38 | After experimenting with `behave` and `py.test` our choice was `pytest-bdd` |
|
38 | After experimenting with `behave` and `py.test` our choice was `pytest-bdd` | |
39 | because it allows us to use our existing knowledge about `py.test` and avoids |
|
39 | because it allows us to use our existing knowledge about `py.test` and avoids | |
40 | that we have to learn another tool. |
|
40 | that we have to learn another tool. | |
41 |
|
41 | |||
42 |
|
42 | |||
43 |
|
43 | |||
44 | Concepts |
|
44 | Concepts | |
45 | ======== |
|
45 | ======== | |
46 |
|
46 | |||
47 | The logic is structured around the design pattern of "page objects". The |
|
47 | The logic is structured around the design pattern of "page objects". The | |
48 | documentation of `python-selemium` contains a few more details about this |
|
48 | documentation of `python-selemium` contains a few more details about this | |
49 | pattern. |
|
49 | pattern. | |
50 |
|
50 | |||
51 |
|
51 | |||
52 |
|
52 | |||
53 | Page Objects |
|
53 | Page Objects | |
54 | ------------ |
|
54 | ------------ | |
55 |
|
55 | |||
56 | We introduce an abstraction class for every page which we have to interact with |
|
56 | We introduce an abstraction class for every page which we have to interact with | |
57 | in order to validate the specifications. |
|
57 | in order to validate the specifications. | |
58 |
|
58 | |||
59 | The implementation for the page objects is inside of the module |
|
59 | The implementation for the page objects is inside of the module | |
60 | :mod:`page_objects`. The class :class:`page_objects.base.BasePage` should be |
|
60 | :mod:`page_objects`. The class :class:`page_objects.base.BasePage` should be | |
61 | used as a base for all page object implementations. |
|
61 | used as a base for all page object implementations. | |
62 |
|
62 | |||
63 |
|
63 | |||
64 |
|
64 | |||
65 | Locators |
|
65 | Locators | |
66 | -------- |
|
66 | -------- | |
67 |
|
67 | |||
68 | The specific information how to locate an element inside of the DOM tree of a |
|
68 | The specific information how to locate an element inside of the DOM tree of a | |
69 |
page is kept in a separate class. This class serves mainly as a data container |
|
69 | page is kept in a separate class. This class serves mainly as a data container; | |
70 | it shall not contain any logic. |
|
70 | it shall not contain any logic. | |
71 |
|
71 | |||
72 | The reason for keeping the locators separate is that we expect a frequent need |
|
72 | The reason for keeping the locators separate is that we expect a frequent need | |
73 | for change whenever we work on our templates. In such a case it is more |
|
73 | for change whenever we work on our templates. In such a case, it is more | |
74 |
efficient to have all locators together and update them there instead of |
|
74 | efficient to have all of thelocators together and update them there instead of | |
75 |
to find |
|
75 | having to find every locator inside of the logic of a page object. |
@@ -1,61 +1,61 b'' | |||||
1 |
|
1 | |||
2 | .. _test-unit-and-functional: |
|
2 | .. _test-unit-and-functional: | |
3 |
|
3 | |||
4 | =========================== |
|
4 | =========================== | |
5 | Unit and Functional Tests |
|
5 | Unit and Functional Tests | |
6 | =========================== |
|
6 | =========================== | |
7 |
|
7 | |||
8 |
|
8 | |||
9 |
|
9 | |||
10 | py.test based test suite |
|
10 | py.test based test suite | |
11 | ======================== |
|
11 | ======================== | |
12 |
|
12 | |||
13 |
|
13 | |||
14 | The test suite is in the folder :file:`rhodecode/tests/` and should be run with |
|
14 | The test suite is in the folder :file:`rhodecode/tests/` and should be run with | |
15 | the test runner `py.test` inside of your `nix-shell` environment:: |
|
15 | the test runner `py.test` inside of your `nix-shell` environment:: | |
16 |
|
16 | |||
17 | # In case you need the cythonized version |
|
17 | # In case you need the cythonized version | |
18 | CYTHONIZE=1 python setup.py develop --prefix=$tmp_path |
|
18 | CYTHONIZE=1 python setup.py develop --prefix=$tmp_path | |
19 |
|
19 | |||
20 | py.test rhodecode |
|
20 | py.test rhodecode | |
21 |
|
21 | |||
22 |
|
22 | |||
23 |
|
23 | |||
24 | py.test integration |
|
24 | py.test integration | |
25 | ------------------- |
|
25 | ------------------- | |
26 |
|
26 | |||
27 | The integration with the test runner is based on the following three parts: |
|
27 | The integration with the test runner is based on the following three parts: | |
28 |
|
28 | |||
29 | - `pytest_pylons` is a py.test plugin which does the integration with the |
|
29 | - `pytest_pylons` is a py.test plugin which does the integration with the | |
30 |
Pylons web framework. It sets up the Pylons environment based on |
|
30 | Pylons web framework. It sets up the Pylons environment based on the given ini | |
31 | file. |
|
31 | file. | |
32 |
|
32 | |||
33 | Tests which depend on the Pylons environment to be set up must request the |
|
33 | Tests which depend on the Pylons environment to be set up must request the | |
34 | fixture `pylonsapp`. |
|
34 | fixture `pylonsapp`. | |
35 |
|
35 | |||
36 | - :file:`rhodecode/tests/plugin.py` contains the integration of py.test with |
|
36 | - :file:`rhodecode/tests/plugin.py` contains the integration of py.test with | |
37 | RhodeCode Enterprise itself. |
|
37 | RhodeCode Enterprise itself. | |
38 |
|
38 | |||
39 | - :file:`conftest.py` plugins are used to provide a special integration for |
|
39 | - :file:`conftest.py` plugins are used to provide a special integration for | |
40 | certain groups of tests based on the directory location. |
|
40 | certain groups of tests based on the directory location. | |
41 |
|
41 | |||
42 |
|
42 | |||
43 |
|
43 | |||
44 | VCS backend selection |
|
44 | VCS backend selection | |
45 | --------------------- |
|
45 | --------------------- | |
46 |
|
46 | |||
47 | The py.test integration provides a parameter `--backends`. It will skip all |
|
47 | The py.test integration provides a parameter `--backends`. It will skip all | |
48 | tests which are marked for other backends. |
|
48 | tests which are marked for other backends. | |
49 |
|
49 | |||
50 | To run only Subversion tests:: |
|
50 | To run only Subversion tests:: | |
51 |
|
51 | |||
52 | py.test rhodecode --backends=svn |
|
52 | py.test rhodecode --backends=svn | |
53 |
|
53 | |||
54 |
|
54 | |||
55 |
|
55 | |||
56 | Frontend / Styling support |
|
56 | Frontend / Styling support | |
57 | ========================== |
|
57 | ========================== | |
58 |
|
58 | |||
59 | All relevant style components have an example inside of the "Style" section |
|
59 | All relevant style components have an example inside of the "Style" section | |
60 | within the application. Enable the setting `debug_style` to make this section |
|
60 | within the application. Enable the setting `debug_style` to make this section | |
61 | visible in your local instance of the application. |
|
61 | visible in your local instance of the application. |
@@ -1,82 +1,83 b'' | |||||
1 | .. _rhodecode-release-notes-ref: |
|
1 | .. _rhodecode-release-notes-ref: | |
2 |
|
2 | |||
3 | Release Notes |
|
3 | Release Notes | |
4 | ============= |
|
4 | ============= | |
5 |
|
5 | |||
6 | |RCE| 4.x Versions |
|
6 | |RCE| 4.x Versions | |
7 | ------------------ |
|
7 | ------------------ | |
8 |
|
8 | |||
9 | .. toctree:: |
|
9 | .. toctree:: | |
10 | :maxdepth: 1 |
|
10 | :maxdepth: 1 | |
11 |
|
11 | |||
|
12 | release-notes-4.2.0.rst | |||
12 | release-notes-4.1.2.rst |
|
13 | release-notes-4.1.2.rst | |
13 | release-notes-4.1.1.rst |
|
14 | release-notes-4.1.1.rst | |
14 | release-notes-4.1.0.rst |
|
15 | release-notes-4.1.0.rst | |
15 | release-notes-4.0.1.rst |
|
16 | release-notes-4.0.1.rst | |
16 | release-notes-4.0.0.rst |
|
17 | release-notes-4.0.0.rst | |
17 |
|
18 | |||
18 | |RCE| 3.x Versions |
|
19 | |RCE| 3.x Versions | |
19 | ------------------ |
|
20 | ------------------ | |
20 |
|
21 | |||
21 | .. toctree:: |
|
22 | .. toctree:: | |
22 | :maxdepth: 1 |
|
23 | :maxdepth: 1 | |
23 |
|
24 | |||
24 | release-notes-3.8.4.rst |
|
25 | release-notes-3.8.4.rst | |
25 | release-notes-3.8.3.rst |
|
26 | release-notes-3.8.3.rst | |
26 | release-notes-3.8.2.rst |
|
27 | release-notes-3.8.2.rst | |
27 | release-notes-3.8.1.rst |
|
28 | release-notes-3.8.1.rst | |
28 | release-notes-3.8.0.rst |
|
29 | release-notes-3.8.0.rst | |
29 | release-notes-3.7.1.rst |
|
30 | release-notes-3.7.1.rst | |
30 | release-notes-3.7.0.rst |
|
31 | release-notes-3.7.0.rst | |
31 | release-notes-3.6.1.rst |
|
32 | release-notes-3.6.1.rst | |
32 | release-notes-3.6.0.rst |
|
33 | release-notes-3.6.0.rst | |
33 | release-notes-3.5.2.rst |
|
34 | release-notes-3.5.2.rst | |
34 | release-notes-3.5.1.rst |
|
35 | release-notes-3.5.1.rst | |
35 | release-notes-3.5.0.rst |
|
36 | release-notes-3.5.0.rst | |
36 | release-notes-3.4.1.rst |
|
37 | release-notes-3.4.1.rst | |
37 | release-notes-3.4.0.rst |
|
38 | release-notes-3.4.0.rst | |
38 | release-notes-3.3.4.rst |
|
39 | release-notes-3.3.4.rst | |
39 | release-notes-3.3.3.rst |
|
40 | release-notes-3.3.3.rst | |
40 | release-notes-3.3.2.rst |
|
41 | release-notes-3.3.2.rst | |
41 | release-notes-3.3.1.rst |
|
42 | release-notes-3.3.1.rst | |
42 | release-notes-3.3.0.rst |
|
43 | release-notes-3.3.0.rst | |
43 | release-notes-3.2.3.rst |
|
44 | release-notes-3.2.3.rst | |
44 | release-notes-3.2.2.rst |
|
45 | release-notes-3.2.2.rst | |
45 | release-notes-3.2.1.rst |
|
46 | release-notes-3.2.1.rst | |
46 | release-notes-3.2.0.rst |
|
47 | release-notes-3.2.0.rst | |
47 | release-notes-3.1.1.rst |
|
48 | release-notes-3.1.1.rst | |
48 | release-notes-3.1.0.rst |
|
49 | release-notes-3.1.0.rst | |
49 | release-notes-3.0.2.rst |
|
50 | release-notes-3.0.2.rst | |
50 | release-notes-3.0.1.rst |
|
51 | release-notes-3.0.1.rst | |
51 | release-notes-3.0.0.rst |
|
52 | release-notes-3.0.0.rst | |
52 |
|
53 | |||
53 | |RCE| 2.x Versions |
|
54 | |RCE| 2.x Versions | |
54 | ------------------ |
|
55 | ------------------ | |
55 |
|
56 | |||
56 | .. toctree:: |
|
57 | .. toctree:: | |
57 | :maxdepth: 1 |
|
58 | :maxdepth: 1 | |
58 |
|
59 | |||
59 | release-notes-2.2.8.rst |
|
60 | release-notes-2.2.8.rst | |
60 | release-notes-2.2.7.rst |
|
61 | release-notes-2.2.7.rst | |
61 | release-notes-2.2.6.rst |
|
62 | release-notes-2.2.6.rst | |
62 | release-notes-2.2.5.rst |
|
63 | release-notes-2.2.5.rst | |
63 | release-notes-2.2.4.rst |
|
64 | release-notes-2.2.4.rst | |
64 | release-notes-2.2.3.rst |
|
65 | release-notes-2.2.3.rst | |
65 | release-notes-2.2.2.rst |
|
66 | release-notes-2.2.2.rst | |
66 | release-notes-2.2.1.rst |
|
67 | release-notes-2.2.1.rst | |
67 | release-notes-2.2.0.rst |
|
68 | release-notes-2.2.0.rst | |
68 | release-notes-2.1.0.rst |
|
69 | release-notes-2.1.0.rst | |
69 | release-notes-2.0.2.rst |
|
70 | release-notes-2.0.2.rst | |
70 | release-notes-2.0.1.rst |
|
71 | release-notes-2.0.1.rst | |
71 | release-notes-2.0.0.rst |
|
72 | release-notes-2.0.0.rst | |
72 |
|
73 | |||
73 | |RCE| 1.x Versions |
|
74 | |RCE| 1.x Versions | |
74 | ------------------ |
|
75 | ------------------ | |
75 |
|
76 | |||
76 | .. toctree:: |
|
77 | .. toctree:: | |
77 | :maxdepth: 1 |
|
78 | :maxdepth: 1 | |
78 |
|
79 | |||
79 | release-notes-1.7.2.rst |
|
80 | release-notes-1.7.2.rst | |
80 | release-notes-1.7.1.rst |
|
81 | release-notes-1.7.1.rst | |
81 | release-notes-1.7.0.rst |
|
82 | release-notes-1.7.0.rst | |
82 | release-notes-1.6.0.rst |
|
83 | release-notes-1.6.0.rst |
@@ -1,163 +1,160 b'' | |||||
1 | # Utility to generate the license information |
|
1 | # Utility to generate the license information | |
2 | # |
|
2 | # | |
3 | # Usage: |
|
3 | # Usage: | |
4 | # |
|
4 | # | |
5 | # nix-build -I ~/dev license.nix -A result |
|
5 | # nix-build -I ~/dev license.nix -A result | |
6 | # |
|
6 | # | |
7 | # Afterwards ./result will contain the license information as JSON files. |
|
7 | # Afterwards ./result will contain the license information as JSON files. | |
8 | # |
|
8 | # | |
9 | # |
|
9 | # | |
10 | # Overview |
|
10 | # Overview | |
11 | # |
|
11 | # | |
12 | # Uses two steps to get the relevant license information: |
|
12 | # Uses two steps to get the relevant license information: | |
13 | # |
|
13 | # | |
14 | # 1. Walk down the derivations based on "buildInputs" and |
|
14 | # 1. Walk down the derivations based on "buildInputs" and | |
15 | # "propagatedBuildInputs". This results in all dependencies based on the nix |
|
15 | # "propagatedBuildInputs". This results in all dependencies based on the nix | |
16 | # declartions. |
|
16 | # declartions. | |
17 | # |
|
17 | # | |
18 | # 2. Build Enterprise and query nix-store to get a list of runtime |
|
18 | # 2. Build Enterprise and query nix-store to get a list of runtime | |
19 | # dependencies. The results from step 1 are then limited to the ones which |
|
19 | # dependencies. The results from step 1 are then limited to the ones which | |
20 | # are in this list. |
|
20 | # are in this list. | |
21 | # |
|
21 | # | |
22 | # The result is then available in ./result/license.json. |
|
22 | # The result is then available in ./result/license.json. | |
23 | # |
|
23 | # | |
24 |
|
24 | |||
25 |
|
25 | |||
26 | let |
|
26 | let | |
27 |
|
27 | |||
28 | nixpkgs = import <nixpkgs> {}; |
|
28 | nixpkgs = import <nixpkgs> {}; | |
29 |
|
29 | |||
30 | stdenv = nixpkgs.stdenv; |
|
30 | stdenv = nixpkgs.stdenv; | |
31 |
|
31 | |||
32 | # Enterprise as simple as possible, goal here is just to identify the runtime |
|
32 | # Enterprise as simple as possible, goal here is just to identify the runtime | |
33 | # dependencies. Ideally we could avoid building Enterprise at all and somehow |
|
33 | # dependencies. Ideally we could avoid building Enterprise at all and somehow | |
34 | # figure it out without calling into nix-store. |
|
34 | # figure it out without calling into nix-store. | |
35 | enterprise = import ./default.nix { |
|
35 | enterprise = import ./default.nix { | |
36 | doCheck = false; |
|
36 | doCheck = false; | |
37 | with_vcsserver = false; |
|
|||
38 | with_pyramid = false; |
|
|||
39 | cythonize = false; |
|
|||
40 | }; |
|
37 | }; | |
41 |
|
38 | |||
42 | # For a given derivation, return the list of all dependencies |
|
39 | # For a given derivation, return the list of all dependencies | |
43 | drvToDependencies = drv: nixpkgs.lib.flatten [ |
|
40 | drvToDependencies = drv: nixpkgs.lib.flatten [ | |
44 | drv.nativeBuildInputs or [] |
|
41 | drv.nativeBuildInputs or [] | |
45 | drv.propagatedNativeBuildInputs or [] |
|
42 | drv.propagatedNativeBuildInputs or [] | |
46 | ]; |
|
43 | ]; | |
47 |
|
44 | |||
48 | # Transform the given derivation into the meta information which we need in |
|
45 | # Transform the given derivation into the meta information which we need in | |
49 | # the resulting JSON files. |
|
46 | # the resulting JSON files. | |
50 | drvToMeta = drv: { |
|
47 | drvToMeta = drv: { | |
51 | name = drv.name or "UNNAMED"; |
|
48 | name = drv.name or "UNNAMED"; | |
52 | license = if drv ? meta.license then drv.meta.license else "UNKNOWN"; |
|
49 | license = if drv ? meta.license then drv.meta.license else "UNKNOWN"; | |
53 | }; |
|
50 | }; | |
54 |
|
51 | |||
55 | # Walk the tree of buildInputs and propagatedBuildInputs and return it as a |
|
52 | # Walk the tree of buildInputs and propagatedBuildInputs and return it as a | |
56 | # flat list. Duplicates are avoided. |
|
53 | # flat list. Duplicates are avoided. | |
57 | listDrvDependencies = drv: let |
|
54 | listDrvDependencies = drv: let | |
58 | addElement = element: seen: |
|
55 | addElement = element: seen: | |
59 | if (builtins.elem element seen) |
|
56 | if (builtins.elem element seen) | |
60 | then seen |
|
57 | then seen | |
61 | else let |
|
58 | else let | |
62 | newSeen = seen ++ [ element ]; |
|
59 | newSeen = seen ++ [ element ]; | |
63 | newDeps = drvToDependencies element; |
|
60 | newDeps = drvToDependencies element; | |
64 | in nixpkgs.lib.fold addElement newSeen newDeps; |
|
61 | in nixpkgs.lib.fold addElement newSeen newDeps; | |
65 | initialElements = drvToDependencies drv; |
|
62 | initialElements = drvToDependencies drv; | |
66 | in nixpkgs.lib.fold addElement [] initialElements; |
|
63 | in nixpkgs.lib.fold addElement [] initialElements; | |
67 |
|
64 | |||
68 | # Reads in a file with store paths and returns a list of derivation names. |
|
65 | # Reads in a file with store paths and returns a list of derivation names. | |
69 | # |
|
66 | # | |
70 | # Reads the file, splits the lines, then removes the prefix, so that we |
|
67 | # Reads the file, splits the lines, then removes the prefix, so that we | |
71 | # end up with a list of derivation names in the end. |
|
68 | # end up with a list of derivation names in the end. | |
72 | storePathsToDrvNames = srcPath: let |
|
69 | storePathsToDrvNames = srcPath: let | |
73 | rawStorePaths = nixpkgs.lib.removeSuffix "\n" ( |
|
70 | rawStorePaths = nixpkgs.lib.removeSuffix "\n" ( | |
74 | builtins.readFile srcPath); |
|
71 | builtins.readFile srcPath); | |
75 | storePaths = nixpkgs.lib.splitString "\n" rawStorePaths; |
|
72 | storePaths = nixpkgs.lib.splitString "\n" rawStorePaths; | |
76 | # TODO: johbo: Would be nice to use some sort of utility here to convert |
|
73 | # TODO: johbo: Would be nice to use some sort of utility here to convert | |
77 | # the path to a derivation name. |
|
74 | # the path to a derivation name. | |
78 | storePathPrefix = ( |
|
75 | storePathPrefix = ( | |
79 | builtins.stringLength "/nix/store/zwy7aavnif9ayw30rya1k6xiacafzzl6-"); |
|
76 | builtins.stringLength "/nix/store/zwy7aavnif9ayw30rya1k6xiacafzzl6-"); | |
80 | storePathToName = path: |
|
77 | storePathToName = path: | |
81 | builtins.substring storePathPrefix (builtins.stringLength path) path; |
|
78 | builtins.substring storePathPrefix (builtins.stringLength path) path; | |
82 | in (map storePathToName storePaths); |
|
79 | in (map storePathToName storePaths); | |
83 |
|
80 | |||
84 | in rec { |
|
81 | in rec { | |
85 |
|
82 | |||
86 | # Build Enterprise and call nix-store to retrieve the runtime |
|
83 | # Build Enterprise and call nix-store to retrieve the runtime | |
87 | # dependencies. The result is available in the nix store. |
|
84 | # dependencies. The result is available in the nix store. | |
88 | runtimeDependencies = stdenv.mkDerivation { |
|
85 | runtimeDependencies = stdenv.mkDerivation { | |
89 | name = "runtime-dependencies"; |
|
86 | name = "runtime-dependencies"; | |
90 | buildInputs = [ |
|
87 | buildInputs = [ | |
91 | # Needed to query the store |
|
88 | # Needed to query the store | |
92 | nixpkgs.nix |
|
89 | nixpkgs.nix | |
93 | ]; |
|
90 | ]; | |
94 | unpackPhase = '' |
|
91 | unpackPhase = '' | |
95 | echo "Nothing to unpack" |
|
92 | echo "Nothing to unpack" | |
96 | ''; |
|
93 | ''; | |
97 | buildPhase = '' |
|
94 | buildPhase = '' | |
98 | # Get a list of runtime dependencies |
|
95 | # Get a list of runtime dependencies | |
99 | nix-store -q --references ${enterprise} > nix-store-references |
|
96 | nix-store -q --references ${enterprise} > nix-store-references | |
100 | ''; |
|
97 | ''; | |
101 | installPhase = '' |
|
98 | installPhase = '' | |
102 | mkdir -p $out |
|
99 | mkdir -p $out | |
103 | cp -v nix-store-references $out/ |
|
100 | cp -v nix-store-references $out/ | |
104 | ''; |
|
101 | ''; | |
105 | }; |
|
102 | }; | |
106 |
|
103 | |||
107 | # Produce the license overview files. |
|
104 | # Produce the license overview files. | |
108 | result = let |
|
105 | result = let | |
109 |
|
106 | |||
110 | # Dependencies according to the nix-store |
|
107 | # Dependencies according to the nix-store | |
111 | runtimeDependencyNames = ( |
|
108 | runtimeDependencyNames = ( | |
112 | storePathsToDrvNames "${runtimeDependencies}/nix-store-references"); |
|
109 | storePathsToDrvNames "${runtimeDependencies}/nix-store-references"); | |
113 |
|
110 | |||
114 | # Dependencies based on buildInputs and propagatedBuildInputs |
|
111 | # Dependencies based on buildInputs and propagatedBuildInputs | |
115 | enterpriseAllDependencies = listDrvDependencies enterprise; |
|
112 | enterpriseAllDependencies = listDrvDependencies enterprise; | |
116 | enterpriseRuntimeDependencies = let |
|
113 | enterpriseRuntimeDependencies = let | |
117 | elemName = element: element.name or "UNNAMED"; |
|
114 | elemName = element: element.name or "UNNAMED"; | |
118 | isRuntime = element: builtins.elem (elemName element) runtimeDependencyNames; |
|
115 | isRuntime = element: builtins.elem (elemName element) runtimeDependencyNames; | |
119 | in builtins.filter isRuntime enterpriseAllDependencies; |
|
116 | in builtins.filter isRuntime enterpriseAllDependencies; | |
120 |
|
117 | |||
121 | # Extract relevant meta information |
|
118 | # Extract relevant meta information | |
122 | enterpriseAllLicenses = map drvToMeta enterpriseAllDependencies; |
|
119 | enterpriseAllLicenses = map drvToMeta enterpriseAllDependencies; | |
123 | enterpriseRuntimeLicenses = map drvToMeta enterpriseRuntimeDependencies; |
|
120 | enterpriseRuntimeLicenses = map drvToMeta enterpriseRuntimeDependencies; | |
124 |
|
121 | |||
125 | in stdenv.mkDerivation { |
|
122 | in stdenv.mkDerivation { | |
126 |
|
123 | |||
127 | name = "licenses"; |
|
124 | name = "licenses"; | |
128 |
|
125 | |||
129 | buildInputs = []; |
|
126 | buildInputs = []; | |
130 |
|
127 | |||
131 | unpackPhase = '' |
|
128 | unpackPhase = '' | |
132 | echo "Nothing to unpack" |
|
129 | echo "Nothing to unpack" | |
133 | ''; |
|
130 | ''; | |
134 |
|
131 | |||
135 | buildPhase = '' |
|
132 | buildPhase = '' | |
136 | mkdir build |
|
133 | mkdir build | |
137 |
|
134 | |||
138 | # Copy list of runtime dependencies for the Python processor |
|
135 | # Copy list of runtime dependencies for the Python processor | |
139 | cp "${runtimeDependencies}/nix-store-references" ./build/nix-store-references |
|
136 | cp "${runtimeDependencies}/nix-store-references" ./build/nix-store-references | |
140 |
|
137 | |||
141 | # All licenses which we found by walking buildInputs and |
|
138 | # All licenses which we found by walking buildInputs and | |
142 | # propagatedBuildInputs |
|
139 | # propagatedBuildInputs | |
143 | cat > build/all-licenses.json <<EOF |
|
140 | cat > build/all-licenses.json <<EOF | |
144 | ${builtins.toJSON enterpriseAllLicenses} |
|
141 | ${builtins.toJSON enterpriseAllLicenses} | |
145 | EOF |
|
142 | EOF | |
146 |
|
143 | |||
147 | # License information for our runtime dependencies only. Basically all |
|
144 | # License information for our runtime dependencies only. Basically all | |
148 | # licenses limited to the items which where also reported by nix-store as |
|
145 | # licenses limited to the items which where also reported by nix-store as | |
149 | # a dependency. |
|
146 | # a dependency. | |
150 | cat > build/licenses.json <<EOF |
|
147 | cat > build/licenses.json <<EOF | |
151 | ${builtins.toJSON enterpriseRuntimeLicenses} |
|
148 | ${builtins.toJSON enterpriseRuntimeLicenses} | |
152 | EOF |
|
149 | EOF | |
153 | ''; |
|
150 | ''; | |
154 |
|
151 | |||
155 | installPhase = '' |
|
152 | installPhase = '' | |
156 | mkdir -p $out |
|
153 | mkdir -p $out | |
157 |
|
154 | |||
158 | # Store it all, that helps when things go wrong |
|
155 | # Store it all, that helps when things go wrong | |
159 | cp -rv ./build/* $out |
|
156 | cp -rv ./build/* $out | |
160 | ''; |
|
157 | ''; | |
161 | }; |
|
158 | }; | |
162 |
|
159 | |||
163 | } |
|
160 | } |
@@ -1,165 +1,273 b'' | |||||
1 | # Overrides for the generated python-packages.nix |
|
1 | # Overrides for the generated python-packages.nix | |
2 | # |
|
2 | # | |
3 | # This function is intended to be used as an extension to the generated file |
|
3 | # This function is intended to be used as an extension to the generated file | |
4 | # python-packages.nix. The main objective is to add needed dependencies of C |
|
4 | # python-packages.nix. The main objective is to add needed dependencies of C | |
5 | # libraries and tweak the build instructions where needed. |
|
5 | # libraries and tweak the build instructions where needed. | |
6 |
|
6 | |||
7 | { pkgs, basePythonPackages }: |
|
7 | { pkgs, basePythonPackages }: | |
8 |
|
8 | |||
9 | let |
|
9 | let | |
10 | sed = "sed -i"; |
|
10 | sed = "sed -i"; | |
|
11 | localLicenses = { | |||
|
12 | repoze = { | |||
|
13 | fullName = "Repoze License"; | |||
|
14 | url = http://www.repoze.org/LICENSE.txt; | |||
|
15 | }; | |||
|
16 | }; | |||
11 | in |
|
17 | in | |
12 |
|
18 | |||
13 | self: super: { |
|
19 | self: super: { | |
14 |
|
20 | |||
|
21 | appenlight-client = super.appenlight-client.override (attrs: { | |||
|
22 | meta = { | |||
|
23 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
24 | }; | |||
|
25 | }); | |||
|
26 | ||||
|
27 | future = super.future.override (attrs: { | |||
|
28 | meta = { | |||
|
29 | license = [ pkgs.lib.licenses.mit ]; | |||
|
30 | }; | |||
|
31 | }); | |||
|
32 | ||||
15 | gnureadline = super.gnureadline.override (attrs: { |
|
33 | gnureadline = super.gnureadline.override (attrs: { | |
16 | buildInputs = attrs.buildInputs ++ [ |
|
34 | buildInputs = attrs.buildInputs ++ [ | |
17 | pkgs.ncurses |
|
35 | pkgs.ncurses | |
18 | ]; |
|
36 | ]; | |
19 | patchPhase = '' |
|
37 | patchPhase = '' | |
20 | substituteInPlace setup.py --replace "/bin/bash" "${pkgs.bash}/bin/bash" |
|
38 | substituteInPlace setup.py --replace "/bin/bash" "${pkgs.bash}/bin/bash" | |
21 | ''; |
|
39 | ''; | |
22 | }); |
|
40 | }); | |
23 |
|
41 | |||
24 | gunicorn = super.gunicorn.override (attrs: { |
|
42 | gunicorn = super.gunicorn.override (attrs: { | |
25 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
43 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ | |
26 | # johbo: futures is needed as long as we are on Python 2, otherwise |
|
44 | # johbo: futures is needed as long as we are on Python 2, otherwise | |
27 | # gunicorn explodes if used with multiple threads per worker. |
|
45 | # gunicorn explodes if used with multiple threads per worker. | |
28 | self.futures |
|
46 | self.futures | |
29 | ]; |
|
47 | ]; | |
30 | }); |
|
48 | }); | |
31 |
|
49 | |||
32 | ipython = super.ipython.override (attrs: { |
|
50 | ipython = super.ipython.override (attrs: { | |
33 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
51 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ | |
34 | self.gnureadline |
|
52 | self.gnureadline | |
35 | ]; |
|
53 | ]; | |
36 | }); |
|
54 | }); | |
37 |
|
55 | |||
38 | kombu = super.kombu.override (attrs: { |
|
56 | kombu = super.kombu.override (attrs: { | |
39 | # The current version of kombu needs some patching to work with the |
|
57 | # The current version of kombu needs some patching to work with the | |
40 | # other libs. Should be removed once we update celery and kombu. |
|
58 | # other libs. Should be removed once we update celery and kombu. | |
41 | patches = [ |
|
59 | patches = [ | |
42 | ./patch-kombu-py-2-7-11.diff |
|
60 | ./patch-kombu-py-2-7-11.diff | |
43 | ./patch-kombu-msgpack.diff |
|
61 | ./patch-kombu-msgpack.diff | |
44 | ]; |
|
62 | ]; | |
45 | }); |
|
63 | }); | |
46 |
|
64 | |||
47 | lxml = super.lxml.override (attrs: { |
|
65 | lxml = super.lxml.override (attrs: { | |
48 | buildInputs = with self; [ |
|
66 | buildInputs = with self; [ | |
49 | pkgs.libxml2 |
|
67 | pkgs.libxml2 | |
50 | pkgs.libxslt |
|
68 | pkgs.libxslt | |
51 | ]; |
|
69 | ]; | |
52 | }); |
|
70 | }); | |
53 |
|
71 | |||
54 | MySQL-python = super.MySQL-python.override (attrs: { |
|
72 | MySQL-python = super.MySQL-python.override (attrs: { | |
55 | buildInputs = attrs.buildInputs ++ [ |
|
73 | buildInputs = attrs.buildInputs ++ [ | |
56 | pkgs.openssl |
|
74 | pkgs.openssl | |
57 | ]; |
|
75 | ]; | |
58 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
76 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ | |
59 | pkgs.mysql.lib |
|
77 | pkgs.mysql.lib | |
60 | pkgs.zlib |
|
78 | pkgs.zlib | |
61 | ]; |
|
79 | ]; | |
62 | }); |
|
80 | }); | |
63 |
|
81 | |||
64 | psutil = super.psutil.override (attrs: { |
|
82 | psutil = super.psutil.override (attrs: { | |
65 | buildInputs = attrs.buildInputs ++ |
|
83 | buildInputs = attrs.buildInputs ++ | |
66 | pkgs.lib.optional pkgs.stdenv.isDarwin pkgs.darwin.IOKit; |
|
84 | pkgs.lib.optional pkgs.stdenv.isDarwin pkgs.darwin.IOKit; | |
67 | }); |
|
85 | }); | |
68 |
|
86 | |||
69 | psycopg2 = super.psycopg2.override (attrs: { |
|
87 | psycopg2 = super.psycopg2.override (attrs: { | |
70 | buildInputs = attrs.buildInputs ++ |
|
88 | buildInputs = attrs.buildInputs ++ | |
71 | pkgs.lib.optional pkgs.stdenv.isDarwin pkgs.openssl; |
|
89 | pkgs.lib.optional pkgs.stdenv.isDarwin pkgs.openssl; | |
72 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
90 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ | |
73 | pkgs.postgresql |
|
91 | pkgs.postgresql | |
74 | ]; |
|
92 | ]; | |
|
93 | meta = { | |||
|
94 | license = pkgs.lib.licenses.lgpl3Plus; | |||
|
95 | }; | |||
75 | }); |
|
96 | }); | |
76 |
|
97 | |||
77 | pycurl = super.pycurl.override (attrs: { |
|
98 | pycurl = super.pycurl.override (attrs: { | |
78 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
99 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ | |
79 | pkgs.curl |
|
100 | pkgs.curl | |
80 | pkgs.openssl |
|
101 | pkgs.openssl | |
81 | ]; |
|
102 | ]; | |
82 | preConfigure = '' |
|
103 | preConfigure = '' | |
83 | substituteInPlace setup.py --replace '--static-libs' '--libs' |
|
104 | substituteInPlace setup.py --replace '--static-libs' '--libs' | |
84 | export PYCURL_SSL_LIBRARY=openssl |
|
105 | export PYCURL_SSL_LIBRARY=openssl | |
85 | ''; |
|
106 | ''; | |
|
107 | meta = { | |||
|
108 | # TODO: It is LGPL and MIT | |||
|
109 | license = pkgs.lib.licenses.mit; | |||
|
110 | }; | |||
86 | }); |
|
111 | }); | |
87 |
|
112 | |||
88 | Pylons = super.Pylons.override (attrs: { |
|
113 | Pylons = super.Pylons.override (attrs: { | |
89 | name = "Pylons-1.0.1-patch1"; |
|
114 | name = "Pylons-1.0.1-patch1"; | |
90 | src = pkgs.fetchgit { |
|
115 | src = pkgs.fetchgit { | |
91 | url = "https://code.rhodecode.com/upstream/pylons"; |
|
116 | url = "https://code.rhodecode.com/upstream/pylons"; | |
92 | rev = "707354ee4261b9c10450404fc9852ccea4fd667d"; |
|
117 | rev = "707354ee4261b9c10450404fc9852ccea4fd667d"; | |
93 | sha256 = "b2763274c2780523a335f83a1df65be22ebe4ff413a7bc9e9288d23c1f62032e"; |
|
118 | sha256 = "b2763274c2780523a335f83a1df65be22ebe4ff413a7bc9e9288d23c1f62032e"; | |
94 | }; |
|
119 | }; | |
95 | }); |
|
120 | }); | |
96 |
|
121 | |||
97 | pyramid = super.pyramid.override (attrs: { |
|
122 | pyramid = super.pyramid.override (attrs: { | |
98 | postFixup = '' |
|
123 | postFixup = '' | |
99 | wrapPythonPrograms |
|
124 | wrapPythonPrograms | |
100 | # TODO: johbo: "wrapPython" adds this magic line which |
|
125 | # TODO: johbo: "wrapPython" adds this magic line which | |
101 | # confuses pserve. |
|
126 | # confuses pserve. | |
102 | ${sed} '/import sys; sys.argv/d' $out/bin/.pserve-wrapped |
|
127 | ${sed} '/import sys; sys.argv/d' $out/bin/.pserve-wrapped | |
103 | ''; |
|
128 | ''; | |
|
129 | meta = { | |||
|
130 | license = localLicenses.repoze; | |||
|
131 | }; | |||
|
132 | }); | |||
|
133 | ||||
|
134 | pyramid-debugtoolbar = super.pyramid-debugtoolbar.override (attrs: { | |||
|
135 | meta = { | |||
|
136 | license = [ pkgs.lib.licenses.bsdOriginal localLicenses.repoze ]; | |||
|
137 | }; | |||
104 | }); |
|
138 | }); | |
105 |
|
139 | |||
106 | Pyro4 = super.Pyro4.override (attrs: { |
|
140 | Pyro4 = super.Pyro4.override (attrs: { | |
107 | # TODO: Was not able to generate this version, needs further |
|
141 | # TODO: Was not able to generate this version, needs further | |
108 | # investigation. |
|
142 | # investigation. | |
109 | name = "Pyro4-4.35"; |
|
143 | name = "Pyro4-4.35"; | |
110 | src = pkgs.fetchurl { |
|
144 | src = pkgs.fetchurl { | |
111 | url = "https://pypi.python.org/packages/source/P/Pyro4/Pyro4-4.35.src.tar.gz"; |
|
145 | url = "https://pypi.python.org/packages/source/P/Pyro4/Pyro4-4.35.src.tar.gz"; | |
112 | md5 = "cbe6cb855f086a0f092ca075005855f3"; |
|
146 | md5 = "cbe6cb855f086a0f092ca075005855f3"; | |
113 | }; |
|
147 | }; | |
114 | }); |
|
148 | }); | |
115 |
|
149 | |||
116 | pysqlite = super.pysqlite.override (attrs: { |
|
150 | pysqlite = super.pysqlite.override (attrs: { | |
117 | propagatedBuildInputs = [ |
|
151 | propagatedBuildInputs = [ | |
118 | pkgs.sqlite |
|
152 | pkgs.sqlite | |
119 | ]; |
|
153 | ]; | |
|
154 | meta = { | |||
|
155 | license = [ pkgs.lib.licenses.zlib pkgs.lib.licenses.libpng ]; | |||
|
156 | }; | |||
120 | }); |
|
157 | }); | |
121 |
|
158 | |||
122 | pytest-runner = super.pytest-runner.override (attrs: { |
|
159 | pytest-runner = super.pytest-runner.override (attrs: { | |
123 | propagatedBuildInputs = [ |
|
160 | propagatedBuildInputs = [ | |
124 | self.setuptools-scm |
|
161 | self.setuptools-scm | |
125 | ]; |
|
162 | ]; | |
126 | }); |
|
163 | }); | |
127 |
|
164 | |||
128 | python-ldap = super.python-ldap.override (attrs: { |
|
165 | python-ldap = super.python-ldap.override (attrs: { | |
129 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
166 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ | |
130 | pkgs.cyrus_sasl |
|
167 | pkgs.cyrus_sasl | |
131 | pkgs.openldap |
|
168 | pkgs.openldap | |
132 | pkgs.openssl |
|
169 | pkgs.openssl | |
133 | ]; |
|
170 | ]; | |
134 | NIX_CFLAGS_COMPILE = "-I${pkgs.cyrus_sasl}/include/sasl"; |
|
171 | NIX_CFLAGS_COMPILE = "-I${pkgs.cyrus_sasl}/include/sasl"; | |
135 | }); |
|
172 | }); | |
136 |
|
173 | |||
137 | python-pam = super.python-pam.override (attrs: |
|
174 | python-pam = super.python-pam.override (attrs: | |
138 | let |
|
175 | let | |
139 | includeLibPam = pkgs.stdenv.isLinux; |
|
176 | includeLibPam = pkgs.stdenv.isLinux; | |
140 | in { |
|
177 | in { | |
141 | # TODO: johbo: Move the option up into the default.nix, we should |
|
178 | # TODO: johbo: Move the option up into the default.nix, we should | |
142 | # include python-pam only on supported platforms. |
|
179 | # include python-pam only on supported platforms. | |
143 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ |
|
180 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ | |
144 | pkgs.lib.optional includeLibPam [ |
|
181 | pkgs.lib.optional includeLibPam [ | |
145 | pkgs.pam |
|
182 | pkgs.pam | |
146 | ]; |
|
183 | ]; | |
147 | # TODO: johbo: Check if this can be avoided, or transform into |
|
184 | # TODO: johbo: Check if this can be avoided, or transform into | |
148 | # a real patch |
|
185 | # a real patch | |
149 | patchPhase = pkgs.lib.optionals includeLibPam '' |
|
186 | patchPhase = pkgs.lib.optionals includeLibPam '' | |
150 | substituteInPlace pam.py \ |
|
187 | substituteInPlace pam.py \ | |
151 | --replace 'find_library("pam")' '"${pkgs.pam}/lib/libpam.so.0"' |
|
188 | --replace 'find_library("pam")' '"${pkgs.pam}/lib/libpam.so.0"' | |
152 | ''; |
|
189 | ''; | |
153 | }); |
|
190 | }); | |
154 |
|
191 | |||
155 | rhodecode-tools = super.rhodecode-tools.override (attrs: { |
|
192 | rhodecode-tools = super.rhodecode-tools.override (attrs: { | |
156 | patches = [ |
|
193 | patches = [ | |
157 | ./patch-rhodecode-tools-setup.diff |
|
194 | ./patch-rhodecode-tools-setup.diff | |
158 | ]; |
|
195 | ]; | |
159 | }); |
|
196 | }); | |
160 |
|
197 | |||
|
198 | URLObject = super.URLObject.override (attrs: { | |||
|
199 | meta = { | |||
|
200 | license = { | |||
|
201 | spdxId = "Unlicense"; | |||
|
202 | fullName = "The Unlicense"; | |||
|
203 | url = http://unlicense.org/; | |||
|
204 | }; | |||
|
205 | }; | |||
|
206 | }); | |||
|
207 | ||||
|
208 | amqplib = super.amqplib.override (attrs: { | |||
|
209 | meta = { | |||
|
210 | license = pkgs.lib.licenses.lgpl3; | |||
|
211 | }; | |||
|
212 | }); | |||
|
213 | ||||
|
214 | docutils = super.docutils.override (attrs: { | |||
|
215 | meta = { | |||
|
216 | license = pkgs.lib.licenses.bsd2; | |||
|
217 | }; | |||
|
218 | }); | |||
|
219 | ||||
|
220 | colander = super.colander.override (attrs: { | |||
|
221 | meta = { | |||
|
222 | license = localLicenses.repoze; | |||
|
223 | }; | |||
|
224 | }); | |||
|
225 | ||||
|
226 | pyramid-beaker = super.pyramid-beaker.override (attrs: { | |||
|
227 | meta = { | |||
|
228 | license = localLicenses.repoze; | |||
|
229 | }; | |||
|
230 | }); | |||
|
231 | ||||
|
232 | pyramid-mako = super.pyramid-mako.override (attrs: { | |||
|
233 | meta = { | |||
|
234 | license = localLicenses.repoze; | |||
|
235 | }; | |||
|
236 | }); | |||
|
237 | ||||
|
238 | repoze.lru = super.repoze.lru.override (attrs: { | |||
|
239 | meta = { | |||
|
240 | license = localLicenses.repoze; | |||
|
241 | }; | |||
|
242 | }); | |||
|
243 | ||||
|
244 | recaptcha-client = super.recaptcha-client.override (attrs: { | |||
|
245 | meta = { | |||
|
246 | # TODO: It is MIT/X11 | |||
|
247 | license = pkgs.lib.licenses.mit; | |||
|
248 | }; | |||
|
249 | }); | |||
|
250 | ||||
|
251 | python-editor = super.python-editor.override (attrs: { | |||
|
252 | meta = { | |||
|
253 | license = pkgs.lib.licenses.asl20; | |||
|
254 | }; | |||
|
255 | }); | |||
|
256 | ||||
|
257 | translationstring = super.translationstring.override (attrs: { | |||
|
258 | meta = { | |||
|
259 | license = localLicenses.repoze; | |||
|
260 | }; | |||
|
261 | }); | |||
|
262 | ||||
|
263 | venusian = super.venusian.override (attrs: { | |||
|
264 | meta = { | |||
|
265 | license = localLicenses.repoze; | |||
|
266 | }; | |||
|
267 | }); | |||
|
268 | ||||
161 | # Avoid that setuptools is replaced, this leads to trouble |
|
269 | # Avoid that setuptools is replaced, this leads to trouble | |
162 | # with buildPythonPackage. |
|
270 | # with buildPythonPackage. | |
163 | setuptools = basePythonPackages.setuptools; |
|
271 | setuptools = basePythonPackages.setuptools; | |
164 |
|
272 | |||
165 | } |
|
273 | } |
@@ -1,1263 +1,1641 b'' | |||||
1 | { |
|
1 | { | |
2 | Babel = super.buildPythonPackage { |
|
2 | Babel = super.buildPythonPackage { | |
3 | name = "Babel-1.3"; |
|
3 | name = "Babel-1.3"; | |
4 | buildInputs = with self; []; |
|
4 | buildInputs = with self; []; | |
5 | doCheck = false; |
|
5 | doCheck = false; | |
6 | propagatedBuildInputs = with self; [pytz]; |
|
6 | propagatedBuildInputs = with self; [pytz]; | |
7 | src = fetchurl { |
|
7 | src = fetchurl { | |
8 | url = "https://pypi.python.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz"; |
|
8 | url = "https://pypi.python.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz"; | |
9 | md5 = "5264ceb02717843cbc9ffce8e6e06bdb"; |
|
9 | md5 = "5264ceb02717843cbc9ffce8e6e06bdb"; | |
10 | }; |
|
10 | }; | |
|
11 | meta = { | |||
|
12 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
13 | }; | |||
11 | }; |
|
14 | }; | |
12 | Beaker = super.buildPythonPackage { |
|
15 | Beaker = super.buildPythonPackage { | |
13 | name = "Beaker-1.7.0"; |
|
16 | name = "Beaker-1.7.0"; | |
14 | buildInputs = with self; []; |
|
17 | buildInputs = with self; []; | |
15 | doCheck = false; |
|
18 | doCheck = false; | |
16 | propagatedBuildInputs = with self; []; |
|
19 | propagatedBuildInputs = with self; []; | |
17 | src = fetchurl { |
|
20 | src = fetchurl { | |
18 | url = "https://pypi.python.org/packages/97/8e/409d2e7c009b8aa803dc9e6f239f1db7c3cdf578249087a404e7c27a505d/Beaker-1.7.0.tar.gz"; |
|
21 | url = "https://pypi.python.org/packages/97/8e/409d2e7c009b8aa803dc9e6f239f1db7c3cdf578249087a404e7c27a505d/Beaker-1.7.0.tar.gz"; | |
19 | md5 = "386be3f7fe427358881eee4622b428b3"; |
|
22 | md5 = "386be3f7fe427358881eee4622b428b3"; | |
20 | }; |
|
23 | }; | |
|
24 | meta = { | |||
|
25 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
26 | }; | |||
21 | }; |
|
27 | }; | |
22 | CProfileV = super.buildPythonPackage { |
|
28 | CProfileV = super.buildPythonPackage { | |
23 | name = "CProfileV-1.0.6"; |
|
29 | name = "CProfileV-1.0.6"; | |
24 | buildInputs = with self; []; |
|
30 | buildInputs = with self; []; | |
25 | doCheck = false; |
|
31 | doCheck = false; | |
26 | propagatedBuildInputs = with self; [bottle]; |
|
32 | propagatedBuildInputs = with self; [bottle]; | |
27 | src = fetchurl { |
|
33 | src = fetchurl { | |
28 | url = "https://pypi.python.org/packages/eb/df/983a0b6cfd3ac94abf023f5011cb04f33613ace196e33f53c86cf91850d5/CProfileV-1.0.6.tar.gz"; |
|
34 | url = "https://pypi.python.org/packages/eb/df/983a0b6cfd3ac94abf023f5011cb04f33613ace196e33f53c86cf91850d5/CProfileV-1.0.6.tar.gz"; | |
29 | md5 = "08c7c242b6e64237bc53c5d13537e03d"; |
|
35 | md5 = "08c7c242b6e64237bc53c5d13537e03d"; | |
30 | }; |
|
36 | }; | |
|
37 | meta = { | |||
|
38 | license = [ pkgs.lib.licenses.mit ]; | |||
|
39 | }; | |||
31 | }; |
|
40 | }; | |
32 | Fabric = super.buildPythonPackage { |
|
41 | Fabric = super.buildPythonPackage { | |
33 | name = "Fabric-1.10.0"; |
|
42 | name = "Fabric-1.10.0"; | |
34 | buildInputs = with self; []; |
|
43 | buildInputs = with self; []; | |
35 | doCheck = false; |
|
44 | doCheck = false; | |
36 | propagatedBuildInputs = with self; [paramiko]; |
|
45 | propagatedBuildInputs = with self; [paramiko]; | |
37 | src = fetchurl { |
|
46 | src = fetchurl { | |
38 | url = "https://pypi.python.org/packages/e3/5f/b6ebdb5241d5ec9eab582a5c8a01255c1107da396f849e538801d2fe64a5/Fabric-1.10.0.tar.gz"; |
|
47 | url = "https://pypi.python.org/packages/e3/5f/b6ebdb5241d5ec9eab582a5c8a01255c1107da396f849e538801d2fe64a5/Fabric-1.10.0.tar.gz"; | |
39 | md5 = "2cb96473387f0e7aa035210892352f4a"; |
|
48 | md5 = "2cb96473387f0e7aa035210892352f4a"; | |
40 | }; |
|
49 | }; | |
|
50 | meta = { | |||
|
51 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
52 | }; | |||
41 | }; |
|
53 | }; | |
42 | FormEncode = super.buildPythonPackage { |
|
54 | FormEncode = super.buildPythonPackage { | |
43 | name = "FormEncode-1.2.4"; |
|
55 | name = "FormEncode-1.2.4"; | |
44 | buildInputs = with self; []; |
|
56 | buildInputs = with self; []; | |
45 | doCheck = false; |
|
57 | doCheck = false; | |
46 | propagatedBuildInputs = with self; []; |
|
58 | propagatedBuildInputs = with self; []; | |
47 | src = fetchurl { |
|
59 | src = fetchurl { | |
48 | url = "https://pypi.python.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz"; |
|
60 | url = "https://pypi.python.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz"; | |
49 | md5 = "6bc17fb9aed8aea198975e888e2077f4"; |
|
61 | md5 = "6bc17fb9aed8aea198975e888e2077f4"; | |
50 | }; |
|
62 | }; | |
|
63 | meta = { | |||
|
64 | license = [ pkgs.lib.licenses.psfl ]; | |||
|
65 | }; | |||
51 | }; |
|
66 | }; | |
52 | Jinja2 = super.buildPythonPackage { |
|
67 | Jinja2 = super.buildPythonPackage { | |
53 | name = "Jinja2-2.7.3"; |
|
68 | name = "Jinja2-2.7.3"; | |
54 | buildInputs = with self; []; |
|
69 | buildInputs = with self; []; | |
55 | doCheck = false; |
|
70 | doCheck = false; | |
56 | propagatedBuildInputs = with self; [MarkupSafe]; |
|
71 | propagatedBuildInputs = with self; [MarkupSafe]; | |
57 | src = fetchurl { |
|
72 | src = fetchurl { | |
58 | url = "https://pypi.python.org/packages/b0/73/eab0bca302d6d6a0b5c402f47ad1760dc9cb2dd14bbc1873ad48db258e4d/Jinja2-2.7.3.tar.gz"; |
|
73 | url = "https://pypi.python.org/packages/b0/73/eab0bca302d6d6a0b5c402f47ad1760dc9cb2dd14bbc1873ad48db258e4d/Jinja2-2.7.3.tar.gz"; | |
59 | md5 = "b9dffd2f3b43d673802fe857c8445b1a"; |
|
74 | md5 = "b9dffd2f3b43d673802fe857c8445b1a"; | |
60 | }; |
|
75 | }; | |
|
76 | meta = { | |||
|
77 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
78 | }; | |||
61 | }; |
|
79 | }; | |
62 | Mako = super.buildPythonPackage { |
|
80 | Mako = super.buildPythonPackage { | |
63 | name = "Mako-1.0.1"; |
|
81 | name = "Mako-1.0.1"; | |
64 | buildInputs = with self; []; |
|
82 | buildInputs = with self; []; | |
65 | doCheck = false; |
|
83 | doCheck = false; | |
66 | propagatedBuildInputs = with self; [MarkupSafe]; |
|
84 | propagatedBuildInputs = with self; [MarkupSafe]; | |
67 | src = fetchurl { |
|
85 | src = fetchurl { | |
68 | url = "https://pypi.python.org/packages/8e/a4/aa56533ecaa5f22ca92428f74e074d0c9337282933c722391902c8f9e0f8/Mako-1.0.1.tar.gz"; |
|
86 | url = "https://pypi.python.org/packages/8e/a4/aa56533ecaa5f22ca92428f74e074d0c9337282933c722391902c8f9e0f8/Mako-1.0.1.tar.gz"; | |
69 | md5 = "9f0aafd177b039ef67b90ea350497a54"; |
|
87 | md5 = "9f0aafd177b039ef67b90ea350497a54"; | |
70 | }; |
|
88 | }; | |
|
89 | meta = { | |||
|
90 | license = [ pkgs.lib.licenses.mit ]; | |||
|
91 | }; | |||
71 | }; |
|
92 | }; | |
72 | Markdown = super.buildPythonPackage { |
|
93 | Markdown = super.buildPythonPackage { | |
73 | name = "Markdown-2.6.2"; |
|
94 | name = "Markdown-2.6.2"; | |
74 | buildInputs = with self; []; |
|
95 | buildInputs = with self; []; | |
75 | doCheck = false; |
|
96 | doCheck = false; | |
76 | propagatedBuildInputs = with self; []; |
|
97 | propagatedBuildInputs = with self; []; | |
77 | src = fetchurl { |
|
98 | src = fetchurl { | |
78 | url = "https://pypi.python.org/packages/62/8b/83658b5f6c220d5fcde9f9852d46ea54765d734cfbc5a9f4c05bfc36db4d/Markdown-2.6.2.tar.gz"; |
|
99 | url = "https://pypi.python.org/packages/62/8b/83658b5f6c220d5fcde9f9852d46ea54765d734cfbc5a9f4c05bfc36db4d/Markdown-2.6.2.tar.gz"; | |
79 | md5 = "256d19afcc564dc4ce4c229bb762f7ae"; |
|
100 | md5 = "256d19afcc564dc4ce4c229bb762f7ae"; | |
80 | }; |
|
101 | }; | |
|
102 | meta = { | |||
|
103 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
104 | }; | |||
81 | }; |
|
105 | }; | |
82 | MarkupSafe = super.buildPythonPackage { |
|
106 | MarkupSafe = super.buildPythonPackage { | |
83 | name = "MarkupSafe-0.23"; |
|
107 | name = "MarkupSafe-0.23"; | |
84 | buildInputs = with self; []; |
|
108 | buildInputs = with self; []; | |
85 | doCheck = false; |
|
109 | doCheck = false; | |
86 | propagatedBuildInputs = with self; []; |
|
110 | propagatedBuildInputs = with self; []; | |
87 | src = fetchurl { |
|
111 | src = fetchurl { | |
88 | url = "https://pypi.python.org/packages/c0/41/bae1254e0396c0cc8cf1751cb7d9afc90a602353695af5952530482c963f/MarkupSafe-0.23.tar.gz"; |
|
112 | url = "https://pypi.python.org/packages/c0/41/bae1254e0396c0cc8cf1751cb7d9afc90a602353695af5952530482c963f/MarkupSafe-0.23.tar.gz"; | |
89 | md5 = "f5ab3deee4c37cd6a922fb81e730da6e"; |
|
113 | md5 = "f5ab3deee4c37cd6a922fb81e730da6e"; | |
90 | }; |
|
114 | }; | |
|
115 | meta = { | |||
|
116 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
117 | }; | |||
91 | }; |
|
118 | }; | |
92 | MySQL-python = super.buildPythonPackage { |
|
119 | MySQL-python = super.buildPythonPackage { | |
93 | name = "MySQL-python-1.2.5"; |
|
120 | name = "MySQL-python-1.2.5"; | |
94 | buildInputs = with self; []; |
|
121 | buildInputs = with self; []; | |
95 | doCheck = false; |
|
122 | doCheck = false; | |
96 | propagatedBuildInputs = with self; []; |
|
123 | propagatedBuildInputs = with self; []; | |
97 | src = fetchurl { |
|
124 | src = fetchurl { | |
98 | url = "https://pypi.python.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip"; |
|
125 | url = "https://pypi.python.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip"; | |
99 | md5 = "654f75b302db6ed8dc5a898c625e030c"; |
|
126 | md5 = "654f75b302db6ed8dc5a898c625e030c"; | |
100 | }; |
|
127 | }; | |
|
128 | meta = { | |||
|
129 | license = [ pkgs.lib.licenses.gpl1 ]; | |||
|
130 | }; | |||
101 | }; |
|
131 | }; | |
102 | Paste = super.buildPythonPackage { |
|
132 | Paste = super.buildPythonPackage { | |
103 | name = "Paste-2.0.2"; |
|
133 | name = "Paste-2.0.2"; | |
104 | buildInputs = with self; []; |
|
134 | buildInputs = with self; []; | |
105 | doCheck = false; |
|
135 | doCheck = false; | |
106 | propagatedBuildInputs = with self; [six]; |
|
136 | propagatedBuildInputs = with self; [six]; | |
107 | src = fetchurl { |
|
137 | src = fetchurl { | |
108 | url = "https://pypi.python.org/packages/d5/8d/0f8ac40687b97ff3e07ebd1369be20bdb3f93864d2dc3c2ff542edb4ce50/Paste-2.0.2.tar.gz"; |
|
138 | url = "https://pypi.python.org/packages/d5/8d/0f8ac40687b97ff3e07ebd1369be20bdb3f93864d2dc3c2ff542edb4ce50/Paste-2.0.2.tar.gz"; | |
109 | md5 = "4bfc8a7eaf858f6309d2ac0f40fc951c"; |
|
139 | md5 = "4bfc8a7eaf858f6309d2ac0f40fc951c"; | |
110 | }; |
|
140 | }; | |
|
141 | meta = { | |||
|
142 | license = [ pkgs.lib.licenses.mit ]; | |||
|
143 | }; | |||
111 | }; |
|
144 | }; | |
112 | PasteDeploy = super.buildPythonPackage { |
|
145 | PasteDeploy = super.buildPythonPackage { | |
113 | name = "PasteDeploy-1.5.2"; |
|
146 | name = "PasteDeploy-1.5.2"; | |
114 | buildInputs = with self; []; |
|
147 | buildInputs = with self; []; | |
115 | doCheck = false; |
|
148 | doCheck = false; | |
116 | propagatedBuildInputs = with self; []; |
|
149 | propagatedBuildInputs = with self; []; | |
117 | src = fetchurl { |
|
150 | src = fetchurl { | |
118 | url = "https://pypi.python.org/packages/0f/90/8e20cdae206c543ea10793cbf4136eb9a8b3f417e04e40a29d72d9922cbd/PasteDeploy-1.5.2.tar.gz"; |
|
151 | url = "https://pypi.python.org/packages/0f/90/8e20cdae206c543ea10793cbf4136eb9a8b3f417e04e40a29d72d9922cbd/PasteDeploy-1.5.2.tar.gz"; | |
119 | md5 = "352b7205c78c8de4987578d19431af3b"; |
|
152 | md5 = "352b7205c78c8de4987578d19431af3b"; | |
120 | }; |
|
153 | }; | |
|
154 | meta = { | |||
|
155 | license = [ pkgs.lib.licenses.mit ]; | |||
|
156 | }; | |||
121 | }; |
|
157 | }; | |
122 | PasteScript = super.buildPythonPackage { |
|
158 | PasteScript = super.buildPythonPackage { | |
123 | name = "PasteScript-1.7.5"; |
|
159 | name = "PasteScript-1.7.5"; | |
124 | buildInputs = with self; []; |
|
160 | buildInputs = with self; []; | |
125 | doCheck = false; |
|
161 | doCheck = false; | |
126 | propagatedBuildInputs = with self; [Paste PasteDeploy]; |
|
162 | propagatedBuildInputs = with self; [Paste PasteDeploy]; | |
127 | src = fetchurl { |
|
163 | src = fetchurl { | |
128 | url = "https://pypi.python.org/packages/a5/05/fc60efa7c2f17a1dbaeccb2a903a1e90902d92b9d00eebabe3095829d806/PasteScript-1.7.5.tar.gz"; |
|
164 | url = "https://pypi.python.org/packages/a5/05/fc60efa7c2f17a1dbaeccb2a903a1e90902d92b9d00eebabe3095829d806/PasteScript-1.7.5.tar.gz"; | |
129 | md5 = "4c72d78dcb6bb993f30536842c16af4d"; |
|
165 | md5 = "4c72d78dcb6bb993f30536842c16af4d"; | |
130 | }; |
|
166 | }; | |
|
167 | meta = { | |||
|
168 | license = [ pkgs.lib.licenses.mit ]; | |||
|
169 | }; | |||
131 | }; |
|
170 | }; | |
132 | Pygments = super.buildPythonPackage { |
|
171 | Pygments = super.buildPythonPackage { | |
133 | name = "Pygments-2.0.2"; |
|
172 | name = "Pygments-2.0.2"; | |
134 | buildInputs = with self; []; |
|
173 | buildInputs = with self; []; | |
135 | doCheck = false; |
|
174 | doCheck = false; | |
136 | propagatedBuildInputs = with self; []; |
|
175 | propagatedBuildInputs = with self; []; | |
137 | src = fetchurl { |
|
176 | src = fetchurl { | |
138 | url = "https://pypi.python.org/packages/f4/c6/bdbc5a8a112256b2b6136af304dbae93d8b1ef8738ff2d12a51018800e46/Pygments-2.0.2.tar.gz"; |
|
177 | url = "https://pypi.python.org/packages/f4/c6/bdbc5a8a112256b2b6136af304dbae93d8b1ef8738ff2d12a51018800e46/Pygments-2.0.2.tar.gz"; | |
139 | md5 = "238587a1370d62405edabd0794b3ec4a"; |
|
178 | md5 = "238587a1370d62405edabd0794b3ec4a"; | |
140 | }; |
|
179 | }; | |
|
180 | meta = { | |||
|
181 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
182 | }; | |||
141 | }; |
|
183 | }; | |
142 | Pylons = super.buildPythonPackage { |
|
184 | Pylons = super.buildPythonPackage { | |
143 | name = "Pylons-1.0.1"; |
|
185 | name = "Pylons-1.0.1"; | |
144 | buildInputs = with self; []; |
|
186 | buildInputs = with self; []; | |
145 | doCheck = false; |
|
187 | doCheck = false; | |
146 | propagatedBuildInputs = with self; [Routes WebHelpers Beaker Paste PasteDeploy PasteScript FormEncode simplejson decorator nose Mako WebError WebTest Tempita MarkupSafe WebOb]; |
|
188 | propagatedBuildInputs = with self; [Routes WebHelpers Beaker Paste PasteDeploy PasteScript FormEncode simplejson decorator nose Mako WebError WebTest Tempita MarkupSafe WebOb]; | |
147 | src = fetchurl { |
|
189 | src = fetchurl { | |
148 | url = "https://pypi.python.org/packages/a2/69/b835a6bad00acbfeed3f33c6e44fa3f936efc998c795bfb15c61a79ecf62/Pylons-1.0.1.tar.gz"; |
|
190 | url = "https://pypi.python.org/packages/a2/69/b835a6bad00acbfeed3f33c6e44fa3f936efc998c795bfb15c61a79ecf62/Pylons-1.0.1.tar.gz"; | |
149 | md5 = "6cb880d75fa81213192142b07a6e4915"; |
|
191 | md5 = "6cb880d75fa81213192142b07a6e4915"; | |
150 | }; |
|
192 | }; | |
|
193 | meta = { | |||
|
194 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
195 | }; | |||
151 | }; |
|
196 | }; | |
152 | Pyro4 = super.buildPythonPackage { |
|
197 | Pyro4 = super.buildPythonPackage { | |
153 | name = "Pyro4-4.41"; |
|
198 | name = "Pyro4-4.41"; | |
154 | buildInputs = with self; []; |
|
199 | buildInputs = with self; []; | |
155 | doCheck = false; |
|
200 | doCheck = false; | |
156 | propagatedBuildInputs = with self; [serpent]; |
|
201 | propagatedBuildInputs = with self; [serpent]; | |
157 | src = fetchurl { |
|
202 | src = fetchurl { | |
158 | url = "https://pypi.python.org/packages/56/2b/89b566b4bf3e7f8ba790db2d1223852f8cb454c52cab7693dd41f608ca2a/Pyro4-4.41.tar.gz"; |
|
203 | url = "https://pypi.python.org/packages/56/2b/89b566b4bf3e7f8ba790db2d1223852f8cb454c52cab7693dd41f608ca2a/Pyro4-4.41.tar.gz"; | |
159 | md5 = "ed69e9bfafa9c06c049a87cb0c4c2b6c"; |
|
204 | md5 = "ed69e9bfafa9c06c049a87cb0c4c2b6c"; | |
160 | }; |
|
205 | }; | |
|
206 | meta = { | |||
|
207 | license = [ pkgs.lib.licenses.mit ]; | |||
|
208 | }; | |||
161 | }; |
|
209 | }; | |
162 | Routes = super.buildPythonPackage { |
|
210 | Routes = super.buildPythonPackage { | |
163 | name = "Routes-1.13"; |
|
211 | name = "Routes-1.13"; | |
164 | buildInputs = with self; []; |
|
212 | buildInputs = with self; []; | |
165 | doCheck = false; |
|
213 | doCheck = false; | |
166 | propagatedBuildInputs = with self; [repoze.lru]; |
|
214 | propagatedBuildInputs = with self; [repoze.lru]; | |
167 | src = fetchurl { |
|
215 | src = fetchurl { | |
168 | url = "https://pypi.python.org/packages/88/d3/259c3b3cde8837eb9441ab5f574a660e8a4acea8f54a078441d4d2acac1c/Routes-1.13.tar.gz"; |
|
216 | url = "https://pypi.python.org/packages/88/d3/259c3b3cde8837eb9441ab5f574a660e8a4acea8f54a078441d4d2acac1c/Routes-1.13.tar.gz"; | |
169 | md5 = "d527b0ab7dd9172b1275a41f97448783"; |
|
217 | md5 = "d527b0ab7dd9172b1275a41f97448783"; | |
170 | }; |
|
218 | }; | |
|
219 | meta = { | |||
|
220 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
221 | }; | |||
171 | }; |
|
222 | }; | |
172 | SQLAlchemy = super.buildPythonPackage { |
|
223 | SQLAlchemy = super.buildPythonPackage { | |
173 | name = "SQLAlchemy-0.9.9"; |
|
224 | name = "SQLAlchemy-0.9.9"; | |
174 | buildInputs = with self; []; |
|
225 | buildInputs = with self; []; | |
175 | doCheck = false; |
|
226 | doCheck = false; | |
176 | propagatedBuildInputs = with self; []; |
|
227 | propagatedBuildInputs = with self; []; | |
177 | src = fetchurl { |
|
228 | src = fetchurl { | |
178 | url = "https://pypi.python.org/packages/28/f7/1bbfd0d8597e8c358d5e15a166a486ad82fc5579b4e67b6ef7c05b1d182b/SQLAlchemy-0.9.9.tar.gz"; |
|
229 | url = "https://pypi.python.org/packages/28/f7/1bbfd0d8597e8c358d5e15a166a486ad82fc5579b4e67b6ef7c05b1d182b/SQLAlchemy-0.9.9.tar.gz"; | |
179 | md5 = "8a10a9bd13ed3336ef7333ac2cc679ff"; |
|
230 | md5 = "8a10a9bd13ed3336ef7333ac2cc679ff"; | |
180 | }; |
|
231 | }; | |
|
232 | meta = { | |||
|
233 | license = [ pkgs.lib.licenses.mit ]; | |||
|
234 | }; | |||
181 | }; |
|
235 | }; | |
182 | Sphinx = super.buildPythonPackage { |
|
236 | Sphinx = super.buildPythonPackage { | |
183 | name = "Sphinx-1.2.2"; |
|
237 | name = "Sphinx-1.2.2"; | |
184 | buildInputs = with self; []; |
|
238 | buildInputs = with self; []; | |
185 | doCheck = false; |
|
239 | doCheck = false; | |
186 | propagatedBuildInputs = with self; [Pygments docutils Jinja2]; |
|
240 | propagatedBuildInputs = with self; [Pygments docutils Jinja2]; | |
187 | src = fetchurl { |
|
241 | src = fetchurl { | |
188 | url = "https://pypi.python.org/packages/0a/50/34017e6efcd372893a416aba14b84a1a149fc7074537b0e9cb6ca7b7abe9/Sphinx-1.2.2.tar.gz"; |
|
242 | url = "https://pypi.python.org/packages/0a/50/34017e6efcd372893a416aba14b84a1a149fc7074537b0e9cb6ca7b7abe9/Sphinx-1.2.2.tar.gz"; | |
189 | md5 = "3dc73ccaa8d0bfb2d62fb671b1f7e8a4"; |
|
243 | md5 = "3dc73ccaa8d0bfb2d62fb671b1f7e8a4"; | |
190 | }; |
|
244 | }; | |
|
245 | meta = { | |||
|
246 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
247 | }; | |||
191 | }; |
|
248 | }; | |
192 | Tempita = super.buildPythonPackage { |
|
249 | Tempita = super.buildPythonPackage { | |
193 | name = "Tempita-0.5.2"; |
|
250 | name = "Tempita-0.5.2"; | |
194 | buildInputs = with self; []; |
|
251 | buildInputs = with self; []; | |
195 | doCheck = false; |
|
252 | doCheck = false; | |
196 | propagatedBuildInputs = with self; []; |
|
253 | propagatedBuildInputs = with self; []; | |
197 | src = fetchurl { |
|
254 | src = fetchurl { | |
198 | url = "https://pypi.python.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz"; |
|
255 | url = "https://pypi.python.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz"; | |
199 | md5 = "4c2f17bb9d481821c41b6fbee904cea1"; |
|
256 | md5 = "4c2f17bb9d481821c41b6fbee904cea1"; | |
200 | }; |
|
257 | }; | |
|
258 | meta = { | |||
|
259 | license = [ pkgs.lib.licenses.mit ]; | |||
|
260 | }; | |||
201 | }; |
|
261 | }; | |
202 | URLObject = super.buildPythonPackage { |
|
262 | URLObject = super.buildPythonPackage { | |
203 | name = "URLObject-2.4.0"; |
|
263 | name = "URLObject-2.4.0"; | |
204 | buildInputs = with self; []; |
|
264 | buildInputs = with self; []; | |
205 | doCheck = false; |
|
265 | doCheck = false; | |
206 | propagatedBuildInputs = with self; []; |
|
266 | propagatedBuildInputs = with self; []; | |
207 | src = fetchurl { |
|
267 | src = fetchurl { | |
208 | url = "https://pypi.python.org/packages/cb/b6/e25e58500f9caef85d664bec71ec67c116897bfebf8622c32cb75d1ca199/URLObject-2.4.0.tar.gz"; |
|
268 | url = "https://pypi.python.org/packages/cb/b6/e25e58500f9caef85d664bec71ec67c116897bfebf8622c32cb75d1ca199/URLObject-2.4.0.tar.gz"; | |
209 | md5 = "2ed819738a9f0a3051f31dc9924e3065"; |
|
269 | md5 = "2ed819738a9f0a3051f31dc9924e3065"; | |
210 | }; |
|
270 | }; | |
|
271 | meta = { | |||
|
272 | license = [ ]; | |||
|
273 | }; | |||
211 | }; |
|
274 | }; | |
212 | WebError = super.buildPythonPackage { |
|
275 | WebError = super.buildPythonPackage { | |
213 | name = "WebError-0.10.3"; |
|
276 | name = "WebError-0.10.3"; | |
214 | buildInputs = with self; []; |
|
277 | buildInputs = with self; []; | |
215 | doCheck = false; |
|
278 | doCheck = false; | |
216 | propagatedBuildInputs = with self; [WebOb Tempita Pygments Paste]; |
|
279 | propagatedBuildInputs = with self; [WebOb Tempita Pygments Paste]; | |
217 | src = fetchurl { |
|
280 | src = fetchurl { | |
218 | url = "https://pypi.python.org/packages/35/76/e7e5c2ce7e9c7f31b54c1ff295a495886d1279a002557d74dd8957346a79/WebError-0.10.3.tar.gz"; |
|
281 | url = "https://pypi.python.org/packages/35/76/e7e5c2ce7e9c7f31b54c1ff295a495886d1279a002557d74dd8957346a79/WebError-0.10.3.tar.gz"; | |
219 | md5 = "84b9990b0baae6fd440b1e60cdd06f9a"; |
|
282 | md5 = "84b9990b0baae6fd440b1e60cdd06f9a"; | |
220 | }; |
|
283 | }; | |
|
284 | meta = { | |||
|
285 | license = [ pkgs.lib.licenses.mit ]; | |||
|
286 | }; | |||
221 | }; |
|
287 | }; | |
222 | WebHelpers = super.buildPythonPackage { |
|
288 | WebHelpers = super.buildPythonPackage { | |
223 | name = "WebHelpers-1.3"; |
|
289 | name = "WebHelpers-1.3"; | |
224 | buildInputs = with self; []; |
|
290 | buildInputs = with self; []; | |
225 | doCheck = false; |
|
291 | doCheck = false; | |
226 | propagatedBuildInputs = with self; [MarkupSafe]; |
|
292 | propagatedBuildInputs = with self; [MarkupSafe]; | |
227 | src = fetchurl { |
|
293 | src = fetchurl { | |
228 | url = "https://pypi.python.org/packages/ee/68/4d07672821d514184357f1552f2dad923324f597e722de3b016ca4f7844f/WebHelpers-1.3.tar.gz"; |
|
294 | url = "https://pypi.python.org/packages/ee/68/4d07672821d514184357f1552f2dad923324f597e722de3b016ca4f7844f/WebHelpers-1.3.tar.gz"; | |
229 | md5 = "32749ffadfc40fea51075a7def32588b"; |
|
295 | md5 = "32749ffadfc40fea51075a7def32588b"; | |
230 | }; |
|
296 | }; | |
|
297 | meta = { | |||
|
298 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
299 | }; | |||
231 | }; |
|
300 | }; | |
232 | WebHelpers2 = super.buildPythonPackage { |
|
301 | WebHelpers2 = super.buildPythonPackage { | |
233 | name = "WebHelpers2-2.0"; |
|
302 | name = "WebHelpers2-2.0"; | |
234 | buildInputs = with self; []; |
|
303 | buildInputs = with self; []; | |
235 | doCheck = false; |
|
304 | doCheck = false; | |
236 | propagatedBuildInputs = with self; [MarkupSafe six]; |
|
305 | propagatedBuildInputs = with self; [MarkupSafe six]; | |
237 | src = fetchurl { |
|
306 | src = fetchurl { | |
238 | url = "https://pypi.python.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz"; |
|
307 | url = "https://pypi.python.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz"; | |
239 | md5 = "0f6b68d70c12ee0aed48c00b24da13d3"; |
|
308 | md5 = "0f6b68d70c12ee0aed48c00b24da13d3"; | |
240 | }; |
|
309 | }; | |
|
310 | meta = { | |||
|
311 | license = [ pkgs.lib.licenses.mit ]; | |||
|
312 | }; | |||
241 | }; |
|
313 | }; | |
242 | WebOb = super.buildPythonPackage { |
|
314 | WebOb = super.buildPythonPackage { | |
243 | name = "WebOb-1.3.1"; |
|
315 | name = "WebOb-1.3.1"; | |
244 | buildInputs = with self; []; |
|
316 | buildInputs = with self; []; | |
245 | doCheck = false; |
|
317 | doCheck = false; | |
246 | propagatedBuildInputs = with self; []; |
|
318 | propagatedBuildInputs = with self; []; | |
247 | src = fetchurl { |
|
319 | src = fetchurl { | |
248 | url = "https://pypi.python.org/packages/16/78/adfc0380b8a0d75b2d543fa7085ba98a573b1ae486d9def88d172b81b9fa/WebOb-1.3.1.tar.gz"; |
|
320 | url = "https://pypi.python.org/packages/16/78/adfc0380b8a0d75b2d543fa7085ba98a573b1ae486d9def88d172b81b9fa/WebOb-1.3.1.tar.gz"; | |
249 | md5 = "20918251c5726956ba8fef22d1556177"; |
|
321 | md5 = "20918251c5726956ba8fef22d1556177"; | |
250 | }; |
|
322 | }; | |
|
323 | meta = { | |||
|
324 | license = [ pkgs.lib.licenses.mit ]; | |||
|
325 | }; | |||
251 | }; |
|
326 | }; | |
252 | WebTest = super.buildPythonPackage { |
|
327 | WebTest = super.buildPythonPackage { | |
253 | name = "WebTest-1.4.3"; |
|
328 | name = "WebTest-1.4.3"; | |
254 | buildInputs = with self; []; |
|
329 | buildInputs = with self; []; | |
255 | doCheck = false; |
|
330 | doCheck = false; | |
256 | propagatedBuildInputs = with self; [WebOb]; |
|
331 | propagatedBuildInputs = with self; [WebOb]; | |
257 | src = fetchurl { |
|
332 | src = fetchurl { | |
258 | url = "https://pypi.python.org/packages/51/3d/84fd0f628df10b30c7db87895f56d0158e5411206b721ca903cb51bfd948/WebTest-1.4.3.zip"; |
|
333 | url = "https://pypi.python.org/packages/51/3d/84fd0f628df10b30c7db87895f56d0158e5411206b721ca903cb51bfd948/WebTest-1.4.3.zip"; | |
259 | md5 = "631ce728bed92c681a4020a36adbc353"; |
|
334 | md5 = "631ce728bed92c681a4020a36adbc353"; | |
260 | }; |
|
335 | }; | |
|
336 | meta = { | |||
|
337 | license = [ pkgs.lib.licenses.mit ]; | |||
|
338 | }; | |||
261 | }; |
|
339 | }; | |
262 | Whoosh = super.buildPythonPackage { |
|
340 | Whoosh = super.buildPythonPackage { | |
263 | name = "Whoosh-2.7.0"; |
|
341 | name = "Whoosh-2.7.0"; | |
264 | buildInputs = with self; []; |
|
342 | buildInputs = with self; []; | |
265 | doCheck = false; |
|
343 | doCheck = false; | |
266 | propagatedBuildInputs = with self; []; |
|
344 | propagatedBuildInputs = with self; []; | |
267 | src = fetchurl { |
|
345 | src = fetchurl { | |
268 | url = "https://pypi.python.org/packages/1c/dc/2f0231ff3875ded36df8c1ab851451e51a237dc0e5a86d3d96036158da94/Whoosh-2.7.0.zip"; |
|
346 | url = "https://pypi.python.org/packages/1c/dc/2f0231ff3875ded36df8c1ab851451e51a237dc0e5a86d3d96036158da94/Whoosh-2.7.0.zip"; | |
269 | md5 = "7abfd970f16fadc7311960f3fa0bc7a9"; |
|
347 | md5 = "7abfd970f16fadc7311960f3fa0bc7a9"; | |
270 | }; |
|
348 | }; | |
|
349 | meta = { | |||
|
350 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ]; | |||
|
351 | }; | |||
271 | }; |
|
352 | }; | |
272 | alembic = super.buildPythonPackage { |
|
353 | alembic = super.buildPythonPackage { | |
273 | name = "alembic-0.8.4"; |
|
354 | name = "alembic-0.8.4"; | |
274 | buildInputs = with self; []; |
|
355 | buildInputs = with self; []; | |
275 | doCheck = false; |
|
356 | doCheck = false; | |
276 | propagatedBuildInputs = with self; [SQLAlchemy Mako python-editor]; |
|
357 | propagatedBuildInputs = with self; [SQLAlchemy Mako python-editor]; | |
277 | src = fetchurl { |
|
358 | src = fetchurl { | |
278 | url = "https://pypi.python.org/packages/ca/7e/299b4499b5c75e5a38c5845145ad24755bebfb8eec07a2e1c366b7181eeb/alembic-0.8.4.tar.gz"; |
|
359 | url = "https://pypi.python.org/packages/ca/7e/299b4499b5c75e5a38c5845145ad24755bebfb8eec07a2e1c366b7181eeb/alembic-0.8.4.tar.gz"; | |
279 | md5 = "5f95d8ee62b443f9b37eb5bee76c582d"; |
|
360 | md5 = "5f95d8ee62b443f9b37eb5bee76c582d"; | |
280 | }; |
|
361 | }; | |
|
362 | meta = { | |||
|
363 | license = [ pkgs.lib.licenses.mit ]; | |||
|
364 | }; | |||
281 | }; |
|
365 | }; | |
282 | amqplib = super.buildPythonPackage { |
|
366 | amqplib = super.buildPythonPackage { | |
283 | name = "amqplib-1.0.2"; |
|
367 | name = "amqplib-1.0.2"; | |
284 | buildInputs = with self; []; |
|
368 | buildInputs = with self; []; | |
285 | doCheck = false; |
|
369 | doCheck = false; | |
286 | propagatedBuildInputs = with self; []; |
|
370 | propagatedBuildInputs = with self; []; | |
287 | src = fetchurl { |
|
371 | src = fetchurl { | |
288 | url = "https://pypi.python.org/packages/75/b7/8c2429bf8d92354a0118614f9a4d15e53bc69ebedce534284111de5a0102/amqplib-1.0.2.tgz"; |
|
372 | url = "https://pypi.python.org/packages/75/b7/8c2429bf8d92354a0118614f9a4d15e53bc69ebedce534284111de5a0102/amqplib-1.0.2.tgz"; | |
289 | md5 = "5c92f17fbedd99b2b4a836d4352d1e2f"; |
|
373 | md5 = "5c92f17fbedd99b2b4a836d4352d1e2f"; | |
290 | }; |
|
374 | }; | |
|
375 | meta = { | |||
|
376 | license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; | |||
|
377 | }; | |||
291 | }; |
|
378 | }; | |
292 | anyjson = super.buildPythonPackage { |
|
379 | anyjson = super.buildPythonPackage { | |
293 | name = "anyjson-0.3.3"; |
|
380 | name = "anyjson-0.3.3"; | |
294 | buildInputs = with self; []; |
|
381 | buildInputs = with self; []; | |
295 | doCheck = false; |
|
382 | doCheck = false; | |
296 | propagatedBuildInputs = with self; []; |
|
383 | propagatedBuildInputs = with self; []; | |
297 | src = fetchurl { |
|
384 | src = fetchurl { | |
298 | url = "https://pypi.python.org/packages/c3/4d/d4089e1a3dd25b46bebdb55a992b0797cff657b4477bc32ce28038fdecbc/anyjson-0.3.3.tar.gz"; |
|
385 | url = "https://pypi.python.org/packages/c3/4d/d4089e1a3dd25b46bebdb55a992b0797cff657b4477bc32ce28038fdecbc/anyjson-0.3.3.tar.gz"; | |
299 | md5 = "2ea28d6ec311aeeebaf993cb3008b27c"; |
|
386 | md5 = "2ea28d6ec311aeeebaf993cb3008b27c"; | |
300 | }; |
|
387 | }; | |
|
388 | meta = { | |||
|
389 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
390 | }; | |||
301 | }; |
|
391 | }; | |
302 | appenlight-client = super.buildPythonPackage { |
|
392 | appenlight-client = super.buildPythonPackage { | |
303 | name = "appenlight-client-0.6.14"; |
|
393 | name = "appenlight-client-0.6.14"; | |
304 | buildInputs = with self; []; |
|
394 | buildInputs = with self; []; | |
305 | doCheck = false; |
|
395 | doCheck = false; | |
306 | propagatedBuildInputs = with self; [WebOb requests]; |
|
396 | propagatedBuildInputs = with self; [WebOb requests]; | |
307 | src = fetchurl { |
|
397 | src = fetchurl { | |
308 | url = "https://pypi.python.org/packages/4d/e0/23fee3ebada8143f707e65c06bcb82992040ee64ea8355e044ed55ebf0c1/appenlight_client-0.6.14.tar.gz"; |
|
398 | url = "https://pypi.python.org/packages/4d/e0/23fee3ebada8143f707e65c06bcb82992040ee64ea8355e044ed55ebf0c1/appenlight_client-0.6.14.tar.gz"; | |
309 | md5 = "578c69b09f4356d898fff1199b98a95c"; |
|
399 | md5 = "578c69b09f4356d898fff1199b98a95c"; | |
310 | }; |
|
400 | }; | |
|
401 | meta = { | |||
|
402 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "DFSG approved"; } ]; | |||
|
403 | }; | |||
311 | }; |
|
404 | }; | |
312 | authomatic = super.buildPythonPackage { |
|
405 | authomatic = super.buildPythonPackage { | |
313 | name = "authomatic-0.1.0.post1"; |
|
406 | name = "authomatic-0.1.0.post1"; | |
314 | buildInputs = with self; []; |
|
407 | buildInputs = with self; []; | |
315 | doCheck = false; |
|
408 | doCheck = false; | |
316 | propagatedBuildInputs = with self; []; |
|
409 | propagatedBuildInputs = with self; []; | |
317 | src = fetchurl { |
|
410 | src = fetchurl { | |
318 | url = "https://pypi.python.org/packages/08/1a/8a930461e604c2d5a7a871e1ac59fa82ccf994c32e807230c8d2fb07815a/Authomatic-0.1.0.post1.tar.gz"; |
|
411 | url = "https://pypi.python.org/packages/08/1a/8a930461e604c2d5a7a871e1ac59fa82ccf994c32e807230c8d2fb07815a/Authomatic-0.1.0.post1.tar.gz"; | |
319 | md5 = "be3f3ce08747d776aae6d6cc8dcb49a9"; |
|
412 | md5 = "be3f3ce08747d776aae6d6cc8dcb49a9"; | |
320 | }; |
|
413 | }; | |
|
414 | meta = { | |||
|
415 | license = [ pkgs.lib.licenses.mit ]; | |||
|
416 | }; | |||
321 | }; |
|
417 | }; | |
322 | backport-ipaddress = super.buildPythonPackage { |
|
418 | backport-ipaddress = super.buildPythonPackage { | |
323 | name = "backport-ipaddress-0.1"; |
|
419 | name = "backport-ipaddress-0.1"; | |
324 | buildInputs = with self; []; |
|
420 | buildInputs = with self; []; | |
325 | doCheck = false; |
|
421 | doCheck = false; | |
326 | propagatedBuildInputs = with self; []; |
|
422 | propagatedBuildInputs = with self; []; | |
327 | src = fetchurl { |
|
423 | src = fetchurl { | |
328 | url = "https://pypi.python.org/packages/d3/30/54c6dab05a4dec44db25ff309f1fbb6b7a8bde3f2bade38bb9da67bbab8f/backport_ipaddress-0.1.tar.gz"; |
|
424 | url = "https://pypi.python.org/packages/d3/30/54c6dab05a4dec44db25ff309f1fbb6b7a8bde3f2bade38bb9da67bbab8f/backport_ipaddress-0.1.tar.gz"; | |
329 | md5 = "9c1f45f4361f71b124d7293a60006c05"; |
|
425 | md5 = "9c1f45f4361f71b124d7293a60006c05"; | |
330 | }; |
|
426 | }; | |
|
427 | meta = { | |||
|
428 | license = [ pkgs.lib.licenses.psfl ]; | |||
|
429 | }; | |||
331 | }; |
|
430 | }; | |
332 | bottle = super.buildPythonPackage { |
|
431 | bottle = super.buildPythonPackage { | |
333 | name = "bottle-0.12.8"; |
|
432 | name = "bottle-0.12.8"; | |
334 | buildInputs = with self; []; |
|
433 | buildInputs = with self; []; | |
335 | doCheck = false; |
|
434 | doCheck = false; | |
336 | propagatedBuildInputs = with self; []; |
|
435 | propagatedBuildInputs = with self; []; | |
337 | src = fetchurl { |
|
436 | src = fetchurl { | |
338 | url = "https://pypi.python.org/packages/52/df/e4a408f3a7af396d186d4ecd3b389dd764f0f943b4fa8d257bfe7b49d343/bottle-0.12.8.tar.gz"; |
|
437 | url = "https://pypi.python.org/packages/52/df/e4a408f3a7af396d186d4ecd3b389dd764f0f943b4fa8d257bfe7b49d343/bottle-0.12.8.tar.gz"; | |
339 | md5 = "13132c0a8f607bf860810a6ee9064c5b"; |
|
438 | md5 = "13132c0a8f607bf860810a6ee9064c5b"; | |
340 | }; |
|
439 | }; | |
|
440 | meta = { | |||
|
441 | license = [ pkgs.lib.licenses.mit ]; | |||
|
442 | }; | |||
341 | }; |
|
443 | }; | |
342 | bumpversion = super.buildPythonPackage { |
|
444 | bumpversion = super.buildPythonPackage { | |
343 | name = "bumpversion-0.5.3"; |
|
445 | name = "bumpversion-0.5.3"; | |
344 | buildInputs = with self; []; |
|
446 | buildInputs = with self; []; | |
345 | doCheck = false; |
|
447 | doCheck = false; | |
346 | propagatedBuildInputs = with self; []; |
|
448 | propagatedBuildInputs = with self; []; | |
347 | src = fetchurl { |
|
449 | src = fetchurl { | |
348 | url = "https://pypi.python.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz"; |
|
450 | url = "https://pypi.python.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz"; | |
349 | md5 = "c66a3492eafcf5ad4b024be9fca29820"; |
|
451 | md5 = "c66a3492eafcf5ad4b024be9fca29820"; | |
350 | }; |
|
452 | }; | |
|
453 | meta = { | |||
|
454 | license = [ pkgs.lib.licenses.mit ]; | |||
|
455 | }; | |||
351 | }; |
|
456 | }; | |
352 | celery = super.buildPythonPackage { |
|
457 | celery = super.buildPythonPackage { | |
353 | name = "celery-2.2.10"; |
|
458 | name = "celery-2.2.10"; | |
354 | buildInputs = with self; []; |
|
459 | buildInputs = with self; []; | |
355 | doCheck = false; |
|
460 | doCheck = false; | |
356 | propagatedBuildInputs = with self; [python-dateutil anyjson kombu pyparsing]; |
|
461 | propagatedBuildInputs = with self; [python-dateutil anyjson kombu pyparsing]; | |
357 | src = fetchurl { |
|
462 | src = fetchurl { | |
358 | url = "https://pypi.python.org/packages/b1/64/860fd50e45844c83442e7953effcddeff66b2851d90b2d784f7201c111b8/celery-2.2.10.tar.gz"; |
|
463 | url = "https://pypi.python.org/packages/b1/64/860fd50e45844c83442e7953effcddeff66b2851d90b2d784f7201c111b8/celery-2.2.10.tar.gz"; | |
359 | md5 = "898bc87e54f278055b561316ba73e222"; |
|
464 | md5 = "898bc87e54f278055b561316ba73e222"; | |
360 | }; |
|
465 | }; | |
|
466 | meta = { | |||
|
467 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
468 | }; | |||
361 | }; |
|
469 | }; | |
362 | click = super.buildPythonPackage { |
|
470 | click = super.buildPythonPackage { | |
363 | name = "click-5.1"; |
|
471 | name = "click-5.1"; | |
364 | buildInputs = with self; []; |
|
472 | buildInputs = with self; []; | |
365 | doCheck = false; |
|
473 | doCheck = false; | |
366 | propagatedBuildInputs = with self; []; |
|
474 | propagatedBuildInputs = with self; []; | |
367 | src = fetchurl { |
|
475 | src = fetchurl { | |
368 | url = "https://pypi.python.org/packages/b7/34/a496632c4fb6c1ee76efedf77bb8d28b29363d839953d95095b12defe791/click-5.1.tar.gz"; |
|
476 | url = "https://pypi.python.org/packages/b7/34/a496632c4fb6c1ee76efedf77bb8d28b29363d839953d95095b12defe791/click-5.1.tar.gz"; | |
369 | md5 = "9c5323008cccfe232a8b161fc8196d41"; |
|
477 | md5 = "9c5323008cccfe232a8b161fc8196d41"; | |
370 | }; |
|
478 | }; | |
|
479 | meta = { | |||
|
480 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
481 | }; | |||
371 | }; |
|
482 | }; | |
372 | colander = super.buildPythonPackage { |
|
483 | colander = super.buildPythonPackage { | |
373 | name = "colander-1.2"; |
|
484 | name = "colander-1.2"; | |
374 | buildInputs = with self; []; |
|
485 | buildInputs = with self; []; | |
375 | doCheck = false; |
|
486 | doCheck = false; | |
376 | propagatedBuildInputs = with self; [translationstring iso8601]; |
|
487 | propagatedBuildInputs = with self; [translationstring iso8601]; | |
377 | src = fetchurl { |
|
488 | src = fetchurl { | |
378 | url = "https://pypi.python.org/packages/14/23/c9ceba07a6a1dc0eefbb215fc0dc64aabc2b22ee756bc0f0c13278fa0887/colander-1.2.tar.gz"; |
|
489 | url = "https://pypi.python.org/packages/14/23/c9ceba07a6a1dc0eefbb215fc0dc64aabc2b22ee756bc0f0c13278fa0887/colander-1.2.tar.gz"; | |
379 | md5 = "83db21b07936a0726e588dae1914b9ed"; |
|
490 | md5 = "83db21b07936a0726e588dae1914b9ed"; | |
380 | }; |
|
491 | }; | |
|
492 | meta = { | |||
|
493 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |||
|
494 | }; | |||
381 | }; |
|
495 | }; | |
382 | configobj = super.buildPythonPackage { |
|
496 | configobj = super.buildPythonPackage { | |
383 | name = "configobj-5.0.6"; |
|
497 | name = "configobj-5.0.6"; | |
384 | buildInputs = with self; []; |
|
498 | buildInputs = with self; []; | |
385 | doCheck = false; |
|
499 | doCheck = false; | |
386 | propagatedBuildInputs = with self; [six]; |
|
500 | propagatedBuildInputs = with self; [six]; | |
387 | src = fetchurl { |
|
501 | src = fetchurl { | |
388 | url = "https://pypi.python.org/packages/64/61/079eb60459c44929e684fa7d9e2fdca403f67d64dd9dbac27296be2e0fab/configobj-5.0.6.tar.gz"; |
|
502 | url = "https://pypi.python.org/packages/64/61/079eb60459c44929e684fa7d9e2fdca403f67d64dd9dbac27296be2e0fab/configobj-5.0.6.tar.gz"; | |
389 | md5 = "e472a3a1c2a67bb0ec9b5d54c13a47d6"; |
|
503 | md5 = "e472a3a1c2a67bb0ec9b5d54c13a47d6"; | |
390 | }; |
|
504 | }; | |
|
505 | meta = { | |||
|
506 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
507 | }; | |||
391 | }; |
|
508 | }; | |
392 | cov-core = super.buildPythonPackage { |
|
509 | cov-core = super.buildPythonPackage { | |
393 | name = "cov-core-1.15.0"; |
|
510 | name = "cov-core-1.15.0"; | |
394 | buildInputs = with self; []; |
|
511 | buildInputs = with self; []; | |
395 | doCheck = false; |
|
512 | doCheck = false; | |
396 | propagatedBuildInputs = with self; [coverage]; |
|
513 | propagatedBuildInputs = with self; [coverage]; | |
397 | src = fetchurl { |
|
514 | src = fetchurl { | |
398 | url = "https://pypi.python.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz"; |
|
515 | url = "https://pypi.python.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz"; | |
399 | md5 = "f519d4cb4c4e52856afb14af52919fe6"; |
|
516 | md5 = "f519d4cb4c4e52856afb14af52919fe6"; | |
400 | }; |
|
517 | }; | |
|
518 | meta = { | |||
|
519 | license = [ pkgs.lib.licenses.mit ]; | |||
|
520 | }; | |||
401 | }; |
|
521 | }; | |
402 | coverage = super.buildPythonPackage { |
|
522 | coverage = super.buildPythonPackage { | |
403 | name = "coverage-3.7.1"; |
|
523 | name = "coverage-3.7.1"; | |
404 | buildInputs = with self; []; |
|
524 | buildInputs = with self; []; | |
405 | doCheck = false; |
|
525 | doCheck = false; | |
406 | propagatedBuildInputs = with self; []; |
|
526 | propagatedBuildInputs = with self; []; | |
407 | src = fetchurl { |
|
527 | src = fetchurl { | |
408 | url = "https://pypi.python.org/packages/09/4f/89b06c7fdc09687bca507dc411c342556ef9c5a3b26756137a4878ff19bf/coverage-3.7.1.tar.gz"; |
|
528 | url = "https://pypi.python.org/packages/09/4f/89b06c7fdc09687bca507dc411c342556ef9c5a3b26756137a4878ff19bf/coverage-3.7.1.tar.gz"; | |
409 | md5 = "c47b36ceb17eaff3ecfab3bcd347d0df"; |
|
529 | md5 = "c47b36ceb17eaff3ecfab3bcd347d0df"; | |
410 | }; |
|
530 | }; | |
|
531 | meta = { | |||
|
532 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
533 | }; | |||
411 | }; |
|
534 | }; | |
412 | cssselect = super.buildPythonPackage { |
|
535 | cssselect = super.buildPythonPackage { | |
413 | name = "cssselect-0.9.1"; |
|
536 | name = "cssselect-0.9.1"; | |
414 | buildInputs = with self; []; |
|
537 | buildInputs = with self; []; | |
415 | doCheck = false; |
|
538 | doCheck = false; | |
416 | propagatedBuildInputs = with self; []; |
|
539 | propagatedBuildInputs = with self; []; | |
417 | src = fetchurl { |
|
540 | src = fetchurl { | |
418 | url = "https://pypi.python.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz"; |
|
541 | url = "https://pypi.python.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz"; | |
419 | md5 = "c74f45966277dc7a0f768b9b0f3522ac"; |
|
542 | md5 = "c74f45966277dc7a0f768b9b0f3522ac"; | |
420 | }; |
|
543 | }; | |
|
544 | meta = { | |||
|
545 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
546 | }; | |||
421 | }; |
|
547 | }; | |
422 | decorator = super.buildPythonPackage { |
|
548 | decorator = super.buildPythonPackage { | |
423 | name = "decorator-3.4.2"; |
|
549 | name = "decorator-3.4.2"; | |
424 | buildInputs = with self; []; |
|
550 | buildInputs = with self; []; | |
425 | doCheck = false; |
|
551 | doCheck = false; | |
426 | propagatedBuildInputs = with self; []; |
|
552 | propagatedBuildInputs = with self; []; | |
427 | src = fetchurl { |
|
553 | src = fetchurl { | |
428 | url = "https://pypi.python.org/packages/35/3a/42566eb7a2cbac774399871af04e11d7ae3fc2579e7dae85213b8d1d1c57/decorator-3.4.2.tar.gz"; |
|
554 | url = "https://pypi.python.org/packages/35/3a/42566eb7a2cbac774399871af04e11d7ae3fc2579e7dae85213b8d1d1c57/decorator-3.4.2.tar.gz"; | |
429 | md5 = "9e0536870d2b83ae27d58dbf22582f4d"; |
|
555 | md5 = "9e0536870d2b83ae27d58dbf22582f4d"; | |
430 | }; |
|
556 | }; | |
|
557 | meta = { | |||
|
558 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
559 | }; | |||
431 | }; |
|
560 | }; | |
432 | docutils = super.buildPythonPackage { |
|
561 | docutils = super.buildPythonPackage { | |
433 | name = "docutils-0.12"; |
|
562 | name = "docutils-0.12"; | |
434 | buildInputs = with self; []; |
|
563 | buildInputs = with self; []; | |
435 | doCheck = false; |
|
564 | doCheck = false; | |
436 | propagatedBuildInputs = with self; []; |
|
565 | propagatedBuildInputs = with self; []; | |
437 | src = fetchurl { |
|
566 | src = fetchurl { | |
438 | url = "https://pypi.python.org/packages/37/38/ceda70135b9144d84884ae2fc5886c6baac4edea39550f28bcd144c1234d/docutils-0.12.tar.gz"; |
|
567 | url = "https://pypi.python.org/packages/37/38/ceda70135b9144d84884ae2fc5886c6baac4edea39550f28bcd144c1234d/docutils-0.12.tar.gz"; | |
439 | md5 = "4622263b62c5c771c03502afa3157768"; |
|
568 | md5 = "4622263b62c5c771c03502afa3157768"; | |
440 | }; |
|
569 | }; | |
|
570 | meta = { | |||
|
571 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.publicDomain pkgs.lib.licenses.gpl1 { fullName = "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"; } pkgs.lib.licenses.psfl ]; | |||
|
572 | }; | |||
441 | }; |
|
573 | }; | |
442 | dogpile.cache = super.buildPythonPackage { |
|
574 | dogpile.cache = super.buildPythonPackage { | |
443 | name = "dogpile.cache-0.5.7"; |
|
575 | name = "dogpile.cache-0.5.7"; | |
444 | buildInputs = with self; []; |
|
576 | buildInputs = with self; []; | |
445 | doCheck = false; |
|
577 | doCheck = false; | |
446 | propagatedBuildInputs = with self; [dogpile.core]; |
|
578 | propagatedBuildInputs = with self; [dogpile.core]; | |
447 | src = fetchurl { |
|
579 | src = fetchurl { | |
448 | url = "https://pypi.python.org/packages/07/74/2a83bedf758156d9c95d112691bbad870d3b77ccbcfb781b4ef836ea7d96/dogpile.cache-0.5.7.tar.gz"; |
|
580 | url = "https://pypi.python.org/packages/07/74/2a83bedf758156d9c95d112691bbad870d3b77ccbcfb781b4ef836ea7d96/dogpile.cache-0.5.7.tar.gz"; | |
449 | md5 = "3e58ce41af574aab41d78e9c4190f194"; |
|
581 | md5 = "3e58ce41af574aab41d78e9c4190f194"; | |
450 | }; |
|
582 | }; | |
|
583 | meta = { | |||
|
584 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
585 | }; | |||
451 | }; |
|
586 | }; | |
452 | dogpile.core = super.buildPythonPackage { |
|
587 | dogpile.core = super.buildPythonPackage { | |
453 | name = "dogpile.core-0.4.1"; |
|
588 | name = "dogpile.core-0.4.1"; | |
454 | buildInputs = with self; []; |
|
589 | buildInputs = with self; []; | |
455 | doCheck = false; |
|
590 | doCheck = false; | |
456 | propagatedBuildInputs = with self; []; |
|
591 | propagatedBuildInputs = with self; []; | |
457 | src = fetchurl { |
|
592 | src = fetchurl { | |
458 | url = "https://pypi.python.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz"; |
|
593 | url = "https://pypi.python.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz"; | |
459 | md5 = "01cb19f52bba3e95c9b560f39341f045"; |
|
594 | md5 = "01cb19f52bba3e95c9b560f39341f045"; | |
460 | }; |
|
595 | }; | |
|
596 | meta = { | |||
|
597 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
598 | }; | |||
461 | }; |
|
599 | }; | |
462 | dulwich = super.buildPythonPackage { |
|
600 | dulwich = super.buildPythonPackage { | |
463 | name = "dulwich-0.12.0"; |
|
601 | name = "dulwich-0.12.0"; | |
464 | buildInputs = with self; []; |
|
602 | buildInputs = with self; []; | |
465 | doCheck = false; |
|
603 | doCheck = false; | |
466 | propagatedBuildInputs = with self; []; |
|
604 | propagatedBuildInputs = with self; []; | |
467 | src = fetchurl { |
|
605 | src = fetchurl { | |
468 | url = "https://pypi.python.org/packages/6f/04/fbe561b6d45c0ec758330d5b7f5ba4b6cb4f1ca1ab49859d2fc16320da75/dulwich-0.12.0.tar.gz"; |
|
606 | url = "https://pypi.python.org/packages/6f/04/fbe561b6d45c0ec758330d5b7f5ba4b6cb4f1ca1ab49859d2fc16320da75/dulwich-0.12.0.tar.gz"; | |
469 | md5 = "f3a8a12bd9f9dd8c233e18f3d49436fa"; |
|
607 | md5 = "f3a8a12bd9f9dd8c233e18f3d49436fa"; | |
470 | }; |
|
608 | }; | |
|
609 | meta = { | |||
|
610 | license = [ pkgs.lib.licenses.gpl2Plus ]; | |||
|
611 | }; | |||
471 | }; |
|
612 | }; | |
472 | ecdsa = super.buildPythonPackage { |
|
613 | ecdsa = super.buildPythonPackage { | |
473 | name = "ecdsa-0.11"; |
|
614 | name = "ecdsa-0.11"; | |
474 | buildInputs = with self; []; |
|
615 | buildInputs = with self; []; | |
475 | doCheck = false; |
|
616 | doCheck = false; | |
476 | propagatedBuildInputs = with self; []; |
|
617 | propagatedBuildInputs = with self; []; | |
477 | src = fetchurl { |
|
618 | src = fetchurl { | |
478 | url = "https://pypi.python.org/packages/6c/3f/92fe5dcdcaa7bd117be21e5520c9a54375112b66ec000d209e9e9519fad1/ecdsa-0.11.tar.gz"; |
|
619 | url = "https://pypi.python.org/packages/6c/3f/92fe5dcdcaa7bd117be21e5520c9a54375112b66ec000d209e9e9519fad1/ecdsa-0.11.tar.gz"; | |
479 | md5 = "8ef586fe4dbb156697d756900cb41d7c"; |
|
620 | md5 = "8ef586fe4dbb156697d756900cb41d7c"; | |
480 | }; |
|
621 | }; | |
|
622 | meta = { | |||
|
623 | license = [ pkgs.lib.licenses.mit ]; | |||
|
624 | }; | |||
481 | }; |
|
625 | }; | |
482 | elasticsearch = super.buildPythonPackage { |
|
626 | elasticsearch = super.buildPythonPackage { | |
483 | name = "elasticsearch-2.3.0"; |
|
627 | name = "elasticsearch-2.3.0"; | |
484 | buildInputs = with self; []; |
|
628 | buildInputs = with self; []; | |
485 | doCheck = false; |
|
629 | doCheck = false; | |
486 | propagatedBuildInputs = with self; [urllib3]; |
|
630 | propagatedBuildInputs = with self; [urllib3]; | |
487 | src = fetchurl { |
|
631 | src = fetchurl { | |
488 | url = "https://pypi.python.org/packages/10/35/5fd52c5f0b0ee405ed4b5195e8bce44c5e041787680dc7b94b8071cac600/elasticsearch-2.3.0.tar.gz"; |
|
632 | url = "https://pypi.python.org/packages/10/35/5fd52c5f0b0ee405ed4b5195e8bce44c5e041787680dc7b94b8071cac600/elasticsearch-2.3.0.tar.gz"; | |
489 | md5 = "2550f3b51629cf1ef9636608af92c340"; |
|
633 | md5 = "2550f3b51629cf1ef9636608af92c340"; | |
490 | }; |
|
634 | }; | |
|
635 | meta = { | |||
|
636 | license = [ pkgs.lib.licenses.asl20 ]; | |||
|
637 | }; | |||
491 | }; |
|
638 | }; | |
492 | elasticsearch-dsl = super.buildPythonPackage { |
|
639 | elasticsearch-dsl = super.buildPythonPackage { | |
493 | name = "elasticsearch-dsl-2.0.0"; |
|
640 | name = "elasticsearch-dsl-2.0.0"; | |
494 | buildInputs = with self; []; |
|
641 | buildInputs = with self; []; | |
495 | doCheck = false; |
|
642 | doCheck = false; | |
496 | propagatedBuildInputs = with self; [six python-dateutil elasticsearch]; |
|
643 | propagatedBuildInputs = with self; [six python-dateutil elasticsearch]; | |
497 | src = fetchurl { |
|
644 | src = fetchurl { | |
498 | url = "https://pypi.python.org/packages/4e/5d/e788ae8dbe2ff4d13426db0a027533386a5c276c77a2654dc0e2007ce04a/elasticsearch-dsl-2.0.0.tar.gz"; |
|
645 | url = "https://pypi.python.org/packages/4e/5d/e788ae8dbe2ff4d13426db0a027533386a5c276c77a2654dc0e2007ce04a/elasticsearch-dsl-2.0.0.tar.gz"; | |
499 | md5 = "4cdfec81bb35383dd3b7d02d7dc5ee68"; |
|
646 | md5 = "4cdfec81bb35383dd3b7d02d7dc5ee68"; | |
500 | }; |
|
647 | }; | |
|
648 | meta = { | |||
|
649 | license = [ pkgs.lib.licenses.asl20 ]; | |||
|
650 | }; | |||
501 | }; |
|
651 | }; | |
502 | flake8 = super.buildPythonPackage { |
|
652 | flake8 = super.buildPythonPackage { | |
503 | name = "flake8-2.4.1"; |
|
653 | name = "flake8-2.4.1"; | |
504 | buildInputs = with self; []; |
|
654 | buildInputs = with self; []; | |
505 | doCheck = false; |
|
655 | doCheck = false; | |
506 | propagatedBuildInputs = with self; [pyflakes pep8 mccabe]; |
|
656 | propagatedBuildInputs = with self; [pyflakes pep8 mccabe]; | |
507 | src = fetchurl { |
|
657 | src = fetchurl { | |
508 | url = "https://pypi.python.org/packages/8f/b5/9a73c66c7dba273bac8758398f060c008a25f3e84531063b42503b5d0a95/flake8-2.4.1.tar.gz"; |
|
658 | url = "https://pypi.python.org/packages/8f/b5/9a73c66c7dba273bac8758398f060c008a25f3e84531063b42503b5d0a95/flake8-2.4.1.tar.gz"; | |
509 | md5 = "ed45d3db81a3b7c88bd63c6e37ca1d65"; |
|
659 | md5 = "ed45d3db81a3b7c88bd63c6e37ca1d65"; | |
510 | }; |
|
660 | }; | |
|
661 | meta = { | |||
|
662 | license = [ pkgs.lib.licenses.mit ]; | |||
|
663 | }; | |||
511 | }; |
|
664 | }; | |
512 | future = super.buildPythonPackage { |
|
665 | future = super.buildPythonPackage { | |
513 | name = "future-0.14.3"; |
|
666 | name = "future-0.14.3"; | |
514 | buildInputs = with self; []; |
|
667 | buildInputs = with self; []; | |
515 | doCheck = false; |
|
668 | doCheck = false; | |
516 | propagatedBuildInputs = with self; []; |
|
669 | propagatedBuildInputs = with self; []; | |
517 | src = fetchurl { |
|
670 | src = fetchurl { | |
518 | url = "https://pypi.python.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz"; |
|
671 | url = "https://pypi.python.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz"; | |
519 | md5 = "e94079b0bd1fc054929e8769fc0f6083"; |
|
672 | md5 = "e94079b0bd1fc054929e8769fc0f6083"; | |
520 | }; |
|
673 | }; | |
|
674 | meta = { | |||
|
675 | license = [ { fullName = "OSI Approved"; } pkgs.lib.licenses.mit ]; | |||
|
676 | }; | |||
521 | }; |
|
677 | }; | |
522 | futures = super.buildPythonPackage { |
|
678 | futures = super.buildPythonPackage { | |
523 | name = "futures-3.0.2"; |
|
679 | name = "futures-3.0.2"; | |
524 | buildInputs = with self; []; |
|
680 | buildInputs = with self; []; | |
525 | doCheck = false; |
|
681 | doCheck = false; | |
526 | propagatedBuildInputs = with self; []; |
|
682 | propagatedBuildInputs = with self; []; | |
527 | src = fetchurl { |
|
683 | src = fetchurl { | |
528 | url = "https://pypi.python.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz"; |
|
684 | url = "https://pypi.python.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz"; | |
529 | md5 = "42aaf1e4de48d6e871d77dc1f9d96d5a"; |
|
685 | md5 = "42aaf1e4de48d6e871d77dc1f9d96d5a"; | |
530 | }; |
|
686 | }; | |
|
687 | meta = { | |||
|
688 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
689 | }; | |||
531 | }; |
|
690 | }; | |
532 | gnureadline = super.buildPythonPackage { |
|
691 | gnureadline = super.buildPythonPackage { | |
533 | name = "gnureadline-6.3.3"; |
|
692 | name = "gnureadline-6.3.3"; | |
534 | buildInputs = with self; []; |
|
693 | buildInputs = with self; []; | |
535 | doCheck = false; |
|
694 | doCheck = false; | |
536 | propagatedBuildInputs = with self; []; |
|
695 | propagatedBuildInputs = with self; []; | |
537 | src = fetchurl { |
|
696 | src = fetchurl { | |
538 | url = "https://pypi.python.org/packages/3a/ee/2c3f568b0a74974791ac590ec742ef6133e2fbd287a074ba72a53fa5e97c/gnureadline-6.3.3.tar.gz"; |
|
697 | url = "https://pypi.python.org/packages/3a/ee/2c3f568b0a74974791ac590ec742ef6133e2fbd287a074ba72a53fa5e97c/gnureadline-6.3.3.tar.gz"; | |
539 | md5 = "c4af83c9a3fbeac8f2da9b5a7c60e51c"; |
|
698 | md5 = "c4af83c9a3fbeac8f2da9b5a7c60e51c"; | |
540 | }; |
|
699 | }; | |
|
700 | meta = { | |||
|
701 | license = [ pkgs.lib.licenses.gpl1 ]; | |||
|
702 | }; | |||
541 | }; |
|
703 | }; | |
542 | gprof2dot = super.buildPythonPackage { |
|
704 | gprof2dot = super.buildPythonPackage { | |
543 |
name = "gprof2dot-2015.12. |
|
705 | name = "gprof2dot-2015.12.1"; | |
544 | buildInputs = with self; []; |
|
706 | buildInputs = with self; []; | |
545 | doCheck = false; |
|
707 | doCheck = false; | |
546 | propagatedBuildInputs = with self; []; |
|
708 | propagatedBuildInputs = with self; []; | |
547 | src = fetchurl { |
|
709 | src = fetchurl { | |
548 | url = "https://pypi.python.org/packages/b9/34/7bf93c1952d40fa5c95ad963f4d8344b61ef58558632402eca18e6c14127/gprof2dot-2015.12.1.tar.gz"; |
|
710 | url = "https://pypi.python.org/packages/b9/34/7bf93c1952d40fa5c95ad963f4d8344b61ef58558632402eca18e6c14127/gprof2dot-2015.12.1.tar.gz"; | |
549 | md5 = "e23bf4e2f94db032750c193384b4165b"; |
|
711 | md5 = "e23bf4e2f94db032750c193384b4165b"; | |
550 | }; |
|
712 | }; | |
|
713 | meta = { | |||
|
714 | license = [ { fullName = "LGPL"; } ]; | |||
|
715 | }; | |||
551 | }; |
|
716 | }; | |
552 | greenlet = super.buildPythonPackage { |
|
717 | greenlet = super.buildPythonPackage { | |
553 | name = "greenlet-0.4.9"; |
|
718 | name = "greenlet-0.4.9"; | |
554 | buildInputs = with self; []; |
|
719 | buildInputs = with self; []; | |
555 | doCheck = false; |
|
720 | doCheck = false; | |
556 | propagatedBuildInputs = with self; []; |
|
721 | propagatedBuildInputs = with self; []; | |
557 | src = fetchurl { |
|
722 | src = fetchurl { | |
558 | url = "https://pypi.python.org/packages/4e/3d/9d421539b74e33608b245092870156b2e171fb49f2b51390aa4641eecb4a/greenlet-0.4.9.zip"; |
|
723 | url = "https://pypi.python.org/packages/4e/3d/9d421539b74e33608b245092870156b2e171fb49f2b51390aa4641eecb4a/greenlet-0.4.9.zip"; | |
559 | md5 = "c6659cdb2a5e591723e629d2eef22e82"; |
|
724 | md5 = "c6659cdb2a5e591723e629d2eef22e82"; | |
560 | }; |
|
725 | }; | |
|
726 | meta = { | |||
|
727 | license = [ pkgs.lib.licenses.mit ]; | |||
|
728 | }; | |||
561 | }; |
|
729 | }; | |
562 | gunicorn = super.buildPythonPackage { |
|
730 | gunicorn = super.buildPythonPackage { | |
563 | name = "gunicorn-19.6.0"; |
|
731 | name = "gunicorn-19.6.0"; | |
564 | buildInputs = with self; []; |
|
732 | buildInputs = with self; []; | |
565 | doCheck = false; |
|
733 | doCheck = false; | |
566 | propagatedBuildInputs = with self; []; |
|
734 | propagatedBuildInputs = with self; []; | |
567 | src = fetchurl { |
|
735 | src = fetchurl { | |
568 | url = "https://pypi.python.org/packages/84/ce/7ea5396efad1cef682bbc4068e72a0276341d9d9d0f501da609fab9fcb80/gunicorn-19.6.0.tar.gz"; |
|
736 | url = "https://pypi.python.org/packages/84/ce/7ea5396efad1cef682bbc4068e72a0276341d9d9d0f501da609fab9fcb80/gunicorn-19.6.0.tar.gz"; | |
569 | md5 = "338e5e8a83ea0f0625f768dba4597530"; |
|
737 | md5 = "338e5e8a83ea0f0625f768dba4597530"; | |
570 | }; |
|
738 | }; | |
|
739 | meta = { | |||
|
740 | license = [ pkgs.lib.licenses.mit ]; | |||
|
741 | }; | |||
571 | }; |
|
742 | }; | |
572 | infrae.cache = super.buildPythonPackage { |
|
743 | infrae.cache = super.buildPythonPackage { | |
573 | name = "infrae.cache-1.0.1"; |
|
744 | name = "infrae.cache-1.0.1"; | |
574 | buildInputs = with self; []; |
|
745 | buildInputs = with self; []; | |
575 | doCheck = false; |
|
746 | doCheck = false; | |
576 | propagatedBuildInputs = with self; [Beaker repoze.lru]; |
|
747 | propagatedBuildInputs = with self; [Beaker repoze.lru]; | |
577 | src = fetchurl { |
|
748 | src = fetchurl { | |
578 | url = "https://pypi.python.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz"; |
|
749 | url = "https://pypi.python.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz"; | |
579 | md5 = "b09076a766747e6ed2a755cc62088e32"; |
|
750 | md5 = "b09076a766747e6ed2a755cc62088e32"; | |
580 | }; |
|
751 | }; | |
|
752 | meta = { | |||
|
753 | license = [ pkgs.lib.licenses.zpt21 ]; | |||
|
754 | }; | |||
581 | }; |
|
755 | }; | |
582 | invoke = super.buildPythonPackage { |
|
756 | invoke = super.buildPythonPackage { | |
583 |
name = "invoke-0.1 |
|
757 | name = "invoke-0.13.0"; | |
584 | buildInputs = with self; []; |
|
758 | buildInputs = with self; []; | |
585 | doCheck = false; |
|
759 | doCheck = false; | |
586 | propagatedBuildInputs = with self; []; |
|
760 | propagatedBuildInputs = with self; []; | |
587 | src = fetchurl { |
|
761 | src = fetchurl { | |
588 | url = "https://pypi.python.org/packages/d3/bb/36a5558ea19882073def7b0edeef4a0e6282056fed96506dd10b1d532bd4/invoke-0.11.1.tar.gz"; |
|
762 | url = "https://pypi.python.org/packages/47/bf/d07ef52fa1ac645468858bbac7cb95b246a972a045e821493d17d89c81be/invoke-0.13.0.tar.gz"; | |
589 | md5 = "3d4ecbe26779ceef1046ecf702c9c4a8"; |
|
763 | md5 = "c0d1ed4bfb34eaab551662d8cfee6540"; | |
|
764 | }; | |||
|
765 | meta = { | |||
|
766 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
590 | }; |
|
767 | }; | |
591 | }; |
|
768 | }; | |
592 | ipdb = super.buildPythonPackage { |
|
769 | ipdb = super.buildPythonPackage { | |
593 | name = "ipdb-0.8"; |
|
770 | name = "ipdb-0.8"; | |
594 | buildInputs = with self; []; |
|
771 | buildInputs = with self; []; | |
595 | doCheck = false; |
|
772 | doCheck = false; | |
596 | propagatedBuildInputs = with self; [ipython]; |
|
773 | propagatedBuildInputs = with self; [ipython]; | |
597 | src = fetchurl { |
|
774 | src = fetchurl { | |
598 | url = "https://pypi.python.org/packages/f0/25/d7dd430ced6cd8dc242a933c8682b5dbf32eb4011d82f87e34209e5ec845/ipdb-0.8.zip"; |
|
775 | url = "https://pypi.python.org/packages/f0/25/d7dd430ced6cd8dc242a933c8682b5dbf32eb4011d82f87e34209e5ec845/ipdb-0.8.zip"; | |
599 | md5 = "96dca0712efa01aa5eaf6b22071dd3ed"; |
|
776 | md5 = "96dca0712efa01aa5eaf6b22071dd3ed"; | |
600 | }; |
|
777 | }; | |
|
778 | meta = { | |||
|
779 | license = [ pkgs.lib.licenses.gpl1 ]; | |||
|
780 | }; | |||
601 | }; |
|
781 | }; | |
602 | ipython = super.buildPythonPackage { |
|
782 | ipython = super.buildPythonPackage { | |
603 | name = "ipython-3.1.0"; |
|
783 | name = "ipython-3.1.0"; | |
604 | buildInputs = with self; []; |
|
784 | buildInputs = with self; []; | |
605 | doCheck = false; |
|
785 | doCheck = false; | |
606 | propagatedBuildInputs = with self; []; |
|
786 | propagatedBuildInputs = with self; []; | |
607 | src = fetchurl { |
|
787 | src = fetchurl { | |
608 | url = "https://pypi.python.org/packages/06/91/120c0835254c120af89f066afaabf81289bc2726c1fc3ca0555df6882f58/ipython-3.1.0.tar.gz"; |
|
788 | url = "https://pypi.python.org/packages/06/91/120c0835254c120af89f066afaabf81289bc2726c1fc3ca0555df6882f58/ipython-3.1.0.tar.gz"; | |
609 | md5 = "a749d90c16068687b0ec45a27e72ef8f"; |
|
789 | md5 = "a749d90c16068687b0ec45a27e72ef8f"; | |
610 | }; |
|
790 | }; | |
|
791 | meta = { | |||
|
792 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
793 | }; | |||
611 | }; |
|
794 | }; | |
612 | iso8601 = super.buildPythonPackage { |
|
795 | iso8601 = super.buildPythonPackage { | |
613 | name = "iso8601-0.1.11"; |
|
796 | name = "iso8601-0.1.11"; | |
614 | buildInputs = with self; []; |
|
797 | buildInputs = with self; []; | |
615 | doCheck = false; |
|
798 | doCheck = false; | |
616 | propagatedBuildInputs = with self; []; |
|
799 | propagatedBuildInputs = with self; []; | |
617 | src = fetchurl { |
|
800 | src = fetchurl { | |
618 | url = "https://pypi.python.org/packages/c0/75/c9209ee4d1b5975eb8c2cba4428bde6b61bd55664a98290dd015cdb18e98/iso8601-0.1.11.tar.gz"; |
|
801 | url = "https://pypi.python.org/packages/c0/75/c9209ee4d1b5975eb8c2cba4428bde6b61bd55664a98290dd015cdb18e98/iso8601-0.1.11.tar.gz"; | |
619 | md5 = "b06d11cd14a64096f907086044f0fe38"; |
|
802 | md5 = "b06d11cd14a64096f907086044f0fe38"; | |
620 | }; |
|
803 | }; | |
|
804 | meta = { | |||
|
805 | license = [ pkgs.lib.licenses.mit ]; | |||
|
806 | }; | |||
621 | }; |
|
807 | }; | |
622 | itsdangerous = super.buildPythonPackage { |
|
808 | itsdangerous = super.buildPythonPackage { | |
623 | name = "itsdangerous-0.24"; |
|
809 | name = "itsdangerous-0.24"; | |
624 | buildInputs = with self; []; |
|
810 | buildInputs = with self; []; | |
625 | doCheck = false; |
|
811 | doCheck = false; | |
626 | propagatedBuildInputs = with self; []; |
|
812 | propagatedBuildInputs = with self; []; | |
627 | src = fetchurl { |
|
813 | src = fetchurl { | |
628 | url = "https://pypi.python.org/packages/dc/b4/a60bcdba945c00f6d608d8975131ab3f25b22f2bcfe1dab221165194b2d4/itsdangerous-0.24.tar.gz"; |
|
814 | url = "https://pypi.python.org/packages/dc/b4/a60bcdba945c00f6d608d8975131ab3f25b22f2bcfe1dab221165194b2d4/itsdangerous-0.24.tar.gz"; | |
629 | md5 = "a3d55aa79369aef5345c036a8a26307f"; |
|
815 | md5 = "a3d55aa79369aef5345c036a8a26307f"; | |
630 | }; |
|
816 | }; | |
|
817 | meta = { | |||
|
818 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
819 | }; | |||
631 | }; |
|
820 | }; | |
632 | kombu = super.buildPythonPackage { |
|
821 | kombu = super.buildPythonPackage { | |
633 | name = "kombu-1.5.1"; |
|
822 | name = "kombu-1.5.1"; | |
634 | buildInputs = with self; []; |
|
823 | buildInputs = with self; []; | |
635 | doCheck = false; |
|
824 | doCheck = false; | |
636 | propagatedBuildInputs = with self; [anyjson amqplib]; |
|
825 | propagatedBuildInputs = with self; [anyjson amqplib]; | |
637 | src = fetchurl { |
|
826 | src = fetchurl { | |
638 | url = "https://pypi.python.org/packages/19/53/74bf2a624644b45f0850a638752514fc10a8e1cbd738f10804951a6df3f5/kombu-1.5.1.tar.gz"; |
|
827 | url = "https://pypi.python.org/packages/19/53/74bf2a624644b45f0850a638752514fc10a8e1cbd738f10804951a6df3f5/kombu-1.5.1.tar.gz"; | |
639 | md5 = "50662f3c7e9395b3d0721fb75d100b63"; |
|
828 | md5 = "50662f3c7e9395b3d0721fb75d100b63"; | |
640 | }; |
|
829 | }; | |
|
830 | meta = { | |||
|
831 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
832 | }; | |||
641 | }; |
|
833 | }; | |
642 | lxml = super.buildPythonPackage { |
|
834 | lxml = super.buildPythonPackage { | |
643 | name = "lxml-3.4.4"; |
|
835 | name = "lxml-3.4.4"; | |
644 | buildInputs = with self; []; |
|
836 | buildInputs = with self; []; | |
645 | doCheck = false; |
|
837 | doCheck = false; | |
646 | propagatedBuildInputs = with self; []; |
|
838 | propagatedBuildInputs = with self; []; | |
647 | src = fetchurl { |
|
839 | src = fetchurl { | |
648 | url = "https://pypi.python.org/packages/63/c7/4f2a2a4ad6c6fa99b14be6b3c1cece9142e2d915aa7c43c908677afc8fa4/lxml-3.4.4.tar.gz"; |
|
840 | url = "https://pypi.python.org/packages/63/c7/4f2a2a4ad6c6fa99b14be6b3c1cece9142e2d915aa7c43c908677afc8fa4/lxml-3.4.4.tar.gz"; | |
649 | md5 = "a9a65972afc173ec7a39c585f4eea69c"; |
|
841 | md5 = "a9a65972afc173ec7a39c585f4eea69c"; | |
650 | }; |
|
842 | }; | |
|
843 | meta = { | |||
|
844 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
845 | }; | |||
651 | }; |
|
846 | }; | |
652 | mccabe = super.buildPythonPackage { |
|
847 | mccabe = super.buildPythonPackage { | |
653 | name = "mccabe-0.3"; |
|
848 | name = "mccabe-0.3"; | |
654 | buildInputs = with self; []; |
|
849 | buildInputs = with self; []; | |
655 | doCheck = false; |
|
850 | doCheck = false; | |
656 | propagatedBuildInputs = with self; []; |
|
851 | propagatedBuildInputs = with self; []; | |
657 | src = fetchurl { |
|
852 | src = fetchurl { | |
658 | url = "https://pypi.python.org/packages/c9/2e/75231479e11a906b64ac43bad9d0bb534d00080b18bdca8db9da46e1faf7/mccabe-0.3.tar.gz"; |
|
853 | url = "https://pypi.python.org/packages/c9/2e/75231479e11a906b64ac43bad9d0bb534d00080b18bdca8db9da46e1faf7/mccabe-0.3.tar.gz"; | |
659 | md5 = "81640948ff226f8c12b3277059489157"; |
|
854 | md5 = "81640948ff226f8c12b3277059489157"; | |
660 | }; |
|
855 | }; | |
|
856 | meta = { | |||
|
857 | license = [ { fullName = "Expat license"; } pkgs.lib.licenses.mit ]; | |||
|
858 | }; | |||
661 | }; |
|
859 | }; | |
662 | meld3 = super.buildPythonPackage { |
|
860 | meld3 = super.buildPythonPackage { | |
663 | name = "meld3-1.0.2"; |
|
861 | name = "meld3-1.0.2"; | |
664 | buildInputs = with self; []; |
|
862 | buildInputs = with self; []; | |
665 | doCheck = false; |
|
863 | doCheck = false; | |
666 | propagatedBuildInputs = with self; []; |
|
864 | propagatedBuildInputs = with self; []; | |
667 | src = fetchurl { |
|
865 | src = fetchurl { | |
668 | url = "https://pypi.python.org/packages/45/a0/317c6422b26c12fe0161e936fc35f36552069ba8e6f7ecbd99bbffe32a5f/meld3-1.0.2.tar.gz"; |
|
866 | url = "https://pypi.python.org/packages/45/a0/317c6422b26c12fe0161e936fc35f36552069ba8e6f7ecbd99bbffe32a5f/meld3-1.0.2.tar.gz"; | |
669 | md5 = "3ccc78cd79cffd63a751ad7684c02c91"; |
|
867 | md5 = "3ccc78cd79cffd63a751ad7684c02c91"; | |
670 | }; |
|
868 | }; | |
|
869 | meta = { | |||
|
870 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |||
|
871 | }; | |||
671 | }; |
|
872 | }; | |
672 | mock = super.buildPythonPackage { |
|
873 | mock = super.buildPythonPackage { | |
673 | name = "mock-1.0.1"; |
|
874 | name = "mock-1.0.1"; | |
674 | buildInputs = with self; []; |
|
875 | buildInputs = with self; []; | |
675 | doCheck = false; |
|
876 | doCheck = false; | |
676 | propagatedBuildInputs = with self; []; |
|
877 | propagatedBuildInputs = with self; []; | |
677 | src = fetchurl { |
|
878 | src = fetchurl { | |
678 | url = "https://pypi.python.org/packages/15/45/30273ee91feb60dabb8fbb2da7868520525f02cf910279b3047182feed80/mock-1.0.1.zip"; |
|
879 | url = "https://pypi.python.org/packages/15/45/30273ee91feb60dabb8fbb2da7868520525f02cf910279b3047182feed80/mock-1.0.1.zip"; | |
679 | md5 = "869f08d003c289a97c1a6610faf5e913"; |
|
880 | md5 = "869f08d003c289a97c1a6610faf5e913"; | |
680 | }; |
|
881 | }; | |
|
882 | meta = { | |||
|
883 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
884 | }; | |||
681 | }; |
|
885 | }; | |
682 | msgpack-python = super.buildPythonPackage { |
|
886 | msgpack-python = super.buildPythonPackage { | |
683 | name = "msgpack-python-0.4.6"; |
|
887 | name = "msgpack-python-0.4.6"; | |
684 | buildInputs = with self; []; |
|
888 | buildInputs = with self; []; | |
685 | doCheck = false; |
|
889 | doCheck = false; | |
686 | propagatedBuildInputs = with self; []; |
|
890 | propagatedBuildInputs = with self; []; | |
687 | src = fetchurl { |
|
891 | src = fetchurl { | |
688 | url = "https://pypi.python.org/packages/15/ce/ff2840885789ef8035f66cd506ea05bdb228340307d5e71a7b1e3f82224c/msgpack-python-0.4.6.tar.gz"; |
|
892 | url = "https://pypi.python.org/packages/15/ce/ff2840885789ef8035f66cd506ea05bdb228340307d5e71a7b1e3f82224c/msgpack-python-0.4.6.tar.gz"; | |
689 | md5 = "8b317669314cf1bc881716cccdaccb30"; |
|
893 | md5 = "8b317669314cf1bc881716cccdaccb30"; | |
690 | }; |
|
894 | }; | |
|
895 | meta = { | |||
|
896 | license = [ pkgs.lib.licenses.asl20 ]; | |||
|
897 | }; | |||
691 | }; |
|
898 | }; | |
692 | nose = super.buildPythonPackage { |
|
899 | nose = super.buildPythonPackage { | |
693 | name = "nose-1.3.6"; |
|
900 | name = "nose-1.3.6"; | |
694 | buildInputs = with self; []; |
|
901 | buildInputs = with self; []; | |
695 | doCheck = false; |
|
902 | doCheck = false; | |
696 | propagatedBuildInputs = with self; []; |
|
903 | propagatedBuildInputs = with self; []; | |
697 | src = fetchurl { |
|
904 | src = fetchurl { | |
698 | url = "https://pypi.python.org/packages/70/c7/469e68148d17a0d3db5ed49150242fd70a74a8147b8f3f8b87776e028d99/nose-1.3.6.tar.gz"; |
|
905 | url = "https://pypi.python.org/packages/70/c7/469e68148d17a0d3db5ed49150242fd70a74a8147b8f3f8b87776e028d99/nose-1.3.6.tar.gz"; | |
699 | md5 = "0ca546d81ca8309080fc80cb389e7a16"; |
|
906 | md5 = "0ca546d81ca8309080fc80cb389e7a16"; | |
700 | }; |
|
907 | }; | |
|
908 | meta = { | |||
|
909 | license = [ { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "GNU LGPL"; } ]; | |||
|
910 | }; | |||
701 | }; |
|
911 | }; | |
702 | objgraph = super.buildPythonPackage { |
|
912 | objgraph = super.buildPythonPackage { | |
703 | name = "objgraph-2.0.0"; |
|
913 | name = "objgraph-2.0.0"; | |
704 | buildInputs = with self; []; |
|
914 | buildInputs = with self; []; | |
705 | doCheck = false; |
|
915 | doCheck = false; | |
706 | propagatedBuildInputs = with self; []; |
|
916 | propagatedBuildInputs = with self; []; | |
707 | src = fetchurl { |
|
917 | src = fetchurl { | |
708 | url = "https://pypi.python.org/packages/d7/33/ace750b59247496ed769b170586c5def7202683f3d98e737b75b767ff29e/objgraph-2.0.0.tar.gz"; |
|
918 | url = "https://pypi.python.org/packages/d7/33/ace750b59247496ed769b170586c5def7202683f3d98e737b75b767ff29e/objgraph-2.0.0.tar.gz"; | |
709 | md5 = "25b0d5e5adc74aa63ead15699614159c"; |
|
919 | md5 = "25b0d5e5adc74aa63ead15699614159c"; | |
710 | }; |
|
920 | }; | |
|
921 | meta = { | |||
|
922 | license = [ pkgs.lib.licenses.mit ]; | |||
|
923 | }; | |||
711 | }; |
|
924 | }; | |
712 | packaging = super.buildPythonPackage { |
|
925 | packaging = super.buildPythonPackage { | |
713 | name = "packaging-15.2"; |
|
926 | name = "packaging-15.2"; | |
714 | buildInputs = with self; []; |
|
927 | buildInputs = with self; []; | |
715 | doCheck = false; |
|
928 | doCheck = false; | |
716 | propagatedBuildInputs = with self; []; |
|
929 | propagatedBuildInputs = with self; []; | |
717 | src = fetchurl { |
|
930 | src = fetchurl { | |
718 | url = "https://pypi.python.org/packages/24/c4/185da1304f07047dc9e0c46c31db75c0351bd73458ac3efad7da3dbcfbe1/packaging-15.2.tar.gz"; |
|
931 | url = "https://pypi.python.org/packages/24/c4/185da1304f07047dc9e0c46c31db75c0351bd73458ac3efad7da3dbcfbe1/packaging-15.2.tar.gz"; | |
719 | md5 = "c16093476f6ced42128bf610e5db3784"; |
|
932 | md5 = "c16093476f6ced42128bf610e5db3784"; | |
720 | }; |
|
933 | }; | |
|
934 | meta = { | |||
|
935 | license = [ pkgs.lib.licenses.asl20 ]; | |||
|
936 | }; | |||
721 | }; |
|
937 | }; | |
722 | paramiko = super.buildPythonPackage { |
|
938 | paramiko = super.buildPythonPackage { | |
723 | name = "paramiko-1.15.1"; |
|
939 | name = "paramiko-1.15.1"; | |
724 | buildInputs = with self; []; |
|
940 | buildInputs = with self; []; | |
725 | doCheck = false; |
|
941 | doCheck = false; | |
726 | propagatedBuildInputs = with self; [pycrypto ecdsa]; |
|
942 | propagatedBuildInputs = with self; [pycrypto ecdsa]; | |
727 | src = fetchurl { |
|
943 | src = fetchurl { | |
728 | url = "https://pypi.python.org/packages/04/2b/a22d2a560c1951abbbf95a0628e245945565f70dc082d9e784666887222c/paramiko-1.15.1.tar.gz"; |
|
944 | url = "https://pypi.python.org/packages/04/2b/a22d2a560c1951abbbf95a0628e245945565f70dc082d9e784666887222c/paramiko-1.15.1.tar.gz"; | |
729 | md5 = "48c274c3f9b1282932567b21f6acf3b5"; |
|
945 | md5 = "48c274c3f9b1282932567b21f6acf3b5"; | |
730 | }; |
|
946 | }; | |
|
947 | meta = { | |||
|
948 | license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; | |||
|
949 | }; | |||
731 | }; |
|
950 | }; | |
732 | pep8 = super.buildPythonPackage { |
|
951 | pep8 = super.buildPythonPackage { | |
733 | name = "pep8-1.5.7"; |
|
952 | name = "pep8-1.5.7"; | |
734 | buildInputs = with self; []; |
|
953 | buildInputs = with self; []; | |
735 | doCheck = false; |
|
954 | doCheck = false; | |
736 | propagatedBuildInputs = with self; []; |
|
955 | propagatedBuildInputs = with self; []; | |
737 | src = fetchurl { |
|
956 | src = fetchurl { | |
738 | url = "https://pypi.python.org/packages/8b/de/259f5e735897ada1683489dd514b2a1c91aaa74e5e6b68f80acf128a6368/pep8-1.5.7.tar.gz"; |
|
957 | url = "https://pypi.python.org/packages/8b/de/259f5e735897ada1683489dd514b2a1c91aaa74e5e6b68f80acf128a6368/pep8-1.5.7.tar.gz"; | |
739 | md5 = "f6adbdd69365ecca20513c709f9b7c93"; |
|
958 | md5 = "f6adbdd69365ecca20513c709f9b7c93"; | |
740 | }; |
|
959 | }; | |
|
960 | meta = { | |||
|
961 | license = [ { fullName = "Expat license"; } pkgs.lib.licenses.mit ]; | |||
|
962 | }; | |||
741 | }; |
|
963 | }; | |
742 | psutil = super.buildPythonPackage { |
|
964 | psutil = super.buildPythonPackage { | |
743 | name = "psutil-2.2.1"; |
|
965 | name = "psutil-2.2.1"; | |
744 | buildInputs = with self; []; |
|
966 | buildInputs = with self; []; | |
745 | doCheck = false; |
|
967 | doCheck = false; | |
746 | propagatedBuildInputs = with self; []; |
|
968 | propagatedBuildInputs = with self; []; | |
747 | src = fetchurl { |
|
969 | src = fetchurl { | |
748 | url = "https://pypi.python.org/packages/df/47/ee54ef14dd40f8ce831a7581001a5096494dc99fe71586260ca6b531fe86/psutil-2.2.1.tar.gz"; |
|
970 | url = "https://pypi.python.org/packages/df/47/ee54ef14dd40f8ce831a7581001a5096494dc99fe71586260ca6b531fe86/psutil-2.2.1.tar.gz"; | |
749 | md5 = "1a2b58cd9e3a53528bb6148f0c4d5244"; |
|
971 | md5 = "1a2b58cd9e3a53528bb6148f0c4d5244"; | |
750 | }; |
|
972 | }; | |
|
973 | meta = { | |||
|
974 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
975 | }; | |||
751 | }; |
|
976 | }; | |
752 | psycopg2 = super.buildPythonPackage { |
|
977 | psycopg2 = super.buildPythonPackage { | |
753 | name = "psycopg2-2.6"; |
|
978 | name = "psycopg2-2.6"; | |
754 | buildInputs = with self; []; |
|
979 | buildInputs = with self; []; | |
755 | doCheck = false; |
|
980 | doCheck = false; | |
756 | propagatedBuildInputs = with self; []; |
|
981 | propagatedBuildInputs = with self; []; | |
757 | src = fetchurl { |
|
982 | src = fetchurl { | |
758 | url = "https://pypi.python.org/packages/dd/c7/9016ff8ff69da269b1848276eebfb264af5badf6b38caad805426771f04d/psycopg2-2.6.tar.gz"; |
|
983 | url = "https://pypi.python.org/packages/dd/c7/9016ff8ff69da269b1848276eebfb264af5badf6b38caad805426771f04d/psycopg2-2.6.tar.gz"; | |
759 | md5 = "fbbb039a8765d561a1c04969bbae7c74"; |
|
984 | md5 = "fbbb039a8765d561a1c04969bbae7c74"; | |
760 | }; |
|
985 | }; | |
|
986 | meta = { | |||
|
987 | license = [ pkgs.lib.licenses.zpt21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ]; | |||
|
988 | }; | |||
761 | }; |
|
989 | }; | |
762 | py = super.buildPythonPackage { |
|
990 | py = super.buildPythonPackage { | |
763 | name = "py-1.4.29"; |
|
991 | name = "py-1.4.29"; | |
764 | buildInputs = with self; []; |
|
992 | buildInputs = with self; []; | |
765 | doCheck = false; |
|
993 | doCheck = false; | |
766 | propagatedBuildInputs = with self; []; |
|
994 | propagatedBuildInputs = with self; []; | |
767 | src = fetchurl { |
|
995 | src = fetchurl { | |
768 | url = "https://pypi.python.org/packages/2a/bc/a1a4a332ac10069b8e5e25136a35e08a03f01fd6ab03d819889d79a1fd65/py-1.4.29.tar.gz"; |
|
996 | url = "https://pypi.python.org/packages/2a/bc/a1a4a332ac10069b8e5e25136a35e08a03f01fd6ab03d819889d79a1fd65/py-1.4.29.tar.gz"; | |
769 | md5 = "c28e0accba523a29b35a48bb703fb96c"; |
|
997 | md5 = "c28e0accba523a29b35a48bb703fb96c"; | |
770 | }; |
|
998 | }; | |
|
999 | meta = { | |||
|
1000 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1001 | }; | |||
771 | }; |
|
1002 | }; | |
772 | py-bcrypt = super.buildPythonPackage { |
|
1003 | py-bcrypt = super.buildPythonPackage { | |
773 | name = "py-bcrypt-0.4"; |
|
1004 | name = "py-bcrypt-0.4"; | |
774 | buildInputs = with self; []; |
|
1005 | buildInputs = with self; []; | |
775 | doCheck = false; |
|
1006 | doCheck = false; | |
776 | propagatedBuildInputs = with self; []; |
|
1007 | propagatedBuildInputs = with self; []; | |
777 | src = fetchurl { |
|
1008 | src = fetchurl { | |
778 | url = "https://pypi.python.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz"; |
|
1009 | url = "https://pypi.python.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz"; | |
779 | md5 = "dd8b367d6b716a2ea2e72392525f4e36"; |
|
1010 | md5 = "dd8b367d6b716a2ea2e72392525f4e36"; | |
780 | }; |
|
1011 | }; | |
|
1012 | meta = { | |||
|
1013 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
1014 | }; | |||
781 | }; |
|
1015 | }; | |
782 | pycrypto = super.buildPythonPackage { |
|
1016 | pycrypto = super.buildPythonPackage { | |
783 | name = "pycrypto-2.6.1"; |
|
1017 | name = "pycrypto-2.6.1"; | |
784 | buildInputs = with self; []; |
|
1018 | buildInputs = with self; []; | |
785 | doCheck = false; |
|
1019 | doCheck = false; | |
786 | propagatedBuildInputs = with self; []; |
|
1020 | propagatedBuildInputs = with self; []; | |
787 | src = fetchurl { |
|
1021 | src = fetchurl { | |
788 | url = "https://pypi.python.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz"; |
|
1022 | url = "https://pypi.python.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz"; | |
789 | md5 = "55a61a054aa66812daf5161a0d5d7eda"; |
|
1023 | md5 = "55a61a054aa66812daf5161a0d5d7eda"; | |
790 | }; |
|
1024 | }; | |
|
1025 | meta = { | |||
|
1026 | license = [ pkgs.lib.licenses.publicDomain ]; | |||
|
1027 | }; | |||
791 | }; |
|
1028 | }; | |
792 | pycurl = super.buildPythonPackage { |
|
1029 | pycurl = super.buildPythonPackage { | |
793 | name = "pycurl-7.19.5"; |
|
1030 | name = "pycurl-7.19.5"; | |
794 | buildInputs = with self; []; |
|
1031 | buildInputs = with self; []; | |
795 | doCheck = false; |
|
1032 | doCheck = false; | |
796 | propagatedBuildInputs = with self; []; |
|
1033 | propagatedBuildInputs = with self; []; | |
797 | src = fetchurl { |
|
1034 | src = fetchurl { | |
798 | url = "https://pypi.python.org/packages/6c/48/13bad289ef6f4869b1d8fc11ae54de8cfb3cc4a2eb9f7419c506f763be46/pycurl-7.19.5.tar.gz"; |
|
1035 | url = "https://pypi.python.org/packages/6c/48/13bad289ef6f4869b1d8fc11ae54de8cfb3cc4a2eb9f7419c506f763be46/pycurl-7.19.5.tar.gz"; | |
799 | md5 = "47b4eac84118e2606658122104e62072"; |
|
1036 | md5 = "47b4eac84118e2606658122104e62072"; | |
800 | }; |
|
1037 | }; | |
|
1038 | meta = { | |||
|
1039 | license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; | |||
|
1040 | }; | |||
801 | }; |
|
1041 | }; | |
802 | pyflakes = super.buildPythonPackage { |
|
1042 | pyflakes = super.buildPythonPackage { | |
803 | name = "pyflakes-0.8.1"; |
|
1043 | name = "pyflakes-0.8.1"; | |
804 | buildInputs = with self; []; |
|
1044 | buildInputs = with self; []; | |
805 | doCheck = false; |
|
1045 | doCheck = false; | |
806 | propagatedBuildInputs = with self; []; |
|
1046 | propagatedBuildInputs = with self; []; | |
807 | src = fetchurl { |
|
1047 | src = fetchurl { | |
808 | url = "https://pypi.python.org/packages/75/22/a90ec0252f4f87f3ffb6336504de71fe16a49d69c4538dae2f12b9360a38/pyflakes-0.8.1.tar.gz"; |
|
1048 | url = "https://pypi.python.org/packages/75/22/a90ec0252f4f87f3ffb6336504de71fe16a49d69c4538dae2f12b9360a38/pyflakes-0.8.1.tar.gz"; | |
809 | md5 = "905fe91ad14b912807e8fdc2ac2e2c23"; |
|
1049 | md5 = "905fe91ad14b912807e8fdc2ac2e2c23"; | |
810 | }; |
|
1050 | }; | |
|
1051 | meta = { | |||
|
1052 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1053 | }; | |||
811 | }; |
|
1054 | }; | |
812 | pyparsing = super.buildPythonPackage { |
|
1055 | pyparsing = super.buildPythonPackage { | |
813 | name = "pyparsing-1.5.7"; |
|
1056 | name = "pyparsing-1.5.7"; | |
814 | buildInputs = with self; []; |
|
1057 | buildInputs = with self; []; | |
815 | doCheck = false; |
|
1058 | doCheck = false; | |
816 | propagatedBuildInputs = with self; []; |
|
1059 | propagatedBuildInputs = with self; []; | |
817 | src = fetchurl { |
|
1060 | src = fetchurl { | |
818 | url = "https://pypi.python.org/packages/2e/26/e8fb5b4256a5f5036be7ce115ef8db8d06bc537becfbdc46c6af008314ee/pyparsing-1.5.7.zip"; |
|
1061 | url = "https://pypi.python.org/packages/2e/26/e8fb5b4256a5f5036be7ce115ef8db8d06bc537becfbdc46c6af008314ee/pyparsing-1.5.7.zip"; | |
819 | md5 = "b86854857a368d6ccb4d5b6e76d0637f"; |
|
1062 | md5 = "b86854857a368d6ccb4d5b6e76d0637f"; | |
820 | }; |
|
1063 | }; | |
|
1064 | meta = { | |||
|
1065 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1066 | }; | |||
821 | }; |
|
1067 | }; | |
822 | pyramid = super.buildPythonPackage { |
|
1068 | pyramid = super.buildPythonPackage { | |
823 | name = "pyramid-1.6.1"; |
|
1069 | name = "pyramid-1.6.1"; | |
824 | buildInputs = with self; []; |
|
1070 | buildInputs = with self; []; | |
825 | doCheck = false; |
|
1071 | doCheck = false; | |
826 | propagatedBuildInputs = with self; [setuptools WebOb repoze.lru zope.interface zope.deprecation venusian translationstring PasteDeploy]; |
|
1072 | propagatedBuildInputs = with self; [setuptools WebOb repoze.lru zope.interface zope.deprecation venusian translationstring PasteDeploy]; | |
827 | src = fetchurl { |
|
1073 | src = fetchurl { | |
828 | url = "https://pypi.python.org/packages/30/b3/fcc4a2a4800cbf21989e00454b5828cf1f7fe35c63e0810b350e56d4c475/pyramid-1.6.1.tar.gz"; |
|
1074 | url = "https://pypi.python.org/packages/30/b3/fcc4a2a4800cbf21989e00454b5828cf1f7fe35c63e0810b350e56d4c475/pyramid-1.6.1.tar.gz"; | |
829 | md5 = "b18688ff3cc33efdbb098a35b45dd122"; |
|
1075 | md5 = "b18688ff3cc33efdbb098a35b45dd122"; | |
830 | }; |
|
1076 | }; | |
|
1077 | meta = { | |||
|
1078 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |||
|
1079 | }; | |||
831 | }; |
|
1080 | }; | |
832 | pyramid-beaker = super.buildPythonPackage { |
|
1081 | pyramid-beaker = super.buildPythonPackage { | |
833 | name = "pyramid-beaker-0.8"; |
|
1082 | name = "pyramid-beaker-0.8"; | |
834 | buildInputs = with self; []; |
|
1083 | buildInputs = with self; []; | |
835 | doCheck = false; |
|
1084 | doCheck = false; | |
836 | propagatedBuildInputs = with self; [pyramid Beaker]; |
|
1085 | propagatedBuildInputs = with self; [pyramid Beaker]; | |
837 | src = fetchurl { |
|
1086 | src = fetchurl { | |
838 | url = "https://pypi.python.org/packages/d9/6e/b85426e00fd3d57f4545f74e1c3828552d8700f13ededeef9233f7bca8be/pyramid_beaker-0.8.tar.gz"; |
|
1087 | url = "https://pypi.python.org/packages/d9/6e/b85426e00fd3d57f4545f74e1c3828552d8700f13ededeef9233f7bca8be/pyramid_beaker-0.8.tar.gz"; | |
839 | md5 = "22f14be31b06549f80890e2c63a93834"; |
|
1088 | md5 = "22f14be31b06549f80890e2c63a93834"; | |
840 | }; |
|
1089 | }; | |
|
1090 | meta = { | |||
|
1091 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |||
|
1092 | }; | |||
841 | }; |
|
1093 | }; | |
842 | pyramid-debugtoolbar = super.buildPythonPackage { |
|
1094 | pyramid-debugtoolbar = super.buildPythonPackage { | |
843 | name = "pyramid-debugtoolbar-2.4.2"; |
|
1095 | name = "pyramid-debugtoolbar-2.4.2"; | |
844 | buildInputs = with self; []; |
|
1096 | buildInputs = with self; []; | |
845 | doCheck = false; |
|
1097 | doCheck = false; | |
846 | propagatedBuildInputs = with self; [pyramid pyramid-mako repoze.lru Pygments]; |
|
1098 | propagatedBuildInputs = with self; [pyramid pyramid-mako repoze.lru Pygments]; | |
847 | src = fetchurl { |
|
1099 | src = fetchurl { | |
848 | url = "https://pypi.python.org/packages/89/00/ed5426ee41ed747ba3ffd30e8230841a6878286ea67d480b1444d24f06a2/pyramid_debugtoolbar-2.4.2.tar.gz"; |
|
1100 | url = "https://pypi.python.org/packages/89/00/ed5426ee41ed747ba3ffd30e8230841a6878286ea67d480b1444d24f06a2/pyramid_debugtoolbar-2.4.2.tar.gz"; | |
849 | md5 = "073ea67086cc4bd5decc3a000853642d"; |
|
1101 | md5 = "073ea67086cc4bd5decc3a000853642d"; | |
850 | }; |
|
1102 | }; | |
|
1103 | meta = { | |||
|
1104 | license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ]; | |||
|
1105 | }; | |||
851 | }; |
|
1106 | }; | |
852 | pyramid-jinja2 = super.buildPythonPackage { |
|
1107 | pyramid-jinja2 = super.buildPythonPackage { | |
853 | name = "pyramid-jinja2-2.5"; |
|
1108 | name = "pyramid-jinja2-2.5"; | |
854 | buildInputs = with self; []; |
|
1109 | buildInputs = with self; []; | |
855 | doCheck = false; |
|
1110 | doCheck = false; | |
856 | propagatedBuildInputs = with self; [pyramid zope.deprecation Jinja2 MarkupSafe]; |
|
1111 | propagatedBuildInputs = with self; [pyramid zope.deprecation Jinja2 MarkupSafe]; | |
857 | src = fetchurl { |
|
1112 | src = fetchurl { | |
858 | url = "https://pypi.python.org/packages/a1/80/595e26ffab7deba7208676b6936b7e5a721875710f982e59899013cae1ed/pyramid_jinja2-2.5.tar.gz"; |
|
1113 | url = "https://pypi.python.org/packages/a1/80/595e26ffab7deba7208676b6936b7e5a721875710f982e59899013cae1ed/pyramid_jinja2-2.5.tar.gz"; | |
859 | md5 = "07cb6547204ac5e6f0b22a954ccee928"; |
|
1114 | md5 = "07cb6547204ac5e6f0b22a954ccee928"; | |
860 | }; |
|
1115 | }; | |
|
1116 | meta = { | |||
|
1117 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |||
|
1118 | }; | |||
861 | }; |
|
1119 | }; | |
862 | pyramid-mako = super.buildPythonPackage { |
|
1120 | pyramid-mako = super.buildPythonPackage { | |
863 | name = "pyramid-mako-1.0.2"; |
|
1121 | name = "pyramid-mako-1.0.2"; | |
864 | buildInputs = with self; []; |
|
1122 | buildInputs = with self; []; | |
865 | doCheck = false; |
|
1123 | doCheck = false; | |
866 | propagatedBuildInputs = with self; [pyramid Mako]; |
|
1124 | propagatedBuildInputs = with self; [pyramid Mako]; | |
867 | src = fetchurl { |
|
1125 | src = fetchurl { | |
868 | url = "https://pypi.python.org/packages/f1/92/7e69bcf09676d286a71cb3bbb887b16595b96f9ba7adbdc239ffdd4b1eb9/pyramid_mako-1.0.2.tar.gz"; |
|
1126 | url = "https://pypi.python.org/packages/f1/92/7e69bcf09676d286a71cb3bbb887b16595b96f9ba7adbdc239ffdd4b1eb9/pyramid_mako-1.0.2.tar.gz"; | |
869 | md5 = "ee25343a97eb76bd90abdc2a774eb48a"; |
|
1127 | md5 = "ee25343a97eb76bd90abdc2a774eb48a"; | |
870 | }; |
|
1128 | }; | |
|
1129 | meta = { | |||
|
1130 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |||
|
1131 | }; | |||
871 | }; |
|
1132 | }; | |
872 | pysqlite = super.buildPythonPackage { |
|
1133 | pysqlite = super.buildPythonPackage { | |
873 | name = "pysqlite-2.6.3"; |
|
1134 | name = "pysqlite-2.6.3"; | |
874 | buildInputs = with self; []; |
|
1135 | buildInputs = with self; []; | |
875 | doCheck = false; |
|
1136 | doCheck = false; | |
876 | propagatedBuildInputs = with self; []; |
|
1137 | propagatedBuildInputs = with self; []; | |
877 | src = fetchurl { |
|
1138 | src = fetchurl { | |
878 | url = "https://pypi.python.org/packages/5c/a6/1c429cd4c8069cf4bfbd0eb4d592b3f4042155a8202df83d7e9b93aa3dc2/pysqlite-2.6.3.tar.gz"; |
|
1139 | url = "https://pypi.python.org/packages/5c/a6/1c429cd4c8069cf4bfbd0eb4d592b3f4042155a8202df83d7e9b93aa3dc2/pysqlite-2.6.3.tar.gz"; | |
879 | md5 = "7ff1cedee74646b50117acff87aa1cfa"; |
|
1140 | md5 = "7ff1cedee74646b50117acff87aa1cfa"; | |
880 | }; |
|
1141 | }; | |
|
1142 | meta = { | |||
|
1143 | license = [ { fullName = "zlib/libpng License"; } { fullName = "zlib/libpng license"; } ]; | |||
|
1144 | }; | |||
881 | }; |
|
1145 | }; | |
882 | pytest = super.buildPythonPackage { |
|
1146 | pytest = super.buildPythonPackage { | |
883 | name = "pytest-2.8.5"; |
|
1147 | name = "pytest-2.8.5"; | |
884 | buildInputs = with self; []; |
|
1148 | buildInputs = with self; []; | |
885 | doCheck = false; |
|
1149 | doCheck = false; | |
886 | propagatedBuildInputs = with self; [py]; |
|
1150 | propagatedBuildInputs = with self; [py]; | |
887 | src = fetchurl { |
|
1151 | src = fetchurl { | |
888 | url = "https://pypi.python.org/packages/b1/3d/d7ea9b0c51e0cacded856e49859f0a13452747491e842c236bbab3714afe/pytest-2.8.5.zip"; |
|
1152 | url = "https://pypi.python.org/packages/b1/3d/d7ea9b0c51e0cacded856e49859f0a13452747491e842c236bbab3714afe/pytest-2.8.5.zip"; | |
889 | md5 = "8493b06f700862f1294298d6c1b715a9"; |
|
1153 | md5 = "8493b06f700862f1294298d6c1b715a9"; | |
890 | }; |
|
1154 | }; | |
|
1155 | meta = { | |||
|
1156 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1157 | }; | |||
891 | }; |
|
1158 | }; | |
892 | pytest-catchlog = super.buildPythonPackage { |
|
1159 | pytest-catchlog = super.buildPythonPackage { | |
893 | name = "pytest-catchlog-1.2.2"; |
|
1160 | name = "pytest-catchlog-1.2.2"; | |
894 | buildInputs = with self; []; |
|
1161 | buildInputs = with self; []; | |
895 | doCheck = false; |
|
1162 | doCheck = false; | |
896 | propagatedBuildInputs = with self; [py pytest]; |
|
1163 | propagatedBuildInputs = with self; [py pytest]; | |
897 | src = fetchurl { |
|
1164 | src = fetchurl { | |
898 | url = "https://pypi.python.org/packages/f2/2b/2faccdb1a978fab9dd0bf31cca9f6847fbe9184a0bdcc3011ac41dd44191/pytest-catchlog-1.2.2.zip"; |
|
1165 | url = "https://pypi.python.org/packages/f2/2b/2faccdb1a978fab9dd0bf31cca9f6847fbe9184a0bdcc3011ac41dd44191/pytest-catchlog-1.2.2.zip"; | |
899 | md5 = "09d890c54c7456c818102b7ff8c182c8"; |
|
1166 | md5 = "09d890c54c7456c818102b7ff8c182c8"; | |
900 | }; |
|
1167 | }; | |
|
1168 | meta = { | |||
|
1169 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1170 | }; | |||
901 | }; |
|
1171 | }; | |
902 | pytest-cov = super.buildPythonPackage { |
|
1172 | pytest-cov = super.buildPythonPackage { | |
903 | name = "pytest-cov-1.8.1"; |
|
1173 | name = "pytest-cov-1.8.1"; | |
904 | buildInputs = with self; []; |
|
1174 | buildInputs = with self; []; | |
905 | doCheck = false; |
|
1175 | doCheck = false; | |
906 | propagatedBuildInputs = with self; [py pytest coverage cov-core]; |
|
1176 | propagatedBuildInputs = with self; [py pytest coverage cov-core]; | |
907 | src = fetchurl { |
|
1177 | src = fetchurl { | |
908 | url = "https://pypi.python.org/packages/11/4b/b04646e97f1721878eb21e9f779102d84dd044d324382263b1770a3e4838/pytest-cov-1.8.1.tar.gz"; |
|
1178 | url = "https://pypi.python.org/packages/11/4b/b04646e97f1721878eb21e9f779102d84dd044d324382263b1770a3e4838/pytest-cov-1.8.1.tar.gz"; | |
909 | md5 = "76c778afa2494088270348be42d759fc"; |
|
1179 | md5 = "76c778afa2494088270348be42d759fc"; | |
910 | }; |
|
1180 | }; | |
|
1181 | meta = { | |||
|
1182 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1183 | }; | |||
911 | }; |
|
1184 | }; | |
912 | pytest-profiling = super.buildPythonPackage { |
|
1185 | pytest-profiling = super.buildPythonPackage { | |
913 | name = "pytest-profiling-1.0.1"; |
|
1186 | name = "pytest-profiling-1.0.1"; | |
914 | buildInputs = with self; []; |
|
1187 | buildInputs = with self; []; | |
915 | doCheck = false; |
|
1188 | doCheck = false; | |
916 | propagatedBuildInputs = with self; [six pytest gprof2dot]; |
|
1189 | propagatedBuildInputs = with self; [six pytest gprof2dot]; | |
917 | src = fetchurl { |
|
1190 | src = fetchurl { | |
918 | url = "https://pypi.python.org/packages/d8/67/8ffab73406e22870e07fa4dc8dce1d7689b26dba8efd00161c9b6fc01ec0/pytest-profiling-1.0.1.tar.gz"; |
|
1191 | url = "https://pypi.python.org/packages/d8/67/8ffab73406e22870e07fa4dc8dce1d7689b26dba8efd00161c9b6fc01ec0/pytest-profiling-1.0.1.tar.gz"; | |
919 | md5 = "354404eb5b3fd4dc5eb7fffbb3d9b68b"; |
|
1192 | md5 = "354404eb5b3fd4dc5eb7fffbb3d9b68b"; | |
920 | }; |
|
1193 | }; | |
|
1194 | meta = { | |||
|
1195 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1196 | }; | |||
921 | }; |
|
1197 | }; | |
922 | pytest-runner = super.buildPythonPackage { |
|
1198 | pytest-runner = super.buildPythonPackage { | |
923 | name = "pytest-runner-2.7.1"; |
|
1199 | name = "pytest-runner-2.7.1"; | |
924 | buildInputs = with self; []; |
|
1200 | buildInputs = with self; []; | |
925 | doCheck = false; |
|
1201 | doCheck = false; | |
926 | propagatedBuildInputs = with self; []; |
|
1202 | propagatedBuildInputs = with self; []; | |
927 | src = fetchurl { |
|
1203 | src = fetchurl { | |
928 | url = "https://pypi.python.org/packages/99/6b/c4ff4418d3424d4475b7af60724fd4a5cdd91ed8e489dc9443281f0052bc/pytest-runner-2.7.1.tar.gz"; |
|
1204 | url = "https://pypi.python.org/packages/99/6b/c4ff4418d3424d4475b7af60724fd4a5cdd91ed8e489dc9443281f0052bc/pytest-runner-2.7.1.tar.gz"; | |
929 | md5 = "e56f0bc8d79a6bd91772b44ef4215c7e"; |
|
1205 | md5 = "e56f0bc8d79a6bd91772b44ef4215c7e"; | |
930 | }; |
|
1206 | }; | |
|
1207 | meta = { | |||
|
1208 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1209 | }; | |||
931 | }; |
|
1210 | }; | |
932 | pytest-timeout = super.buildPythonPackage { |
|
1211 | pytest-timeout = super.buildPythonPackage { | |
933 | name = "pytest-timeout-0.4"; |
|
1212 | name = "pytest-timeout-0.4"; | |
934 | buildInputs = with self; []; |
|
1213 | buildInputs = with self; []; | |
935 | doCheck = false; |
|
1214 | doCheck = false; | |
936 | propagatedBuildInputs = with self; [pytest]; |
|
1215 | propagatedBuildInputs = with self; [pytest]; | |
937 | src = fetchurl { |
|
1216 | src = fetchurl { | |
938 | url = "https://pypi.python.org/packages/24/48/5f6bd4b8026a26e1dd427243d560a29a0f1b24a5c7cffca4bf049a7bb65b/pytest-timeout-0.4.tar.gz"; |
|
1217 | url = "https://pypi.python.org/packages/24/48/5f6bd4b8026a26e1dd427243d560a29a0f1b24a5c7cffca4bf049a7bb65b/pytest-timeout-0.4.tar.gz"; | |
939 | md5 = "03b28aff69cbbfb959ed35ade5fde262"; |
|
1218 | md5 = "03b28aff69cbbfb959ed35ade5fde262"; | |
940 | }; |
|
1219 | }; | |
|
1220 | meta = { | |||
|
1221 | license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ]; | |||
|
1222 | }; | |||
941 | }; |
|
1223 | }; | |
942 | python-dateutil = super.buildPythonPackage { |
|
1224 | python-dateutil = super.buildPythonPackage { | |
943 | name = "python-dateutil-1.5"; |
|
1225 | name = "python-dateutil-1.5"; | |
944 | buildInputs = with self; []; |
|
1226 | buildInputs = with self; []; | |
945 | doCheck = false; |
|
1227 | doCheck = false; | |
946 | propagatedBuildInputs = with self; []; |
|
1228 | propagatedBuildInputs = with self; []; | |
947 | src = fetchurl { |
|
1229 | src = fetchurl { | |
948 | url = "https://pypi.python.org/packages/b4/7c/df59c89a753eb33c7c44e1dd42de0e9bc2ccdd5a4d576e0bfad97cc280cb/python-dateutil-1.5.tar.gz"; |
|
1230 | url = "https://pypi.python.org/packages/b4/7c/df59c89a753eb33c7c44e1dd42de0e9bc2ccdd5a4d576e0bfad97cc280cb/python-dateutil-1.5.tar.gz"; | |
949 | md5 = "0dcb1de5e5cad69490a3b6ab63f0cfa5"; |
|
1231 | md5 = "0dcb1de5e5cad69490a3b6ab63f0cfa5"; | |
950 | }; |
|
1232 | }; | |
|
1233 | meta = { | |||
|
1234 | license = [ pkgs.lib.licenses.psfl ]; | |||
|
1235 | }; | |||
951 | }; |
|
1236 | }; | |
952 | python-editor = super.buildPythonPackage { |
|
1237 | python-editor = super.buildPythonPackage { | |
953 | name = "python-editor-1.0.1"; |
|
1238 | name = "python-editor-1.0.1"; | |
954 | buildInputs = with self; []; |
|
1239 | buildInputs = with self; []; | |
955 | doCheck = false; |
|
1240 | doCheck = false; | |
956 | propagatedBuildInputs = with self; []; |
|
1241 | propagatedBuildInputs = with self; []; | |
957 | src = fetchurl { |
|
1242 | src = fetchurl { | |
958 | url = "https://pypi.python.org/packages/2b/c0/df7b87d5cf016f82eab3b05cd35f53287c1178ad8c42bfb6fa61b89b22f6/python-editor-1.0.1.tar.gz"; |
|
1243 | url = "https://pypi.python.org/packages/2b/c0/df7b87d5cf016f82eab3b05cd35f53287c1178ad8c42bfb6fa61b89b22f6/python-editor-1.0.1.tar.gz"; | |
959 | md5 = "e1fa63535b40e022fa4fd646fd8b511a"; |
|
1244 | md5 = "e1fa63535b40e022fa4fd646fd8b511a"; | |
960 | }; |
|
1245 | }; | |
|
1246 | meta = { | |||
|
1247 | license = [ pkgs.lib.licenses.asl20 ]; | |||
|
1248 | }; | |||
961 | }; |
|
1249 | }; | |
962 | python-ldap = super.buildPythonPackage { |
|
1250 | python-ldap = super.buildPythonPackage { | |
963 | name = "python-ldap-2.4.19"; |
|
1251 | name = "python-ldap-2.4.19"; | |
964 | buildInputs = with self; []; |
|
1252 | buildInputs = with self; []; | |
965 | doCheck = false; |
|
1253 | doCheck = false; | |
966 | propagatedBuildInputs = with self; [setuptools]; |
|
1254 | propagatedBuildInputs = with self; [setuptools]; | |
967 | src = fetchurl { |
|
1255 | src = fetchurl { | |
968 | url = "https://pypi.python.org/packages/42/81/1b64838c82e64f14d4e246ff00b52e650a35c012551b891ada2b85d40737/python-ldap-2.4.19.tar.gz"; |
|
1256 | url = "https://pypi.python.org/packages/42/81/1b64838c82e64f14d4e246ff00b52e650a35c012551b891ada2b85d40737/python-ldap-2.4.19.tar.gz"; | |
969 | md5 = "b941bf31d09739492aa19ef679e94ae3"; |
|
1257 | md5 = "b941bf31d09739492aa19ef679e94ae3"; | |
970 | }; |
|
1258 | }; | |
|
1259 | meta = { | |||
|
1260 | license = [ pkgs.lib.licenses.psfl ]; | |||
|
1261 | }; | |||
971 | }; |
|
1262 | }; | |
972 | python-memcached = super.buildPythonPackage { |
|
1263 | python-memcached = super.buildPythonPackage { | |
973 | name = "python-memcached-1.57"; |
|
1264 | name = "python-memcached-1.57"; | |
974 | buildInputs = with self; []; |
|
1265 | buildInputs = with self; []; | |
975 | doCheck = false; |
|
1266 | doCheck = false; | |
976 | propagatedBuildInputs = with self; [six]; |
|
1267 | propagatedBuildInputs = with self; [six]; | |
977 | src = fetchurl { |
|
1268 | src = fetchurl { | |
978 | url = "https://pypi.python.org/packages/52/9d/eebc0dcbc5c7c66840ad207dfc1baa376dadb74912484bff73819cce01e6/python-memcached-1.57.tar.gz"; |
|
1269 | url = "https://pypi.python.org/packages/52/9d/eebc0dcbc5c7c66840ad207dfc1baa376dadb74912484bff73819cce01e6/python-memcached-1.57.tar.gz"; | |
979 | md5 = "de21f64b42b2d961f3d4ad7beb5468a1"; |
|
1270 | md5 = "de21f64b42b2d961f3d4ad7beb5468a1"; | |
980 | }; |
|
1271 | }; | |
|
1272 | meta = { | |||
|
1273 | license = [ pkgs.lib.licenses.psfl ]; | |||
|
1274 | }; | |||
981 | }; |
|
1275 | }; | |
982 | python-pam = super.buildPythonPackage { |
|
1276 | python-pam = super.buildPythonPackage { | |
983 | name = "python-pam-1.8.2"; |
|
1277 | name = "python-pam-1.8.2"; | |
984 | buildInputs = with self; []; |
|
1278 | buildInputs = with self; []; | |
985 | doCheck = false; |
|
1279 | doCheck = false; | |
986 | propagatedBuildInputs = with self; []; |
|
1280 | propagatedBuildInputs = with self; []; | |
987 | src = fetchurl { |
|
1281 | src = fetchurl { | |
988 | url = "https://pypi.python.org/packages/de/8c/f8f5d38b4f26893af267ea0b39023d4951705ab0413a39e0cf7cf4900505/python-pam-1.8.2.tar.gz"; |
|
1282 | url = "https://pypi.python.org/packages/de/8c/f8f5d38b4f26893af267ea0b39023d4951705ab0413a39e0cf7cf4900505/python-pam-1.8.2.tar.gz"; | |
989 | md5 = "db71b6b999246fb05d78ecfbe166629d"; |
|
1283 | md5 = "db71b6b999246fb05d78ecfbe166629d"; | |
990 | }; |
|
1284 | }; | |
|
1285 | meta = { | |||
|
1286 | license = [ { fullName = "License :: OSI Approved :: MIT License"; } pkgs.lib.licenses.mit ]; | |||
|
1287 | }; | |||
991 | }; |
|
1288 | }; | |
992 | pytz = super.buildPythonPackage { |
|
1289 | pytz = super.buildPythonPackage { | |
993 | name = "pytz-2015.4"; |
|
1290 | name = "pytz-2015.4"; | |
994 | buildInputs = with self; []; |
|
1291 | buildInputs = with self; []; | |
995 | doCheck = false; |
|
1292 | doCheck = false; | |
996 | propagatedBuildInputs = with self; []; |
|
1293 | propagatedBuildInputs = with self; []; | |
997 | src = fetchurl { |
|
1294 | src = fetchurl { | |
998 | url = "https://pypi.python.org/packages/7e/1a/f43b5c92df7b156822030fed151327ea096bcf417e45acc23bd1df43472f/pytz-2015.4.zip"; |
|
1295 | url = "https://pypi.python.org/packages/7e/1a/f43b5c92df7b156822030fed151327ea096bcf417e45acc23bd1df43472f/pytz-2015.4.zip"; | |
999 | md5 = "233f2a2b370d03f9b5911700cc9ebf3c"; |
|
1296 | md5 = "233f2a2b370d03f9b5911700cc9ebf3c"; | |
1000 | }; |
|
1297 | }; | |
|
1298 | meta = { | |||
|
1299 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1300 | }; | |||
1001 | }; |
|
1301 | }; | |
1002 | pyzmq = super.buildPythonPackage { |
|
1302 | pyzmq = super.buildPythonPackage { | |
1003 | name = "pyzmq-14.6.0"; |
|
1303 | name = "pyzmq-14.6.0"; | |
1004 | buildInputs = with self; []; |
|
1304 | buildInputs = with self; []; | |
1005 | doCheck = false; |
|
1305 | doCheck = false; | |
1006 | propagatedBuildInputs = with self; []; |
|
1306 | propagatedBuildInputs = with self; []; | |
1007 | src = fetchurl { |
|
1307 | src = fetchurl { | |
1008 | url = "https://pypi.python.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz"; |
|
1308 | url = "https://pypi.python.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz"; | |
1009 | md5 = "395b5de95a931afa5b14c9349a5b8024"; |
|
1309 | md5 = "395b5de95a931afa5b14c9349a5b8024"; | |
1010 | }; |
|
1310 | }; | |
|
1311 | meta = { | |||
|
1312 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "LGPL+BSD"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; | |||
|
1313 | }; | |||
1011 | }; |
|
1314 | }; | |
1012 | recaptcha-client = super.buildPythonPackage { |
|
1315 | recaptcha-client = super.buildPythonPackage { | |
1013 | name = "recaptcha-client-1.0.6"; |
|
1316 | name = "recaptcha-client-1.0.6"; | |
1014 | buildInputs = with self; []; |
|
1317 | buildInputs = with self; []; | |
1015 | doCheck = false; |
|
1318 | doCheck = false; | |
1016 | propagatedBuildInputs = with self; []; |
|
1319 | propagatedBuildInputs = with self; []; | |
1017 | src = fetchurl { |
|
1320 | src = fetchurl { | |
1018 | url = "https://pypi.python.org/packages/0a/ea/5f2fbbfd894bdac1c68ef8d92019066cfcf9fbff5fe3d728d2b5c25c8db4/recaptcha-client-1.0.6.tar.gz"; |
|
1321 | url = "https://pypi.python.org/packages/0a/ea/5f2fbbfd894bdac1c68ef8d92019066cfcf9fbff5fe3d728d2b5c25c8db4/recaptcha-client-1.0.6.tar.gz"; | |
1019 | md5 = "74228180f7e1fb76c4d7089160b0d919"; |
|
1322 | md5 = "74228180f7e1fb76c4d7089160b0d919"; | |
1020 | }; |
|
1323 | }; | |
|
1324 | meta = { | |||
|
1325 | license = [ { fullName = "MIT/X11"; } ]; | |||
|
1326 | }; | |||
1021 | }; |
|
1327 | }; | |
1022 | repoze.lru = super.buildPythonPackage { |
|
1328 | repoze.lru = super.buildPythonPackage { | |
1023 | name = "repoze.lru-0.6"; |
|
1329 | name = "repoze.lru-0.6"; | |
1024 | buildInputs = with self; []; |
|
1330 | buildInputs = with self; []; | |
1025 | doCheck = false; |
|
1331 | doCheck = false; | |
1026 | propagatedBuildInputs = with self; []; |
|
1332 | propagatedBuildInputs = with self; []; | |
1027 | src = fetchurl { |
|
1333 | src = fetchurl { | |
1028 | url = "https://pypi.python.org/packages/6e/1e/aa15cc90217e086dc8769872c8778b409812ff036bf021b15795638939e4/repoze.lru-0.6.tar.gz"; |
|
1334 | url = "https://pypi.python.org/packages/6e/1e/aa15cc90217e086dc8769872c8778b409812ff036bf021b15795638939e4/repoze.lru-0.6.tar.gz"; | |
1029 | md5 = "2c3b64b17a8e18b405f55d46173e14dd"; |
|
1335 | md5 = "2c3b64b17a8e18b405f55d46173e14dd"; | |
1030 | }; |
|
1336 | }; | |
|
1337 | meta = { | |||
|
1338 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |||
|
1339 | }; | |||
1031 | }; |
|
1340 | }; | |
1032 | requests = super.buildPythonPackage { |
|
1341 | requests = super.buildPythonPackage { | |
1033 | name = "requests-2.9.1"; |
|
1342 | name = "requests-2.9.1"; | |
1034 | buildInputs = with self; []; |
|
1343 | buildInputs = with self; []; | |
1035 | doCheck = false; |
|
1344 | doCheck = false; | |
1036 | propagatedBuildInputs = with self; []; |
|
1345 | propagatedBuildInputs = with self; []; | |
1037 | src = fetchurl { |
|
1346 | src = fetchurl { | |
1038 | url = "https://pypi.python.org/packages/f9/6d/07c44fb1ebe04d069459a189e7dab9e4abfe9432adcd4477367c25332748/requests-2.9.1.tar.gz"; |
|
1347 | url = "https://pypi.python.org/packages/f9/6d/07c44fb1ebe04d069459a189e7dab9e4abfe9432adcd4477367c25332748/requests-2.9.1.tar.gz"; | |
1039 | md5 = "0b7f480d19012ec52bab78292efd976d"; |
|
1348 | md5 = "0b7f480d19012ec52bab78292efd976d"; | |
1040 | }; |
|
1349 | }; | |
|
1350 | meta = { | |||
|
1351 | license = [ pkgs.lib.licenses.asl20 ]; | |||
|
1352 | }; | |||
1041 | }; |
|
1353 | }; | |
1042 | rhodecode-enterprise-ce = super.buildPythonPackage { |
|
1354 | rhodecode-enterprise-ce = super.buildPythonPackage { | |
1043 |
name = "rhodecode-enterprise-ce-4. |
|
1355 | name = "rhodecode-enterprise-ce-4.2.0"; | |
1044 | buildInputs = with self; [WebTest configobj cssselect flake8 lxml mock pytest pytest-cov pytest-runner]; |
|
1356 | buildInputs = with self; [WebTest configobj cssselect flake8 lxml mock pytest pytest-cov pytest-runner]; | |
1045 | doCheck = true; |
|
1357 | doCheck = true; | |
1046 | propagatedBuildInputs = with self; [Babel Beaker FormEncode Mako Markdown MarkupSafe MySQL-python Paste PasteDeploy PasteScript Pygments Pylons Pyro4 Routes SQLAlchemy Tempita URLObject WebError WebHelpers WebHelpers2 WebOb WebTest Whoosh alembic amqplib anyjson appenlight-client authomatic backport-ipaddress celery colander decorator docutils gunicorn infrae.cache ipython iso8601 kombu msgpack-python packaging psycopg2 pycrypto pycurl pyparsing pyramid pyramid-debugtoolbar pyramid-mako pyramid-beaker pysqlite python-dateutil python-ldap python-memcached python-pam recaptcha-client repoze.lru requests simplejson waitress zope.cachedescriptors psutil py-bcrypt]; |
|
1358 | propagatedBuildInputs = with self; [Babel Beaker FormEncode Mako Markdown MarkupSafe MySQL-python Paste PasteDeploy PasteScript Pygments Pylons Pyro4 Routes SQLAlchemy Tempita URLObject WebError WebHelpers WebHelpers2 WebOb WebTest Whoosh alembic amqplib anyjson appenlight-client authomatic backport-ipaddress celery colander decorator docutils gunicorn infrae.cache ipython iso8601 kombu msgpack-python packaging psycopg2 pycrypto pycurl pyparsing pyramid pyramid-debugtoolbar pyramid-mako pyramid-beaker pysqlite python-dateutil python-ldap python-memcached python-pam recaptcha-client repoze.lru requests simplejson waitress zope.cachedescriptors psutil py-bcrypt]; | |
1047 | src = ./.; |
|
1359 | src = ./.; | |
|
1360 | meta = { | |||
|
1361 | license = [ { fullName = "AGPLv3, and Commercial License"; } ]; | |||
|
1362 | }; | |||
1048 | }; |
|
1363 | }; | |
1049 | rhodecode-tools = super.buildPythonPackage { |
|
1364 | rhodecode-tools = super.buildPythonPackage { | |
1050 | name = "rhodecode-tools-0.8.3"; |
|
1365 | name = "rhodecode-tools-0.8.3"; | |
1051 | buildInputs = with self; []; |
|
1366 | buildInputs = with self; []; | |
1052 | doCheck = false; |
|
1367 | doCheck = false; | |
1053 | propagatedBuildInputs = with self; [click future six Mako MarkupSafe requests Whoosh elasticsearch elasticsearch-dsl]; |
|
1368 | propagatedBuildInputs = with self; [click future six Mako MarkupSafe requests Whoosh elasticsearch elasticsearch-dsl]; | |
1054 | src = fetchurl { |
|
1369 | src = fetchurl { | |
1055 | url = "https://code.rhodecode.com/rhodecode-tools-ce/archive/v0.8.3.zip"; |
|
1370 | url = "https://code.rhodecode.com/rhodecode-tools-ce/archive/v0.8.3.zip"; | |
1056 | md5 = "9acdfd71b8ddf4056057065f37ab9ccb"; |
|
1371 | md5 = "9acdfd71b8ddf4056057065f37ab9ccb"; | |
1057 | }; |
|
1372 | }; | |
|
1373 | meta = { | |||
|
1374 | license = [ { fullName = "AGPLv3 and Proprietary"; } ]; | |||
|
1375 | }; | |||
1058 | }; |
|
1376 | }; | |
1059 | serpent = super.buildPythonPackage { |
|
1377 | serpent = super.buildPythonPackage { | |
1060 | name = "serpent-1.12"; |
|
1378 | name = "serpent-1.12"; | |
1061 | buildInputs = with self; []; |
|
1379 | buildInputs = with self; []; | |
1062 | doCheck = false; |
|
1380 | doCheck = false; | |
1063 | propagatedBuildInputs = with self; []; |
|
1381 | propagatedBuildInputs = with self; []; | |
1064 | src = fetchurl { |
|
1382 | src = fetchurl { | |
1065 | url = "https://pypi.python.org/packages/3b/19/1e0e83b47c09edaef8398655088036e7e67386b5c48770218ebb339fbbd5/serpent-1.12.tar.gz"; |
|
1383 | url = "https://pypi.python.org/packages/3b/19/1e0e83b47c09edaef8398655088036e7e67386b5c48770218ebb339fbbd5/serpent-1.12.tar.gz"; | |
1066 | md5 = "05869ac7b062828b34f8f927f0457b65"; |
|
1384 | md5 = "05869ac7b062828b34f8f927f0457b65"; | |
1067 | }; |
|
1385 | }; | |
|
1386 | meta = { | |||
|
1387 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1388 | }; | |||
1068 | }; |
|
1389 | }; | |
1069 | setproctitle = super.buildPythonPackage { |
|
1390 | setproctitle = super.buildPythonPackage { | |
1070 | name = "setproctitle-1.1.8"; |
|
1391 | name = "setproctitle-1.1.8"; | |
1071 | buildInputs = with self; []; |
|
1392 | buildInputs = with self; []; | |
1072 | doCheck = false; |
|
1393 | doCheck = false; | |
1073 | propagatedBuildInputs = with self; []; |
|
1394 | propagatedBuildInputs = with self; []; | |
1074 | src = fetchurl { |
|
1395 | src = fetchurl { | |
1075 | url = "https://pypi.python.org/packages/33/c3/ad367a4f4f1ca90468863ae727ac62f6edb558fc09a003d344a02cfc6ea6/setproctitle-1.1.8.tar.gz"; |
|
1396 | url = "https://pypi.python.org/packages/33/c3/ad367a4f4f1ca90468863ae727ac62f6edb558fc09a003d344a02cfc6ea6/setproctitle-1.1.8.tar.gz"; | |
1076 | md5 = "728f4c8c6031bbe56083a48594027edd"; |
|
1397 | md5 = "728f4c8c6031bbe56083a48594027edd"; | |
1077 | }; |
|
1398 | }; | |
|
1399 | meta = { | |||
|
1400 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
1401 | }; | |||
1078 | }; |
|
1402 | }; | |
1079 | setuptools = super.buildPythonPackage { |
|
1403 | setuptools = super.buildPythonPackage { | |
1080 | name = "setuptools-20.8.1"; |
|
1404 | name = "setuptools-20.8.1"; | |
1081 | buildInputs = with self; []; |
|
1405 | buildInputs = with self; []; | |
1082 | doCheck = false; |
|
1406 | doCheck = false; | |
1083 | propagatedBuildInputs = with self; []; |
|
1407 | propagatedBuildInputs = with self; []; | |
1084 | src = fetchurl { |
|
1408 | src = fetchurl { | |
1085 | url = "https://pypi.python.org/packages/c4/19/c1bdc88b53da654df43770f941079dbab4e4788c2dcb5658fb86259894c7/setuptools-20.8.1.zip"; |
|
1409 | url = "https://pypi.python.org/packages/c4/19/c1bdc88b53da654df43770f941079dbab4e4788c2dcb5658fb86259894c7/setuptools-20.8.1.zip"; | |
1086 | md5 = "fe58a5cac0df20bb83942b252a4b0543"; |
|
1410 | md5 = "fe58a5cac0df20bb83942b252a4b0543"; | |
1087 | }; |
|
1411 | }; | |
|
1412 | meta = { | |||
|
1413 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1414 | }; | |||
1088 | }; |
|
1415 | }; | |
1089 | setuptools-scm = super.buildPythonPackage { |
|
1416 | setuptools-scm = super.buildPythonPackage { | |
1090 | name = "setuptools-scm-1.11.0"; |
|
1417 | name = "setuptools-scm-1.11.0"; | |
1091 | buildInputs = with self; []; |
|
1418 | buildInputs = with self; []; | |
1092 | doCheck = false; |
|
1419 | doCheck = false; | |
1093 | propagatedBuildInputs = with self; []; |
|
1420 | propagatedBuildInputs = with self; []; | |
1094 | src = fetchurl { |
|
1421 | src = fetchurl { | |
1095 | url = "https://pypi.python.org/packages/cd/5f/e3a038292358058d83d764a47d09114aa5a8003ed4529518f9e580f1a94f/setuptools_scm-1.11.0.tar.gz"; |
|
1422 | url = "https://pypi.python.org/packages/cd/5f/e3a038292358058d83d764a47d09114aa5a8003ed4529518f9e580f1a94f/setuptools_scm-1.11.0.tar.gz"; | |
1096 | md5 = "4c5c896ba52e134bbc3507bac6400087"; |
|
1423 | md5 = "4c5c896ba52e134bbc3507bac6400087"; | |
1097 | }; |
|
1424 | }; | |
|
1425 | meta = { | |||
|
1426 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1427 | }; | |||
1098 | }; |
|
1428 | }; | |
1099 | simplejson = super.buildPythonPackage { |
|
1429 | simplejson = super.buildPythonPackage { | |
1100 | name = "simplejson-3.7.2"; |
|
1430 | name = "simplejson-3.7.2"; | |
1101 | buildInputs = with self; []; |
|
1431 | buildInputs = with self; []; | |
1102 | doCheck = false; |
|
1432 | doCheck = false; | |
1103 | propagatedBuildInputs = with self; []; |
|
1433 | propagatedBuildInputs = with self; []; | |
1104 | src = fetchurl { |
|
1434 | src = fetchurl { | |
1105 | url = "https://pypi.python.org/packages/6d/89/7f13f099344eea9d6722779a1f165087cb559598107844b1ac5dbd831fb1/simplejson-3.7.2.tar.gz"; |
|
1435 | url = "https://pypi.python.org/packages/6d/89/7f13f099344eea9d6722779a1f165087cb559598107844b1ac5dbd831fb1/simplejson-3.7.2.tar.gz"; | |
1106 | md5 = "a5fc7d05d4cb38492285553def5d4b46"; |
|
1436 | md5 = "a5fc7d05d4cb38492285553def5d4b46"; | |
1107 | }; |
|
1437 | }; | |
|
1438 | meta = { | |||
|
1439 | license = [ pkgs.lib.licenses.mit pkgs.lib.licenses.afl21 ]; | |||
|
1440 | }; | |||
1108 | }; |
|
1441 | }; | |
1109 | six = super.buildPythonPackage { |
|
1442 | six = super.buildPythonPackage { | |
1110 | name = "six-1.9.0"; |
|
1443 | name = "six-1.9.0"; | |
1111 | buildInputs = with self; []; |
|
1444 | buildInputs = with self; []; | |
1112 | doCheck = false; |
|
1445 | doCheck = false; | |
1113 | propagatedBuildInputs = with self; []; |
|
1446 | propagatedBuildInputs = with self; []; | |
1114 | src = fetchurl { |
|
1447 | src = fetchurl { | |
1115 | url = "https://pypi.python.org/packages/16/64/1dc5e5976b17466fd7d712e59cbe9fb1e18bec153109e5ba3ed6c9102f1a/six-1.9.0.tar.gz"; |
|
1448 | url = "https://pypi.python.org/packages/16/64/1dc5e5976b17466fd7d712e59cbe9fb1e18bec153109e5ba3ed6c9102f1a/six-1.9.0.tar.gz"; | |
1116 | md5 = "476881ef4012262dfc8adc645ee786c4"; |
|
1449 | md5 = "476881ef4012262dfc8adc645ee786c4"; | |
1117 | }; |
|
1450 | }; | |
|
1451 | meta = { | |||
|
1452 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1453 | }; | |||
1118 | }; |
|
1454 | }; | |
1119 | subprocess32 = super.buildPythonPackage { |
|
1455 | subprocess32 = super.buildPythonPackage { | |
1120 | name = "subprocess32-3.2.6"; |
|
1456 | name = "subprocess32-3.2.6"; | |
1121 | buildInputs = with self; []; |
|
1457 | buildInputs = with self; []; | |
1122 | doCheck = false; |
|
1458 | doCheck = false; | |
1123 | propagatedBuildInputs = with self; []; |
|
1459 | propagatedBuildInputs = with self; []; | |
1124 | src = fetchurl { |
|
1460 | src = fetchurl { | |
1125 | url = "https://pypi.python.org/packages/28/8d/33ccbff51053f59ae6c357310cac0e79246bbed1d345ecc6188b176d72c3/subprocess32-3.2.6.tar.gz"; |
|
1461 | url = "https://pypi.python.org/packages/28/8d/33ccbff51053f59ae6c357310cac0e79246bbed1d345ecc6188b176d72c3/subprocess32-3.2.6.tar.gz"; | |
1126 | md5 = "754c5ab9f533e764f931136974b618f1"; |
|
1462 | md5 = "754c5ab9f533e764f931136974b618f1"; | |
1127 | }; |
|
1463 | }; | |
|
1464 | meta = { | |||
|
1465 | license = [ pkgs.lib.licenses.psfl ]; | |||
|
1466 | }; | |||
1128 | }; |
|
1467 | }; | |
1129 | supervisor = super.buildPythonPackage { |
|
1468 | supervisor = super.buildPythonPackage { | |
1130 | name = "supervisor-3.1.3"; |
|
1469 | name = "supervisor-3.1.3"; | |
1131 | buildInputs = with self; []; |
|
1470 | buildInputs = with self; []; | |
1132 | doCheck = false; |
|
1471 | doCheck = false; | |
1133 | propagatedBuildInputs = with self; [meld3]; |
|
1472 | propagatedBuildInputs = with self; [meld3]; | |
1134 | src = fetchurl { |
|
1473 | src = fetchurl { | |
1135 | url = "https://pypi.python.org/packages/a6/41/65ad5bd66230b173eb4d0b8810230f3a9c59ef52ae066e540b6b99895db7/supervisor-3.1.3.tar.gz"; |
|
1474 | url = "https://pypi.python.org/packages/a6/41/65ad5bd66230b173eb4d0b8810230f3a9c59ef52ae066e540b6b99895db7/supervisor-3.1.3.tar.gz"; | |
1136 | md5 = "aad263c4fbc070de63dd354864d5e552"; |
|
1475 | md5 = "aad263c4fbc070de63dd354864d5e552"; | |
1137 | }; |
|
1476 | }; | |
|
1477 | meta = { | |||
|
1478 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |||
|
1479 | }; | |||
1138 | }; |
|
1480 | }; | |
1139 | transifex-client = super.buildPythonPackage { |
|
1481 | transifex-client = super.buildPythonPackage { | |
1140 | name = "transifex-client-0.10"; |
|
1482 | name = "transifex-client-0.10"; | |
1141 | buildInputs = with self; []; |
|
1483 | buildInputs = with self; []; | |
1142 | doCheck = false; |
|
1484 | doCheck = false; | |
1143 | propagatedBuildInputs = with self; []; |
|
1485 | propagatedBuildInputs = with self; []; | |
1144 | src = fetchurl { |
|
1486 | src = fetchurl { | |
1145 | url = "https://pypi.python.org/packages/f3/4e/7b925192aee656fb3e04fa6381c8b3dc40198047c3b4a356f6cfd642c809/transifex-client-0.10.tar.gz"; |
|
1487 | url = "https://pypi.python.org/packages/f3/4e/7b925192aee656fb3e04fa6381c8b3dc40198047c3b4a356f6cfd642c809/transifex-client-0.10.tar.gz"; | |
1146 | md5 = "5549538d84b8eede6b254cd81ae024fa"; |
|
1488 | md5 = "5549538d84b8eede6b254cd81ae024fa"; | |
1147 | }; |
|
1489 | }; | |
|
1490 | meta = { | |||
|
1491 | license = [ pkgs.lib.licenses.gpl2 ]; | |||
|
1492 | }; | |||
1148 | }; |
|
1493 | }; | |
1149 | translationstring = super.buildPythonPackage { |
|
1494 | translationstring = super.buildPythonPackage { | |
1150 | name = "translationstring-1.3"; |
|
1495 | name = "translationstring-1.3"; | |
1151 | buildInputs = with self; []; |
|
1496 | buildInputs = with self; []; | |
1152 | doCheck = false; |
|
1497 | doCheck = false; | |
1153 | propagatedBuildInputs = with self; []; |
|
1498 | propagatedBuildInputs = with self; []; | |
1154 | src = fetchurl { |
|
1499 | src = fetchurl { | |
1155 | url = "https://pypi.python.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz"; |
|
1500 | url = "https://pypi.python.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz"; | |
1156 | md5 = "a4b62e0f3c189c783a1685b3027f7c90"; |
|
1501 | md5 = "a4b62e0f3c189c783a1685b3027f7c90"; | |
1157 | }; |
|
1502 | }; | |
|
1503 | meta = { | |||
|
1504 | license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ]; | |||
|
1505 | }; | |||
1158 | }; |
|
1506 | }; | |
1159 | trollius = super.buildPythonPackage { |
|
1507 | trollius = super.buildPythonPackage { | |
1160 | name = "trollius-1.0.4"; |
|
1508 | name = "trollius-1.0.4"; | |
1161 | buildInputs = with self; []; |
|
1509 | buildInputs = with self; []; | |
1162 | doCheck = false; |
|
1510 | doCheck = false; | |
1163 | propagatedBuildInputs = with self; [futures]; |
|
1511 | propagatedBuildInputs = with self; [futures]; | |
1164 | src = fetchurl { |
|
1512 | src = fetchurl { | |
1165 | url = "https://pypi.python.org/packages/aa/e6/4141db437f55e6ee7a3fb69663239e3fde7841a811b4bef293145ad6c836/trollius-1.0.4.tar.gz"; |
|
1513 | url = "https://pypi.python.org/packages/aa/e6/4141db437f55e6ee7a3fb69663239e3fde7841a811b4bef293145ad6c836/trollius-1.0.4.tar.gz"; | |
1166 | md5 = "3631a464d49d0cbfd30ab2918ef2b783"; |
|
1514 | md5 = "3631a464d49d0cbfd30ab2918ef2b783"; | |
1167 | }; |
|
1515 | }; | |
|
1516 | meta = { | |||
|
1517 | license = [ pkgs.lib.licenses.asl20 ]; | |||
|
1518 | }; | |||
1168 | }; |
|
1519 | }; | |
1169 | uWSGI = super.buildPythonPackage { |
|
1520 | uWSGI = super.buildPythonPackage { | |
1170 | name = "uWSGI-2.0.11.2"; |
|
1521 | name = "uWSGI-2.0.11.2"; | |
1171 | buildInputs = with self; []; |
|
1522 | buildInputs = with self; []; | |
1172 | doCheck = false; |
|
1523 | doCheck = false; | |
1173 | propagatedBuildInputs = with self; []; |
|
1524 | propagatedBuildInputs = with self; []; | |
1174 | src = fetchurl { |
|
1525 | src = fetchurl { | |
1175 | url = "https://pypi.python.org/packages/9b/78/918db0cfab0546afa580c1e565209c49aaf1476bbfe491314eadbe47c556/uwsgi-2.0.11.2.tar.gz"; |
|
1526 | url = "https://pypi.python.org/packages/9b/78/918db0cfab0546afa580c1e565209c49aaf1476bbfe491314eadbe47c556/uwsgi-2.0.11.2.tar.gz"; | |
1176 | md5 = "1f02dcbee7f6f61de4b1fd68350cf16f"; |
|
1527 | md5 = "1f02dcbee7f6f61de4b1fd68350cf16f"; | |
1177 | }; |
|
1528 | }; | |
|
1529 | meta = { | |||
|
1530 | license = [ pkgs.lib.licenses.gpl2 ]; | |||
|
1531 | }; | |||
1178 | }; |
|
1532 | }; | |
1179 | urllib3 = super.buildPythonPackage { |
|
1533 | urllib3 = super.buildPythonPackage { | |
1180 | name = "urllib3-1.16"; |
|
1534 | name = "urllib3-1.16"; | |
1181 | buildInputs = with self; []; |
|
1535 | buildInputs = with self; []; | |
1182 | doCheck = false; |
|
1536 | doCheck = false; | |
1183 | propagatedBuildInputs = with self; []; |
|
1537 | propagatedBuildInputs = with self; []; | |
1184 | src = fetchurl { |
|
1538 | src = fetchurl { | |
1185 | url = "https://pypi.python.org/packages/3b/f0/e763169124e3f5db0926bc3dbfcd580a105f9ca44cf5d8e6c7a803c9f6b5/urllib3-1.16.tar.gz"; |
|
1539 | url = "https://pypi.python.org/packages/3b/f0/e763169124e3f5db0926bc3dbfcd580a105f9ca44cf5d8e6c7a803c9f6b5/urllib3-1.16.tar.gz"; | |
1186 | md5 = "fcaab1c5385c57deeb7053d3d7d81d59"; |
|
1540 | md5 = "fcaab1c5385c57deeb7053d3d7d81d59"; | |
1187 | }; |
|
1541 | }; | |
|
1542 | meta = { | |||
|
1543 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1544 | }; | |||
1188 | }; |
|
1545 | }; | |
1189 | venusian = super.buildPythonPackage { |
|
1546 | venusian = super.buildPythonPackage { | |
1190 | name = "venusian-1.0"; |
|
1547 | name = "venusian-1.0"; | |
1191 | buildInputs = with self; []; |
|
1548 | buildInputs = with self; []; | |
1192 | doCheck = false; |
|
1549 | doCheck = false; | |
1193 | propagatedBuildInputs = with self; []; |
|
1550 | propagatedBuildInputs = with self; []; | |
1194 | src = fetchurl { |
|
1551 | src = fetchurl { | |
1195 | url = "https://pypi.python.org/packages/86/20/1948e0dfc4930ddde3da8c33612f6a5717c0b4bc28f591a5c5cf014dd390/venusian-1.0.tar.gz"; |
|
1552 | url = "https://pypi.python.org/packages/86/20/1948e0dfc4930ddde3da8c33612f6a5717c0b4bc28f591a5c5cf014dd390/venusian-1.0.tar.gz"; | |
1196 | md5 = "dccf2eafb7113759d60c86faf5538756"; |
|
1553 | md5 = "dccf2eafb7113759d60c86faf5538756"; | |
1197 | }; |
|
1554 | }; | |
|
1555 | meta = { | |||
|
1556 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |||
|
1557 | }; | |||
1198 | }; |
|
1558 | }; | |
1199 | waitress = super.buildPythonPackage { |
|
1559 | waitress = super.buildPythonPackage { | |
1200 | name = "waitress-0.8.9"; |
|
1560 | name = "waitress-0.8.9"; | |
1201 | buildInputs = with self; []; |
|
1561 | buildInputs = with self; []; | |
1202 | doCheck = false; |
|
1562 | doCheck = false; | |
1203 | propagatedBuildInputs = with self; [setuptools]; |
|
1563 | propagatedBuildInputs = with self; [setuptools]; | |
1204 | src = fetchurl { |
|
1564 | src = fetchurl { | |
1205 | url = "https://pypi.python.org/packages/ee/65/fc9dee74a909a1187ca51e4f15ad9c4d35476e4ab5813f73421505c48053/waitress-0.8.9.tar.gz"; |
|
1565 | url = "https://pypi.python.org/packages/ee/65/fc9dee74a909a1187ca51e4f15ad9c4d35476e4ab5813f73421505c48053/waitress-0.8.9.tar.gz"; | |
1206 | md5 = "da3f2e62b3676be5dd630703a68e2a04"; |
|
1566 | md5 = "da3f2e62b3676be5dd630703a68e2a04"; | |
1207 | }; |
|
1567 | }; | |
|
1568 | meta = { | |||
|
1569 | license = [ pkgs.lib.licenses.zpt21 ]; | |||
|
1570 | }; | |||
1208 | }; |
|
1571 | }; | |
1209 | wsgiref = super.buildPythonPackage { |
|
1572 | wsgiref = super.buildPythonPackage { | |
1210 | name = "wsgiref-0.1.2"; |
|
1573 | name = "wsgiref-0.1.2"; | |
1211 | buildInputs = with self; []; |
|
1574 | buildInputs = with self; []; | |
1212 | doCheck = false; |
|
1575 | doCheck = false; | |
1213 | propagatedBuildInputs = with self; []; |
|
1576 | propagatedBuildInputs = with self; []; | |
1214 | src = fetchurl { |
|
1577 | src = fetchurl { | |
1215 | url = "https://pypi.python.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip"; |
|
1578 | url = "https://pypi.python.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip"; | |
1216 | md5 = "29b146e6ebd0f9fb119fe321f7bcf6cb"; |
|
1579 | md5 = "29b146e6ebd0f9fb119fe321f7bcf6cb"; | |
1217 | }; |
|
1580 | }; | |
|
1581 | meta = { | |||
|
1582 | license = [ { fullName = "PSF or ZPL"; } ]; | |||
|
1583 | }; | |||
1218 | }; |
|
1584 | }; | |
1219 | zope.cachedescriptors = super.buildPythonPackage { |
|
1585 | zope.cachedescriptors = super.buildPythonPackage { | |
1220 | name = "zope.cachedescriptors-4.0.0"; |
|
1586 | name = "zope.cachedescriptors-4.0.0"; | |
1221 | buildInputs = with self; []; |
|
1587 | buildInputs = with self; []; | |
1222 | doCheck = false; |
|
1588 | doCheck = false; | |
1223 | propagatedBuildInputs = with self; [setuptools]; |
|
1589 | propagatedBuildInputs = with self; [setuptools]; | |
1224 | src = fetchurl { |
|
1590 | src = fetchurl { | |
1225 | url = "https://pypi.python.org/packages/40/33/694b6644c37f28553f4b9f20b3c3a20fb709a22574dff20b5bdffb09ecd5/zope.cachedescriptors-4.0.0.tar.gz"; |
|
1591 | url = "https://pypi.python.org/packages/40/33/694b6644c37f28553f4b9f20b3c3a20fb709a22574dff20b5bdffb09ecd5/zope.cachedescriptors-4.0.0.tar.gz"; | |
1226 | md5 = "8d308de8c936792c8e758058fcb7d0f0"; |
|
1592 | md5 = "8d308de8c936792c8e758058fcb7d0f0"; | |
1227 | }; |
|
1593 | }; | |
|
1594 | meta = { | |||
|
1595 | license = [ pkgs.lib.licenses.zpt21 ]; | |||
|
1596 | }; | |||
1228 | }; |
|
1597 | }; | |
1229 | zope.deprecation = super.buildPythonPackage { |
|
1598 | zope.deprecation = super.buildPythonPackage { | |
1230 | name = "zope.deprecation-4.1.2"; |
|
1599 | name = "zope.deprecation-4.1.2"; | |
1231 | buildInputs = with self; []; |
|
1600 | buildInputs = with self; []; | |
1232 | doCheck = false; |
|
1601 | doCheck = false; | |
1233 | propagatedBuildInputs = with self; [setuptools]; |
|
1602 | propagatedBuildInputs = with self; [setuptools]; | |
1234 | src = fetchurl { |
|
1603 | src = fetchurl { | |
1235 | url = "https://pypi.python.org/packages/c1/d3/3919492d5e57d8dd01b36f30b34fc8404a30577392b1eb817c303499ad20/zope.deprecation-4.1.2.tar.gz"; |
|
1604 | url = "https://pypi.python.org/packages/c1/d3/3919492d5e57d8dd01b36f30b34fc8404a30577392b1eb817c303499ad20/zope.deprecation-4.1.2.tar.gz"; | |
1236 | md5 = "e9a663ded58f4f9f7881beb56cae2782"; |
|
1605 | md5 = "e9a663ded58f4f9f7881beb56cae2782"; | |
1237 | }; |
|
1606 | }; | |
|
1607 | meta = { | |||
|
1608 | license = [ pkgs.lib.licenses.zpt21 ]; | |||
|
1609 | }; | |||
1238 | }; |
|
1610 | }; | |
1239 | zope.event = super.buildPythonPackage { |
|
1611 | zope.event = super.buildPythonPackage { | |
1240 | name = "zope.event-4.0.3"; |
|
1612 | name = "zope.event-4.0.3"; | |
1241 | buildInputs = with self; []; |
|
1613 | buildInputs = with self; []; | |
1242 | doCheck = false; |
|
1614 | doCheck = false; | |
1243 | propagatedBuildInputs = with self; [setuptools]; |
|
1615 | propagatedBuildInputs = with self; [setuptools]; | |
1244 | src = fetchurl { |
|
1616 | src = fetchurl { | |
1245 | url = "https://pypi.python.org/packages/c1/29/91ba884d7d6d96691df592e9e9c2bfa57a47040ec1ff47eff18c85137152/zope.event-4.0.3.tar.gz"; |
|
1617 | url = "https://pypi.python.org/packages/c1/29/91ba884d7d6d96691df592e9e9c2bfa57a47040ec1ff47eff18c85137152/zope.event-4.0.3.tar.gz"; | |
1246 | md5 = "9a3780916332b18b8b85f522bcc3e249"; |
|
1618 | md5 = "9a3780916332b18b8b85f522bcc3e249"; | |
1247 | }; |
|
1619 | }; | |
|
1620 | meta = { | |||
|
1621 | license = [ pkgs.lib.licenses.zpt21 ]; | |||
|
1622 | }; | |||
1248 | }; |
|
1623 | }; | |
1249 | zope.interface = super.buildPythonPackage { |
|
1624 | zope.interface = super.buildPythonPackage { | |
1250 | name = "zope.interface-4.1.3"; |
|
1625 | name = "zope.interface-4.1.3"; | |
1251 | buildInputs = with self; []; |
|
1626 | buildInputs = with self; []; | |
1252 | doCheck = false; |
|
1627 | doCheck = false; | |
1253 | propagatedBuildInputs = with self; [setuptools]; |
|
1628 | propagatedBuildInputs = with self; [setuptools]; | |
1254 | src = fetchurl { |
|
1629 | src = fetchurl { | |
1255 | url = "https://pypi.python.org/packages/9d/81/2509ca3c6f59080123c1a8a97125eb48414022618cec0e64eb1313727bfe/zope.interface-4.1.3.tar.gz"; |
|
1630 | url = "https://pypi.python.org/packages/9d/81/2509ca3c6f59080123c1a8a97125eb48414022618cec0e64eb1313727bfe/zope.interface-4.1.3.tar.gz"; | |
1256 | md5 = "9ae3d24c0c7415deb249dd1a132f0f79"; |
|
1631 | md5 = "9ae3d24c0c7415deb249dd1a132f0f79"; | |
1257 | }; |
|
1632 | }; | |
|
1633 | meta = { | |||
|
1634 | license = [ pkgs.lib.licenses.zpt21 ]; | |||
|
1635 | }; | |||
1258 | }; |
|
1636 | }; | |
1259 |
|
1637 | |||
1260 | ### Test requirements |
|
1638 | ### Test requirements | |
1261 |
|
1639 | |||
1262 |
|
1640 | |||
1263 | } |
|
1641 | } |
@@ -1,231 +1,232 b'' | |||||
1 | # |
|
1 | # | |
2 | # About |
|
2 | # About | |
3 | # ===== |
|
3 | # ===== | |
4 | # |
|
4 | # | |
5 | # This file defines jobs for our CI system and the attribute "build" is used |
|
5 | # This file defines jobs for our CI system and the attribute "build" is used | |
6 | # as the input for packaging. |
|
6 | # as the input for packaging. | |
7 | # |
|
7 | # | |
8 | # |
|
8 | # | |
9 | # CI details |
|
9 | # CI details | |
10 | # ========== |
|
10 | # ========== | |
11 | # |
|
11 | # | |
12 | # This file defines an attribute set of derivations. Each of these attributes is |
|
12 | # This file defines an attribute set of derivations. Each of these attributes is | |
13 | # then used in our CI system as one job to run. This way we keep the |
|
13 | # then used in our CI system as one job to run. This way we keep the | |
14 | # configuration for the CI jobs as well under version control. |
|
14 | # configuration for the CI jobs as well under version control. | |
15 | # |
|
15 | # | |
16 | # Run CI jobs locally |
|
16 | # Run CI jobs locally | |
17 | # ------------------- |
|
17 | # ------------------- | |
18 | # |
|
18 | # | |
19 | # Since it is all based on normal Nix derivations, the jobs can be tested |
|
19 | # Since it is all based on normal Nix derivations, the jobs can be tested | |
20 | # locally with a run of "nix-build" like the following example: |
|
20 | # locally with a run of "nix-build" like the following example: | |
21 | # |
|
21 | # | |
22 | # nix-build release.nix -A test-api -I vcsserver=~/rhodecode-vcsserver |
|
22 | # nix-build release.nix -A test-api -I vcsserver=~/rhodecode-vcsserver | |
23 | # |
|
23 | # | |
24 | # Note: Replace "~/rhodecode-vcsserver" with a path where a clone of the |
|
24 | # Note: Replace "~/rhodecode-vcsserver" with a path where a clone of the | |
25 | # vcsserver resides. |
|
25 | # vcsserver resides. | |
26 |
|
26 | |||
27 | { pkgs ? import <nixpkgs> {} |
|
27 | { pkgs ? import <nixpkgs> {} | |
|
28 | , doCheck ? true | |||
28 | }: |
|
29 | }: | |
29 |
|
30 | |||
30 | let |
|
31 | let | |
31 |
|
32 | |||
32 | inherit (pkgs) |
|
33 | inherit (pkgs) | |
33 | stdenv |
|
34 | stdenv | |
34 | system; |
|
35 | system; | |
35 |
|
36 | |||
36 | testing = import <nixpkgs/nixos/lib/testing.nix> { |
|
37 | testing = import <nixpkgs/nixos/lib/testing.nix> { | |
37 | inherit system; |
|
38 | inherit system; | |
38 | }; |
|
39 | }; | |
39 |
|
40 | |||
40 | runInMachine = testing.runInMachine; |
|
41 | runInMachine = testing.runInMachine; | |
41 |
|
42 | |||
42 | sphinx = import ./docs/default.nix {}; |
|
43 | sphinx = import ./docs/default.nix {}; | |
43 |
|
44 | |||
44 | mkDocs = kind: stdenv.mkDerivation { |
|
45 | mkDocs = kind: stdenv.mkDerivation { | |
45 | name = kind; |
|
46 | name = kind; | |
46 | srcs = [ |
|
47 | srcs = [ | |
47 | (./. + (builtins.toPath "/${kind}")) |
|
48 | (./. + (builtins.toPath "/${kind}")) | |
48 | (builtins.filterSource |
|
49 | (builtins.filterSource | |
49 | (path: type: baseNameOf path == "VERSION") |
|
50 | (path: type: baseNameOf path == "VERSION") | |
50 | ./rhodecode) |
|
51 | ./rhodecode) | |
51 | ]; |
|
52 | ]; | |
52 | sourceRoot = kind; |
|
53 | sourceRoot = kind; | |
53 | buildInputs = [ sphinx ]; |
|
54 | buildInputs = [ sphinx ]; | |
54 | configurePhase = null; |
|
55 | configurePhase = null; | |
55 | buildPhase = '' |
|
56 | buildPhase = '' | |
56 | make SPHINXBUILD=sphinx-build html |
|
57 | make SPHINXBUILD=sphinx-build html | |
57 | ''; |
|
58 | ''; | |
58 | installPhase = '' |
|
59 | installPhase = '' | |
59 | mkdir -p $out |
|
60 | mkdir -p $out | |
60 | mv _build/html $out/ |
|
61 | mv _build/html $out/ | |
61 |
|
62 | |||
62 | mkdir -p $out/nix-support |
|
63 | mkdir -p $out/nix-support | |
63 | echo "doc manual $out/html index.html" >> \ |
|
64 | echo "doc manual $out/html index.html" >> \ | |
64 | "$out/nix-support/hydra-build-products" |
|
65 | "$out/nix-support/hydra-build-products" | |
65 | ''; |
|
66 | ''; | |
66 | }; |
|
67 | }; | |
67 |
|
68 | |||
68 | enterprise = import ./default.nix { |
|
69 | enterprise = import ./default.nix { | |
69 | inherit |
|
70 | inherit | |
70 | pkgs; |
|
71 | pkgs; | |
71 |
|
72 | |||
72 | # TODO: for quick local testing |
|
73 | # TODO: for quick local testing | |
73 | doCheck = false; |
|
74 | doCheck = false; | |
74 | }; |
|
75 | }; | |
75 |
|
76 | |||
76 | test-cfg = stdenv.mkDerivation { |
|
77 | test-cfg = stdenv.mkDerivation { | |
77 | name = "test-cfg"; |
|
78 | name = "test-cfg"; | |
78 | unpackPhase = "true"; |
|
79 | unpackPhase = "true"; | |
79 | buildInputs = [ |
|
80 | buildInputs = [ | |
80 | enterprise.src |
|
81 | enterprise.src | |
81 | ]; |
|
82 | ]; | |
82 | installPhase = '' |
|
83 | installPhase = '' | |
83 | mkdir -p $out/etc |
|
84 | mkdir -p $out/etc | |
84 | cp ${enterprise.src}/test.ini $out/etc/enterprise.ini |
|
85 | cp ${enterprise.src}/test.ini $out/etc/enterprise.ini | |
85 | # TODO: johbo: Needed, so that the login works, this causes |
|
86 | # TODO: johbo: Needed, so that the login works, this causes | |
86 | # probably some side effects |
|
87 | # probably some side effects | |
87 | substituteInPlace $out/etc/enterprise.ini --replace "is_test = True" "" |
|
88 | substituteInPlace $out/etc/enterprise.ini --replace "is_test = True" "" | |
88 |
|
89 | |||
89 | # Gevent configuration |
|
90 | # Gevent configuration | |
90 | cp $out/etc/enterprise.ini $out/etc/enterprise-gevent.ini; |
|
91 | cp $out/etc/enterprise.ini $out/etc/enterprise-gevent.ini; | |
91 | cat >> $out/etc/enterprise-gevent.ini <<EOF |
|
92 | cat >> $out/etc/enterprise-gevent.ini <<EOF | |
92 |
|
93 | |||
93 | [server:main] |
|
94 | [server:main] | |
94 | use = egg:gunicorn#main |
|
95 | use = egg:gunicorn#main | |
95 | worker_class = gevent |
|
96 | worker_class = gevent | |
96 | EOF |
|
97 | EOF | |
97 |
|
98 | |||
98 | cp ${enterprise.src}/vcsserver/test.ini $out/etc/vcsserver.ini |
|
99 | cp ${enterprise.src}/vcsserver/test.ini $out/etc/vcsserver.ini | |
99 | ''; |
|
100 | ''; | |
100 | }; |
|
101 | }; | |
101 |
|
102 | |||
102 | ac-test-drv = import ./acceptance_tests { |
|
103 | ac-test-drv = import ./acceptance_tests { | |
103 | withExternals = false; |
|
104 | withExternals = false; | |
104 | }; |
|
105 | }; | |
105 |
|
106 | |||
106 | # TODO: johbo: Currently abusing buildPythonPackage to make the |
|
107 | # TODO: johbo: Currently abusing buildPythonPackage to make the | |
107 | # needed environment for the ac-test tools. |
|
108 | # needed environment for the ac-test tools. | |
108 | mkAcTests = { |
|
109 | mkAcTests = { | |
109 | # Path to an INI file which will be used to run Enterprise. |
|
110 | # Path to an INI file which will be used to run Enterprise. | |
110 | # |
|
111 | # | |
111 | # Intended usage is to provide different configuration files to |
|
112 | # Intended usage is to provide different configuration files to | |
112 | # run the tests against a different configuration. |
|
113 | # run the tests against a different configuration. | |
113 | enterpriseCfg ? "${test-cfg}/etc/enterprise.ini" |
|
114 | enterpriseCfg ? "${test-cfg}/etc/enterprise.ini" | |
114 |
|
115 | |||
115 | # Path to an INI file which will be used to run the VCSServer. |
|
116 | # Path to an INI file which will be used to run the VCSServer. | |
116 | , vcsserverCfg ? "${test-cfg}/etc/vcsserver.ini" |
|
117 | , vcsserverCfg ? "${test-cfg}/etc/vcsserver.ini" | |
117 | }: pkgs.pythonPackages.buildPythonPackage { |
|
118 | }: pkgs.pythonPackages.buildPythonPackage { | |
118 | name = "enterprise-ac-tests"; |
|
119 | name = "enterprise-ac-tests"; | |
119 | src = ./acceptance_tests; |
|
120 | src = ./acceptance_tests; | |
120 |
|
121 | |||
121 | buildInputs = with pkgs; [ |
|
122 | buildInputs = with pkgs; [ | |
122 | curl |
|
123 | curl | |
123 | enterprise |
|
124 | enterprise | |
124 | ac-test-drv |
|
125 | ac-test-drv | |
125 | ]; |
|
126 | ]; | |
126 |
|
127 | |||
127 | buildPhase = '' |
|
128 | buildPhase = '' | |
128 | cp ${enterpriseCfg} enterprise.ini |
|
129 | cp ${enterpriseCfg} enterprise.ini | |
129 |
|
130 | |||
130 | echo "Creating a fake home directory" |
|
131 | echo "Creating a fake home directory" | |
131 | mkdir fake-home |
|
132 | mkdir fake-home | |
132 | export HOME=$PWD/fake-home |
|
133 | export HOME=$PWD/fake-home | |
133 |
|
134 | |||
134 | echo "Creating a repository directory" |
|
135 | echo "Creating a repository directory" | |
135 | mkdir repos |
|
136 | mkdir repos | |
136 |
|
137 | |||
137 | echo "Preparing the database" |
|
138 | echo "Preparing the database" | |
138 | paster setup-rhodecode \ |
|
139 | paster setup-rhodecode \ | |
139 | --user=admin \ |
|
140 | --user=admin \ | |
140 | --email=admin@example.com \ |
|
141 | --email=admin@example.com \ | |
141 | --password=secret \ |
|
142 | --password=secret \ | |
142 | --api-key=9999999999999999999999999999999999999999 \ |
|
143 | --api-key=9999999999999999999999999999999999999999 \ | |
143 | --force-yes \ |
|
144 | --force-yes \ | |
144 | --repos=$PWD/repos \ |
|
145 | --repos=$PWD/repos \ | |
145 | enterprise.ini > /dev/null |
|
146 | enterprise.ini > /dev/null | |
146 |
|
147 | |||
147 | echo "Starting rcserver" |
|
148 | echo "Starting rcserver" | |
148 | vcsserver --config ${vcsserverCfg} >vcsserver.log 2>&1 & |
|
149 | vcsserver --config ${vcsserverCfg} >vcsserver.log 2>&1 & | |
149 | rcserver enterprise.ini >rcserver.log 2>&1 & |
|
150 | rcserver enterprise.ini >rcserver.log 2>&1 & | |
150 |
|
151 | |||
151 | while ! curl -f -s http://localhost:5000 > /dev/null |
|
152 | while ! curl -f -s http://localhost:5000 > /dev/null | |
152 | do |
|
153 | do | |
153 | echo "Waiting for server to be ready..." |
|
154 | echo "Waiting for server to be ready..." | |
154 | sleep 3 |
|
155 | sleep 3 | |
155 | done |
|
156 | done | |
156 | echo "Webserver is ready." |
|
157 | echo "Webserver is ready." | |
157 |
|
158 | |||
158 | echo "Starting the test run" |
|
159 | echo "Starting the test run" | |
159 | py.test -c example.ini -vs --maxfail=5 tests |
|
160 | py.test -c example.ini -vs --maxfail=5 tests | |
160 |
|
161 | |||
161 | echo "Kill rcserver" |
|
162 | echo "Kill rcserver" | |
162 | kill %2 |
|
163 | kill %2 | |
163 | kill %1 |
|
164 | kill %1 | |
164 | ''; |
|
165 | ''; | |
165 |
|
166 | |||
166 | # TODO: johbo: Use the install phase again once the normal mkDerivation |
|
167 | # TODO: johbo: Use the install phase again once the normal mkDerivation | |
167 | # can be used again. |
|
168 | # can be used again. | |
168 | postInstall = '' |
|
169 | postInstall = '' | |
169 | mkdir -p $out |
|
170 | mkdir -p $out | |
170 | cp enterprise.ini $out |
|
171 | cp enterprise.ini $out | |
171 | cp ${vcsserverCfg} $out/vcsserver.ini |
|
172 | cp ${vcsserverCfg} $out/vcsserver.ini | |
172 | cp rcserver.log $out |
|
173 | cp rcserver.log $out | |
173 | cp vcsserver.log $out |
|
174 | cp vcsserver.log $out | |
174 |
|
175 | |||
175 | mkdir -p $out/nix-support |
|
176 | mkdir -p $out/nix-support | |
176 | echo "report config $out enterprise.ini" >> $out/nix-support/hydra-build-products |
|
177 | echo "report config $out enterprise.ini" >> $out/nix-support/hydra-build-products | |
177 | echo "report config $out vcsserver.ini" >> $out/nix-support/hydra-build-products |
|
178 | echo "report config $out vcsserver.ini" >> $out/nix-support/hydra-build-products | |
178 | echo "report rcserver $out rcserver.log" >> $out/nix-support/hydra-build-products |
|
179 | echo "report rcserver $out rcserver.log" >> $out/nix-support/hydra-build-products | |
179 | echo "report vcsserver $out vcsserver.log" >> $out/nix-support/hydra-build-products |
|
180 | echo "report vcsserver $out vcsserver.log" >> $out/nix-support/hydra-build-products | |
180 | ''; |
|
181 | ''; | |
181 | }; |
|
182 | }; | |
182 |
|
183 | |||
183 | vcsserver = import <vcsserver> { |
|
184 | vcsserver = import <vcsserver> { | |
184 | inherit pkgs; |
|
185 | inherit pkgs; | |
185 |
|
186 | |||
186 | # TODO: johbo: Think of a more elegant solution to this problem |
|
187 | # TODO: johbo: Think of a more elegant solution to this problem | |
187 | pythonExternalOverrides = self: super: (enterprise.myPythonPackagesUnfix self); |
|
188 | pythonExternalOverrides = self: super: (enterprise.myPythonPackagesUnfix self); | |
188 | }; |
|
189 | }; | |
189 |
|
190 | |||
190 | runTests = optionString: (enterprise.override (attrs: { |
|
191 | runTests = optionString: (enterprise.override (attrs: { | |
191 | doCheck = true; |
|
192 | doCheck = true; | |
192 | name = "test-run"; |
|
193 | name = "test-run"; | |
193 | buildInputs = attrs.buildInputs ++ [ |
|
194 | buildInputs = attrs.buildInputs ++ [ | |
194 | vcsserver |
|
195 | vcsserver | |
195 | ]; |
|
196 | ]; | |
196 | checkPhase = '' |
|
197 | checkPhase = '' | |
197 | py.test ${optionString} -vv -ra |
|
198 | py.test ${optionString} -vv -ra | |
198 | ''; |
|
199 | ''; | |
199 | buildPhase = attrs.shellHook; |
|
200 | buildPhase = attrs.shellHook; | |
200 | installPhase = '' |
|
201 | installPhase = '' | |
201 | echo "Intentionally not installing anything" |
|
202 | echo "Intentionally not installing anything" | |
202 | ''; |
|
203 | ''; | |
203 | meta.description = "Enterprise test run ${optionString}"; |
|
204 | meta.description = "Enterprise test run ${optionString}"; | |
204 | })); |
|
205 | })); | |
205 |
|
206 | |||
206 | jobs = { |
|
207 | jobs = { | |
207 |
|
208 | |||
208 | build = enterprise; |
|
209 | build = enterprise; | |
209 |
|
210 | |||
210 | # johbo: Currently this is simply running the tests against the sources. Nicer |
|
211 | # johbo: Currently this is simply running the tests against the sources. Nicer | |
211 | # would be to run xdist and against the installed application, so that we also |
|
212 | # would be to run xdist and against the installed application, so that we also | |
212 | # cover the impact of installing the application. |
|
213 | # cover the impact of installing the application. | |
213 | test-api = runTests "rhodecode/api"; |
|
214 | test-api = runTests "rhodecode/api"; | |
214 | test-functional = runTests "rhodecode/tests/functional"; |
|
215 | test-functional = runTests "rhodecode/tests/functional"; | |
215 | test-rest = runTests "rhodecode/tests --ignore=rhodecode/tests/functional"; |
|
216 | test-rest = runTests "rhodecode/tests --ignore=rhodecode/tests/functional"; | |
216 | test-full = runTests "rhodecode"; |
|
217 | test-full = runTests "rhodecode"; | |
217 |
|
218 | |||
218 | docs = mkDocs "docs"; |
|
219 | docs = mkDocs "docs"; | |
219 |
|
220 | |||
220 | aggregate = pkgs.releaseTools.aggregate { |
|
221 | aggregate = pkgs.releaseTools.aggregate { | |
221 | name = "aggregated-jobs"; |
|
222 | name = "aggregated-jobs"; | |
222 | constituents = [ |
|
223 | constituents = [ | |
223 | jobs.build |
|
224 | jobs.build | |
224 | jobs.test-api |
|
225 | jobs.test-api | |
225 | jobs.test-rest |
|
226 | jobs.test-rest | |
226 | jobs.docs |
|
227 | jobs.docs | |
227 | ]; |
|
228 | ]; | |
228 | }; |
|
229 | }; | |
229 | }; |
|
230 | }; | |
230 |
|
231 | |||
231 | in jobs |
|
232 | in jobs |
@@ -1,151 +1,150 b'' | |||||
1 | Babel==1.3 |
|
1 | Babel==1.3 | |
2 | Beaker==1.7.0 |
|
2 | Beaker==1.7.0 | |
3 | CProfileV==1.0.6 |
|
3 | CProfileV==1.0.6 | |
4 | Fabric==1.10.0 |
|
4 | Fabric==1.10.0 | |
5 | FormEncode==1.2.4 |
|
5 | FormEncode==1.2.4 | |
6 | Jinja2==2.7.3 |
|
6 | Jinja2==2.7.3 | |
7 | Mako==1.0.1 |
|
7 | Mako==1.0.1 | |
8 | Markdown==2.6.2 |
|
8 | Markdown==2.6.2 | |
9 | MarkupSafe==0.23 |
|
9 | MarkupSafe==0.23 | |
10 | MySQL-python==1.2.5 |
|
10 | MySQL-python==1.2.5 | |
11 | Paste==2.0.2 |
|
11 | Paste==2.0.2 | |
12 | PasteDeploy==1.5.2 |
|
12 | PasteDeploy==1.5.2 | |
13 | PasteScript==1.7.5 |
|
13 | PasteScript==1.7.5 | |
14 | Pygments==2.0.2 |
|
14 | Pygments==2.0.2 | |
15 |
|
15 | |||
16 | # TODO: This version is not available on PyPI |
|
16 | # TODO: This version is not available on PyPI | |
17 | # Pylons==1.0.2.dev20160108 |
|
17 | # Pylons==1.0.2.dev20160108 | |
18 | Pylons==1.0.1 |
|
18 | Pylons==1.0.1 | |
19 |
|
19 | |||
20 | # TODO: This version is not available, but newer ones are |
|
20 | # TODO: This version is not available, but newer ones are | |
21 | # Pyro4==4.35 |
|
21 | # Pyro4==4.35 | |
22 | Pyro4==4.41 |
|
22 | Pyro4==4.41 | |
23 |
|
23 | |||
24 | # TODO: This should probably not be in here |
|
24 | # TODO: This should probably not be in here | |
25 | # -e hg+https://johbo@code.rhodecode.com/johbo/rhodecode-fork@3a454bd1f17c0b2b2a951cf2b111e0320d7942a9#egg=RhodeCodeEnterprise-dev |
|
25 | # -e hg+https://johbo@code.rhodecode.com/johbo/rhodecode-fork@3a454bd1f17c0b2b2a951cf2b111e0320d7942a9#egg=RhodeCodeEnterprise-dev | |
26 |
|
26 | |||
27 | # TODO: This is not really a dependency, we should add it only |
|
27 | # TODO: This is not really a dependency, we should add it only | |
28 | # into the development environment, since there it is useful. |
|
28 | # into the development environment, since there it is useful. | |
29 | # RhodeCodeVCSServer==3.9.0 |
|
29 | # RhodeCodeVCSServer==3.9.0 | |
30 |
|
30 | |||
31 | Routes==1.13 |
|
31 | Routes==1.13 | |
32 | SQLAlchemy==0.9.9 |
|
32 | SQLAlchemy==0.9.9 | |
33 | Sphinx==1.2.2 |
|
33 | Sphinx==1.2.2 | |
34 | Tempita==0.5.2 |
|
34 | Tempita==0.5.2 | |
35 | URLObject==2.4.0 |
|
35 | URLObject==2.4.0 | |
36 | WebError==0.10.3 |
|
36 | WebError==0.10.3 | |
37 |
|
37 | |||
38 | # TODO: This is modified by us, needs a better integration. For now |
|
38 | # TODO: This is modified by us, needs a better integration. For now | |
39 | # using the latest version before. |
|
39 | # using the latest version before. | |
40 | # WebHelpers==1.3.dev20150807 |
|
40 | # WebHelpers==1.3.dev20150807 | |
41 | WebHelpers==1.3 |
|
41 | WebHelpers==1.3 | |
42 |
|
42 | |||
43 | WebHelpers2==2.0 |
|
43 | WebHelpers2==2.0 | |
44 | WebOb==1.3.1 |
|
44 | WebOb==1.3.1 | |
45 | WebTest==1.4.3 |
|
45 | WebTest==1.4.3 | |
46 | Whoosh==2.7.0 |
|
46 | Whoosh==2.7.0 | |
47 | alembic==0.8.4 |
|
47 | alembic==0.8.4 | |
48 | amqplib==1.0.2 |
|
48 | amqplib==1.0.2 | |
49 | anyjson==0.3.3 |
|
49 | anyjson==0.3.3 | |
50 | appenlight-client==0.6.14 |
|
50 | appenlight-client==0.6.14 | |
51 | authomatic==0.1.0.post1; |
|
51 | authomatic==0.1.0.post1; | |
52 | backport-ipaddress==0.1 |
|
52 | backport-ipaddress==0.1 | |
53 | bottle==0.12.8 |
|
53 | bottle==0.12.8 | |
54 | bumpversion==0.5.3 |
|
54 | bumpversion==0.5.3 | |
55 | celery==2.2.10 |
|
55 | celery==2.2.10 | |
56 | click==5.1 |
|
56 | click==5.1 | |
57 | colander==1.2 |
|
57 | colander==1.2 | |
58 | configobj==5.0.6 |
|
58 | configobj==5.0.6 | |
59 | cov-core==1.15.0 |
|
59 | cov-core==1.15.0 | |
60 | coverage==3.7.1 |
|
60 | coverage==3.7.1 | |
61 | cssselect==0.9.1 |
|
61 | cssselect==0.9.1 | |
62 | decorator==3.4.2 |
|
62 | decorator==3.4.2 | |
63 | docutils==0.12 |
|
63 | docutils==0.12 | |
64 | dogpile.cache==0.5.7 |
|
64 | dogpile.cache==0.5.7 | |
65 | dogpile.core==0.4.1 |
|
65 | dogpile.core==0.4.1 | |
66 | dulwich==0.12.0 |
|
66 | dulwich==0.12.0 | |
67 | ecdsa==0.11 |
|
67 | ecdsa==0.11 | |
68 | flake8==2.4.1 |
|
68 | flake8==2.4.1 | |
69 | future==0.14.3 |
|
69 | future==0.14.3 | |
70 | futures==3.0.2 |
|
70 | futures==3.0.2 | |
71 | gprof2dot==2015.12.1 |
|
71 | gprof2dot==2015.12.1 | |
72 | greenlet==0.4.9 |
|
|||
73 | gunicorn==19.6.0 |
|
72 | gunicorn==19.6.0 | |
74 |
|
73 | |||
75 | # TODO: Needs subvertpy and blows up without Subversion headers, |
|
74 | # TODO: Needs subvertpy and blows up without Subversion headers, | |
76 | # actually we should not need this for Enterprise at all. |
|
75 | # actually we should not need this for Enterprise at all. | |
77 | # hgsubversion==1.8.2 |
|
76 | # hgsubversion==1.8.2 | |
78 |
|
77 | |||
79 | gnureadline==6.3.3 |
|
78 | gnureadline==6.3.3 | |
80 | infrae.cache==1.0.1 |
|
79 | infrae.cache==1.0.1 | |
81 |
invoke==0.1 |
|
80 | invoke==0.13.0 | |
82 | ipdb==0.8 |
|
81 | ipdb==0.8 | |
83 | ipython==3.1.0 |
|
82 | ipython==3.1.0 | |
84 | iso8601==0.1.11 |
|
83 | iso8601==0.1.11 | |
85 | itsdangerous==0.24 |
|
84 | itsdangerous==0.24 | |
86 | kombu==1.5.1 |
|
85 | kombu==1.5.1 | |
87 | lxml==3.4.4 |
|
86 | lxml==3.4.4 | |
88 | mccabe==0.3 |
|
87 | mccabe==0.3 | |
89 | meld3==1.0.2 |
|
88 | meld3==1.0.2 | |
90 | mock==1.0.1 |
|
89 | mock==1.0.1 | |
91 | msgpack-python==0.4.6 |
|
90 | msgpack-python==0.4.6 | |
92 | nose==1.3.6 |
|
91 | nose==1.3.6 | |
93 | objgraph==2.0.0 |
|
92 | objgraph==2.0.0 | |
94 | packaging==15.2 |
|
93 | packaging==15.2 | |
95 | paramiko==1.15.1 |
|
94 | paramiko==1.15.1 | |
96 | pep8==1.5.7 |
|
95 | pep8==1.5.7 | |
97 | psutil==2.2.1 |
|
96 | psutil==2.2.1 | |
98 | psycopg2==2.6 |
|
97 | psycopg2==2.6 | |
99 | py==1.4.29 |
|
98 | py==1.4.29 | |
100 | py-bcrypt==0.4 |
|
99 | py-bcrypt==0.4 | |
101 | pycrypto==2.6.1 |
|
100 | pycrypto==2.6.1 | |
102 | pycurl==7.19.5 |
|
101 | pycurl==7.19.5 | |
103 | pyflakes==0.8.1 |
|
102 | pyflakes==0.8.1 | |
104 | pyparsing==1.5.7 |
|
103 | pyparsing==1.5.7 | |
105 | pyramid==1.6.1 |
|
104 | pyramid==1.6.1 | |
106 | pyramid-beaker==0.8 |
|
105 | pyramid-beaker==0.8 | |
107 | pyramid-debugtoolbar==2.4.2 |
|
106 | pyramid-debugtoolbar==2.4.2 | |
108 | pyramid-jinja2==2.5 |
|
107 | pyramid-jinja2==2.5 | |
109 | pyramid-mako==1.0.2 |
|
108 | pyramid-mako==1.0.2 | |
110 | pysqlite==2.6.3 |
|
109 | pysqlite==2.6.3 | |
111 | pytest==2.8.5 |
|
110 | pytest==2.8.5 | |
112 | pytest-runner==2.7.1 |
|
111 | pytest-runner==2.7.1 | |
113 | pytest-catchlog==1.2.2 |
|
112 | pytest-catchlog==1.2.2 | |
114 | pytest-cov==1.8.1 |
|
113 | pytest-cov==1.8.1 | |
115 | pytest-profiling==1.0.1 |
|
114 | pytest-profiling==1.0.1 | |
116 | pytest-timeout==0.4 |
|
115 | pytest-timeout==0.4 | |
117 | python-dateutil==1.5 |
|
116 | python-dateutil==1.5 | |
118 | python-ldap==2.4.19 |
|
117 | python-ldap==2.4.19 | |
119 | python-memcached==1.57 |
|
118 | python-memcached==1.57 | |
120 | python-pam==1.8.2 |
|
119 | python-pam==1.8.2 | |
121 | pytz==2015.4 |
|
120 | pytz==2015.4 | |
122 | pyzmq==14.6.0 |
|
121 | pyzmq==14.6.0 | |
123 |
|
122 | |||
124 | # TODO: This is not available in public |
|
123 | # TODO: This is not available in public | |
125 | # rc-testdata==0.2.0 |
|
124 | # rc-testdata==0.2.0 | |
126 |
|
125 | |||
127 | https://code.rhodecode.com/rhodecode-tools-ce/archive/v0.8.3.zip#md5=9acdfd71b8ddf4056057065f37ab9ccb |
|
126 | https://code.rhodecode.com/rhodecode-tools-ce/archive/v0.8.3.zip#md5=9acdfd71b8ddf4056057065f37ab9ccb | |
128 |
|
127 | |||
129 |
|
128 | |||
130 | recaptcha-client==1.0.6 |
|
129 | recaptcha-client==1.0.6 | |
131 | repoze.lru==0.6 |
|
130 | repoze.lru==0.6 | |
132 | requests==2.9.1 |
|
131 | requests==2.9.1 | |
133 | serpent==1.12 |
|
132 | serpent==1.12 | |
134 | setproctitle==1.1.8 |
|
133 | setproctitle==1.1.8 | |
135 | setuptools==20.8.1 |
|
134 | setuptools==20.8.1 | |
136 | setuptools-scm==1.11.0 |
|
135 | setuptools-scm==1.11.0 | |
137 | simplejson==3.7.2 |
|
136 | simplejson==3.7.2 | |
138 | six==1.9.0 |
|
137 | six==1.9.0 | |
139 | subprocess32==3.2.6 |
|
138 | subprocess32==3.2.6 | |
140 | supervisor==3.1.3 |
|
139 | supervisor==3.1.3 | |
141 | transifex-client==0.10 |
|
140 | transifex-client==0.10 | |
142 | translationstring==1.3 |
|
141 | translationstring==1.3 | |
143 | trollius==1.0.4 |
|
142 | trollius==1.0.4 | |
144 | uWSGI==2.0.11.2 |
|
143 | uWSGI==2.0.11.2 | |
145 | venusian==1.0 |
|
144 | venusian==1.0 | |
146 | waitress==0.8.9 |
|
145 | waitress==0.8.9 | |
147 | wsgiref==0.1.2 |
|
146 | wsgiref==0.1.2 | |
148 | zope.cachedescriptors==4.0.0 |
|
147 | zope.cachedescriptors==4.0.0 | |
149 | zope.deprecation==4.1.2 |
|
148 | zope.deprecation==4.1.2 | |
150 | zope.event==4.0.3 |
|
149 | zope.event==4.0.3 | |
151 | zope.interface==4.1.3 |
|
150 | zope.interface==4.1.3 |
@@ -1,609 +1,615 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 | """ |
|
21 | """ | |
22 | Authentication modules |
|
22 | Authentication modules | |
23 | """ |
|
23 | """ | |
24 |
|
24 | |||
|
25 | import colander | |||
25 | import logging |
|
26 | import logging | |
26 | import time |
|
27 | import time | |
27 | import traceback |
|
28 | import traceback | |
28 | import warnings |
|
29 | import warnings | |
29 |
|
30 | |||
30 | from pyramid.threadlocal import get_current_registry |
|
31 | from pyramid.threadlocal import get_current_registry | |
31 | from sqlalchemy.ext.hybrid import hybrid_property |
|
32 | from sqlalchemy.ext.hybrid import hybrid_property | |
32 |
|
33 | |||
33 | from rhodecode.authentication.interface import IAuthnPluginRegistry |
|
34 | from rhodecode.authentication.interface import IAuthnPluginRegistry | |
34 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase |
|
35 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase | |
35 | from rhodecode.lib import caches |
|
36 | from rhodecode.lib import caches | |
36 | from rhodecode.lib.auth import PasswordGenerator, _RhodeCodeCryptoBCrypt |
|
37 | from rhodecode.lib.auth import PasswordGenerator, _RhodeCodeCryptoBCrypt | |
37 | from rhodecode.lib.utils2 import md5_safe, safe_int |
|
38 | from rhodecode.lib.utils2 import md5_safe, safe_int | |
38 | from rhodecode.lib.utils2 import safe_str |
|
39 | from rhodecode.lib.utils2 import safe_str | |
39 | from rhodecode.model.db import User |
|
40 | from rhodecode.model.db import User | |
40 | from rhodecode.model.meta import Session |
|
41 | from rhodecode.model.meta import Session | |
41 | from rhodecode.model.settings import SettingsModel |
|
42 | from rhodecode.model.settings import SettingsModel | |
42 | from rhodecode.model.user import UserModel |
|
43 | from rhodecode.model.user import UserModel | |
43 | from rhodecode.model.user_group import UserGroupModel |
|
44 | from rhodecode.model.user_group import UserGroupModel | |
44 |
|
45 | |||
45 |
|
46 | |||
46 | log = logging.getLogger(__name__) |
|
47 | log = logging.getLogger(__name__) | |
47 |
|
48 | |||
48 | # auth types that authenticate() function can receive |
|
49 | # auth types that authenticate() function can receive | |
49 | VCS_TYPE = 'vcs' |
|
50 | VCS_TYPE = 'vcs' | |
50 | HTTP_TYPE = 'http' |
|
51 | HTTP_TYPE = 'http' | |
51 |
|
52 | |||
52 |
|
53 | |||
53 | class LazyFormencode(object): |
|
54 | class LazyFormencode(object): | |
54 | def __init__(self, formencode_obj, *args, **kwargs): |
|
55 | def __init__(self, formencode_obj, *args, **kwargs): | |
55 | self.formencode_obj = formencode_obj |
|
56 | self.formencode_obj = formencode_obj | |
56 | self.args = args |
|
57 | self.args = args | |
57 | self.kwargs = kwargs |
|
58 | self.kwargs = kwargs | |
58 |
|
59 | |||
59 | def __call__(self, *args, **kwargs): |
|
60 | def __call__(self, *args, **kwargs): | |
60 | from inspect import isfunction |
|
61 | from inspect import isfunction | |
61 | formencode_obj = self.formencode_obj |
|
62 | formencode_obj = self.formencode_obj | |
62 | if isfunction(formencode_obj): |
|
63 | if isfunction(formencode_obj): | |
63 | # case we wrap validators into functions |
|
64 | # case we wrap validators into functions | |
64 | formencode_obj = self.formencode_obj(*args, **kwargs) |
|
65 | formencode_obj = self.formencode_obj(*args, **kwargs) | |
65 | return formencode_obj(*self.args, **self.kwargs) |
|
66 | return formencode_obj(*self.args, **self.kwargs) | |
66 |
|
67 | |||
67 |
|
68 | |||
68 | class RhodeCodeAuthPluginBase(object): |
|
69 | class RhodeCodeAuthPluginBase(object): | |
69 | # cache the authentication request for N amount of seconds. Some kind |
|
70 | # cache the authentication request for N amount of seconds. Some kind | |
70 | # of authentication methods are very heavy and it's very efficient to cache |
|
71 | # of authentication methods are very heavy and it's very efficient to cache | |
71 | # the result of a call. If it's set to None (default) cache is off |
|
72 | # the result of a call. If it's set to None (default) cache is off | |
72 | AUTH_CACHE_TTL = None |
|
73 | AUTH_CACHE_TTL = None | |
73 | AUTH_CACHE = {} |
|
74 | AUTH_CACHE = {} | |
74 |
|
75 | |||
75 | auth_func_attrs = { |
|
76 | auth_func_attrs = { | |
76 | "username": "unique username", |
|
77 | "username": "unique username", | |
77 | "firstname": "first name", |
|
78 | "firstname": "first name", | |
78 | "lastname": "last name", |
|
79 | "lastname": "last name", | |
79 | "email": "email address", |
|
80 | "email": "email address", | |
80 | "groups": '["list", "of", "groups"]', |
|
81 | "groups": '["list", "of", "groups"]', | |
81 | "extern_name": "name in external source of record", |
|
82 | "extern_name": "name in external source of record", | |
82 | "extern_type": "type of external source of record", |
|
83 | "extern_type": "type of external source of record", | |
83 | "admin": 'True|False defines if user should be RhodeCode super admin', |
|
84 | "admin": 'True|False defines if user should be RhodeCode super admin', | |
84 | "active": |
|
85 | "active": | |
85 | 'True|False defines active state of user internally for RhodeCode', |
|
86 | 'True|False defines active state of user internally for RhodeCode', | |
86 | "active_from_extern": |
|
87 | "active_from_extern": | |
87 | "True|False\None, active state from the external auth, " |
|
88 | "True|False\None, active state from the external auth, " | |
88 | "None means use definition from RhodeCode extern_type active value" |
|
89 | "None means use definition from RhodeCode extern_type active value" | |
89 | } |
|
90 | } | |
90 | # set on authenticate() method and via set_auth_type func. |
|
91 | # set on authenticate() method and via set_auth_type func. | |
91 | auth_type = None |
|
92 | auth_type = None | |
92 |
|
93 | |||
93 | # List of setting names to store encrypted. Plugins may override this list |
|
94 | # List of setting names to store encrypted. Plugins may override this list | |
94 | # to store settings encrypted. |
|
95 | # to store settings encrypted. | |
95 | _settings_encrypted = [] |
|
96 | _settings_encrypted = [] | |
96 |
|
97 | |||
97 | # Mapping of python to DB settings model types. Plugins may override or |
|
98 | # Mapping of python to DB settings model types. Plugins may override or | |
98 | # extend this mapping. |
|
99 | # extend this mapping. | |
99 | _settings_type_map = { |
|
100 | _settings_type_map = { | |
100 | str: 'str', |
|
101 | colander.String: 'unicode', | |
101 |
|
|
102 | colander.Integer: 'int', | |
102 |
|
|
103 | colander.Boolean: 'bool', | |
103 |
|
|
104 | colander.List: 'list', | |
104 | list: 'list', |
|
|||
105 | } |
|
105 | } | |
106 |
|
106 | |||
107 | def __init__(self, plugin_id): |
|
107 | def __init__(self, plugin_id): | |
108 | self._plugin_id = plugin_id |
|
108 | self._plugin_id = plugin_id | |
109 |
|
109 | |||
|
110 | def __str__(self): | |||
|
111 | return self.get_id() | |||
|
112 | ||||
110 | def _get_setting_full_name(self, name): |
|
113 | def _get_setting_full_name(self, name): | |
111 | """ |
|
114 | """ | |
112 | Return the full setting name used for storing values in the database. |
|
115 | Return the full setting name used for storing values in the database. | |
113 | """ |
|
116 | """ | |
114 | # TODO: johbo: Using the name here is problematic. It would be good to |
|
117 | # TODO: johbo: Using the name here is problematic. It would be good to | |
115 | # introduce either new models in the database to hold Plugin and |
|
118 | # introduce either new models in the database to hold Plugin and | |
116 | # PluginSetting or to use the plugin id here. |
|
119 | # PluginSetting or to use the plugin id here. | |
117 | return 'auth_{}_{}'.format(self.name, name) |
|
120 | return 'auth_{}_{}'.format(self.name, name) | |
118 |
|
121 | |||
119 |
def _get_setting_type(self, name |
|
122 | def _get_setting_type(self, name): | |
|
123 | """ | |||
|
124 | Return the type of a setting. This type is defined by the SettingsModel | |||
|
125 | and determines how the setting is stored in DB. Optionally the suffix | |||
|
126 | `.encrypted` is appended to instruct SettingsModel to store it | |||
|
127 | encrypted. | |||
120 |
|
|
128 | """ | |
121 | Get the type as used by the SettingsModel accordingly to type of passed |
|
129 | schema_node = self.get_settings_schema().get(name) | |
122 | value. Optionally the suffix `.encrypted` is appended to instruct |
|
130 | db_type = self._settings_type_map.get( | |
123 | SettingsModel to store it encrypted. |
|
131 | type(schema_node.typ), 'unicode') | |
124 | """ |
|
|||
125 | type_ = self._settings_type_map.get(type(value), 'unicode') |
|
|||
126 | if name in self._settings_encrypted: |
|
132 | if name in self._settings_encrypted: | |
127 |
type |
|
133 | db_type = '{}.encrypted'.format(db_type) | |
128 |
return type |
|
134 | return db_type | |
129 |
|
135 | |||
130 | def is_enabled(self): |
|
136 | def is_enabled(self): | |
131 | """ |
|
137 | """ | |
132 | Returns true if this plugin is enabled. An enabled plugin can be |
|
138 | Returns true if this plugin is enabled. An enabled plugin can be | |
133 | configured in the admin interface but it is not consulted during |
|
139 | configured in the admin interface but it is not consulted during | |
134 | authentication. |
|
140 | authentication. | |
135 | """ |
|
141 | """ | |
136 | auth_plugins = SettingsModel().get_auth_plugins() |
|
142 | auth_plugins = SettingsModel().get_auth_plugins() | |
137 | return self.get_id() in auth_plugins |
|
143 | return self.get_id() in auth_plugins | |
138 |
|
144 | |||
139 | def is_active(self): |
|
145 | def is_active(self): | |
140 | """ |
|
146 | """ | |
141 | Returns true if the plugin is activated. An activated plugin is |
|
147 | Returns true if the plugin is activated. An activated plugin is | |
142 | consulted during authentication, assumed it is also enabled. |
|
148 | consulted during authentication, assumed it is also enabled. | |
143 | """ |
|
149 | """ | |
144 | return self.get_setting_by_name('enabled') |
|
150 | return self.get_setting_by_name('enabled') | |
145 |
|
151 | |||
146 | def get_id(self): |
|
152 | def get_id(self): | |
147 | """ |
|
153 | """ | |
148 | Returns the plugin id. |
|
154 | Returns the plugin id. | |
149 | """ |
|
155 | """ | |
150 | return self._plugin_id |
|
156 | return self._plugin_id | |
151 |
|
157 | |||
152 | def get_display_name(self): |
|
158 | def get_display_name(self): | |
153 | """ |
|
159 | """ | |
154 | Returns a translation string for displaying purposes. |
|
160 | Returns a translation string for displaying purposes. | |
155 | """ |
|
161 | """ | |
156 | raise NotImplementedError('Not implemented in base class') |
|
162 | raise NotImplementedError('Not implemented in base class') | |
157 |
|
163 | |||
158 | def get_settings_schema(self): |
|
164 | def get_settings_schema(self): | |
159 | """ |
|
165 | """ | |
160 | Returns a colander schema, representing the plugin settings. |
|
166 | Returns a colander schema, representing the plugin settings. | |
161 | """ |
|
167 | """ | |
162 | return AuthnPluginSettingsSchemaBase() |
|
168 | return AuthnPluginSettingsSchemaBase() | |
163 |
|
169 | |||
164 | def get_setting_by_name(self, name): |
|
170 | def get_setting_by_name(self, name, default=None): | |
165 | """ |
|
171 | """ | |
166 | Returns a plugin setting by name. |
|
172 | Returns a plugin setting by name. | |
167 | """ |
|
173 | """ | |
168 | full_name = self._get_setting_full_name(name) |
|
174 | full_name = self._get_setting_full_name(name) | |
169 | db_setting = SettingsModel().get_setting_by_name(full_name) |
|
175 | db_setting = SettingsModel().get_setting_by_name(full_name) | |
170 |
return db_setting.app_settings_value if db_setting else |
|
176 | return db_setting.app_settings_value if db_setting else default | |
171 |
|
177 | |||
172 | def create_or_update_setting(self, name, value): |
|
178 | def create_or_update_setting(self, name, value): | |
173 | """ |
|
179 | """ | |
174 | Create or update a setting for this plugin in the persistent storage. |
|
180 | Create or update a setting for this plugin in the persistent storage. | |
175 | """ |
|
181 | """ | |
176 | full_name = self._get_setting_full_name(name) |
|
182 | full_name = self._get_setting_full_name(name) | |
177 |
type_ = self._get_setting_type(name |
|
183 | type_ = self._get_setting_type(name) | |
178 | db_setting = SettingsModel().create_or_update_setting( |
|
184 | db_setting = SettingsModel().create_or_update_setting( | |
179 | full_name, value, type_) |
|
185 | full_name, value, type_) | |
180 | return db_setting.app_settings_value |
|
186 | return db_setting.app_settings_value | |
181 |
|
187 | |||
182 | def get_settings(self): |
|
188 | def get_settings(self): | |
183 | """ |
|
189 | """ | |
184 | Returns the plugin settings as dictionary. |
|
190 | Returns the plugin settings as dictionary. | |
185 | """ |
|
191 | """ | |
186 | settings = {} |
|
192 | settings = {} | |
187 | for node in self.get_settings_schema(): |
|
193 | for node in self.get_settings_schema(): | |
188 | settings[node.name] = self.get_setting_by_name(node.name) |
|
194 | settings[node.name] = self.get_setting_by_name(node.name) | |
189 | return settings |
|
195 | return settings | |
190 |
|
196 | |||
191 | @property |
|
197 | @property | |
192 | def validators(self): |
|
198 | def validators(self): | |
193 | """ |
|
199 | """ | |
194 | Exposes RhodeCode validators modules |
|
200 | Exposes RhodeCode validators modules | |
195 | """ |
|
201 | """ | |
196 | # this is a hack to overcome issues with pylons threadlocals and |
|
202 | # this is a hack to overcome issues with pylons threadlocals and | |
197 | # translator object _() not beein registered properly. |
|
203 | # translator object _() not beein registered properly. | |
198 | class LazyCaller(object): |
|
204 | class LazyCaller(object): | |
199 | def __init__(self, name): |
|
205 | def __init__(self, name): | |
200 | self.validator_name = name |
|
206 | self.validator_name = name | |
201 |
|
207 | |||
202 | def __call__(self, *args, **kwargs): |
|
208 | def __call__(self, *args, **kwargs): | |
203 | from rhodecode.model import validators as v |
|
209 | from rhodecode.model import validators as v | |
204 | obj = getattr(v, self.validator_name) |
|
210 | obj = getattr(v, self.validator_name) | |
205 | # log.debug('Initializing lazy formencode object: %s', obj) |
|
211 | # log.debug('Initializing lazy formencode object: %s', obj) | |
206 | return LazyFormencode(obj, *args, **kwargs) |
|
212 | return LazyFormencode(obj, *args, **kwargs) | |
207 |
|
213 | |||
208 | class ProxyGet(object): |
|
214 | class ProxyGet(object): | |
209 | def __getattribute__(self, name): |
|
215 | def __getattribute__(self, name): | |
210 | return LazyCaller(name) |
|
216 | return LazyCaller(name) | |
211 |
|
217 | |||
212 | return ProxyGet() |
|
218 | return ProxyGet() | |
213 |
|
219 | |||
214 | @hybrid_property |
|
220 | @hybrid_property | |
215 | def name(self): |
|
221 | def name(self): | |
216 | """ |
|
222 | """ | |
217 | Returns the name of this authentication plugin. |
|
223 | Returns the name of this authentication plugin. | |
218 |
|
224 | |||
219 | :returns: string |
|
225 | :returns: string | |
220 | """ |
|
226 | """ | |
221 | raise NotImplementedError("Not implemented in base class") |
|
227 | raise NotImplementedError("Not implemented in base class") | |
222 |
|
228 | |||
223 | @property |
|
229 | @property | |
224 | def is_headers_auth(self): |
|
230 | def is_headers_auth(self): | |
225 | """ |
|
231 | """ | |
226 | Returns True if this authentication plugin uses HTTP headers as |
|
232 | Returns True if this authentication plugin uses HTTP headers as | |
227 | authentication method. |
|
233 | authentication method. | |
228 | """ |
|
234 | """ | |
229 | return False |
|
235 | return False | |
230 |
|
236 | |||
231 | @hybrid_property |
|
237 | @hybrid_property | |
232 | def is_container_auth(self): |
|
238 | def is_container_auth(self): | |
233 | """ |
|
239 | """ | |
234 | Deprecated method that indicates if this authentication plugin uses |
|
240 | Deprecated method that indicates if this authentication plugin uses | |
235 | HTTP headers as authentication method. |
|
241 | HTTP headers as authentication method. | |
236 | """ |
|
242 | """ | |
237 | warnings.warn( |
|
243 | warnings.warn( | |
238 | 'Use is_headers_auth instead.', category=DeprecationWarning) |
|
244 | 'Use is_headers_auth instead.', category=DeprecationWarning) | |
239 | return self.is_headers_auth |
|
245 | return self.is_headers_auth | |
240 |
|
246 | |||
241 | @hybrid_property |
|
247 | @hybrid_property | |
242 | def allows_creating_users(self): |
|
248 | def allows_creating_users(self): | |
243 | """ |
|
249 | """ | |
244 | Defines if Plugin allows users to be created on-the-fly when |
|
250 | Defines if Plugin allows users to be created on-the-fly when | |
245 | authentication is called. Controls how external plugins should behave |
|
251 | authentication is called. Controls how external plugins should behave | |
246 | in terms if they are allowed to create new users, or not. Base plugins |
|
252 | in terms if they are allowed to create new users, or not. Base plugins | |
247 | should not be allowed to, but External ones should be ! |
|
253 | should not be allowed to, but External ones should be ! | |
248 |
|
254 | |||
249 | :return: bool |
|
255 | :return: bool | |
250 | """ |
|
256 | """ | |
251 | return False |
|
257 | return False | |
252 |
|
258 | |||
253 | def set_auth_type(self, auth_type): |
|
259 | def set_auth_type(self, auth_type): | |
254 | self.auth_type = auth_type |
|
260 | self.auth_type = auth_type | |
255 |
|
261 | |||
256 | def allows_authentication_from( |
|
262 | def allows_authentication_from( | |
257 | self, user, allows_non_existing_user=True, |
|
263 | self, user, allows_non_existing_user=True, | |
258 | allowed_auth_plugins=None, allowed_auth_sources=None): |
|
264 | allowed_auth_plugins=None, allowed_auth_sources=None): | |
259 | """ |
|
265 | """ | |
260 | Checks if this authentication module should accept a request for |
|
266 | Checks if this authentication module should accept a request for | |
261 | the current user. |
|
267 | the current user. | |
262 |
|
268 | |||
263 | :param user: user object fetched using plugin's get_user() method. |
|
269 | :param user: user object fetched using plugin's get_user() method. | |
264 | :param allows_non_existing_user: if True, don't allow the |
|
270 | :param allows_non_existing_user: if True, don't allow the | |
265 | user to be empty, meaning not existing in our database |
|
271 | user to be empty, meaning not existing in our database | |
266 | :param allowed_auth_plugins: if provided, users extern_type will be |
|
272 | :param allowed_auth_plugins: if provided, users extern_type will be | |
267 | checked against a list of provided extern types, which are plugin |
|
273 | checked against a list of provided extern types, which are plugin | |
268 | auth_names in the end |
|
274 | auth_names in the end | |
269 | :param allowed_auth_sources: authentication type allowed, |
|
275 | :param allowed_auth_sources: authentication type allowed, | |
270 | `http` or `vcs` default is both. |
|
276 | `http` or `vcs` default is both. | |
271 | defines if plugin will accept only http authentication vcs |
|
277 | defines if plugin will accept only http authentication vcs | |
272 | authentication(git/hg) or both |
|
278 | authentication(git/hg) or both | |
273 | :returns: boolean |
|
279 | :returns: boolean | |
274 | """ |
|
280 | """ | |
275 | if not user and not allows_non_existing_user: |
|
281 | if not user and not allows_non_existing_user: | |
276 | log.debug('User is empty but plugin does not allow empty users,' |
|
282 | log.debug('User is empty but plugin does not allow empty users,' | |
277 | 'not allowed to authenticate') |
|
283 | 'not allowed to authenticate') | |
278 | return False |
|
284 | return False | |
279 |
|
285 | |||
280 | expected_auth_plugins = allowed_auth_plugins or [self.name] |
|
286 | expected_auth_plugins = allowed_auth_plugins or [self.name] | |
281 | if user and (user.extern_type and |
|
287 | if user and (user.extern_type and | |
282 | user.extern_type not in expected_auth_plugins): |
|
288 | user.extern_type not in expected_auth_plugins): | |
283 | log.debug( |
|
289 | log.debug( | |
284 | 'User `%s` is bound to `%s` auth type. Plugin allows only ' |
|
290 | 'User `%s` is bound to `%s` auth type. Plugin allows only ' | |
285 | '%s, skipping', user, user.extern_type, expected_auth_plugins) |
|
291 | '%s, skipping', user, user.extern_type, expected_auth_plugins) | |
286 |
|
292 | |||
287 | return False |
|
293 | return False | |
288 |
|
294 | |||
289 | # by default accept both |
|
295 | # by default accept both | |
290 | expected_auth_from = allowed_auth_sources or [HTTP_TYPE, VCS_TYPE] |
|
296 | expected_auth_from = allowed_auth_sources or [HTTP_TYPE, VCS_TYPE] | |
291 | if self.auth_type not in expected_auth_from: |
|
297 | if self.auth_type not in expected_auth_from: | |
292 | log.debug('Current auth source is %s but plugin only allows %s', |
|
298 | log.debug('Current auth source is %s but plugin only allows %s', | |
293 | self.auth_type, expected_auth_from) |
|
299 | self.auth_type, expected_auth_from) | |
294 | return False |
|
300 | return False | |
295 |
|
301 | |||
296 | return True |
|
302 | return True | |
297 |
|
303 | |||
298 | def get_user(self, username=None, **kwargs): |
|
304 | def get_user(self, username=None, **kwargs): | |
299 | """ |
|
305 | """ | |
300 | Helper method for user fetching in plugins, by default it's using |
|
306 | Helper method for user fetching in plugins, by default it's using | |
301 | simple fetch by username, but this method can be custimized in plugins |
|
307 | simple fetch by username, but this method can be custimized in plugins | |
302 | eg. headers auth plugin to fetch user by environ params |
|
308 | eg. headers auth plugin to fetch user by environ params | |
303 |
|
309 | |||
304 | :param username: username if given to fetch from database |
|
310 | :param username: username if given to fetch from database | |
305 | :param kwargs: extra arguments needed for user fetching. |
|
311 | :param kwargs: extra arguments needed for user fetching. | |
306 | """ |
|
312 | """ | |
307 | user = None |
|
313 | user = None | |
308 | log.debug( |
|
314 | log.debug( | |
309 | 'Trying to fetch user `%s` from RhodeCode database', username) |
|
315 | 'Trying to fetch user `%s` from RhodeCode database', username) | |
310 | if username: |
|
316 | if username: | |
311 | user = User.get_by_username(username) |
|
317 | user = User.get_by_username(username) | |
312 | if not user: |
|
318 | if not user: | |
313 | log.debug('User not found, fallback to fetch user in ' |
|
319 | log.debug('User not found, fallback to fetch user in ' | |
314 | 'case insensitive mode') |
|
320 | 'case insensitive mode') | |
315 | user = User.get_by_username(username, case_insensitive=True) |
|
321 | user = User.get_by_username(username, case_insensitive=True) | |
316 | else: |
|
322 | else: | |
317 | log.debug('provided username:`%s` is empty skipping...', username) |
|
323 | log.debug('provided username:`%s` is empty skipping...', username) | |
318 | if not user: |
|
324 | if not user: | |
319 | log.debug('User `%s` not found in database', username) |
|
325 | log.debug('User `%s` not found in database', username) | |
320 | return user |
|
326 | return user | |
321 |
|
327 | |||
322 | def user_activation_state(self): |
|
328 | def user_activation_state(self): | |
323 | """ |
|
329 | """ | |
324 | Defines user activation state when creating new users |
|
330 | Defines user activation state when creating new users | |
325 |
|
331 | |||
326 | :returns: boolean |
|
332 | :returns: boolean | |
327 | """ |
|
333 | """ | |
328 | raise NotImplementedError("Not implemented in base class") |
|
334 | raise NotImplementedError("Not implemented in base class") | |
329 |
|
335 | |||
330 | def auth(self, userobj, username, passwd, settings, **kwargs): |
|
336 | def auth(self, userobj, username, passwd, settings, **kwargs): | |
331 | """ |
|
337 | """ | |
332 | Given a user object (which may be null), username, a plaintext |
|
338 | Given a user object (which may be null), username, a plaintext | |
333 | password, and a settings object (containing all the keys needed as |
|
339 | password, and a settings object (containing all the keys needed as | |
334 | listed in settings()), authenticate this user's login attempt. |
|
340 | listed in settings()), authenticate this user's login attempt. | |
335 |
|
341 | |||
336 | Return None on failure. On success, return a dictionary of the form: |
|
342 | Return None on failure. On success, return a dictionary of the form: | |
337 |
|
343 | |||
338 | see: RhodeCodeAuthPluginBase.auth_func_attrs |
|
344 | see: RhodeCodeAuthPluginBase.auth_func_attrs | |
339 | This is later validated for correctness |
|
345 | This is later validated for correctness | |
340 | """ |
|
346 | """ | |
341 | raise NotImplementedError("not implemented in base class") |
|
347 | raise NotImplementedError("not implemented in base class") | |
342 |
|
348 | |||
343 | def _authenticate(self, userobj, username, passwd, settings, **kwargs): |
|
349 | def _authenticate(self, userobj, username, passwd, settings, **kwargs): | |
344 | """ |
|
350 | """ | |
345 | Wrapper to call self.auth() that validates call on it |
|
351 | Wrapper to call self.auth() that validates call on it | |
346 |
|
352 | |||
347 | :param userobj: userobj |
|
353 | :param userobj: userobj | |
348 | :param username: username |
|
354 | :param username: username | |
349 | :param passwd: plaintext password |
|
355 | :param passwd: plaintext password | |
350 | :param settings: plugin settings |
|
356 | :param settings: plugin settings | |
351 | """ |
|
357 | """ | |
352 | auth = self.auth(userobj, username, passwd, settings, **kwargs) |
|
358 | auth = self.auth(userobj, username, passwd, settings, **kwargs) | |
353 | if auth: |
|
359 | if auth: | |
354 | # check if hash should be migrated ? |
|
360 | # check if hash should be migrated ? | |
355 | new_hash = auth.get('_hash_migrate') |
|
361 | new_hash = auth.get('_hash_migrate') | |
356 | if new_hash: |
|
362 | if new_hash: | |
357 | self._migrate_hash_to_bcrypt(username, passwd, new_hash) |
|
363 | self._migrate_hash_to_bcrypt(username, passwd, new_hash) | |
358 | return self._validate_auth_return(auth) |
|
364 | return self._validate_auth_return(auth) | |
359 | return auth |
|
365 | return auth | |
360 |
|
366 | |||
361 | def _migrate_hash_to_bcrypt(self, username, password, new_hash): |
|
367 | def _migrate_hash_to_bcrypt(self, username, password, new_hash): | |
362 | new_hash_cypher = _RhodeCodeCryptoBCrypt() |
|
368 | new_hash_cypher = _RhodeCodeCryptoBCrypt() | |
363 | # extra checks, so make sure new hash is correct. |
|
369 | # extra checks, so make sure new hash is correct. | |
364 | password_encoded = safe_str(password) |
|
370 | password_encoded = safe_str(password) | |
365 | if new_hash and new_hash_cypher.hash_check( |
|
371 | if new_hash and new_hash_cypher.hash_check( | |
366 | password_encoded, new_hash): |
|
372 | password_encoded, new_hash): | |
367 | cur_user = User.get_by_username(username) |
|
373 | cur_user = User.get_by_username(username) | |
368 | cur_user.password = new_hash |
|
374 | cur_user.password = new_hash | |
369 | Session().add(cur_user) |
|
375 | Session().add(cur_user) | |
370 | Session().flush() |
|
376 | Session().flush() | |
371 | log.info('Migrated user %s hash to bcrypt', cur_user) |
|
377 | log.info('Migrated user %s hash to bcrypt', cur_user) | |
372 |
|
378 | |||
373 | def _validate_auth_return(self, ret): |
|
379 | def _validate_auth_return(self, ret): | |
374 | if not isinstance(ret, dict): |
|
380 | if not isinstance(ret, dict): | |
375 | raise Exception('returned value from auth must be a dict') |
|
381 | raise Exception('returned value from auth must be a dict') | |
376 | for k in self.auth_func_attrs: |
|
382 | for k in self.auth_func_attrs: | |
377 | if k not in ret: |
|
383 | if k not in ret: | |
378 | raise Exception('Missing %s attribute from returned data' % k) |
|
384 | raise Exception('Missing %s attribute from returned data' % k) | |
379 | return ret |
|
385 | return ret | |
380 |
|
386 | |||
381 |
|
387 | |||
382 | class RhodeCodeExternalAuthPlugin(RhodeCodeAuthPluginBase): |
|
388 | class RhodeCodeExternalAuthPlugin(RhodeCodeAuthPluginBase): | |
383 |
|
389 | |||
384 | @hybrid_property |
|
390 | @hybrid_property | |
385 | def allows_creating_users(self): |
|
391 | def allows_creating_users(self): | |
386 | return True |
|
392 | return True | |
387 |
|
393 | |||
388 | def use_fake_password(self): |
|
394 | def use_fake_password(self): | |
389 | """ |
|
395 | """ | |
390 | Return a boolean that indicates whether or not we should set the user's |
|
396 | Return a boolean that indicates whether or not we should set the user's | |
391 | password to a random value when it is authenticated by this plugin. |
|
397 | password to a random value when it is authenticated by this plugin. | |
392 | If your plugin provides authentication, then you will generally |
|
398 | If your plugin provides authentication, then you will generally | |
393 | want this. |
|
399 | want this. | |
394 |
|
400 | |||
395 | :returns: boolean |
|
401 | :returns: boolean | |
396 | """ |
|
402 | """ | |
397 | raise NotImplementedError("Not implemented in base class") |
|
403 | raise NotImplementedError("Not implemented in base class") | |
398 |
|
404 | |||
399 | def _authenticate(self, userobj, username, passwd, settings, **kwargs): |
|
405 | def _authenticate(self, userobj, username, passwd, settings, **kwargs): | |
400 | # at this point _authenticate calls plugin's `auth()` function |
|
406 | # at this point _authenticate calls plugin's `auth()` function | |
401 | auth = super(RhodeCodeExternalAuthPlugin, self)._authenticate( |
|
407 | auth = super(RhodeCodeExternalAuthPlugin, self)._authenticate( | |
402 | userobj, username, passwd, settings, **kwargs) |
|
408 | userobj, username, passwd, settings, **kwargs) | |
403 | if auth: |
|
409 | if auth: | |
404 | # maybe plugin will clean the username ? |
|
410 | # maybe plugin will clean the username ? | |
405 | # we should use the return value |
|
411 | # we should use the return value | |
406 | username = auth['username'] |
|
412 | username = auth['username'] | |
407 |
|
413 | |||
408 | # if external source tells us that user is not active, we should |
|
414 | # if external source tells us that user is not active, we should | |
409 | # skip rest of the process. This can prevent from creating users in |
|
415 | # skip rest of the process. This can prevent from creating users in | |
410 | # RhodeCode when using external authentication, but if it's |
|
416 | # RhodeCode when using external authentication, but if it's | |
411 | # inactive user we shouldn't create that user anyway |
|
417 | # inactive user we shouldn't create that user anyway | |
412 | if auth['active_from_extern'] is False: |
|
418 | if auth['active_from_extern'] is False: | |
413 | log.warning( |
|
419 | log.warning( | |
414 | "User %s authenticated against %s, but is inactive", |
|
420 | "User %s authenticated against %s, but is inactive", | |
415 | username, self.__module__) |
|
421 | username, self.__module__) | |
416 | return None |
|
422 | return None | |
417 |
|
423 | |||
418 | cur_user = User.get_by_username(username, case_insensitive=True) |
|
424 | cur_user = User.get_by_username(username, case_insensitive=True) | |
419 | is_user_existing = cur_user is not None |
|
425 | is_user_existing = cur_user is not None | |
420 |
|
426 | |||
421 | if is_user_existing: |
|
427 | if is_user_existing: | |
422 | log.debug('Syncing user `%s` from ' |
|
428 | log.debug('Syncing user `%s` from ' | |
423 | '`%s` plugin', username, self.name) |
|
429 | '`%s` plugin', username, self.name) | |
424 | else: |
|
430 | else: | |
425 | log.debug('Creating non existing user `%s` from ' |
|
431 | log.debug('Creating non existing user `%s` from ' | |
426 | '`%s` plugin', username, self.name) |
|
432 | '`%s` plugin', username, self.name) | |
427 |
|
433 | |||
428 | if self.allows_creating_users: |
|
434 | if self.allows_creating_users: | |
429 | log.debug('Plugin `%s` allows to ' |
|
435 | log.debug('Plugin `%s` allows to ' | |
430 | 'create new users', self.name) |
|
436 | 'create new users', self.name) | |
431 | else: |
|
437 | else: | |
432 | log.debug('Plugin `%s` does not allow to ' |
|
438 | log.debug('Plugin `%s` does not allow to ' | |
433 | 'create new users', self.name) |
|
439 | 'create new users', self.name) | |
434 |
|
440 | |||
435 | user_parameters = { |
|
441 | user_parameters = { | |
436 | 'username': username, |
|
442 | 'username': username, | |
437 | 'email': auth["email"], |
|
443 | 'email': auth["email"], | |
438 | 'firstname': auth["firstname"], |
|
444 | 'firstname': auth["firstname"], | |
439 | 'lastname': auth["lastname"], |
|
445 | 'lastname': auth["lastname"], | |
440 | 'active': auth["active"], |
|
446 | 'active': auth["active"], | |
441 | 'admin': auth["admin"], |
|
447 | 'admin': auth["admin"], | |
442 | 'extern_name': auth["extern_name"], |
|
448 | 'extern_name': auth["extern_name"], | |
443 | 'extern_type': self.name, |
|
449 | 'extern_type': self.name, | |
444 | 'plugin': self, |
|
450 | 'plugin': self, | |
445 | 'allow_to_create_user': self.allows_creating_users, |
|
451 | 'allow_to_create_user': self.allows_creating_users, | |
446 | } |
|
452 | } | |
447 |
|
453 | |||
448 | if not is_user_existing: |
|
454 | if not is_user_existing: | |
449 | if self.use_fake_password(): |
|
455 | if self.use_fake_password(): | |
450 | # Randomize the PW because we don't need it, but don't want |
|
456 | # Randomize the PW because we don't need it, but don't want | |
451 | # them blank either |
|
457 | # them blank either | |
452 | passwd = PasswordGenerator().gen_password(length=16) |
|
458 | passwd = PasswordGenerator().gen_password(length=16) | |
453 | user_parameters['password'] = passwd |
|
459 | user_parameters['password'] = passwd | |
454 | else: |
|
460 | else: | |
455 | # Since the password is required by create_or_update method of |
|
461 | # Since the password is required by create_or_update method of | |
456 | # UserModel, we need to set it explicitly. |
|
462 | # UserModel, we need to set it explicitly. | |
457 | # The create_or_update method is smart and recognises the |
|
463 | # The create_or_update method is smart and recognises the | |
458 | # password hashes as well. |
|
464 | # password hashes as well. | |
459 | user_parameters['password'] = cur_user.password |
|
465 | user_parameters['password'] = cur_user.password | |
460 |
|
466 | |||
461 | # we either create or update users, we also pass the flag |
|
467 | # we either create or update users, we also pass the flag | |
462 | # that controls if this method can actually do that. |
|
468 | # that controls if this method can actually do that. | |
463 | # raises NotAllowedToCreateUserError if it cannot, and we try to. |
|
469 | # raises NotAllowedToCreateUserError if it cannot, and we try to. | |
464 | user = UserModel().create_or_update(**user_parameters) |
|
470 | user = UserModel().create_or_update(**user_parameters) | |
465 | Session().flush() |
|
471 | Session().flush() | |
466 | # enforce user is just in given groups, all of them has to be ones |
|
472 | # enforce user is just in given groups, all of them has to be ones | |
467 | # created from plugins. We store this info in _group_data JSON |
|
473 | # created from plugins. We store this info in _group_data JSON | |
468 | # field |
|
474 | # field | |
469 | try: |
|
475 | try: | |
470 | groups = auth['groups'] or [] |
|
476 | groups = auth['groups'] or [] | |
471 | UserGroupModel().enforce_groups(user, groups, self.name) |
|
477 | UserGroupModel().enforce_groups(user, groups, self.name) | |
472 | except Exception: |
|
478 | except Exception: | |
473 | # for any reason group syncing fails, we should |
|
479 | # for any reason group syncing fails, we should | |
474 | # proceed with login |
|
480 | # proceed with login | |
475 | log.error(traceback.format_exc()) |
|
481 | log.error(traceback.format_exc()) | |
476 | Session().commit() |
|
482 | Session().commit() | |
477 | return auth |
|
483 | return auth | |
478 |
|
484 | |||
479 |
|
485 | |||
480 | def loadplugin(plugin_id): |
|
486 | def loadplugin(plugin_id): | |
481 | """ |
|
487 | """ | |
482 | Loads and returns an instantiated authentication plugin. |
|
488 | Loads and returns an instantiated authentication plugin. | |
483 | Returns the RhodeCodeAuthPluginBase subclass on success, |
|
489 | Returns the RhodeCodeAuthPluginBase subclass on success, | |
484 | or None on failure. |
|
490 | or None on failure. | |
485 | """ |
|
491 | """ | |
486 | # TODO: Disusing pyramids thread locals to retrieve the registry. |
|
492 | # TODO: Disusing pyramids thread locals to retrieve the registry. | |
487 | authn_registry = get_current_registry().getUtility(IAuthnPluginRegistry) |
|
493 | authn_registry = get_current_registry().getUtility(IAuthnPluginRegistry) | |
488 | plugin = authn_registry.get_plugin(plugin_id) |
|
494 | plugin = authn_registry.get_plugin(plugin_id) | |
489 | if plugin is None: |
|
495 | if plugin is None: | |
490 | log.error('Authentication plugin not found: "%s"', plugin_id) |
|
496 | log.error('Authentication plugin not found: "%s"', plugin_id) | |
491 | return plugin |
|
497 | return plugin | |
492 |
|
498 | |||
493 |
|
499 | |||
494 | def get_auth_cache_manager(custom_ttl=None): |
|
500 | def get_auth_cache_manager(custom_ttl=None): | |
495 | return caches.get_cache_manager( |
|
501 | return caches.get_cache_manager( | |
496 | 'auth_plugins', 'rhodecode.authentication', custom_ttl) |
|
502 | 'auth_plugins', 'rhodecode.authentication', custom_ttl) | |
497 |
|
503 | |||
498 |
|
504 | |||
499 | def authenticate(username, password, environ=None, auth_type=None, |
|
505 | def authenticate(username, password, environ=None, auth_type=None, | |
500 | skip_missing=False): |
|
506 | skip_missing=False): | |
501 | """ |
|
507 | """ | |
502 | Authentication function used for access control, |
|
508 | Authentication function used for access control, | |
503 | It tries to authenticate based on enabled authentication modules. |
|
509 | It tries to authenticate based on enabled authentication modules. | |
504 |
|
510 | |||
505 | :param username: username can be empty for headers auth |
|
511 | :param username: username can be empty for headers auth | |
506 | :param password: password can be empty for headers auth |
|
512 | :param password: password can be empty for headers auth | |
507 | :param environ: environ headers passed for headers auth |
|
513 | :param environ: environ headers passed for headers auth | |
508 | :param auth_type: type of authentication, either `HTTP_TYPE` or `VCS_TYPE` |
|
514 | :param auth_type: type of authentication, either `HTTP_TYPE` or `VCS_TYPE` | |
509 | :param skip_missing: ignores plugins that are in db but not in environment |
|
515 | :param skip_missing: ignores plugins that are in db but not in environment | |
510 | :returns: None if auth failed, plugin_user dict if auth is correct |
|
516 | :returns: None if auth failed, plugin_user dict if auth is correct | |
511 | """ |
|
517 | """ | |
512 | if not auth_type or auth_type not in [HTTP_TYPE, VCS_TYPE]: |
|
518 | if not auth_type or auth_type not in [HTTP_TYPE, VCS_TYPE]: | |
513 | raise ValueError('auth type must be on of http, vcs got "%s" instead' |
|
519 | raise ValueError('auth type must be on of http, vcs got "%s" instead' | |
514 | % auth_type) |
|
520 | % auth_type) | |
515 | headers_only = environ and not (username and password) |
|
521 | headers_only = environ and not (username and password) | |
516 |
|
522 | |||
517 | authn_registry = get_current_registry().getUtility(IAuthnPluginRegistry) |
|
523 | authn_registry = get_current_registry().getUtility(IAuthnPluginRegistry) | |
518 | for plugin in authn_registry.get_plugins_for_authentication(): |
|
524 | for plugin in authn_registry.get_plugins_for_authentication(): | |
519 | plugin.set_auth_type(auth_type) |
|
525 | plugin.set_auth_type(auth_type) | |
520 | user = plugin.get_user(username) |
|
526 | user = plugin.get_user(username) | |
521 | display_user = user.username if user else username |
|
527 | display_user = user.username if user else username | |
522 |
|
528 | |||
523 | if headers_only and not plugin.is_headers_auth: |
|
529 | if headers_only and not plugin.is_headers_auth: | |
524 | log.debug('Auth type is for headers only and plugin `%s` is not ' |
|
530 | log.debug('Auth type is for headers only and plugin `%s` is not ' | |
525 | 'headers plugin, skipping...', plugin.get_id()) |
|
531 | 'headers plugin, skipping...', plugin.get_id()) | |
526 | continue |
|
532 | continue | |
527 |
|
533 | |||
528 | # load plugin settings from RhodeCode database |
|
534 | # load plugin settings from RhodeCode database | |
529 | plugin_settings = plugin.get_settings() |
|
535 | plugin_settings = plugin.get_settings() | |
530 | log.debug('Plugin settings:%s', plugin_settings) |
|
536 | log.debug('Plugin settings:%s', plugin_settings) | |
531 |
|
537 | |||
532 | log.debug('Trying authentication using ** %s **', plugin.get_id()) |
|
538 | log.debug('Trying authentication using ** %s **', plugin.get_id()) | |
533 | # use plugin's method of user extraction. |
|
539 | # use plugin's method of user extraction. | |
534 | user = plugin.get_user(username, environ=environ, |
|
540 | user = plugin.get_user(username, environ=environ, | |
535 | settings=plugin_settings) |
|
541 | settings=plugin_settings) | |
536 | display_user = user.username if user else username |
|
542 | display_user = user.username if user else username | |
537 | log.debug( |
|
543 | log.debug( | |
538 | 'Plugin %s extracted user is `%s`', plugin.get_id(), display_user) |
|
544 | 'Plugin %s extracted user is `%s`', plugin.get_id(), display_user) | |
539 |
|
545 | |||
540 | if not plugin.allows_authentication_from(user): |
|
546 | if not plugin.allows_authentication_from(user): | |
541 | log.debug('Plugin %s does not accept user `%s` for authentication', |
|
547 | log.debug('Plugin %s does not accept user `%s` for authentication', | |
542 | plugin.get_id(), display_user) |
|
548 | plugin.get_id(), display_user) | |
543 | continue |
|
549 | continue | |
544 | else: |
|
550 | else: | |
545 | log.debug('Plugin %s accepted user `%s` for authentication', |
|
551 | log.debug('Plugin %s accepted user `%s` for authentication', | |
546 | plugin.get_id(), display_user) |
|
552 | plugin.get_id(), display_user) | |
547 |
|
553 | |||
548 | log.info('Authenticating user `%s` using %s plugin', |
|
554 | log.info('Authenticating user `%s` using %s plugin', | |
549 | display_user, plugin.get_id()) |
|
555 | display_user, plugin.get_id()) | |
550 |
|
556 | |||
551 | _cache_ttl = 0 |
|
557 | _cache_ttl = 0 | |
552 |
|
558 | |||
553 | if isinstance(plugin.AUTH_CACHE_TTL, (int, long)): |
|
559 | if isinstance(plugin.AUTH_CACHE_TTL, (int, long)): | |
554 | # plugin cache set inside is more important than the settings value |
|
560 | # plugin cache set inside is more important than the settings value | |
555 | _cache_ttl = plugin.AUTH_CACHE_TTL |
|
561 | _cache_ttl = plugin.AUTH_CACHE_TTL | |
556 | elif plugin_settings.get('auth_cache_ttl'): |
|
562 | elif plugin_settings.get('auth_cache_ttl'): | |
557 | _cache_ttl = safe_int(plugin_settings.get('auth_cache_ttl'), 0) |
|
563 | _cache_ttl = safe_int(plugin_settings.get('auth_cache_ttl'), 0) | |
558 |
|
564 | |||
559 | plugin_cache_active = bool(_cache_ttl and _cache_ttl > 0) |
|
565 | plugin_cache_active = bool(_cache_ttl and _cache_ttl > 0) | |
560 |
|
566 | |||
561 | # get instance of cache manager configured for a namespace |
|
567 | # get instance of cache manager configured for a namespace | |
562 | cache_manager = get_auth_cache_manager(custom_ttl=_cache_ttl) |
|
568 | cache_manager = get_auth_cache_manager(custom_ttl=_cache_ttl) | |
563 |
|
569 | |||
564 | log.debug('Cache for plugin `%s` active: %s', plugin.get_id(), |
|
570 | log.debug('Cache for plugin `%s` active: %s', plugin.get_id(), | |
565 | plugin_cache_active) |
|
571 | plugin_cache_active) | |
566 |
|
572 | |||
567 | # for environ based password can be empty, but then the validation is |
|
573 | # for environ based password can be empty, but then the validation is | |
568 | # on the server that fills in the env data needed for authentication |
|
574 | # on the server that fills in the env data needed for authentication | |
569 | _password_hash = md5_safe(plugin.name + username + (password or '')) |
|
575 | _password_hash = md5_safe(plugin.name + username + (password or '')) | |
570 |
|
576 | |||
571 | # _authenticate is a wrapper for .auth() method of plugin. |
|
577 | # _authenticate is a wrapper for .auth() method of plugin. | |
572 | # it checks if .auth() sends proper data. |
|
578 | # it checks if .auth() sends proper data. | |
573 | # For RhodeCodeExternalAuthPlugin it also maps users to |
|
579 | # For RhodeCodeExternalAuthPlugin it also maps users to | |
574 | # Database and maps the attributes returned from .auth() |
|
580 | # Database and maps the attributes returned from .auth() | |
575 | # to RhodeCode database. If this function returns data |
|
581 | # to RhodeCode database. If this function returns data | |
576 | # then auth is correct. |
|
582 | # then auth is correct. | |
577 | start = time.time() |
|
583 | start = time.time() | |
578 | log.debug('Running plugin `%s` _authenticate method', |
|
584 | log.debug('Running plugin `%s` _authenticate method', | |
579 | plugin.get_id()) |
|
585 | plugin.get_id()) | |
580 |
|
586 | |||
581 | def auth_func(): |
|
587 | def auth_func(): | |
582 | """ |
|
588 | """ | |
583 | This function is used internally in Cache of Beaker to calculate |
|
589 | This function is used internally in Cache of Beaker to calculate | |
584 | Results |
|
590 | Results | |
585 | """ |
|
591 | """ | |
586 | return plugin._authenticate( |
|
592 | return plugin._authenticate( | |
587 | user, username, password, plugin_settings, |
|
593 | user, username, password, plugin_settings, | |
588 | environ=environ or {}) |
|
594 | environ=environ or {}) | |
589 |
|
595 | |||
590 | if plugin_cache_active: |
|
596 | if plugin_cache_active: | |
591 | plugin_user = cache_manager.get( |
|
597 | plugin_user = cache_manager.get( | |
592 | _password_hash, createfunc=auth_func) |
|
598 | _password_hash, createfunc=auth_func) | |
593 | else: |
|
599 | else: | |
594 | plugin_user = auth_func() |
|
600 | plugin_user = auth_func() | |
595 |
|
601 | |||
596 | auth_time = time.time() - start |
|
602 | auth_time = time.time() - start | |
597 | log.debug('Authentication for plugin `%s` completed in %.3fs, ' |
|
603 | log.debug('Authentication for plugin `%s` completed in %.3fs, ' | |
598 | 'expiration time of fetched cache %.1fs.', |
|
604 | 'expiration time of fetched cache %.1fs.', | |
599 | plugin.get_id(), auth_time, _cache_ttl) |
|
605 | plugin.get_id(), auth_time, _cache_ttl) | |
600 |
|
606 | |||
601 | log.debug('PLUGIN USER DATA: %s', plugin_user) |
|
607 | log.debug('PLUGIN USER DATA: %s', plugin_user) | |
602 |
|
608 | |||
603 | if plugin_user: |
|
609 | if plugin_user: | |
604 | log.debug('Plugin returned proper authentication data') |
|
610 | log.debug('Plugin returned proper authentication data') | |
605 | return plugin_user |
|
611 | return plugin_user | |
606 | # we failed to Auth because .auth() method didn't return proper user |
|
612 | # we failed to Auth because .auth() method didn't return proper user | |
607 | log.debug("User `%s` failed to authenticate against %s", |
|
613 | log.debug("User `%s` failed to authenticate against %s", | |
608 | display_user, plugin.get_id()) |
|
614 | display_user, plugin.get_id()) | |
609 | return None |
|
615 | return None |
@@ -1,188 +1,192 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2012-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2012-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 | import colander |
|
21 | import colander | |
22 | import formencode.htmlfill |
|
22 | import formencode.htmlfill | |
23 | import logging |
|
23 | import logging | |
24 |
|
24 | |||
25 | from pyramid.httpexceptions import HTTPFound |
|
25 | from pyramid.httpexceptions import HTTPFound | |
26 | from pyramid.renderers import render |
|
26 | from pyramid.renderers import render | |
27 | from pyramid.response import Response |
|
27 | from pyramid.response import Response | |
28 |
|
28 | |||
29 | from rhodecode.authentication.base import get_auth_cache_manager |
|
29 | from rhodecode.authentication.base import get_auth_cache_manager | |
30 | from rhodecode.authentication.interface import IAuthnPluginRegistry |
|
30 | from rhodecode.authentication.interface import IAuthnPluginRegistry | |
31 | from rhodecode.lib import auth |
|
31 | from rhodecode.lib import auth | |
32 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator |
|
32 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator | |
33 | from rhodecode.model.forms import AuthSettingsForm |
|
33 | from rhodecode.model.forms import AuthSettingsForm | |
34 | from rhodecode.model.meta import Session |
|
34 | from rhodecode.model.meta import Session | |
35 | from rhodecode.model.settings import SettingsModel |
|
35 | from rhodecode.model.settings import SettingsModel | |
36 | from rhodecode.translation import _ |
|
36 | from rhodecode.translation import _ | |
37 |
|
37 | |||
38 | log = logging.getLogger(__name__) |
|
38 | log = logging.getLogger(__name__) | |
39 |
|
39 | |||
40 |
|
40 | |||
41 | class AuthnPluginViewBase(object): |
|
41 | class AuthnPluginViewBase(object): | |
42 |
|
42 | |||
43 | def __init__(self, context, request): |
|
43 | def __init__(self, context, request): | |
44 | self.request = request |
|
44 | self.request = request | |
45 | self.context = context |
|
45 | self.context = context | |
46 | self.plugin = context.plugin |
|
46 | self.plugin = context.plugin | |
47 | self._rhodecode_user = request.user |
|
47 | self._rhodecode_user = request.user | |
48 |
|
48 | |||
49 | @LoginRequired() |
|
49 | @LoginRequired() | |
50 | @HasPermissionAllDecorator('hg.admin') |
|
50 | @HasPermissionAllDecorator('hg.admin') | |
51 | def settings_get(self, defaults=None, errors=None): |
|
51 | def settings_get(self, defaults=None, errors=None): | |
52 | """ |
|
52 | """ | |
53 | View that displays the plugin settings as a form. |
|
53 | View that displays the plugin settings as a form. | |
54 | """ |
|
54 | """ | |
55 | defaults = defaults or {} |
|
55 | defaults = defaults or {} | |
56 | errors = errors or {} |
|
56 | errors = errors or {} | |
57 | schema = self.plugin.get_settings_schema() |
|
57 | schema = self.plugin.get_settings_schema() | |
58 |
|
58 | |||
59 |
# |
|
59 | # Compute default values for the form. Priority is: | |
|
60 | # 1. Passed to this method 2. DB value 3. Schema default | |||
60 | for node in schema: |
|
61 | for node in schema: | |
61 | db_value = self.plugin.get_setting_by_name(node.name) |
|
62 | if node.name not in defaults: | |
62 | defaults.setdefault(node.name, db_value) |
|
63 | defaults[node.name] = self.plugin.get_setting_by_name( | |
|
64 | node.name, node.default) | |||
63 |
|
65 | |||
64 | template_context = { |
|
66 | template_context = { | |
65 | 'defaults': defaults, |
|
67 | 'defaults': defaults, | |
66 | 'errors': errors, |
|
68 | 'errors': errors, | |
67 | 'plugin': self.context.plugin, |
|
69 | 'plugin': self.context.plugin, | |
68 | 'resource': self.context, |
|
70 | 'resource': self.context, | |
69 | } |
|
71 | } | |
70 |
|
72 | |||
71 | return template_context |
|
73 | return template_context | |
72 |
|
74 | |||
73 | @LoginRequired() |
|
75 | @LoginRequired() | |
74 | @HasPermissionAllDecorator('hg.admin') |
|
76 | @HasPermissionAllDecorator('hg.admin') | |
75 | @auth.CSRFRequired() |
|
77 | @auth.CSRFRequired() | |
76 | def settings_post(self): |
|
78 | def settings_post(self): | |
77 | """ |
|
79 | """ | |
78 | View that validates and stores the plugin settings. |
|
80 | View that validates and stores the plugin settings. | |
79 | """ |
|
81 | """ | |
80 | schema = self.plugin.get_settings_schema() |
|
82 | schema = self.plugin.get_settings_schema() | |
|
83 | data = self.request.params | |||
|
84 | ||||
81 | try: |
|
85 | try: | |
82 |
valid_data = schema.deserialize( |
|
86 | valid_data = schema.deserialize(data) | |
83 | except colander.Invalid, e: |
|
87 | except colander.Invalid, e: | |
84 | # Display error message and display form again. |
|
88 | # Display error message and display form again. | |
85 | self.request.session.flash( |
|
89 | self.request.session.flash( | |
86 | _('Errors exist when saving plugin settings. ' |
|
90 | _('Errors exist when saving plugin settings. ' | |
87 | 'Please check the form inputs.'), |
|
91 | 'Please check the form inputs.'), | |
88 | queue='error') |
|
92 | queue='error') | |
89 | defaults = schema.flatten(self.request.params) |
|
93 | defaults = {key: data[key] for key in data if key in schema} | |
90 | return self.settings_get(errors=e.asdict(), defaults=defaults) |
|
94 | return self.settings_get(errors=e.asdict(), defaults=defaults) | |
91 |
|
95 | |||
92 | # Store validated data. |
|
96 | # Store validated data. | |
93 | for name, value in valid_data.items(): |
|
97 | for name, value in valid_data.items(): | |
94 | self.plugin.create_or_update_setting(name, value) |
|
98 | self.plugin.create_or_update_setting(name, value) | |
95 | Session.commit() |
|
99 | Session.commit() | |
96 |
|
100 | |||
97 | # Display success message and redirect. |
|
101 | # Display success message and redirect. | |
98 | self.request.session.flash( |
|
102 | self.request.session.flash( | |
99 | _('Auth settings updated successfully.'), |
|
103 | _('Auth settings updated successfully.'), | |
100 | queue='success') |
|
104 | queue='success') | |
101 | redirect_to = self.request.resource_path( |
|
105 | redirect_to = self.request.resource_path( | |
102 | self.context, route_name='auth_home') |
|
106 | self.context, route_name='auth_home') | |
103 | return HTTPFound(redirect_to) |
|
107 | return HTTPFound(redirect_to) | |
104 |
|
108 | |||
105 |
|
109 | |||
106 | # TODO: Ongoing migration in these views. |
|
110 | # TODO: Ongoing migration in these views. | |
107 | # - Maybe we should also use a colander schema for these views. |
|
111 | # - Maybe we should also use a colander schema for these views. | |
108 | class AuthSettingsView(object): |
|
112 | class AuthSettingsView(object): | |
109 | def __init__(self, context, request): |
|
113 | def __init__(self, context, request): | |
110 | self.context = context |
|
114 | self.context = context | |
111 | self.request = request |
|
115 | self.request = request | |
112 |
|
116 | |||
113 | # TODO: Move this into a utility function. It is needed in all view |
|
117 | # TODO: Move this into a utility function. It is needed in all view | |
114 | # classes during migration. Maybe a mixin? |
|
118 | # classes during migration. Maybe a mixin? | |
115 |
|
119 | |||
116 | # Some of the decorators rely on this attribute to be present on the |
|
120 | # Some of the decorators rely on this attribute to be present on the | |
117 | # class of the decorated method. |
|
121 | # class of the decorated method. | |
118 | self._rhodecode_user = request.user |
|
122 | self._rhodecode_user = request.user | |
119 |
|
123 | |||
120 | @LoginRequired() |
|
124 | @LoginRequired() | |
121 | @HasPermissionAllDecorator('hg.admin') |
|
125 | @HasPermissionAllDecorator('hg.admin') | |
122 | def index(self, defaults=None, errors=None, prefix_error=False): |
|
126 | def index(self, defaults=None, errors=None, prefix_error=False): | |
123 | defaults = defaults or {} |
|
127 | defaults = defaults or {} | |
124 | authn_registry = self.request.registry.getUtility(IAuthnPluginRegistry) |
|
128 | authn_registry = self.request.registry.getUtility(IAuthnPluginRegistry) | |
125 | enabled_plugins = SettingsModel().get_auth_plugins() |
|
129 | enabled_plugins = SettingsModel().get_auth_plugins() | |
126 |
|
130 | |||
127 | # Create template context and render it. |
|
131 | # Create template context and render it. | |
128 | template_context = { |
|
132 | template_context = { | |
129 | 'resource': self.context, |
|
133 | 'resource': self.context, | |
130 | 'available_plugins': authn_registry.get_plugins(), |
|
134 | 'available_plugins': authn_registry.get_plugins(), | |
131 | 'enabled_plugins': enabled_plugins, |
|
135 | 'enabled_plugins': enabled_plugins, | |
132 | } |
|
136 | } | |
133 | html = render('rhodecode:templates/admin/auth/auth_settings.html', |
|
137 | html = render('rhodecode:templates/admin/auth/auth_settings.html', | |
134 | template_context, |
|
138 | template_context, | |
135 | request=self.request) |
|
139 | request=self.request) | |
136 |
|
140 | |||
137 | # Create form default values and fill the form. |
|
141 | # Create form default values and fill the form. | |
138 | form_defaults = { |
|
142 | form_defaults = { | |
139 | 'auth_plugins': ','.join(enabled_plugins) |
|
143 | 'auth_plugins': ','.join(enabled_plugins) | |
140 | } |
|
144 | } | |
141 | form_defaults.update(defaults) |
|
145 | form_defaults.update(defaults) | |
142 | html = formencode.htmlfill.render( |
|
146 | html = formencode.htmlfill.render( | |
143 | html, |
|
147 | html, | |
144 | defaults=form_defaults, |
|
148 | defaults=form_defaults, | |
145 | errors=errors, |
|
149 | errors=errors, | |
146 | prefix_error=prefix_error, |
|
150 | prefix_error=prefix_error, | |
147 | encoding="UTF-8", |
|
151 | encoding="UTF-8", | |
148 | force_defaults=False) |
|
152 | force_defaults=False) | |
149 |
|
153 | |||
150 | return Response(html) |
|
154 | return Response(html) | |
151 |
|
155 | |||
152 | @LoginRequired() |
|
156 | @LoginRequired() | |
153 | @HasPermissionAllDecorator('hg.admin') |
|
157 | @HasPermissionAllDecorator('hg.admin') | |
154 | @auth.CSRFRequired() |
|
158 | @auth.CSRFRequired() | |
155 | def auth_settings(self): |
|
159 | def auth_settings(self): | |
156 | try: |
|
160 | try: | |
157 | form = AuthSettingsForm()() |
|
161 | form = AuthSettingsForm()() | |
158 | form_result = form.to_python(self.request.params) |
|
162 | form_result = form.to_python(self.request.params) | |
159 | plugins = ','.join(form_result['auth_plugins']) |
|
163 | plugins = ','.join(form_result['auth_plugins']) | |
160 | setting = SettingsModel().create_or_update_setting( |
|
164 | setting = SettingsModel().create_or_update_setting( | |
161 | 'auth_plugins', plugins) |
|
165 | 'auth_plugins', plugins) | |
162 | Session().add(setting) |
|
166 | Session().add(setting) | |
163 | Session().commit() |
|
167 | Session().commit() | |
164 |
|
168 | |||
165 | cache_manager = get_auth_cache_manager() |
|
169 | cache_manager = get_auth_cache_manager() | |
166 | cache_manager.clear() |
|
170 | cache_manager.clear() | |
167 | self.request.session.flash( |
|
171 | self.request.session.flash( | |
168 | _('Auth settings updated successfully.'), |
|
172 | _('Auth settings updated successfully.'), | |
169 | queue='success') |
|
173 | queue='success') | |
170 | except formencode.Invalid as errors: |
|
174 | except formencode.Invalid as errors: | |
171 | e = errors.error_dict or {} |
|
175 | e = errors.error_dict or {} | |
172 | self.request.session.flash( |
|
176 | self.request.session.flash( | |
173 | _('Errors exist when saving plugin setting. ' |
|
177 | _('Errors exist when saving plugin setting. ' | |
174 | 'Please check the form inputs.'), |
|
178 | 'Please check the form inputs.'), | |
175 | queue='error') |
|
179 | queue='error') | |
176 | return self.index( |
|
180 | return self.index( | |
177 | defaults=errors.value, |
|
181 | defaults=errors.value, | |
178 | errors=e, |
|
182 | errors=e, | |
179 | prefix_error=False) |
|
183 | prefix_error=False) | |
180 | except Exception: |
|
184 | except Exception: | |
181 | log.exception('Exception in auth_settings') |
|
185 | log.exception('Exception in auth_settings') | |
182 | self.request.session.flash( |
|
186 | self.request.session.flash( | |
183 | _('Error occurred during update of auth settings.'), |
|
187 | _('Error occurred during update of auth settings.'), | |
184 | queue='error') |
|
188 | queue='error') | |
185 |
|
189 | |||
186 | redirect_to = self.request.resource_path( |
|
190 | redirect_to = self.request.resource_path( | |
187 | self.context, route_name='auth_home') |
|
191 | self.context, route_name='auth_home') | |
188 | return HTTPFound(redirect_to) |
|
192 | return HTTPFound(redirect_to) |
@@ -1,192 +1,192 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 | """ |
|
21 | """ | |
22 | Pylons environment configuration |
|
22 | Pylons environment configuration | |
23 | """ |
|
23 | """ | |
24 |
|
24 | |||
25 | import os |
|
25 | import os | |
26 | import logging |
|
26 | import logging | |
27 | import rhodecode |
|
27 | import rhodecode | |
28 | import platform |
|
28 | import platform | |
29 | import re |
|
29 | import re | |
30 | import io |
|
30 | import io | |
31 |
|
31 | |||
32 | from mako.lookup import TemplateLookup |
|
32 | from mako.lookup import TemplateLookup | |
33 | from pylons.configuration import PylonsConfig |
|
33 | from pylons.configuration import PylonsConfig | |
34 | from pylons.error import handle_mako_error |
|
34 | from pylons.error import handle_mako_error | |
35 | from pyramid.settings import asbool |
|
35 | from pyramid.settings import asbool | |
36 |
|
36 | |||
37 | # don't remove this import it does magic for celery |
|
37 | # don't remove this import it does magic for celery | |
38 | from rhodecode.lib import celerypylons # noqa |
|
38 | from rhodecode.lib import celerypylons # noqa | |
39 |
|
39 | |||
40 | import rhodecode.lib.app_globals as app_globals |
|
40 | import rhodecode.lib.app_globals as app_globals | |
41 |
|
41 | |||
42 | from rhodecode.config import utils |
|
42 | from rhodecode.config import utils | |
43 | from rhodecode.config.routing import make_map |
|
43 | from rhodecode.config.routing import make_map | |
44 | from rhodecode.config.jsroutes import generate_jsroutes_content |
|
44 | from rhodecode.config.jsroutes import generate_jsroutes_content | |
45 |
|
45 | |||
46 | from rhodecode.lib import helpers |
|
46 | from rhodecode.lib import helpers | |
47 | from rhodecode.lib.auth import set_available_permissions |
|
47 | from rhodecode.lib.auth import set_available_permissions | |
48 | from rhodecode.lib.utils import ( |
|
48 | from rhodecode.lib.utils import ( | |
49 | repo2db_mapper, make_db_config, set_rhodecode_config, |
|
49 | repo2db_mapper, make_db_config, set_rhodecode_config, | |
50 | load_rcextensions) |
|
50 | load_rcextensions) | |
51 | from rhodecode.lib.utils2 import str2bool, aslist |
|
51 | from rhodecode.lib.utils2 import str2bool, aslist | |
52 | from rhodecode.lib.vcs import connect_vcs, start_vcs_server |
|
52 | from rhodecode.lib.vcs import connect_vcs, start_vcs_server | |
53 | from rhodecode.model.scm import ScmModel |
|
53 | from rhodecode.model.scm import ScmModel | |
54 |
|
54 | |||
55 | log = logging.getLogger(__name__) |
|
55 | log = logging.getLogger(__name__) | |
56 |
|
56 | |||
57 | def load_environment(global_conf, app_conf, initial=False, |
|
57 | def load_environment(global_conf, app_conf, initial=False, | |
58 | test_env=None, test_index=None): |
|
58 | test_env=None, test_index=None): | |
59 | """ |
|
59 | """ | |
60 | Configure the Pylons environment via the ``pylons.config`` |
|
60 | Configure the Pylons environment via the ``pylons.config`` | |
61 | object |
|
61 | object | |
62 | """ |
|
62 | """ | |
63 | config = PylonsConfig() |
|
63 | config = PylonsConfig() | |
64 |
|
64 | |||
65 |
|
65 | |||
66 | # Pylons paths |
|
66 | # Pylons paths | |
67 | root = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) |
|
67 | root = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) | |
68 | paths = { |
|
68 | paths = { | |
69 | 'root': root, |
|
69 | 'root': root, | |
70 | 'controllers': os.path.join(root, 'controllers'), |
|
70 | 'controllers': os.path.join(root, 'controllers'), | |
71 | 'static_files': os.path.join(root, 'public'), |
|
71 | 'static_files': os.path.join(root, 'public'), | |
72 | 'templates': [os.path.join(root, 'templates')], |
|
72 | 'templates': [os.path.join(root, 'templates')], | |
73 | } |
|
73 | } | |
74 |
|
74 | |||
75 | # Initialize config with the basic options |
|
75 | # Initialize config with the basic options | |
76 | config.init_app(global_conf, app_conf, package='rhodecode', paths=paths) |
|
76 | config.init_app(global_conf, app_conf, package='rhodecode', paths=paths) | |
77 |
|
77 | |||
78 | # store some globals into rhodecode |
|
78 | # store some globals into rhodecode | |
79 | rhodecode.CELERY_ENABLED = str2bool(config['app_conf'].get('use_celery')) |
|
79 | rhodecode.CELERY_ENABLED = str2bool(config['app_conf'].get('use_celery')) | |
80 | rhodecode.CELERY_EAGER = str2bool( |
|
80 | rhodecode.CELERY_EAGER = str2bool( | |
81 | config['app_conf'].get('celery.always.eager')) |
|
81 | config['app_conf'].get('celery.always.eager')) | |
82 |
|
82 | |||
83 | config['routes.map'] = make_map(config) |
|
83 | config['routes.map'] = make_map(config) | |
84 |
|
84 | |||
85 | if asbool(config['debug']): |
|
85 | if asbool(config.get('generate_js_files', 'false')): | |
86 | jsroutes = config['routes.map'].jsroutes() |
|
86 | jsroutes = config['routes.map'].jsroutes() | |
87 | jsroutes_file_content = generate_jsroutes_content(jsroutes) |
|
87 | jsroutes_file_content = generate_jsroutes_content(jsroutes) | |
88 | jsroutes_file_path = os.path.join( |
|
88 | jsroutes_file_path = os.path.join( | |
89 | paths['static_files'], 'js', 'rhodecode', 'routes.js') |
|
89 | paths['static_files'], 'js', 'rhodecode', 'routes.js') | |
90 |
|
90 | |||
91 | with io.open(jsroutes_file_path, 'w', encoding='utf-8') as f: |
|
91 | with io.open(jsroutes_file_path, 'w', encoding='utf-8') as f: | |
92 | f.write(jsroutes_file_content) |
|
92 | f.write(jsroutes_file_content) | |
93 |
|
93 | |||
94 | config['pylons.app_globals'] = app_globals.Globals(config) |
|
94 | config['pylons.app_globals'] = app_globals.Globals(config) | |
95 | config['pylons.h'] = helpers |
|
95 | config['pylons.h'] = helpers | |
96 | rhodecode.CONFIG = config |
|
96 | rhodecode.CONFIG = config | |
97 |
|
97 | |||
98 | load_rcextensions(root_path=config['here']) |
|
98 | load_rcextensions(root_path=config['here']) | |
99 |
|
99 | |||
100 | # Setup cache object as early as possible |
|
100 | # Setup cache object as early as possible | |
101 | import pylons |
|
101 | import pylons | |
102 | pylons.cache._push_object(config['pylons.app_globals'].cache) |
|
102 | pylons.cache._push_object(config['pylons.app_globals'].cache) | |
103 |
|
103 | |||
104 | # Create the Mako TemplateLookup, with the default auto-escaping |
|
104 | # Create the Mako TemplateLookup, with the default auto-escaping | |
105 | config['pylons.app_globals'].mako_lookup = TemplateLookup( |
|
105 | config['pylons.app_globals'].mako_lookup = TemplateLookup( | |
106 | directories=paths['templates'], |
|
106 | directories=paths['templates'], | |
107 | error_handler=handle_mako_error, |
|
107 | error_handler=handle_mako_error, | |
108 | module_directory=os.path.join(app_conf['cache_dir'], 'templates'), |
|
108 | module_directory=os.path.join(app_conf['cache_dir'], 'templates'), | |
109 | input_encoding='utf-8', default_filters=['escape'], |
|
109 | input_encoding='utf-8', default_filters=['escape'], | |
110 | imports=['from webhelpers.html import escape']) |
|
110 | imports=['from webhelpers.html import escape']) | |
111 |
|
111 | |||
112 | # sets the c attribute access when don't existing attribute are accessed |
|
112 | # sets the c attribute access when don't existing attribute are accessed | |
113 | config['pylons.strict_tmpl_context'] = True |
|
113 | config['pylons.strict_tmpl_context'] = True | |
114 |
|
114 | |||
115 | # Limit backends to "vcs.backends" from configuration |
|
115 | # Limit backends to "vcs.backends" from configuration | |
116 | backends = config['vcs.backends'] = aslist( |
|
116 | backends = config['vcs.backends'] = aslist( | |
117 | config.get('vcs.backends', 'hg,git'), sep=',') |
|
117 | config.get('vcs.backends', 'hg,git'), sep=',') | |
118 | for alias in rhodecode.BACKENDS.keys(): |
|
118 | for alias in rhodecode.BACKENDS.keys(): | |
119 | if alias not in backends: |
|
119 | if alias not in backends: | |
120 | del rhodecode.BACKENDS[alias] |
|
120 | del rhodecode.BACKENDS[alias] | |
121 | log.info("Enabled backends: %s", backends) |
|
121 | log.info("Enabled backends: %s", backends) | |
122 |
|
122 | |||
123 | # initialize vcs client and optionally run the server if enabled |
|
123 | # initialize vcs client and optionally run the server if enabled | |
124 | vcs_server_uri = config.get('vcs.server', '') |
|
124 | vcs_server_uri = config.get('vcs.server', '') | |
125 | vcs_server_enabled = str2bool(config.get('vcs.server.enable', 'true')) |
|
125 | vcs_server_enabled = str2bool(config.get('vcs.server.enable', 'true')) | |
126 | start_server = ( |
|
126 | start_server = ( | |
127 | str2bool(config.get('vcs.start_server', 'false')) and |
|
127 | str2bool(config.get('vcs.start_server', 'false')) and | |
128 | not int(os.environ.get('RC_VCSSERVER_TEST_DISABLE', '0'))) |
|
128 | not int(os.environ.get('RC_VCSSERVER_TEST_DISABLE', '0'))) | |
129 | if vcs_server_enabled and start_server: |
|
129 | if vcs_server_enabled and start_server: | |
130 | log.info("Starting vcsserver") |
|
130 | log.info("Starting vcsserver") | |
131 | start_vcs_server(server_and_port=vcs_server_uri, |
|
131 | start_vcs_server(server_and_port=vcs_server_uri, | |
132 | protocol=utils.get_vcs_server_protocol(config), |
|
132 | protocol=utils.get_vcs_server_protocol(config), | |
133 | log_level=config['vcs.server.log_level']) |
|
133 | log_level=config['vcs.server.log_level']) | |
134 |
|
134 | |||
135 | set_available_permissions(config) |
|
135 | set_available_permissions(config) | |
136 | db_cfg = make_db_config(clear_session=True) |
|
136 | db_cfg = make_db_config(clear_session=True) | |
137 |
|
137 | |||
138 | repos_path = list(db_cfg.items('paths'))[0][1] |
|
138 | repos_path = list(db_cfg.items('paths'))[0][1] | |
139 | config['base_path'] = repos_path |
|
139 | config['base_path'] = repos_path | |
140 |
|
140 | |||
141 | config['vcs.hooks.direct_calls'] = _use_direct_hook_calls(config) |
|
141 | config['vcs.hooks.direct_calls'] = _use_direct_hook_calls(config) | |
142 | config['vcs.hooks.protocol'] = _get_vcs_hooks_protocol(config) |
|
142 | config['vcs.hooks.protocol'] = _get_vcs_hooks_protocol(config) | |
143 |
|
143 | |||
144 | # store db config also in main global CONFIG |
|
144 | # store db config also in main global CONFIG | |
145 | set_rhodecode_config(config) |
|
145 | set_rhodecode_config(config) | |
146 |
|
146 | |||
147 | # configure instance id |
|
147 | # configure instance id | |
148 | utils.set_instance_id(config) |
|
148 | utils.set_instance_id(config) | |
149 |
|
149 | |||
150 | # CONFIGURATION OPTIONS HERE (note: all config options will override |
|
150 | # CONFIGURATION OPTIONS HERE (note: all config options will override | |
151 | # any Pylons config options) |
|
151 | # any Pylons config options) | |
152 |
|
152 | |||
153 | # store config reference into our module to skip import magic of pylons |
|
153 | # store config reference into our module to skip import magic of pylons | |
154 | rhodecode.CONFIG.update(config) |
|
154 | rhodecode.CONFIG.update(config) | |
155 |
|
155 | |||
156 | utils.configure_pyro4(config) |
|
156 | utils.configure_pyro4(config) | |
157 | utils.configure_vcs(config) |
|
157 | utils.configure_vcs(config) | |
158 | if vcs_server_enabled: |
|
158 | if vcs_server_enabled: | |
159 | connect_vcs(vcs_server_uri, utils.get_vcs_server_protocol(config)) |
|
159 | connect_vcs(vcs_server_uri, utils.get_vcs_server_protocol(config)) | |
160 |
|
160 | |||
161 | import_on_startup = str2bool(config.get('startup.import_repos', False)) |
|
161 | import_on_startup = str2bool(config.get('startup.import_repos', False)) | |
162 | if vcs_server_enabled and import_on_startup: |
|
162 | if vcs_server_enabled and import_on_startup: | |
163 | repo2db_mapper(ScmModel().repo_scan(repos_path), remove_obsolete=False) |
|
163 | repo2db_mapper(ScmModel().repo_scan(repos_path), remove_obsolete=False) | |
164 | return config |
|
164 | return config | |
165 |
|
165 | |||
166 |
|
166 | |||
167 | def _use_direct_hook_calls(config): |
|
167 | def _use_direct_hook_calls(config): | |
168 | default_direct_hook_calls = 'false' |
|
168 | default_direct_hook_calls = 'false' | |
169 | direct_hook_calls = str2bool( |
|
169 | direct_hook_calls = str2bool( | |
170 | config.get('vcs.hooks.direct_calls', default_direct_hook_calls)) |
|
170 | config.get('vcs.hooks.direct_calls', default_direct_hook_calls)) | |
171 | return direct_hook_calls |
|
171 | return direct_hook_calls | |
172 |
|
172 | |||
173 |
|
173 | |||
174 | def _get_vcs_hooks_protocol(config): |
|
174 | def _get_vcs_hooks_protocol(config): | |
175 | protocol = config.get('vcs.hooks.protocol', 'pyro4').lower() |
|
175 | protocol = config.get('vcs.hooks.protocol', 'pyro4').lower() | |
176 | return protocol |
|
176 | return protocol | |
177 |
|
177 | |||
178 |
|
178 | |||
179 | def load_pyramid_environment(global_config, settings): |
|
179 | def load_pyramid_environment(global_config, settings): | |
180 | # Some parts of the code expect a merge of global and app settings. |
|
180 | # Some parts of the code expect a merge of global and app settings. | |
181 | settings_merged = global_config.copy() |
|
181 | settings_merged = global_config.copy() | |
182 | settings_merged.update(settings) |
|
182 | settings_merged.update(settings) | |
183 |
|
183 | |||
184 | # If this is a test run we prepare the test environment like |
|
184 | # If this is a test run we prepare the test environment like | |
185 | # creating a test database, test search index and test repositories. |
|
185 | # creating a test database, test search index and test repositories. | |
186 | # This has to be done before the database connection is initialized. |
|
186 | # This has to be done before the database connection is initialized. | |
187 | if settings['is_test']: |
|
187 | if settings['is_test']: | |
188 | rhodecode.is_test = True |
|
188 | rhodecode.is_test = True | |
189 | utils.initialize_test_environment(settings_merged) |
|
189 | utils.initialize_test_environment(settings_merged) | |
190 |
|
190 | |||
191 | # Initialize the database connection. |
|
191 | # Initialize the database connection. | |
192 | utils.initialize_database(settings_merged) |
|
192 | utils.initialize_database(settings_merged) |
@@ -1,42 +1,43 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 | def generate_jsroutes_content(jsroutes): |
|
21 | def generate_jsroutes_content(jsroutes): | |
22 | statements = [] |
|
22 | statements = [] | |
23 | for url_name, url, fields in jsroutes: |
|
23 | for url_name, url, fields in jsroutes: | |
24 | statements.append( |
|
24 | statements.append( | |
25 | "pyroutes.register('%s', '%s', %s);" % (url_name, url, fields)) |
|
25 | "pyroutes.register('%s', '%s', %s);" % (url_name, url, fields)) | |
26 | return u''' |
|
26 | return u''' | |
27 | /****************************************************************************** |
|
27 | /****************************************************************************** | |
28 | * * |
|
28 | * * | |
29 | * DO NOT CHANGE THIS FILE MANUALLY * |
|
29 | * DO NOT CHANGE THIS FILE MANUALLY * | |
30 | * * |
|
30 | * * | |
31 | * * |
|
31 | * * | |
32 |
* This file is automatically generated when the app starts up |
|
32 | * This file is automatically generated when the app starts up with * | |
|
33 | * generate_js_files = true * | |||
33 | * * |
|
34 | * * | |
34 | * To add a route here pass jsroute=True to the route definition in the app * |
|
35 | * To add a route here pass jsroute=True to the route definition in the app * | |
35 | * * |
|
36 | * * | |
36 | ******************************************************************************/ |
|
37 | ******************************************************************************/ | |
37 | function registerRCRoutes() { |
|
38 | function registerRCRoutes() { | |
38 | // routes registration |
|
39 | // routes registration | |
39 | %s |
|
40 | %s | |
40 | } |
|
41 | } | |
41 | ''' % '\n '.join(statements) |
|
42 | ''' % '\n '.join(statements) | |
42 |
|
43 |
@@ -1,217 +1,256 b'' | |||||
1 | { |
|
1 | { | |
2 | "cyrus-sasl-2.1.26": { |
|
2 | "nodejs-4.3.1": { | |
3 | "cyrus": "http://cyrusimap.web.cmu.edu/mediawiki/index.php/Downloads#Licensing" |
|
3 | "MIT License": "http://spdx.org/licenses/MIT" | |
4 | }, |
|
4 | }, | |
5 |
" |
|
5 | "postgresql-9.5.1": { | |
6 |
" |
|
6 | "PostgreSQL License": "http://spdx.org/licenses/PostgreSQL" | |
7 | }, |
|
7 | }, | |
8 |
" |
|
8 | "python-2.7.11": { | |
9 | "OpenSSL": "http://spdx.org/licenses/OpenSSL" |
|
9 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0" | |
10 | }, |
|
|||
11 | "python-2.7.10": { |
|
|||
12 | "Python-2.0": "http://spdx.org/licenses/Python-2.0" |
|
|||
13 | }, |
|
10 | }, | |
14 | "python2.7-Babel-1.3": { |
|
11 | "python2.7-Babel-1.3": { | |
15 |
"BSD |
|
12 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
16 | }, |
|
13 | }, | |
17 | "python2.7-Beaker-1.7.0": { |
|
14 | "python2.7-Beaker-1.7.0": { | |
18 |
"BSD |
|
15 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
19 | }, |
|
16 | }, | |
20 | "python2.7-FormEncode-1.2.4": { |
|
17 | "python2.7-FormEncode-1.2.4": { | |
21 |
"Python |
|
18 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0" | |
22 | }, |
|
19 | }, | |
23 | "python2.7-Mako-1.0.1": { |
|
20 | "python2.7-Mako-1.0.1": { | |
24 | "MIT": "http://spdx.org/licenses/MIT" |
|
21 | "MIT License": "http://spdx.org/licenses/MIT" | |
25 | }, |
|
22 | }, | |
26 | "python2.7-Markdown-2.6.2": { |
|
23 | "python2.7-Markdown-2.6.2": { | |
27 |
"BSD |
|
24 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
28 | }, |
|
25 | }, | |
29 | "python2.7-MarkupSafe-0.23": { |
|
26 | "python2.7-MarkupSafe-0.23": { | |
30 |
"BSD |
|
27 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
31 | }, |
|
28 | }, | |
32 | "python2.7-Paste-2.0.2": { |
|
29 | "python2.7-Paste-2.0.2": { | |
33 | "MIT": "http://spdx.org/licenses/MIT" |
|
30 | "MIT License": "http://spdx.org/licenses/MIT" | |
34 | }, |
|
31 | }, | |
35 | "python2.7-PasteDeploy-1.5.2": { |
|
32 | "python2.7-PasteDeploy-1.5.2": { | |
36 | "MIT": "http://spdx.org/licenses/MIT" |
|
33 | "MIT License": "http://spdx.org/licenses/MIT" | |
37 | }, |
|
34 | }, | |
38 | "python2.7-PasteScript-1.7.5": { |
|
35 | "python2.7-PasteScript-1.7.5": { | |
39 | "MIT": "http://spdx.org/licenses/MIT" |
|
36 | "MIT License": "http://spdx.org/licenses/MIT" | |
40 | }, |
|
37 | }, | |
41 | "python2.7-Pygments-2.0.2": { |
|
38 | "python2.7-Pygments-2.0.2": { | |
42 |
"BSD |
|
39 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
43 | }, |
|
40 | }, | |
44 |
"python2.7-Pylons-1.0. |
|
41 | "python2.7-Pylons-1.0.1-patch1": { | |
45 |
"BSD |
|
42 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
46 | }, |
|
43 | }, | |
47 | "python2.7-Pyro4-4.35": { |
|
44 | "python2.7-Pyro4-4.35": { | |
48 | "MIT": "http://spdx.org/licenses/MIT" |
|
45 | "MIT License": "http://spdx.org/licenses/MIT" | |
49 | }, |
|
46 | }, | |
50 | "python2.7-Routes-1.13": { |
|
47 | "python2.7-Routes-1.13": { | |
51 | "MIT": "http://spdx.org/licenses/MIT" |
|
48 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
52 | }, |
|
49 | }, | |
53 | "python2.7-SQLAlchemy-0.9.9": { |
|
50 | "python2.7-SQLAlchemy-0.9.9": { | |
54 | "MIT": "http://spdx.org/licenses/MIT" |
|
51 | "MIT License": "http://spdx.org/licenses/MIT" | |
55 | }, |
|
52 | }, | |
56 | "python2.7-Tempita-0.5.2": { |
|
53 | "python2.7-Tempita-0.5.2": { | |
57 | "MIT": "http://spdx.org/licenses/MIT" |
|
54 | "MIT License": "http://spdx.org/licenses/MIT" | |
58 | }, |
|
55 | }, | |
59 | "python2.7-URLObject-2.4.0": { |
|
56 | "python2.7-URLObject-2.4.0": { | |
60 |
"Unlicense": "http:// |
|
57 | "The Unlicense": "http://unlicense.org/" | |
61 | }, |
|
58 | }, | |
62 | "python2.7-WebError-0.10.3": { |
|
59 | "python2.7-WebError-0.10.3": { | |
63 | "MIT": "http://spdx.org/licenses/MIT" |
|
60 | "MIT License": "http://spdx.org/licenses/MIT" | |
64 | }, |
|
61 | }, | |
65 |
"python2.7-WebHelpers-1.3 |
|
62 | "python2.7-WebHelpers-1.3": { | |
66 |
"BSD |
|
63 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
67 | }, |
|
64 | }, | |
68 | "python2.7-WebHelpers2-2.0": { |
|
65 | "python2.7-WebHelpers2-2.0": { | |
69 |
" |
|
66 | "MIT License": "http://spdx.org/licenses/MIT" | |
70 | }, |
|
67 | }, | |
71 | "python2.7-WebOb-1.3.1": { |
|
68 | "python2.7-WebOb-1.3.1": { | |
72 | "MIT": "http://spdx.org/licenses/MIT" |
|
69 | "MIT License": "http://spdx.org/licenses/MIT" | |
73 | }, |
|
70 | }, | |
74 |
"python2.7-Whoosh-2.7.0 |
|
71 | "python2.7-Whoosh-2.7.0": { | |
75 |
"BSD |
|
72 | "BSD 2-clause \"Simplified\" License": "http://spdx.org/licenses/BSD-2-Clause", | |
|
73 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |||
76 | }, |
|
74 | }, | |
77 | "python2.7-alembic-0.8.4": { |
|
75 | "python2.7-alembic-0.8.4": { | |
78 | "MIT": "http://spdx.org/licenses/MIT" |
|
76 | "MIT License": "http://spdx.org/licenses/MIT" | |
79 | }, |
|
77 | }, | |
80 | "python2.7-amqplib-1.0.2": { |
|
78 | "python2.7-amqplib-1.0.2": { | |
81 |
" |
|
79 | "GNU Lesser General Public License v3.0 only": "http://spdx.org/licenses/LGPL-3.0" | |
82 | }, |
|
80 | }, | |
83 | "python2.7-anyjson-0.3.3": { |
|
81 | "python2.7-anyjson-0.3.3": { | |
84 |
"BSD |
|
82 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
83 | }, | |||
|
84 | "python2.7-appenlight-client-0.6.14": { | |||
|
85 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |||
85 | }, |
|
86 | }, | |
86 |
"python2.7-a |
|
87 | "python2.7-authomatic-0.1.0.post1": { | |
87 |
" |
|
88 | "MIT License": "http://spdx.org/licenses/MIT" | |
88 | }, |
|
89 | }, | |
89 |
"python2.7-backport |
|
90 | "python2.7-backport-ipaddress-0.1": { | |
90 |
"Python |
|
91 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0" | |
91 | }, |
|
92 | }, | |
92 | "python2.7-celery-2.2.10": { |
|
93 | "python2.7-celery-2.2.10": { | |
93 |
"BSD |
|
94 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
94 | }, |
|
95 | }, | |
95 |
"python2.7-click- |
|
96 | "python2.7-click-5.1": { | |
96 |
"BSD |
|
97 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
98 | }, | |||
|
99 | "python2.7-colander-1.2": { | |||
|
100 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |||
97 | }, |
|
101 | }, | |
98 | "python2.7-configobj-5.0.6": { |
|
102 | "python2.7-configobj-5.0.6": { | |
99 |
"BSD |
|
103 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
100 | }, |
|
104 | }, | |
101 | "python2.7-cssselect-0.9.1": { |
|
105 | "python2.7-cssselect-0.9.1": { | |
102 |
"BSD |
|
106 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
103 | }, |
|
107 | }, | |
104 | "python2.7-decorator-3.4.2": { |
|
108 | "python2.7-decorator-3.4.2": { | |
105 |
"BSD |
|
109 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
106 | }, |
|
110 | }, | |
107 | "python2.7-docutils-0.12": { |
|
111 | "python2.7-docutils-0.12": { | |
108 |
"BSD |
|
112 | "BSD 2-clause \"Simplified\" License": "http://spdx.org/licenses/BSD-2-Clause" | |
|
113 | }, | |||
|
114 | "python2.7-elasticsearch-2.3.0": { | |||
|
115 | "Apache License 2.0": "http://spdx.org/licenses/Apache-2.0" | |||
|
116 | }, | |||
|
117 | "python2.7-elasticsearch-dsl-2.0.0": { | |||
|
118 | "Apache License 2.0": "http://spdx.org/licenses/Apache-2.0" | |||
109 | }, |
|
119 | }, | |
110 | "python2.7-future-0.14.3": { |
|
120 | "python2.7-future-0.14.3": { | |
111 | "MIT": "http://spdx.org/licenses/MIT" |
|
121 | "MIT License": "http://spdx.org/licenses/MIT" | |
112 | }, |
|
122 | }, | |
113 | "python2.7-futures-3.0.2": { |
|
123 | "python2.7-futures-3.0.2": { | |
114 |
"BSD |
|
124 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
125 | }, | |||
|
126 | "python2.7-gnureadline-6.3.3": { | |||
|
127 | "GNU General Public License v1.0 only": "http://spdx.org/licenses/GPL-1.0" | |||
115 | }, |
|
128 | }, | |
116 |
"python2.7-g |
|
129 | "python2.7-gunicorn-19.6.0": { | |
117 | "MIT": "http://spdx.org/licenses/MIT" |
|
130 | "MIT License": "http://spdx.org/licenses/MIT" | |
118 | }, |
|
131 | }, | |
119 |
"python2.7- |
|
132 | "python2.7-infrae.cache-1.0.1": { | |
120 |
" |
|
133 | "Zope Public License 2.1": "http://spdx.org/licenses/ZPL-2.1" | |
121 | }, |
|
134 | }, | |
122 | "python2.7-ipython-3.1.0": { |
|
135 | "python2.7-ipython-3.1.0": { | |
123 |
"BSD |
|
136 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
124 | }, |
|
|||
125 | "python2.7-kombu-1.5.1-patch1": { |
|
|||
126 | "BSD-3-Clause": "http://spdx.org/licenses/BSD-3-Clause" |
|
|||
127 | }, |
|
137 | }, | |
128 |
"python2.7- |
|
138 | "python2.7-iso8601-0.1.11": { | |
129 | "expat": "http://directory.fsf.org/wiki/License:Expat" |
|
139 | "MIT License": "http://spdx.org/licenses/MIT" | |
130 | }, |
|
140 | }, | |
131 |
"python2.7- |
|
141 | "python2.7-kombu-1.5.1": { | |
132 | "repoze": "http://repoze.org/license.html" |
|
142 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
133 | }, |
|
143 | }, | |
134 | "python2.7-msgpack-python-0.4.6": { |
|
144 | "python2.7-msgpack-python-0.4.6": { | |
135 |
"Apache |
|
145 | "Apache License 2.0": "http://spdx.org/licenses/Apache-2.0" | |
136 | }, |
|
|||
137 | "python2.7-objgraph-2.0.0": { |
|
|||
138 | "MIT": "http://spdx.org/licenses/MIT" |
|
|||
139 | }, |
|
146 | }, | |
140 | "python2.7-packaging-15.2": { |
|
147 | "python2.7-packaging-15.2": { | |
141 |
"Apache |
|
148 | "Apache License 2.0": "http://spdx.org/licenses/Apache-2.0" | |
142 | }, |
|
149 | }, | |
143 | "python2.7-psutil-2.2.1": { |
|
150 | "python2.7-psutil-2.2.1": { | |
144 |
"BSD |
|
151 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
145 | }, |
|
152 | }, | |
146 | "python2.7-psycopg2-2.6": { |
|
153 | "python2.7-psycopg2-2.6": { | |
147 |
" |
|
154 | "GNU Lesser General Public License v3.0 or later": "http://spdx.org/licenses/LGPL-3.0+" | |
148 | }, |
|
155 | }, | |
149 | "python2.7-py-1.4.29": { |
|
156 | "python2.7-py-1.4.29": { | |
150 | "MIT": "http://spdx.org/licenses/MIT" |
|
157 | "MIT License": "http://spdx.org/licenses/MIT" | |
151 | }, |
|
158 | }, | |
152 | "python2.7-py-bcrypt-0.4": { |
|
159 | "python2.7-py-bcrypt-0.4": { | |
153 |
"BSD |
|
160 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
154 | }, |
|
161 | }, | |
155 | "python2.7-pycrypto-2.6.1": { |
|
162 | "python2.7-pycrypto-2.6.1": { | |
156 |
" |
|
163 | "Public Domain": null | |
|
164 | }, | |||
|
165 | "python2.7-pycurl-7.19.5": { | |||
|
166 | "MIT License": "http://spdx.org/licenses/MIT" | |||
157 | }, |
|
167 | }, | |
158 | "python2.7-pyparsing-1.5.7": { |
|
168 | "python2.7-pyparsing-1.5.7": { | |
159 | "MIT": "http://spdx.org/licenses/MIT" |
|
169 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
170 | }, | |||
|
171 | "python2.7-pyramid-1.6.1": { | |||
|
172 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |||
|
173 | }, | |||
|
174 | "python2.7-pyramid-beaker-0.8": { | |||
|
175 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |||
|
176 | }, | |||
|
177 | "python2.7-pyramid-debugtoolbar-2.4.2": { | |||
|
178 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause", | |||
|
179 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |||
|
180 | }, | |||
|
181 | "python2.7-pyramid-mako-1.0.2": { | |||
|
182 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |||
160 | }, |
|
183 | }, | |
161 | "python2.7-pysqlite-2.6.3": { |
|
184 | "python2.7-pysqlite-2.6.3": { | |
162 |
" |
|
185 | "libpng License": "http://spdx.org/licenses/Libpng", | |
163 |
" |
|
186 | "zlib License": "http://spdx.org/licenses/Zlib" | |
164 | }, |
|
187 | }, | |
165 | "python2.7-pytest-2.8.5": { |
|
188 | "python2.7-pytest-2.8.5": { | |
166 | "MIT": "http://spdx.org/licenses/MIT" |
|
189 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
190 | }, | |||
|
191 | "python2.7-pytest-runner-2.7.1": { | |||
|
192 | "MIT License": "http://spdx.org/licenses/MIT" | |||
167 | }, |
|
193 | }, | |
168 | "python2.7-python-dateutil-1.5": { |
|
194 | "python2.7-python-dateutil-1.5": { | |
169 | "BSD-2-Clause": "http://spdx.org/licenses/BSD-2-Clause" |
|
195 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0" | |
|
196 | }, | |||
|
197 | "python2.7-python-editor-1.0.1": { | |||
|
198 | "Apache License 2.0": "http://spdx.org/licenses/Apache-2.0" | |||
170 | }, |
|
199 | }, | |
171 | "python2.7-python-ldap-2.4.19": { |
|
200 | "python2.7-python-ldap-2.4.19": { | |
172 |
"Python |
|
201 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0" | |
|
202 | }, | |||
|
203 | "python2.7-python-memcached-1.57": { | |||
|
204 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0" | |||
173 | }, |
|
205 | }, | |
174 | "python2.7-pytz-2015.4": { |
|
206 | "python2.7-pytz-2015.4": { | |
175 | "MIT": "http://spdx.org/licenses/MIT" |
|
207 | "MIT License": "http://spdx.org/licenses/MIT" | |
176 | }, |
|
208 | }, | |
177 | "python2.7-recaptcha-client-1.0.6": { |
|
209 | "python2.7-recaptcha-client-1.0.6": { | |
178 | "MIT": "http://spdx.org/licenses/MIT" |
|
210 | "MIT License": "http://spdx.org/licenses/MIT" | |
179 | }, |
|
211 | }, | |
180 | "python2.7-repoze.lru-0.6": { |
|
212 | "python2.7-repoze.lru-0.6": { | |
181 |
" |
|
213 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |
182 | }, |
|
214 | }, | |
183 |
"python2.7-requests-2. |
|
215 | "python2.7-requests-2.9.1": { | |
184 |
"A |
|
216 | "Apache License 2.0": "http://spdx.org/licenses/Apache-2.0" | |
185 | }, |
|
217 | }, | |
186 |
"python2.7-serpent-1.1 |
|
218 | "python2.7-serpent-1.12": { | |
187 | "MIT": "http://spdx.org/licenses/MIT" |
|
219 | "MIT License": "http://spdx.org/licenses/MIT" | |
188 | }, |
|
220 | }, | |
189 |
"python2.7-set |
|
221 | "python2.7-setuptools-19.4": { | |
190 | "BSD-2-Clause": "http://spdx.org/licenses/BSD-2-Clause" |
|
222 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0", | |
|
223 | "Zope Public License 2.0": "http://spdx.org/licenses/ZPL-2.0" | |||
191 | }, |
|
224 | }, | |
192 |
"python2.7-setuptools- |
|
225 | "python2.7-setuptools-scm-1.11.0": { | |
193 | "PSF": null, |
|
226 | "MIT License": "http://spdx.org/licenses/MIT" | |
194 | "ZPL": null |
|
|||
195 | }, |
|
227 | }, | |
196 | "python2.7-simplejson-3.7.2": { |
|
228 | "python2.7-simplejson-3.7.2": { | |
197 |
" |
|
229 | "Academic Free License": "http://spdx.org/licenses/AFL-2.1", | |
|
230 | "MIT License": "http://spdx.org/licenses/MIT" | |||
198 | }, |
|
231 | }, | |
199 | "python2.7-six-1.9.0": { |
|
232 | "python2.7-six-1.9.0": { | |
200 | "MIT": "http://spdx.org/licenses/MIT" |
|
233 | "MIT License": "http://spdx.org/licenses/MIT" | |
201 | }, |
|
234 | }, | |
202 | "python2.7-subprocess32-3.2.6": { |
|
235 | "python2.7-translationstring-1.3": { | |
203 | "Python-2.0": "http://spdx.org/licenses/Python-2.0" |
|
236 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |
204 | }, |
|
237 | }, | |
205 |
"python2.7- |
|
238 | "python2.7-urllib3-1.16": { | |
206 |
" |
|
239 | "MIT License": "http://spdx.org/licenses/MIT" | |
207 | }, |
|
240 | }, | |
208 |
"python2.7- |
|
241 | "python2.7-venusian-1.0": { | |
209 | "APSL-2.0": "http://spdx.org/licenses/APSL-2.0" |
|
242 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |
210 | }, |
|
243 | }, | |
211 | "python2.7-waitress-0.8.9": { |
|
244 | "python2.7-waitress-0.8.9": { | |
212 |
"Z |
|
245 | "Zope Public License 2.1": "http://spdx.org/licenses/ZPL-2.1" | |
213 | }, |
|
246 | }, | |
214 | "python2.7-zope.cachedescriptors-4.0.0": { |
|
247 | "python2.7-zope.cachedescriptors-4.0.0": { | |
215 |
"Z |
|
248 | "Zope Public License 2.1": "http://spdx.org/licenses/ZPL-2.1" | |
|
249 | }, | |||
|
250 | "python2.7-zope.deprecation-4.1.2": { | |||
|
251 | "Zope Public License 2.1": "http://spdx.org/licenses/ZPL-2.1" | |||
|
252 | }, | |||
|
253 | "python2.7-zope.interface-4.1.3": { | |||
|
254 | "Zope Public License 2.1": "http://spdx.org/licenses/ZPL-2.1" | |||
216 | } |
|
255 | } | |
217 |
} |
|
256 | } No newline at end of file |
@@ -1,316 +1,387 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 | """ |
|
21 | """ | |
22 | Pylons middleware initialization |
|
22 | Pylons middleware initialization | |
23 | """ |
|
23 | """ | |
24 | import logging |
|
24 | import logging | |
25 |
|
25 | |||
26 | from paste.registry import RegistryManager |
|
26 | from paste.registry import RegistryManager | |
27 | from paste.gzipper import make_gzip_middleware |
|
27 | from paste.gzipper import make_gzip_middleware | |
28 | from pylons.middleware import ErrorHandler, StatusCodeRedirect |
|
|||
29 | from pylons.wsgiapp import PylonsApp |
|
28 | from pylons.wsgiapp import PylonsApp | |
30 | from pyramid.authorization import ACLAuthorizationPolicy |
|
29 | from pyramid.authorization import ACLAuthorizationPolicy | |
31 | from pyramid.config import Configurator |
|
30 | from pyramid.config import Configurator | |
32 | from pyramid.static import static_view |
|
31 | from pyramid.static import static_view | |
33 | from pyramid.settings import asbool, aslist |
|
32 | from pyramid.settings import asbool, aslist | |
34 | from pyramid.wsgi import wsgiapp |
|
33 | from pyramid.wsgi import wsgiapp | |
|
34 | from pyramid.httpexceptions import HTTPError, HTTPInternalServerError | |||
|
35 | import pyramid.httpexceptions as httpexceptions | |||
|
36 | from pyramid.renderers import render_to_response, render | |||
35 | from routes.middleware import RoutesMiddleware |
|
37 | from routes.middleware import RoutesMiddleware | |
36 | import routes.util |
|
38 | import routes.util | |
37 |
|
39 | |||
38 | import rhodecode |
|
40 | import rhodecode | |
39 | from rhodecode.config import patches |
|
41 | from rhodecode.config import patches | |
40 | from rhodecode.config.environment import ( |
|
42 | from rhodecode.config.environment import ( | |
41 | load_environment, load_pyramid_environment) |
|
43 | load_environment, load_pyramid_environment) | |
42 | from rhodecode.lib.middleware import csrf |
|
44 | from rhodecode.lib.middleware import csrf | |
43 | from rhodecode.lib.middleware.appenlight import wrap_in_appenlight_if_enabled |
|
45 | from rhodecode.lib.middleware.appenlight import wrap_in_appenlight_if_enabled | |
44 | from rhodecode.lib.middleware.disable_vcs import DisableVCSPagesWrapper |
|
46 | from rhodecode.lib.middleware.disable_vcs import DisableVCSPagesWrapper | |
45 | from rhodecode.lib.middleware.https_fixup import HttpsFixup |
|
47 | from rhodecode.lib.middleware.https_fixup import HttpsFixup | |
46 | from rhodecode.lib.middleware.vcs import VCSMiddleware |
|
48 | from rhodecode.lib.middleware.vcs import VCSMiddleware | |
47 | from rhodecode.lib.plugins.utils import register_rhodecode_plugin |
|
49 | from rhodecode.lib.plugins.utils import register_rhodecode_plugin | |
48 |
|
50 | |||
49 |
|
51 | |||
50 | log = logging.getLogger(__name__) |
|
52 | log = logging.getLogger(__name__) | |
51 |
|
53 | |||
52 |
|
54 | |||
53 | def make_app(global_conf, full_stack=True, static_files=True, **app_conf): |
|
55 | def make_app(global_conf, full_stack=True, static_files=True, **app_conf): | |
54 | """Create a Pylons WSGI application and return it |
|
56 | """Create a Pylons WSGI application and return it | |
55 |
|
57 | |||
56 | ``global_conf`` |
|
58 | ``global_conf`` | |
57 | The inherited configuration for this application. Normally from |
|
59 | The inherited configuration for this application. Normally from | |
58 | the [DEFAULT] section of the Paste ini file. |
|
60 | the [DEFAULT] section of the Paste ini file. | |
59 |
|
61 | |||
60 | ``full_stack`` |
|
62 | ``full_stack`` | |
61 | Whether or not this application provides a full WSGI stack (by |
|
63 | Whether or not this application provides a full WSGI stack (by | |
62 | default, meaning it handles its own exceptions and errors). |
|
64 | default, meaning it handles its own exceptions and errors). | |
63 | Disable full_stack when this application is "managed" by |
|
65 | Disable full_stack when this application is "managed" by | |
64 | another WSGI middleware. |
|
66 | another WSGI middleware. | |
65 |
|
67 | |||
66 | ``app_conf`` |
|
68 | ``app_conf`` | |
67 | The application's local configuration. Normally specified in |
|
69 | The application's local configuration. Normally specified in | |
68 | the [app:<name>] section of the Paste ini file (where <name> |
|
70 | the [app:<name>] section of the Paste ini file (where <name> | |
69 | defaults to main). |
|
71 | defaults to main). | |
70 |
|
72 | |||
71 | """ |
|
73 | """ | |
72 | # Apply compatibility patches |
|
74 | # Apply compatibility patches | |
73 | patches.kombu_1_5_1_python_2_7_11() |
|
75 | patches.kombu_1_5_1_python_2_7_11() | |
74 | patches.inspect_getargspec() |
|
76 | patches.inspect_getargspec() | |
75 |
|
77 | |||
76 | # Configure the Pylons environment |
|
78 | # Configure the Pylons environment | |
77 | config = load_environment(global_conf, app_conf) |
|
79 | config = load_environment(global_conf, app_conf) | |
78 |
|
80 | |||
79 | # The Pylons WSGI app |
|
81 | # The Pylons WSGI app | |
80 | app = PylonsApp(config=config) |
|
82 | app = PylonsApp(config=config) | |
81 | if rhodecode.is_test: |
|
83 | if rhodecode.is_test: | |
82 | app = csrf.CSRFDetector(app) |
|
84 | app = csrf.CSRFDetector(app) | |
83 |
|
85 | |||
84 | expected_origin = config.get('expected_origin') |
|
86 | expected_origin = config.get('expected_origin') | |
85 | if expected_origin: |
|
87 | if expected_origin: | |
86 | # The API can be accessed from other Origins. |
|
88 | # The API can be accessed from other Origins. | |
87 | app = csrf.OriginChecker(app, expected_origin, |
|
89 | app = csrf.OriginChecker(app, expected_origin, | |
88 | skip_urls=[routes.util.url_for('api')]) |
|
90 | skip_urls=[routes.util.url_for('api')]) | |
89 |
|
91 | |||
90 | # Add RoutesMiddleware. Currently we have two instances in the stack. This |
|
|||
91 | # is the lower one to make the StatusCodeRedirect middleware happy. |
|
|||
92 | # TODO: johbo: This is not optimal, search for a better solution. |
|
|||
93 | app = RoutesMiddleware(app, config['routes.map']) |
|
|||
94 |
|
||||
95 | # CUSTOM MIDDLEWARE HERE (filtered by error handling middlewares) |
|
|||
96 | if asbool(config['pdebug']): |
|
|||
97 | from rhodecode.lib.profiler import ProfilingMiddleware |
|
|||
98 | app = ProfilingMiddleware(app) |
|
|||
99 |
|
||||
100 | # Protect from VCS Server error related pages when server is not available |
|
|||
101 | vcs_server_enabled = asbool(config.get('vcs.server.enable', 'true')) |
|
|||
102 | if not vcs_server_enabled: |
|
|||
103 | app = DisableVCSPagesWrapper(app) |
|
|||
104 |
|
92 | |||
105 | if asbool(full_stack): |
|
93 | if asbool(full_stack): | |
106 |
|
94 | |||
107 | # Appenlight monitoring and error handler |
|
95 | # Appenlight monitoring and error handler | |
108 | app, appenlight_client = wrap_in_appenlight_if_enabled(app, config) |
|
96 | app, appenlight_client = wrap_in_appenlight_if_enabled(app, config) | |
109 |
|
97 | |||
110 | # Handle Python exceptions |
|
|||
111 | app = ErrorHandler(app, global_conf, **config['pylons.errorware']) |
|
|||
112 |
|
||||
113 | # we want our low level middleware to get to the request ASAP. We don't |
|
98 | # we want our low level middleware to get to the request ASAP. We don't | |
114 | # need any pylons stack middleware in them |
|
99 | # need any pylons stack middleware in them | |
115 | app = VCSMiddleware(app, config, appenlight_client) |
|
100 | app = VCSMiddleware(app, config, appenlight_client) | |
116 | # Display error documents for 401, 403, 404 status codes (and |
|
|||
117 | # 500 when debug is disabled) |
|
|||
118 | if asbool(config['debug']): |
|
|||
119 | app = StatusCodeRedirect(app) |
|
|||
120 | else: |
|
|||
121 | app = StatusCodeRedirect(app, [400, 401, 403, 404, 500]) |
|
|||
122 |
|
101 | |||
123 | # Establish the Registry for this application |
|
102 | # Establish the Registry for this application | |
124 | app = RegistryManager(app) |
|
103 | app = RegistryManager(app) | |
125 |
|
104 | |||
126 | app.config = config |
|
105 | app.config = config | |
127 |
|
106 | |||
128 | return app |
|
107 | return app | |
129 |
|
108 | |||
130 |
|
109 | |||
131 | def make_pyramid_app(global_config, **settings): |
|
110 | def make_pyramid_app(global_config, **settings): | |
132 | """ |
|
111 | """ | |
133 | Constructs the WSGI application based on Pyramid and wraps the Pylons based |
|
112 | Constructs the WSGI application based on Pyramid and wraps the Pylons based | |
134 | application. |
|
113 | application. | |
135 |
|
114 | |||
136 | Specials: |
|
115 | Specials: | |
137 |
|
116 | |||
138 | * We migrate from Pylons to Pyramid. While doing this, we keep both |
|
117 | * We migrate from Pylons to Pyramid. While doing this, we keep both | |
139 | frameworks functional. This involves moving some WSGI middlewares around |
|
118 | frameworks functional. This involves moving some WSGI middlewares around | |
140 | and providing access to some data internals, so that the old code is |
|
119 | and providing access to some data internals, so that the old code is | |
141 | still functional. |
|
120 | still functional. | |
142 |
|
121 | |||
143 | * The application can also be integrated like a plugin via the call to |
|
122 | * The application can also be integrated like a plugin via the call to | |
144 | `includeme`. This is accompanied with the other utility functions which |
|
123 | `includeme`. This is accompanied with the other utility functions which | |
145 | are called. Changing this should be done with great care to not break |
|
124 | are called. Changing this should be done with great care to not break | |
146 | cases when these fragments are assembled from another place. |
|
125 | cases when these fragments are assembled from another place. | |
147 |
|
126 | |||
148 | """ |
|
127 | """ | |
149 | # The edition string should be available in pylons too, so we add it here |
|
128 | # The edition string should be available in pylons too, so we add it here | |
150 | # before copying the settings. |
|
129 | # before copying the settings. | |
151 | settings.setdefault('rhodecode.edition', 'Community Edition') |
|
130 | settings.setdefault('rhodecode.edition', 'Community Edition') | |
152 |
|
131 | |||
153 | # As long as our Pylons application does expect "unprepared" settings, make |
|
132 | # As long as our Pylons application does expect "unprepared" settings, make | |
154 | # sure that we keep an unmodified copy. This avoids unintentional change of |
|
133 | # sure that we keep an unmodified copy. This avoids unintentional change of | |
155 | # behavior in the old application. |
|
134 | # behavior in the old application. | |
156 | settings_pylons = settings.copy() |
|
135 | settings_pylons = settings.copy() | |
157 |
|
136 | |||
158 | sanitize_settings_and_apply_defaults(settings) |
|
137 | sanitize_settings_and_apply_defaults(settings) | |
159 | config = Configurator(settings=settings) |
|
138 | config = Configurator(settings=settings) | |
160 | add_pylons_compat_data(config.registry, global_config, settings_pylons) |
|
139 | add_pylons_compat_data(config.registry, global_config, settings_pylons) | |
161 |
|
140 | |||
162 | load_pyramid_environment(global_config, settings) |
|
141 | load_pyramid_environment(global_config, settings) | |
163 |
|
142 | |||
164 | includeme(config) |
|
143 | includeme(config) | |
165 | includeme_last(config) |
|
144 | includeme_last(config) | |
166 | pyramid_app = config.make_wsgi_app() |
|
145 | pyramid_app = config.make_wsgi_app() | |
167 | pyramid_app = wrap_app_in_wsgi_middlewares(pyramid_app, config) |
|
146 | pyramid_app = wrap_app_in_wsgi_middlewares(pyramid_app, config) | |
168 | return pyramid_app |
|
147 | return pyramid_app | |
169 |
|
148 | |||
170 |
|
149 | |||
171 | def add_pylons_compat_data(registry, global_config, settings): |
|
150 | def add_pylons_compat_data(registry, global_config, settings): | |
172 | """ |
|
151 | """ | |
173 | Attach data to the registry to support the Pylons integration. |
|
152 | Attach data to the registry to support the Pylons integration. | |
174 | """ |
|
153 | """ | |
175 | registry._pylons_compat_global_config = global_config |
|
154 | registry._pylons_compat_global_config = global_config | |
176 | registry._pylons_compat_settings = settings |
|
155 | registry._pylons_compat_settings = settings | |
177 |
|
156 | |||
178 |
|
157 | |||
|
158 | def webob_to_pyramid_http_response(webob_response): | |||
|
159 | ResponseClass = httpexceptions.status_map[webob_response.status_int] | |||
|
160 | pyramid_response = ResponseClass(webob_response.status) | |||
|
161 | pyramid_response.status = webob_response.status | |||
|
162 | pyramid_response.headers.update(webob_response.headers) | |||
|
163 | if pyramid_response.headers['content-type'] == 'text/html': | |||
|
164 | pyramid_response.headers['content-type'] = 'text/html; charset=UTF-8' | |||
|
165 | return pyramid_response | |||
|
166 | ||||
|
167 | ||||
|
168 | def error_handler(exception, request): | |||
|
169 | # TODO: dan: replace the old pylons error controller with this | |||
|
170 | from rhodecode.model.settings import SettingsModel | |||
|
171 | from rhodecode.lib.utils2 import AttributeDict | |||
|
172 | ||||
|
173 | try: | |||
|
174 | rc_config = SettingsModel().get_all_settings() | |||
|
175 | except Exception: | |||
|
176 | log.exception('failed to fetch settings') | |||
|
177 | rc_config = {} | |||
|
178 | ||||
|
179 | base_response = HTTPInternalServerError() | |||
|
180 | # prefer original exception for the response since it may have headers set | |||
|
181 | if isinstance(exception, HTTPError): | |||
|
182 | base_response = exception | |||
|
183 | ||||
|
184 | c = AttributeDict() | |||
|
185 | c.error_message = base_response.status | |||
|
186 | c.error_explanation = base_response.explanation or str(base_response) | |||
|
187 | c.visual = AttributeDict() | |||
|
188 | ||||
|
189 | c.visual.rhodecode_support_url = ( | |||
|
190 | request.registry.settings.get('rhodecode_support_url') or | |||
|
191 | request.route_url('rhodecode_support') | |||
|
192 | ) | |||
|
193 | c.redirect_time = 0 | |||
|
194 | c.rhodecode_name = rc_config.get('rhodecode_title', '') | |||
|
195 | if not c.rhodecode_name: | |||
|
196 | c.rhodecode_name = 'Rhodecode' | |||
|
197 | ||||
|
198 | response = render_to_response( | |||
|
199 | '/errors/error_document.html', {'c': c}, request=request, | |||
|
200 | response=base_response) | |||
|
201 | ||||
|
202 | return response | |||
|
203 | ||||
|
204 | ||||
179 | def includeme(config): |
|
205 | def includeme(config): | |
180 | settings = config.registry.settings |
|
206 | settings = config.registry.settings | |
181 |
|
207 | |||
|
208 | if asbool(settings.get('appenlight', 'false')): | |||
|
209 | config.include('appenlight_client.ext.pyramid_tween') | |||
|
210 | ||||
182 | # Includes which are required. The application would fail without them. |
|
211 | # Includes which are required. The application would fail without them. | |
183 | config.include('pyramid_mako') |
|
212 | config.include('pyramid_mako') | |
184 | config.include('pyramid_beaker') |
|
213 | config.include('pyramid_beaker') | |
|
214 | config.include('rhodecode.admin') | |||
185 | config.include('rhodecode.authentication') |
|
215 | config.include('rhodecode.authentication') | |
186 | config.include('rhodecode.login') |
|
216 | config.include('rhodecode.login') | |
187 | config.include('rhodecode.tweens') |
|
217 | config.include('rhodecode.tweens') | |
188 | config.include('rhodecode.api') |
|
218 | config.include('rhodecode.api') | |
|
219 | config.add_route( | |||
|
220 | 'rhodecode_support', 'https://rhodecode.com/help/', static=True) | |||
189 |
|
221 | |||
190 | # Set the authorization policy. |
|
222 | # Set the authorization policy. | |
191 | authz_policy = ACLAuthorizationPolicy() |
|
223 | authz_policy = ACLAuthorizationPolicy() | |
192 | config.set_authorization_policy(authz_policy) |
|
224 | config.set_authorization_policy(authz_policy) | |
193 |
|
225 | |||
194 | # Set the default renderer for HTML templates to mako. |
|
226 | # Set the default renderer for HTML templates to mako. | |
195 | config.add_mako_renderer('.html') |
|
227 | config.add_mako_renderer('.html') | |
196 |
|
228 | |||
197 | # plugin information |
|
229 | # plugin information | |
198 | config.registry.rhodecode_plugins = {} |
|
230 | config.registry.rhodecode_plugins = {} | |
199 |
|
231 | |||
200 | config.add_directive( |
|
232 | config.add_directive( | |
201 | 'register_rhodecode_plugin', register_rhodecode_plugin) |
|
233 | 'register_rhodecode_plugin', register_rhodecode_plugin) | |
202 | # include RhodeCode plugins |
|
234 | # include RhodeCode plugins | |
203 | includes = aslist(settings.get('rhodecode.includes', [])) |
|
235 | includes = aslist(settings.get('rhodecode.includes', [])) | |
204 | for inc in includes: |
|
236 | for inc in includes: | |
205 | config.include(inc) |
|
237 | config.include(inc) | |
206 |
|
238 | |||
|
239 | pylons_app = make_app( | |||
|
240 | config.registry._pylons_compat_global_config, | |||
|
241 | **config.registry._pylons_compat_settings) | |||
|
242 | config.registry._pylons_compat_config = pylons_app.config | |||
|
243 | ||||
|
244 | pylons_app_as_view = wsgiapp(pylons_app) | |||
|
245 | ||||
|
246 | # Protect from VCS Server error related pages when server is not available | |||
|
247 | vcs_server_enabled = asbool(settings.get('vcs.server.enable', 'true')) | |||
|
248 | if not vcs_server_enabled: | |||
|
249 | pylons_app_as_view = DisableVCSPagesWrapper(pylons_app_as_view) | |||
|
250 | ||||
|
251 | ||||
|
252 | def pylons_app_with_error_handler(context, request): | |||
|
253 | """ | |||
|
254 | Handle exceptions from rc pylons app: | |||
|
255 | ||||
|
256 | - old webob type exceptions get converted to pyramid exceptions | |||
|
257 | - pyramid exceptions are passed to the error handler view | |||
|
258 | """ | |||
|
259 | try: | |||
|
260 | response = pylons_app_as_view(context, request) | |||
|
261 | if 400 <= response.status_int <= 599: # webob type error responses | |||
|
262 | return error_handler( | |||
|
263 | webob_to_pyramid_http_response(response), request) | |||
|
264 | except HTTPError as e: # pyramid type exceptions | |||
|
265 | return error_handler(e, request) | |||
|
266 | except Exception: | |||
|
267 | if settings.get('debugtoolbar.enabled', False): | |||
|
268 | raise | |||
|
269 | return error_handler(HTTPInternalServerError(), request) | |||
|
270 | return response | |||
|
271 | ||||
207 | # This is the glue which allows us to migrate in chunks. By registering the |
|
272 | # This is the glue which allows us to migrate in chunks. By registering the | |
208 | # pylons based application as the "Not Found" view in Pyramid, we will |
|
273 | # pylons based application as the "Not Found" view in Pyramid, we will | |
209 | # fallback to the old application each time the new one does not yet know |
|
274 | # fallback to the old application each time the new one does not yet know | |
210 | # how to handle a request. |
|
275 | # how to handle a request. | |
211 | pylons_app = make_app( |
|
276 | config.add_notfound_view(pylons_app_with_error_handler) | |
212 | config.registry._pylons_compat_global_config, |
|
277 | ||
213 | **config.registry._pylons_compat_settings) |
|
278 | if settings.get('debugtoolbar.enabled', False): | |
214 | config.registry._pylons_compat_config = pylons_app.config |
|
279 | # if toolbar, then only http type exceptions get caught and rendered | |
215 | pylons_app_as_view = wsgiapp(pylons_app) |
|
280 | ExcClass = HTTPError | |
216 | config.add_notfound_view(pylons_app_as_view) |
|
281 | else: | |
|
282 | # if no toolbar, then any exception gets caught and rendered | |||
|
283 | ExcClass = Exception | |||
|
284 | config.add_view(error_handler, context=ExcClass) | |||
217 |
|
285 | |||
218 |
|
286 | |||
219 | def includeme_last(config): |
|
287 | def includeme_last(config): | |
220 | """ |
|
288 | """ | |
221 | The static file catchall needs to be last in the view configuration. |
|
289 | The static file catchall needs to be last in the view configuration. | |
222 | """ |
|
290 | """ | |
223 | settings = config.registry.settings |
|
291 | settings = config.registry.settings | |
224 |
|
292 | |||
225 | # Note: johbo: I would prefer to register a prefix for static files at some |
|
293 | # Note: johbo: I would prefer to register a prefix for static files at some | |
226 | # point, e.g. move them under '_static/'. This would fully avoid that we |
|
294 | # point, e.g. move them under '_static/'. This would fully avoid that we | |
227 | # can have name clashes with a repository name. Imaging someone calling his |
|
295 | # can have name clashes with a repository name. Imaging someone calling his | |
228 | # repo "css" ;-) Also having an external web server to serve out the static |
|
296 | # repo "css" ;-) Also having an external web server to serve out the static | |
229 | # files seems to be easier to set up if they have a common prefix. |
|
297 | # files seems to be easier to set up if they have a common prefix. | |
230 | # |
|
298 | # | |
231 | # Example: config.add_static_view('_static', path='rhodecode:public') |
|
299 | # Example: config.add_static_view('_static', path='rhodecode:public') | |
232 | # |
|
300 | # | |
233 | # It might be an option to register both paths for a while and then migrate |
|
301 | # It might be an option to register both paths for a while and then migrate | |
234 | # over to the new location. |
|
302 | # over to the new location. | |
235 |
|
303 | |||
236 | # Serving static files with a catchall. |
|
304 | # Serving static files with a catchall. | |
237 | if settings['static_files']: |
|
305 | if settings['static_files']: | |
238 | config.add_route('catchall_static', '/*subpath') |
|
306 | config.add_route('catchall_static', '/*subpath') | |
239 | config.add_view( |
|
307 | config.add_view( | |
240 | static_view('rhodecode:public'), route_name='catchall_static') |
|
308 | static_view('rhodecode:public'), route_name='catchall_static') | |
241 |
|
309 | |||
242 |
|
310 | |||
243 | def wrap_app_in_wsgi_middlewares(pyramid_app, config): |
|
311 | def wrap_app_in_wsgi_middlewares(pyramid_app, config): | |
244 | """ |
|
312 | """ | |
245 | Apply outer WSGI middlewares around the application. |
|
313 | Apply outer WSGI middlewares around the application. | |
246 |
|
314 | |||
247 | Part of this has been moved up from the Pylons layer, so that the |
|
315 | Part of this has been moved up from the Pylons layer, so that the | |
248 | data is also available if old Pylons code is hit through an already ported |
|
316 | data is also available if old Pylons code is hit through an already ported | |
249 | view. |
|
317 | view. | |
250 | """ |
|
318 | """ | |
251 | settings = config.registry.settings |
|
319 | settings = config.registry.settings | |
252 |
|
320 | |||
253 | # enable https redirects based on HTTP_X_URL_SCHEME set by proxy |
|
321 | # enable https redirects based on HTTP_X_URL_SCHEME set by proxy | |
254 | pyramid_app = HttpsFixup(pyramid_app, settings) |
|
322 | pyramid_app = HttpsFixup(pyramid_app, settings) | |
255 |
|
323 | |||
256 | # Add RoutesMiddleware. Currently we have two instances in the stack. This |
|
324 | # Add RoutesMiddleware to support the pylons compatibility tween during | |
257 | # is the upper one to support the pylons compatibility tween during |
|
|||
258 |
|
325 | |||
259 | # migration to pyramid. |
|
326 | # migration to pyramid. | |
260 | pyramid_app = RoutesMiddleware( |
|
327 | pyramid_app = RoutesMiddleware( | |
261 | pyramid_app, config.registry._pylons_compat_config['routes.map']) |
|
328 | pyramid_app, config.registry._pylons_compat_config['routes.map']) | |
262 |
|
329 | |||
|
330 | if asbool(settings.get('appenlight', 'false')): | |||
|
331 | pyramid_app, _ = wrap_in_appenlight_if_enabled( | |||
|
332 | pyramid_app, config.registry._pylons_compat_config) | |||
|
333 | ||||
263 | # TODO: johbo: Don't really see why we enable the gzip middleware when |
|
334 | # TODO: johbo: Don't really see why we enable the gzip middleware when | |
264 | # serving static files, might be something that should have its own setting |
|
335 | # serving static files, might be something that should have its own setting | |
265 | # as well? |
|
336 | # as well? | |
266 | if settings['static_files']: |
|
337 | if settings['static_files']: | |
267 | pyramid_app = make_gzip_middleware( |
|
338 | pyramid_app = make_gzip_middleware( | |
268 | pyramid_app, settings, compress_level=1) |
|
339 | pyramid_app, settings, compress_level=1) | |
269 |
|
340 | |||
270 | return pyramid_app |
|
341 | return pyramid_app | |
271 |
|
342 | |||
272 |
|
343 | |||
273 | def sanitize_settings_and_apply_defaults(settings): |
|
344 | def sanitize_settings_and_apply_defaults(settings): | |
274 | """ |
|
345 | """ | |
275 | Applies settings defaults and does all type conversion. |
|
346 | Applies settings defaults and does all type conversion. | |
276 |
|
347 | |||
277 | We would move all settings parsing and preparation into this place, so that |
|
348 | We would move all settings parsing and preparation into this place, so that | |
278 | we have only one place left which deals with this part. The remaining parts |
|
349 | we have only one place left which deals with this part. The remaining parts | |
279 | of the application would start to rely fully on well prepared settings. |
|
350 | of the application would start to rely fully on well prepared settings. | |
280 |
|
351 | |||
281 | This piece would later be split up per topic to avoid a big fat monster |
|
352 | This piece would later be split up per topic to avoid a big fat monster | |
282 | function. |
|
353 | function. | |
283 | """ |
|
354 | """ | |
284 |
|
355 | |||
285 | # Pyramid's mako renderer has to search in the templates folder so that the |
|
356 | # Pyramid's mako renderer has to search in the templates folder so that the | |
286 | # old templates still work. Ported and new templates are expected to use |
|
357 | # old templates still work. Ported and new templates are expected to use | |
287 | # real asset specifications for the includes. |
|
358 | # real asset specifications for the includes. | |
288 | mako_directories = settings.setdefault('mako.directories', [ |
|
359 | mako_directories = settings.setdefault('mako.directories', [ | |
289 | # Base templates of the original Pylons application |
|
360 | # Base templates of the original Pylons application | |
290 | 'rhodecode:templates', |
|
361 | 'rhodecode:templates', | |
291 | ]) |
|
362 | ]) | |
292 | log.debug( |
|
363 | log.debug( | |
293 | "Using the following Mako template directories: %s", |
|
364 | "Using the following Mako template directories: %s", | |
294 | mako_directories) |
|
365 | mako_directories) | |
295 |
|
366 | |||
296 | # Default includes, possible to change as a user |
|
367 | # Default includes, possible to change as a user | |
297 | pyramid_includes = settings.setdefault('pyramid.includes', [ |
|
368 | pyramid_includes = settings.setdefault('pyramid.includes', [ | |
298 | 'rhodecode.lib.middleware.request_wrapper', |
|
369 | 'rhodecode.lib.middleware.request_wrapper', | |
299 | ]) |
|
370 | ]) | |
300 | log.debug( |
|
371 | log.debug( | |
301 | "Using the following pyramid.includes: %s", |
|
372 | "Using the following pyramid.includes: %s", | |
302 | pyramid_includes) |
|
373 | pyramid_includes) | |
303 |
|
374 | |||
304 | # TODO: johbo: Re-think this, usually the call to config.include |
|
375 | # TODO: johbo: Re-think this, usually the call to config.include | |
305 | # should allow to pass in a prefix. |
|
376 | # should allow to pass in a prefix. | |
306 | settings.setdefault('rhodecode.api.url', '/_admin/api') |
|
377 | settings.setdefault('rhodecode.api.url', '/_admin/api') | |
307 |
|
378 | |||
308 | _bool_setting(settings, 'vcs.server.enable', 'true') |
|
379 | _bool_setting(settings, 'vcs.server.enable', 'true') | |
309 | _bool_setting(settings, 'static_files', 'true') |
|
380 | _bool_setting(settings, 'static_files', 'true') | |
310 | _bool_setting(settings, 'is_test', 'false') |
|
381 | _bool_setting(settings, 'is_test', 'false') | |
311 |
|
382 | |||
312 | return settings |
|
383 | return settings | |
313 |
|
384 | |||
314 |
|
385 | |||
315 | def _bool_setting(settings, name, default): |
|
386 | def _bool_setting(settings, name, default): | |
316 | settings[name] = asbool(settings.get(name, default)) |
|
387 | settings[name] = asbool(settings.get(name, default)) |
@@ -1,1149 +1,1141 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 | """ |
|
21 | """ | |
22 | Routes configuration |
|
22 | Routes configuration | |
23 |
|
23 | |||
24 | The more specific and detailed routes should be defined first so they |
|
24 | The more specific and detailed routes should be defined first so they | |
25 | may take precedent over the more generic routes. For more information |
|
25 | may take precedent over the more generic routes. For more information | |
26 | refer to the routes manual at http://routes.groovie.org/docs/ |
|
26 | refer to the routes manual at http://routes.groovie.org/docs/ | |
27 |
|
27 | |||
28 | IMPORTANT: if you change any routing here, make sure to take a look at lib/base.py |
|
28 | IMPORTANT: if you change any routing here, make sure to take a look at lib/base.py | |
29 | and _route_name variable which uses some of stored naming here to do redirects. |
|
29 | and _route_name variable which uses some of stored naming here to do redirects. | |
30 | """ |
|
30 | """ | |
31 | import os |
|
31 | import os | |
32 | import re |
|
32 | import re | |
33 | from routes import Mapper |
|
33 | from routes import Mapper | |
34 |
|
34 | |||
35 | from rhodecode.config import routing_links |
|
35 | from rhodecode.config import routing_links | |
36 |
|
36 | |||
37 | # prefix for non repository related links needs to be prefixed with `/` |
|
37 | # prefix for non repository related links needs to be prefixed with `/` | |
38 | ADMIN_PREFIX = '/_admin' |
|
38 | ADMIN_PREFIX = '/_admin' | |
39 |
|
39 | |||
40 | # Default requirements for URL parts |
|
40 | # Default requirements for URL parts | |
41 | URL_NAME_REQUIREMENTS = { |
|
41 | URL_NAME_REQUIREMENTS = { | |
42 | # group name can have a slash in them, but they must not end with a slash |
|
42 | # group name can have a slash in them, but they must not end with a slash | |
43 | 'group_name': r'.*?[^/]', |
|
43 | 'group_name': r'.*?[^/]', | |
44 | # repo names can have a slash in them, but they must not end with a slash |
|
44 | # repo names can have a slash in them, but they must not end with a slash | |
45 | 'repo_name': r'.*?[^/]', |
|
45 | 'repo_name': r'.*?[^/]', | |
46 | # file path eats up everything at the end |
|
46 | # file path eats up everything at the end | |
47 | 'f_path': r'.*', |
|
47 | 'f_path': r'.*', | |
48 | # reference types |
|
48 | # reference types | |
49 | 'source_ref_type': '(branch|book|tag|rev|\%\(source_ref_type\)s)', |
|
49 | 'source_ref_type': '(branch|book|tag|rev|\%\(source_ref_type\)s)', | |
50 | 'target_ref_type': '(branch|book|tag|rev|\%\(target_ref_type\)s)', |
|
50 | 'target_ref_type': '(branch|book|tag|rev|\%\(target_ref_type\)s)', | |
51 | } |
|
51 | } | |
52 |
|
52 | |||
53 |
|
53 | |||
54 | class JSRoutesMapper(Mapper): |
|
54 | class JSRoutesMapper(Mapper): | |
55 | """ |
|
55 | """ | |
56 | Wrapper for routes.Mapper to make pyroutes compatible url definitions |
|
56 | Wrapper for routes.Mapper to make pyroutes compatible url definitions | |
57 | """ |
|
57 | """ | |
58 | _named_route_regex = re.compile(r'^[a-z-_0-9A-Z]+$') |
|
58 | _named_route_regex = re.compile(r'^[a-z-_0-9A-Z]+$') | |
59 | _argument_prog = re.compile('\{(.*?)\}|:\((.*)\)') |
|
59 | _argument_prog = re.compile('\{(.*?)\}|:\((.*)\)') | |
60 | def __init__(self, *args, **kw): |
|
60 | def __init__(self, *args, **kw): | |
61 | super(JSRoutesMapper, self).__init__(*args, **kw) |
|
61 | super(JSRoutesMapper, self).__init__(*args, **kw) | |
62 | self._jsroutes = [] |
|
62 | self._jsroutes = [] | |
63 |
|
63 | |||
64 | def connect(self, *args, **kw): |
|
64 | def connect(self, *args, **kw): | |
65 | """ |
|
65 | """ | |
66 | Wrapper for connect to take an extra argument jsroute=True |
|
66 | Wrapper for connect to take an extra argument jsroute=True | |
67 |
|
67 | |||
68 | :param jsroute: boolean, if True will add the route to the pyroutes list |
|
68 | :param jsroute: boolean, if True will add the route to the pyroutes list | |
69 | """ |
|
69 | """ | |
70 | if kw.pop('jsroute', False): |
|
70 | if kw.pop('jsroute', False): | |
71 | if not self._named_route_regex.match(args[0]): |
|
71 | if not self._named_route_regex.match(args[0]): | |
72 | raise Exception('only named routes can be added to pyroutes') |
|
72 | raise Exception('only named routes can be added to pyroutes') | |
73 | self._jsroutes.append(args[0]) |
|
73 | self._jsroutes.append(args[0]) | |
74 |
|
74 | |||
75 | super(JSRoutesMapper, self).connect(*args, **kw) |
|
75 | super(JSRoutesMapper, self).connect(*args, **kw) | |
76 |
|
76 | |||
77 | def _extract_route_information(self, route): |
|
77 | def _extract_route_information(self, route): | |
78 | """ |
|
78 | """ | |
79 | Convert a route into tuple(name, path, args), eg: |
|
79 | Convert a route into tuple(name, path, args), eg: | |
80 | ('user_profile', '/profile/%(username)s', ['username']) |
|
80 | ('user_profile', '/profile/%(username)s', ['username']) | |
81 | """ |
|
81 | """ | |
82 | routepath = route.routepath |
|
82 | routepath = route.routepath | |
83 | def replace(matchobj): |
|
83 | def replace(matchobj): | |
84 | if matchobj.group(1): |
|
84 | if matchobj.group(1): | |
85 | return "%%(%s)s" % matchobj.group(1).split(':')[0] |
|
85 | return "%%(%s)s" % matchobj.group(1).split(':')[0] | |
86 | else: |
|
86 | else: | |
87 | return "%%(%s)s" % matchobj.group(2) |
|
87 | return "%%(%s)s" % matchobj.group(2) | |
88 |
|
88 | |||
89 | routepath = self._argument_prog.sub(replace, routepath) |
|
89 | routepath = self._argument_prog.sub(replace, routepath) | |
90 | return ( |
|
90 | return ( | |
91 | route.name, |
|
91 | route.name, | |
92 | routepath, |
|
92 | routepath, | |
93 | [(arg[0].split(':')[0] if arg[0] != '' else arg[1]) |
|
93 | [(arg[0].split(':')[0] if arg[0] != '' else arg[1]) | |
94 | for arg in self._argument_prog.findall(route.routepath)] |
|
94 | for arg in self._argument_prog.findall(route.routepath)] | |
95 | ) |
|
95 | ) | |
96 |
|
96 | |||
97 | def jsroutes(self): |
|
97 | def jsroutes(self): | |
98 | """ |
|
98 | """ | |
99 | Return a list of pyroutes.js compatible routes |
|
99 | Return a list of pyroutes.js compatible routes | |
100 | """ |
|
100 | """ | |
101 | for route_name in self._jsroutes: |
|
101 | for route_name in self._jsroutes: | |
102 | yield self._extract_route_information(self._routenames[route_name]) |
|
102 | yield self._extract_route_information(self._routenames[route_name]) | |
103 |
|
103 | |||
104 |
|
104 | |||
105 | def make_map(config): |
|
105 | def make_map(config): | |
106 | """Create, configure and return the routes Mapper""" |
|
106 | """Create, configure and return the routes Mapper""" | |
107 | rmap = JSRoutesMapper(directory=config['pylons.paths']['controllers'], |
|
107 | rmap = JSRoutesMapper(directory=config['pylons.paths']['controllers'], | |
108 | always_scan=config['debug']) |
|
108 | always_scan=config['debug']) | |
109 | rmap.minimization = False |
|
109 | rmap.minimization = False | |
110 | rmap.explicit = False |
|
110 | rmap.explicit = False | |
111 |
|
111 | |||
112 | from rhodecode.lib.utils2 import str2bool |
|
112 | from rhodecode.lib.utils2 import str2bool | |
113 | from rhodecode.model import repo, repo_group |
|
113 | from rhodecode.model import repo, repo_group | |
114 |
|
114 | |||
115 | def check_repo(environ, match_dict): |
|
115 | def check_repo(environ, match_dict): | |
116 | """ |
|
116 | """ | |
117 | check for valid repository for proper 404 handling |
|
117 | check for valid repository for proper 404 handling | |
118 |
|
118 | |||
119 | :param environ: |
|
119 | :param environ: | |
120 | :param match_dict: |
|
120 | :param match_dict: | |
121 | """ |
|
121 | """ | |
122 | repo_name = match_dict.get('repo_name') |
|
122 | repo_name = match_dict.get('repo_name') | |
123 |
|
123 | |||
124 | if match_dict.get('f_path'): |
|
124 | if match_dict.get('f_path'): | |
125 | # fix for multiple initial slashes that causes errors |
|
125 | # fix for multiple initial slashes that causes errors | |
126 | match_dict['f_path'] = match_dict['f_path'].lstrip('/') |
|
126 | match_dict['f_path'] = match_dict['f_path'].lstrip('/') | |
127 | repo_model = repo.RepoModel() |
|
127 | repo_model = repo.RepoModel() | |
128 | by_name_match = repo_model.get_by_repo_name(repo_name) |
|
128 | by_name_match = repo_model.get_by_repo_name(repo_name) | |
129 | # if we match quickly from database, short circuit the operation, |
|
129 | # if we match quickly from database, short circuit the operation, | |
130 | # and validate repo based on the type. |
|
130 | # and validate repo based on the type. | |
131 | if by_name_match: |
|
131 | if by_name_match: | |
132 | return True |
|
132 | return True | |
133 |
|
133 | |||
134 | by_id_match = repo_model.get_repo_by_id(repo_name) |
|
134 | by_id_match = repo_model.get_repo_by_id(repo_name) | |
135 | if by_id_match: |
|
135 | if by_id_match: | |
136 | repo_name = by_id_match.repo_name |
|
136 | repo_name = by_id_match.repo_name | |
137 | match_dict['repo_name'] = repo_name |
|
137 | match_dict['repo_name'] = repo_name | |
138 | return True |
|
138 | return True | |
139 |
|
139 | |||
140 | return False |
|
140 | return False | |
141 |
|
141 | |||
142 | def check_group(environ, match_dict): |
|
142 | def check_group(environ, match_dict): | |
143 | """ |
|
143 | """ | |
144 | check for valid repository group path for proper 404 handling |
|
144 | check for valid repository group path for proper 404 handling | |
145 |
|
145 | |||
146 | :param environ: |
|
146 | :param environ: | |
147 | :param match_dict: |
|
147 | :param match_dict: | |
148 | """ |
|
148 | """ | |
149 | repo_group_name = match_dict.get('group_name') |
|
149 | repo_group_name = match_dict.get('group_name') | |
150 | repo_group_model = repo_group.RepoGroupModel() |
|
150 | repo_group_model = repo_group.RepoGroupModel() | |
151 | by_name_match = repo_group_model.get_by_group_name(repo_group_name) |
|
151 | by_name_match = repo_group_model.get_by_group_name(repo_group_name) | |
152 | if by_name_match: |
|
152 | if by_name_match: | |
153 | return True |
|
153 | return True | |
154 |
|
154 | |||
155 | return False |
|
155 | return False | |
156 |
|
156 | |||
157 | def check_user_group(environ, match_dict): |
|
157 | def check_user_group(environ, match_dict): | |
158 | """ |
|
158 | """ | |
159 | check for valid user group for proper 404 handling |
|
159 | check for valid user group for proper 404 handling | |
160 |
|
160 | |||
161 | :param environ: |
|
161 | :param environ: | |
162 | :param match_dict: |
|
162 | :param match_dict: | |
163 | """ |
|
163 | """ | |
164 | return True |
|
164 | return True | |
165 |
|
165 | |||
166 | def check_int(environ, match_dict): |
|
166 | def check_int(environ, match_dict): | |
167 | return match_dict.get('id').isdigit() |
|
167 | return match_dict.get('id').isdigit() | |
168 |
|
168 | |||
169 | # The ErrorController route (handles 404/500 error pages); it should |
|
|||
170 | # likely stay at the top, ensuring it can always be resolved |
|
|||
171 | rmap.connect('/error/{action}', controller='error') |
|
|||
172 | rmap.connect('/error/{action}/{id}', controller='error') |
|
|||
173 |
|
169 | |||
174 | #========================================================================== |
|
170 | #========================================================================== | |
175 | # CUSTOM ROUTES HERE |
|
171 | # CUSTOM ROUTES HERE | |
176 | #========================================================================== |
|
172 | #========================================================================== | |
177 |
|
173 | |||
178 | # MAIN PAGE |
|
174 | # MAIN PAGE | |
179 | rmap.connect('home', '/', controller='home', action='index', jsroute=True) |
|
175 | rmap.connect('home', '/', controller='home', action='index', jsroute=True) | |
180 | rmap.connect('goto_switcher_data', '/_goto_data', controller='home', |
|
176 | rmap.connect('goto_switcher_data', '/_goto_data', controller='home', | |
181 | action='goto_switcher_data') |
|
177 | action='goto_switcher_data') | |
182 | rmap.connect('repo_list_data', '/_repos', controller='home', |
|
178 | rmap.connect('repo_list_data', '/_repos', controller='home', | |
183 | action='repo_list_data') |
|
179 | action='repo_list_data') | |
184 |
|
180 | |||
185 | rmap.connect('user_autocomplete_data', '/_users', controller='home', |
|
181 | rmap.connect('user_autocomplete_data', '/_users', controller='home', | |
186 | action='user_autocomplete_data', jsroute=True) |
|
182 | action='user_autocomplete_data', jsroute=True) | |
187 | rmap.connect('user_group_autocomplete_data', '/_user_groups', controller='home', |
|
183 | rmap.connect('user_group_autocomplete_data', '/_user_groups', controller='home', | |
188 | action='user_group_autocomplete_data') |
|
184 | action='user_group_autocomplete_data') | |
189 |
|
185 | |||
190 | rmap.connect( |
|
186 | rmap.connect( | |
191 | 'user_profile', '/_profiles/{username}', controller='users', |
|
187 | 'user_profile', '/_profiles/{username}', controller='users', | |
192 | action='user_profile') |
|
188 | action='user_profile') | |
193 |
|
189 | |||
194 | # TODO: johbo: Static links, to be replaced by our redirection mechanism |
|
190 | # TODO: johbo: Static links, to be replaced by our redirection mechanism | |
195 | rmap.connect('rst_help', |
|
191 | rmap.connect('rst_help', | |
196 | 'http://docutils.sourceforge.net/docs/user/rst/quickref.html', |
|
192 | 'http://docutils.sourceforge.net/docs/user/rst/quickref.html', | |
197 | _static=True) |
|
193 | _static=True) | |
198 | rmap.connect('markdown_help', |
|
194 | rmap.connect('markdown_help', | |
199 | 'http://daringfireball.net/projects/markdown/syntax', |
|
195 | 'http://daringfireball.net/projects/markdown/syntax', | |
200 | _static=True) |
|
196 | _static=True) | |
201 | rmap.connect('rhodecode_official', 'https://rhodecode.com', _static=True) |
|
197 | rmap.connect('rhodecode_official', 'https://rhodecode.com', _static=True) | |
202 | rmap.connect('rhodecode_support', 'https://rhodecode.com/help/', _static=True) |
|
198 | rmap.connect('rhodecode_support', 'https://rhodecode.com/help/', _static=True) | |
203 | rmap.connect('rhodecode_translations', 'https://rhodecode.com/translate/enterprise', _static=True) |
|
199 | rmap.connect('rhodecode_translations', 'https://rhodecode.com/translate/enterprise', _static=True) | |
204 | # TODO: anderson - making this a static link since redirect won't play |
|
200 | # TODO: anderson - making this a static link since redirect won't play | |
205 | # nice with POST requests |
|
201 | # nice with POST requests | |
206 | rmap.connect('enterprise_license_convert_from_old', |
|
202 | rmap.connect('enterprise_license_convert_from_old', | |
207 | 'https://rhodecode.com/u/license-upgrade', |
|
203 | 'https://rhodecode.com/u/license-upgrade', | |
208 | _static=True) |
|
204 | _static=True) | |
209 |
|
205 | |||
210 | routing_links.connect_redirection_links(rmap) |
|
206 | routing_links.connect_redirection_links(rmap) | |
211 |
|
207 | |||
212 | rmap.connect('ping', '%s/ping' % (ADMIN_PREFIX,), controller='home', action='ping') |
|
208 | rmap.connect('ping', '%s/ping' % (ADMIN_PREFIX,), controller='home', action='ping') | |
213 | rmap.connect('error_test', '%s/error_test' % (ADMIN_PREFIX,), controller='home', action='error_test') |
|
209 | rmap.connect('error_test', '%s/error_test' % (ADMIN_PREFIX,), controller='home', action='error_test') | |
214 |
|
210 | |||
215 | # ADMIN REPOSITORY ROUTES |
|
211 | # ADMIN REPOSITORY ROUTES | |
216 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
212 | with rmap.submapper(path_prefix=ADMIN_PREFIX, | |
217 | controller='admin/repos') as m: |
|
213 | controller='admin/repos') as m: | |
218 | m.connect('repos', '/repos', |
|
214 | m.connect('repos', '/repos', | |
219 | action='create', conditions={'method': ['POST']}) |
|
215 | action='create', conditions={'method': ['POST']}) | |
220 | m.connect('repos', '/repos', |
|
216 | m.connect('repos', '/repos', | |
221 | action='index', conditions={'method': ['GET']}) |
|
217 | action='index', conditions={'method': ['GET']}) | |
222 | m.connect('new_repo', '/create_repository', jsroute=True, |
|
218 | m.connect('new_repo', '/create_repository', jsroute=True, | |
223 | action='create_repository', conditions={'method': ['GET']}) |
|
219 | action='create_repository', conditions={'method': ['GET']}) | |
224 | m.connect('/repos/{repo_name}', |
|
220 | m.connect('/repos/{repo_name}', | |
225 | action='update', conditions={'method': ['PUT'], |
|
221 | action='update', conditions={'method': ['PUT'], | |
226 | 'function': check_repo}, |
|
222 | 'function': check_repo}, | |
227 | requirements=URL_NAME_REQUIREMENTS) |
|
223 | requirements=URL_NAME_REQUIREMENTS) | |
228 | m.connect('delete_repo', '/repos/{repo_name}', |
|
224 | m.connect('delete_repo', '/repos/{repo_name}', | |
229 | action='delete', conditions={'method': ['DELETE']}, |
|
225 | action='delete', conditions={'method': ['DELETE']}, | |
230 | requirements=URL_NAME_REQUIREMENTS) |
|
226 | requirements=URL_NAME_REQUIREMENTS) | |
231 | m.connect('repo', '/repos/{repo_name}', |
|
227 | m.connect('repo', '/repos/{repo_name}', | |
232 | action='show', conditions={'method': ['GET'], |
|
228 | action='show', conditions={'method': ['GET'], | |
233 | 'function': check_repo}, |
|
229 | 'function': check_repo}, | |
234 | requirements=URL_NAME_REQUIREMENTS) |
|
230 | requirements=URL_NAME_REQUIREMENTS) | |
235 |
|
231 | |||
236 | # ADMIN REPOSITORY GROUPS ROUTES |
|
232 | # ADMIN REPOSITORY GROUPS ROUTES | |
237 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
233 | with rmap.submapper(path_prefix=ADMIN_PREFIX, | |
238 | controller='admin/repo_groups') as m: |
|
234 | controller='admin/repo_groups') as m: | |
239 | m.connect('repo_groups', '/repo_groups', |
|
235 | m.connect('repo_groups', '/repo_groups', | |
240 | action='create', conditions={'method': ['POST']}) |
|
236 | action='create', conditions={'method': ['POST']}) | |
241 | m.connect('repo_groups', '/repo_groups', |
|
237 | m.connect('repo_groups', '/repo_groups', | |
242 | action='index', conditions={'method': ['GET']}) |
|
238 | action='index', conditions={'method': ['GET']}) | |
243 | m.connect('new_repo_group', '/repo_groups/new', |
|
239 | m.connect('new_repo_group', '/repo_groups/new', | |
244 | action='new', conditions={'method': ['GET']}) |
|
240 | action='new', conditions={'method': ['GET']}) | |
245 | m.connect('update_repo_group', '/repo_groups/{group_name}', |
|
241 | m.connect('update_repo_group', '/repo_groups/{group_name}', | |
246 | action='update', conditions={'method': ['PUT'], |
|
242 | action='update', conditions={'method': ['PUT'], | |
247 | 'function': check_group}, |
|
243 | 'function': check_group}, | |
248 | requirements=URL_NAME_REQUIREMENTS) |
|
244 | requirements=URL_NAME_REQUIREMENTS) | |
249 |
|
245 | |||
250 | # EXTRAS REPO GROUP ROUTES |
|
246 | # EXTRAS REPO GROUP ROUTES | |
251 | m.connect('edit_repo_group', '/repo_groups/{group_name}/edit', |
|
247 | m.connect('edit_repo_group', '/repo_groups/{group_name}/edit', | |
252 | action='edit', |
|
248 | action='edit', | |
253 | conditions={'method': ['GET'], 'function': check_group}, |
|
249 | conditions={'method': ['GET'], 'function': check_group}, | |
254 | requirements=URL_NAME_REQUIREMENTS) |
|
250 | requirements=URL_NAME_REQUIREMENTS) | |
255 | m.connect('edit_repo_group', '/repo_groups/{group_name}/edit', |
|
251 | m.connect('edit_repo_group', '/repo_groups/{group_name}/edit', | |
256 | action='edit', |
|
252 | action='edit', | |
257 | conditions={'method': ['PUT'], 'function': check_group}, |
|
253 | conditions={'method': ['PUT'], 'function': check_group}, | |
258 | requirements=URL_NAME_REQUIREMENTS) |
|
254 | requirements=URL_NAME_REQUIREMENTS) | |
259 |
|
255 | |||
260 | m.connect('edit_repo_group_advanced', '/repo_groups/{group_name}/edit/advanced', |
|
256 | m.connect('edit_repo_group_advanced', '/repo_groups/{group_name}/edit/advanced', | |
261 | action='edit_repo_group_advanced', |
|
257 | action='edit_repo_group_advanced', | |
262 | conditions={'method': ['GET'], 'function': check_group}, |
|
258 | conditions={'method': ['GET'], 'function': check_group}, | |
263 | requirements=URL_NAME_REQUIREMENTS) |
|
259 | requirements=URL_NAME_REQUIREMENTS) | |
264 | m.connect('edit_repo_group_advanced', '/repo_groups/{group_name}/edit/advanced', |
|
260 | m.connect('edit_repo_group_advanced', '/repo_groups/{group_name}/edit/advanced', | |
265 | action='edit_repo_group_advanced', |
|
261 | action='edit_repo_group_advanced', | |
266 | conditions={'method': ['PUT'], 'function': check_group}, |
|
262 | conditions={'method': ['PUT'], 'function': check_group}, | |
267 | requirements=URL_NAME_REQUIREMENTS) |
|
263 | requirements=URL_NAME_REQUIREMENTS) | |
268 |
|
264 | |||
269 | m.connect('edit_repo_group_perms', '/repo_groups/{group_name}/edit/permissions', |
|
265 | m.connect('edit_repo_group_perms', '/repo_groups/{group_name}/edit/permissions', | |
270 | action='edit_repo_group_perms', |
|
266 | action='edit_repo_group_perms', | |
271 | conditions={'method': ['GET'], 'function': check_group}, |
|
267 | conditions={'method': ['GET'], 'function': check_group}, | |
272 | requirements=URL_NAME_REQUIREMENTS) |
|
268 | requirements=URL_NAME_REQUIREMENTS) | |
273 | m.connect('edit_repo_group_perms', '/repo_groups/{group_name}/edit/permissions', |
|
269 | m.connect('edit_repo_group_perms', '/repo_groups/{group_name}/edit/permissions', | |
274 | action='update_perms', |
|
270 | action='update_perms', | |
275 | conditions={'method': ['PUT'], 'function': check_group}, |
|
271 | conditions={'method': ['PUT'], 'function': check_group}, | |
276 | requirements=URL_NAME_REQUIREMENTS) |
|
272 | requirements=URL_NAME_REQUIREMENTS) | |
277 |
|
273 | |||
278 | m.connect('delete_repo_group', '/repo_groups/{group_name}', |
|
274 | m.connect('delete_repo_group', '/repo_groups/{group_name}', | |
279 | action='delete', conditions={'method': ['DELETE'], |
|
275 | action='delete', conditions={'method': ['DELETE'], | |
280 | 'function': check_group}, |
|
276 | 'function': check_group}, | |
281 | requirements=URL_NAME_REQUIREMENTS) |
|
277 | requirements=URL_NAME_REQUIREMENTS) | |
282 |
|
278 | |||
283 | # ADMIN USER ROUTES |
|
279 | # ADMIN USER ROUTES | |
284 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
280 | with rmap.submapper(path_prefix=ADMIN_PREFIX, | |
285 | controller='admin/users') as m: |
|
281 | controller='admin/users') as m: | |
286 | m.connect('users', '/users', |
|
282 | m.connect('users', '/users', | |
287 | action='create', conditions={'method': ['POST']}) |
|
283 | action='create', conditions={'method': ['POST']}) | |
288 | m.connect('users', '/users', |
|
284 | m.connect('users', '/users', | |
289 | action='index', conditions={'method': ['GET']}) |
|
285 | action='index', conditions={'method': ['GET']}) | |
290 | m.connect('new_user', '/users/new', |
|
286 | m.connect('new_user', '/users/new', | |
291 | action='new', conditions={'method': ['GET']}) |
|
287 | action='new', conditions={'method': ['GET']}) | |
292 | m.connect('update_user', '/users/{user_id}', |
|
288 | m.connect('update_user', '/users/{user_id}', | |
293 | action='update', conditions={'method': ['PUT']}) |
|
289 | action='update', conditions={'method': ['PUT']}) | |
294 | m.connect('delete_user', '/users/{user_id}', |
|
290 | m.connect('delete_user', '/users/{user_id}', | |
295 | action='delete', conditions={'method': ['DELETE']}) |
|
291 | action='delete', conditions={'method': ['DELETE']}) | |
296 | m.connect('edit_user', '/users/{user_id}/edit', |
|
292 | m.connect('edit_user', '/users/{user_id}/edit', | |
297 | action='edit', conditions={'method': ['GET']}) |
|
293 | action='edit', conditions={'method': ['GET']}) | |
298 | m.connect('user', '/users/{user_id}', |
|
294 | m.connect('user', '/users/{user_id}', | |
299 | action='show', conditions={'method': ['GET']}) |
|
295 | action='show', conditions={'method': ['GET']}) | |
300 | m.connect('force_password_reset_user', '/users/{user_id}/password_reset', |
|
296 | m.connect('force_password_reset_user', '/users/{user_id}/password_reset', | |
301 | action='reset_password', conditions={'method': ['POST']}) |
|
297 | action='reset_password', conditions={'method': ['POST']}) | |
302 | m.connect('create_personal_repo_group', '/users/{user_id}/create_repo_group', |
|
298 | m.connect('create_personal_repo_group', '/users/{user_id}/create_repo_group', | |
303 | action='create_personal_repo_group', conditions={'method': ['POST']}) |
|
299 | action='create_personal_repo_group', conditions={'method': ['POST']}) | |
304 |
|
300 | |||
305 | # EXTRAS USER ROUTES |
|
301 | # EXTRAS USER ROUTES | |
306 | m.connect('edit_user_advanced', '/users/{user_id}/edit/advanced', |
|
302 | m.connect('edit_user_advanced', '/users/{user_id}/edit/advanced', | |
307 | action='edit_advanced', conditions={'method': ['GET']}) |
|
303 | action='edit_advanced', conditions={'method': ['GET']}) | |
308 | m.connect('edit_user_advanced', '/users/{user_id}/edit/advanced', |
|
304 | m.connect('edit_user_advanced', '/users/{user_id}/edit/advanced', | |
309 | action='update_advanced', conditions={'method': ['PUT']}) |
|
305 | action='update_advanced', conditions={'method': ['PUT']}) | |
310 |
|
306 | |||
311 | m.connect('edit_user_auth_tokens', '/users/{user_id}/edit/auth_tokens', |
|
307 | m.connect('edit_user_auth_tokens', '/users/{user_id}/edit/auth_tokens', | |
312 | action='edit_auth_tokens', conditions={'method': ['GET']}) |
|
308 | action='edit_auth_tokens', conditions={'method': ['GET']}) | |
313 | m.connect('edit_user_auth_tokens', '/users/{user_id}/edit/auth_tokens', |
|
309 | m.connect('edit_user_auth_tokens', '/users/{user_id}/edit/auth_tokens', | |
314 | action='add_auth_token', conditions={'method': ['PUT']}) |
|
310 | action='add_auth_token', conditions={'method': ['PUT']}) | |
315 | m.connect('edit_user_auth_tokens', '/users/{user_id}/edit/auth_tokens', |
|
311 | m.connect('edit_user_auth_tokens', '/users/{user_id}/edit/auth_tokens', | |
316 | action='delete_auth_token', conditions={'method': ['DELETE']}) |
|
312 | action='delete_auth_token', conditions={'method': ['DELETE']}) | |
317 |
|
313 | |||
318 | m.connect('edit_user_global_perms', '/users/{user_id}/edit/global_permissions', |
|
314 | m.connect('edit_user_global_perms', '/users/{user_id}/edit/global_permissions', | |
319 | action='edit_global_perms', conditions={'method': ['GET']}) |
|
315 | action='edit_global_perms', conditions={'method': ['GET']}) | |
320 | m.connect('edit_user_global_perms', '/users/{user_id}/edit/global_permissions', |
|
316 | m.connect('edit_user_global_perms', '/users/{user_id}/edit/global_permissions', | |
321 | action='update_global_perms', conditions={'method': ['PUT']}) |
|
317 | action='update_global_perms', conditions={'method': ['PUT']}) | |
322 |
|
318 | |||
323 | m.connect('edit_user_perms_summary', '/users/{user_id}/edit/permissions_summary', |
|
319 | m.connect('edit_user_perms_summary', '/users/{user_id}/edit/permissions_summary', | |
324 | action='edit_perms_summary', conditions={'method': ['GET']}) |
|
320 | action='edit_perms_summary', conditions={'method': ['GET']}) | |
325 |
|
321 | |||
326 | m.connect('edit_user_emails', '/users/{user_id}/edit/emails', |
|
322 | m.connect('edit_user_emails', '/users/{user_id}/edit/emails', | |
327 | action='edit_emails', conditions={'method': ['GET']}) |
|
323 | action='edit_emails', conditions={'method': ['GET']}) | |
328 | m.connect('edit_user_emails', '/users/{user_id}/edit/emails', |
|
324 | m.connect('edit_user_emails', '/users/{user_id}/edit/emails', | |
329 | action='add_email', conditions={'method': ['PUT']}) |
|
325 | action='add_email', conditions={'method': ['PUT']}) | |
330 | m.connect('edit_user_emails', '/users/{user_id}/edit/emails', |
|
326 | m.connect('edit_user_emails', '/users/{user_id}/edit/emails', | |
331 | action='delete_email', conditions={'method': ['DELETE']}) |
|
327 | action='delete_email', conditions={'method': ['DELETE']}) | |
332 |
|
328 | |||
333 | m.connect('edit_user_ips', '/users/{user_id}/edit/ips', |
|
329 | m.connect('edit_user_ips', '/users/{user_id}/edit/ips', | |
334 | action='edit_ips', conditions={'method': ['GET']}) |
|
330 | action='edit_ips', conditions={'method': ['GET']}) | |
335 | m.connect('edit_user_ips', '/users/{user_id}/edit/ips', |
|
331 | m.connect('edit_user_ips', '/users/{user_id}/edit/ips', | |
336 | action='add_ip', conditions={'method': ['PUT']}) |
|
332 | action='add_ip', conditions={'method': ['PUT']}) | |
337 | m.connect('edit_user_ips', '/users/{user_id}/edit/ips', |
|
333 | m.connect('edit_user_ips', '/users/{user_id}/edit/ips', | |
338 | action='delete_ip', conditions={'method': ['DELETE']}) |
|
334 | action='delete_ip', conditions={'method': ['DELETE']}) | |
339 |
|
335 | |||
340 | # ADMIN USER GROUPS REST ROUTES |
|
336 | # ADMIN USER GROUPS REST ROUTES | |
341 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
337 | with rmap.submapper(path_prefix=ADMIN_PREFIX, | |
342 | controller='admin/user_groups') as m: |
|
338 | controller='admin/user_groups') as m: | |
343 | m.connect('users_groups', '/user_groups', |
|
339 | m.connect('users_groups', '/user_groups', | |
344 | action='create', conditions={'method': ['POST']}) |
|
340 | action='create', conditions={'method': ['POST']}) | |
345 | m.connect('users_groups', '/user_groups', |
|
341 | m.connect('users_groups', '/user_groups', | |
346 | action='index', conditions={'method': ['GET']}) |
|
342 | action='index', conditions={'method': ['GET']}) | |
347 | m.connect('new_users_group', '/user_groups/new', |
|
343 | m.connect('new_users_group', '/user_groups/new', | |
348 | action='new', conditions={'method': ['GET']}) |
|
344 | action='new', conditions={'method': ['GET']}) | |
349 | m.connect('update_users_group', '/user_groups/{user_group_id}', |
|
345 | m.connect('update_users_group', '/user_groups/{user_group_id}', | |
350 | action='update', conditions={'method': ['PUT']}) |
|
346 | action='update', conditions={'method': ['PUT']}) | |
351 | m.connect('delete_users_group', '/user_groups/{user_group_id}', |
|
347 | m.connect('delete_users_group', '/user_groups/{user_group_id}', | |
352 | action='delete', conditions={'method': ['DELETE']}) |
|
348 | action='delete', conditions={'method': ['DELETE']}) | |
353 | m.connect('edit_users_group', '/user_groups/{user_group_id}/edit', |
|
349 | m.connect('edit_users_group', '/user_groups/{user_group_id}/edit', | |
354 | action='edit', conditions={'method': ['GET']}, |
|
350 | action='edit', conditions={'method': ['GET']}, | |
355 | function=check_user_group) |
|
351 | function=check_user_group) | |
356 |
|
352 | |||
357 | # EXTRAS USER GROUP ROUTES |
|
353 | # EXTRAS USER GROUP ROUTES | |
358 | m.connect('edit_user_group_global_perms', |
|
354 | m.connect('edit_user_group_global_perms', | |
359 | '/user_groups/{user_group_id}/edit/global_permissions', |
|
355 | '/user_groups/{user_group_id}/edit/global_permissions', | |
360 | action='edit_global_perms', conditions={'method': ['GET']}) |
|
356 | action='edit_global_perms', conditions={'method': ['GET']}) | |
361 | m.connect('edit_user_group_global_perms', |
|
357 | m.connect('edit_user_group_global_perms', | |
362 | '/user_groups/{user_group_id}/edit/global_permissions', |
|
358 | '/user_groups/{user_group_id}/edit/global_permissions', | |
363 | action='update_global_perms', conditions={'method': ['PUT']}) |
|
359 | action='update_global_perms', conditions={'method': ['PUT']}) | |
364 | m.connect('edit_user_group_perms_summary', |
|
360 | m.connect('edit_user_group_perms_summary', | |
365 | '/user_groups/{user_group_id}/edit/permissions_summary', |
|
361 | '/user_groups/{user_group_id}/edit/permissions_summary', | |
366 | action='edit_perms_summary', conditions={'method': ['GET']}) |
|
362 | action='edit_perms_summary', conditions={'method': ['GET']}) | |
367 |
|
363 | |||
368 | m.connect('edit_user_group_perms', |
|
364 | m.connect('edit_user_group_perms', | |
369 | '/user_groups/{user_group_id}/edit/permissions', |
|
365 | '/user_groups/{user_group_id}/edit/permissions', | |
370 | action='edit_perms', conditions={'method': ['GET']}) |
|
366 | action='edit_perms', conditions={'method': ['GET']}) | |
371 | m.connect('edit_user_group_perms', |
|
367 | m.connect('edit_user_group_perms', | |
372 | '/user_groups/{user_group_id}/edit/permissions', |
|
368 | '/user_groups/{user_group_id}/edit/permissions', | |
373 | action='update_perms', conditions={'method': ['PUT']}) |
|
369 | action='update_perms', conditions={'method': ['PUT']}) | |
374 |
|
370 | |||
375 | m.connect('edit_user_group_advanced', |
|
371 | m.connect('edit_user_group_advanced', | |
376 | '/user_groups/{user_group_id}/edit/advanced', |
|
372 | '/user_groups/{user_group_id}/edit/advanced', | |
377 | action='edit_advanced', conditions={'method': ['GET']}) |
|
373 | action='edit_advanced', conditions={'method': ['GET']}) | |
378 |
|
374 | |||
379 | m.connect('edit_user_group_members', |
|
375 | m.connect('edit_user_group_members', | |
380 | '/user_groups/{user_group_id}/edit/members', jsroute=True, |
|
376 | '/user_groups/{user_group_id}/edit/members', jsroute=True, | |
381 | action='edit_members', conditions={'method': ['GET']}) |
|
377 | action='edit_members', conditions={'method': ['GET']}) | |
382 |
|
378 | |||
383 | # ADMIN PERMISSIONS ROUTES |
|
379 | # ADMIN PERMISSIONS ROUTES | |
384 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
380 | with rmap.submapper(path_prefix=ADMIN_PREFIX, | |
385 | controller='admin/permissions') as m: |
|
381 | controller='admin/permissions') as m: | |
386 | m.connect('admin_permissions_application', '/permissions/application', |
|
382 | m.connect('admin_permissions_application', '/permissions/application', | |
387 | action='permission_application_update', conditions={'method': ['POST']}) |
|
383 | action='permission_application_update', conditions={'method': ['POST']}) | |
388 | m.connect('admin_permissions_application', '/permissions/application', |
|
384 | m.connect('admin_permissions_application', '/permissions/application', | |
389 | action='permission_application', conditions={'method': ['GET']}) |
|
385 | action='permission_application', conditions={'method': ['GET']}) | |
390 |
|
386 | |||
391 | m.connect('admin_permissions_global', '/permissions/global', |
|
387 | m.connect('admin_permissions_global', '/permissions/global', | |
392 | action='permission_global_update', conditions={'method': ['POST']}) |
|
388 | action='permission_global_update', conditions={'method': ['POST']}) | |
393 | m.connect('admin_permissions_global', '/permissions/global', |
|
389 | m.connect('admin_permissions_global', '/permissions/global', | |
394 | action='permission_global', conditions={'method': ['GET']}) |
|
390 | action='permission_global', conditions={'method': ['GET']}) | |
395 |
|
391 | |||
396 | m.connect('admin_permissions_object', '/permissions/object', |
|
392 | m.connect('admin_permissions_object', '/permissions/object', | |
397 | action='permission_objects_update', conditions={'method': ['POST']}) |
|
393 | action='permission_objects_update', conditions={'method': ['POST']}) | |
398 | m.connect('admin_permissions_object', '/permissions/object', |
|
394 | m.connect('admin_permissions_object', '/permissions/object', | |
399 | action='permission_objects', conditions={'method': ['GET']}) |
|
395 | action='permission_objects', conditions={'method': ['GET']}) | |
400 |
|
396 | |||
401 | m.connect('admin_permissions_ips', '/permissions/ips', |
|
397 | m.connect('admin_permissions_ips', '/permissions/ips', | |
402 | action='permission_ips', conditions={'method': ['POST']}) |
|
398 | action='permission_ips', conditions={'method': ['POST']}) | |
403 | m.connect('admin_permissions_ips', '/permissions/ips', |
|
399 | m.connect('admin_permissions_ips', '/permissions/ips', | |
404 | action='permission_ips', conditions={'method': ['GET']}) |
|
400 | action='permission_ips', conditions={'method': ['GET']}) | |
405 |
|
401 | |||
406 | m.connect('admin_permissions_overview', '/permissions/overview', |
|
402 | m.connect('admin_permissions_overview', '/permissions/overview', | |
407 | action='permission_perms', conditions={'method': ['GET']}) |
|
403 | action='permission_perms', conditions={'method': ['GET']}) | |
408 |
|
404 | |||
409 | # ADMIN DEFAULTS REST ROUTES |
|
405 | # ADMIN DEFAULTS REST ROUTES | |
410 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
406 | with rmap.submapper(path_prefix=ADMIN_PREFIX, | |
411 | controller='admin/defaults') as m: |
|
407 | controller='admin/defaults') as m: | |
412 | m.connect('admin_defaults_repositories', '/defaults/repositories', |
|
408 | m.connect('admin_defaults_repositories', '/defaults/repositories', | |
413 | action='update_repository_defaults', conditions={'method': ['POST']}) |
|
409 | action='update_repository_defaults', conditions={'method': ['POST']}) | |
414 | m.connect('admin_defaults_repositories', '/defaults/repositories', |
|
410 | m.connect('admin_defaults_repositories', '/defaults/repositories', | |
415 | action='index', conditions={'method': ['GET']}) |
|
411 | action='index', conditions={'method': ['GET']}) | |
416 |
|
412 | |||
417 | # ADMIN DEBUG STYLE ROUTES |
|
413 | # ADMIN DEBUG STYLE ROUTES | |
418 | if str2bool(config.get('debug_style')): |
|
414 | if str2bool(config.get('debug_style')): | |
419 | with rmap.submapper(path_prefix=ADMIN_PREFIX + '/debug_style', |
|
415 | with rmap.submapper(path_prefix=ADMIN_PREFIX + '/debug_style', | |
420 | controller='debug_style') as m: |
|
416 | controller='debug_style') as m: | |
421 | m.connect('debug_style_home', '', |
|
417 | m.connect('debug_style_home', '', | |
422 | action='index', conditions={'method': ['GET']}) |
|
418 | action='index', conditions={'method': ['GET']}) | |
423 | m.connect('debug_style_template', '/t/{t_path}', |
|
419 | m.connect('debug_style_template', '/t/{t_path}', | |
424 | action='template', conditions={'method': ['GET']}) |
|
420 | action='template', conditions={'method': ['GET']}) | |
425 |
|
421 | |||
426 | # ADMIN SETTINGS ROUTES |
|
422 | # ADMIN SETTINGS ROUTES | |
427 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
423 | with rmap.submapper(path_prefix=ADMIN_PREFIX, | |
428 | controller='admin/settings') as m: |
|
424 | controller='admin/settings') as m: | |
429 |
|
425 | |||
430 | # default |
|
426 | # default | |
431 | m.connect('admin_settings', '/settings', |
|
427 | m.connect('admin_settings', '/settings', | |
432 | action='settings_global_update', |
|
428 | action='settings_global_update', | |
433 | conditions={'method': ['POST']}) |
|
429 | conditions={'method': ['POST']}) | |
434 | m.connect('admin_settings', '/settings', |
|
430 | m.connect('admin_settings', '/settings', | |
435 | action='settings_global', conditions={'method': ['GET']}) |
|
431 | action='settings_global', conditions={'method': ['GET']}) | |
436 |
|
432 | |||
437 | m.connect('admin_settings_vcs', '/settings/vcs', |
|
433 | m.connect('admin_settings_vcs', '/settings/vcs', | |
438 | action='settings_vcs_update', |
|
434 | action='settings_vcs_update', | |
439 | conditions={'method': ['POST']}) |
|
435 | conditions={'method': ['POST']}) | |
440 | m.connect('admin_settings_vcs', '/settings/vcs', |
|
436 | m.connect('admin_settings_vcs', '/settings/vcs', | |
441 | action='settings_vcs', |
|
437 | action='settings_vcs', | |
442 | conditions={'method': ['GET']}) |
|
438 | conditions={'method': ['GET']}) | |
443 | m.connect('admin_settings_vcs', '/settings/vcs', |
|
439 | m.connect('admin_settings_vcs', '/settings/vcs', | |
444 | action='delete_svn_pattern', |
|
440 | action='delete_svn_pattern', | |
445 | conditions={'method': ['DELETE']}) |
|
441 | conditions={'method': ['DELETE']}) | |
446 |
|
442 | |||
447 | m.connect('admin_settings_mapping', '/settings/mapping', |
|
443 | m.connect('admin_settings_mapping', '/settings/mapping', | |
448 | action='settings_mapping_update', |
|
444 | action='settings_mapping_update', | |
449 | conditions={'method': ['POST']}) |
|
445 | conditions={'method': ['POST']}) | |
450 | m.connect('admin_settings_mapping', '/settings/mapping', |
|
446 | m.connect('admin_settings_mapping', '/settings/mapping', | |
451 | action='settings_mapping', conditions={'method': ['GET']}) |
|
447 | action='settings_mapping', conditions={'method': ['GET']}) | |
452 |
|
448 | |||
453 | m.connect('admin_settings_global', '/settings/global', |
|
449 | m.connect('admin_settings_global', '/settings/global', | |
454 | action='settings_global_update', |
|
450 | action='settings_global_update', | |
455 | conditions={'method': ['POST']}) |
|
451 | conditions={'method': ['POST']}) | |
456 | m.connect('admin_settings_global', '/settings/global', |
|
452 | m.connect('admin_settings_global', '/settings/global', | |
457 | action='settings_global', conditions={'method': ['GET']}) |
|
453 | action='settings_global', conditions={'method': ['GET']}) | |
458 |
|
454 | |||
459 | m.connect('admin_settings_visual', '/settings/visual', |
|
455 | m.connect('admin_settings_visual', '/settings/visual', | |
460 | action='settings_visual_update', |
|
456 | action='settings_visual_update', | |
461 | conditions={'method': ['POST']}) |
|
457 | conditions={'method': ['POST']}) | |
462 | m.connect('admin_settings_visual', '/settings/visual', |
|
458 | m.connect('admin_settings_visual', '/settings/visual', | |
463 | action='settings_visual', conditions={'method': ['GET']}) |
|
459 | action='settings_visual', conditions={'method': ['GET']}) | |
464 |
|
460 | |||
465 | m.connect('admin_settings_issuetracker', |
|
461 | m.connect('admin_settings_issuetracker', | |
466 | '/settings/issue-tracker', action='settings_issuetracker', |
|
462 | '/settings/issue-tracker', action='settings_issuetracker', | |
467 | conditions={'method': ['GET']}) |
|
463 | conditions={'method': ['GET']}) | |
468 | m.connect('admin_settings_issuetracker_save', |
|
464 | m.connect('admin_settings_issuetracker_save', | |
469 | '/settings/issue-tracker/save', |
|
465 | '/settings/issue-tracker/save', | |
470 | action='settings_issuetracker_save', |
|
466 | action='settings_issuetracker_save', | |
471 | conditions={'method': ['POST']}) |
|
467 | conditions={'method': ['POST']}) | |
472 | m.connect('admin_issuetracker_test', '/settings/issue-tracker/test', |
|
468 | m.connect('admin_issuetracker_test', '/settings/issue-tracker/test', | |
473 | action='settings_issuetracker_test', |
|
469 | action='settings_issuetracker_test', | |
474 | conditions={'method': ['POST']}) |
|
470 | conditions={'method': ['POST']}) | |
475 | m.connect('admin_issuetracker_delete', |
|
471 | m.connect('admin_issuetracker_delete', | |
476 | '/settings/issue-tracker/delete', |
|
472 | '/settings/issue-tracker/delete', | |
477 | action='settings_issuetracker_delete', |
|
473 | action='settings_issuetracker_delete', | |
478 | conditions={'method': ['DELETE']}) |
|
474 | conditions={'method': ['DELETE']}) | |
479 |
|
475 | |||
480 | m.connect('admin_settings_email', '/settings/email', |
|
476 | m.connect('admin_settings_email', '/settings/email', | |
481 | action='settings_email_update', |
|
477 | action='settings_email_update', | |
482 | conditions={'method': ['POST']}) |
|
478 | conditions={'method': ['POST']}) | |
483 | m.connect('admin_settings_email', '/settings/email', |
|
479 | m.connect('admin_settings_email', '/settings/email', | |
484 | action='settings_email', conditions={'method': ['GET']}) |
|
480 | action='settings_email', conditions={'method': ['GET']}) | |
485 |
|
481 | |||
486 | m.connect('admin_settings_hooks', '/settings/hooks', |
|
482 | m.connect('admin_settings_hooks', '/settings/hooks', | |
487 | action='settings_hooks_update', |
|
483 | action='settings_hooks_update', | |
488 | conditions={'method': ['POST', 'DELETE']}) |
|
484 | conditions={'method': ['POST', 'DELETE']}) | |
489 | m.connect('admin_settings_hooks', '/settings/hooks', |
|
485 | m.connect('admin_settings_hooks', '/settings/hooks', | |
490 | action='settings_hooks', conditions={'method': ['GET']}) |
|
486 | action='settings_hooks', conditions={'method': ['GET']}) | |
491 |
|
487 | |||
492 | m.connect('admin_settings_search', '/settings/search', |
|
488 | m.connect('admin_settings_search', '/settings/search', | |
493 | action='settings_search', conditions={'method': ['GET']}) |
|
489 | action='settings_search', conditions={'method': ['GET']}) | |
494 |
|
490 | |||
495 | m.connect('admin_settings_system', '/settings/system', |
|
491 | m.connect('admin_settings_system', '/settings/system', | |
496 | action='settings_system', conditions={'method': ['GET']}) |
|
492 | action='settings_system', conditions={'method': ['GET']}) | |
497 |
|
493 | |||
498 | m.connect('admin_settings_system_update', '/settings/system/updates', |
|
494 | m.connect('admin_settings_system_update', '/settings/system/updates', | |
499 | action='settings_system_update', conditions={'method': ['GET']}) |
|
495 | action='settings_system_update', conditions={'method': ['GET']}) | |
500 |
|
496 | |||
501 | m.connect('admin_settings_supervisor', '/settings/supervisor', |
|
497 | m.connect('admin_settings_supervisor', '/settings/supervisor', | |
502 | action='settings_supervisor', conditions={'method': ['GET']}) |
|
498 | action='settings_supervisor', conditions={'method': ['GET']}) | |
503 | m.connect('admin_settings_supervisor_log', '/settings/supervisor/{procid}/log', |
|
499 | m.connect('admin_settings_supervisor_log', '/settings/supervisor/{procid}/log', | |
504 | action='settings_supervisor_log', conditions={'method': ['GET']}) |
|
500 | action='settings_supervisor_log', conditions={'method': ['GET']}) | |
505 |
|
501 | |||
506 | m.connect('admin_settings_labs', '/settings/labs', |
|
502 | m.connect('admin_settings_labs', '/settings/labs', | |
507 | action='settings_labs_update', |
|
503 | action='settings_labs_update', | |
508 | conditions={'method': ['POST']}) |
|
504 | conditions={'method': ['POST']}) | |
509 | m.connect('admin_settings_labs', '/settings/labs', |
|
505 | m.connect('admin_settings_labs', '/settings/labs', | |
510 | action='settings_labs', conditions={'method': ['GET']}) |
|
506 | action='settings_labs', conditions={'method': ['GET']}) | |
511 |
|
507 | |||
512 | m.connect('admin_settings_open_source', '/settings/open_source', |
|
|||
513 | action='settings_open_source', |
|
|||
514 | conditions={'method': ['GET']}) |
|
|||
515 |
|
||||
516 | # ADMIN MY ACCOUNT |
|
508 | # ADMIN MY ACCOUNT | |
517 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
509 | with rmap.submapper(path_prefix=ADMIN_PREFIX, | |
518 | controller='admin/my_account') as m: |
|
510 | controller='admin/my_account') as m: | |
519 |
|
511 | |||
520 | m.connect('my_account', '/my_account', |
|
512 | m.connect('my_account', '/my_account', | |
521 | action='my_account', conditions={'method': ['GET']}) |
|
513 | action='my_account', conditions={'method': ['GET']}) | |
522 | m.connect('my_account_edit', '/my_account/edit', |
|
514 | m.connect('my_account_edit', '/my_account/edit', | |
523 | action='my_account_edit', conditions={'method': ['GET']}) |
|
515 | action='my_account_edit', conditions={'method': ['GET']}) | |
524 | m.connect('my_account', '/my_account', |
|
516 | m.connect('my_account', '/my_account', | |
525 | action='my_account_update', conditions={'method': ['POST']}) |
|
517 | action='my_account_update', conditions={'method': ['POST']}) | |
526 |
|
518 | |||
527 | m.connect('my_account_password', '/my_account/password', |
|
519 | m.connect('my_account_password', '/my_account/password', | |
528 | action='my_account_password', conditions={'method': ['GET']}) |
|
520 | action='my_account_password', conditions={'method': ['GET']}) | |
529 | m.connect('my_account_password', '/my_account/password', |
|
521 | m.connect('my_account_password', '/my_account/password', | |
530 | action='my_account_password_update', conditions={'method': ['POST']}) |
|
522 | action='my_account_password_update', conditions={'method': ['POST']}) | |
531 |
|
523 | |||
532 | m.connect('my_account_repos', '/my_account/repos', |
|
524 | m.connect('my_account_repos', '/my_account/repos', | |
533 | action='my_account_repos', conditions={'method': ['GET']}) |
|
525 | action='my_account_repos', conditions={'method': ['GET']}) | |
534 |
|
526 | |||
535 | m.connect('my_account_watched', '/my_account/watched', |
|
527 | m.connect('my_account_watched', '/my_account/watched', | |
536 | action='my_account_watched', conditions={'method': ['GET']}) |
|
528 | action='my_account_watched', conditions={'method': ['GET']}) | |
537 |
|
529 | |||
538 | m.connect('my_account_pullrequests', '/my_account/pull_requests', |
|
530 | m.connect('my_account_pullrequests', '/my_account/pull_requests', | |
539 | action='my_account_pullrequests', conditions={'method': ['GET']}) |
|
531 | action='my_account_pullrequests', conditions={'method': ['GET']}) | |
540 |
|
532 | |||
541 | m.connect('my_account_perms', '/my_account/perms', |
|
533 | m.connect('my_account_perms', '/my_account/perms', | |
542 | action='my_account_perms', conditions={'method': ['GET']}) |
|
534 | action='my_account_perms', conditions={'method': ['GET']}) | |
543 |
|
535 | |||
544 | m.connect('my_account_emails', '/my_account/emails', |
|
536 | m.connect('my_account_emails', '/my_account/emails', | |
545 | action='my_account_emails', conditions={'method': ['GET']}) |
|
537 | action='my_account_emails', conditions={'method': ['GET']}) | |
546 | m.connect('my_account_emails', '/my_account/emails', |
|
538 | m.connect('my_account_emails', '/my_account/emails', | |
547 | action='my_account_emails_add', conditions={'method': ['POST']}) |
|
539 | action='my_account_emails_add', conditions={'method': ['POST']}) | |
548 | m.connect('my_account_emails', '/my_account/emails', |
|
540 | m.connect('my_account_emails', '/my_account/emails', | |
549 | action='my_account_emails_delete', conditions={'method': ['DELETE']}) |
|
541 | action='my_account_emails_delete', conditions={'method': ['DELETE']}) | |
550 |
|
542 | |||
551 | m.connect('my_account_auth_tokens', '/my_account/auth_tokens', |
|
543 | m.connect('my_account_auth_tokens', '/my_account/auth_tokens', | |
552 | action='my_account_auth_tokens', conditions={'method': ['GET']}) |
|
544 | action='my_account_auth_tokens', conditions={'method': ['GET']}) | |
553 | m.connect('my_account_auth_tokens', '/my_account/auth_tokens', |
|
545 | m.connect('my_account_auth_tokens', '/my_account/auth_tokens', | |
554 | action='my_account_auth_tokens_add', conditions={'method': ['POST']}) |
|
546 | action='my_account_auth_tokens_add', conditions={'method': ['POST']}) | |
555 | m.connect('my_account_auth_tokens', '/my_account/auth_tokens', |
|
547 | m.connect('my_account_auth_tokens', '/my_account/auth_tokens', | |
556 | action='my_account_auth_tokens_delete', conditions={'method': ['DELETE']}) |
|
548 | action='my_account_auth_tokens_delete', conditions={'method': ['DELETE']}) | |
557 |
|
549 | |||
558 | # NOTIFICATION REST ROUTES |
|
550 | # NOTIFICATION REST ROUTES | |
559 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
551 | with rmap.submapper(path_prefix=ADMIN_PREFIX, | |
560 | controller='admin/notifications') as m: |
|
552 | controller='admin/notifications') as m: | |
561 | m.connect('notifications', '/notifications', |
|
553 | m.connect('notifications', '/notifications', | |
562 | action='index', conditions={'method': ['GET']}) |
|
554 | action='index', conditions={'method': ['GET']}) | |
563 | m.connect('notifications_mark_all_read', '/notifications/mark_all_read', |
|
555 | m.connect('notifications_mark_all_read', '/notifications/mark_all_read', | |
564 | action='mark_all_read', conditions={'method': ['POST']}) |
|
556 | action='mark_all_read', conditions={'method': ['POST']}) | |
565 |
|
557 | |||
566 | m.connect('/notifications/{notification_id}', |
|
558 | m.connect('/notifications/{notification_id}', | |
567 | action='update', conditions={'method': ['PUT']}) |
|
559 | action='update', conditions={'method': ['PUT']}) | |
568 | m.connect('/notifications/{notification_id}', |
|
560 | m.connect('/notifications/{notification_id}', | |
569 | action='delete', conditions={'method': ['DELETE']}) |
|
561 | action='delete', conditions={'method': ['DELETE']}) | |
570 | m.connect('notification', '/notifications/{notification_id}', |
|
562 | m.connect('notification', '/notifications/{notification_id}', | |
571 | action='show', conditions={'method': ['GET']}) |
|
563 | action='show', conditions={'method': ['GET']}) | |
572 |
|
564 | |||
573 | # ADMIN GIST |
|
565 | # ADMIN GIST | |
574 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
566 | with rmap.submapper(path_prefix=ADMIN_PREFIX, | |
575 | controller='admin/gists') as m: |
|
567 | controller='admin/gists') as m: | |
576 | m.connect('gists', '/gists', |
|
568 | m.connect('gists', '/gists', | |
577 | action='create', conditions={'method': ['POST']}) |
|
569 | action='create', conditions={'method': ['POST']}) | |
578 | m.connect('gists', '/gists', jsroute=True, |
|
570 | m.connect('gists', '/gists', jsroute=True, | |
579 | action='index', conditions={'method': ['GET']}) |
|
571 | action='index', conditions={'method': ['GET']}) | |
580 | m.connect('new_gist', '/gists/new', jsroute=True, |
|
572 | m.connect('new_gist', '/gists/new', jsroute=True, | |
581 | action='new', conditions={'method': ['GET']}) |
|
573 | action='new', conditions={'method': ['GET']}) | |
582 |
|
574 | |||
583 | m.connect('/gists/{gist_id}', |
|
575 | m.connect('/gists/{gist_id}', | |
584 | action='delete', conditions={'method': ['DELETE']}) |
|
576 | action='delete', conditions={'method': ['DELETE']}) | |
585 | m.connect('edit_gist', '/gists/{gist_id}/edit', |
|
577 | m.connect('edit_gist', '/gists/{gist_id}/edit', | |
586 | action='edit_form', conditions={'method': ['GET']}) |
|
578 | action='edit_form', conditions={'method': ['GET']}) | |
587 | m.connect('edit_gist', '/gists/{gist_id}/edit', |
|
579 | m.connect('edit_gist', '/gists/{gist_id}/edit', | |
588 | action='edit', conditions={'method': ['POST']}) |
|
580 | action='edit', conditions={'method': ['POST']}) | |
589 | m.connect( |
|
581 | m.connect( | |
590 | 'edit_gist_check_revision', '/gists/{gist_id}/edit/check_revision', |
|
582 | 'edit_gist_check_revision', '/gists/{gist_id}/edit/check_revision', | |
591 | action='check_revision', conditions={'method': ['GET']}) |
|
583 | action='check_revision', conditions={'method': ['GET']}) | |
592 |
|
584 | |||
593 | m.connect('gist', '/gists/{gist_id}', |
|
585 | m.connect('gist', '/gists/{gist_id}', | |
594 | action='show', conditions={'method': ['GET']}) |
|
586 | action='show', conditions={'method': ['GET']}) | |
595 | m.connect('gist_rev', '/gists/{gist_id}/{revision}', |
|
587 | m.connect('gist_rev', '/gists/{gist_id}/{revision}', | |
596 | revision='tip', |
|
588 | revision='tip', | |
597 | action='show', conditions={'method': ['GET']}) |
|
589 | action='show', conditions={'method': ['GET']}) | |
598 | m.connect('formatted_gist', '/gists/{gist_id}/{revision}/{format}', |
|
590 | m.connect('formatted_gist', '/gists/{gist_id}/{revision}/{format}', | |
599 | revision='tip', |
|
591 | revision='tip', | |
600 | action='show', conditions={'method': ['GET']}) |
|
592 | action='show', conditions={'method': ['GET']}) | |
601 | m.connect('formatted_gist_file', '/gists/{gist_id}/{revision}/{format}/{f_path}', |
|
593 | m.connect('formatted_gist_file', '/gists/{gist_id}/{revision}/{format}/{f_path}', | |
602 | revision='tip', |
|
594 | revision='tip', | |
603 | action='show', conditions={'method': ['GET']}, |
|
595 | action='show', conditions={'method': ['GET']}, | |
604 | requirements=URL_NAME_REQUIREMENTS) |
|
596 | requirements=URL_NAME_REQUIREMENTS) | |
605 |
|
597 | |||
606 | # ADMIN MAIN PAGES |
|
598 | # ADMIN MAIN PAGES | |
607 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
599 | with rmap.submapper(path_prefix=ADMIN_PREFIX, | |
608 | controller='admin/admin') as m: |
|
600 | controller='admin/admin') as m: | |
609 | m.connect('admin_home', '', action='index') |
|
601 | m.connect('admin_home', '', action='index') | |
610 | m.connect('admin_add_repo', '/add_repo/{new_repo:[a-z0-9\. _-]*}', |
|
602 | m.connect('admin_add_repo', '/add_repo/{new_repo:[a-z0-9\. _-]*}', | |
611 | action='add_repo') |
|
603 | action='add_repo') | |
612 | m.connect( |
|
604 | m.connect( | |
613 | 'pull_requests_global_0', '/pull_requests/{pull_request_id:[0-9]+}', |
|
605 | 'pull_requests_global_0', '/pull_requests/{pull_request_id:[0-9]+}', | |
614 | action='pull_requests') |
|
606 | action='pull_requests') | |
615 | m.connect( |
|
607 | m.connect( | |
616 | 'pull_requests_global', '/pull-requests/{pull_request_id:[0-9]+}', |
|
608 | 'pull_requests_global', '/pull-requests/{pull_request_id:[0-9]+}', | |
617 | action='pull_requests') |
|
609 | action='pull_requests') | |
618 |
|
610 | |||
619 |
|
611 | |||
620 | # USER JOURNAL |
|
612 | # USER JOURNAL | |
621 | rmap.connect('journal', '%s/journal' % (ADMIN_PREFIX,), |
|
613 | rmap.connect('journal', '%s/journal' % (ADMIN_PREFIX,), | |
622 | controller='journal', action='index') |
|
614 | controller='journal', action='index') | |
623 | rmap.connect('journal_rss', '%s/journal/rss' % (ADMIN_PREFIX,), |
|
615 | rmap.connect('journal_rss', '%s/journal/rss' % (ADMIN_PREFIX,), | |
624 | controller='journal', action='journal_rss') |
|
616 | controller='journal', action='journal_rss') | |
625 | rmap.connect('journal_atom', '%s/journal/atom' % (ADMIN_PREFIX,), |
|
617 | rmap.connect('journal_atom', '%s/journal/atom' % (ADMIN_PREFIX,), | |
626 | controller='journal', action='journal_atom') |
|
618 | controller='journal', action='journal_atom') | |
627 |
|
619 | |||
628 | rmap.connect('public_journal', '%s/public_journal' % (ADMIN_PREFIX,), |
|
620 | rmap.connect('public_journal', '%s/public_journal' % (ADMIN_PREFIX,), | |
629 | controller='journal', action='public_journal') |
|
621 | controller='journal', action='public_journal') | |
630 |
|
622 | |||
631 | rmap.connect('public_journal_rss', '%s/public_journal/rss' % (ADMIN_PREFIX,), |
|
623 | rmap.connect('public_journal_rss', '%s/public_journal/rss' % (ADMIN_PREFIX,), | |
632 | controller='journal', action='public_journal_rss') |
|
624 | controller='journal', action='public_journal_rss') | |
633 |
|
625 | |||
634 | rmap.connect('public_journal_rss_old', '%s/public_journal_rss' % (ADMIN_PREFIX,), |
|
626 | rmap.connect('public_journal_rss_old', '%s/public_journal_rss' % (ADMIN_PREFIX,), | |
635 | controller='journal', action='public_journal_rss') |
|
627 | controller='journal', action='public_journal_rss') | |
636 |
|
628 | |||
637 | rmap.connect('public_journal_atom', |
|
629 | rmap.connect('public_journal_atom', | |
638 | '%s/public_journal/atom' % (ADMIN_PREFIX,), controller='journal', |
|
630 | '%s/public_journal/atom' % (ADMIN_PREFIX,), controller='journal', | |
639 | action='public_journal_atom') |
|
631 | action='public_journal_atom') | |
640 |
|
632 | |||
641 | rmap.connect('public_journal_atom_old', |
|
633 | rmap.connect('public_journal_atom_old', | |
642 | '%s/public_journal_atom' % (ADMIN_PREFIX,), controller='journal', |
|
634 | '%s/public_journal_atom' % (ADMIN_PREFIX,), controller='journal', | |
643 | action='public_journal_atom') |
|
635 | action='public_journal_atom') | |
644 |
|
636 | |||
645 | rmap.connect('toggle_following', '%s/toggle_following' % (ADMIN_PREFIX,), |
|
637 | rmap.connect('toggle_following', '%s/toggle_following' % (ADMIN_PREFIX,), | |
646 | controller='journal', action='toggle_following', jsroute=True, |
|
638 | controller='journal', action='toggle_following', jsroute=True, | |
647 | conditions={'method': ['POST']}) |
|
639 | conditions={'method': ['POST']}) | |
648 |
|
640 | |||
649 | # FULL TEXT SEARCH |
|
641 | # FULL TEXT SEARCH | |
650 | rmap.connect('search', '%s/search' % (ADMIN_PREFIX,), |
|
642 | rmap.connect('search', '%s/search' % (ADMIN_PREFIX,), | |
651 | controller='search') |
|
643 | controller='search') | |
652 | rmap.connect('search_repo_home', '/{repo_name}/search', |
|
644 | rmap.connect('search_repo_home', '/{repo_name}/search', | |
653 | controller='search', |
|
645 | controller='search', | |
654 | action='index', |
|
646 | action='index', | |
655 | conditions={'function': check_repo}, |
|
647 | conditions={'function': check_repo}, | |
656 | requirements=URL_NAME_REQUIREMENTS) |
|
648 | requirements=URL_NAME_REQUIREMENTS) | |
657 |
|
649 | |||
658 | # FEEDS |
|
650 | # FEEDS | |
659 | rmap.connect('rss_feed_home', '/{repo_name}/feed/rss', |
|
651 | rmap.connect('rss_feed_home', '/{repo_name}/feed/rss', | |
660 | controller='feed', action='rss', |
|
652 | controller='feed', action='rss', | |
661 | conditions={'function': check_repo}, |
|
653 | conditions={'function': check_repo}, | |
662 | requirements=URL_NAME_REQUIREMENTS) |
|
654 | requirements=URL_NAME_REQUIREMENTS) | |
663 |
|
655 | |||
664 | rmap.connect('atom_feed_home', '/{repo_name}/feed/atom', |
|
656 | rmap.connect('atom_feed_home', '/{repo_name}/feed/atom', | |
665 | controller='feed', action='atom', |
|
657 | controller='feed', action='atom', | |
666 | conditions={'function': check_repo}, |
|
658 | conditions={'function': check_repo}, | |
667 | requirements=URL_NAME_REQUIREMENTS) |
|
659 | requirements=URL_NAME_REQUIREMENTS) | |
668 |
|
660 | |||
669 | #========================================================================== |
|
661 | #========================================================================== | |
670 | # REPOSITORY ROUTES |
|
662 | # REPOSITORY ROUTES | |
671 | #========================================================================== |
|
663 | #========================================================================== | |
672 |
|
664 | |||
673 | rmap.connect('repo_creating_home', '/{repo_name}/repo_creating', |
|
665 | rmap.connect('repo_creating_home', '/{repo_name}/repo_creating', | |
674 | controller='admin/repos', action='repo_creating', |
|
666 | controller='admin/repos', action='repo_creating', | |
675 | requirements=URL_NAME_REQUIREMENTS) |
|
667 | requirements=URL_NAME_REQUIREMENTS) | |
676 | rmap.connect('repo_check_home', '/{repo_name}/crepo_check', |
|
668 | rmap.connect('repo_check_home', '/{repo_name}/crepo_check', | |
677 | controller='admin/repos', action='repo_check', |
|
669 | controller='admin/repos', action='repo_check', | |
678 | requirements=URL_NAME_REQUIREMENTS) |
|
670 | requirements=URL_NAME_REQUIREMENTS) | |
679 |
|
671 | |||
680 | rmap.connect('repo_stats', '/{repo_name}/repo_stats/{commit_id}', |
|
672 | rmap.connect('repo_stats', '/{repo_name}/repo_stats/{commit_id}', | |
681 | controller='summary', action='repo_stats', |
|
673 | controller='summary', action='repo_stats', | |
682 | conditions={'function': check_repo}, |
|
674 | conditions={'function': check_repo}, | |
683 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
675 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
684 |
|
676 | |||
685 | rmap.connect('repo_refs_data', '/{repo_name}/refs-data', |
|
677 | rmap.connect('repo_refs_data', '/{repo_name}/refs-data', | |
686 | controller='summary', action='repo_refs_data', jsroute=True, |
|
678 | controller='summary', action='repo_refs_data', jsroute=True, | |
687 | requirements=URL_NAME_REQUIREMENTS) |
|
679 | requirements=URL_NAME_REQUIREMENTS) | |
688 | rmap.connect('repo_refs_changelog_data', '/{repo_name}/refs-data-changelog', |
|
680 | rmap.connect('repo_refs_changelog_data', '/{repo_name}/refs-data-changelog', | |
689 | controller='summary', action='repo_refs_changelog_data', |
|
681 | controller='summary', action='repo_refs_changelog_data', | |
690 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
682 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
691 |
|
683 | |||
692 | rmap.connect('changeset_home', '/{repo_name}/changeset/{revision}', |
|
684 | rmap.connect('changeset_home', '/{repo_name}/changeset/{revision}', | |
693 | controller='changeset', revision='tip', jsroute=True, |
|
685 | controller='changeset', revision='tip', jsroute=True, | |
694 | conditions={'function': check_repo}, |
|
686 | conditions={'function': check_repo}, | |
695 | requirements=URL_NAME_REQUIREMENTS) |
|
687 | requirements=URL_NAME_REQUIREMENTS) | |
696 | rmap.connect('changeset_children', '/{repo_name}/changeset_children/{revision}', |
|
688 | rmap.connect('changeset_children', '/{repo_name}/changeset_children/{revision}', | |
697 | controller='changeset', revision='tip', action='changeset_children', |
|
689 | controller='changeset', revision='tip', action='changeset_children', | |
698 | conditions={'function': check_repo}, |
|
690 | conditions={'function': check_repo}, | |
699 | requirements=URL_NAME_REQUIREMENTS) |
|
691 | requirements=URL_NAME_REQUIREMENTS) | |
700 | rmap.connect('changeset_parents', '/{repo_name}/changeset_parents/{revision}', |
|
692 | rmap.connect('changeset_parents', '/{repo_name}/changeset_parents/{revision}', | |
701 | controller='changeset', revision='tip', action='changeset_parents', |
|
693 | controller='changeset', revision='tip', action='changeset_parents', | |
702 | conditions={'function': check_repo}, |
|
694 | conditions={'function': check_repo}, | |
703 | requirements=URL_NAME_REQUIREMENTS) |
|
695 | requirements=URL_NAME_REQUIREMENTS) | |
704 |
|
696 | |||
705 | # repo edit options |
|
697 | # repo edit options | |
706 | rmap.connect('edit_repo', '/{repo_name}/settings', jsroute=True, |
|
698 | rmap.connect('edit_repo', '/{repo_name}/settings', jsroute=True, | |
707 | controller='admin/repos', action='edit', |
|
699 | controller='admin/repos', action='edit', | |
708 | conditions={'method': ['GET'], 'function': check_repo}, |
|
700 | conditions={'method': ['GET'], 'function': check_repo}, | |
709 | requirements=URL_NAME_REQUIREMENTS) |
|
701 | requirements=URL_NAME_REQUIREMENTS) | |
710 |
|
702 | |||
711 | rmap.connect('edit_repo_perms', '/{repo_name}/settings/permissions', |
|
703 | rmap.connect('edit_repo_perms', '/{repo_name}/settings/permissions', | |
712 | jsroute=True, |
|
704 | jsroute=True, | |
713 | controller='admin/repos', action='edit_permissions', |
|
705 | controller='admin/repos', action='edit_permissions', | |
714 | conditions={'method': ['GET'], 'function': check_repo}, |
|
706 | conditions={'method': ['GET'], 'function': check_repo}, | |
715 | requirements=URL_NAME_REQUIREMENTS) |
|
707 | requirements=URL_NAME_REQUIREMENTS) | |
716 | rmap.connect('edit_repo_perms_update', '/{repo_name}/settings/permissions', |
|
708 | rmap.connect('edit_repo_perms_update', '/{repo_name}/settings/permissions', | |
717 | controller='admin/repos', action='edit_permissions_update', |
|
709 | controller='admin/repos', action='edit_permissions_update', | |
718 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
710 | conditions={'method': ['PUT'], 'function': check_repo}, | |
719 | requirements=URL_NAME_REQUIREMENTS) |
|
711 | requirements=URL_NAME_REQUIREMENTS) | |
720 |
|
712 | |||
721 | rmap.connect('edit_repo_fields', '/{repo_name}/settings/fields', |
|
713 | rmap.connect('edit_repo_fields', '/{repo_name}/settings/fields', | |
722 | controller='admin/repos', action='edit_fields', |
|
714 | controller='admin/repos', action='edit_fields', | |
723 | conditions={'method': ['GET'], 'function': check_repo}, |
|
715 | conditions={'method': ['GET'], 'function': check_repo}, | |
724 | requirements=URL_NAME_REQUIREMENTS) |
|
716 | requirements=URL_NAME_REQUIREMENTS) | |
725 | rmap.connect('create_repo_fields', '/{repo_name}/settings/fields/new', |
|
717 | rmap.connect('create_repo_fields', '/{repo_name}/settings/fields/new', | |
726 | controller='admin/repos', action='create_repo_field', |
|
718 | controller='admin/repos', action='create_repo_field', | |
727 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
719 | conditions={'method': ['PUT'], 'function': check_repo}, | |
728 | requirements=URL_NAME_REQUIREMENTS) |
|
720 | requirements=URL_NAME_REQUIREMENTS) | |
729 | rmap.connect('delete_repo_fields', '/{repo_name}/settings/fields/{field_id}', |
|
721 | rmap.connect('delete_repo_fields', '/{repo_name}/settings/fields/{field_id}', | |
730 | controller='admin/repos', action='delete_repo_field', |
|
722 | controller='admin/repos', action='delete_repo_field', | |
731 | conditions={'method': ['DELETE'], 'function': check_repo}, |
|
723 | conditions={'method': ['DELETE'], 'function': check_repo}, | |
732 | requirements=URL_NAME_REQUIREMENTS) |
|
724 | requirements=URL_NAME_REQUIREMENTS) | |
733 |
|
725 | |||
734 | rmap.connect('edit_repo_advanced', '/{repo_name}/settings/advanced', |
|
726 | rmap.connect('edit_repo_advanced', '/{repo_name}/settings/advanced', | |
735 | controller='admin/repos', action='edit_advanced', |
|
727 | controller='admin/repos', action='edit_advanced', | |
736 | conditions={'method': ['GET'], 'function': check_repo}, |
|
728 | conditions={'method': ['GET'], 'function': check_repo}, | |
737 | requirements=URL_NAME_REQUIREMENTS) |
|
729 | requirements=URL_NAME_REQUIREMENTS) | |
738 |
|
730 | |||
739 | rmap.connect('edit_repo_advanced_locking', '/{repo_name}/settings/advanced/locking', |
|
731 | rmap.connect('edit_repo_advanced_locking', '/{repo_name}/settings/advanced/locking', | |
740 | controller='admin/repos', action='edit_advanced_locking', |
|
732 | controller='admin/repos', action='edit_advanced_locking', | |
741 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
733 | conditions={'method': ['PUT'], 'function': check_repo}, | |
742 | requirements=URL_NAME_REQUIREMENTS) |
|
734 | requirements=URL_NAME_REQUIREMENTS) | |
743 | rmap.connect('toggle_locking', '/{repo_name}/settings/advanced/locking_toggle', |
|
735 | rmap.connect('toggle_locking', '/{repo_name}/settings/advanced/locking_toggle', | |
744 | controller='admin/repos', action='toggle_locking', |
|
736 | controller='admin/repos', action='toggle_locking', | |
745 | conditions={'method': ['GET'], 'function': check_repo}, |
|
737 | conditions={'method': ['GET'], 'function': check_repo}, | |
746 | requirements=URL_NAME_REQUIREMENTS) |
|
738 | requirements=URL_NAME_REQUIREMENTS) | |
747 |
|
739 | |||
748 | rmap.connect('edit_repo_advanced_journal', '/{repo_name}/settings/advanced/journal', |
|
740 | rmap.connect('edit_repo_advanced_journal', '/{repo_name}/settings/advanced/journal', | |
749 | controller='admin/repos', action='edit_advanced_journal', |
|
741 | controller='admin/repos', action='edit_advanced_journal', | |
750 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
742 | conditions={'method': ['PUT'], 'function': check_repo}, | |
751 | requirements=URL_NAME_REQUIREMENTS) |
|
743 | requirements=URL_NAME_REQUIREMENTS) | |
752 |
|
744 | |||
753 | rmap.connect('edit_repo_advanced_fork', '/{repo_name}/settings/advanced/fork', |
|
745 | rmap.connect('edit_repo_advanced_fork', '/{repo_name}/settings/advanced/fork', | |
754 | controller='admin/repos', action='edit_advanced_fork', |
|
746 | controller='admin/repos', action='edit_advanced_fork', | |
755 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
747 | conditions={'method': ['PUT'], 'function': check_repo}, | |
756 | requirements=URL_NAME_REQUIREMENTS) |
|
748 | requirements=URL_NAME_REQUIREMENTS) | |
757 |
|
749 | |||
758 | rmap.connect('edit_repo_caches', '/{repo_name}/settings/caches', |
|
750 | rmap.connect('edit_repo_caches', '/{repo_name}/settings/caches', | |
759 | controller='admin/repos', action='edit_caches_form', |
|
751 | controller='admin/repos', action='edit_caches_form', | |
760 | conditions={'method': ['GET'], 'function': check_repo}, |
|
752 | conditions={'method': ['GET'], 'function': check_repo}, | |
761 | requirements=URL_NAME_REQUIREMENTS) |
|
753 | requirements=URL_NAME_REQUIREMENTS) | |
762 | rmap.connect('edit_repo_caches', '/{repo_name}/settings/caches', |
|
754 | rmap.connect('edit_repo_caches', '/{repo_name}/settings/caches', | |
763 | controller='admin/repos', action='edit_caches', |
|
755 | controller='admin/repos', action='edit_caches', | |
764 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
756 | conditions={'method': ['PUT'], 'function': check_repo}, | |
765 | requirements=URL_NAME_REQUIREMENTS) |
|
757 | requirements=URL_NAME_REQUIREMENTS) | |
766 |
|
758 | |||
767 | rmap.connect('edit_repo_remote', '/{repo_name}/settings/remote', |
|
759 | rmap.connect('edit_repo_remote', '/{repo_name}/settings/remote', | |
768 | controller='admin/repos', action='edit_remote_form', |
|
760 | controller='admin/repos', action='edit_remote_form', | |
769 | conditions={'method': ['GET'], 'function': check_repo}, |
|
761 | conditions={'method': ['GET'], 'function': check_repo}, | |
770 | requirements=URL_NAME_REQUIREMENTS) |
|
762 | requirements=URL_NAME_REQUIREMENTS) | |
771 | rmap.connect('edit_repo_remote', '/{repo_name}/settings/remote', |
|
763 | rmap.connect('edit_repo_remote', '/{repo_name}/settings/remote', | |
772 | controller='admin/repos', action='edit_remote', |
|
764 | controller='admin/repos', action='edit_remote', | |
773 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
765 | conditions={'method': ['PUT'], 'function': check_repo}, | |
774 | requirements=URL_NAME_REQUIREMENTS) |
|
766 | requirements=URL_NAME_REQUIREMENTS) | |
775 |
|
767 | |||
776 | rmap.connect('edit_repo_statistics', '/{repo_name}/settings/statistics', |
|
768 | rmap.connect('edit_repo_statistics', '/{repo_name}/settings/statistics', | |
777 | controller='admin/repos', action='edit_statistics_form', |
|
769 | controller='admin/repos', action='edit_statistics_form', | |
778 | conditions={'method': ['GET'], 'function': check_repo}, |
|
770 | conditions={'method': ['GET'], 'function': check_repo}, | |
779 | requirements=URL_NAME_REQUIREMENTS) |
|
771 | requirements=URL_NAME_REQUIREMENTS) | |
780 | rmap.connect('edit_repo_statistics', '/{repo_name}/settings/statistics', |
|
772 | rmap.connect('edit_repo_statistics', '/{repo_name}/settings/statistics', | |
781 | controller='admin/repos', action='edit_statistics', |
|
773 | controller='admin/repos', action='edit_statistics', | |
782 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
774 | conditions={'method': ['PUT'], 'function': check_repo}, | |
783 | requirements=URL_NAME_REQUIREMENTS) |
|
775 | requirements=URL_NAME_REQUIREMENTS) | |
784 | rmap.connect('repo_settings_issuetracker', |
|
776 | rmap.connect('repo_settings_issuetracker', | |
785 | '/{repo_name}/settings/issue-tracker', |
|
777 | '/{repo_name}/settings/issue-tracker', | |
786 | controller='admin/repos', action='repo_issuetracker', |
|
778 | controller='admin/repos', action='repo_issuetracker', | |
787 | conditions={'method': ['GET'], 'function': check_repo}, |
|
779 | conditions={'method': ['GET'], 'function': check_repo}, | |
788 | requirements=URL_NAME_REQUIREMENTS) |
|
780 | requirements=URL_NAME_REQUIREMENTS) | |
789 | rmap.connect('repo_issuetracker_test', |
|
781 | rmap.connect('repo_issuetracker_test', | |
790 | '/{repo_name}/settings/issue-tracker/test', |
|
782 | '/{repo_name}/settings/issue-tracker/test', | |
791 | controller='admin/repos', action='repo_issuetracker_test', |
|
783 | controller='admin/repos', action='repo_issuetracker_test', | |
792 | conditions={'method': ['POST'], 'function': check_repo}, |
|
784 | conditions={'method': ['POST'], 'function': check_repo}, | |
793 | requirements=URL_NAME_REQUIREMENTS) |
|
785 | requirements=URL_NAME_REQUIREMENTS) | |
794 | rmap.connect('repo_issuetracker_delete', |
|
786 | rmap.connect('repo_issuetracker_delete', | |
795 | '/{repo_name}/settings/issue-tracker/delete', |
|
787 | '/{repo_name}/settings/issue-tracker/delete', | |
796 | controller='admin/repos', action='repo_issuetracker_delete', |
|
788 | controller='admin/repos', action='repo_issuetracker_delete', | |
797 | conditions={'method': ['DELETE'], 'function': check_repo}, |
|
789 | conditions={'method': ['DELETE'], 'function': check_repo}, | |
798 | requirements=URL_NAME_REQUIREMENTS) |
|
790 | requirements=URL_NAME_REQUIREMENTS) | |
799 | rmap.connect('repo_issuetracker_save', |
|
791 | rmap.connect('repo_issuetracker_save', | |
800 | '/{repo_name}/settings/issue-tracker/save', |
|
792 | '/{repo_name}/settings/issue-tracker/save', | |
801 | controller='admin/repos', action='repo_issuetracker_save', |
|
793 | controller='admin/repos', action='repo_issuetracker_save', | |
802 | conditions={'method': ['POST'], 'function': check_repo}, |
|
794 | conditions={'method': ['POST'], 'function': check_repo}, | |
803 | requirements=URL_NAME_REQUIREMENTS) |
|
795 | requirements=URL_NAME_REQUIREMENTS) | |
804 | rmap.connect('repo_vcs_settings', '/{repo_name}/settings/vcs', |
|
796 | rmap.connect('repo_vcs_settings', '/{repo_name}/settings/vcs', | |
805 | controller='admin/repos', action='repo_settings_vcs_update', |
|
797 | controller='admin/repos', action='repo_settings_vcs_update', | |
806 | conditions={'method': ['POST'], 'function': check_repo}, |
|
798 | conditions={'method': ['POST'], 'function': check_repo}, | |
807 | requirements=URL_NAME_REQUIREMENTS) |
|
799 | requirements=URL_NAME_REQUIREMENTS) | |
808 | rmap.connect('repo_vcs_settings', '/{repo_name}/settings/vcs', |
|
800 | rmap.connect('repo_vcs_settings', '/{repo_name}/settings/vcs', | |
809 | controller='admin/repos', action='repo_settings_vcs', |
|
801 | controller='admin/repos', action='repo_settings_vcs', | |
810 | conditions={'method': ['GET'], 'function': check_repo}, |
|
802 | conditions={'method': ['GET'], 'function': check_repo}, | |
811 | requirements=URL_NAME_REQUIREMENTS) |
|
803 | requirements=URL_NAME_REQUIREMENTS) | |
812 | rmap.connect('repo_vcs_settings', '/{repo_name}/settings/vcs', |
|
804 | rmap.connect('repo_vcs_settings', '/{repo_name}/settings/vcs', | |
813 | controller='admin/repos', action='repo_delete_svn_pattern', |
|
805 | controller='admin/repos', action='repo_delete_svn_pattern', | |
814 | conditions={'method': ['DELETE'], 'function': check_repo}, |
|
806 | conditions={'method': ['DELETE'], 'function': check_repo}, | |
815 | requirements=URL_NAME_REQUIREMENTS) |
|
807 | requirements=URL_NAME_REQUIREMENTS) | |
816 |
|
808 | |||
817 | # still working url for backward compat. |
|
809 | # still working url for backward compat. | |
818 | rmap.connect('raw_changeset_home_depraced', |
|
810 | rmap.connect('raw_changeset_home_depraced', | |
819 | '/{repo_name}/raw-changeset/{revision}', |
|
811 | '/{repo_name}/raw-changeset/{revision}', | |
820 | controller='changeset', action='changeset_raw', |
|
812 | controller='changeset', action='changeset_raw', | |
821 | revision='tip', conditions={'function': check_repo}, |
|
813 | revision='tip', conditions={'function': check_repo}, | |
822 | requirements=URL_NAME_REQUIREMENTS) |
|
814 | requirements=URL_NAME_REQUIREMENTS) | |
823 |
|
815 | |||
824 | # new URLs |
|
816 | # new URLs | |
825 | rmap.connect('changeset_raw_home', |
|
817 | rmap.connect('changeset_raw_home', | |
826 | '/{repo_name}/changeset-diff/{revision}', |
|
818 | '/{repo_name}/changeset-diff/{revision}', | |
827 | controller='changeset', action='changeset_raw', |
|
819 | controller='changeset', action='changeset_raw', | |
828 | revision='tip', conditions={'function': check_repo}, |
|
820 | revision='tip', conditions={'function': check_repo}, | |
829 | requirements=URL_NAME_REQUIREMENTS) |
|
821 | requirements=URL_NAME_REQUIREMENTS) | |
830 |
|
822 | |||
831 | rmap.connect('changeset_patch_home', |
|
823 | rmap.connect('changeset_patch_home', | |
832 | '/{repo_name}/changeset-patch/{revision}', |
|
824 | '/{repo_name}/changeset-patch/{revision}', | |
833 | controller='changeset', action='changeset_patch', |
|
825 | controller='changeset', action='changeset_patch', | |
834 | revision='tip', conditions={'function': check_repo}, |
|
826 | revision='tip', conditions={'function': check_repo}, | |
835 | requirements=URL_NAME_REQUIREMENTS) |
|
827 | requirements=URL_NAME_REQUIREMENTS) | |
836 |
|
828 | |||
837 | rmap.connect('changeset_download_home', |
|
829 | rmap.connect('changeset_download_home', | |
838 | '/{repo_name}/changeset-download/{revision}', |
|
830 | '/{repo_name}/changeset-download/{revision}', | |
839 | controller='changeset', action='changeset_download', |
|
831 | controller='changeset', action='changeset_download', | |
840 | revision='tip', conditions={'function': check_repo}, |
|
832 | revision='tip', conditions={'function': check_repo}, | |
841 | requirements=URL_NAME_REQUIREMENTS) |
|
833 | requirements=URL_NAME_REQUIREMENTS) | |
842 |
|
834 | |||
843 | rmap.connect('changeset_comment', |
|
835 | rmap.connect('changeset_comment', | |
844 | '/{repo_name}/changeset/{revision}/comment', jsroute=True, |
|
836 | '/{repo_name}/changeset/{revision}/comment', jsroute=True, | |
845 | controller='changeset', revision='tip', action='comment', |
|
837 | controller='changeset', revision='tip', action='comment', | |
846 | conditions={'function': check_repo}, |
|
838 | conditions={'function': check_repo}, | |
847 | requirements=URL_NAME_REQUIREMENTS) |
|
839 | requirements=URL_NAME_REQUIREMENTS) | |
848 |
|
840 | |||
849 | rmap.connect('changeset_comment_preview', |
|
841 | rmap.connect('changeset_comment_preview', | |
850 | '/{repo_name}/changeset/comment/preview', jsroute=True, |
|
842 | '/{repo_name}/changeset/comment/preview', jsroute=True, | |
851 | controller='changeset', action='preview_comment', |
|
843 | controller='changeset', action='preview_comment', | |
852 | conditions={'function': check_repo, 'method': ['POST']}, |
|
844 | conditions={'function': check_repo, 'method': ['POST']}, | |
853 | requirements=URL_NAME_REQUIREMENTS) |
|
845 | requirements=URL_NAME_REQUIREMENTS) | |
854 |
|
846 | |||
855 | rmap.connect('changeset_comment_delete', |
|
847 | rmap.connect('changeset_comment_delete', | |
856 | '/{repo_name}/changeset/comment/{comment_id}/delete', |
|
848 | '/{repo_name}/changeset/comment/{comment_id}/delete', | |
857 | controller='changeset', action='delete_comment', |
|
849 | controller='changeset', action='delete_comment', | |
858 | conditions={'function': check_repo, 'method': ['DELETE']}, |
|
850 | conditions={'function': check_repo, 'method': ['DELETE']}, | |
859 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
851 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
860 |
|
852 | |||
861 | rmap.connect('changeset_info', '/changeset_info/{repo_name}/{revision}', |
|
853 | rmap.connect('changeset_info', '/changeset_info/{repo_name}/{revision}', | |
862 | controller='changeset', action='changeset_info', |
|
854 | controller='changeset', action='changeset_info', | |
863 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
855 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
864 |
|
856 | |||
865 | rmap.connect('compare_home', |
|
857 | rmap.connect('compare_home', | |
866 | '/{repo_name}/compare', |
|
858 | '/{repo_name}/compare', | |
867 | controller='compare', action='index', |
|
859 | controller='compare', action='index', | |
868 | conditions={'function': check_repo}, |
|
860 | conditions={'function': check_repo}, | |
869 | requirements=URL_NAME_REQUIREMENTS) |
|
861 | requirements=URL_NAME_REQUIREMENTS) | |
870 |
|
862 | |||
871 | rmap.connect('compare_url', |
|
863 | rmap.connect('compare_url', | |
872 | '/{repo_name}/compare/{source_ref_type}@{source_ref:.*?}...{target_ref_type}@{target_ref:.*?}', |
|
864 | '/{repo_name}/compare/{source_ref_type}@{source_ref:.*?}...{target_ref_type}@{target_ref:.*?}', | |
873 | controller='compare', action='compare', |
|
865 | controller='compare', action='compare', | |
874 | conditions={'function': check_repo}, |
|
866 | conditions={'function': check_repo}, | |
875 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
867 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
876 |
|
868 | |||
877 | rmap.connect('pullrequest_home', |
|
869 | rmap.connect('pullrequest_home', | |
878 | '/{repo_name}/pull-request/new', controller='pullrequests', |
|
870 | '/{repo_name}/pull-request/new', controller='pullrequests', | |
879 | action='index', conditions={'function': check_repo, |
|
871 | action='index', conditions={'function': check_repo, | |
880 | 'method': ['GET']}, |
|
872 | 'method': ['GET']}, | |
881 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
873 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
882 |
|
874 | |||
883 | rmap.connect('pullrequest', |
|
875 | rmap.connect('pullrequest', | |
884 | '/{repo_name}/pull-request/new', controller='pullrequests', |
|
876 | '/{repo_name}/pull-request/new', controller='pullrequests', | |
885 | action='create', conditions={'function': check_repo, |
|
877 | action='create', conditions={'function': check_repo, | |
886 | 'method': ['POST']}, |
|
878 | 'method': ['POST']}, | |
887 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
879 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
888 |
|
880 | |||
889 | rmap.connect('pullrequest_repo_refs', |
|
881 | rmap.connect('pullrequest_repo_refs', | |
890 | '/{repo_name}/pull-request/refs/{target_repo_name:.*?[^/]}', |
|
882 | '/{repo_name}/pull-request/refs/{target_repo_name:.*?[^/]}', | |
891 | controller='pullrequests', |
|
883 | controller='pullrequests', | |
892 | action='get_repo_refs', |
|
884 | action='get_repo_refs', | |
893 | conditions={'function': check_repo, 'method': ['GET']}, |
|
885 | conditions={'function': check_repo, 'method': ['GET']}, | |
894 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
886 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
895 |
|
887 | |||
896 | rmap.connect('pullrequest_repo_destinations', |
|
888 | rmap.connect('pullrequest_repo_destinations', | |
897 | '/{repo_name}/pull-request/repo-destinations', |
|
889 | '/{repo_name}/pull-request/repo-destinations', | |
898 | controller='pullrequests', |
|
890 | controller='pullrequests', | |
899 | action='get_repo_destinations', |
|
891 | action='get_repo_destinations', | |
900 | conditions={'function': check_repo, 'method': ['GET']}, |
|
892 | conditions={'function': check_repo, 'method': ['GET']}, | |
901 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
893 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
902 |
|
894 | |||
903 | rmap.connect('pullrequest_show', |
|
895 | rmap.connect('pullrequest_show', | |
904 | '/{repo_name}/pull-request/{pull_request_id}', |
|
896 | '/{repo_name}/pull-request/{pull_request_id}', | |
905 | controller='pullrequests', |
|
897 | controller='pullrequests', | |
906 | action='show', conditions={'function': check_repo, |
|
898 | action='show', conditions={'function': check_repo, | |
907 | 'method': ['GET']}, |
|
899 | 'method': ['GET']}, | |
908 | requirements=URL_NAME_REQUIREMENTS) |
|
900 | requirements=URL_NAME_REQUIREMENTS) | |
909 |
|
901 | |||
910 | rmap.connect('pullrequest_update', |
|
902 | rmap.connect('pullrequest_update', | |
911 | '/{repo_name}/pull-request/{pull_request_id}', |
|
903 | '/{repo_name}/pull-request/{pull_request_id}', | |
912 | controller='pullrequests', |
|
904 | controller='pullrequests', | |
913 | action='update', conditions={'function': check_repo, |
|
905 | action='update', conditions={'function': check_repo, | |
914 | 'method': ['PUT']}, |
|
906 | 'method': ['PUT']}, | |
915 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
907 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
916 |
|
908 | |||
917 | rmap.connect('pullrequest_merge', |
|
909 | rmap.connect('pullrequest_merge', | |
918 | '/{repo_name}/pull-request/{pull_request_id}', |
|
910 | '/{repo_name}/pull-request/{pull_request_id}', | |
919 | controller='pullrequests', |
|
911 | controller='pullrequests', | |
920 | action='merge', conditions={'function': check_repo, |
|
912 | action='merge', conditions={'function': check_repo, | |
921 | 'method': ['POST']}, |
|
913 | 'method': ['POST']}, | |
922 | requirements=URL_NAME_REQUIREMENTS) |
|
914 | requirements=URL_NAME_REQUIREMENTS) | |
923 |
|
915 | |||
924 | rmap.connect('pullrequest_delete', |
|
916 | rmap.connect('pullrequest_delete', | |
925 | '/{repo_name}/pull-request/{pull_request_id}', |
|
917 | '/{repo_name}/pull-request/{pull_request_id}', | |
926 | controller='pullrequests', |
|
918 | controller='pullrequests', | |
927 | action='delete', conditions={'function': check_repo, |
|
919 | action='delete', conditions={'function': check_repo, | |
928 | 'method': ['DELETE']}, |
|
920 | 'method': ['DELETE']}, | |
929 | requirements=URL_NAME_REQUIREMENTS) |
|
921 | requirements=URL_NAME_REQUIREMENTS) | |
930 |
|
922 | |||
931 | rmap.connect('pullrequest_show_all', |
|
923 | rmap.connect('pullrequest_show_all', | |
932 | '/{repo_name}/pull-request', |
|
924 | '/{repo_name}/pull-request', | |
933 | controller='pullrequests', |
|
925 | controller='pullrequests', | |
934 | action='show_all', conditions={'function': check_repo, |
|
926 | action='show_all', conditions={'function': check_repo, | |
935 | 'method': ['GET']}, |
|
927 | 'method': ['GET']}, | |
936 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
928 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
937 |
|
929 | |||
938 | rmap.connect('pullrequest_comment', |
|
930 | rmap.connect('pullrequest_comment', | |
939 | '/{repo_name}/pull-request-comment/{pull_request_id}', |
|
931 | '/{repo_name}/pull-request-comment/{pull_request_id}', | |
940 | controller='pullrequests', |
|
932 | controller='pullrequests', | |
941 | action='comment', conditions={'function': check_repo, |
|
933 | action='comment', conditions={'function': check_repo, | |
942 | 'method': ['POST']}, |
|
934 | 'method': ['POST']}, | |
943 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
935 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
944 |
|
936 | |||
945 | rmap.connect('pullrequest_comment_delete', |
|
937 | rmap.connect('pullrequest_comment_delete', | |
946 | '/{repo_name}/pull-request-comment/{comment_id}/delete', |
|
938 | '/{repo_name}/pull-request-comment/{comment_id}/delete', | |
947 | controller='pullrequests', action='delete_comment', |
|
939 | controller='pullrequests', action='delete_comment', | |
948 | conditions={'function': check_repo, 'method': ['DELETE']}, |
|
940 | conditions={'function': check_repo, 'method': ['DELETE']}, | |
949 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
941 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
950 |
|
942 | |||
951 | rmap.connect('summary_home_explicit', '/{repo_name}/summary', |
|
943 | rmap.connect('summary_home_explicit', '/{repo_name}/summary', | |
952 | controller='summary', conditions={'function': check_repo}, |
|
944 | controller='summary', conditions={'function': check_repo}, | |
953 | requirements=URL_NAME_REQUIREMENTS) |
|
945 | requirements=URL_NAME_REQUIREMENTS) | |
954 |
|
946 | |||
955 | rmap.connect('branches_home', '/{repo_name}/branches', |
|
947 | rmap.connect('branches_home', '/{repo_name}/branches', | |
956 | controller='branches', conditions={'function': check_repo}, |
|
948 | controller='branches', conditions={'function': check_repo}, | |
957 | requirements=URL_NAME_REQUIREMENTS) |
|
949 | requirements=URL_NAME_REQUIREMENTS) | |
958 |
|
950 | |||
959 | rmap.connect('tags_home', '/{repo_name}/tags', |
|
951 | rmap.connect('tags_home', '/{repo_name}/tags', | |
960 | controller='tags', conditions={'function': check_repo}, |
|
952 | controller='tags', conditions={'function': check_repo}, | |
961 | requirements=URL_NAME_REQUIREMENTS) |
|
953 | requirements=URL_NAME_REQUIREMENTS) | |
962 |
|
954 | |||
963 | rmap.connect('bookmarks_home', '/{repo_name}/bookmarks', |
|
955 | rmap.connect('bookmarks_home', '/{repo_name}/bookmarks', | |
964 | controller='bookmarks', conditions={'function': check_repo}, |
|
956 | controller='bookmarks', conditions={'function': check_repo}, | |
965 | requirements=URL_NAME_REQUIREMENTS) |
|
957 | requirements=URL_NAME_REQUIREMENTS) | |
966 |
|
958 | |||
967 | rmap.connect('changelog_home', '/{repo_name}/changelog', jsroute=True, |
|
959 | rmap.connect('changelog_home', '/{repo_name}/changelog', jsroute=True, | |
968 | controller='changelog', conditions={'function': check_repo}, |
|
960 | controller='changelog', conditions={'function': check_repo}, | |
969 | requirements=URL_NAME_REQUIREMENTS) |
|
961 | requirements=URL_NAME_REQUIREMENTS) | |
970 |
|
962 | |||
971 | rmap.connect('changelog_summary_home', '/{repo_name}/changelog_summary', |
|
963 | rmap.connect('changelog_summary_home', '/{repo_name}/changelog_summary', | |
972 | controller='changelog', action='changelog_summary', |
|
964 | controller='changelog', action='changelog_summary', | |
973 | conditions={'function': check_repo}, |
|
965 | conditions={'function': check_repo}, | |
974 | requirements=URL_NAME_REQUIREMENTS) |
|
966 | requirements=URL_NAME_REQUIREMENTS) | |
975 |
|
967 | |||
976 | rmap.connect('changelog_file_home', |
|
968 | rmap.connect('changelog_file_home', | |
977 | '/{repo_name}/changelog/{revision}/{f_path}', |
|
969 | '/{repo_name}/changelog/{revision}/{f_path}', | |
978 | controller='changelog', f_path=None, |
|
970 | controller='changelog', f_path=None, | |
979 | conditions={'function': check_repo}, |
|
971 | conditions={'function': check_repo}, | |
980 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
972 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
981 |
|
973 | |||
982 | rmap.connect('changelog_details', '/{repo_name}/changelog_details/{cs}', |
|
974 | rmap.connect('changelog_details', '/{repo_name}/changelog_details/{cs}', | |
983 | controller='changelog', action='changelog_details', |
|
975 | controller='changelog', action='changelog_details', | |
984 | conditions={'function': check_repo}, |
|
976 | conditions={'function': check_repo}, | |
985 | requirements=URL_NAME_REQUIREMENTS) |
|
977 | requirements=URL_NAME_REQUIREMENTS) | |
986 |
|
978 | |||
987 | rmap.connect('files_home', '/{repo_name}/files/{revision}/{f_path}', |
|
979 | rmap.connect('files_home', '/{repo_name}/files/{revision}/{f_path}', | |
988 | controller='files', revision='tip', f_path='', |
|
980 | controller='files', revision='tip', f_path='', | |
989 | conditions={'function': check_repo}, |
|
981 | conditions={'function': check_repo}, | |
990 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
982 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
991 |
|
983 | |||
992 | rmap.connect('files_home_simple_catchrev', |
|
984 | rmap.connect('files_home_simple_catchrev', | |
993 | '/{repo_name}/files/{revision}', |
|
985 | '/{repo_name}/files/{revision}', | |
994 | controller='files', revision='tip', f_path='', |
|
986 | controller='files', revision='tip', f_path='', | |
995 | conditions={'function': check_repo}, |
|
987 | conditions={'function': check_repo}, | |
996 | requirements=URL_NAME_REQUIREMENTS) |
|
988 | requirements=URL_NAME_REQUIREMENTS) | |
997 |
|
989 | |||
998 | rmap.connect('files_home_simple_catchall', |
|
990 | rmap.connect('files_home_simple_catchall', | |
999 | '/{repo_name}/files', |
|
991 | '/{repo_name}/files', | |
1000 | controller='files', revision='tip', f_path='', |
|
992 | controller='files', revision='tip', f_path='', | |
1001 | conditions={'function': check_repo}, |
|
993 | conditions={'function': check_repo}, | |
1002 | requirements=URL_NAME_REQUIREMENTS) |
|
994 | requirements=URL_NAME_REQUIREMENTS) | |
1003 |
|
995 | |||
1004 | rmap.connect('files_history_home', |
|
996 | rmap.connect('files_history_home', | |
1005 | '/{repo_name}/history/{revision}/{f_path}', |
|
997 | '/{repo_name}/history/{revision}/{f_path}', | |
1006 | controller='files', action='history', revision='tip', f_path='', |
|
998 | controller='files', action='history', revision='tip', f_path='', | |
1007 | conditions={'function': check_repo}, |
|
999 | conditions={'function': check_repo}, | |
1008 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
1000 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
1009 |
|
1001 | |||
1010 | rmap.connect('files_authors_home', |
|
1002 | rmap.connect('files_authors_home', | |
1011 | '/{repo_name}/authors/{revision}/{f_path}', |
|
1003 | '/{repo_name}/authors/{revision}/{f_path}', | |
1012 | controller='files', action='authors', revision='tip', f_path='', |
|
1004 | controller='files', action='authors', revision='tip', f_path='', | |
1013 | conditions={'function': check_repo}, |
|
1005 | conditions={'function': check_repo}, | |
1014 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
1006 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
1015 |
|
1007 | |||
1016 | rmap.connect('files_diff_home', '/{repo_name}/diff/{f_path}', |
|
1008 | rmap.connect('files_diff_home', '/{repo_name}/diff/{f_path}', | |
1017 | controller='files', action='diff', f_path='', |
|
1009 | controller='files', action='diff', f_path='', | |
1018 | conditions={'function': check_repo}, |
|
1010 | conditions={'function': check_repo}, | |
1019 | requirements=URL_NAME_REQUIREMENTS) |
|
1011 | requirements=URL_NAME_REQUIREMENTS) | |
1020 |
|
1012 | |||
1021 | rmap.connect('files_diff_2way_home', |
|
1013 | rmap.connect('files_diff_2way_home', | |
1022 | '/{repo_name}/diff-2way/{f_path}', |
|
1014 | '/{repo_name}/diff-2way/{f_path}', | |
1023 | controller='files', action='diff_2way', f_path='', |
|
1015 | controller='files', action='diff_2way', f_path='', | |
1024 | conditions={'function': check_repo}, |
|
1016 | conditions={'function': check_repo}, | |
1025 | requirements=URL_NAME_REQUIREMENTS) |
|
1017 | requirements=URL_NAME_REQUIREMENTS) | |
1026 |
|
1018 | |||
1027 | rmap.connect('files_rawfile_home', |
|
1019 | rmap.connect('files_rawfile_home', | |
1028 | '/{repo_name}/rawfile/{revision}/{f_path}', |
|
1020 | '/{repo_name}/rawfile/{revision}/{f_path}', | |
1029 | controller='files', action='rawfile', revision='tip', |
|
1021 | controller='files', action='rawfile', revision='tip', | |
1030 | f_path='', conditions={'function': check_repo}, |
|
1022 | f_path='', conditions={'function': check_repo}, | |
1031 | requirements=URL_NAME_REQUIREMENTS) |
|
1023 | requirements=URL_NAME_REQUIREMENTS) | |
1032 |
|
1024 | |||
1033 | rmap.connect('files_raw_home', |
|
1025 | rmap.connect('files_raw_home', | |
1034 | '/{repo_name}/raw/{revision}/{f_path}', |
|
1026 | '/{repo_name}/raw/{revision}/{f_path}', | |
1035 | controller='files', action='raw', revision='tip', f_path='', |
|
1027 | controller='files', action='raw', revision='tip', f_path='', | |
1036 | conditions={'function': check_repo}, |
|
1028 | conditions={'function': check_repo}, | |
1037 | requirements=URL_NAME_REQUIREMENTS) |
|
1029 | requirements=URL_NAME_REQUIREMENTS) | |
1038 |
|
1030 | |||
1039 | rmap.connect('files_render_home', |
|
1031 | rmap.connect('files_render_home', | |
1040 | '/{repo_name}/render/{revision}/{f_path}', |
|
1032 | '/{repo_name}/render/{revision}/{f_path}', | |
1041 | controller='files', action='index', revision='tip', f_path='', |
|
1033 | controller='files', action='index', revision='tip', f_path='', | |
1042 | rendered=True, conditions={'function': check_repo}, |
|
1034 | rendered=True, conditions={'function': check_repo}, | |
1043 | requirements=URL_NAME_REQUIREMENTS) |
|
1035 | requirements=URL_NAME_REQUIREMENTS) | |
1044 |
|
1036 | |||
1045 | rmap.connect('files_annotate_home', |
|
1037 | rmap.connect('files_annotate_home', | |
1046 | '/{repo_name}/annotate/{revision}/{f_path}', |
|
1038 | '/{repo_name}/annotate/{revision}/{f_path}', | |
1047 | controller='files', action='index', revision='tip', |
|
1039 | controller='files', action='index', revision='tip', | |
1048 | f_path='', annotate=True, conditions={'function': check_repo}, |
|
1040 | f_path='', annotate=True, conditions={'function': check_repo}, | |
1049 | requirements=URL_NAME_REQUIREMENTS) |
|
1041 | requirements=URL_NAME_REQUIREMENTS) | |
1050 |
|
1042 | |||
1051 | rmap.connect('files_edit', |
|
1043 | rmap.connect('files_edit', | |
1052 | '/{repo_name}/edit/{revision}/{f_path}', |
|
1044 | '/{repo_name}/edit/{revision}/{f_path}', | |
1053 | controller='files', action='edit', revision='tip', |
|
1045 | controller='files', action='edit', revision='tip', | |
1054 | f_path='', |
|
1046 | f_path='', | |
1055 | conditions={'function': check_repo, 'method': ['POST']}, |
|
1047 | conditions={'function': check_repo, 'method': ['POST']}, | |
1056 | requirements=URL_NAME_REQUIREMENTS) |
|
1048 | requirements=URL_NAME_REQUIREMENTS) | |
1057 |
|
1049 | |||
1058 | rmap.connect('files_edit_home', |
|
1050 | rmap.connect('files_edit_home', | |
1059 | '/{repo_name}/edit/{revision}/{f_path}', |
|
1051 | '/{repo_name}/edit/{revision}/{f_path}', | |
1060 | controller='files', action='edit_home', revision='tip', |
|
1052 | controller='files', action='edit_home', revision='tip', | |
1061 | f_path='', conditions={'function': check_repo}, |
|
1053 | f_path='', conditions={'function': check_repo}, | |
1062 | requirements=URL_NAME_REQUIREMENTS) |
|
1054 | requirements=URL_NAME_REQUIREMENTS) | |
1063 |
|
1055 | |||
1064 | rmap.connect('files_add', |
|
1056 | rmap.connect('files_add', | |
1065 | '/{repo_name}/add/{revision}/{f_path}', |
|
1057 | '/{repo_name}/add/{revision}/{f_path}', | |
1066 | controller='files', action='add', revision='tip', |
|
1058 | controller='files', action='add', revision='tip', | |
1067 | f_path='', |
|
1059 | f_path='', | |
1068 | conditions={'function': check_repo, 'method': ['POST']}, |
|
1060 | conditions={'function': check_repo, 'method': ['POST']}, | |
1069 | requirements=URL_NAME_REQUIREMENTS) |
|
1061 | requirements=URL_NAME_REQUIREMENTS) | |
1070 |
|
1062 | |||
1071 | rmap.connect('files_add_home', |
|
1063 | rmap.connect('files_add_home', | |
1072 | '/{repo_name}/add/{revision}/{f_path}', |
|
1064 | '/{repo_name}/add/{revision}/{f_path}', | |
1073 | controller='files', action='add_home', revision='tip', |
|
1065 | controller='files', action='add_home', revision='tip', | |
1074 | f_path='', conditions={'function': check_repo}, |
|
1066 | f_path='', conditions={'function': check_repo}, | |
1075 | requirements=URL_NAME_REQUIREMENTS) |
|
1067 | requirements=URL_NAME_REQUIREMENTS) | |
1076 |
|
1068 | |||
1077 | rmap.connect('files_delete', |
|
1069 | rmap.connect('files_delete', | |
1078 | '/{repo_name}/delete/{revision}/{f_path}', |
|
1070 | '/{repo_name}/delete/{revision}/{f_path}', | |
1079 | controller='files', action='delete', revision='tip', |
|
1071 | controller='files', action='delete', revision='tip', | |
1080 | f_path='', |
|
1072 | f_path='', | |
1081 | conditions={'function': check_repo, 'method': ['POST']}, |
|
1073 | conditions={'function': check_repo, 'method': ['POST']}, | |
1082 | requirements=URL_NAME_REQUIREMENTS) |
|
1074 | requirements=URL_NAME_REQUIREMENTS) | |
1083 |
|
1075 | |||
1084 | rmap.connect('files_delete_home', |
|
1076 | rmap.connect('files_delete_home', | |
1085 | '/{repo_name}/delete/{revision}/{f_path}', |
|
1077 | '/{repo_name}/delete/{revision}/{f_path}', | |
1086 | controller='files', action='delete_home', revision='tip', |
|
1078 | controller='files', action='delete_home', revision='tip', | |
1087 | f_path='', conditions={'function': check_repo}, |
|
1079 | f_path='', conditions={'function': check_repo}, | |
1088 | requirements=URL_NAME_REQUIREMENTS) |
|
1080 | requirements=URL_NAME_REQUIREMENTS) | |
1089 |
|
1081 | |||
1090 | rmap.connect('files_archive_home', '/{repo_name}/archive/{fname}', |
|
1082 | rmap.connect('files_archive_home', '/{repo_name}/archive/{fname}', | |
1091 | controller='files', action='archivefile', |
|
1083 | controller='files', action='archivefile', | |
1092 | conditions={'function': check_repo}, |
|
1084 | conditions={'function': check_repo}, | |
1093 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
1085 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
1094 |
|
1086 | |||
1095 | rmap.connect('files_nodelist_home', |
|
1087 | rmap.connect('files_nodelist_home', | |
1096 | '/{repo_name}/nodelist/{revision}/{f_path}', |
|
1088 | '/{repo_name}/nodelist/{revision}/{f_path}', | |
1097 | controller='files', action='nodelist', |
|
1089 | controller='files', action='nodelist', | |
1098 | conditions={'function': check_repo}, |
|
1090 | conditions={'function': check_repo}, | |
1099 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
1091 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
1100 |
|
1092 | |||
1101 | rmap.connect('files_metadata_list_home', |
|
1093 | rmap.connect('files_metadata_list_home', | |
1102 | '/{repo_name}/metadata_list/{revision}/{f_path}', |
|
1094 | '/{repo_name}/metadata_list/{revision}/{f_path}', | |
1103 | controller='files', action='metadata_list', |
|
1095 | controller='files', action='metadata_list', | |
1104 | conditions={'function': check_repo}, |
|
1096 | conditions={'function': check_repo}, | |
1105 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
1097 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
1106 |
|
1098 | |||
1107 | rmap.connect('repo_fork_create_home', '/{repo_name}/fork', |
|
1099 | rmap.connect('repo_fork_create_home', '/{repo_name}/fork', | |
1108 | controller='forks', action='fork_create', |
|
1100 | controller='forks', action='fork_create', | |
1109 | conditions={'function': check_repo, 'method': ['POST']}, |
|
1101 | conditions={'function': check_repo, 'method': ['POST']}, | |
1110 | requirements=URL_NAME_REQUIREMENTS) |
|
1102 | requirements=URL_NAME_REQUIREMENTS) | |
1111 |
|
1103 | |||
1112 | rmap.connect('repo_fork_home', '/{repo_name}/fork', |
|
1104 | rmap.connect('repo_fork_home', '/{repo_name}/fork', | |
1113 | controller='forks', action='fork', |
|
1105 | controller='forks', action='fork', | |
1114 | conditions={'function': check_repo}, |
|
1106 | conditions={'function': check_repo}, | |
1115 | requirements=URL_NAME_REQUIREMENTS) |
|
1107 | requirements=URL_NAME_REQUIREMENTS) | |
1116 |
|
1108 | |||
1117 | rmap.connect('repo_forks_home', '/{repo_name}/forks', |
|
1109 | rmap.connect('repo_forks_home', '/{repo_name}/forks', | |
1118 | controller='forks', action='forks', |
|
1110 | controller='forks', action='forks', | |
1119 | conditions={'function': check_repo}, |
|
1111 | conditions={'function': check_repo}, | |
1120 | requirements=URL_NAME_REQUIREMENTS) |
|
1112 | requirements=URL_NAME_REQUIREMENTS) | |
1121 |
|
1113 | |||
1122 | rmap.connect('repo_followers_home', '/{repo_name}/followers', |
|
1114 | rmap.connect('repo_followers_home', '/{repo_name}/followers', | |
1123 | controller='followers', action='followers', |
|
1115 | controller='followers', action='followers', | |
1124 | conditions={'function': check_repo}, |
|
1116 | conditions={'function': check_repo}, | |
1125 | requirements=URL_NAME_REQUIREMENTS) |
|
1117 | requirements=URL_NAME_REQUIREMENTS) | |
1126 |
|
1118 | |||
1127 | # must be here for proper group/repo catching pattern |
|
1119 | # must be here for proper group/repo catching pattern | |
1128 | _connect_with_slash( |
|
1120 | _connect_with_slash( | |
1129 | rmap, 'repo_group_home', '/{group_name}', |
|
1121 | rmap, 'repo_group_home', '/{group_name}', | |
1130 | controller='home', action='index_repo_group', |
|
1122 | controller='home', action='index_repo_group', | |
1131 | conditions={'function': check_group}, |
|
1123 | conditions={'function': check_group}, | |
1132 | requirements=URL_NAME_REQUIREMENTS) |
|
1124 | requirements=URL_NAME_REQUIREMENTS) | |
1133 |
|
1125 | |||
1134 | # catch all, at the end |
|
1126 | # catch all, at the end | |
1135 | _connect_with_slash( |
|
1127 | _connect_with_slash( | |
1136 | rmap, 'summary_home', '/{repo_name}', jsroute=True, |
|
1128 | rmap, 'summary_home', '/{repo_name}', jsroute=True, | |
1137 | controller='summary', action='index', |
|
1129 | controller='summary', action='index', | |
1138 | conditions={'function': check_repo}, |
|
1130 | conditions={'function': check_repo}, | |
1139 | requirements=URL_NAME_REQUIREMENTS) |
|
1131 | requirements=URL_NAME_REQUIREMENTS) | |
1140 |
|
1132 | |||
1141 | return rmap |
|
1133 | return rmap | |
1142 |
|
1134 | |||
1143 |
|
1135 | |||
1144 | def _connect_with_slash(mapper, name, path, *args, **kwargs): |
|
1136 | def _connect_with_slash(mapper, name, path, *args, **kwargs): | |
1145 | """ |
|
1137 | """ | |
1146 | Connect a route with an optional trailing slash in `path`. |
|
1138 | Connect a route with an optional trailing slash in `path`. | |
1147 | """ |
|
1139 | """ | |
1148 | mapper.connect(name + '_slash', path + '/', *args, **kwargs) |
|
1140 | mapper.connect(name + '_slash', path + '/', *args, **kwargs) | |
1149 | mapper.connect(name, path, *args, **kwargs) |
|
1141 | mapper.connect(name, path, *args, **kwargs) |
@@ -1,99 +1,103 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 | import os |
|
21 | import os | |
22 | import shlex |
|
22 | import shlex | |
23 | import Pyro4 |
|
23 | import Pyro4 | |
24 | import platform |
|
24 | import platform | |
25 |
|
25 | |||
26 | from rhodecode.model import init_model |
|
26 | from rhodecode.model import init_model | |
27 |
|
27 | |||
28 |
|
28 | |||
29 | def configure_pyro4(config): |
|
29 | def configure_pyro4(config): | |
30 | """ |
|
30 | """ | |
31 | Configure Pyro4 based on `config`. |
|
31 | Configure Pyro4 based on `config`. | |
32 |
|
32 | |||
33 | This will mainly set the different configuration parameters of the Pyro4 |
|
33 | This will mainly set the different configuration parameters of the Pyro4 | |
34 | library based on the settings in our INI files. The Pyro4 documentation |
|
34 | library based on the settings in our INI files. The Pyro4 documentation | |
35 | lists more details about the specific settings and their meaning. |
|
35 | lists more details about the specific settings and their meaning. | |
36 | """ |
|
36 | """ | |
37 | Pyro4.config.COMMTIMEOUT = float(config['vcs.connection_timeout']) |
|
37 | Pyro4.config.COMMTIMEOUT = float(config['vcs.connection_timeout']) | |
38 | Pyro4.config.SERIALIZER = 'pickle' |
|
38 | Pyro4.config.SERIALIZER = 'pickle' | |
39 | Pyro4.config.SERIALIZERS_ACCEPTED.add('pickle') |
|
39 | Pyro4.config.SERIALIZERS_ACCEPTED.add('pickle') | |
40 |
|
40 | |||
41 | # Note: We need server configuration in the WSGI processes |
|
41 | # Note: We need server configuration in the WSGI processes | |
42 | # because we provide a callback server in certain vcs operations. |
|
42 | # because we provide a callback server in certain vcs operations. | |
43 | Pyro4.config.SERVERTYPE = "multiplex" |
|
43 | Pyro4.config.SERVERTYPE = "multiplex" | |
44 | Pyro4.config.POLLTIMEOUT = 0.01 |
|
44 | Pyro4.config.POLLTIMEOUT = 0.01 | |
45 |
|
45 | |||
46 |
|
46 | |||
47 | def configure_vcs(config): |
|
47 | def configure_vcs(config): | |
48 | """ |
|
48 | """ | |
49 | Patch VCS config with some RhodeCode specific stuff |
|
49 | Patch VCS config with some RhodeCode specific stuff | |
50 | """ |
|
50 | """ | |
51 | from rhodecode.lib.vcs import conf |
|
51 | from rhodecode.lib.vcs import conf | |
52 | from rhodecode.lib.utils2 import aslist |
|
52 | from rhodecode.lib.utils2 import aslist | |
53 | conf.settings.BACKENDS = { |
|
53 | conf.settings.BACKENDS = { | |
54 | 'hg': 'rhodecode.lib.vcs.backends.hg.MercurialRepository', |
|
54 | 'hg': 'rhodecode.lib.vcs.backends.hg.MercurialRepository', | |
55 | 'git': 'rhodecode.lib.vcs.backends.git.GitRepository', |
|
55 | 'git': 'rhodecode.lib.vcs.backends.git.GitRepository', | |
56 | 'svn': 'rhodecode.lib.vcs.backends.svn.SubversionRepository', |
|
56 | 'svn': 'rhodecode.lib.vcs.backends.svn.SubversionRepository', | |
57 | } |
|
57 | } | |
58 |
|
58 | |||
59 | conf.settings.HG_USE_REBASE_FOR_MERGING = config.get( |
|
59 | conf.settings.HG_USE_REBASE_FOR_MERGING = config.get( | |
60 | 'rhodecode_hg_use_rebase_for_merging', False) |
|
60 | 'rhodecode_hg_use_rebase_for_merging', False) | |
61 | conf.settings.GIT_REV_FILTER = shlex.split( |
|
61 | conf.settings.GIT_REV_FILTER = shlex.split( | |
62 | config.get('git_rev_filter', '--all').strip()) |
|
62 | config.get('git_rev_filter', '--all').strip()) | |
63 | conf.settings.DEFAULT_ENCODINGS = aslist(config.get('default_encoding', |
|
63 | conf.settings.DEFAULT_ENCODINGS = aslist(config.get('default_encoding', | |
64 | 'UTF-8'), sep=',') |
|
64 | 'UTF-8'), sep=',') | |
65 | conf.settings.ALIASES[:] = config.get('vcs.backends') |
|
65 | conf.settings.ALIASES[:] = config.get('vcs.backends') | |
66 | conf.settings.SVN_COMPATIBLE_VERSION = config.get( |
|
66 | conf.settings.SVN_COMPATIBLE_VERSION = config.get( | |
67 | 'vcs.svn.compatible_version') |
|
67 | 'vcs.svn.compatible_version') | |
68 |
|
68 | |||
69 |
|
69 | |||
70 | def initialize_database(config): |
|
70 | def initialize_database(config): | |
71 | from rhodecode.lib.utils2 import engine_from_config |
|
71 | from rhodecode.lib.utils2 import engine_from_config, get_encryption_key | |
72 | engine = engine_from_config(config, 'sqlalchemy.db1.') |
|
72 | engine = engine_from_config(config, 'sqlalchemy.db1.') | |
73 |
init_model(engine, encryption_key=config |
|
73 | init_model(engine, encryption_key=get_encryption_key(config)) | |
74 |
|
74 | |||
75 |
|
75 | |||
76 | def initialize_test_environment(settings, test_env=None): |
|
76 | def initialize_test_environment(settings, test_env=None): | |
77 | if test_env is None: |
|
77 | if test_env is None: | |
78 | test_env = not int(os.environ.get('RC_NO_TMP_PATH', 0)) |
|
78 | test_env = not int(os.environ.get('RC_NO_TMP_PATH', 0)) | |
79 |
|
79 | |||
80 |
from rhodecode.lib.utils import |
|
80 | from rhodecode.lib.utils import ( | |
|
81 | create_test_directory, create_test_database, create_test_repositories, | |||
|
82 | create_test_index) | |||
81 | from rhodecode.tests import TESTS_TMP_PATH |
|
83 | from rhodecode.tests import TESTS_TMP_PATH | |
82 | # test repos |
|
84 | # test repos | |
83 | if test_env: |
|
85 | if test_env: | |
84 |
create_test_ |
|
86 | create_test_directory(TESTS_TMP_PATH) | |
85 |
create_test_ |
|
87 | create_test_database(TESTS_TMP_PATH, settings) | |
|
88 | create_test_repositories(TESTS_TMP_PATH, settings) | |||
|
89 | create_test_index(TESTS_TMP_PATH, settings) | |||
86 |
|
90 | |||
87 |
|
91 | |||
88 | def get_vcs_server_protocol(config): |
|
92 | def get_vcs_server_protocol(config): | |
89 | protocol = config.get('vcs.server.protocol', 'pyro4') |
|
93 | protocol = config.get('vcs.server.protocol', 'pyro4') | |
90 | return protocol |
|
94 | return protocol | |
91 |
|
95 | |||
92 |
|
96 | |||
93 | def set_instance_id(config): |
|
97 | def set_instance_id(config): | |
94 | """ Sets a dynamic generated config['instance_id'] if missing or '*' """ |
|
98 | """ Sets a dynamic generated config['instance_id'] if missing or '*' """ | |
95 |
|
99 | |||
96 | config['instance_id'] = config.get('instance_id') or '' |
|
100 | config['instance_id'] = config.get('instance_id') or '' | |
97 | if config['instance_id'] == '*' or not config['instance_id']: |
|
101 | if config['instance_id'] == '*' or not config['instance_id']: | |
98 | _platform_id = platform.uname()[1] or 'instance' |
|
102 | _platform_id = platform.uname()[1] or 'instance' | |
99 | config['instance_id'] = '%s-%s' % (_platform_id, os.getpid()) |
|
103 | config['instance_id'] = '%s-%s' % (_platform_id, os.getpid()) |
@@ -1,407 +1,406 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 |
|
21 | |||
22 | """ |
|
22 | """ | |
23 | Repository groups controller for RhodeCode |
|
23 | Repository groups controller for RhodeCode | |
24 | """ |
|
24 | """ | |
25 |
|
25 | |||
26 | import logging |
|
26 | import logging | |
27 | import formencode |
|
27 | import formencode | |
28 |
|
28 | |||
29 | from formencode import htmlfill |
|
29 | from formencode import htmlfill | |
30 |
|
30 | |||
31 | from pylons import request, tmpl_context as c, url |
|
31 | from pylons import request, tmpl_context as c, url | |
32 | from pylons.controllers.util import abort, redirect |
|
32 | from pylons.controllers.util import abort, redirect | |
33 | from pylons.i18n.translation import _, ungettext |
|
33 | from pylons.i18n.translation import _, ungettext | |
34 |
|
34 | |||
35 | from rhodecode.lib import auth |
|
35 | from rhodecode.lib import auth | |
36 | from rhodecode.lib import helpers as h |
|
36 | from rhodecode.lib import helpers as h | |
37 | from rhodecode.lib.ext_json import json |
|
37 | from rhodecode.lib.ext_json import json | |
38 | from rhodecode.lib.auth import ( |
|
38 | from rhodecode.lib.auth import ( | |
39 | LoginRequired, NotAnonymous, HasPermissionAll, |
|
39 | LoginRequired, NotAnonymous, HasPermissionAll, | |
40 | HasRepoGroupPermissionAll, HasRepoGroupPermissionAnyDecorator) |
|
40 | HasRepoGroupPermissionAll, HasRepoGroupPermissionAnyDecorator) | |
41 | from rhodecode.lib.base import BaseController, render |
|
41 | from rhodecode.lib.base import BaseController, render | |
42 | from rhodecode.model.db import RepoGroup, User |
|
42 | from rhodecode.model.db import RepoGroup, User | |
43 | from rhodecode.model.scm import RepoGroupList |
|
43 | from rhodecode.model.scm import RepoGroupList | |
44 | from rhodecode.model.repo_group import RepoGroupModel |
|
44 | from rhodecode.model.repo_group import RepoGroupModel | |
45 | from rhodecode.model.forms import RepoGroupForm, RepoGroupPermsForm |
|
45 | from rhodecode.model.forms import RepoGroupForm, RepoGroupPermsForm | |
46 | from rhodecode.model.meta import Session |
|
46 | from rhodecode.model.meta import Session | |
47 | from rhodecode.lib.utils2 import safe_int |
|
47 | from rhodecode.lib.utils2 import safe_int | |
48 |
|
48 | |||
49 |
|
49 | |||
50 | log = logging.getLogger(__name__) |
|
50 | log = logging.getLogger(__name__) | |
51 |
|
51 | |||
52 |
|
52 | |||
53 | class RepoGroupsController(BaseController): |
|
53 | class RepoGroupsController(BaseController): | |
54 | """REST Controller styled on the Atom Publishing Protocol""" |
|
54 | """REST Controller styled on the Atom Publishing Protocol""" | |
55 |
|
55 | |||
56 | @LoginRequired() |
|
56 | @LoginRequired() | |
57 | def __before__(self): |
|
57 | def __before__(self): | |
58 | super(RepoGroupsController, self).__before__() |
|
58 | super(RepoGroupsController, self).__before__() | |
59 |
|
59 | |||
60 | def __load_defaults(self, allow_empty_group=False, repo_group=None): |
|
60 | def __load_defaults(self, allow_empty_group=False, repo_group=None): | |
61 | if self._can_create_repo_group(): |
|
61 | if self._can_create_repo_group(): | |
62 | # we're global admin, we're ok and we can create TOP level groups |
|
62 | # we're global admin, we're ok and we can create TOP level groups | |
63 | allow_empty_group = True |
|
63 | allow_empty_group = True | |
64 |
|
64 | |||
65 | # override the choices for this form, we need to filter choices |
|
65 | # override the choices for this form, we need to filter choices | |
66 | # and display only those we have ADMIN right |
|
66 | # and display only those we have ADMIN right | |
67 | groups_with_admin_rights = RepoGroupList( |
|
67 | groups_with_admin_rights = RepoGroupList( | |
68 | RepoGroup.query().all(), |
|
68 | RepoGroup.query().all(), | |
69 | perm_set=['group.admin']) |
|
69 | perm_set=['group.admin']) | |
70 | c.repo_groups = RepoGroup.groups_choices( |
|
70 | c.repo_groups = RepoGroup.groups_choices( | |
71 | groups=groups_with_admin_rights, |
|
71 | groups=groups_with_admin_rights, | |
72 | show_empty_group=allow_empty_group) |
|
72 | show_empty_group=allow_empty_group) | |
73 |
|
73 | |||
74 | if repo_group: |
|
74 | if repo_group: | |
75 | # exclude filtered ids |
|
75 | # exclude filtered ids | |
76 | exclude_group_ids = [repo_group.group_id] |
|
76 | exclude_group_ids = [repo_group.group_id] | |
77 | c.repo_groups = filter(lambda x: x[0] not in exclude_group_ids, |
|
77 | c.repo_groups = filter(lambda x: x[0] not in exclude_group_ids, | |
78 | c.repo_groups) |
|
78 | c.repo_groups) | |
79 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) |
|
79 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) | |
80 | parent_group = repo_group.parent_group |
|
80 | parent_group = repo_group.parent_group | |
81 |
|
81 | |||
82 | add_parent_group = (parent_group and ( |
|
82 | add_parent_group = (parent_group and ( | |
83 | unicode(parent_group.group_id) not in c.repo_groups_choices)) |
|
83 | unicode(parent_group.group_id) not in c.repo_groups_choices)) | |
84 | if add_parent_group: |
|
84 | if add_parent_group: | |
85 | c.repo_groups_choices.append(unicode(parent_group.group_id)) |
|
85 | c.repo_groups_choices.append(unicode(parent_group.group_id)) | |
86 | c.repo_groups.append(RepoGroup._generate_choice(parent_group)) |
|
86 | c.repo_groups.append(RepoGroup._generate_choice(parent_group)) | |
87 |
|
87 | |||
88 | def __load_data(self, group_id): |
|
88 | def __load_data(self, group_id): | |
89 | """ |
|
89 | """ | |
90 | Load defaults settings for edit, and update |
|
90 | Load defaults settings for edit, and update | |
91 |
|
91 | |||
92 | :param group_id: |
|
92 | :param group_id: | |
93 | """ |
|
93 | """ | |
94 | repo_group = RepoGroup.get_or_404(group_id) |
|
94 | repo_group = RepoGroup.get_or_404(group_id) | |
95 | data = repo_group.get_dict() |
|
95 | data = repo_group.get_dict() | |
96 | data['group_name'] = repo_group.name |
|
96 | data['group_name'] = repo_group.name | |
97 |
|
97 | |||
98 | # fill owner |
|
98 | # fill owner | |
99 | if repo_group.user: |
|
99 | if repo_group.user: | |
100 | data.update({'user': repo_group.user.username}) |
|
100 | data.update({'user': repo_group.user.username}) | |
101 | else: |
|
101 | else: | |
102 | replacement_user = User.get_first_admin().username |
|
102 | replacement_user = User.get_first_super_admin().username | |
103 | data.update({'user': replacement_user}) |
|
103 | data.update({'user': replacement_user}) | |
104 |
|
104 | |||
105 | # fill repository group users |
|
105 | # fill repository group users | |
106 | for p in repo_group.repo_group_to_perm: |
|
106 | for p in repo_group.repo_group_to_perm: | |
107 | data.update({ |
|
107 | data.update({ | |
108 | 'u_perm_%s' % p.user.user_id: p.permission.permission_name}) |
|
108 | 'u_perm_%s' % p.user.user_id: p.permission.permission_name}) | |
109 |
|
109 | |||
110 | # fill repository group user groups |
|
110 | # fill repository group user groups | |
111 | for p in repo_group.users_group_to_perm: |
|
111 | for p in repo_group.users_group_to_perm: | |
112 | data.update({ |
|
112 | data.update({ | |
113 | 'g_perm_%s' % p.users_group.users_group_id: |
|
113 | 'g_perm_%s' % p.users_group.users_group_id: | |
114 | p.permission.permission_name}) |
|
114 | p.permission.permission_name}) | |
115 | # html and form expects -1 as empty parent group |
|
115 | # html and form expects -1 as empty parent group | |
116 | data['group_parent_id'] = data['group_parent_id'] or -1 |
|
116 | data['group_parent_id'] = data['group_parent_id'] or -1 | |
117 | return data |
|
117 | return data | |
118 |
|
118 | |||
119 | def _revoke_perms_on_yourself(self, form_result): |
|
119 | def _revoke_perms_on_yourself(self, form_result): | |
120 | _updates = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), |
|
120 | _updates = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), | |
121 | form_result['perm_updates']) |
|
121 | form_result['perm_updates']) | |
122 | _additions = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), |
|
122 | _additions = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), | |
123 | form_result['perm_additions']) |
|
123 | form_result['perm_additions']) | |
124 | _deletions = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), |
|
124 | _deletions = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), | |
125 | form_result['perm_deletions']) |
|
125 | form_result['perm_deletions']) | |
126 | admin_perm = 'group.admin' |
|
126 | admin_perm = 'group.admin' | |
127 | if _updates and _updates[0][1] != admin_perm or \ |
|
127 | if _updates and _updates[0][1] != admin_perm or \ | |
128 | _additions and _additions[0][1] != admin_perm or \ |
|
128 | _additions and _additions[0][1] != admin_perm or \ | |
129 | _deletions and _deletions[0][1] != admin_perm: |
|
129 | _deletions and _deletions[0][1] != admin_perm: | |
130 | return True |
|
130 | return True | |
131 | return False |
|
131 | return False | |
132 |
|
132 | |||
133 | def _can_create_repo_group(self, parent_group_id=None): |
|
133 | def _can_create_repo_group(self, parent_group_id=None): | |
134 | is_admin = HasPermissionAll('hg.admin')('group create controller') |
|
134 | is_admin = HasPermissionAll('hg.admin')('group create controller') | |
135 | create_repo_group = HasPermissionAll( |
|
135 | create_repo_group = HasPermissionAll( | |
136 | 'hg.repogroup.create.true')('group create controller') |
|
136 | 'hg.repogroup.create.true')('group create controller') | |
137 | if is_admin or (create_repo_group and not parent_group_id): |
|
137 | if is_admin or (create_repo_group and not parent_group_id): | |
138 | # we're global admin, or we have global repo group create |
|
138 | # we're global admin, or we have global repo group create | |
139 | # permission |
|
139 | # permission | |
140 | # we're ok and we can create TOP level groups |
|
140 | # we're ok and we can create TOP level groups | |
141 | return True |
|
141 | return True | |
142 | elif parent_group_id: |
|
142 | elif parent_group_id: | |
143 | # we check the permission if we can write to parent group |
|
143 | # we check the permission if we can write to parent group | |
144 | group = RepoGroup.get(parent_group_id) |
|
144 | group = RepoGroup.get(parent_group_id) | |
145 | group_name = group.group_name if group else None |
|
145 | group_name = group.group_name if group else None | |
146 | if HasRepoGroupPermissionAll('group.admin')( |
|
146 | if HasRepoGroupPermissionAll('group.admin')( | |
147 | group_name, 'check if user is an admin of group'): |
|
147 | group_name, 'check if user is an admin of group'): | |
148 | # we're an admin of passed in group, we're ok. |
|
148 | # we're an admin of passed in group, we're ok. | |
149 | return True |
|
149 | return True | |
150 | else: |
|
150 | else: | |
151 | return False |
|
151 | return False | |
152 | return False |
|
152 | return False | |
153 |
|
153 | |||
154 | @NotAnonymous() |
|
154 | @NotAnonymous() | |
155 | def index(self): |
|
155 | def index(self): | |
156 | """GET /repo_groups: All items in the collection""" |
|
156 | """GET /repo_groups: All items in the collection""" | |
157 | # url('repo_groups') |
|
157 | # url('repo_groups') | |
158 |
|
158 | |||
159 | repo_group_list = RepoGroup.get_all_repo_groups() |
|
159 | repo_group_list = RepoGroup.get_all_repo_groups() | |
160 | _perms = ['group.admin'] |
|
160 | _perms = ['group.admin'] | |
161 | repo_group_list_acl = RepoGroupList(repo_group_list, perm_set=_perms) |
|
161 | repo_group_list_acl = RepoGroupList(repo_group_list, perm_set=_perms) | |
162 | repo_group_data = RepoGroupModel().get_repo_groups_as_dict( |
|
162 | repo_group_data = RepoGroupModel().get_repo_groups_as_dict( | |
163 | repo_group_list=repo_group_list_acl, admin=True) |
|
163 | repo_group_list=repo_group_list_acl, admin=True) | |
164 | c.data = json.dumps(repo_group_data) |
|
164 | c.data = json.dumps(repo_group_data) | |
165 | return render('admin/repo_groups/repo_groups.html') |
|
165 | return render('admin/repo_groups/repo_groups.html') | |
166 |
|
166 | |||
167 | # perm checks inside |
|
167 | # perm checks inside | |
168 | @NotAnonymous() |
|
168 | @NotAnonymous() | |
169 | @auth.CSRFRequired() |
|
169 | @auth.CSRFRequired() | |
170 | def create(self): |
|
170 | def create(self): | |
171 | """POST /repo_groups: Create a new item""" |
|
171 | """POST /repo_groups: Create a new item""" | |
172 | # url('repo_groups') |
|
172 | # url('repo_groups') | |
173 |
|
173 | |||
174 | parent_group_id = safe_int(request.POST.get('group_parent_id')) |
|
174 | parent_group_id = safe_int(request.POST.get('group_parent_id')) | |
175 | can_create = self._can_create_repo_group(parent_group_id) |
|
175 | can_create = self._can_create_repo_group(parent_group_id) | |
176 |
|
176 | |||
177 | self.__load_defaults() |
|
177 | self.__load_defaults() | |
178 | # permissions for can create group based on parent_id are checked |
|
178 | # permissions for can create group based on parent_id are checked | |
179 | # here in the Form |
|
179 | # here in the Form | |
180 | available_groups = map(lambda k: unicode(k[0]), c.repo_groups) |
|
180 | available_groups = map(lambda k: unicode(k[0]), c.repo_groups) | |
181 | repo_group_form = RepoGroupForm(available_groups=available_groups, |
|
181 | repo_group_form = RepoGroupForm(available_groups=available_groups, | |
182 | can_create_in_root=can_create)() |
|
182 | can_create_in_root=can_create)() | |
183 | try: |
|
183 | try: | |
184 | owner = c.rhodecode_user |
|
184 | owner = c.rhodecode_user | |
185 | form_result = repo_group_form.to_python(dict(request.POST)) |
|
185 | form_result = repo_group_form.to_python(dict(request.POST)) | |
186 | RepoGroupModel().create( |
|
186 | RepoGroupModel().create( | |
187 | group_name=form_result['group_name_full'], |
|
187 | group_name=form_result['group_name_full'], | |
188 | group_description=form_result['group_description'], |
|
188 | group_description=form_result['group_description'], | |
189 | owner=owner.user_id, |
|
189 | owner=owner.user_id, | |
190 | copy_permissions=form_result['group_copy_permissions'] |
|
190 | copy_permissions=form_result['group_copy_permissions'] | |
191 | ) |
|
191 | ) | |
192 | Session().commit() |
|
192 | Session().commit() | |
193 | _new_group_name = form_result['group_name_full'] |
|
193 | _new_group_name = form_result['group_name_full'] | |
194 | repo_group_url = h.link_to( |
|
194 | repo_group_url = h.link_to( | |
195 | _new_group_name, |
|
195 | _new_group_name, | |
196 | h.url('repo_group_home', group_name=_new_group_name)) |
|
196 | h.url('repo_group_home', group_name=_new_group_name)) | |
197 | h.flash(h.literal(_('Created repository group %s') |
|
197 | h.flash(h.literal(_('Created repository group %s') | |
198 | % repo_group_url), category='success') |
|
198 | % repo_group_url), category='success') | |
199 | # TODO: in futureaction_logger(, '', '', '', self.sa) |
|
199 | # TODO: in futureaction_logger(, '', '', '', self.sa) | |
200 | except formencode.Invalid as errors: |
|
200 | except formencode.Invalid as errors: | |
201 | return htmlfill.render( |
|
201 | return htmlfill.render( | |
202 | render('admin/repo_groups/repo_group_add.html'), |
|
202 | render('admin/repo_groups/repo_group_add.html'), | |
203 | defaults=errors.value, |
|
203 | defaults=errors.value, | |
204 | errors=errors.error_dict or {}, |
|
204 | errors=errors.error_dict or {}, | |
205 | prefix_error=False, |
|
205 | prefix_error=False, | |
206 | encoding="UTF-8", |
|
206 | encoding="UTF-8", | |
207 | force_defaults=False) |
|
207 | force_defaults=False) | |
208 | except Exception: |
|
208 | except Exception: | |
209 | log.exception("Exception during creation of repository group") |
|
209 | log.exception("Exception during creation of repository group") | |
210 | h.flash(_('Error occurred during creation of repository group %s') |
|
210 | h.flash(_('Error occurred during creation of repository group %s') | |
211 | % request.POST.get('group_name'), category='error') |
|
211 | % request.POST.get('group_name'), category='error') | |
212 |
|
212 | |||
213 | # TODO: maybe we should get back to the main view, not the admin one |
|
213 | # TODO: maybe we should get back to the main view, not the admin one | |
214 | return redirect(url('repo_groups', parent_group=parent_group_id)) |
|
214 | return redirect(url('repo_groups', parent_group=parent_group_id)) | |
215 |
|
215 | |||
216 | # perm checks inside |
|
216 | # perm checks inside | |
217 | @NotAnonymous() |
|
217 | @NotAnonymous() | |
218 | def new(self): |
|
218 | def new(self): | |
219 | """GET /repo_groups/new: Form to create a new item""" |
|
219 | """GET /repo_groups/new: Form to create a new item""" | |
220 | # url('new_repo_group') |
|
220 | # url('new_repo_group') | |
221 | # perm check for admin, create_group perm or admin of parent_group |
|
221 | # perm check for admin, create_group perm or admin of parent_group | |
222 | parent_group_id = safe_int(request.GET.get('parent_group')) |
|
222 | parent_group_id = safe_int(request.GET.get('parent_group')) | |
223 | if not self._can_create_repo_group(parent_group_id): |
|
223 | if not self._can_create_repo_group(parent_group_id): | |
224 | return abort(403) |
|
224 | return abort(403) | |
225 |
|
225 | |||
226 | self.__load_defaults() |
|
226 | self.__load_defaults() | |
227 | return render('admin/repo_groups/repo_group_add.html') |
|
227 | return render('admin/repo_groups/repo_group_add.html') | |
228 |
|
228 | |||
229 | @HasRepoGroupPermissionAnyDecorator('group.admin') |
|
229 | @HasRepoGroupPermissionAnyDecorator('group.admin') | |
230 | @auth.CSRFRequired() |
|
230 | @auth.CSRFRequired() | |
231 | def update(self, group_name): |
|
231 | def update(self, group_name): | |
232 | """PUT /repo_groups/group_name: Update an existing item""" |
|
232 | """PUT /repo_groups/group_name: Update an existing item""" | |
233 | # Forms posted to this method should contain a hidden field: |
|
233 | # Forms posted to this method should contain a hidden field: | |
234 | # <input type="hidden" name="_method" value="PUT" /> |
|
234 | # <input type="hidden" name="_method" value="PUT" /> | |
235 | # Or using helpers: |
|
235 | # Or using helpers: | |
236 | # h.form(url('repos_group', group_name=GROUP_NAME), method='put') |
|
236 | # h.form(url('repos_group', group_name=GROUP_NAME), method='put') | |
237 | # url('repo_group_home', group_name=GROUP_NAME) |
|
237 | # url('repo_group_home', group_name=GROUP_NAME) | |
238 |
|
238 | |||
239 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
239 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) | |
240 | can_create_in_root = self._can_create_repo_group() |
|
240 | can_create_in_root = self._can_create_repo_group() | |
241 | show_root_location = can_create_in_root |
|
241 | show_root_location = can_create_in_root | |
242 | if not c.repo_group.parent_group: |
|
242 | if not c.repo_group.parent_group: | |
243 | # this group don't have a parrent so we should show empty value |
|
243 | # this group don't have a parrent so we should show empty value | |
244 | show_root_location = True |
|
244 | show_root_location = True | |
245 | self.__load_defaults(allow_empty_group=show_root_location, |
|
245 | self.__load_defaults(allow_empty_group=show_root_location, | |
246 | repo_group=c.repo_group) |
|
246 | repo_group=c.repo_group) | |
247 |
|
247 | |||
248 | repo_group_form = RepoGroupForm( |
|
248 | repo_group_form = RepoGroupForm( | |
249 | edit=True, |
|
249 | edit=True, old_data=c.repo_group.get_dict(), | |
250 | old_data=c.repo_group.get_dict(), |
|
|||
251 | available_groups=c.repo_groups_choices, |
|
250 | available_groups=c.repo_groups_choices, | |
252 | can_create_in_root=can_create_in_root, |
|
251 | can_create_in_root=can_create_in_root, allow_disabled=True)() | |
253 | )() |
|
252 | ||
254 | try: |
|
253 | try: | |
255 | form_result = repo_group_form.to_python(dict(request.POST)) |
|
254 | form_result = repo_group_form.to_python(dict(request.POST)) | |
256 | gr_name = form_result['group_name'] |
|
255 | gr_name = form_result['group_name'] | |
257 | new_gr = RepoGroupModel().update(group_name, form_result) |
|
256 | new_gr = RepoGroupModel().update(group_name, form_result) | |
258 | Session().commit() |
|
257 | Session().commit() | |
259 | h.flash(_('Updated repository group %s') % (gr_name,), |
|
258 | h.flash(_('Updated repository group %s') % (gr_name,), | |
260 | category='success') |
|
259 | category='success') | |
261 | # we now have new name ! |
|
260 | # we now have new name ! | |
262 | group_name = new_gr.group_name |
|
261 | group_name = new_gr.group_name | |
263 | # TODO: in future action_logger(, '', '', '', self.sa) |
|
262 | # TODO: in future action_logger(, '', '', '', self.sa) | |
264 | except formencode.Invalid as errors: |
|
263 | except formencode.Invalid as errors: | |
265 | c.active = 'settings' |
|
264 | c.active = 'settings' | |
266 | return htmlfill.render( |
|
265 | return htmlfill.render( | |
267 | render('admin/repo_groups/repo_group_edit.html'), |
|
266 | render('admin/repo_groups/repo_group_edit.html'), | |
268 | defaults=errors.value, |
|
267 | defaults=errors.value, | |
269 | errors=errors.error_dict or {}, |
|
268 | errors=errors.error_dict or {}, | |
270 | prefix_error=False, |
|
269 | prefix_error=False, | |
271 | encoding="UTF-8", |
|
270 | encoding="UTF-8", | |
272 | force_defaults=False) |
|
271 | force_defaults=False) | |
273 | except Exception: |
|
272 | except Exception: | |
274 | log.exception("Exception during update or repository group") |
|
273 | log.exception("Exception during update or repository group") | |
275 | h.flash(_('Error occurred during update of repository group %s') |
|
274 | h.flash(_('Error occurred during update of repository group %s') | |
276 | % request.POST.get('group_name'), category='error') |
|
275 | % request.POST.get('group_name'), category='error') | |
277 |
|
276 | |||
278 | return redirect(url('edit_repo_group', group_name=group_name)) |
|
277 | return redirect(url('edit_repo_group', group_name=group_name)) | |
279 |
|
278 | |||
280 | @HasRepoGroupPermissionAnyDecorator('group.admin') |
|
279 | @HasRepoGroupPermissionAnyDecorator('group.admin') | |
281 | @auth.CSRFRequired() |
|
280 | @auth.CSRFRequired() | |
282 | def delete(self, group_name): |
|
281 | def delete(self, group_name): | |
283 | """DELETE /repo_groups/group_name: Delete an existing item""" |
|
282 | """DELETE /repo_groups/group_name: Delete an existing item""" | |
284 | # Forms posted to this method should contain a hidden field: |
|
283 | # Forms posted to this method should contain a hidden field: | |
285 | # <input type="hidden" name="_method" value="DELETE" /> |
|
284 | # <input type="hidden" name="_method" value="DELETE" /> | |
286 | # Or using helpers: |
|
285 | # Or using helpers: | |
287 | # h.form(url('repos_group', group_name=GROUP_NAME), method='delete') |
|
286 | # h.form(url('repos_group', group_name=GROUP_NAME), method='delete') | |
288 | # url('repo_group_home', group_name=GROUP_NAME) |
|
287 | # url('repo_group_home', group_name=GROUP_NAME) | |
289 |
|
288 | |||
290 | gr = c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
289 | gr = c.repo_group = RepoGroupModel()._get_repo_group(group_name) | |
291 | repos = gr.repositories.all() |
|
290 | repos = gr.repositories.all() | |
292 | if repos: |
|
291 | if repos: | |
293 | msg = ungettext( |
|
292 | msg = ungettext( | |
294 | 'This group contains %(num)d repository and cannot be deleted', |
|
293 | 'This group contains %(num)d repository and cannot be deleted', | |
295 | 'This group contains %(num)d repositories and cannot be' |
|
294 | 'This group contains %(num)d repositories and cannot be' | |
296 | ' deleted', |
|
295 | ' deleted', | |
297 | len(repos)) % {'num': len(repos)} |
|
296 | len(repos)) % {'num': len(repos)} | |
298 | h.flash(msg, category='warning') |
|
297 | h.flash(msg, category='warning') | |
299 | return redirect(url('repo_groups')) |
|
298 | return redirect(url('repo_groups')) | |
300 |
|
299 | |||
301 | children = gr.children.all() |
|
300 | children = gr.children.all() | |
302 | if children: |
|
301 | if children: | |
303 | msg = ungettext( |
|
302 | msg = ungettext( | |
304 | 'This group contains %(num)d subgroup and cannot be deleted', |
|
303 | 'This group contains %(num)d subgroup and cannot be deleted', | |
305 | 'This group contains %(num)d subgroups and cannot be deleted', |
|
304 | 'This group contains %(num)d subgroups and cannot be deleted', | |
306 | len(children)) % {'num': len(children)} |
|
305 | len(children)) % {'num': len(children)} | |
307 | h.flash(msg, category='warning') |
|
306 | h.flash(msg, category='warning') | |
308 | return redirect(url('repo_groups')) |
|
307 | return redirect(url('repo_groups')) | |
309 |
|
308 | |||
310 | try: |
|
309 | try: | |
311 | RepoGroupModel().delete(group_name) |
|
310 | RepoGroupModel().delete(group_name) | |
312 | Session().commit() |
|
311 | Session().commit() | |
313 | h.flash(_('Removed repository group %s') % group_name, |
|
312 | h.flash(_('Removed repository group %s') % group_name, | |
314 | category='success') |
|
313 | category='success') | |
315 | # TODO: in future action_logger(, '', '', '', self.sa) |
|
314 | # TODO: in future action_logger(, '', '', '', self.sa) | |
316 | except Exception: |
|
315 | except Exception: | |
317 | log.exception("Exception during deletion of repository group") |
|
316 | log.exception("Exception during deletion of repository group") | |
318 | h.flash(_('Error occurred during deletion of repository group %s') |
|
317 | h.flash(_('Error occurred during deletion of repository group %s') | |
319 | % group_name, category='error') |
|
318 | % group_name, category='error') | |
320 |
|
319 | |||
321 | return redirect(url('repo_groups')) |
|
320 | return redirect(url('repo_groups')) | |
322 |
|
321 | |||
323 | @HasRepoGroupPermissionAnyDecorator('group.admin') |
|
322 | @HasRepoGroupPermissionAnyDecorator('group.admin') | |
324 | def edit(self, group_name): |
|
323 | def edit(self, group_name): | |
325 | """GET /repo_groups/group_name/edit: Form to edit an existing item""" |
|
324 | """GET /repo_groups/group_name/edit: Form to edit an existing item""" | |
326 | # url('edit_repo_group', group_name=GROUP_NAME) |
|
325 | # url('edit_repo_group', group_name=GROUP_NAME) | |
327 | c.active = 'settings' |
|
326 | c.active = 'settings' | |
328 |
|
327 | |||
329 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
328 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) | |
330 | # we can only allow moving empty group if it's already a top-level |
|
329 | # we can only allow moving empty group if it's already a top-level | |
331 | # group, ie has no parents, or we're admin |
|
330 | # group, ie has no parents, or we're admin | |
332 | can_create_in_root = self._can_create_repo_group() |
|
331 | can_create_in_root = self._can_create_repo_group() | |
333 | show_root_location = can_create_in_root |
|
332 | show_root_location = can_create_in_root | |
334 | if not c.repo_group.parent_group: |
|
333 | if not c.repo_group.parent_group: | |
335 | # this group don't have a parrent so we should show empty value |
|
334 | # this group don't have a parrent so we should show empty value | |
336 | show_root_location = True |
|
335 | show_root_location = True | |
337 | self.__load_defaults(allow_empty_group=show_root_location, |
|
336 | self.__load_defaults(allow_empty_group=show_root_location, | |
338 | repo_group=c.repo_group) |
|
337 | repo_group=c.repo_group) | |
339 | defaults = self.__load_data(c.repo_group.group_id) |
|
338 | defaults = self.__load_data(c.repo_group.group_id) | |
340 |
|
339 | |||
341 | return htmlfill.render( |
|
340 | return htmlfill.render( | |
342 | render('admin/repo_groups/repo_group_edit.html'), |
|
341 | render('admin/repo_groups/repo_group_edit.html'), | |
343 | defaults=defaults, |
|
342 | defaults=defaults, | |
344 | encoding="UTF-8", |
|
343 | encoding="UTF-8", | |
345 | force_defaults=False |
|
344 | force_defaults=False | |
346 | ) |
|
345 | ) | |
347 |
|
346 | |||
348 | @HasRepoGroupPermissionAnyDecorator('group.admin') |
|
347 | @HasRepoGroupPermissionAnyDecorator('group.admin') | |
349 | def edit_repo_group_advanced(self, group_name): |
|
348 | def edit_repo_group_advanced(self, group_name): | |
350 | """GET /repo_groups/group_name/edit: Form to edit an existing item""" |
|
349 | """GET /repo_groups/group_name/edit: Form to edit an existing item""" | |
351 | # url('edit_repo_group', group_name=GROUP_NAME) |
|
350 | # url('edit_repo_group', group_name=GROUP_NAME) | |
352 | c.active = 'advanced' |
|
351 | c.active = 'advanced' | |
353 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
352 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) | |
354 |
|
353 | |||
355 | return render('admin/repo_groups/repo_group_edit.html') |
|
354 | return render('admin/repo_groups/repo_group_edit.html') | |
356 |
|
355 | |||
357 | @HasRepoGroupPermissionAnyDecorator('group.admin') |
|
356 | @HasRepoGroupPermissionAnyDecorator('group.admin') | |
358 | def edit_repo_group_perms(self, group_name): |
|
357 | def edit_repo_group_perms(self, group_name): | |
359 | """GET /repo_groups/group_name/edit: Form to edit an existing item""" |
|
358 | """GET /repo_groups/group_name/edit: Form to edit an existing item""" | |
360 | # url('edit_repo_group', group_name=GROUP_NAME) |
|
359 | # url('edit_repo_group', group_name=GROUP_NAME) | |
361 | c.active = 'perms' |
|
360 | c.active = 'perms' | |
362 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
361 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) | |
363 | self.__load_defaults() |
|
362 | self.__load_defaults() | |
364 | defaults = self.__load_data(c.repo_group.group_id) |
|
363 | defaults = self.__load_data(c.repo_group.group_id) | |
365 |
|
364 | |||
366 | return htmlfill.render( |
|
365 | return htmlfill.render( | |
367 | render('admin/repo_groups/repo_group_edit.html'), |
|
366 | render('admin/repo_groups/repo_group_edit.html'), | |
368 | defaults=defaults, |
|
367 | defaults=defaults, | |
369 | encoding="UTF-8", |
|
368 | encoding="UTF-8", | |
370 | force_defaults=False |
|
369 | force_defaults=False | |
371 | ) |
|
370 | ) | |
372 |
|
371 | |||
373 | @HasRepoGroupPermissionAnyDecorator('group.admin') |
|
372 | @HasRepoGroupPermissionAnyDecorator('group.admin') | |
374 | @auth.CSRFRequired() |
|
373 | @auth.CSRFRequired() | |
375 | def update_perms(self, group_name): |
|
374 | def update_perms(self, group_name): | |
376 | """ |
|
375 | """ | |
377 | Update permissions for given repository group |
|
376 | Update permissions for given repository group | |
378 |
|
377 | |||
379 | :param group_name: |
|
378 | :param group_name: | |
380 | """ |
|
379 | """ | |
381 |
|
380 | |||
382 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
381 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) | |
383 | valid_recursive_choices = ['none', 'repos', 'groups', 'all'] |
|
382 | valid_recursive_choices = ['none', 'repos', 'groups', 'all'] | |
384 | form = RepoGroupPermsForm(valid_recursive_choices)().to_python( |
|
383 | form = RepoGroupPermsForm(valid_recursive_choices)().to_python( | |
385 | request.POST) |
|
384 | request.POST) | |
386 |
|
385 | |||
387 | if not c.rhodecode_user.is_admin: |
|
386 | if not c.rhodecode_user.is_admin: | |
388 | if self._revoke_perms_on_yourself(form): |
|
387 | if self._revoke_perms_on_yourself(form): | |
389 | msg = _('Cannot change permission for yourself as admin') |
|
388 | msg = _('Cannot change permission for yourself as admin') | |
390 | h.flash(msg, category='warning') |
|
389 | h.flash(msg, category='warning') | |
391 | return redirect( |
|
390 | return redirect( | |
392 | url('edit_repo_group_perms', group_name=group_name)) |
|
391 | url('edit_repo_group_perms', group_name=group_name)) | |
393 |
|
392 | |||
394 | # iterate over all members(if in recursive mode) of this groups and |
|
393 | # iterate over all members(if in recursive mode) of this groups and | |
395 | # set the permissions ! |
|
394 | # set the permissions ! | |
396 | # this can be potentially heavy operation |
|
395 | # this can be potentially heavy operation | |
397 | RepoGroupModel().update_permissions( |
|
396 | RepoGroupModel().update_permissions( | |
398 | c.repo_group, |
|
397 | c.repo_group, | |
399 | form['perm_additions'], form['perm_updates'], |
|
398 | form['perm_additions'], form['perm_updates'], | |
400 | form['perm_deletions'], form['recursive']) |
|
399 | form['perm_deletions'], form['recursive']) | |
401 |
|
400 | |||
402 | # TODO: implement this |
|
401 | # TODO: implement this | |
403 | # action_logger(c.rhodecode_user, 'admin_changed_repo_permissions', |
|
402 | # action_logger(c.rhodecode_user, 'admin_changed_repo_permissions', | |
404 | # repo_name, self.ip_addr, self.sa) |
|
403 | # repo_name, self.ip_addr, self.sa) | |
405 | Session().commit() |
|
404 | Session().commit() | |
406 | h.flash(_('Repository Group permissions updated'), category='success') |
|
405 | h.flash(_('Repository Group permissions updated'), category='success') | |
407 | return redirect(url('edit_repo_group_perms', group_name=group_name)) |
|
406 | return redirect(url('edit_repo_group_perms', group_name=group_name)) |
@@ -1,878 +1,878 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2013-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2013-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 |
|
21 | |||
22 | """ |
|
22 | """ | |
23 | Repositories controller for RhodeCode |
|
23 | Repositories controller for RhodeCode | |
24 | """ |
|
24 | """ | |
25 |
|
25 | |||
26 | import logging |
|
26 | import logging | |
27 | import traceback |
|
27 | import traceback | |
28 |
|
28 | |||
29 | import formencode |
|
29 | import formencode | |
30 | from formencode import htmlfill |
|
30 | from formencode import htmlfill | |
31 | from pylons import request, tmpl_context as c, url |
|
31 | from pylons import request, tmpl_context as c, url | |
32 | from pylons.controllers.util import redirect |
|
32 | from pylons.controllers.util import redirect | |
33 | from pylons.i18n.translation import _ |
|
33 | from pylons.i18n.translation import _ | |
34 | from webob.exc import HTTPForbidden, HTTPNotFound, HTTPBadRequest |
|
34 | from webob.exc import HTTPForbidden, HTTPNotFound, HTTPBadRequest | |
35 |
|
35 | |||
36 | from rhodecode.lib import auth, helpers as h |
|
36 | from rhodecode.lib import auth, helpers as h | |
37 | from rhodecode.lib.auth import ( |
|
37 | from rhodecode.lib.auth import ( | |
38 | LoginRequired, HasPermissionAllDecorator, |
|
38 | LoginRequired, HasPermissionAllDecorator, | |
39 | HasRepoPermissionAllDecorator, NotAnonymous, HasPermissionAny, |
|
39 | HasRepoPermissionAllDecorator, NotAnonymous, HasPermissionAny, | |
40 | HasRepoGroupPermissionAny, HasRepoPermissionAnyDecorator) |
|
40 | HasRepoGroupPermissionAny, HasRepoPermissionAnyDecorator) | |
41 | from rhodecode.lib.base import BaseRepoController, render |
|
41 | from rhodecode.lib.base import BaseRepoController, render | |
42 | from rhodecode.lib.ext_json import json |
|
42 | from rhodecode.lib.ext_json import json | |
43 | from rhodecode.lib.exceptions import AttachedForksError |
|
43 | from rhodecode.lib.exceptions import AttachedForksError | |
44 | from rhodecode.lib.utils import action_logger, repo_name_slug, jsonify |
|
44 | from rhodecode.lib.utils import action_logger, repo_name_slug, jsonify | |
45 | from rhodecode.lib.utils2 import safe_int |
|
45 | from rhodecode.lib.utils2 import safe_int | |
46 | from rhodecode.lib.vcs import RepositoryError |
|
46 | from rhodecode.lib.vcs import RepositoryError | |
47 | from rhodecode.model.db import ( |
|
47 | from rhodecode.model.db import ( | |
48 | User, Repository, UserFollowing, RepoGroup, RepositoryField) |
|
48 | User, Repository, UserFollowing, RepoGroup, RepositoryField) | |
49 | from rhodecode.model.forms import ( |
|
49 | from rhodecode.model.forms import ( | |
50 | RepoForm, RepoFieldForm, RepoPermsForm, RepoVcsSettingsForm, |
|
50 | RepoForm, RepoFieldForm, RepoPermsForm, RepoVcsSettingsForm, | |
51 | IssueTrackerPatternsForm) |
|
51 | IssueTrackerPatternsForm) | |
52 | from rhodecode.model.meta import Session |
|
52 | from rhodecode.model.meta import Session | |
53 | from rhodecode.model.repo import RepoModel |
|
53 | from rhodecode.model.repo import RepoModel | |
54 | from rhodecode.model.scm import ScmModel, RepoGroupList, RepoList |
|
54 | from rhodecode.model.scm import ScmModel, RepoGroupList, RepoList | |
55 | from rhodecode.model.settings import ( |
|
55 | from rhodecode.model.settings import ( | |
56 | SettingsModel, IssueTrackerSettingsModel, VcsSettingsModel, |
|
56 | SettingsModel, IssueTrackerSettingsModel, VcsSettingsModel, | |
57 | SettingNotFound) |
|
57 | SettingNotFound) | |
58 |
|
58 | |||
59 | log = logging.getLogger(__name__) |
|
59 | log = logging.getLogger(__name__) | |
60 |
|
60 | |||
61 |
|
61 | |||
62 | class ReposController(BaseRepoController): |
|
62 | class ReposController(BaseRepoController): | |
63 | """ |
|
63 | """ | |
64 | REST Controller styled on the Atom Publishing Protocol""" |
|
64 | REST Controller styled on the Atom Publishing Protocol""" | |
65 | # To properly map this controller, ensure your config/routing.py |
|
65 | # To properly map this controller, ensure your config/routing.py | |
66 | # file has a resource setup: |
|
66 | # file has a resource setup: | |
67 | # map.resource('repo', 'repos') |
|
67 | # map.resource('repo', 'repos') | |
68 |
|
68 | |||
69 | @LoginRequired() |
|
69 | @LoginRequired() | |
70 | def __before__(self): |
|
70 | def __before__(self): | |
71 | super(ReposController, self).__before__() |
|
71 | super(ReposController, self).__before__() | |
72 |
|
72 | |||
73 | def _load_repo(self, repo_name): |
|
73 | def _load_repo(self, repo_name): | |
74 | repo_obj = Repository.get_by_repo_name(repo_name) |
|
74 | repo_obj = Repository.get_by_repo_name(repo_name) | |
75 |
|
75 | |||
76 | if repo_obj is None: |
|
76 | if repo_obj is None: | |
77 | h.not_mapped_error(repo_name) |
|
77 | h.not_mapped_error(repo_name) | |
78 | return redirect(url('repos')) |
|
78 | return redirect(url('repos')) | |
79 |
|
79 | |||
80 | return repo_obj |
|
80 | return repo_obj | |
81 |
|
81 | |||
82 | def __load_defaults(self, repo=None): |
|
82 | def __load_defaults(self, repo=None): | |
83 | acl_groups = RepoGroupList(RepoGroup.query().all(), |
|
83 | acl_groups = RepoGroupList(RepoGroup.query().all(), | |
84 | perm_set=['group.write', 'group.admin']) |
|
84 | perm_set=['group.write', 'group.admin']) | |
85 | c.repo_groups = RepoGroup.groups_choices(groups=acl_groups) |
|
85 | c.repo_groups = RepoGroup.groups_choices(groups=acl_groups) | |
86 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) |
|
86 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) | |
87 |
|
87 | |||
88 | # in case someone no longer have a group.write access to a repository |
|
88 | # in case someone no longer have a group.write access to a repository | |
89 | # pre fill the list with this entry, we don't care if this is the same |
|
89 | # pre fill the list with this entry, we don't care if this is the same | |
90 | # but it will allow saving repo data properly. |
|
90 | # but it will allow saving repo data properly. | |
91 |
|
91 | |||
92 | repo_group = None |
|
92 | repo_group = None | |
93 | if repo: |
|
93 | if repo: | |
94 | repo_group = repo.group |
|
94 | repo_group = repo.group | |
95 | if repo_group and unicode(repo_group.group_id) not in c.repo_groups_choices: |
|
95 | if repo_group and unicode(repo_group.group_id) not in c.repo_groups_choices: | |
96 | c.repo_groups_choices.append(unicode(repo_group.group_id)) |
|
96 | c.repo_groups_choices.append(unicode(repo_group.group_id)) | |
97 | c.repo_groups.append(RepoGroup._generate_choice(repo_group)) |
|
97 | c.repo_groups.append(RepoGroup._generate_choice(repo_group)) | |
98 |
|
98 | |||
99 | choices, c.landing_revs = ScmModel().get_repo_landing_revs() |
|
99 | choices, c.landing_revs = ScmModel().get_repo_landing_revs() | |
100 | c.landing_revs_choices = choices |
|
100 | c.landing_revs_choices = choices | |
101 |
|
101 | |||
102 | def __load_data(self, repo_name=None): |
|
102 | def __load_data(self, repo_name=None): | |
103 | """ |
|
103 | """ | |
104 | Load defaults settings for edit, and update |
|
104 | Load defaults settings for edit, and update | |
105 |
|
105 | |||
106 | :param repo_name: |
|
106 | :param repo_name: | |
107 | """ |
|
107 | """ | |
108 | c.repo_info = self._load_repo(repo_name) |
|
108 | c.repo_info = self._load_repo(repo_name) | |
109 | self.__load_defaults(c.repo_info) |
|
109 | self.__load_defaults(c.repo_info) | |
110 |
|
110 | |||
111 | # override defaults for exact repo info here git/hg etc |
|
111 | # override defaults for exact repo info here git/hg etc | |
112 | if not c.repository_requirements_missing: |
|
112 | if not c.repository_requirements_missing: | |
113 | choices, c.landing_revs = ScmModel().get_repo_landing_revs( |
|
113 | choices, c.landing_revs = ScmModel().get_repo_landing_revs( | |
114 | c.repo_info) |
|
114 | c.repo_info) | |
115 | c.landing_revs_choices = choices |
|
115 | c.landing_revs_choices = choices | |
116 | defaults = RepoModel()._get_defaults(repo_name) |
|
116 | defaults = RepoModel()._get_defaults(repo_name) | |
117 |
|
117 | |||
118 | return defaults |
|
118 | return defaults | |
119 |
|
119 | |||
120 | def _log_creation_exception(self, e, repo_name): |
|
120 | def _log_creation_exception(self, e, repo_name): | |
121 | reason = None |
|
121 | reason = None | |
122 | if len(e.args) == 2: |
|
122 | if len(e.args) == 2: | |
123 | reason = e.args[1] |
|
123 | reason = e.args[1] | |
124 |
|
124 | |||
125 | if reason == 'INVALID_CERTIFICATE': |
|
125 | if reason == 'INVALID_CERTIFICATE': | |
126 | log.exception( |
|
126 | log.exception( | |
127 | 'Exception creating a repository: invalid certificate') |
|
127 | 'Exception creating a repository: invalid certificate') | |
128 | msg = (_('Error creating repository %s: invalid certificate') |
|
128 | msg = (_('Error creating repository %s: invalid certificate') | |
129 | % repo_name) |
|
129 | % repo_name) | |
130 | else: |
|
130 | else: | |
131 | log.exception("Exception creating a repository") |
|
131 | log.exception("Exception creating a repository") | |
132 | msg = (_('Error creating repository %s') |
|
132 | msg = (_('Error creating repository %s') | |
133 | % repo_name) |
|
133 | % repo_name) | |
134 |
|
134 | |||
135 | return msg |
|
135 | return msg | |
136 |
|
136 | |||
137 | @NotAnonymous() |
|
137 | @NotAnonymous() | |
138 | def index(self, format='html'): |
|
138 | def index(self, format='html'): | |
139 | """GET /repos: All items in the collection""" |
|
139 | """GET /repos: All items in the collection""" | |
140 | # url('repos') |
|
140 | # url('repos') | |
141 |
|
141 | |||
142 | repo_list = Repository.get_all_repos() |
|
142 | repo_list = Repository.get_all_repos() | |
143 | c.repo_list = RepoList(repo_list, perm_set=['repository.admin']) |
|
143 | c.repo_list = RepoList(repo_list, perm_set=['repository.admin']) | |
144 | repos_data = RepoModel().get_repos_as_dict( |
|
144 | repos_data = RepoModel().get_repos_as_dict( | |
145 | repo_list=c.repo_list, admin=True, super_user_actions=True) |
|
145 | repo_list=c.repo_list, admin=True, super_user_actions=True) | |
146 | # json used to render the grid |
|
146 | # json used to render the grid | |
147 | c.data = json.dumps(repos_data) |
|
147 | c.data = json.dumps(repos_data) | |
148 |
|
148 | |||
149 | return render('admin/repos/repos.html') |
|
149 | return render('admin/repos/repos.html') | |
150 |
|
150 | |||
151 | # perms check inside |
|
151 | # perms check inside | |
152 | @NotAnonymous() |
|
152 | @NotAnonymous() | |
153 | @auth.CSRFRequired() |
|
153 | @auth.CSRFRequired() | |
154 | def create(self): |
|
154 | def create(self): | |
155 | """ |
|
155 | """ | |
156 | POST /repos: Create a new item""" |
|
156 | POST /repos: Create a new item""" | |
157 | # url('repos') |
|
157 | # url('repos') | |
158 |
|
158 | |||
159 | self.__load_defaults() |
|
159 | self.__load_defaults() | |
160 | form_result = {} |
|
160 | form_result = {} | |
161 | task_id = None |
|
161 | task_id = None | |
162 | try: |
|
162 | try: | |
163 | # CanWriteToGroup validators checks permissions of this POST |
|
163 | # CanWriteToGroup validators checks permissions of this POST | |
164 | form_result = RepoForm(repo_groups=c.repo_groups_choices, |
|
164 | form_result = RepoForm(repo_groups=c.repo_groups_choices, | |
165 | landing_revs=c.landing_revs_choices)()\ |
|
165 | landing_revs=c.landing_revs_choices)()\ | |
166 | .to_python(dict(request.POST)) |
|
166 | .to_python(dict(request.POST)) | |
167 |
|
167 | |||
168 | # create is done sometimes async on celery, db transaction |
|
168 | # create is done sometimes async on celery, db transaction | |
169 | # management is handled there. |
|
169 | # management is handled there. | |
170 | task = RepoModel().create(form_result, c.rhodecode_user.user_id) |
|
170 | task = RepoModel().create(form_result, c.rhodecode_user.user_id) | |
171 | from celery.result import BaseAsyncResult |
|
171 | from celery.result import BaseAsyncResult | |
172 | if isinstance(task, BaseAsyncResult): |
|
172 | if isinstance(task, BaseAsyncResult): | |
173 | task_id = task.task_id |
|
173 | task_id = task.task_id | |
174 | except formencode.Invalid as errors: |
|
174 | except formencode.Invalid as errors: | |
175 | c.personal_repo_group = RepoGroup.get_by_group_name( |
|
175 | c.personal_repo_group = RepoGroup.get_by_group_name( | |
176 | c.rhodecode_user.username) |
|
176 | c.rhodecode_user.username) | |
177 | return htmlfill.render( |
|
177 | return htmlfill.render( | |
178 | render('admin/repos/repo_add.html'), |
|
178 | render('admin/repos/repo_add.html'), | |
179 | defaults=errors.value, |
|
179 | defaults=errors.value, | |
180 | errors=errors.error_dict or {}, |
|
180 | errors=errors.error_dict or {}, | |
181 | prefix_error=False, |
|
181 | prefix_error=False, | |
182 | encoding="UTF-8", |
|
182 | encoding="UTF-8", | |
183 | force_defaults=False) |
|
183 | force_defaults=False) | |
184 |
|
184 | |||
185 | except Exception as e: |
|
185 | except Exception as e: | |
186 | msg = self._log_creation_exception(e, form_result.get('repo_name')) |
|
186 | msg = self._log_creation_exception(e, form_result.get('repo_name')) | |
187 | h.flash(msg, category='error') |
|
187 | h.flash(msg, category='error') | |
188 | return redirect(url('home')) |
|
188 | return redirect(url('home')) | |
189 |
|
189 | |||
190 | return redirect(h.url('repo_creating_home', |
|
190 | return redirect(h.url('repo_creating_home', | |
191 | repo_name=form_result['repo_name_full'], |
|
191 | repo_name=form_result['repo_name_full'], | |
192 | task_id=task_id)) |
|
192 | task_id=task_id)) | |
193 |
|
193 | |||
194 | # perms check inside |
|
194 | # perms check inside | |
195 | @NotAnonymous() |
|
195 | @NotAnonymous() | |
196 | def create_repository(self): |
|
196 | def create_repository(self): | |
197 | """GET /_admin/create_repository: Form to create a new item""" |
|
197 | """GET /_admin/create_repository: Form to create a new item""" | |
198 | new_repo = request.GET.get('repo', '') |
|
198 | new_repo = request.GET.get('repo', '') | |
199 | parent_group = request.GET.get('parent_group') |
|
199 | parent_group = request.GET.get('parent_group') | |
200 | if not HasPermissionAny('hg.admin', 'hg.create.repository')(): |
|
200 | if not HasPermissionAny('hg.admin', 'hg.create.repository')(): | |
201 | # you're not super admin nor have global create permissions, |
|
201 | # you're not super admin nor have global create permissions, | |
202 | # but maybe you have at least write permission to a parent group ? |
|
202 | # but maybe you have at least write permission to a parent group ? | |
203 | _gr = RepoGroup.get(parent_group) |
|
203 | _gr = RepoGroup.get(parent_group) | |
204 | gr_name = _gr.group_name if _gr else None |
|
204 | gr_name = _gr.group_name if _gr else None | |
205 | # create repositories with write permission on group is set to true |
|
205 | # create repositories with write permission on group is set to true | |
206 | create_on_write = HasPermissionAny('hg.create.write_on_repogroup.true')() |
|
206 | create_on_write = HasPermissionAny('hg.create.write_on_repogroup.true')() | |
207 | group_admin = HasRepoGroupPermissionAny('group.admin')(group_name=gr_name) |
|
207 | group_admin = HasRepoGroupPermissionAny('group.admin')(group_name=gr_name) | |
208 | group_write = HasRepoGroupPermissionAny('group.write')(group_name=gr_name) |
|
208 | group_write = HasRepoGroupPermissionAny('group.write')(group_name=gr_name) | |
209 | if not (group_admin or (group_write and create_on_write)): |
|
209 | if not (group_admin or (group_write and create_on_write)): | |
210 | raise HTTPForbidden |
|
210 | raise HTTPForbidden | |
211 |
|
211 | |||
212 | acl_groups = RepoGroupList(RepoGroup.query().all(), |
|
212 | acl_groups = RepoGroupList(RepoGroup.query().all(), | |
213 | perm_set=['group.write', 'group.admin']) |
|
213 | perm_set=['group.write', 'group.admin']) | |
214 | c.repo_groups = RepoGroup.groups_choices(groups=acl_groups) |
|
214 | c.repo_groups = RepoGroup.groups_choices(groups=acl_groups) | |
215 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) |
|
215 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) | |
216 | choices, c.landing_revs = ScmModel().get_repo_landing_revs() |
|
216 | choices, c.landing_revs = ScmModel().get_repo_landing_revs() | |
217 | c.personal_repo_group = RepoGroup.get_by_group_name(c.rhodecode_user.username) |
|
217 | c.personal_repo_group = RepoGroup.get_by_group_name(c.rhodecode_user.username) | |
218 | c.new_repo = repo_name_slug(new_repo) |
|
218 | c.new_repo = repo_name_slug(new_repo) | |
219 |
|
219 | |||
220 | ## apply the defaults from defaults page |
|
220 | ## apply the defaults from defaults page | |
221 | defaults = SettingsModel().get_default_repo_settings(strip_prefix=True) |
|
221 | defaults = SettingsModel().get_default_repo_settings(strip_prefix=True) | |
222 | # set checkbox to autochecked |
|
222 | # set checkbox to autochecked | |
223 | defaults['repo_copy_permissions'] = True |
|
223 | defaults['repo_copy_permissions'] = True | |
224 | if parent_group: |
|
224 | if parent_group: | |
225 | defaults.update({'repo_group': parent_group}) |
|
225 | defaults.update({'repo_group': parent_group}) | |
226 |
|
226 | |||
227 | return htmlfill.render( |
|
227 | return htmlfill.render( | |
228 | render('admin/repos/repo_add.html'), |
|
228 | render('admin/repos/repo_add.html'), | |
229 | defaults=defaults, |
|
229 | defaults=defaults, | |
230 | errors={}, |
|
230 | errors={}, | |
231 | prefix_error=False, |
|
231 | prefix_error=False, | |
232 | encoding="UTF-8", |
|
232 | encoding="UTF-8", | |
233 | force_defaults=False |
|
233 | force_defaults=False | |
234 | ) |
|
234 | ) | |
235 |
|
235 | |||
236 | @NotAnonymous() |
|
236 | @NotAnonymous() | |
237 | def repo_creating(self, repo_name): |
|
237 | def repo_creating(self, repo_name): | |
238 | c.repo = repo_name |
|
238 | c.repo = repo_name | |
239 | c.task_id = request.GET.get('task_id') |
|
239 | c.task_id = request.GET.get('task_id') | |
240 | if not c.repo: |
|
240 | if not c.repo: | |
241 | raise HTTPNotFound() |
|
241 | raise HTTPNotFound() | |
242 | return render('admin/repos/repo_creating.html') |
|
242 | return render('admin/repos/repo_creating.html') | |
243 |
|
243 | |||
244 | @NotAnonymous() |
|
244 | @NotAnonymous() | |
245 | @jsonify |
|
245 | @jsonify | |
246 | def repo_check(self, repo_name): |
|
246 | def repo_check(self, repo_name): | |
247 | c.repo = repo_name |
|
247 | c.repo = repo_name | |
248 | task_id = request.GET.get('task_id') |
|
248 | task_id = request.GET.get('task_id') | |
249 |
|
249 | |||
250 | if task_id and task_id not in ['None']: |
|
250 | if task_id and task_id not in ['None']: | |
251 |
|
|
251 | import rhodecode | |
252 | from celery.result import AsyncResult |
|
252 | from celery.result import AsyncResult | |
253 | if CELERY_ENABLED: |
|
253 | if rhodecode.CELERY_ENABLED: | |
254 | task = AsyncResult(task_id) |
|
254 | task = AsyncResult(task_id) | |
255 | if task.failed(): |
|
255 | if task.failed(): | |
256 | msg = self._log_creation_exception(task.result, c.repo) |
|
256 | msg = self._log_creation_exception(task.result, c.repo) | |
257 | h.flash(msg, category='error') |
|
257 | h.flash(msg, category='error') | |
258 | return redirect(url('home'), code=501) |
|
258 | return redirect(url('home'), code=501) | |
259 |
|
259 | |||
260 | repo = Repository.get_by_repo_name(repo_name) |
|
260 | repo = Repository.get_by_repo_name(repo_name) | |
261 | if repo and repo.repo_state == Repository.STATE_CREATED: |
|
261 | if repo and repo.repo_state == Repository.STATE_CREATED: | |
262 | if repo.clone_uri: |
|
262 | if repo.clone_uri: | |
263 | clone_uri = repo.clone_uri_hidden |
|
263 | clone_uri = repo.clone_uri_hidden | |
264 | h.flash(_('Created repository %s from %s') |
|
264 | h.flash(_('Created repository %s from %s') | |
265 | % (repo.repo_name, clone_uri), category='success') |
|
265 | % (repo.repo_name, clone_uri), category='success') | |
266 | else: |
|
266 | else: | |
267 | repo_url = h.link_to(repo.repo_name, |
|
267 | repo_url = h.link_to(repo.repo_name, | |
268 | h.url('summary_home', |
|
268 | h.url('summary_home', | |
269 | repo_name=repo.repo_name)) |
|
269 | repo_name=repo.repo_name)) | |
270 | fork = repo.fork |
|
270 | fork = repo.fork | |
271 | if fork: |
|
271 | if fork: | |
272 | fork_name = fork.repo_name |
|
272 | fork_name = fork.repo_name | |
273 | h.flash(h.literal(_('Forked repository %s as %s') |
|
273 | h.flash(h.literal(_('Forked repository %s as %s') | |
274 | % (fork_name, repo_url)), category='success') |
|
274 | % (fork_name, repo_url)), category='success') | |
275 | else: |
|
275 | else: | |
276 | h.flash(h.literal(_('Created repository %s') % repo_url), |
|
276 | h.flash(h.literal(_('Created repository %s') % repo_url), | |
277 | category='success') |
|
277 | category='success') | |
278 | return {'result': True} |
|
278 | return {'result': True} | |
279 | return {'result': False} |
|
279 | return {'result': False} | |
280 |
|
280 | |||
281 | @HasRepoPermissionAllDecorator('repository.admin') |
|
281 | @HasRepoPermissionAllDecorator('repository.admin') | |
282 | @auth.CSRFRequired() |
|
282 | @auth.CSRFRequired() | |
283 | def update(self, repo_name): |
|
283 | def update(self, repo_name): | |
284 | """ |
|
284 | """ | |
285 | PUT /repos/repo_name: Update an existing item""" |
|
285 | PUT /repos/repo_name: Update an existing item""" | |
286 | # Forms posted to this method should contain a hidden field: |
|
286 | # Forms posted to this method should contain a hidden field: | |
287 | # <input type="hidden" name="_method" value="PUT" /> |
|
287 | # <input type="hidden" name="_method" value="PUT" /> | |
288 | # Or using helpers: |
|
288 | # Or using helpers: | |
289 | # h.form(url('repo', repo_name=ID), |
|
289 | # h.form(url('repo', repo_name=ID), | |
290 | # method='put') |
|
290 | # method='put') | |
291 | # url('repo', repo_name=ID) |
|
291 | # url('repo', repo_name=ID) | |
292 |
|
292 | |||
293 | self.__load_data(repo_name) |
|
293 | self.__load_data(repo_name) | |
294 | c.active = 'settings' |
|
294 | c.active = 'settings' | |
295 | c.repo_fields = RepositoryField.query()\ |
|
295 | c.repo_fields = RepositoryField.query()\ | |
296 | .filter(RepositoryField.repository == c.repo_info).all() |
|
296 | .filter(RepositoryField.repository == c.repo_info).all() | |
297 |
|
297 | |||
298 | repo_model = RepoModel() |
|
298 | repo_model = RepoModel() | |
299 | changed_name = repo_name |
|
299 | changed_name = repo_name | |
300 |
|
300 | |||
301 | # override the choices with extracted revisions ! |
|
301 | # override the choices with extracted revisions ! | |
302 | c.personal_repo_group = RepoGroup.get_by_group_name( |
|
302 | c.personal_repo_group = RepoGroup.get_by_group_name( | |
303 | c.rhodecode_user.username) |
|
303 | c.rhodecode_user.username) | |
304 | repo = Repository.get_by_repo_name(repo_name) |
|
304 | repo = Repository.get_by_repo_name(repo_name) | |
305 | old_data = { |
|
305 | old_data = { | |
306 | 'repo_name': repo_name, |
|
306 | 'repo_name': repo_name, | |
307 | 'repo_group': repo.group.get_dict() if repo.group else {}, |
|
307 | 'repo_group': repo.group.get_dict() if repo.group else {}, | |
308 | 'repo_type': repo.repo_type, |
|
308 | 'repo_type': repo.repo_type, | |
309 | } |
|
309 | } | |
310 |
_form = RepoForm( |
|
310 | _form = RepoForm( | |
311 |
|
|
311 | edit=True, old_data=old_data, repo_groups=c.repo_groups_choices, | |
312 |
|
|
312 | landing_revs=c.landing_revs_choices, allow_disabled=True)() | |
313 |
|
313 | |||
314 | try: |
|
314 | try: | |
315 | form_result = _form.to_python(dict(request.POST)) |
|
315 | form_result = _form.to_python(dict(request.POST)) | |
316 | repo = repo_model.update(repo_name, **form_result) |
|
316 | repo = repo_model.update(repo_name, **form_result) | |
317 | ScmModel().mark_for_invalidation(repo_name) |
|
317 | ScmModel().mark_for_invalidation(repo_name) | |
318 | h.flash(_('Repository %s updated successfully') % repo_name, |
|
318 | h.flash(_('Repository %s updated successfully') % repo_name, | |
319 | category='success') |
|
319 | category='success') | |
320 | changed_name = repo.repo_name |
|
320 | changed_name = repo.repo_name | |
321 | action_logger(c.rhodecode_user, 'admin_updated_repo', |
|
321 | action_logger(c.rhodecode_user, 'admin_updated_repo', | |
322 | changed_name, self.ip_addr, self.sa) |
|
322 | changed_name, self.ip_addr, self.sa) | |
323 | Session().commit() |
|
323 | Session().commit() | |
324 | except formencode.Invalid as errors: |
|
324 | except formencode.Invalid as errors: | |
325 | defaults = self.__load_data(repo_name) |
|
325 | defaults = self.__load_data(repo_name) | |
326 | defaults.update(errors.value) |
|
326 | defaults.update(errors.value) | |
327 | return htmlfill.render( |
|
327 | return htmlfill.render( | |
328 | render('admin/repos/repo_edit.html'), |
|
328 | render('admin/repos/repo_edit.html'), | |
329 | defaults=defaults, |
|
329 | defaults=defaults, | |
330 | errors=errors.error_dict or {}, |
|
330 | errors=errors.error_dict or {}, | |
331 | prefix_error=False, |
|
331 | prefix_error=False, | |
332 | encoding="UTF-8", |
|
332 | encoding="UTF-8", | |
333 | force_defaults=False) |
|
333 | force_defaults=False) | |
334 |
|
334 | |||
335 | except Exception: |
|
335 | except Exception: | |
336 | log.exception("Exception during update of repository") |
|
336 | log.exception("Exception during update of repository") | |
337 | h.flash(_('Error occurred during update of repository %s') \ |
|
337 | h.flash(_('Error occurred during update of repository %s') \ | |
338 | % repo_name, category='error') |
|
338 | % repo_name, category='error') | |
339 | return redirect(url('edit_repo', repo_name=changed_name)) |
|
339 | return redirect(url('edit_repo', repo_name=changed_name)) | |
340 |
|
340 | |||
341 | @HasRepoPermissionAllDecorator('repository.admin') |
|
341 | @HasRepoPermissionAllDecorator('repository.admin') | |
342 | @auth.CSRFRequired() |
|
342 | @auth.CSRFRequired() | |
343 | def delete(self, repo_name): |
|
343 | def delete(self, repo_name): | |
344 | """ |
|
344 | """ | |
345 | DELETE /repos/repo_name: Delete an existing item""" |
|
345 | DELETE /repos/repo_name: Delete an existing item""" | |
346 | # Forms posted to this method should contain a hidden field: |
|
346 | # Forms posted to this method should contain a hidden field: | |
347 | # <input type="hidden" name="_method" value="DELETE" /> |
|
347 | # <input type="hidden" name="_method" value="DELETE" /> | |
348 | # Or using helpers: |
|
348 | # Or using helpers: | |
349 | # h.form(url('repo', repo_name=ID), |
|
349 | # h.form(url('repo', repo_name=ID), | |
350 | # method='delete') |
|
350 | # method='delete') | |
351 | # url('repo', repo_name=ID) |
|
351 | # url('repo', repo_name=ID) | |
352 |
|
352 | |||
353 | repo_model = RepoModel() |
|
353 | repo_model = RepoModel() | |
354 | repo = repo_model.get_by_repo_name(repo_name) |
|
354 | repo = repo_model.get_by_repo_name(repo_name) | |
355 | if not repo: |
|
355 | if not repo: | |
356 | h.not_mapped_error(repo_name) |
|
356 | h.not_mapped_error(repo_name) | |
357 | return redirect(url('repos')) |
|
357 | return redirect(url('repos')) | |
358 | try: |
|
358 | try: | |
359 | _forks = repo.forks.count() |
|
359 | _forks = repo.forks.count() | |
360 | handle_forks = None |
|
360 | handle_forks = None | |
361 | if _forks and request.POST.get('forks'): |
|
361 | if _forks and request.POST.get('forks'): | |
362 | do = request.POST['forks'] |
|
362 | do = request.POST['forks'] | |
363 | if do == 'detach_forks': |
|
363 | if do == 'detach_forks': | |
364 | handle_forks = 'detach' |
|
364 | handle_forks = 'detach' | |
365 | h.flash(_('Detached %s forks') % _forks, category='success') |
|
365 | h.flash(_('Detached %s forks') % _forks, category='success') | |
366 | elif do == 'delete_forks': |
|
366 | elif do == 'delete_forks': | |
367 | handle_forks = 'delete' |
|
367 | handle_forks = 'delete' | |
368 | h.flash(_('Deleted %s forks') % _forks, category='success') |
|
368 | h.flash(_('Deleted %s forks') % _forks, category='success') | |
369 | repo_model.delete(repo, forks=handle_forks) |
|
369 | repo_model.delete(repo, forks=handle_forks) | |
370 | action_logger(c.rhodecode_user, 'admin_deleted_repo', |
|
370 | action_logger(c.rhodecode_user, 'admin_deleted_repo', | |
371 | repo_name, self.ip_addr, self.sa) |
|
371 | repo_name, self.ip_addr, self.sa) | |
372 | ScmModel().mark_for_invalidation(repo_name) |
|
372 | ScmModel().mark_for_invalidation(repo_name) | |
373 | h.flash(_('Deleted repository %s') % repo_name, category='success') |
|
373 | h.flash(_('Deleted repository %s') % repo_name, category='success') | |
374 | Session().commit() |
|
374 | Session().commit() | |
375 | except AttachedForksError: |
|
375 | except AttachedForksError: | |
376 | h.flash(_('Cannot delete %s it still contains attached forks') |
|
376 | h.flash(_('Cannot delete %s it still contains attached forks') | |
377 | % repo_name, category='warning') |
|
377 | % repo_name, category='warning') | |
378 |
|
378 | |||
379 | except Exception: |
|
379 | except Exception: | |
380 | log.exception("Exception during deletion of repository") |
|
380 | log.exception("Exception during deletion of repository") | |
381 | h.flash(_('An error occurred during deletion of %s') % repo_name, |
|
381 | h.flash(_('An error occurred during deletion of %s') % repo_name, | |
382 | category='error') |
|
382 | category='error') | |
383 |
|
383 | |||
384 | return redirect(url('repos')) |
|
384 | return redirect(url('repos')) | |
385 |
|
385 | |||
386 | @HasPermissionAllDecorator('hg.admin') |
|
386 | @HasPermissionAllDecorator('hg.admin') | |
387 | def show(self, repo_name, format='html'): |
|
387 | def show(self, repo_name, format='html'): | |
388 | """GET /repos/repo_name: Show a specific item""" |
|
388 | """GET /repos/repo_name: Show a specific item""" | |
389 | # url('repo', repo_name=ID) |
|
389 | # url('repo', repo_name=ID) | |
390 |
|
390 | |||
391 | @HasRepoPermissionAllDecorator('repository.admin') |
|
391 | @HasRepoPermissionAllDecorator('repository.admin') | |
392 | def edit(self, repo_name): |
|
392 | def edit(self, repo_name): | |
393 | """GET /repo_name/settings: Form to edit an existing item""" |
|
393 | """GET /repo_name/settings: Form to edit an existing item""" | |
394 | # url('edit_repo', repo_name=ID) |
|
394 | # url('edit_repo', repo_name=ID) | |
395 | defaults = self.__load_data(repo_name) |
|
395 | defaults = self.__load_data(repo_name) | |
396 | if 'clone_uri' in defaults: |
|
396 | if 'clone_uri' in defaults: | |
397 | del defaults['clone_uri'] |
|
397 | del defaults['clone_uri'] | |
398 |
|
398 | |||
399 | c.repo_fields = RepositoryField.query()\ |
|
399 | c.repo_fields = RepositoryField.query()\ | |
400 | .filter(RepositoryField.repository == c.repo_info).all() |
|
400 | .filter(RepositoryField.repository == c.repo_info).all() | |
401 | c.personal_repo_group = RepoGroup.get_by_group_name( |
|
401 | c.personal_repo_group = RepoGroup.get_by_group_name( | |
402 | c.rhodecode_user.username) |
|
402 | c.rhodecode_user.username) | |
403 | c.active = 'settings' |
|
403 | c.active = 'settings' | |
404 | return htmlfill.render( |
|
404 | return htmlfill.render( | |
405 | render('admin/repos/repo_edit.html'), |
|
405 | render('admin/repos/repo_edit.html'), | |
406 | defaults=defaults, |
|
406 | defaults=defaults, | |
407 | encoding="UTF-8", |
|
407 | encoding="UTF-8", | |
408 | force_defaults=False) |
|
408 | force_defaults=False) | |
409 |
|
409 | |||
410 | @HasRepoPermissionAllDecorator('repository.admin') |
|
410 | @HasRepoPermissionAllDecorator('repository.admin') | |
411 | def edit_permissions(self, repo_name): |
|
411 | def edit_permissions(self, repo_name): | |
412 | """GET /repo_name/settings: Form to edit an existing item""" |
|
412 | """GET /repo_name/settings: Form to edit an existing item""" | |
413 | # url('edit_repo', repo_name=ID) |
|
413 | # url('edit_repo', repo_name=ID) | |
414 | c.repo_info = self._load_repo(repo_name) |
|
414 | c.repo_info = self._load_repo(repo_name) | |
415 | c.active = 'permissions' |
|
415 | c.active = 'permissions' | |
416 | defaults = RepoModel()._get_defaults(repo_name) |
|
416 | defaults = RepoModel()._get_defaults(repo_name) | |
417 |
|
417 | |||
418 | return htmlfill.render( |
|
418 | return htmlfill.render( | |
419 | render('admin/repos/repo_edit.html'), |
|
419 | render('admin/repos/repo_edit.html'), | |
420 | defaults=defaults, |
|
420 | defaults=defaults, | |
421 | encoding="UTF-8", |
|
421 | encoding="UTF-8", | |
422 | force_defaults=False) |
|
422 | force_defaults=False) | |
423 |
|
423 | |||
424 | @HasRepoPermissionAllDecorator('repository.admin') |
|
424 | @HasRepoPermissionAllDecorator('repository.admin') | |
425 | @auth.CSRFRequired() |
|
425 | @auth.CSRFRequired() | |
426 | def edit_permissions_update(self, repo_name): |
|
426 | def edit_permissions_update(self, repo_name): | |
427 | form = RepoPermsForm()().to_python(request.POST) |
|
427 | form = RepoPermsForm()().to_python(request.POST) | |
428 | RepoModel().update_permissions(repo_name, |
|
428 | RepoModel().update_permissions(repo_name, | |
429 | form['perm_additions'], form['perm_updates'], form['perm_deletions']) |
|
429 | form['perm_additions'], form['perm_updates'], form['perm_deletions']) | |
430 |
|
430 | |||
431 | #TODO: implement this |
|
431 | #TODO: implement this | |
432 | #action_logger(c.rhodecode_user, 'admin_changed_repo_permissions', |
|
432 | #action_logger(c.rhodecode_user, 'admin_changed_repo_permissions', | |
433 | # repo_name, self.ip_addr, self.sa) |
|
433 | # repo_name, self.ip_addr, self.sa) | |
434 | Session().commit() |
|
434 | Session().commit() | |
435 | h.flash(_('Repository permissions updated'), category='success') |
|
435 | h.flash(_('Repository permissions updated'), category='success') | |
436 | return redirect(url('edit_repo_perms', repo_name=repo_name)) |
|
436 | return redirect(url('edit_repo_perms', repo_name=repo_name)) | |
437 |
|
437 | |||
438 | @HasRepoPermissionAllDecorator('repository.admin') |
|
438 | @HasRepoPermissionAllDecorator('repository.admin') | |
439 | def edit_fields(self, repo_name): |
|
439 | def edit_fields(self, repo_name): | |
440 | """GET /repo_name/settings: Form to edit an existing item""" |
|
440 | """GET /repo_name/settings: Form to edit an existing item""" | |
441 | # url('edit_repo', repo_name=ID) |
|
441 | # url('edit_repo', repo_name=ID) | |
442 | c.repo_info = self._load_repo(repo_name) |
|
442 | c.repo_info = self._load_repo(repo_name) | |
443 | c.repo_fields = RepositoryField.query()\ |
|
443 | c.repo_fields = RepositoryField.query()\ | |
444 | .filter(RepositoryField.repository == c.repo_info).all() |
|
444 | .filter(RepositoryField.repository == c.repo_info).all() | |
445 | c.active = 'fields' |
|
445 | c.active = 'fields' | |
446 | if request.POST: |
|
446 | if request.POST: | |
447 |
|
447 | |||
448 | return redirect(url('repo_edit_fields')) |
|
448 | return redirect(url('repo_edit_fields')) | |
449 | return render('admin/repos/repo_edit.html') |
|
449 | return render('admin/repos/repo_edit.html') | |
450 |
|
450 | |||
451 | @HasRepoPermissionAllDecorator('repository.admin') |
|
451 | @HasRepoPermissionAllDecorator('repository.admin') | |
452 | @auth.CSRFRequired() |
|
452 | @auth.CSRFRequired() | |
453 | def create_repo_field(self, repo_name): |
|
453 | def create_repo_field(self, repo_name): | |
454 | try: |
|
454 | try: | |
455 | form_result = RepoFieldForm()().to_python(dict(request.POST)) |
|
455 | form_result = RepoFieldForm()().to_python(dict(request.POST)) | |
456 | RepoModel().add_repo_field( |
|
456 | RepoModel().add_repo_field( | |
457 | repo_name, form_result['new_field_key'], |
|
457 | repo_name, form_result['new_field_key'], | |
458 | field_type=form_result['new_field_type'], |
|
458 | field_type=form_result['new_field_type'], | |
459 | field_value=form_result['new_field_value'], |
|
459 | field_value=form_result['new_field_value'], | |
460 | field_label=form_result['new_field_label'], |
|
460 | field_label=form_result['new_field_label'], | |
461 | field_desc=form_result['new_field_desc']) |
|
461 | field_desc=form_result['new_field_desc']) | |
462 |
|
462 | |||
463 | Session().commit() |
|
463 | Session().commit() | |
464 | except Exception as e: |
|
464 | except Exception as e: | |
465 | log.exception("Exception creating field") |
|
465 | log.exception("Exception creating field") | |
466 | msg = _('An error occurred during creation of field') |
|
466 | msg = _('An error occurred during creation of field') | |
467 | if isinstance(e, formencode.Invalid): |
|
467 | if isinstance(e, formencode.Invalid): | |
468 | msg += ". " + e.msg |
|
468 | msg += ". " + e.msg | |
469 | h.flash(msg, category='error') |
|
469 | h.flash(msg, category='error') | |
470 | return redirect(url('edit_repo_fields', repo_name=repo_name)) |
|
470 | return redirect(url('edit_repo_fields', repo_name=repo_name)) | |
471 |
|
471 | |||
472 | @HasRepoPermissionAllDecorator('repository.admin') |
|
472 | @HasRepoPermissionAllDecorator('repository.admin') | |
473 | @auth.CSRFRequired() |
|
473 | @auth.CSRFRequired() | |
474 | def delete_repo_field(self, repo_name, field_id): |
|
474 | def delete_repo_field(self, repo_name, field_id): | |
475 | field = RepositoryField.get_or_404(field_id) |
|
475 | field = RepositoryField.get_or_404(field_id) | |
476 | try: |
|
476 | try: | |
477 | RepoModel().delete_repo_field(repo_name, field.field_key) |
|
477 | RepoModel().delete_repo_field(repo_name, field.field_key) | |
478 | Session().commit() |
|
478 | Session().commit() | |
479 | except Exception as e: |
|
479 | except Exception as e: | |
480 | log.exception("Exception during removal of field") |
|
480 | log.exception("Exception during removal of field") | |
481 | msg = _('An error occurred during removal of field') |
|
481 | msg = _('An error occurred during removal of field') | |
482 | h.flash(msg, category='error') |
|
482 | h.flash(msg, category='error') | |
483 | return redirect(url('edit_repo_fields', repo_name=repo_name)) |
|
483 | return redirect(url('edit_repo_fields', repo_name=repo_name)) | |
484 |
|
484 | |||
485 | @HasRepoPermissionAllDecorator('repository.admin') |
|
485 | @HasRepoPermissionAllDecorator('repository.admin') | |
486 | def edit_advanced(self, repo_name): |
|
486 | def edit_advanced(self, repo_name): | |
487 | """GET /repo_name/settings: Form to edit an existing item""" |
|
487 | """GET /repo_name/settings: Form to edit an existing item""" | |
488 | # url('edit_repo', repo_name=ID) |
|
488 | # url('edit_repo', repo_name=ID) | |
489 | c.repo_info = self._load_repo(repo_name) |
|
489 | c.repo_info = self._load_repo(repo_name) | |
490 | c.default_user_id = User.get_default_user().user_id |
|
490 | c.default_user_id = User.get_default_user().user_id | |
491 | c.in_public_journal = UserFollowing.query()\ |
|
491 | c.in_public_journal = UserFollowing.query()\ | |
492 | .filter(UserFollowing.user_id == c.default_user_id)\ |
|
492 | .filter(UserFollowing.user_id == c.default_user_id)\ | |
493 | .filter(UserFollowing.follows_repository == c.repo_info).scalar() |
|
493 | .filter(UserFollowing.follows_repository == c.repo_info).scalar() | |
494 |
|
494 | |||
495 | c.active = 'advanced' |
|
495 | c.active = 'advanced' | |
496 | c.has_origin_repo_read_perm = False |
|
496 | c.has_origin_repo_read_perm = False | |
497 | if c.repo_info.fork: |
|
497 | if c.repo_info.fork: | |
498 | c.has_origin_repo_read_perm = h.HasRepoPermissionAny( |
|
498 | c.has_origin_repo_read_perm = h.HasRepoPermissionAny( | |
499 | 'repository.write', 'repository.read', 'repository.admin')( |
|
499 | 'repository.write', 'repository.read', 'repository.admin')( | |
500 | c.repo_info.fork.repo_name, 'repo set as fork page') |
|
500 | c.repo_info.fork.repo_name, 'repo set as fork page') | |
501 |
|
501 | |||
502 | if request.POST: |
|
502 | if request.POST: | |
503 | return redirect(url('repo_edit_advanced')) |
|
503 | return redirect(url('repo_edit_advanced')) | |
504 | return render('admin/repos/repo_edit.html') |
|
504 | return render('admin/repos/repo_edit.html') | |
505 |
|
505 | |||
506 | @HasRepoPermissionAllDecorator('repository.admin') |
|
506 | @HasRepoPermissionAllDecorator('repository.admin') | |
507 | @auth.CSRFRequired() |
|
507 | @auth.CSRFRequired() | |
508 | def edit_advanced_journal(self, repo_name): |
|
508 | def edit_advanced_journal(self, repo_name): | |
509 | """ |
|
509 | """ | |
510 | Set's this repository to be visible in public journal, |
|
510 | Set's this repository to be visible in public journal, | |
511 | in other words assing default user to follow this repo |
|
511 | in other words assing default user to follow this repo | |
512 |
|
512 | |||
513 | :param repo_name: |
|
513 | :param repo_name: | |
514 | """ |
|
514 | """ | |
515 |
|
515 | |||
516 | try: |
|
516 | try: | |
517 | repo_id = Repository.get_by_repo_name(repo_name).repo_id |
|
517 | repo_id = Repository.get_by_repo_name(repo_name).repo_id | |
518 | user_id = User.get_default_user().user_id |
|
518 | user_id = User.get_default_user().user_id | |
519 | self.scm_model.toggle_following_repo(repo_id, user_id) |
|
519 | self.scm_model.toggle_following_repo(repo_id, user_id) | |
520 | h.flash(_('Updated repository visibility in public journal'), |
|
520 | h.flash(_('Updated repository visibility in public journal'), | |
521 | category='success') |
|
521 | category='success') | |
522 | Session().commit() |
|
522 | Session().commit() | |
523 | except Exception: |
|
523 | except Exception: | |
524 | h.flash(_('An error occurred during setting this' |
|
524 | h.flash(_('An error occurred during setting this' | |
525 | ' repository in public journal'), |
|
525 | ' repository in public journal'), | |
526 | category='error') |
|
526 | category='error') | |
527 |
|
527 | |||
528 | return redirect(url('edit_repo_advanced', repo_name=repo_name)) |
|
528 | return redirect(url('edit_repo_advanced', repo_name=repo_name)) | |
529 |
|
529 | |||
530 | @HasRepoPermissionAllDecorator('repository.admin') |
|
530 | @HasRepoPermissionAllDecorator('repository.admin') | |
531 | @auth.CSRFRequired() |
|
531 | @auth.CSRFRequired() | |
532 | def edit_advanced_fork(self, repo_name): |
|
532 | def edit_advanced_fork(self, repo_name): | |
533 | """ |
|
533 | """ | |
534 | Mark given repository as a fork of another |
|
534 | Mark given repository as a fork of another | |
535 |
|
535 | |||
536 | :param repo_name: |
|
536 | :param repo_name: | |
537 | """ |
|
537 | """ | |
538 |
|
538 | |||
539 | new_fork_id = request.POST.get('id_fork_of') |
|
539 | new_fork_id = request.POST.get('id_fork_of') | |
540 | try: |
|
540 | try: | |
541 |
|
541 | |||
542 | if new_fork_id and not new_fork_id.isdigit(): |
|
542 | if new_fork_id and not new_fork_id.isdigit(): | |
543 | log.error('Given fork id %s is not an INT', new_fork_id) |
|
543 | log.error('Given fork id %s is not an INT', new_fork_id) | |
544 |
|
544 | |||
545 | fork_id = safe_int(new_fork_id) |
|
545 | fork_id = safe_int(new_fork_id) | |
546 | repo = ScmModel().mark_as_fork(repo_name, fork_id, |
|
546 | repo = ScmModel().mark_as_fork(repo_name, fork_id, | |
547 | c.rhodecode_user.username) |
|
547 | c.rhodecode_user.username) | |
548 | fork = repo.fork.repo_name if repo.fork else _('Nothing') |
|
548 | fork = repo.fork.repo_name if repo.fork else _('Nothing') | |
549 | Session().commit() |
|
549 | Session().commit() | |
550 | h.flash(_('Marked repo %s as fork of %s') % (repo_name, fork), |
|
550 | h.flash(_('Marked repo %s as fork of %s') % (repo_name, fork), | |
551 | category='success') |
|
551 | category='success') | |
552 | except RepositoryError as e: |
|
552 | except RepositoryError as e: | |
553 | log.exception("Repository Error occurred") |
|
553 | log.exception("Repository Error occurred") | |
554 | h.flash(str(e), category='error') |
|
554 | h.flash(str(e), category='error') | |
555 | except Exception as e: |
|
555 | except Exception as e: | |
556 | log.exception("Exception while editing fork") |
|
556 | log.exception("Exception while editing fork") | |
557 | h.flash(_('An error occurred during this operation'), |
|
557 | h.flash(_('An error occurred during this operation'), | |
558 | category='error') |
|
558 | category='error') | |
559 |
|
559 | |||
560 | return redirect(url('edit_repo_advanced', repo_name=repo_name)) |
|
560 | return redirect(url('edit_repo_advanced', repo_name=repo_name)) | |
561 |
|
561 | |||
562 | @HasRepoPermissionAllDecorator('repository.admin') |
|
562 | @HasRepoPermissionAllDecorator('repository.admin') | |
563 | @auth.CSRFRequired() |
|
563 | @auth.CSRFRequired() | |
564 | def edit_advanced_locking(self, repo_name): |
|
564 | def edit_advanced_locking(self, repo_name): | |
565 | """ |
|
565 | """ | |
566 | Unlock repository when it is locked ! |
|
566 | Unlock repository when it is locked ! | |
567 |
|
567 | |||
568 | :param repo_name: |
|
568 | :param repo_name: | |
569 | """ |
|
569 | """ | |
570 | try: |
|
570 | try: | |
571 | repo = Repository.get_by_repo_name(repo_name) |
|
571 | repo = Repository.get_by_repo_name(repo_name) | |
572 | if request.POST.get('set_lock'): |
|
572 | if request.POST.get('set_lock'): | |
573 | Repository.lock(repo, c.rhodecode_user.user_id, |
|
573 | Repository.lock(repo, c.rhodecode_user.user_id, | |
574 | lock_reason=Repository.LOCK_WEB) |
|
574 | lock_reason=Repository.LOCK_WEB) | |
575 | h.flash(_('Locked repository'), category='success') |
|
575 | h.flash(_('Locked repository'), category='success') | |
576 | elif request.POST.get('set_unlock'): |
|
576 | elif request.POST.get('set_unlock'): | |
577 | Repository.unlock(repo) |
|
577 | Repository.unlock(repo) | |
578 | h.flash(_('Unlocked repository'), category='success') |
|
578 | h.flash(_('Unlocked repository'), category='success') | |
579 | except Exception as e: |
|
579 | except Exception as e: | |
580 | log.exception("Exception during unlocking") |
|
580 | log.exception("Exception during unlocking") | |
581 | h.flash(_('An error occurred during unlocking'), |
|
581 | h.flash(_('An error occurred during unlocking'), | |
582 | category='error') |
|
582 | category='error') | |
583 | return redirect(url('edit_repo_advanced', repo_name=repo_name)) |
|
583 | return redirect(url('edit_repo_advanced', repo_name=repo_name)) | |
584 |
|
584 | |||
585 | @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin') |
|
585 | @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin') | |
586 | @auth.CSRFRequired() |
|
586 | @auth.CSRFRequired() | |
587 | def toggle_locking(self, repo_name): |
|
587 | def toggle_locking(self, repo_name): | |
588 | """ |
|
588 | """ | |
589 | Toggle locking of repository by simple GET call to url |
|
589 | Toggle locking of repository by simple GET call to url | |
590 |
|
590 | |||
591 | :param repo_name: |
|
591 | :param repo_name: | |
592 | """ |
|
592 | """ | |
593 |
|
593 | |||
594 | try: |
|
594 | try: | |
595 | repo = Repository.get_by_repo_name(repo_name) |
|
595 | repo = Repository.get_by_repo_name(repo_name) | |
596 |
|
596 | |||
597 | if repo.enable_locking: |
|
597 | if repo.enable_locking: | |
598 | if repo.locked[0]: |
|
598 | if repo.locked[0]: | |
599 | Repository.unlock(repo) |
|
599 | Repository.unlock(repo) | |
600 | action = _('Unlocked') |
|
600 | action = _('Unlocked') | |
601 | else: |
|
601 | else: | |
602 | Repository.lock(repo, c.rhodecode_user.user_id, |
|
602 | Repository.lock(repo, c.rhodecode_user.user_id, | |
603 | lock_reason=Repository.LOCK_WEB) |
|
603 | lock_reason=Repository.LOCK_WEB) | |
604 | action = _('Locked') |
|
604 | action = _('Locked') | |
605 |
|
605 | |||
606 | h.flash(_('Repository has been %s') % action, |
|
606 | h.flash(_('Repository has been %s') % action, | |
607 | category='success') |
|
607 | category='success') | |
608 | except Exception: |
|
608 | except Exception: | |
609 | log.exception("Exception during unlocking") |
|
609 | log.exception("Exception during unlocking") | |
610 | h.flash(_('An error occurred during unlocking'), |
|
610 | h.flash(_('An error occurred during unlocking'), | |
611 | category='error') |
|
611 | category='error') | |
612 | return redirect(url('summary_home', repo_name=repo_name)) |
|
612 | return redirect(url('summary_home', repo_name=repo_name)) | |
613 |
|
613 | |||
614 | @HasRepoPermissionAllDecorator('repository.admin') |
|
614 | @HasRepoPermissionAllDecorator('repository.admin') | |
615 | @auth.CSRFRequired() |
|
615 | @auth.CSRFRequired() | |
616 | def edit_caches(self, repo_name): |
|
616 | def edit_caches(self, repo_name): | |
617 | """PUT /{repo_name}/settings/caches: invalidate the repo caches.""" |
|
617 | """PUT /{repo_name}/settings/caches: invalidate the repo caches.""" | |
618 | try: |
|
618 | try: | |
619 | ScmModel().mark_for_invalidation(repo_name, delete=True) |
|
619 | ScmModel().mark_for_invalidation(repo_name, delete=True) | |
620 | Session().commit() |
|
620 | Session().commit() | |
621 | h.flash(_('Cache invalidation successful'), |
|
621 | h.flash(_('Cache invalidation successful'), | |
622 | category='success') |
|
622 | category='success') | |
623 | except Exception: |
|
623 | except Exception: | |
624 | log.exception("Exception during cache invalidation") |
|
624 | log.exception("Exception during cache invalidation") | |
625 | h.flash(_('An error occurred during cache invalidation'), |
|
625 | h.flash(_('An error occurred during cache invalidation'), | |
626 | category='error') |
|
626 | category='error') | |
627 |
|
627 | |||
628 | return redirect(url('edit_repo_caches', repo_name=c.repo_name)) |
|
628 | return redirect(url('edit_repo_caches', repo_name=c.repo_name)) | |
629 |
|
629 | |||
630 | @HasRepoPermissionAllDecorator('repository.admin') |
|
630 | @HasRepoPermissionAllDecorator('repository.admin') | |
631 | def edit_caches_form(self, repo_name): |
|
631 | def edit_caches_form(self, repo_name): | |
632 | """GET /repo_name/settings: Form to edit an existing item""" |
|
632 | """GET /repo_name/settings: Form to edit an existing item""" | |
633 | # url('edit_repo', repo_name=ID) |
|
633 | # url('edit_repo', repo_name=ID) | |
634 | c.repo_info = self._load_repo(repo_name) |
|
634 | c.repo_info = self._load_repo(repo_name) | |
635 | c.active = 'caches' |
|
635 | c.active = 'caches' | |
636 |
|
636 | |||
637 | return render('admin/repos/repo_edit.html') |
|
637 | return render('admin/repos/repo_edit.html') | |
638 |
|
638 | |||
639 | @HasRepoPermissionAllDecorator('repository.admin') |
|
639 | @HasRepoPermissionAllDecorator('repository.admin') | |
640 | @auth.CSRFRequired() |
|
640 | @auth.CSRFRequired() | |
641 | def edit_remote(self, repo_name): |
|
641 | def edit_remote(self, repo_name): | |
642 | """PUT /{repo_name}/settings/remote: edit the repo remote.""" |
|
642 | """PUT /{repo_name}/settings/remote: edit the repo remote.""" | |
643 | try: |
|
643 | try: | |
644 | ScmModel().pull_changes(repo_name, c.rhodecode_user.username) |
|
644 | ScmModel().pull_changes(repo_name, c.rhodecode_user.username) | |
645 | h.flash(_('Pulled from remote location'), category='success') |
|
645 | h.flash(_('Pulled from remote location'), category='success') | |
646 | except Exception: |
|
646 | except Exception: | |
647 | log.exception("Exception during pull from remote") |
|
647 | log.exception("Exception during pull from remote") | |
648 | h.flash(_('An error occurred during pull from remote location'), |
|
648 | h.flash(_('An error occurred during pull from remote location'), | |
649 | category='error') |
|
649 | category='error') | |
650 | return redirect(url('edit_repo_remote', repo_name=c.repo_name)) |
|
650 | return redirect(url('edit_repo_remote', repo_name=c.repo_name)) | |
651 |
|
651 | |||
652 | @HasRepoPermissionAllDecorator('repository.admin') |
|
652 | @HasRepoPermissionAllDecorator('repository.admin') | |
653 | def edit_remote_form(self, repo_name): |
|
653 | def edit_remote_form(self, repo_name): | |
654 | """GET /repo_name/settings: Form to edit an existing item""" |
|
654 | """GET /repo_name/settings: Form to edit an existing item""" | |
655 | # url('edit_repo', repo_name=ID) |
|
655 | # url('edit_repo', repo_name=ID) | |
656 | c.repo_info = self._load_repo(repo_name) |
|
656 | c.repo_info = self._load_repo(repo_name) | |
657 | c.active = 'remote' |
|
657 | c.active = 'remote' | |
658 |
|
658 | |||
659 | return render('admin/repos/repo_edit.html') |
|
659 | return render('admin/repos/repo_edit.html') | |
660 |
|
660 | |||
661 | @HasRepoPermissionAllDecorator('repository.admin') |
|
661 | @HasRepoPermissionAllDecorator('repository.admin') | |
662 | @auth.CSRFRequired() |
|
662 | @auth.CSRFRequired() | |
663 | def edit_statistics(self, repo_name): |
|
663 | def edit_statistics(self, repo_name): | |
664 | """PUT /{repo_name}/settings/statistics: reset the repo statistics.""" |
|
664 | """PUT /{repo_name}/settings/statistics: reset the repo statistics.""" | |
665 | try: |
|
665 | try: | |
666 | RepoModel().delete_stats(repo_name) |
|
666 | RepoModel().delete_stats(repo_name) | |
667 | Session().commit() |
|
667 | Session().commit() | |
668 | except Exception as e: |
|
668 | except Exception as e: | |
669 | log.error(traceback.format_exc()) |
|
669 | log.error(traceback.format_exc()) | |
670 | h.flash(_('An error occurred during deletion of repository stats'), |
|
670 | h.flash(_('An error occurred during deletion of repository stats'), | |
671 | category='error') |
|
671 | category='error') | |
672 | return redirect(url('edit_repo_statistics', repo_name=c.repo_name)) |
|
672 | return redirect(url('edit_repo_statistics', repo_name=c.repo_name)) | |
673 |
|
673 | |||
674 | @HasRepoPermissionAllDecorator('repository.admin') |
|
674 | @HasRepoPermissionAllDecorator('repository.admin') | |
675 | def edit_statistics_form(self, repo_name): |
|
675 | def edit_statistics_form(self, repo_name): | |
676 | """GET /repo_name/settings: Form to edit an existing item""" |
|
676 | """GET /repo_name/settings: Form to edit an existing item""" | |
677 | # url('edit_repo', repo_name=ID) |
|
677 | # url('edit_repo', repo_name=ID) | |
678 | c.repo_info = self._load_repo(repo_name) |
|
678 | c.repo_info = self._load_repo(repo_name) | |
679 | repo = c.repo_info.scm_instance() |
|
679 | repo = c.repo_info.scm_instance() | |
680 |
|
680 | |||
681 | if c.repo_info.stats: |
|
681 | if c.repo_info.stats: | |
682 | # this is on what revision we ended up so we add +1 for count |
|
682 | # this is on what revision we ended up so we add +1 for count | |
683 | last_rev = c.repo_info.stats.stat_on_revision + 1 |
|
683 | last_rev = c.repo_info.stats.stat_on_revision + 1 | |
684 | else: |
|
684 | else: | |
685 | last_rev = 0 |
|
685 | last_rev = 0 | |
686 | c.stats_revision = last_rev |
|
686 | c.stats_revision = last_rev | |
687 |
|
687 | |||
688 | c.repo_last_rev = repo.count() |
|
688 | c.repo_last_rev = repo.count() | |
689 |
|
689 | |||
690 | if last_rev == 0 or c.repo_last_rev == 0: |
|
690 | if last_rev == 0 or c.repo_last_rev == 0: | |
691 | c.stats_percentage = 0 |
|
691 | c.stats_percentage = 0 | |
692 | else: |
|
692 | else: | |
693 | c.stats_percentage = '%.2f' % ((float((last_rev)) / c.repo_last_rev) * 100) |
|
693 | c.stats_percentage = '%.2f' % ((float((last_rev)) / c.repo_last_rev) * 100) | |
694 |
|
694 | |||
695 | c.active = 'statistics' |
|
695 | c.active = 'statistics' | |
696 |
|
696 | |||
697 | return render('admin/repos/repo_edit.html') |
|
697 | return render('admin/repos/repo_edit.html') | |
698 |
|
698 | |||
699 | @HasRepoPermissionAllDecorator('repository.admin') |
|
699 | @HasRepoPermissionAllDecorator('repository.admin') | |
700 | @auth.CSRFRequired() |
|
700 | @auth.CSRFRequired() | |
701 | def repo_issuetracker_test(self, repo_name): |
|
701 | def repo_issuetracker_test(self, repo_name): | |
702 | if request.is_xhr: |
|
702 | if request.is_xhr: | |
703 | return h.urlify_commit_message( |
|
703 | return h.urlify_commit_message( | |
704 | request.POST.get('test_text', ''), |
|
704 | request.POST.get('test_text', ''), | |
705 | repo_name) |
|
705 | repo_name) | |
706 | else: |
|
706 | else: | |
707 | raise HTTPBadRequest() |
|
707 | raise HTTPBadRequest() | |
708 |
|
708 | |||
709 | @HasRepoPermissionAllDecorator('repository.admin') |
|
709 | @HasRepoPermissionAllDecorator('repository.admin') | |
710 | @auth.CSRFRequired() |
|
710 | @auth.CSRFRequired() | |
711 | def repo_issuetracker_delete(self, repo_name): |
|
711 | def repo_issuetracker_delete(self, repo_name): | |
712 | uid = request.POST.get('uid') |
|
712 | uid = request.POST.get('uid') | |
713 | repo_settings = IssueTrackerSettingsModel(repo=repo_name) |
|
713 | repo_settings = IssueTrackerSettingsModel(repo=repo_name) | |
714 | try: |
|
714 | try: | |
715 | repo_settings.delete_entries(uid) |
|
715 | repo_settings.delete_entries(uid) | |
716 | except Exception: |
|
716 | except Exception: | |
717 | h.flash(_('Error occurred during deleting issue tracker entry'), |
|
717 | h.flash(_('Error occurred during deleting issue tracker entry'), | |
718 | category='error') |
|
718 | category='error') | |
719 | else: |
|
719 | else: | |
720 | h.flash(_('Removed issue tracker entry'), category='success') |
|
720 | h.flash(_('Removed issue tracker entry'), category='success') | |
721 | return redirect(url('repo_settings_issuetracker', |
|
721 | return redirect(url('repo_settings_issuetracker', | |
722 | repo_name=repo_name)) |
|
722 | repo_name=repo_name)) | |
723 |
|
723 | |||
724 | def _update_patterns(self, form, repo_settings): |
|
724 | def _update_patterns(self, form, repo_settings): | |
725 | for uid in form['delete_patterns']: |
|
725 | for uid in form['delete_patterns']: | |
726 | repo_settings.delete_entries(uid) |
|
726 | repo_settings.delete_entries(uid) | |
727 |
|
727 | |||
728 | for pattern in form['patterns']: |
|
728 | for pattern in form['patterns']: | |
729 | for setting, value, type_ in pattern: |
|
729 | for setting, value, type_ in pattern: | |
730 | sett = repo_settings.create_or_update_setting( |
|
730 | sett = repo_settings.create_or_update_setting( | |
731 | setting, value, type_) |
|
731 | setting, value, type_) | |
732 | Session().add(sett) |
|
732 | Session().add(sett) | |
733 |
|
733 | |||
734 | Session().commit() |
|
734 | Session().commit() | |
735 |
|
735 | |||
736 | @HasRepoPermissionAllDecorator('repository.admin') |
|
736 | @HasRepoPermissionAllDecorator('repository.admin') | |
737 | @auth.CSRFRequired() |
|
737 | @auth.CSRFRequired() | |
738 | def repo_issuetracker_save(self, repo_name): |
|
738 | def repo_issuetracker_save(self, repo_name): | |
739 | # Save inheritance |
|
739 | # Save inheritance | |
740 | repo_settings = IssueTrackerSettingsModel(repo=repo_name) |
|
740 | repo_settings = IssueTrackerSettingsModel(repo=repo_name) | |
741 | inherited = (request.POST.get('inherit_global_issuetracker') |
|
741 | inherited = (request.POST.get('inherit_global_issuetracker') | |
742 | == "inherited") |
|
742 | == "inherited") | |
743 | repo_settings.inherit_global_settings = inherited |
|
743 | repo_settings.inherit_global_settings = inherited | |
744 | Session().commit() |
|
744 | Session().commit() | |
745 |
|
745 | |||
746 | form = IssueTrackerPatternsForm()().to_python(request.POST) |
|
746 | form = IssueTrackerPatternsForm()().to_python(request.POST) | |
747 | if form: |
|
747 | if form: | |
748 | self._update_patterns(form, repo_settings) |
|
748 | self._update_patterns(form, repo_settings) | |
749 |
|
749 | |||
750 | h.flash(_('Updated issue tracker entries'), category='success') |
|
750 | h.flash(_('Updated issue tracker entries'), category='success') | |
751 | return redirect(url('repo_settings_issuetracker', |
|
751 | return redirect(url('repo_settings_issuetracker', | |
752 | repo_name=repo_name)) |
|
752 | repo_name=repo_name)) | |
753 |
|
753 | |||
754 | @HasRepoPermissionAllDecorator('repository.admin') |
|
754 | @HasRepoPermissionAllDecorator('repository.admin') | |
755 | def repo_issuetracker(self, repo_name): |
|
755 | def repo_issuetracker(self, repo_name): | |
756 | """GET /admin/settings/issue-tracker: All items in the collection""" |
|
756 | """GET /admin/settings/issue-tracker: All items in the collection""" | |
757 | c.active = 'issuetracker' |
|
757 | c.active = 'issuetracker' | |
758 | c.data = 'data' |
|
758 | c.data = 'data' | |
759 | c.repo_info = self._load_repo(repo_name) |
|
759 | c.repo_info = self._load_repo(repo_name) | |
760 |
|
760 | |||
761 | repo = Repository.get_by_repo_name(repo_name) |
|
761 | repo = Repository.get_by_repo_name(repo_name) | |
762 | c.settings_model = IssueTrackerSettingsModel(repo=repo) |
|
762 | c.settings_model = IssueTrackerSettingsModel(repo=repo) | |
763 | c.global_patterns = c.settings_model.get_global_settings() |
|
763 | c.global_patterns = c.settings_model.get_global_settings() | |
764 | c.repo_patterns = c.settings_model.get_repo_settings() |
|
764 | c.repo_patterns = c.settings_model.get_repo_settings() | |
765 |
|
765 | |||
766 | return render('admin/repos/repo_edit.html') |
|
766 | return render('admin/repos/repo_edit.html') | |
767 |
|
767 | |||
768 | @HasRepoPermissionAllDecorator('repository.admin') |
|
768 | @HasRepoPermissionAllDecorator('repository.admin') | |
769 | def repo_settings_vcs(self, repo_name): |
|
769 | def repo_settings_vcs(self, repo_name): | |
770 | """GET /{repo_name}/settings/vcs/: All items in the collection""" |
|
770 | """GET /{repo_name}/settings/vcs/: All items in the collection""" | |
771 |
|
771 | |||
772 | model = VcsSettingsModel(repo=repo_name) |
|
772 | model = VcsSettingsModel(repo=repo_name) | |
773 |
|
773 | |||
774 | c.active = 'vcs' |
|
774 | c.active = 'vcs' | |
775 | c.global_svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
775 | c.global_svn_branch_patterns = model.get_global_svn_branch_patterns() | |
776 | c.global_svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
776 | c.global_svn_tag_patterns = model.get_global_svn_tag_patterns() | |
777 | c.svn_branch_patterns = model.get_repo_svn_branch_patterns() |
|
777 | c.svn_branch_patterns = model.get_repo_svn_branch_patterns() | |
778 | c.svn_tag_patterns = model.get_repo_svn_tag_patterns() |
|
778 | c.svn_tag_patterns = model.get_repo_svn_tag_patterns() | |
779 | c.repo_info = self._load_repo(repo_name) |
|
779 | c.repo_info = self._load_repo(repo_name) | |
780 | defaults = self._vcs_form_defaults(repo_name) |
|
780 | defaults = self._vcs_form_defaults(repo_name) | |
781 | c.inherit_global_settings = defaults['inherit_global_settings'] |
|
781 | c.inherit_global_settings = defaults['inherit_global_settings'] | |
782 |
|
782 | |||
783 | return htmlfill.render( |
|
783 | return htmlfill.render( | |
784 | render('admin/repos/repo_edit.html'), |
|
784 | render('admin/repos/repo_edit.html'), | |
785 | defaults=defaults, |
|
785 | defaults=defaults, | |
786 | encoding="UTF-8", |
|
786 | encoding="UTF-8", | |
787 | force_defaults=False) |
|
787 | force_defaults=False) | |
788 |
|
788 | |||
789 | @HasRepoPermissionAllDecorator('repository.admin') |
|
789 | @HasRepoPermissionAllDecorator('repository.admin') | |
790 | @auth.CSRFRequired() |
|
790 | @auth.CSRFRequired() | |
791 | def repo_settings_vcs_update(self, repo_name): |
|
791 | def repo_settings_vcs_update(self, repo_name): | |
792 | """POST /{repo_name}/settings/vcs/: All items in the collection""" |
|
792 | """POST /{repo_name}/settings/vcs/: All items in the collection""" | |
793 | c.active = 'vcs' |
|
793 | c.active = 'vcs' | |
794 |
|
794 | |||
795 | model = VcsSettingsModel(repo=repo_name) |
|
795 | model = VcsSettingsModel(repo=repo_name) | |
796 | c.global_svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
796 | c.global_svn_branch_patterns = model.get_global_svn_branch_patterns() | |
797 | c.global_svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
797 | c.global_svn_tag_patterns = model.get_global_svn_tag_patterns() | |
798 | c.svn_branch_patterns = model.get_repo_svn_branch_patterns() |
|
798 | c.svn_branch_patterns = model.get_repo_svn_branch_patterns() | |
799 | c.svn_tag_patterns = model.get_repo_svn_tag_patterns() |
|
799 | c.svn_tag_patterns = model.get_repo_svn_tag_patterns() | |
800 | c.repo_info = self._load_repo(repo_name) |
|
800 | c.repo_info = self._load_repo(repo_name) | |
801 | defaults = self._vcs_form_defaults(repo_name) |
|
801 | defaults = self._vcs_form_defaults(repo_name) | |
802 | c.inherit_global_settings = defaults['inherit_global_settings'] |
|
802 | c.inherit_global_settings = defaults['inherit_global_settings'] | |
803 |
|
803 | |||
804 | application_form = RepoVcsSettingsForm(repo_name)() |
|
804 | application_form = RepoVcsSettingsForm(repo_name)() | |
805 | try: |
|
805 | try: | |
806 | form_result = application_form.to_python(dict(request.POST)) |
|
806 | form_result = application_form.to_python(dict(request.POST)) | |
807 | except formencode.Invalid as errors: |
|
807 | except formencode.Invalid as errors: | |
808 | h.flash( |
|
808 | h.flash( | |
809 | _("Some form inputs contain invalid data."), |
|
809 | _("Some form inputs contain invalid data."), | |
810 | category='error') |
|
810 | category='error') | |
811 | return htmlfill.render( |
|
811 | return htmlfill.render( | |
812 | render('admin/repos/repo_edit.html'), |
|
812 | render('admin/repos/repo_edit.html'), | |
813 | defaults=errors.value, |
|
813 | defaults=errors.value, | |
814 | errors=errors.error_dict or {}, |
|
814 | errors=errors.error_dict or {}, | |
815 | prefix_error=False, |
|
815 | prefix_error=False, | |
816 | encoding="UTF-8", |
|
816 | encoding="UTF-8", | |
817 | force_defaults=False |
|
817 | force_defaults=False | |
818 | ) |
|
818 | ) | |
819 |
|
819 | |||
820 | try: |
|
820 | try: | |
821 | inherit_global_settings = form_result['inherit_global_settings'] |
|
821 | inherit_global_settings = form_result['inherit_global_settings'] | |
822 | model.create_or_update_repo_settings( |
|
822 | model.create_or_update_repo_settings( | |
823 | form_result, inherit_global_settings=inherit_global_settings) |
|
823 | form_result, inherit_global_settings=inherit_global_settings) | |
824 | except Exception: |
|
824 | except Exception: | |
825 | log.exception("Exception while updating settings") |
|
825 | log.exception("Exception while updating settings") | |
826 | h.flash( |
|
826 | h.flash( | |
827 | _('Error occurred during updating repository VCS settings'), |
|
827 | _('Error occurred during updating repository VCS settings'), | |
828 | category='error') |
|
828 | category='error') | |
829 | else: |
|
829 | else: | |
830 | Session().commit() |
|
830 | Session().commit() | |
831 | h.flash(_('Updated VCS settings'), category='success') |
|
831 | h.flash(_('Updated VCS settings'), category='success') | |
832 | return redirect(url('repo_vcs_settings', repo_name=repo_name)) |
|
832 | return redirect(url('repo_vcs_settings', repo_name=repo_name)) | |
833 |
|
833 | |||
834 | return htmlfill.render( |
|
834 | return htmlfill.render( | |
835 | render('admin/repos/repo_edit.html'), |
|
835 | render('admin/repos/repo_edit.html'), | |
836 | defaults=self._vcs_form_defaults(repo_name), |
|
836 | defaults=self._vcs_form_defaults(repo_name), | |
837 | encoding="UTF-8", |
|
837 | encoding="UTF-8", | |
838 | force_defaults=False) |
|
838 | force_defaults=False) | |
839 |
|
839 | |||
840 | @HasRepoPermissionAllDecorator('repository.admin') |
|
840 | @HasRepoPermissionAllDecorator('repository.admin') | |
841 | @auth.CSRFRequired() |
|
841 | @auth.CSRFRequired() | |
842 | @jsonify |
|
842 | @jsonify | |
843 | def repo_delete_svn_pattern(self, repo_name): |
|
843 | def repo_delete_svn_pattern(self, repo_name): | |
844 | if not request.is_xhr: |
|
844 | if not request.is_xhr: | |
845 | return False |
|
845 | return False | |
846 |
|
846 | |||
847 | delete_pattern_id = request.POST.get('delete_svn_pattern') |
|
847 | delete_pattern_id = request.POST.get('delete_svn_pattern') | |
848 | model = VcsSettingsModel(repo=repo_name) |
|
848 | model = VcsSettingsModel(repo=repo_name) | |
849 | try: |
|
849 | try: | |
850 | model.delete_repo_svn_pattern(delete_pattern_id) |
|
850 | model.delete_repo_svn_pattern(delete_pattern_id) | |
851 | except SettingNotFound: |
|
851 | except SettingNotFound: | |
852 | raise HTTPBadRequest() |
|
852 | raise HTTPBadRequest() | |
853 |
|
853 | |||
854 | Session().commit() |
|
854 | Session().commit() | |
855 | return True |
|
855 | return True | |
856 |
|
856 | |||
857 | def _vcs_form_defaults(self, repo_name): |
|
857 | def _vcs_form_defaults(self, repo_name): | |
858 | model = VcsSettingsModel(repo=repo_name) |
|
858 | model = VcsSettingsModel(repo=repo_name) | |
859 | global_defaults = model.get_global_settings() |
|
859 | global_defaults = model.get_global_settings() | |
860 |
|
860 | |||
861 | repo_defaults = {} |
|
861 | repo_defaults = {} | |
862 | repo_defaults.update(global_defaults) |
|
862 | repo_defaults.update(global_defaults) | |
863 | repo_defaults.update(model.get_repo_settings()) |
|
863 | repo_defaults.update(model.get_repo_settings()) | |
864 |
|
864 | |||
865 | global_defaults = { |
|
865 | global_defaults = { | |
866 | '{}_inherited'.format(k): global_defaults[k] |
|
866 | '{}_inherited'.format(k): global_defaults[k] | |
867 | for k in global_defaults} |
|
867 | for k in global_defaults} | |
868 |
|
868 | |||
869 | defaults = { |
|
869 | defaults = { | |
870 | 'inherit_global_settings': model.inherit_global_settings |
|
870 | 'inherit_global_settings': model.inherit_global_settings | |
871 | } |
|
871 | } | |
872 | defaults.update(global_defaults) |
|
872 | defaults.update(global_defaults) | |
873 | defaults.update(repo_defaults) |
|
873 | defaults.update(repo_defaults) | |
874 | defaults.update({ |
|
874 | defaults.update({ | |
875 | 'new_svn_branch': '', |
|
875 | 'new_svn_branch': '', | |
876 | 'new_svn_tag': '', |
|
876 | 'new_svn_tag': '', | |
877 | }) |
|
877 | }) | |
878 | return defaults |
|
878 | return defaults |
@@ -1,866 +1,813 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 |
|
21 | |||
22 | """ |
|
22 | """ | |
23 | settings controller for rhodecode admin |
|
23 | settings controller for rhodecode admin | |
24 | """ |
|
24 | """ | |
25 |
|
25 | |||
26 | import collections |
|
26 | import collections | |
27 | import logging |
|
27 | import logging | |
28 | import urllib2 |
|
28 | import urllib2 | |
29 |
|
29 | |||
30 | import datetime |
|
30 | import datetime | |
31 | import formencode |
|
31 | import formencode | |
32 | from formencode import htmlfill |
|
32 | from formencode import htmlfill | |
33 | import packaging.version |
|
33 | import packaging.version | |
34 | from pylons import request, tmpl_context as c, url, config |
|
34 | from pylons import request, tmpl_context as c, url, config | |
35 | from pylons.controllers.util import redirect |
|
35 | from pylons.controllers.util import redirect | |
36 | from pylons.i18n.translation import _, lazy_ugettext |
|
36 | from pylons.i18n.translation import _, lazy_ugettext | |
37 | from webob.exc import HTTPBadRequest |
|
37 | from webob.exc import HTTPBadRequest | |
38 |
|
38 | |||
39 | import rhodecode |
|
39 | import rhodecode | |
|
40 | from rhodecode.admin.navigation import navigation_list | |||
40 | from rhodecode.lib import auth |
|
41 | from rhodecode.lib import auth | |
41 | from rhodecode.lib import helpers as h |
|
42 | from rhodecode.lib import helpers as h | |
42 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator |
|
43 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator | |
43 | from rhodecode.lib.base import BaseController, render |
|
44 | from rhodecode.lib.base import BaseController, render | |
44 | from rhodecode.lib.celerylib import tasks, run_task |
|
45 | from rhodecode.lib.celerylib import tasks, run_task | |
45 | from rhodecode.lib.utils import repo2db_mapper |
|
46 | from rhodecode.lib.utils import repo2db_mapper | |
46 | from rhodecode.lib.utils2 import ( |
|
47 | from rhodecode.lib.utils2 import ( | |
47 | str2bool, safe_unicode, AttributeDict, safe_int) |
|
48 | str2bool, safe_unicode, AttributeDict, safe_int) | |
48 | from rhodecode.lib.compat import OrderedDict |
|
49 | from rhodecode.lib.compat import OrderedDict | |
49 | from rhodecode.lib.ext_json import json |
|
50 | from rhodecode.lib.ext_json import json | |
50 |
from rhodecode.lib.utils import jsonify |
|
51 | from rhodecode.lib.utils import jsonify | |
51 |
|
52 | |||
52 | from rhodecode.model.db import RhodeCodeUi, Repository |
|
53 | from rhodecode.model.db import RhodeCodeUi, Repository | |
53 | from rhodecode.model.forms import ApplicationSettingsForm, \ |
|
54 | from rhodecode.model.forms import ApplicationSettingsForm, \ | |
54 | ApplicationUiSettingsForm, ApplicationVisualisationForm, \ |
|
55 | ApplicationUiSettingsForm, ApplicationVisualisationForm, \ | |
55 | LabsSettingsForm, IssueTrackerPatternsForm |
|
56 | LabsSettingsForm, IssueTrackerPatternsForm | |
56 |
|
57 | |||
57 | from rhodecode.model.scm import ScmModel |
|
58 | from rhodecode.model.scm import ScmModel | |
58 | from rhodecode.model.notification import EmailNotificationModel |
|
59 | from rhodecode.model.notification import EmailNotificationModel | |
59 | from rhodecode.model.meta import Session |
|
60 | from rhodecode.model.meta import Session | |
60 | from rhodecode.model.settings import ( |
|
61 | from rhodecode.model.settings import ( | |
61 | IssueTrackerSettingsModel, VcsSettingsModel, SettingNotFound, |
|
62 | IssueTrackerSettingsModel, VcsSettingsModel, SettingNotFound, | |
62 | SettingsModel) |
|
63 | SettingsModel) | |
|
64 | ||||
63 | from rhodecode.model.supervisor import SupervisorModel, SUPERVISOR_MASTER |
|
65 | from rhodecode.model.supervisor import SupervisorModel, SUPERVISOR_MASTER | |
64 | from rhodecode.model.user import UserModel |
|
66 | ||
65 |
|
67 | |||
66 | log = logging.getLogger(__name__) |
|
68 | log = logging.getLogger(__name__) | |
67 |
|
69 | |||
68 |
|
70 | |||
69 | class SettingsController(BaseController): |
|
71 | class SettingsController(BaseController): | |
70 | """REST Controller styled on the Atom Publishing Protocol""" |
|
72 | """REST Controller styled on the Atom Publishing Protocol""" | |
71 | # To properly map this controller, ensure your config/routing.py |
|
73 | # To properly map this controller, ensure your config/routing.py | |
72 | # file has a resource setup: |
|
74 | # file has a resource setup: | |
73 | # map.resource('setting', 'settings', controller='admin/settings', |
|
75 | # map.resource('setting', 'settings', controller='admin/settings', | |
74 | # path_prefix='/admin', name_prefix='admin_') |
|
76 | # path_prefix='/admin', name_prefix='admin_') | |
75 |
|
77 | |||
76 | @LoginRequired() |
|
78 | @LoginRequired() | |
77 | def __before__(self): |
|
79 | def __before__(self): | |
78 | super(SettingsController, self).__before__() |
|
80 | super(SettingsController, self).__before__() | |
79 | c.labs_active = str2bool( |
|
81 | c.labs_active = str2bool( | |
80 | rhodecode.CONFIG.get('labs_settings_active', 'false')) |
|
82 | rhodecode.CONFIG.get('labs_settings_active', 'false')) | |
81 |
c.navlist = navigation |
|
83 | c.navlist = navigation_list(request) | |
82 |
|
84 | |||
83 | def _get_hg_ui_settings(self): |
|
85 | def _get_hg_ui_settings(self): | |
84 | ret = RhodeCodeUi.query().all() |
|
86 | ret = RhodeCodeUi.query().all() | |
85 |
|
87 | |||
86 | if not ret: |
|
88 | if not ret: | |
87 | raise Exception('Could not get application ui settings !') |
|
89 | raise Exception('Could not get application ui settings !') | |
88 | settings = {} |
|
90 | settings = {} | |
89 | for each in ret: |
|
91 | for each in ret: | |
90 | k = each.ui_key |
|
92 | k = each.ui_key | |
91 | v = each.ui_value |
|
93 | v = each.ui_value | |
92 | if k == '/': |
|
94 | if k == '/': | |
93 | k = 'root_path' |
|
95 | k = 'root_path' | |
94 |
|
96 | |||
95 | if k in ['push_ssl', 'publish']: |
|
97 | if k in ['push_ssl', 'publish']: | |
96 | v = str2bool(v) |
|
98 | v = str2bool(v) | |
97 |
|
99 | |||
98 | if k.find('.') != -1: |
|
100 | if k.find('.') != -1: | |
99 | k = k.replace('.', '_') |
|
101 | k = k.replace('.', '_') | |
100 |
|
102 | |||
101 | if each.ui_section in ['hooks', 'extensions']: |
|
103 | if each.ui_section in ['hooks', 'extensions']: | |
102 | v = each.ui_active |
|
104 | v = each.ui_active | |
103 |
|
105 | |||
104 | settings[each.ui_section + '_' + k] = v |
|
106 | settings[each.ui_section + '_' + k] = v | |
105 | return settings |
|
107 | return settings | |
106 |
|
108 | |||
107 | @HasPermissionAllDecorator('hg.admin') |
|
109 | @HasPermissionAllDecorator('hg.admin') | |
108 | @auth.CSRFRequired() |
|
110 | @auth.CSRFRequired() | |
109 | @jsonify |
|
111 | @jsonify | |
110 | def delete_svn_pattern(self): |
|
112 | def delete_svn_pattern(self): | |
111 | if not request.is_xhr: |
|
113 | if not request.is_xhr: | |
112 | raise HTTPBadRequest() |
|
114 | raise HTTPBadRequest() | |
113 |
|
115 | |||
114 | delete_pattern_id = request.POST.get('delete_svn_pattern') |
|
116 | delete_pattern_id = request.POST.get('delete_svn_pattern') | |
115 | model = VcsSettingsModel() |
|
117 | model = VcsSettingsModel() | |
116 | try: |
|
118 | try: | |
117 | model.delete_global_svn_pattern(delete_pattern_id) |
|
119 | model.delete_global_svn_pattern(delete_pattern_id) | |
118 | except SettingNotFound: |
|
120 | except SettingNotFound: | |
119 | raise HTTPBadRequest() |
|
121 | raise HTTPBadRequest() | |
120 |
|
122 | |||
121 | Session().commit() |
|
123 | Session().commit() | |
122 | return True |
|
124 | return True | |
123 |
|
125 | |||
124 | @HasPermissionAllDecorator('hg.admin') |
|
126 | @HasPermissionAllDecorator('hg.admin') | |
125 | @auth.CSRFRequired() |
|
127 | @auth.CSRFRequired() | |
126 | def settings_vcs_update(self): |
|
128 | def settings_vcs_update(self): | |
127 | """POST /admin/settings: All items in the collection""" |
|
129 | """POST /admin/settings: All items in the collection""" | |
128 | # url('admin_settings_vcs') |
|
130 | # url('admin_settings_vcs') | |
129 | c.active = 'vcs' |
|
131 | c.active = 'vcs' | |
130 |
|
132 | |||
131 | model = VcsSettingsModel() |
|
133 | model = VcsSettingsModel() | |
132 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
134 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() | |
133 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
135 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() | |
134 |
|
136 | |||
135 | application_form = ApplicationUiSettingsForm()() |
|
137 | application_form = ApplicationUiSettingsForm()() | |
136 | try: |
|
138 | try: | |
137 | form_result = application_form.to_python(dict(request.POST)) |
|
139 | form_result = application_form.to_python(dict(request.POST)) | |
138 | except formencode.Invalid as errors: |
|
140 | except formencode.Invalid as errors: | |
139 | h.flash( |
|
141 | h.flash( | |
140 | _("Some form inputs contain invalid data."), |
|
142 | _("Some form inputs contain invalid data."), | |
141 | category='error') |
|
143 | category='error') | |
142 | return htmlfill.render( |
|
144 | return htmlfill.render( | |
143 | render('admin/settings/settings.html'), |
|
145 | render('admin/settings/settings.html'), | |
144 | defaults=errors.value, |
|
146 | defaults=errors.value, | |
145 | errors=errors.error_dict or {}, |
|
147 | errors=errors.error_dict or {}, | |
146 | prefix_error=False, |
|
148 | prefix_error=False, | |
147 | encoding="UTF-8", |
|
149 | encoding="UTF-8", | |
148 | force_defaults=False |
|
150 | force_defaults=False | |
149 | ) |
|
151 | ) | |
150 |
|
152 | |||
151 | try: |
|
153 | try: | |
152 | model.update_global_ssl_setting(form_result['web_push_ssl']) |
|
154 | model.update_global_ssl_setting(form_result['web_push_ssl']) | |
153 | if c.visual.allow_repo_location_change: |
|
155 | if c.visual.allow_repo_location_change: | |
154 | model.update_global_path_setting( |
|
156 | model.update_global_path_setting( | |
155 | form_result['paths_root_path']) |
|
157 | form_result['paths_root_path']) | |
156 | model.update_global_hook_settings(form_result) |
|
158 | model.update_global_hook_settings(form_result) | |
157 | model.create_global_svn_settings(form_result) |
|
159 | model.create_global_svn_settings(form_result) | |
158 | model.create_or_update_global_hg_settings(form_result) |
|
160 | model.create_or_update_global_hg_settings(form_result) | |
159 | model.create_or_update_global_pr_settings(form_result) |
|
161 | model.create_or_update_global_pr_settings(form_result) | |
160 | except Exception: |
|
162 | except Exception: | |
161 | log.exception("Exception while updating settings") |
|
163 | log.exception("Exception while updating settings") | |
162 | h.flash(_('Error occurred during updating ' |
|
164 | h.flash(_('Error occurred during updating ' | |
163 | 'application settings'), category='error') |
|
165 | 'application settings'), category='error') | |
164 | else: |
|
166 | else: | |
165 | Session().commit() |
|
167 | Session().commit() | |
166 | h.flash(_('Updated VCS settings'), category='success') |
|
168 | h.flash(_('Updated VCS settings'), category='success') | |
167 | return redirect(url('admin_settings_vcs')) |
|
169 | return redirect(url('admin_settings_vcs')) | |
168 |
|
170 | |||
169 | return htmlfill.render( |
|
171 | return htmlfill.render( | |
170 | render('admin/settings/settings.html'), |
|
172 | render('admin/settings/settings.html'), | |
171 | defaults=self._form_defaults(), |
|
173 | defaults=self._form_defaults(), | |
172 | encoding="UTF-8", |
|
174 | encoding="UTF-8", | |
173 | force_defaults=False) |
|
175 | force_defaults=False) | |
174 |
|
176 | |||
175 | @HasPermissionAllDecorator('hg.admin') |
|
177 | @HasPermissionAllDecorator('hg.admin') | |
176 | def settings_vcs(self): |
|
178 | def settings_vcs(self): | |
177 | """GET /admin/settings: All items in the collection""" |
|
179 | """GET /admin/settings: All items in the collection""" | |
178 | # url('admin_settings_vcs') |
|
180 | # url('admin_settings_vcs') | |
179 | c.active = 'vcs' |
|
181 | c.active = 'vcs' | |
180 | model = VcsSettingsModel() |
|
182 | model = VcsSettingsModel() | |
181 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
183 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() | |
182 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
184 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() | |
183 |
|
185 | |||
184 | return htmlfill.render( |
|
186 | return htmlfill.render( | |
185 | render('admin/settings/settings.html'), |
|
187 | render('admin/settings/settings.html'), | |
186 | defaults=self._form_defaults(), |
|
188 | defaults=self._form_defaults(), | |
187 | encoding="UTF-8", |
|
189 | encoding="UTF-8", | |
188 | force_defaults=False) |
|
190 | force_defaults=False) | |
189 |
|
191 | |||
190 | @HasPermissionAllDecorator('hg.admin') |
|
192 | @HasPermissionAllDecorator('hg.admin') | |
191 | @auth.CSRFRequired() |
|
193 | @auth.CSRFRequired() | |
192 | def settings_mapping_update(self): |
|
194 | def settings_mapping_update(self): | |
193 | """POST /admin/settings/mapping: All items in the collection""" |
|
195 | """POST /admin/settings/mapping: All items in the collection""" | |
194 | # url('admin_settings_mapping') |
|
196 | # url('admin_settings_mapping') | |
195 | c.active = 'mapping' |
|
197 | c.active = 'mapping' | |
196 | rm_obsolete = request.POST.get('destroy', False) |
|
198 | rm_obsolete = request.POST.get('destroy', False) | |
197 | invalidate_cache = request.POST.get('invalidate', False) |
|
199 | invalidate_cache = request.POST.get('invalidate', False) | |
198 | log.debug( |
|
200 | log.debug( | |
199 | 'rescanning repo location with destroy obsolete=%s', rm_obsolete) |
|
201 | 'rescanning repo location with destroy obsolete=%s', rm_obsolete) | |
200 |
|
202 | |||
201 | if invalidate_cache: |
|
203 | if invalidate_cache: | |
202 | log.debug('invalidating all repositories cache') |
|
204 | log.debug('invalidating all repositories cache') | |
203 | for repo in Repository.get_all(): |
|
205 | for repo in Repository.get_all(): | |
204 | ScmModel().mark_for_invalidation(repo.repo_name, delete=True) |
|
206 | ScmModel().mark_for_invalidation(repo.repo_name, delete=True) | |
205 |
|
207 | |||
206 | filesystem_repos = ScmModel().repo_scan() |
|
208 | filesystem_repos = ScmModel().repo_scan() | |
207 | added, removed = repo2db_mapper(filesystem_repos, rm_obsolete) |
|
209 | added, removed = repo2db_mapper(filesystem_repos, rm_obsolete) | |
208 | _repr = lambda l: ', '.join(map(safe_unicode, l)) or '-' |
|
210 | _repr = lambda l: ', '.join(map(safe_unicode, l)) or '-' | |
209 | h.flash(_('Repositories successfully ' |
|
211 | h.flash(_('Repositories successfully ' | |
210 | 'rescanned added: %s ; removed: %s') % |
|
212 | 'rescanned added: %s ; removed: %s') % | |
211 | (_repr(added), _repr(removed)), |
|
213 | (_repr(added), _repr(removed)), | |
212 | category='success') |
|
214 | category='success') | |
213 | return redirect(url('admin_settings_mapping')) |
|
215 | return redirect(url('admin_settings_mapping')) | |
214 |
|
216 | |||
215 | @HasPermissionAllDecorator('hg.admin') |
|
217 | @HasPermissionAllDecorator('hg.admin') | |
216 | def settings_mapping(self): |
|
218 | def settings_mapping(self): | |
217 | """GET /admin/settings/mapping: All items in the collection""" |
|
219 | """GET /admin/settings/mapping: All items in the collection""" | |
218 | # url('admin_settings_mapping') |
|
220 | # url('admin_settings_mapping') | |
219 | c.active = 'mapping' |
|
221 | c.active = 'mapping' | |
220 |
|
222 | |||
221 | return htmlfill.render( |
|
223 | return htmlfill.render( | |
222 | render('admin/settings/settings.html'), |
|
224 | render('admin/settings/settings.html'), | |
223 | defaults=self._form_defaults(), |
|
225 | defaults=self._form_defaults(), | |
224 | encoding="UTF-8", |
|
226 | encoding="UTF-8", | |
225 | force_defaults=False) |
|
227 | force_defaults=False) | |
226 |
|
228 | |||
227 | @HasPermissionAllDecorator('hg.admin') |
|
229 | @HasPermissionAllDecorator('hg.admin') | |
228 | @auth.CSRFRequired() |
|
230 | @auth.CSRFRequired() | |
229 | def settings_global_update(self): |
|
231 | def settings_global_update(self): | |
230 | """POST /admin/settings/global: All items in the collection""" |
|
232 | """POST /admin/settings/global: All items in the collection""" | |
231 | # url('admin_settings_global') |
|
233 | # url('admin_settings_global') | |
232 | c.active = 'global' |
|
234 | c.active = 'global' | |
233 | application_form = ApplicationSettingsForm()() |
|
235 | application_form = ApplicationSettingsForm()() | |
234 | try: |
|
236 | try: | |
235 | form_result = application_form.to_python(dict(request.POST)) |
|
237 | form_result = application_form.to_python(dict(request.POST)) | |
236 | except formencode.Invalid as errors: |
|
238 | except formencode.Invalid as errors: | |
237 | return htmlfill.render( |
|
239 | return htmlfill.render( | |
238 | render('admin/settings/settings.html'), |
|
240 | render('admin/settings/settings.html'), | |
239 | defaults=errors.value, |
|
241 | defaults=errors.value, | |
240 | errors=errors.error_dict or {}, |
|
242 | errors=errors.error_dict or {}, | |
241 | prefix_error=False, |
|
243 | prefix_error=False, | |
242 | encoding="UTF-8", |
|
244 | encoding="UTF-8", | |
243 | force_defaults=False) |
|
245 | force_defaults=False) | |
244 |
|
246 | |||
245 | try: |
|
247 | try: | |
246 | settings = [ |
|
248 | settings = [ | |
247 | ('title', 'rhodecode_title'), |
|
249 | ('title', 'rhodecode_title'), | |
248 | ('realm', 'rhodecode_realm'), |
|
250 | ('realm', 'rhodecode_realm'), | |
249 | ('pre_code', 'rhodecode_pre_code'), |
|
251 | ('pre_code', 'rhodecode_pre_code'), | |
250 | ('post_code', 'rhodecode_post_code'), |
|
252 | ('post_code', 'rhodecode_post_code'), | |
251 | ('captcha_public_key', 'rhodecode_captcha_public_key'), |
|
253 | ('captcha_public_key', 'rhodecode_captcha_public_key'), | |
252 | ('captcha_private_key', 'rhodecode_captcha_private_key'), |
|
254 | ('captcha_private_key', 'rhodecode_captcha_private_key'), | |
253 | ] |
|
255 | ] | |
254 | for setting, form_key in settings: |
|
256 | for setting, form_key in settings: | |
255 | sett = SettingsModel().create_or_update_setting( |
|
257 | sett = SettingsModel().create_or_update_setting( | |
256 | setting, form_result[form_key]) |
|
258 | setting, form_result[form_key]) | |
257 | Session().add(sett) |
|
259 | Session().add(sett) | |
258 |
|
260 | |||
259 | Session().commit() |
|
261 | Session().commit() | |
|
262 | SettingsModel().invalidate_settings_cache() | |||
260 | h.flash(_('Updated application settings'), category='success') |
|
263 | h.flash(_('Updated application settings'), category='success') | |
261 |
|
||||
262 | except Exception: |
|
264 | except Exception: | |
263 | log.exception("Exception while updating application settings") |
|
265 | log.exception("Exception while updating application settings") | |
264 | h.flash( |
|
266 | h.flash( | |
265 | _('Error occurred during updating application settings'), |
|
267 | _('Error occurred during updating application settings'), | |
266 | category='error') |
|
268 | category='error') | |
267 |
|
269 | |||
268 | return redirect(url('admin_settings_global')) |
|
270 | return redirect(url('admin_settings_global')) | |
269 |
|
271 | |||
270 | @HasPermissionAllDecorator('hg.admin') |
|
272 | @HasPermissionAllDecorator('hg.admin') | |
271 | def settings_global(self): |
|
273 | def settings_global(self): | |
272 | """GET /admin/settings/global: All items in the collection""" |
|
274 | """GET /admin/settings/global: All items in the collection""" | |
273 | # url('admin_settings_global') |
|
275 | # url('admin_settings_global') | |
274 | c.active = 'global' |
|
276 | c.active = 'global' | |
275 |
|
277 | |||
276 | return htmlfill.render( |
|
278 | return htmlfill.render( | |
277 | render('admin/settings/settings.html'), |
|
279 | render('admin/settings/settings.html'), | |
278 | defaults=self._form_defaults(), |
|
280 | defaults=self._form_defaults(), | |
279 | encoding="UTF-8", |
|
281 | encoding="UTF-8", | |
280 | force_defaults=False) |
|
282 | force_defaults=False) | |
281 |
|
283 | |||
282 | @HasPermissionAllDecorator('hg.admin') |
|
284 | @HasPermissionAllDecorator('hg.admin') | |
283 | @auth.CSRFRequired() |
|
285 | @auth.CSRFRequired() | |
284 | def settings_visual_update(self): |
|
286 | def settings_visual_update(self): | |
285 | """POST /admin/settings/visual: All items in the collection""" |
|
287 | """POST /admin/settings/visual: All items in the collection""" | |
286 | # url('admin_settings_visual') |
|
288 | # url('admin_settings_visual') | |
287 | c.active = 'visual' |
|
289 | c.active = 'visual' | |
288 | application_form = ApplicationVisualisationForm()() |
|
290 | application_form = ApplicationVisualisationForm()() | |
289 | try: |
|
291 | try: | |
290 | form_result = application_form.to_python(dict(request.POST)) |
|
292 | form_result = application_form.to_python(dict(request.POST)) | |
291 | except formencode.Invalid as errors: |
|
293 | except formencode.Invalid as errors: | |
292 | return htmlfill.render( |
|
294 | return htmlfill.render( | |
293 | render('admin/settings/settings.html'), |
|
295 | render('admin/settings/settings.html'), | |
294 | defaults=errors.value, |
|
296 | defaults=errors.value, | |
295 | errors=errors.error_dict or {}, |
|
297 | errors=errors.error_dict or {}, | |
296 | prefix_error=False, |
|
298 | prefix_error=False, | |
297 | encoding="UTF-8", |
|
299 | encoding="UTF-8", | |
298 | force_defaults=False |
|
300 | force_defaults=False | |
299 | ) |
|
301 | ) | |
300 |
|
302 | |||
301 | try: |
|
303 | try: | |
302 | settings = [ |
|
304 | settings = [ | |
303 | ('show_public_icon', 'rhodecode_show_public_icon', 'bool'), |
|
305 | ('show_public_icon', 'rhodecode_show_public_icon', 'bool'), | |
304 | ('show_private_icon', 'rhodecode_show_private_icon', 'bool'), |
|
306 | ('show_private_icon', 'rhodecode_show_private_icon', 'bool'), | |
305 | ('stylify_metatags', 'rhodecode_stylify_metatags', 'bool'), |
|
307 | ('stylify_metatags', 'rhodecode_stylify_metatags', 'bool'), | |
306 | ('repository_fields', 'rhodecode_repository_fields', 'bool'), |
|
308 | ('repository_fields', 'rhodecode_repository_fields', 'bool'), | |
307 | ('dashboard_items', 'rhodecode_dashboard_items', 'int'), |
|
309 | ('dashboard_items', 'rhodecode_dashboard_items', 'int'), | |
308 | ('admin_grid_items', 'rhodecode_admin_grid_items', 'int'), |
|
310 | ('admin_grid_items', 'rhodecode_admin_grid_items', 'int'), | |
309 | ('show_version', 'rhodecode_show_version', 'bool'), |
|
311 | ('show_version', 'rhodecode_show_version', 'bool'), | |
310 | ('use_gravatar', 'rhodecode_use_gravatar', 'bool'), |
|
312 | ('use_gravatar', 'rhodecode_use_gravatar', 'bool'), | |
311 | ('markup_renderer', 'rhodecode_markup_renderer', 'unicode'), |
|
313 | ('markup_renderer', 'rhodecode_markup_renderer', 'unicode'), | |
312 | ('gravatar_url', 'rhodecode_gravatar_url', 'unicode'), |
|
314 | ('gravatar_url', 'rhodecode_gravatar_url', 'unicode'), | |
313 | ('clone_uri_tmpl', 'rhodecode_clone_uri_tmpl', 'unicode'), |
|
315 | ('clone_uri_tmpl', 'rhodecode_clone_uri_tmpl', 'unicode'), | |
314 | ('support_url', 'rhodecode_support_url', 'unicode'), |
|
316 | ('support_url', 'rhodecode_support_url', 'unicode'), | |
315 | ('show_revision_number', 'rhodecode_show_revision_number', 'bool'), |
|
317 | ('show_revision_number', 'rhodecode_show_revision_number', 'bool'), | |
316 | ('show_sha_length', 'rhodecode_show_sha_length', 'int'), |
|
318 | ('show_sha_length', 'rhodecode_show_sha_length', 'int'), | |
317 | ] |
|
319 | ] | |
318 | for setting, form_key, type_ in settings: |
|
320 | for setting, form_key, type_ in settings: | |
319 | sett = SettingsModel().create_or_update_setting( |
|
321 | sett = SettingsModel().create_or_update_setting( | |
320 | setting, form_result[form_key], type_) |
|
322 | setting, form_result[form_key], type_) | |
321 | Session().add(sett) |
|
323 | Session().add(sett) | |
322 |
|
324 | |||
323 | Session().commit() |
|
325 | Session().commit() | |
324 |
|
326 | SettingsModel().invalidate_settings_cache() | ||
325 | h.flash(_('Updated visualisation settings'), category='success') |
|
327 | h.flash(_('Updated visualisation settings'), category='success') | |
326 | except Exception: |
|
328 | except Exception: | |
327 | log.exception("Exception updating visualization settings") |
|
329 | log.exception("Exception updating visualization settings") | |
328 | h.flash(_('Error occurred during updating ' |
|
330 | h.flash(_('Error occurred during updating ' | |
329 | 'visualisation settings'), |
|
331 | 'visualisation settings'), | |
330 | category='error') |
|
332 | category='error') | |
331 |
|
333 | |||
332 | return redirect(url('admin_settings_visual')) |
|
334 | return redirect(url('admin_settings_visual')) | |
333 |
|
335 | |||
334 | @HasPermissionAllDecorator('hg.admin') |
|
336 | @HasPermissionAllDecorator('hg.admin') | |
335 | def settings_visual(self): |
|
337 | def settings_visual(self): | |
336 | """GET /admin/settings/visual: All items in the collection""" |
|
338 | """GET /admin/settings/visual: All items in the collection""" | |
337 | # url('admin_settings_visual') |
|
339 | # url('admin_settings_visual') | |
338 | c.active = 'visual' |
|
340 | c.active = 'visual' | |
339 |
|
341 | |||
340 | return htmlfill.render( |
|
342 | return htmlfill.render( | |
341 | render('admin/settings/settings.html'), |
|
343 | render('admin/settings/settings.html'), | |
342 | defaults=self._form_defaults(), |
|
344 | defaults=self._form_defaults(), | |
343 | encoding="UTF-8", |
|
345 | encoding="UTF-8", | |
344 | force_defaults=False) |
|
346 | force_defaults=False) | |
345 |
|
347 | |||
346 | @HasPermissionAllDecorator('hg.admin') |
|
348 | @HasPermissionAllDecorator('hg.admin') | |
347 | @auth.CSRFRequired() |
|
349 | @auth.CSRFRequired() | |
348 | def settings_issuetracker_test(self): |
|
350 | def settings_issuetracker_test(self): | |
349 | if request.is_xhr: |
|
351 | if request.is_xhr: | |
350 | return h.urlify_commit_message( |
|
352 | return h.urlify_commit_message( | |
351 | request.POST.get('test_text', ''), |
|
353 | request.POST.get('test_text', ''), | |
352 | 'repo_group/test_repo1') |
|
354 | 'repo_group/test_repo1') | |
353 | else: |
|
355 | else: | |
354 | raise HTTPBadRequest() |
|
356 | raise HTTPBadRequest() | |
355 |
|
357 | |||
356 | @HasPermissionAllDecorator('hg.admin') |
|
358 | @HasPermissionAllDecorator('hg.admin') | |
357 | @auth.CSRFRequired() |
|
359 | @auth.CSRFRequired() | |
358 | def settings_issuetracker_delete(self): |
|
360 | def settings_issuetracker_delete(self): | |
359 | uid = request.POST.get('uid') |
|
361 | uid = request.POST.get('uid') | |
360 | IssueTrackerSettingsModel().delete_entries(uid) |
|
362 | IssueTrackerSettingsModel().delete_entries(uid) | |
361 | h.flash(_('Removed issue tracker entry'), category='success') |
|
363 | h.flash(_('Removed issue tracker entry'), category='success') | |
362 | return redirect(url('admin_settings_issuetracker')) |
|
364 | return redirect(url('admin_settings_issuetracker')) | |
363 |
|
365 | |||
364 | @HasPermissionAllDecorator('hg.admin') |
|
366 | @HasPermissionAllDecorator('hg.admin') | |
365 | def settings_issuetracker(self): |
|
367 | def settings_issuetracker(self): | |
366 | """GET /admin/settings/issue-tracker: All items in the collection""" |
|
368 | """GET /admin/settings/issue-tracker: All items in the collection""" | |
367 | # url('admin_settings_issuetracker') |
|
369 | # url('admin_settings_issuetracker') | |
368 | c.active = 'issuetracker' |
|
370 | c.active = 'issuetracker' | |
369 | defaults = SettingsModel().get_all_settings() |
|
371 | defaults = SettingsModel().get_all_settings() | |
370 |
|
372 | |||
371 | entry_key = 'rhodecode_issuetracker_pat_' |
|
373 | entry_key = 'rhodecode_issuetracker_pat_' | |
372 |
|
374 | |||
373 | c.issuetracker_entries = {} |
|
375 | c.issuetracker_entries = {} | |
374 | for k, v in defaults.items(): |
|
376 | for k, v in defaults.items(): | |
375 | if k.startswith(entry_key): |
|
377 | if k.startswith(entry_key): | |
376 | uid = k[len(entry_key):] |
|
378 | uid = k[len(entry_key):] | |
377 | c.issuetracker_entries[uid] = None |
|
379 | c.issuetracker_entries[uid] = None | |
378 |
|
380 | |||
379 | for uid in c.issuetracker_entries: |
|
381 | for uid in c.issuetracker_entries: | |
380 | c.issuetracker_entries[uid] = AttributeDict({ |
|
382 | c.issuetracker_entries[uid] = AttributeDict({ | |
381 | 'pat': defaults.get('rhodecode_issuetracker_pat_' + uid), |
|
383 | 'pat': defaults.get('rhodecode_issuetracker_pat_' + uid), | |
382 | 'url': defaults.get('rhodecode_issuetracker_url_' + uid), |
|
384 | 'url': defaults.get('rhodecode_issuetracker_url_' + uid), | |
383 | 'pref': defaults.get('rhodecode_issuetracker_pref_' + uid), |
|
385 | 'pref': defaults.get('rhodecode_issuetracker_pref_' + uid), | |
384 | 'desc': defaults.get('rhodecode_issuetracker_desc_' + uid), |
|
386 | 'desc': defaults.get('rhodecode_issuetracker_desc_' + uid), | |
385 | }) |
|
387 | }) | |
386 |
|
388 | |||
387 | return render('admin/settings/settings.html') |
|
389 | return render('admin/settings/settings.html') | |
388 |
|
390 | |||
389 | @HasPermissionAllDecorator('hg.admin') |
|
391 | @HasPermissionAllDecorator('hg.admin') | |
390 | @auth.CSRFRequired() |
|
392 | @auth.CSRFRequired() | |
391 | def settings_issuetracker_save(self): |
|
393 | def settings_issuetracker_save(self): | |
392 | settings_model = IssueTrackerSettingsModel() |
|
394 | settings_model = IssueTrackerSettingsModel() | |
393 |
|
395 | |||
394 | form = IssueTrackerPatternsForm()().to_python(request.POST) |
|
396 | form = IssueTrackerPatternsForm()().to_python(request.POST) | |
395 | for uid in form['delete_patterns']: |
|
397 | for uid in form['delete_patterns']: | |
396 | settings_model.delete_entries(uid) |
|
398 | settings_model.delete_entries(uid) | |
397 |
|
399 | |||
398 | for pattern in form['patterns']: |
|
400 | for pattern in form['patterns']: | |
399 | for setting, value, type_ in pattern: |
|
401 | for setting, value, type_ in pattern: | |
400 | sett = settings_model.create_or_update_setting( |
|
402 | sett = settings_model.create_or_update_setting( | |
401 | setting, value, type_) |
|
403 | setting, value, type_) | |
402 | Session().add(sett) |
|
404 | Session().add(sett) | |
403 |
|
405 | |||
404 | Session().commit() |
|
406 | Session().commit() | |
405 |
|
407 | |||
|
408 | SettingsModel().invalidate_settings_cache() | |||
406 | h.flash(_('Updated issue tracker entries'), category='success') |
|
409 | h.flash(_('Updated issue tracker entries'), category='success') | |
407 | return redirect(url('admin_settings_issuetracker')) |
|
410 | return redirect(url('admin_settings_issuetracker')) | |
408 |
|
411 | |||
409 | @HasPermissionAllDecorator('hg.admin') |
|
412 | @HasPermissionAllDecorator('hg.admin') | |
410 | @auth.CSRFRequired() |
|
413 | @auth.CSRFRequired() | |
411 | def settings_email_update(self): |
|
414 | def settings_email_update(self): | |
412 | """POST /admin/settings/email: All items in the collection""" |
|
415 | """POST /admin/settings/email: All items in the collection""" | |
413 | # url('admin_settings_email') |
|
416 | # url('admin_settings_email') | |
414 | c.active = 'email' |
|
417 | c.active = 'email' | |
415 |
|
418 | |||
416 | test_email = request.POST.get('test_email') |
|
419 | test_email = request.POST.get('test_email') | |
417 |
|
420 | |||
418 | if not test_email: |
|
421 | if not test_email: | |
419 | h.flash(_('Please enter email address'), category='error') |
|
422 | h.flash(_('Please enter email address'), category='error') | |
420 | return redirect(url('admin_settings_email')) |
|
423 | return redirect(url('admin_settings_email')) | |
421 |
|
424 | |||
422 | email_kwargs = { |
|
425 | email_kwargs = { | |
423 | 'date': datetime.datetime.now(), |
|
426 | 'date': datetime.datetime.now(), | |
424 | 'user': c.rhodecode_user, |
|
427 | 'user': c.rhodecode_user, | |
425 | 'rhodecode_version': c.rhodecode_version |
|
428 | 'rhodecode_version': c.rhodecode_version | |
426 | } |
|
429 | } | |
427 |
|
430 | |||
428 | (subject, headers, email_body, |
|
431 | (subject, headers, email_body, | |
429 | email_body_plaintext) = EmailNotificationModel().render_email( |
|
432 | email_body_plaintext) = EmailNotificationModel().render_email( | |
430 | EmailNotificationModel.TYPE_EMAIL_TEST, **email_kwargs) |
|
433 | EmailNotificationModel.TYPE_EMAIL_TEST, **email_kwargs) | |
431 |
|
434 | |||
432 | recipients = [test_email] if test_email else None |
|
435 | recipients = [test_email] if test_email else None | |
433 |
|
436 | |||
434 | run_task(tasks.send_email, recipients, subject, |
|
437 | run_task(tasks.send_email, recipients, subject, | |
435 | email_body_plaintext, email_body) |
|
438 | email_body_plaintext, email_body) | |
436 |
|
439 | |||
437 | h.flash(_('Send email task created'), category='success') |
|
440 | h.flash(_('Send email task created'), category='success') | |
438 | return redirect(url('admin_settings_email')) |
|
441 | return redirect(url('admin_settings_email')) | |
439 |
|
442 | |||
440 | @HasPermissionAllDecorator('hg.admin') |
|
443 | @HasPermissionAllDecorator('hg.admin') | |
441 | def settings_email(self): |
|
444 | def settings_email(self): | |
442 | """GET /admin/settings/email: All items in the collection""" |
|
445 | """GET /admin/settings/email: All items in the collection""" | |
443 | # url('admin_settings_email') |
|
446 | # url('admin_settings_email') | |
444 | c.active = 'email' |
|
447 | c.active = 'email' | |
445 | c.rhodecode_ini = rhodecode.CONFIG |
|
448 | c.rhodecode_ini = rhodecode.CONFIG | |
446 |
|
449 | |||
447 | return htmlfill.render( |
|
450 | return htmlfill.render( | |
448 | render('admin/settings/settings.html'), |
|
451 | render('admin/settings/settings.html'), | |
449 | defaults=self._form_defaults(), |
|
452 | defaults=self._form_defaults(), | |
450 | encoding="UTF-8", |
|
453 | encoding="UTF-8", | |
451 | force_defaults=False) |
|
454 | force_defaults=False) | |
452 |
|
455 | |||
453 | @HasPermissionAllDecorator('hg.admin') |
|
456 | @HasPermissionAllDecorator('hg.admin') | |
454 | @auth.CSRFRequired() |
|
457 | @auth.CSRFRequired() | |
455 | def settings_hooks_update(self): |
|
458 | def settings_hooks_update(self): | |
456 | """POST or DELETE /admin/settings/hooks: All items in the collection""" |
|
459 | """POST or DELETE /admin/settings/hooks: All items in the collection""" | |
457 | # url('admin_settings_hooks') |
|
460 | # url('admin_settings_hooks') | |
458 | c.active = 'hooks' |
|
461 | c.active = 'hooks' | |
459 | if c.visual.allow_custom_hooks_settings: |
|
462 | if c.visual.allow_custom_hooks_settings: | |
460 | ui_key = request.POST.get('new_hook_ui_key') |
|
463 | ui_key = request.POST.get('new_hook_ui_key') | |
461 | ui_value = request.POST.get('new_hook_ui_value') |
|
464 | ui_value = request.POST.get('new_hook_ui_value') | |
462 |
|
465 | |||
463 | hook_id = request.POST.get('hook_id') |
|
466 | hook_id = request.POST.get('hook_id') | |
464 | new_hook = False |
|
467 | new_hook = False | |
465 |
|
468 | |||
466 | model = SettingsModel() |
|
469 | model = SettingsModel() | |
467 | try: |
|
470 | try: | |
468 | if ui_value and ui_key: |
|
471 | if ui_value and ui_key: | |
469 | model.create_or_update_hook(ui_key, ui_value) |
|
472 | model.create_or_update_hook(ui_key, ui_value) | |
470 | h.flash(_('Added new hook'), category='success') |
|
473 | h.flash(_('Added new hook'), category='success') | |
471 | new_hook = True |
|
474 | new_hook = True | |
472 | elif hook_id: |
|
475 | elif hook_id: | |
473 | RhodeCodeUi.delete(hook_id) |
|
476 | RhodeCodeUi.delete(hook_id) | |
474 | Session().commit() |
|
477 | Session().commit() | |
475 |
|
478 | |||
476 | # check for edits |
|
479 | # check for edits | |
477 | update = False |
|
480 | update = False | |
478 | _d = request.POST.dict_of_lists() |
|
481 | _d = request.POST.dict_of_lists() | |
479 | for k, v in zip(_d.get('hook_ui_key', []), |
|
482 | for k, v in zip(_d.get('hook_ui_key', []), | |
480 | _d.get('hook_ui_value_new', [])): |
|
483 | _d.get('hook_ui_value_new', [])): | |
481 | model.create_or_update_hook(k, v) |
|
484 | model.create_or_update_hook(k, v) | |
482 | update = True |
|
485 | update = True | |
483 |
|
486 | |||
484 | if update and not new_hook: |
|
487 | if update and not new_hook: | |
485 | h.flash(_('Updated hooks'), category='success') |
|
488 | h.flash(_('Updated hooks'), category='success') | |
486 | Session().commit() |
|
489 | Session().commit() | |
487 | except Exception: |
|
490 | except Exception: | |
488 | log.exception("Exception during hook creation") |
|
491 | log.exception("Exception during hook creation") | |
489 | h.flash(_('Error occurred during hook creation'), |
|
492 | h.flash(_('Error occurred during hook creation'), | |
490 | category='error') |
|
493 | category='error') | |
491 |
|
494 | |||
492 | return redirect(url('admin_settings_hooks')) |
|
495 | return redirect(url('admin_settings_hooks')) | |
493 |
|
496 | |||
494 | @HasPermissionAllDecorator('hg.admin') |
|
497 | @HasPermissionAllDecorator('hg.admin') | |
495 | def settings_hooks(self): |
|
498 | def settings_hooks(self): | |
496 | """GET /admin/settings/hooks: All items in the collection""" |
|
499 | """GET /admin/settings/hooks: All items in the collection""" | |
497 | # url('admin_settings_hooks') |
|
500 | # url('admin_settings_hooks') | |
498 | c.active = 'hooks' |
|
501 | c.active = 'hooks' | |
499 |
|
502 | |||
500 | model = SettingsModel() |
|
503 | model = SettingsModel() | |
501 | c.hooks = model.get_builtin_hooks() |
|
504 | c.hooks = model.get_builtin_hooks() | |
502 | c.custom_hooks = model.get_custom_hooks() |
|
505 | c.custom_hooks = model.get_custom_hooks() | |
503 |
|
506 | |||
504 | return htmlfill.render( |
|
507 | return htmlfill.render( | |
505 | render('admin/settings/settings.html'), |
|
508 | render('admin/settings/settings.html'), | |
506 | defaults=self._form_defaults(), |
|
509 | defaults=self._form_defaults(), | |
507 | encoding="UTF-8", |
|
510 | encoding="UTF-8", | |
508 | force_defaults=False) |
|
511 | force_defaults=False) | |
509 |
|
512 | |||
510 | @HasPermissionAllDecorator('hg.admin') |
|
513 | @HasPermissionAllDecorator('hg.admin') | |
511 | def settings_search(self): |
|
514 | def settings_search(self): | |
512 | """GET /admin/settings/search: All items in the collection""" |
|
515 | """GET /admin/settings/search: All items in the collection""" | |
513 | # url('admin_settings_search') |
|
516 | # url('admin_settings_search') | |
514 | c.active = 'search' |
|
517 | c.active = 'search' | |
515 |
|
518 | |||
516 | from rhodecode.lib.index import searcher_from_config |
|
519 | from rhodecode.lib.index import searcher_from_config | |
517 | searcher = searcher_from_config(config) |
|
520 | searcher = searcher_from_config(config) | |
518 | c.statistics = searcher.statistics() |
|
521 | c.statistics = searcher.statistics() | |
519 |
|
522 | |||
520 | return render('admin/settings/settings.html') |
|
523 | return render('admin/settings/settings.html') | |
521 |
|
524 | |||
522 | @HasPermissionAllDecorator('hg.admin') |
|
525 | @HasPermissionAllDecorator('hg.admin') | |
523 | def settings_system(self): |
|
526 | def settings_system(self): | |
524 | """GET /admin/settings/system: All items in the collection""" |
|
527 | """GET /admin/settings/system: All items in the collection""" | |
525 | # url('admin_settings_system') |
|
528 | # url('admin_settings_system') | |
|
529 | snapshot = str2bool(request.GET.get('snapshot')) | |||
526 | c.active = 'system' |
|
530 | c.active = 'system' | |
527 |
|
531 | |||
528 | defaults = self._form_defaults() |
|
532 | defaults = self._form_defaults() | |
529 | c.rhodecode_ini = rhodecode.CONFIG |
|
533 | c.rhodecode_ini = rhodecode.CONFIG | |
530 | c.rhodecode_update_url = defaults.get('rhodecode_update_url') |
|
534 | c.rhodecode_update_url = defaults.get('rhodecode_update_url') | |
531 | server_info = ScmModel().get_server_info(request.environ) |
|
535 | server_info = ScmModel().get_server_info(request.environ) | |
532 | for key, val in server_info.iteritems(): |
|
536 | for key, val in server_info.iteritems(): | |
533 | setattr(c, key, val) |
|
537 | setattr(c, key, val) | |
534 |
|
538 | |||
535 | if c.disk['percent'] > 90: |
|
539 | if c.disk['percent'] > 90: | |
536 | h.flash(h.literal(_( |
|
540 | h.flash(h.literal(_( | |
537 | 'Critical: your disk space is very low <b>%s%%</b> used' % |
|
541 | 'Critical: your disk space is very low <b>%s%%</b> used' % | |
538 | c.disk['percent'])), 'error') |
|
542 | c.disk['percent'])), 'error') | |
539 | elif c.disk['percent'] > 70: |
|
543 | elif c.disk['percent'] > 70: | |
540 | h.flash(h.literal(_( |
|
544 | h.flash(h.literal(_( | |
541 | 'Warning: your disk space is running low <b>%s%%</b> used' % |
|
545 | 'Warning: your disk space is running low <b>%s%%</b> used' % | |
542 | c.disk['percent'])), 'warning') |
|
546 | c.disk['percent'])), 'warning') | |
543 |
|
547 | |||
544 | try: |
|
548 | try: | |
545 | c.uptime_age = h._age( |
|
549 | c.uptime_age = h._age( | |
546 | h.time_to_datetime(c.boot_time), False, show_suffix=False) |
|
550 | h.time_to_datetime(c.boot_time), False, show_suffix=False) | |
547 | except TypeError: |
|
551 | except TypeError: | |
548 | c.uptime_age = c.boot_time |
|
552 | c.uptime_age = c.boot_time | |
549 |
|
553 | |||
550 | try: |
|
554 | try: | |
551 | c.system_memory = '%s/%s, %s%% (%s%%) used%s' % ( |
|
555 | c.system_memory = '%s/%s, %s%% (%s%%) used%s' % ( | |
552 | h.format_byte_size_binary(c.memory['used']), |
|
556 | h.format_byte_size_binary(c.memory['used']), | |
553 | h.format_byte_size_binary(c.memory['total']), |
|
557 | h.format_byte_size_binary(c.memory['total']), | |
554 | c.memory['percent2'], |
|
558 | c.memory['percent2'], | |
555 | c.memory['percent'], |
|
559 | c.memory['percent'], | |
556 | ' %s' % c.memory['error'] if 'error' in c.memory else '') |
|
560 | ' %s' % c.memory['error'] if 'error' in c.memory else '') | |
557 | except TypeError: |
|
561 | except TypeError: | |
558 | c.system_memory = 'NOT AVAILABLE' |
|
562 | c.system_memory = 'NOT AVAILABLE' | |
559 |
|
563 | |||
|
564 | rhodecode_ini_safe = rhodecode.CONFIG.copy() | |||
|
565 | blacklist = [ | |||
|
566 | 'rhodecode_license_key', | |||
|
567 | 'routes.map', | |||
|
568 | 'pylons.h', | |||
|
569 | 'pylons.app_globals', | |||
|
570 | 'pylons.environ_config', | |||
|
571 | 'sqlalchemy.db1.url', | |||
|
572 | ('app_conf', 'sqlalchemy.db1.url') | |||
|
573 | ] | |||
|
574 | for k in blacklist: | |||
|
575 | if isinstance(k, tuple): | |||
|
576 | section, key = k | |||
|
577 | if section in rhodecode_ini_safe: | |||
|
578 | rhodecode_ini_safe[section].pop(key, None) | |||
|
579 | else: | |||
|
580 | rhodecode_ini_safe.pop(k, None) | |||
|
581 | ||||
|
582 | c.rhodecode_ini_safe = rhodecode_ini_safe | |||
|
583 | ||||
|
584 | # TODO: marcink, figure out how to allow only selected users to do this | |||
|
585 | c.allowed_to_snapshot = False | |||
|
586 | ||||
|
587 | if snapshot: | |||
|
588 | if c.allowed_to_snapshot: | |||
|
589 | return render('admin/settings/settings_system_snapshot.html') | |||
|
590 | else: | |||
|
591 | h.flash('You are not allowed to do this', category='warning') | |||
|
592 | ||||
560 | return htmlfill.render( |
|
593 | return htmlfill.render( | |
561 | render('admin/settings/settings.html'), |
|
594 | render('admin/settings/settings.html'), | |
562 | defaults=defaults, |
|
595 | defaults=defaults, | |
563 | encoding="UTF-8", |
|
596 | encoding="UTF-8", | |
564 | force_defaults=False) |
|
597 | force_defaults=False) | |
565 |
|
598 | |||
566 | @staticmethod |
|
599 | @staticmethod | |
567 | def get_update_data(update_url): |
|
600 | def get_update_data(update_url): | |
568 | """Return the JSON update data.""" |
|
601 | """Return the JSON update data.""" | |
569 | ver = rhodecode.__version__ |
|
602 | ver = rhodecode.__version__ | |
570 | log.debug('Checking for upgrade on `%s` server', update_url) |
|
603 | log.debug('Checking for upgrade on `%s` server', update_url) | |
571 | opener = urllib2.build_opener() |
|
604 | opener = urllib2.build_opener() | |
572 | opener.addheaders = [('User-agent', 'RhodeCode-SCM/%s' % ver)] |
|
605 | opener.addheaders = [('User-agent', 'RhodeCode-SCM/%s' % ver)] | |
573 | response = opener.open(update_url) |
|
606 | response = opener.open(update_url) | |
574 | response_data = response.read() |
|
607 | response_data = response.read() | |
575 | data = json.loads(response_data) |
|
608 | data = json.loads(response_data) | |
576 |
|
609 | |||
577 | return data |
|
610 | return data | |
578 |
|
611 | |||
579 | @HasPermissionAllDecorator('hg.admin') |
|
612 | @HasPermissionAllDecorator('hg.admin') | |
580 | def settings_system_update(self): |
|
613 | def settings_system_update(self): | |
581 | """GET /admin/settings/system/updates: All items in the collection""" |
|
614 | """GET /admin/settings/system/updates: All items in the collection""" | |
582 | # url('admin_settings_system_update') |
|
615 | # url('admin_settings_system_update') | |
583 | defaults = self._form_defaults() |
|
616 | defaults = self._form_defaults() | |
584 | update_url = defaults.get('rhodecode_update_url', '') |
|
617 | update_url = defaults.get('rhodecode_update_url', '') | |
585 |
|
618 | |||
586 | _err = lambda s: '<div style="color:#ff8888; padding:4px 0px">%s</div>' % (s) |
|
619 | _err = lambda s: '<div style="color:#ff8888; padding:4px 0px">%s</div>' % (s) | |
587 | try: |
|
620 | try: | |
588 | data = self.get_update_data(update_url) |
|
621 | data = self.get_update_data(update_url) | |
589 | except urllib2.URLError as e: |
|
622 | except urllib2.URLError as e: | |
590 | log.exception("Exception contacting upgrade server") |
|
623 | log.exception("Exception contacting upgrade server") | |
591 | return _err('Failed to contact upgrade server: %r' % e) |
|
624 | return _err('Failed to contact upgrade server: %r' % e) | |
592 | except ValueError as e: |
|
625 | except ValueError as e: | |
593 | log.exception("Bad data sent from update server") |
|
626 | log.exception("Bad data sent from update server") | |
594 | return _err('Bad data sent from update server') |
|
627 | return _err('Bad data sent from update server') | |
595 |
|
628 | |||
596 | latest = data['versions'][0] |
|
629 | latest = data['versions'][0] | |
597 |
|
630 | |||
598 | c.update_url = update_url |
|
631 | c.update_url = update_url | |
599 | c.latest_data = latest |
|
632 | c.latest_data = latest | |
600 | c.latest_ver = latest['version'] |
|
633 | c.latest_ver = latest['version'] | |
601 | c.cur_ver = rhodecode.__version__ |
|
634 | c.cur_ver = rhodecode.__version__ | |
602 | c.should_upgrade = False |
|
635 | c.should_upgrade = False | |
603 |
|
636 | |||
604 | if (packaging.version.Version(c.latest_ver) > |
|
637 | if (packaging.version.Version(c.latest_ver) > | |
605 | packaging.version.Version(c.cur_ver)): |
|
638 | packaging.version.Version(c.cur_ver)): | |
606 | c.should_upgrade = True |
|
639 | c.should_upgrade = True | |
607 | c.important_notices = latest['general'] |
|
640 | c.important_notices = latest['general'] | |
608 |
|
641 | |||
609 | return render('admin/settings/settings_system_update.html') |
|
642 | return render('admin/settings/settings_system_update.html') | |
610 |
|
643 | |||
611 | @HasPermissionAllDecorator('hg.admin') |
|
644 | @HasPermissionAllDecorator('hg.admin') | |
612 | def settings_supervisor(self): |
|
645 | def settings_supervisor(self): | |
613 | c.rhodecode_ini = rhodecode.CONFIG |
|
646 | c.rhodecode_ini = rhodecode.CONFIG | |
614 | c.active = 'supervisor' |
|
647 | c.active = 'supervisor' | |
615 |
|
648 | |||
616 | c.supervisor_procs = OrderedDict([ |
|
649 | c.supervisor_procs = OrderedDict([ | |
617 | (SUPERVISOR_MASTER, {}), |
|
650 | (SUPERVISOR_MASTER, {}), | |
618 | ]) |
|
651 | ]) | |
619 |
|
652 | |||
620 | c.log_size = 10240 |
|
653 | c.log_size = 10240 | |
621 | supervisor = SupervisorModel() |
|
654 | supervisor = SupervisorModel() | |
622 |
|
655 | |||
623 | _connection = supervisor.get_connection( |
|
656 | _connection = supervisor.get_connection( | |
624 | c.rhodecode_ini.get('supervisor.uri')) |
|
657 | c.rhodecode_ini.get('supervisor.uri')) | |
625 | c.connection_error = None |
|
658 | c.connection_error = None | |
626 | try: |
|
659 | try: | |
627 | _connection.supervisor.getAllProcessInfo() |
|
660 | _connection.supervisor.getAllProcessInfo() | |
628 | except Exception as e: |
|
661 | except Exception as e: | |
629 | c.connection_error = str(e) |
|
662 | c.connection_error = str(e) | |
630 | log.exception("Exception reading supervisor data") |
|
663 | log.exception("Exception reading supervisor data") | |
631 | return render('admin/settings/settings.html') |
|
664 | return render('admin/settings/settings.html') | |
632 |
|
665 | |||
633 | groupid = c.rhodecode_ini.get('supervisor.group_id') |
|
666 | groupid = c.rhodecode_ini.get('supervisor.group_id') | |
634 |
|
667 | |||
635 | # feed our group processes to the main |
|
668 | # feed our group processes to the main | |
636 | for proc in supervisor.get_group_processes(_connection, groupid): |
|
669 | for proc in supervisor.get_group_processes(_connection, groupid): | |
637 | c.supervisor_procs[proc['name']] = {} |
|
670 | c.supervisor_procs[proc['name']] = {} | |
638 |
|
671 | |||
639 | for k in c.supervisor_procs.keys(): |
|
672 | for k in c.supervisor_procs.keys(): | |
640 | try: |
|
673 | try: | |
641 | # master process info |
|
674 | # master process info | |
642 | if k == SUPERVISOR_MASTER: |
|
675 | if k == SUPERVISOR_MASTER: | |
643 | _data = supervisor.get_master_state(_connection) |
|
676 | _data = supervisor.get_master_state(_connection) | |
644 | _data['name'] = 'supervisor master' |
|
677 | _data['name'] = 'supervisor master' | |
645 | _data['description'] = 'pid %s, id: %s, ver: %s' % ( |
|
678 | _data['description'] = 'pid %s, id: %s, ver: %s' % ( | |
646 | _data['pid'], _data['id'], _data['ver']) |
|
679 | _data['pid'], _data['id'], _data['ver']) | |
647 | c.supervisor_procs[k] = _data |
|
680 | c.supervisor_procs[k] = _data | |
648 | else: |
|
681 | else: | |
649 | procid = groupid + ":" + k |
|
682 | procid = groupid + ":" + k | |
650 | c.supervisor_procs[k] = supervisor.get_process_info(_connection, procid) |
|
683 | c.supervisor_procs[k] = supervisor.get_process_info(_connection, procid) | |
651 | except Exception as e: |
|
684 | except Exception as e: | |
652 | log.exception("Exception reading supervisor data") |
|
685 | log.exception("Exception reading supervisor data") | |
653 | c.supervisor_procs[k] = {'_rhodecode_error': str(e)} |
|
686 | c.supervisor_procs[k] = {'_rhodecode_error': str(e)} | |
654 |
|
687 | |||
655 | return render('admin/settings/settings.html') |
|
688 | return render('admin/settings/settings.html') | |
656 |
|
689 | |||
657 | @HasPermissionAllDecorator('hg.admin') |
|
690 | @HasPermissionAllDecorator('hg.admin') | |
658 | def settings_supervisor_log(self, procid): |
|
691 | def settings_supervisor_log(self, procid): | |
659 | import rhodecode |
|
692 | import rhodecode | |
660 | c.rhodecode_ini = rhodecode.CONFIG |
|
693 | c.rhodecode_ini = rhodecode.CONFIG | |
661 | c.active = 'supervisor_tail' |
|
694 | c.active = 'supervisor_tail' | |
662 |
|
695 | |||
663 | supervisor = SupervisorModel() |
|
696 | supervisor = SupervisorModel() | |
664 | _connection = supervisor.get_connection(c.rhodecode_ini.get('supervisor.uri')) |
|
697 | _connection = supervisor.get_connection(c.rhodecode_ini.get('supervisor.uri')) | |
665 | groupid = c.rhodecode_ini.get('supervisor.group_id') |
|
698 | groupid = c.rhodecode_ini.get('supervisor.group_id') | |
666 | procid = groupid + ":" + procid if procid != SUPERVISOR_MASTER else procid |
|
699 | procid = groupid + ":" + procid if procid != SUPERVISOR_MASTER else procid | |
667 |
|
700 | |||
668 | c.log_size = 10240 |
|
701 | c.log_size = 10240 | |
669 | offset = abs(safe_int(request.GET.get('offset', c.log_size))) * -1 |
|
702 | offset = abs(safe_int(request.GET.get('offset', c.log_size))) * -1 | |
670 | c.log = supervisor.read_process_log(_connection, procid, offset, 0) |
|
703 | c.log = supervisor.read_process_log(_connection, procid, offset, 0) | |
671 |
|
704 | |||
672 | return render('admin/settings/settings.html') |
|
705 | return render('admin/settings/settings.html') | |
673 |
|
706 | |||
674 | @HasPermissionAllDecorator('hg.admin') |
|
707 | @HasPermissionAllDecorator('hg.admin') | |
675 | @auth.CSRFRequired() |
|
708 | @auth.CSRFRequired() | |
676 | def settings_labs_update(self): |
|
709 | def settings_labs_update(self): | |
677 | """POST /admin/settings/labs: All items in the collection""" |
|
710 | """POST /admin/settings/labs: All items in the collection""" | |
678 | # url('admin_settings/labs', method={'POST'}) |
|
711 | # url('admin_settings/labs', method={'POST'}) | |
679 | c.active = 'labs' |
|
712 | c.active = 'labs' | |
680 |
|
713 | |||
681 | application_form = LabsSettingsForm()() |
|
714 | application_form = LabsSettingsForm()() | |
682 | try: |
|
715 | try: | |
683 | form_result = application_form.to_python(dict(request.POST)) |
|
716 | form_result = application_form.to_python(dict(request.POST)) | |
684 | except formencode.Invalid as errors: |
|
717 | except formencode.Invalid as errors: | |
685 | h.flash( |
|
718 | h.flash( | |
686 | _('Some form inputs contain invalid data.'), |
|
719 | _('Some form inputs contain invalid data.'), | |
687 | category='error') |
|
720 | category='error') | |
688 | return htmlfill.render( |
|
721 | return htmlfill.render( | |
689 | render('admin/settings/settings.html'), |
|
722 | render('admin/settings/settings.html'), | |
690 | defaults=errors.value, |
|
723 | defaults=errors.value, | |
691 | errors=errors.error_dict or {}, |
|
724 | errors=errors.error_dict or {}, | |
692 | prefix_error=False, |
|
725 | prefix_error=False, | |
693 | encoding='UTF-8', |
|
726 | encoding='UTF-8', | |
694 | force_defaults=False |
|
727 | force_defaults=False | |
695 | ) |
|
728 | ) | |
696 |
|
729 | |||
697 | try: |
|
730 | try: | |
698 | session = Session() |
|
731 | session = Session() | |
699 | for setting in _LAB_SETTINGS: |
|
732 | for setting in _LAB_SETTINGS: | |
700 | setting_name = setting.key[len('rhodecode_'):] |
|
733 | setting_name = setting.key[len('rhodecode_'):] | |
701 | sett = SettingsModel().create_or_update_setting( |
|
734 | sett = SettingsModel().create_or_update_setting( | |
702 | setting_name, form_result[setting.key], setting.type) |
|
735 | setting_name, form_result[setting.key], setting.type) | |
703 | session.add(sett) |
|
736 | session.add(sett) | |
704 |
|
737 | |||
705 | except Exception: |
|
738 | except Exception: | |
706 | log.exception('Exception while updating lab settings') |
|
739 | log.exception('Exception while updating lab settings') | |
707 | h.flash(_('Error occurred during updating labs settings'), |
|
740 | h.flash(_('Error occurred during updating labs settings'), | |
708 | category='error') |
|
741 | category='error') | |
709 | else: |
|
742 | else: | |
710 | Session().commit() |
|
743 | Session().commit() | |
|
744 | SettingsModel().invalidate_settings_cache() | |||
711 | h.flash(_('Updated Labs settings'), category='success') |
|
745 | h.flash(_('Updated Labs settings'), category='success') | |
712 | return redirect(url('admin_settings_labs')) |
|
746 | return redirect(url('admin_settings_labs')) | |
713 |
|
747 | |||
714 | return htmlfill.render( |
|
748 | return htmlfill.render( | |
715 | render('admin/settings/settings.html'), |
|
749 | render('admin/settings/settings.html'), | |
716 | defaults=self._form_defaults(), |
|
750 | defaults=self._form_defaults(), | |
717 | encoding='UTF-8', |
|
751 | encoding='UTF-8', | |
718 | force_defaults=False) |
|
752 | force_defaults=False) | |
719 |
|
753 | |||
720 | @HasPermissionAllDecorator('hg.admin') |
|
754 | @HasPermissionAllDecorator('hg.admin') | |
721 | def settings_labs(self): |
|
755 | def settings_labs(self): | |
722 | """GET /admin/settings/labs: All items in the collection""" |
|
756 | """GET /admin/settings/labs: All items in the collection""" | |
723 | # url('admin_settings_labs') |
|
757 | # url('admin_settings_labs') | |
724 | if not c.labs_active: |
|
758 | if not c.labs_active: | |
725 | redirect(url('admin_settings')) |
|
759 | redirect(url('admin_settings')) | |
726 |
|
760 | |||
727 | c.active = 'labs' |
|
761 | c.active = 'labs' | |
728 | c.lab_settings = _LAB_SETTINGS |
|
762 | c.lab_settings = _LAB_SETTINGS | |
729 |
|
763 | |||
730 | return htmlfill.render( |
|
764 | return htmlfill.render( | |
731 | render('admin/settings/settings.html'), |
|
765 | render('admin/settings/settings.html'), | |
732 | defaults=self._form_defaults(), |
|
766 | defaults=self._form_defaults(), | |
733 | encoding='UTF-8', |
|
767 | encoding='UTF-8', | |
734 | force_defaults=False) |
|
768 | force_defaults=False) | |
735 |
|
769 | |||
736 | @HasPermissionAllDecorator('hg.admin') |
|
|||
737 | def settings_open_source(self): |
|
|||
738 | # url('admin_settings_open_source') |
|
|||
739 |
|
||||
740 | c.active = 'open_source' |
|
|||
741 | c.opensource_licenses = collections.OrderedDict( |
|
|||
742 | sorted(read_opensource_licenses().items(), key=lambda t: t[0])) |
|
|||
743 |
|
||||
744 | return htmlfill.render( |
|
|||
745 | render('admin/settings/settings.html'), |
|
|||
746 | defaults=self._form_defaults(), |
|
|||
747 | encoding='UTF-8', |
|
|||
748 | force_defaults=False) |
|
|||
749 |
|
||||
750 | def _form_defaults(self): |
|
770 | def _form_defaults(self): | |
751 | defaults = SettingsModel().get_all_settings() |
|
771 | defaults = SettingsModel().get_all_settings() | |
752 | defaults.update(self._get_hg_ui_settings()) |
|
772 | defaults.update(self._get_hg_ui_settings()) | |
753 | defaults.update({ |
|
773 | defaults.update({ | |
754 | 'new_svn_branch': '', |
|
774 | 'new_svn_branch': '', | |
755 | 'new_svn_tag': '', |
|
775 | 'new_svn_tag': '', | |
756 | }) |
|
776 | }) | |
757 | return defaults |
|
777 | return defaults | |
758 |
|
778 | |||
759 |
|
779 | |||
760 | # :param key: name of the setting including the 'rhodecode_' prefix |
|
780 | # :param key: name of the setting including the 'rhodecode_' prefix | |
761 | # :param type: the RhodeCodeSetting type to use. |
|
781 | # :param type: the RhodeCodeSetting type to use. | |
762 | # :param group: the i18ned group in which we should dispaly this setting |
|
782 | # :param group: the i18ned group in which we should dispaly this setting | |
763 | # :param label: the i18ned label we should display for this setting |
|
783 | # :param label: the i18ned label we should display for this setting | |
764 | # :param help: the i18ned help we should dispaly for this setting |
|
784 | # :param help: the i18ned help we should dispaly for this setting | |
765 | LabSetting = collections.namedtuple( |
|
785 | LabSetting = collections.namedtuple( | |
766 | 'LabSetting', ('key', 'type', 'group', 'label', 'help')) |
|
786 | 'LabSetting', ('key', 'type', 'group', 'label', 'help')) | |
767 |
|
787 | |||
768 |
|
788 | |||
769 | # This list has to be kept in sync with the form |
|
789 | # This list has to be kept in sync with the form | |
770 | # rhodecode.model.forms.LabsSettingsForm. |
|
790 | # rhodecode.model.forms.LabsSettingsForm. | |
771 | _LAB_SETTINGS = [ |
|
791 | _LAB_SETTINGS = [ | |
772 | LabSetting( |
|
792 | LabSetting( | |
773 | key='rhodecode_hg_use_rebase_for_merging', |
|
793 | key='rhodecode_hg_use_rebase_for_merging', | |
774 | type='bool', |
|
794 | type='bool', | |
775 | group=lazy_ugettext('Mercurial server-side merge'), |
|
795 | group=lazy_ugettext('Mercurial server-side merge'), | |
776 | label=lazy_ugettext('Use rebase instead of creating a merge commit when merging via web interface'), |
|
796 | label=lazy_ugettext('Use rebase instead of creating a merge commit when merging via web interface'), | |
777 | help='' # Do not translate the empty string! |
|
797 | help='' # Do not translate the empty string! | |
778 | ), |
|
798 | ), | |
779 | LabSetting( |
|
799 | LabSetting( | |
780 | key='rhodecode_proxy_subversion_http_requests', |
|
800 | key='rhodecode_proxy_subversion_http_requests', | |
781 | type='bool', |
|
801 | type='bool', | |
782 | group=lazy_ugettext('Subversion HTTP Support'), |
|
802 | group=lazy_ugettext('Subversion HTTP Support'), | |
783 | label=lazy_ugettext('Proxy subversion HTTP requests'), |
|
803 | label=lazy_ugettext('Proxy subversion HTTP requests'), | |
784 | help='' # Do not translate the empty string! |
|
804 | help='' # Do not translate the empty string! | |
785 | ), |
|
805 | ), | |
786 | LabSetting( |
|
806 | LabSetting( | |
787 | key='rhodecode_subversion_http_server_url', |
|
807 | key='rhodecode_subversion_http_server_url', | |
788 | type='str', |
|
808 | type='str', | |
789 | group=lazy_ugettext('Subversion HTTP Server URL'), |
|
809 | group=lazy_ugettext('Subversion HTTP Server URL'), | |
790 | label='', # Do not translate the empty string! |
|
810 | label='', # Do not translate the empty string! | |
791 | help=lazy_ugettext('e.g. http://localhost:8080/') |
|
811 | help=lazy_ugettext('e.g. http://localhost:8080/') | |
792 | ), |
|
812 | ), | |
793 | ] |
|
813 | ] | |
794 |
|
||||
795 |
|
||||
796 | NavListEntry = collections.namedtuple('NavListEntry', ['key', 'name', 'url']) |
|
|||
797 |
|
||||
798 |
|
||||
799 | class NavEntry(object): |
|
|||
800 |
|
||||
801 | def __init__(self, key, name, view_name, pyramid=False): |
|
|||
802 | self.key = key |
|
|||
803 | self.name = name |
|
|||
804 | self.view_name = view_name |
|
|||
805 | self.pyramid = pyramid |
|
|||
806 |
|
||||
807 | def generate_url(self, request): |
|
|||
808 | if self.pyramid: |
|
|||
809 | if hasattr(request, 'route_path'): |
|
|||
810 | return request.route_path(self.view_name) |
|
|||
811 | else: |
|
|||
812 | # TODO: johbo: Remove this after migrating to pyramid. |
|
|||
813 | # We need the pyramid request here to generate URLs to pyramid |
|
|||
814 | # views from within pylons views. |
|
|||
815 | from pyramid.threadlocal import get_current_request |
|
|||
816 | pyramid_request = get_current_request() |
|
|||
817 | return pyramid_request.route_path(self.view_name) |
|
|||
818 | else: |
|
|||
819 | return url(self.view_name) |
|
|||
820 |
|
||||
821 |
|
||||
822 | class NavigationRegistry(object): |
|
|||
823 |
|
||||
824 | _base_entries = [ |
|
|||
825 | NavEntry('global', lazy_ugettext('Global'), 'admin_settings_global'), |
|
|||
826 | NavEntry('vcs', lazy_ugettext('VCS'), 'admin_settings_vcs'), |
|
|||
827 | NavEntry('visual', lazy_ugettext('Visual'), 'admin_settings_visual'), |
|
|||
828 | NavEntry('mapping', lazy_ugettext('Remap and Rescan'), |
|
|||
829 | 'admin_settings_mapping'), |
|
|||
830 | NavEntry('issuetracker', lazy_ugettext('Issue Tracker'), |
|
|||
831 | 'admin_settings_issuetracker'), |
|
|||
832 | NavEntry('email', lazy_ugettext('Email'), 'admin_settings_email'), |
|
|||
833 | NavEntry('hooks', lazy_ugettext('Hooks'), 'admin_settings_hooks'), |
|
|||
834 | NavEntry('search', lazy_ugettext('Full Text Search'), |
|
|||
835 | 'admin_settings_search'), |
|
|||
836 | NavEntry('system', lazy_ugettext('System Info'), |
|
|||
837 | 'admin_settings_system'), |
|
|||
838 | NavEntry('open_source', lazy_ugettext('Open Source Licenses'), |
|
|||
839 | 'admin_settings_open_source'), |
|
|||
840 | # TODO: marcink: we disable supervisor now until the supervisor stats |
|
|||
841 | # page is fixed in the nix configuration |
|
|||
842 | # NavEntry('supervisor', lazy_ugettext('Supervisor'), |
|
|||
843 | # 'admin_settings_supervisor'), |
|
|||
844 | ] |
|
|||
845 |
|
||||
846 | def __init__(self): |
|
|||
847 | self._registered_entries = collections.OrderedDict([ |
|
|||
848 | (item.key, item) for item in self.__class__._base_entries |
|
|||
849 | ]) |
|
|||
850 |
|
||||
851 | # Add the labs entry when it's activated. |
|
|||
852 | labs_active = str2bool( |
|
|||
853 | rhodecode.CONFIG.get('labs_settings_active', 'false')) |
|
|||
854 | if labs_active: |
|
|||
855 | self.add_entry( |
|
|||
856 | NavEntry('labs', lazy_ugettext('Labs'), 'admin_settings_labs')) |
|
|||
857 |
|
||||
858 | def add_entry(self, entry): |
|
|||
859 | self._registered_entries[entry.key] = entry |
|
|||
860 |
|
||||
861 | def get_navlist(self, request): |
|
|||
862 | navlist = [NavListEntry(i.key, i.name, i.generate_url(request)) |
|
|||
863 | for i in self._registered_entries.values()] |
|
|||
864 | return navlist |
|
|||
865 |
|
||||
866 | navigation = NavigationRegistry() |
|
@@ -1,480 +1,480 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2011-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2011-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 | """ |
|
21 | """ | |
22 | User Groups crud controller for pylons |
|
22 | User Groups crud controller for pylons | |
23 | """ |
|
23 | """ | |
24 |
|
24 | |||
25 | import logging |
|
25 | import logging | |
26 | import formencode |
|
26 | import formencode | |
27 |
|
27 | |||
28 | from formencode import htmlfill |
|
28 | from formencode import htmlfill | |
29 | from pylons import request, tmpl_context as c, url, config |
|
29 | from pylons import request, tmpl_context as c, url, config | |
30 | from pylons.controllers.util import redirect |
|
30 | from pylons.controllers.util import redirect | |
31 | from pylons.i18n.translation import _ |
|
31 | from pylons.i18n.translation import _ | |
32 |
|
32 | |||
33 | from sqlalchemy.orm import joinedload |
|
33 | from sqlalchemy.orm import joinedload | |
34 |
|
34 | |||
35 | from rhodecode.lib import auth |
|
35 | from rhodecode.lib import auth | |
36 | from rhodecode.lib import helpers as h |
|
36 | from rhodecode.lib import helpers as h | |
37 | from rhodecode.lib.exceptions import UserGroupAssignedException,\ |
|
37 | from rhodecode.lib.exceptions import UserGroupAssignedException,\ | |
38 | RepoGroupAssignmentError |
|
38 | RepoGroupAssignmentError | |
39 | from rhodecode.lib.utils import jsonify, action_logger |
|
39 | from rhodecode.lib.utils import jsonify, action_logger | |
40 | from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int |
|
40 | from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int | |
41 | from rhodecode.lib.auth import ( |
|
41 | from rhodecode.lib.auth import ( | |
42 | LoginRequired, NotAnonymous, HasUserGroupPermissionAnyDecorator, |
|
42 | LoginRequired, NotAnonymous, HasUserGroupPermissionAnyDecorator, | |
43 | HasPermissionAnyDecorator) |
|
43 | HasPermissionAnyDecorator) | |
44 | from rhodecode.lib.base import BaseController, render |
|
44 | from rhodecode.lib.base import BaseController, render | |
45 | from rhodecode.model.permission import PermissionModel |
|
45 | from rhodecode.model.permission import PermissionModel | |
46 | from rhodecode.model.scm import UserGroupList |
|
46 | from rhodecode.model.scm import UserGroupList | |
47 | from rhodecode.model.user_group import UserGroupModel |
|
47 | from rhodecode.model.user_group import UserGroupModel | |
48 | from rhodecode.model.db import ( |
|
48 | from rhodecode.model.db import ( | |
49 | User, UserGroup, UserGroupRepoToPerm, UserGroupRepoGroupToPerm) |
|
49 | User, UserGroup, UserGroupRepoToPerm, UserGroupRepoGroupToPerm) | |
50 | from rhodecode.model.forms import ( |
|
50 | from rhodecode.model.forms import ( | |
51 | UserGroupForm, UserGroupPermsForm, UserIndividualPermissionsForm, |
|
51 | UserGroupForm, UserGroupPermsForm, UserIndividualPermissionsForm, | |
52 | UserPermissionsForm) |
|
52 | UserPermissionsForm) | |
53 | from rhodecode.model.meta import Session |
|
53 | from rhodecode.model.meta import Session | |
54 | from rhodecode.lib.utils import action_logger |
|
54 | from rhodecode.lib.utils import action_logger | |
55 | from rhodecode.lib.ext_json import json |
|
55 | from rhodecode.lib.ext_json import json | |
56 |
|
56 | |||
57 | log = logging.getLogger(__name__) |
|
57 | log = logging.getLogger(__name__) | |
58 |
|
58 | |||
59 |
|
59 | |||
60 | class UserGroupsController(BaseController): |
|
60 | class UserGroupsController(BaseController): | |
61 | """REST Controller styled on the Atom Publishing Protocol""" |
|
61 | """REST Controller styled on the Atom Publishing Protocol""" | |
62 |
|
62 | |||
63 | @LoginRequired() |
|
63 | @LoginRequired() | |
64 | def __before__(self): |
|
64 | def __before__(self): | |
65 | super(UserGroupsController, self).__before__() |
|
65 | super(UserGroupsController, self).__before__() | |
66 | c.available_permissions = config['available_permissions'] |
|
66 | c.available_permissions = config['available_permissions'] | |
67 | PermissionModel().set_global_permission_choices(c, translator=_) |
|
67 | PermissionModel().set_global_permission_choices(c, translator=_) | |
68 |
|
68 | |||
69 | def __load_data(self, user_group_id): |
|
69 | def __load_data(self, user_group_id): | |
70 | c.group_members_obj = [x.user for x in c.user_group.members] |
|
70 | c.group_members_obj = [x.user for x in c.user_group.members] | |
71 | c.group_members_obj.sort(key=lambda u: u.username.lower()) |
|
71 | c.group_members_obj.sort(key=lambda u: u.username.lower()) | |
72 |
|
72 | |||
73 | c.group_members = [(x.user_id, x.username) for x in c.group_members_obj] |
|
73 | c.group_members = [(x.user_id, x.username) for x in c.group_members_obj] | |
74 |
|
74 | |||
75 | c.available_members = [(x.user_id, x.username) |
|
75 | c.available_members = [(x.user_id, x.username) | |
76 | for x in User.query().all()] |
|
76 | for x in User.query().all()] | |
77 | c.available_members.sort(key=lambda u: u[1].lower()) |
|
77 | c.available_members.sort(key=lambda u: u[1].lower()) | |
78 |
|
78 | |||
79 | def __load_defaults(self, user_group_id): |
|
79 | def __load_defaults(self, user_group_id): | |
80 | """ |
|
80 | """ | |
81 | Load defaults settings for edit, and update |
|
81 | Load defaults settings for edit, and update | |
82 |
|
82 | |||
83 | :param user_group_id: |
|
83 | :param user_group_id: | |
84 | """ |
|
84 | """ | |
85 | user_group = UserGroup.get_or_404(user_group_id) |
|
85 | user_group = UserGroup.get_or_404(user_group_id) | |
86 | data = user_group.get_dict() |
|
86 | data = user_group.get_dict() | |
87 | # fill owner |
|
87 | # fill owner | |
88 | if user_group.user: |
|
88 | if user_group.user: | |
89 | data.update({'user': user_group.user.username}) |
|
89 | data.update({'user': user_group.user.username}) | |
90 | else: |
|
90 | else: | |
91 | replacement_user = User.get_first_admin().username |
|
91 | replacement_user = User.get_first_super_admin().username | |
92 | data.update({'user': replacement_user}) |
|
92 | data.update({'user': replacement_user}) | |
93 | return data |
|
93 | return data | |
94 |
|
94 | |||
95 | def _revoke_perms_on_yourself(self, form_result): |
|
95 | def _revoke_perms_on_yourself(self, form_result): | |
96 | _updates = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), |
|
96 | _updates = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), | |
97 | form_result['perm_updates']) |
|
97 | form_result['perm_updates']) | |
98 | _additions = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), |
|
98 | _additions = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), | |
99 | form_result['perm_additions']) |
|
99 | form_result['perm_additions']) | |
100 | _deletions = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), |
|
100 | _deletions = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), | |
101 | form_result['perm_deletions']) |
|
101 | form_result['perm_deletions']) | |
102 | admin_perm = 'usergroup.admin' |
|
102 | admin_perm = 'usergroup.admin' | |
103 | if _updates and _updates[0][1] != admin_perm or \ |
|
103 | if _updates and _updates[0][1] != admin_perm or \ | |
104 | _additions and _additions[0][1] != admin_perm or \ |
|
104 | _additions and _additions[0][1] != admin_perm or \ | |
105 | _deletions and _deletions[0][1] != admin_perm: |
|
105 | _deletions and _deletions[0][1] != admin_perm: | |
106 | return True |
|
106 | return True | |
107 | return False |
|
107 | return False | |
108 |
|
108 | |||
109 | # permission check inside |
|
109 | # permission check inside | |
110 | @NotAnonymous() |
|
110 | @NotAnonymous() | |
111 | def index(self): |
|
111 | def index(self): | |
112 | """GET /users_groups: All items in the collection""" |
|
112 | """GET /users_groups: All items in the collection""" | |
113 | # url('users_groups') |
|
113 | # url('users_groups') | |
114 |
|
114 | |||
115 | from rhodecode.lib.utils import PartialRenderer |
|
115 | from rhodecode.lib.utils import PartialRenderer | |
116 | _render = PartialRenderer('data_table/_dt_elements.html') |
|
116 | _render = PartialRenderer('data_table/_dt_elements.html') | |
117 |
|
117 | |||
118 | def user_group_name(user_group_id, user_group_name): |
|
118 | def user_group_name(user_group_id, user_group_name): | |
119 | return _render("user_group_name", user_group_id, user_group_name) |
|
119 | return _render("user_group_name", user_group_id, user_group_name) | |
120 |
|
120 | |||
121 | def user_group_actions(user_group_id, user_group_name): |
|
121 | def user_group_actions(user_group_id, user_group_name): | |
122 | return _render("user_group_actions", user_group_id, user_group_name) |
|
122 | return _render("user_group_actions", user_group_id, user_group_name) | |
123 |
|
123 | |||
124 | ## json generate |
|
124 | ## json generate | |
125 | group_iter = UserGroupList(UserGroup.query().all(), |
|
125 | group_iter = UserGroupList(UserGroup.query().all(), | |
126 | perm_set=['usergroup.admin']) |
|
126 | perm_set=['usergroup.admin']) | |
127 |
|
127 | |||
128 | user_groups_data = [] |
|
128 | user_groups_data = [] | |
129 | for user_gr in group_iter: |
|
129 | for user_gr in group_iter: | |
130 | user_groups_data.append({ |
|
130 | user_groups_data.append({ | |
131 | "group_name": user_group_name( |
|
131 | "group_name": user_group_name( | |
132 | user_gr.users_group_id, h.escape(user_gr.users_group_name)), |
|
132 | user_gr.users_group_id, h.escape(user_gr.users_group_name)), | |
133 | "group_name_raw": user_gr.users_group_name, |
|
133 | "group_name_raw": user_gr.users_group_name, | |
134 | "desc": h.escape(user_gr.user_group_description), |
|
134 | "desc": h.escape(user_gr.user_group_description), | |
135 | "members": len(user_gr.members), |
|
135 | "members": len(user_gr.members), | |
136 | "active": h.bool2icon(user_gr.users_group_active), |
|
136 | "active": h.bool2icon(user_gr.users_group_active), | |
137 | "owner": h.escape(h.link_to_user(user_gr.user.username)), |
|
137 | "owner": h.escape(h.link_to_user(user_gr.user.username)), | |
138 | "action": user_group_actions( |
|
138 | "action": user_group_actions( | |
139 | user_gr.users_group_id, user_gr.users_group_name) |
|
139 | user_gr.users_group_id, user_gr.users_group_name) | |
140 | }) |
|
140 | }) | |
141 |
|
141 | |||
142 | c.data = json.dumps(user_groups_data) |
|
142 | c.data = json.dumps(user_groups_data) | |
143 | return render('admin/user_groups/user_groups.html') |
|
143 | return render('admin/user_groups/user_groups.html') | |
144 |
|
144 | |||
145 | @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true') |
|
145 | @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true') | |
146 | @auth.CSRFRequired() |
|
146 | @auth.CSRFRequired() | |
147 | def create(self): |
|
147 | def create(self): | |
148 | """POST /users_groups: Create a new item""" |
|
148 | """POST /users_groups: Create a new item""" | |
149 | # url('users_groups') |
|
149 | # url('users_groups') | |
150 |
|
150 | |||
151 | users_group_form = UserGroupForm()() |
|
151 | users_group_form = UserGroupForm()() | |
152 | try: |
|
152 | try: | |
153 | form_result = users_group_form.to_python(dict(request.POST)) |
|
153 | form_result = users_group_form.to_python(dict(request.POST)) | |
154 | user_group = UserGroupModel().create( |
|
154 | user_group = UserGroupModel().create( | |
155 | name=form_result['users_group_name'], |
|
155 | name=form_result['users_group_name'], | |
156 | description=form_result['user_group_description'], |
|
156 | description=form_result['user_group_description'], | |
157 | owner=c.rhodecode_user.user_id, |
|
157 | owner=c.rhodecode_user.user_id, | |
158 | active=form_result['users_group_active']) |
|
158 | active=form_result['users_group_active']) | |
159 | Session().flush() |
|
159 | Session().flush() | |
160 |
|
160 | |||
161 | user_group_name = form_result['users_group_name'] |
|
161 | user_group_name = form_result['users_group_name'] | |
162 | action_logger(c.rhodecode_user, |
|
162 | action_logger(c.rhodecode_user, | |
163 | 'admin_created_users_group:%s' % user_group_name, |
|
163 | 'admin_created_users_group:%s' % user_group_name, | |
164 | None, self.ip_addr, self.sa) |
|
164 | None, self.ip_addr, self.sa) | |
165 | user_group_link = h.link_to(h.escape(user_group_name), |
|
165 | user_group_link = h.link_to(h.escape(user_group_name), | |
166 | url('edit_users_group', |
|
166 | url('edit_users_group', | |
167 | user_group_id=user_group.users_group_id)) |
|
167 | user_group_id=user_group.users_group_id)) | |
168 | h.flash(h.literal(_('Created user group %(user_group_link)s') |
|
168 | h.flash(h.literal(_('Created user group %(user_group_link)s') | |
169 | % {'user_group_link': user_group_link}), |
|
169 | % {'user_group_link': user_group_link}), | |
170 | category='success') |
|
170 | category='success') | |
171 | Session().commit() |
|
171 | Session().commit() | |
172 | except formencode.Invalid as errors: |
|
172 | except formencode.Invalid as errors: | |
173 | return htmlfill.render( |
|
173 | return htmlfill.render( | |
174 | render('admin/user_groups/user_group_add.html'), |
|
174 | render('admin/user_groups/user_group_add.html'), | |
175 | defaults=errors.value, |
|
175 | defaults=errors.value, | |
176 | errors=errors.error_dict or {}, |
|
176 | errors=errors.error_dict or {}, | |
177 | prefix_error=False, |
|
177 | prefix_error=False, | |
178 | encoding="UTF-8", |
|
178 | encoding="UTF-8", | |
179 | force_defaults=False) |
|
179 | force_defaults=False) | |
180 | except Exception: |
|
180 | except Exception: | |
181 | log.exception("Exception creating user group") |
|
181 | log.exception("Exception creating user group") | |
182 | h.flash(_('Error occurred during creation of user group %s') \ |
|
182 | h.flash(_('Error occurred during creation of user group %s') \ | |
183 | % request.POST.get('users_group_name'), category='error') |
|
183 | % request.POST.get('users_group_name'), category='error') | |
184 |
|
184 | |||
185 | return redirect( |
|
185 | return redirect( | |
186 | url('edit_users_group', user_group_id=user_group.users_group_id)) |
|
186 | url('edit_users_group', user_group_id=user_group.users_group_id)) | |
187 |
|
187 | |||
188 | @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true') |
|
188 | @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true') | |
189 | def new(self): |
|
189 | def new(self): | |
190 | """GET /user_groups/new: Form to create a new item""" |
|
190 | """GET /user_groups/new: Form to create a new item""" | |
191 | # url('new_users_group') |
|
191 | # url('new_users_group') | |
192 | return render('admin/user_groups/user_group_add.html') |
|
192 | return render('admin/user_groups/user_group_add.html') | |
193 |
|
193 | |||
194 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
194 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') | |
195 | @auth.CSRFRequired() |
|
195 | @auth.CSRFRequired() | |
196 | def update(self, user_group_id): |
|
196 | def update(self, user_group_id): | |
197 | """PUT /user_groups/user_group_id: Update an existing item""" |
|
197 | """PUT /user_groups/user_group_id: Update an existing item""" | |
198 | # Forms posted to this method should contain a hidden field: |
|
198 | # Forms posted to this method should contain a hidden field: | |
199 | # <input type="hidden" name="_method" value="PUT" /> |
|
199 | # <input type="hidden" name="_method" value="PUT" /> | |
200 | # Or using helpers: |
|
200 | # Or using helpers: | |
201 | # h.form(url('users_group', user_group_id=ID), |
|
201 | # h.form(url('users_group', user_group_id=ID), | |
202 | # method='put') |
|
202 | # method='put') | |
203 | # url('users_group', user_group_id=ID) |
|
203 | # url('users_group', user_group_id=ID) | |
204 |
|
204 | |||
205 | user_group_id = safe_int(user_group_id) |
|
205 | user_group_id = safe_int(user_group_id) | |
206 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
206 | c.user_group = UserGroup.get_or_404(user_group_id) | |
207 | c.active = 'settings' |
|
207 | c.active = 'settings' | |
208 | self.__load_data(user_group_id) |
|
208 | self.__load_data(user_group_id) | |
209 |
|
209 | |||
210 | available_members = [safe_unicode(x[0]) for x in c.available_members] |
|
210 | available_members = [safe_unicode(x[0]) for x in c.available_members] | |
211 |
|
211 | |||
212 |
users_group_form = UserGroupForm( |
|
212 | users_group_form = UserGroupForm( | |
213 |
|
|
213 | edit=True, old_data=c.user_group.get_dict(), | |
214 |
|
|
214 | available_members=available_members, allow_disabled=True)() | |
215 |
|
215 | |||
216 | try: |
|
216 | try: | |
217 | form_result = users_group_form.to_python(request.POST) |
|
217 | form_result = users_group_form.to_python(request.POST) | |
218 | UserGroupModel().update(c.user_group, form_result) |
|
218 | UserGroupModel().update(c.user_group, form_result) | |
219 | gr = form_result['users_group_name'] |
|
219 | gr = form_result['users_group_name'] | |
220 | action_logger(c.rhodecode_user, |
|
220 | action_logger(c.rhodecode_user, | |
221 | 'admin_updated_users_group:%s' % gr, |
|
221 | 'admin_updated_users_group:%s' % gr, | |
222 | None, self.ip_addr, self.sa) |
|
222 | None, self.ip_addr, self.sa) | |
223 | h.flash(_('Updated user group %s') % gr, category='success') |
|
223 | h.flash(_('Updated user group %s') % gr, category='success') | |
224 | Session().commit() |
|
224 | Session().commit() | |
225 | except formencode.Invalid as errors: |
|
225 | except formencode.Invalid as errors: | |
226 | defaults = errors.value |
|
226 | defaults = errors.value | |
227 | e = errors.error_dict or {} |
|
227 | e = errors.error_dict or {} | |
228 |
|
228 | |||
229 | return htmlfill.render( |
|
229 | return htmlfill.render( | |
230 | render('admin/user_groups/user_group_edit.html'), |
|
230 | render('admin/user_groups/user_group_edit.html'), | |
231 | defaults=defaults, |
|
231 | defaults=defaults, | |
232 | errors=e, |
|
232 | errors=e, | |
233 | prefix_error=False, |
|
233 | prefix_error=False, | |
234 | encoding="UTF-8", |
|
234 | encoding="UTF-8", | |
235 | force_defaults=False) |
|
235 | force_defaults=False) | |
236 | except Exception: |
|
236 | except Exception: | |
237 | log.exception("Exception during update of user group") |
|
237 | log.exception("Exception during update of user group") | |
238 | h.flash(_('Error occurred during update of user group %s') |
|
238 | h.flash(_('Error occurred during update of user group %s') | |
239 | % request.POST.get('users_group_name'), category='error') |
|
239 | % request.POST.get('users_group_name'), category='error') | |
240 |
|
240 | |||
241 | return redirect(url('edit_users_group', user_group_id=user_group_id)) |
|
241 | return redirect(url('edit_users_group', user_group_id=user_group_id)) | |
242 |
|
242 | |||
243 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
243 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') | |
244 | @auth.CSRFRequired() |
|
244 | @auth.CSRFRequired() | |
245 | def delete(self, user_group_id): |
|
245 | def delete(self, user_group_id): | |
246 | """DELETE /user_groups/user_group_id: Delete an existing item""" |
|
246 | """DELETE /user_groups/user_group_id: Delete an existing item""" | |
247 | # Forms posted to this method should contain a hidden field: |
|
247 | # Forms posted to this method should contain a hidden field: | |
248 | # <input type="hidden" name="_method" value="DELETE" /> |
|
248 | # <input type="hidden" name="_method" value="DELETE" /> | |
249 | # Or using helpers: |
|
249 | # Or using helpers: | |
250 | # h.form(url('users_group', user_group_id=ID), |
|
250 | # h.form(url('users_group', user_group_id=ID), | |
251 | # method='delete') |
|
251 | # method='delete') | |
252 | # url('users_group', user_group_id=ID) |
|
252 | # url('users_group', user_group_id=ID) | |
253 | user_group_id = safe_int(user_group_id) |
|
253 | user_group_id = safe_int(user_group_id) | |
254 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
254 | c.user_group = UserGroup.get_or_404(user_group_id) | |
255 | force = str2bool(request.POST.get('force')) |
|
255 | force = str2bool(request.POST.get('force')) | |
256 |
|
256 | |||
257 | try: |
|
257 | try: | |
258 | UserGroupModel().delete(c.user_group, force=force) |
|
258 | UserGroupModel().delete(c.user_group, force=force) | |
259 | Session().commit() |
|
259 | Session().commit() | |
260 | h.flash(_('Successfully deleted user group'), category='success') |
|
260 | h.flash(_('Successfully deleted user group'), category='success') | |
261 | except UserGroupAssignedException as e: |
|
261 | except UserGroupAssignedException as e: | |
262 | h.flash(str(e), category='error') |
|
262 | h.flash(str(e), category='error') | |
263 | except Exception: |
|
263 | except Exception: | |
264 | log.exception("Exception during deletion of user group") |
|
264 | log.exception("Exception during deletion of user group") | |
265 | h.flash(_('An error occurred during deletion of user group'), |
|
265 | h.flash(_('An error occurred during deletion of user group'), | |
266 | category='error') |
|
266 | category='error') | |
267 | return redirect(url('users_groups')) |
|
267 | return redirect(url('users_groups')) | |
268 |
|
268 | |||
269 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
269 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') | |
270 | def edit(self, user_group_id): |
|
270 | def edit(self, user_group_id): | |
271 | """GET /user_groups/user_group_id/edit: Form to edit an existing item""" |
|
271 | """GET /user_groups/user_group_id/edit: Form to edit an existing item""" | |
272 | # url('edit_users_group', user_group_id=ID) |
|
272 | # url('edit_users_group', user_group_id=ID) | |
273 |
|
273 | |||
274 | user_group_id = safe_int(user_group_id) |
|
274 | user_group_id = safe_int(user_group_id) | |
275 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
275 | c.user_group = UserGroup.get_or_404(user_group_id) | |
276 | c.active = 'settings' |
|
276 | c.active = 'settings' | |
277 | self.__load_data(user_group_id) |
|
277 | self.__load_data(user_group_id) | |
278 |
|
278 | |||
279 | defaults = self.__load_defaults(user_group_id) |
|
279 | defaults = self.__load_defaults(user_group_id) | |
280 |
|
280 | |||
281 | return htmlfill.render( |
|
281 | return htmlfill.render( | |
282 | render('admin/user_groups/user_group_edit.html'), |
|
282 | render('admin/user_groups/user_group_edit.html'), | |
283 | defaults=defaults, |
|
283 | defaults=defaults, | |
284 | encoding="UTF-8", |
|
284 | encoding="UTF-8", | |
285 | force_defaults=False |
|
285 | force_defaults=False | |
286 | ) |
|
286 | ) | |
287 |
|
287 | |||
288 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
288 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') | |
289 | def edit_perms(self, user_group_id): |
|
289 | def edit_perms(self, user_group_id): | |
290 | user_group_id = safe_int(user_group_id) |
|
290 | user_group_id = safe_int(user_group_id) | |
291 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
291 | c.user_group = UserGroup.get_or_404(user_group_id) | |
292 | c.active = 'perms' |
|
292 | c.active = 'perms' | |
293 |
|
293 | |||
294 | defaults = {} |
|
294 | defaults = {} | |
295 | # fill user group users |
|
295 | # fill user group users | |
296 | for p in c.user_group.user_user_group_to_perm: |
|
296 | for p in c.user_group.user_user_group_to_perm: | |
297 | defaults.update({'u_perm_%s' % p.user.user_id: |
|
297 | defaults.update({'u_perm_%s' % p.user.user_id: | |
298 | p.permission.permission_name}) |
|
298 | p.permission.permission_name}) | |
299 |
|
299 | |||
300 | for p in c.user_group.user_group_user_group_to_perm: |
|
300 | for p in c.user_group.user_group_user_group_to_perm: | |
301 | defaults.update({'g_perm_%s' % p.user_group.users_group_id: |
|
301 | defaults.update({'g_perm_%s' % p.user_group.users_group_id: | |
302 | p.permission.permission_name}) |
|
302 | p.permission.permission_name}) | |
303 |
|
303 | |||
304 | return htmlfill.render( |
|
304 | return htmlfill.render( | |
305 | render('admin/user_groups/user_group_edit.html'), |
|
305 | render('admin/user_groups/user_group_edit.html'), | |
306 | defaults=defaults, |
|
306 | defaults=defaults, | |
307 | encoding="UTF-8", |
|
307 | encoding="UTF-8", | |
308 | force_defaults=False |
|
308 | force_defaults=False | |
309 | ) |
|
309 | ) | |
310 |
|
310 | |||
311 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
311 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') | |
312 | @auth.CSRFRequired() |
|
312 | @auth.CSRFRequired() | |
313 | def update_perms(self, user_group_id): |
|
313 | def update_perms(self, user_group_id): | |
314 | """ |
|
314 | """ | |
315 | grant permission for given usergroup |
|
315 | grant permission for given usergroup | |
316 |
|
316 | |||
317 | :param user_group_id: |
|
317 | :param user_group_id: | |
318 | """ |
|
318 | """ | |
319 | user_group_id = safe_int(user_group_id) |
|
319 | user_group_id = safe_int(user_group_id) | |
320 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
320 | c.user_group = UserGroup.get_or_404(user_group_id) | |
321 | form = UserGroupPermsForm()().to_python(request.POST) |
|
321 | form = UserGroupPermsForm()().to_python(request.POST) | |
322 |
|
322 | |||
323 | if not c.rhodecode_user.is_admin: |
|
323 | if not c.rhodecode_user.is_admin: | |
324 | if self._revoke_perms_on_yourself(form): |
|
324 | if self._revoke_perms_on_yourself(form): | |
325 | msg = _('Cannot change permission for yourself as admin') |
|
325 | msg = _('Cannot change permission for yourself as admin') | |
326 | h.flash(msg, category='warning') |
|
326 | h.flash(msg, category='warning') | |
327 | return redirect(url('edit_user_group_perms', user_group_id=user_group_id)) |
|
327 | return redirect(url('edit_user_group_perms', user_group_id=user_group_id)) | |
328 |
|
328 | |||
329 | try: |
|
329 | try: | |
330 | UserGroupModel().update_permissions(user_group_id, |
|
330 | UserGroupModel().update_permissions(user_group_id, | |
331 | form['perm_additions'], form['perm_updates'], form['perm_deletions']) |
|
331 | form['perm_additions'], form['perm_updates'], form['perm_deletions']) | |
332 | except RepoGroupAssignmentError: |
|
332 | except RepoGroupAssignmentError: | |
333 | h.flash(_('Target group cannot be the same'), category='error') |
|
333 | h.flash(_('Target group cannot be the same'), category='error') | |
334 | return redirect(url('edit_user_group_perms', user_group_id=user_group_id)) |
|
334 | return redirect(url('edit_user_group_perms', user_group_id=user_group_id)) | |
335 | #TODO: implement this |
|
335 | #TODO: implement this | |
336 | #action_logger(c.rhodecode_user, 'admin_changed_repo_permissions', |
|
336 | #action_logger(c.rhodecode_user, 'admin_changed_repo_permissions', | |
337 | # repo_name, self.ip_addr, self.sa) |
|
337 | # repo_name, self.ip_addr, self.sa) | |
338 | Session().commit() |
|
338 | Session().commit() | |
339 | h.flash(_('User Group permissions updated'), category='success') |
|
339 | h.flash(_('User Group permissions updated'), category='success') | |
340 | return redirect(url('edit_user_group_perms', user_group_id=user_group_id)) |
|
340 | return redirect(url('edit_user_group_perms', user_group_id=user_group_id)) | |
341 |
|
341 | |||
342 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
342 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') | |
343 | def edit_perms_summary(self, user_group_id): |
|
343 | def edit_perms_summary(self, user_group_id): | |
344 | user_group_id = safe_int(user_group_id) |
|
344 | user_group_id = safe_int(user_group_id) | |
345 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
345 | c.user_group = UserGroup.get_or_404(user_group_id) | |
346 | c.active = 'perms_summary' |
|
346 | c.active = 'perms_summary' | |
347 | permissions = { |
|
347 | permissions = { | |
348 | 'repositories': {}, |
|
348 | 'repositories': {}, | |
349 | 'repositories_groups': {}, |
|
349 | 'repositories_groups': {}, | |
350 | } |
|
350 | } | |
351 | ugroup_repo_perms = UserGroupRepoToPerm.query()\ |
|
351 | ugroup_repo_perms = UserGroupRepoToPerm.query()\ | |
352 | .options(joinedload(UserGroupRepoToPerm.permission))\ |
|
352 | .options(joinedload(UserGroupRepoToPerm.permission))\ | |
353 | .options(joinedload(UserGroupRepoToPerm.repository))\ |
|
353 | .options(joinedload(UserGroupRepoToPerm.repository))\ | |
354 | .filter(UserGroupRepoToPerm.users_group_id == user_group_id)\ |
|
354 | .filter(UserGroupRepoToPerm.users_group_id == user_group_id)\ | |
355 | .all() |
|
355 | .all() | |
356 |
|
356 | |||
357 | for gr in ugroup_repo_perms: |
|
357 | for gr in ugroup_repo_perms: | |
358 | permissions['repositories'][gr.repository.repo_name] \ |
|
358 | permissions['repositories'][gr.repository.repo_name] \ | |
359 | = gr.permission.permission_name |
|
359 | = gr.permission.permission_name | |
360 |
|
360 | |||
361 | ugroup_group_perms = UserGroupRepoGroupToPerm.query()\ |
|
361 | ugroup_group_perms = UserGroupRepoGroupToPerm.query()\ | |
362 | .options(joinedload(UserGroupRepoGroupToPerm.permission))\ |
|
362 | .options(joinedload(UserGroupRepoGroupToPerm.permission))\ | |
363 | .options(joinedload(UserGroupRepoGroupToPerm.group))\ |
|
363 | .options(joinedload(UserGroupRepoGroupToPerm.group))\ | |
364 | .filter(UserGroupRepoGroupToPerm.users_group_id == user_group_id)\ |
|
364 | .filter(UserGroupRepoGroupToPerm.users_group_id == user_group_id)\ | |
365 | .all() |
|
365 | .all() | |
366 |
|
366 | |||
367 | for gr in ugroup_group_perms: |
|
367 | for gr in ugroup_group_perms: | |
368 | permissions['repositories_groups'][gr.group.group_name] \ |
|
368 | permissions['repositories_groups'][gr.group.group_name] \ | |
369 | = gr.permission.permission_name |
|
369 | = gr.permission.permission_name | |
370 | c.permissions = permissions |
|
370 | c.permissions = permissions | |
371 | return render('admin/user_groups/user_group_edit.html') |
|
371 | return render('admin/user_groups/user_group_edit.html') | |
372 |
|
372 | |||
373 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
373 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') | |
374 | def edit_global_perms(self, user_group_id): |
|
374 | def edit_global_perms(self, user_group_id): | |
375 | user_group_id = safe_int(user_group_id) |
|
375 | user_group_id = safe_int(user_group_id) | |
376 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
376 | c.user_group = UserGroup.get_or_404(user_group_id) | |
377 | c.active = 'global_perms' |
|
377 | c.active = 'global_perms' | |
378 |
|
378 | |||
379 | c.default_user = User.get_default_user() |
|
379 | c.default_user = User.get_default_user() | |
380 | defaults = c.user_group.get_dict() |
|
380 | defaults = c.user_group.get_dict() | |
381 | defaults.update(c.default_user.get_default_perms(suffix='_inherited')) |
|
381 | defaults.update(c.default_user.get_default_perms(suffix='_inherited')) | |
382 | defaults.update(c.user_group.get_default_perms()) |
|
382 | defaults.update(c.user_group.get_default_perms()) | |
383 |
|
383 | |||
384 | return htmlfill.render( |
|
384 | return htmlfill.render( | |
385 | render('admin/user_groups/user_group_edit.html'), |
|
385 | render('admin/user_groups/user_group_edit.html'), | |
386 | defaults=defaults, |
|
386 | defaults=defaults, | |
387 | encoding="UTF-8", |
|
387 | encoding="UTF-8", | |
388 | force_defaults=False |
|
388 | force_defaults=False | |
389 | ) |
|
389 | ) | |
390 |
|
390 | |||
391 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
391 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') | |
392 | @auth.CSRFRequired() |
|
392 | @auth.CSRFRequired() | |
393 | def update_global_perms(self, user_group_id): |
|
393 | def update_global_perms(self, user_group_id): | |
394 | """PUT /users_perm/user_group_id: Update an existing item""" |
|
394 | """PUT /users_perm/user_group_id: Update an existing item""" | |
395 | # url('users_group_perm', user_group_id=ID, method='put') |
|
395 | # url('users_group_perm', user_group_id=ID, method='put') | |
396 | user_group_id = safe_int(user_group_id) |
|
396 | user_group_id = safe_int(user_group_id) | |
397 | user_group = UserGroup.get_or_404(user_group_id) |
|
397 | user_group = UserGroup.get_or_404(user_group_id) | |
398 | c.active = 'global_perms' |
|
398 | c.active = 'global_perms' | |
399 |
|
399 | |||
400 | try: |
|
400 | try: | |
401 | # first stage that verifies the checkbox |
|
401 | # first stage that verifies the checkbox | |
402 | _form = UserIndividualPermissionsForm() |
|
402 | _form = UserIndividualPermissionsForm() | |
403 | form_result = _form.to_python(dict(request.POST)) |
|
403 | form_result = _form.to_python(dict(request.POST)) | |
404 | inherit_perms = form_result['inherit_default_permissions'] |
|
404 | inherit_perms = form_result['inherit_default_permissions'] | |
405 | user_group.inherit_default_permissions = inherit_perms |
|
405 | user_group.inherit_default_permissions = inherit_perms | |
406 | Session().add(user_group) |
|
406 | Session().add(user_group) | |
407 |
|
407 | |||
408 | if not inherit_perms: |
|
408 | if not inherit_perms: | |
409 | # only update the individual ones if we un check the flag |
|
409 | # only update the individual ones if we un check the flag | |
410 | _form = UserPermissionsForm( |
|
410 | _form = UserPermissionsForm( | |
411 | [x[0] for x in c.repo_create_choices], |
|
411 | [x[0] for x in c.repo_create_choices], | |
412 | [x[0] for x in c.repo_create_on_write_choices], |
|
412 | [x[0] for x in c.repo_create_on_write_choices], | |
413 | [x[0] for x in c.repo_group_create_choices], |
|
413 | [x[0] for x in c.repo_group_create_choices], | |
414 | [x[0] for x in c.user_group_create_choices], |
|
414 | [x[0] for x in c.user_group_create_choices], | |
415 | [x[0] for x in c.fork_choices], |
|
415 | [x[0] for x in c.fork_choices], | |
416 | [x[0] for x in c.inherit_default_permission_choices])() |
|
416 | [x[0] for x in c.inherit_default_permission_choices])() | |
417 |
|
417 | |||
418 | form_result = _form.to_python(dict(request.POST)) |
|
418 | form_result = _form.to_python(dict(request.POST)) | |
419 | form_result.update({'perm_user_group_id': user_group.users_group_id}) |
|
419 | form_result.update({'perm_user_group_id': user_group.users_group_id}) | |
420 |
|
420 | |||
421 | PermissionModel().update_user_group_permissions(form_result) |
|
421 | PermissionModel().update_user_group_permissions(form_result) | |
422 |
|
422 | |||
423 | Session().commit() |
|
423 | Session().commit() | |
424 | h.flash(_('User Group global permissions updated successfully'), |
|
424 | h.flash(_('User Group global permissions updated successfully'), | |
425 | category='success') |
|
425 | category='success') | |
426 |
|
426 | |||
427 | except formencode.Invalid as errors: |
|
427 | except formencode.Invalid as errors: | |
428 | defaults = errors.value |
|
428 | defaults = errors.value | |
429 | c.user_group = user_group |
|
429 | c.user_group = user_group | |
430 | return htmlfill.render( |
|
430 | return htmlfill.render( | |
431 | render('admin/user_groups/user_group_edit.html'), |
|
431 | render('admin/user_groups/user_group_edit.html'), | |
432 | defaults=defaults, |
|
432 | defaults=defaults, | |
433 | errors=errors.error_dict or {}, |
|
433 | errors=errors.error_dict or {}, | |
434 | prefix_error=False, |
|
434 | prefix_error=False, | |
435 | encoding="UTF-8", |
|
435 | encoding="UTF-8", | |
436 | force_defaults=False) |
|
436 | force_defaults=False) | |
437 |
|
437 | |||
438 | except Exception: |
|
438 | except Exception: | |
439 | log.exception("Exception during permissions saving") |
|
439 | log.exception("Exception during permissions saving") | |
440 | h.flash(_('An error occurred during permissions saving'), |
|
440 | h.flash(_('An error occurred during permissions saving'), | |
441 | category='error') |
|
441 | category='error') | |
442 |
|
442 | |||
443 | return redirect(url('edit_user_group_global_perms', user_group_id=user_group_id)) |
|
443 | return redirect(url('edit_user_group_global_perms', user_group_id=user_group_id)) | |
444 |
|
444 | |||
445 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
445 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') | |
446 | def edit_advanced(self, user_group_id): |
|
446 | def edit_advanced(self, user_group_id): | |
447 | user_group_id = safe_int(user_group_id) |
|
447 | user_group_id = safe_int(user_group_id) | |
448 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
448 | c.user_group = UserGroup.get_or_404(user_group_id) | |
449 | c.active = 'advanced' |
|
449 | c.active = 'advanced' | |
450 | c.group_members_obj = sorted( |
|
450 | c.group_members_obj = sorted( | |
451 | (x.user for x in c.user_group.members), |
|
451 | (x.user for x in c.user_group.members), | |
452 | key=lambda u: u.username.lower()) |
|
452 | key=lambda u: u.username.lower()) | |
453 |
|
453 | |||
454 | c.group_to_repos = sorted( |
|
454 | c.group_to_repos = sorted( | |
455 | (x.repository for x in c.user_group.users_group_repo_to_perm), |
|
455 | (x.repository for x in c.user_group.users_group_repo_to_perm), | |
456 | key=lambda u: u.repo_name.lower()) |
|
456 | key=lambda u: u.repo_name.lower()) | |
457 |
|
457 | |||
458 | c.group_to_repo_groups = sorted( |
|
458 | c.group_to_repo_groups = sorted( | |
459 | (x.group for x in c.user_group.users_group_repo_group_to_perm), |
|
459 | (x.group for x in c.user_group.users_group_repo_group_to_perm), | |
460 | key=lambda u: u.group_name.lower()) |
|
460 | key=lambda u: u.group_name.lower()) | |
461 |
|
461 | |||
462 | return render('admin/user_groups/user_group_edit.html') |
|
462 | return render('admin/user_groups/user_group_edit.html') | |
463 |
|
463 | |||
464 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
464 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') | |
465 | def edit_members(self, user_group_id): |
|
465 | def edit_members(self, user_group_id): | |
466 | user_group_id = safe_int(user_group_id) |
|
466 | user_group_id = safe_int(user_group_id) | |
467 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
467 | c.user_group = UserGroup.get_or_404(user_group_id) | |
468 | c.active = 'members' |
|
468 | c.active = 'members' | |
469 | c.group_members_obj = sorted((x.user for x in c.user_group.members), |
|
469 | c.group_members_obj = sorted((x.user for x in c.user_group.members), | |
470 | key=lambda u: u.username.lower()) |
|
470 | key=lambda u: u.username.lower()) | |
471 |
|
471 | |||
472 | group_members = [(x.user_id, x.username) for x in c.group_members_obj] |
|
472 | group_members = [(x.user_id, x.username) for x in c.group_members_obj] | |
473 |
|
473 | |||
474 | if request.is_xhr: |
|
474 | if request.is_xhr: | |
475 | return jsonify(lambda *a, **k: { |
|
475 | return jsonify(lambda *a, **k: { | |
476 | 'members': group_members |
|
476 | 'members': group_members | |
477 | }) |
|
477 | }) | |
478 |
|
478 | |||
479 | c.group_members = group_members |
|
479 | c.group_members = group_members | |
480 | return render('admin/user_groups/user_group_edit.html') |
|
480 | return render('admin/user_groups/user_group_edit.html') |
@@ -1,717 +1,719 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 | """ |
|
21 | """ | |
22 | Users crud controller for pylons |
|
22 | Users crud controller for pylons | |
23 | """ |
|
23 | """ | |
24 |
|
24 | |||
25 | import logging |
|
25 | import logging | |
26 | import formencode |
|
26 | import formencode | |
27 |
|
27 | |||
28 | from formencode import htmlfill |
|
28 | from formencode import htmlfill | |
29 | from pylons import request, tmpl_context as c, url, config |
|
29 | from pylons import request, tmpl_context as c, url, config | |
30 | from pylons.controllers.util import redirect |
|
30 | from pylons.controllers.util import redirect | |
31 | from pylons.i18n.translation import _ |
|
31 | from pylons.i18n.translation import _ | |
32 |
|
32 | |||
33 | from rhodecode.authentication.plugins import auth_rhodecode |
|
33 | from rhodecode.authentication.plugins import auth_rhodecode | |
34 | from rhodecode.lib.exceptions import ( |
|
34 | from rhodecode.lib.exceptions import ( | |
35 | DefaultUserException, UserOwnsReposException, UserOwnsRepoGroupsException, |
|
35 | DefaultUserException, UserOwnsReposException, UserOwnsRepoGroupsException, | |
36 | UserOwnsUserGroupsException, UserCreationError) |
|
36 | UserOwnsUserGroupsException, UserCreationError) | |
37 | from rhodecode.lib import helpers as h |
|
37 | from rhodecode.lib import helpers as h | |
38 | from rhodecode.lib import auth |
|
38 | from rhodecode.lib import auth | |
39 | from rhodecode.lib.auth import ( |
|
39 | from rhodecode.lib.auth import ( | |
40 | LoginRequired, HasPermissionAllDecorator, AuthUser, generate_auth_token) |
|
40 | LoginRequired, HasPermissionAllDecorator, AuthUser, generate_auth_token) | |
41 | from rhodecode.lib.base import BaseController, render |
|
41 | from rhodecode.lib.base import BaseController, render | |
42 | from rhodecode.model.auth_token import AuthTokenModel |
|
42 | from rhodecode.model.auth_token import AuthTokenModel | |
43 |
|
43 | |||
44 | from rhodecode.model.db import ( |
|
44 | from rhodecode.model.db import ( | |
45 | PullRequestReviewers, User, UserEmailMap, UserIpMap, RepoGroup) |
|
45 | PullRequestReviewers, User, UserEmailMap, UserIpMap, RepoGroup) | |
46 | from rhodecode.model.forms import ( |
|
46 | from rhodecode.model.forms import ( | |
47 | UserForm, UserPermissionsForm, UserIndividualPermissionsForm) |
|
47 | UserForm, UserPermissionsForm, UserIndividualPermissionsForm) | |
48 | from rhodecode.model.user import UserModel |
|
48 | from rhodecode.model.user import UserModel | |
49 | from rhodecode.model.meta import Session |
|
49 | from rhodecode.model.meta import Session | |
50 | from rhodecode.model.permission import PermissionModel |
|
50 | from rhodecode.model.permission import PermissionModel | |
51 | from rhodecode.lib.utils import action_logger |
|
51 | from rhodecode.lib.utils import action_logger | |
52 | from rhodecode.lib.ext_json import json |
|
52 | from rhodecode.lib.ext_json import json | |
53 | from rhodecode.lib.utils2 import datetime_to_time, safe_int |
|
53 | from rhodecode.lib.utils2 import datetime_to_time, safe_int | |
54 |
|
54 | |||
55 | log = logging.getLogger(__name__) |
|
55 | log = logging.getLogger(__name__) | |
56 |
|
56 | |||
57 |
|
57 | |||
58 | class UsersController(BaseController): |
|
58 | class UsersController(BaseController): | |
59 | """REST Controller styled on the Atom Publishing Protocol""" |
|
59 | """REST Controller styled on the Atom Publishing Protocol""" | |
60 |
|
60 | |||
61 | @LoginRequired() |
|
61 | @LoginRequired() | |
62 | def __before__(self): |
|
62 | def __before__(self): | |
63 | super(UsersController, self).__before__() |
|
63 | super(UsersController, self).__before__() | |
64 | c.available_permissions = config['available_permissions'] |
|
64 | c.available_permissions = config['available_permissions'] | |
65 | c.allowed_languages = [ |
|
65 | c.allowed_languages = [ | |
66 | ('en', 'English (en)'), |
|
66 | ('en', 'English (en)'), | |
67 | ('de', 'German (de)'), |
|
67 | ('de', 'German (de)'), | |
68 | ('fr', 'French (fr)'), |
|
68 | ('fr', 'French (fr)'), | |
69 | ('it', 'Italian (it)'), |
|
69 | ('it', 'Italian (it)'), | |
70 | ('ja', 'Japanese (ja)'), |
|
70 | ('ja', 'Japanese (ja)'), | |
71 | ('pl', 'Polish (pl)'), |
|
71 | ('pl', 'Polish (pl)'), | |
72 | ('pt', 'Portuguese (pt)'), |
|
72 | ('pt', 'Portuguese (pt)'), | |
73 | ('ru', 'Russian (ru)'), |
|
73 | ('ru', 'Russian (ru)'), | |
74 | ('zh', 'Chinese (zh)'), |
|
74 | ('zh', 'Chinese (zh)'), | |
75 | ] |
|
75 | ] | |
76 | PermissionModel().set_global_permission_choices(c, translator=_) |
|
76 | PermissionModel().set_global_permission_choices(c, translator=_) | |
77 |
|
77 | |||
78 | @HasPermissionAllDecorator('hg.admin') |
|
78 | @HasPermissionAllDecorator('hg.admin') | |
79 | def index(self): |
|
79 | def index(self): | |
80 | """GET /users: All items in the collection""" |
|
80 | """GET /users: All items in the collection""" | |
81 | # url('users') |
|
81 | # url('users') | |
82 |
|
82 | |||
83 | from rhodecode.lib.utils import PartialRenderer |
|
83 | from rhodecode.lib.utils import PartialRenderer | |
84 | _render = PartialRenderer('data_table/_dt_elements.html') |
|
84 | _render = PartialRenderer('data_table/_dt_elements.html') | |
85 |
|
85 | |||
86 | def grav_tmpl(user_email, size): |
|
86 | def grav_tmpl(user_email, size): | |
87 | return _render("user_gravatar", user_email, size) |
|
87 | return _render("user_gravatar", user_email, size) | |
88 |
|
88 | |||
89 | def username(user_id, username): |
|
89 | def username(user_id, username): | |
90 | return _render("user_name", user_id, username) |
|
90 | return _render("user_name", user_id, username) | |
91 |
|
91 | |||
92 | def user_actions(user_id, username): |
|
92 | def user_actions(user_id, username): | |
93 | return _render("user_actions", user_id, username) |
|
93 | return _render("user_actions", user_id, username) | |
94 |
|
94 | |||
95 | # json generate |
|
95 | # json generate | |
96 | c.users_list = User.query()\ |
|
96 | c.users_list = User.query()\ | |
97 | .filter(User.username != User.DEFAULT_USER) \ |
|
97 | .filter(User.username != User.DEFAULT_USER) \ | |
98 | .all() |
|
98 | .all() | |
99 |
|
99 | |||
100 | users_data = [] |
|
100 | users_data = [] | |
101 | for user in c.users_list: |
|
101 | for user in c.users_list: | |
102 | users_data.append({ |
|
102 | users_data.append({ | |
103 | "gravatar": grav_tmpl(user.email, 20), |
|
103 | "gravatar": grav_tmpl(user.email, 20), | |
104 | "username": h.link_to( |
|
104 | "username": h.link_to( | |
105 | user.username, h.url('user_profile', username=user.username)), |
|
105 | user.username, h.url('user_profile', username=user.username)), | |
106 | "username_raw": user.username, |
|
106 | "username_raw": user.username, | |
107 | "email": user.email, |
|
107 | "email": user.email, | |
108 | "first_name": h.escape(user.name), |
|
108 | "first_name": h.escape(user.name), | |
109 | "last_name": h.escape(user.lastname), |
|
109 | "last_name": h.escape(user.lastname), | |
110 | "last_login": h.format_date(user.last_login), |
|
110 | "last_login": h.format_date(user.last_login), | |
111 | "last_login_raw": datetime_to_time(user.last_login), |
|
111 | "last_login_raw": datetime_to_time(user.last_login), | |
112 | "last_activity": h.format_date( |
|
112 | "last_activity": h.format_date( | |
113 | h.time_to_datetime(user.user_data.get('last_activity', 0))), |
|
113 | h.time_to_datetime(user.user_data.get('last_activity', 0))), | |
114 | "last_activity_raw": user.user_data.get('last_activity', 0), |
|
114 | "last_activity_raw": user.user_data.get('last_activity', 0), | |
115 | "active": h.bool2icon(user.active), |
|
115 | "active": h.bool2icon(user.active), | |
116 | "active_raw": user.active, |
|
116 | "active_raw": user.active, | |
117 | "admin": h.bool2icon(user.admin), |
|
117 | "admin": h.bool2icon(user.admin), | |
118 | "admin_raw": user.admin, |
|
118 | "admin_raw": user.admin, | |
119 | "extern_type": user.extern_type, |
|
119 | "extern_type": user.extern_type, | |
120 | "extern_name": user.extern_name, |
|
120 | "extern_name": user.extern_name, | |
121 | "action": user_actions(user.user_id, user.username), |
|
121 | "action": user_actions(user.user_id, user.username), | |
122 | }) |
|
122 | }) | |
123 |
|
123 | |||
124 |
|
124 | |||
125 | c.data = json.dumps(users_data) |
|
125 | c.data = json.dumps(users_data) | |
126 | return render('admin/users/users.html') |
|
126 | return render('admin/users/users.html') | |
127 |
|
127 | |||
128 | @HasPermissionAllDecorator('hg.admin') |
|
128 | @HasPermissionAllDecorator('hg.admin') | |
129 | @auth.CSRFRequired() |
|
129 | @auth.CSRFRequired() | |
130 | def create(self): |
|
130 | def create(self): | |
131 | """POST /users: Create a new item""" |
|
131 | """POST /users: Create a new item""" | |
132 | # url('users') |
|
132 | # url('users') | |
133 | c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.name |
|
133 | c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.name | |
134 | user_model = UserModel() |
|
134 | user_model = UserModel() | |
135 | user_form = UserForm()() |
|
135 | user_form = UserForm()() | |
136 | try: |
|
136 | try: | |
137 | form_result = user_form.to_python(dict(request.POST)) |
|
137 | form_result = user_form.to_python(dict(request.POST)) | |
138 | user = user_model.create(form_result) |
|
138 | user = user_model.create(form_result) | |
139 | Session().flush() |
|
139 | Session().flush() | |
140 | username = form_result['username'] |
|
140 | username = form_result['username'] | |
141 | action_logger(c.rhodecode_user, 'admin_created_user:%s' % username, |
|
141 | action_logger(c.rhodecode_user, 'admin_created_user:%s' % username, | |
142 | None, self.ip_addr, self.sa) |
|
142 | None, self.ip_addr, self.sa) | |
143 |
|
143 | |||
144 | user_link = h.link_to(h.escape(username), |
|
144 | user_link = h.link_to(h.escape(username), | |
145 | url('edit_user', |
|
145 | url('edit_user', | |
146 | user_id=user.user_id)) |
|
146 | user_id=user.user_id)) | |
147 | h.flash(h.literal(_('Created user %(user_link)s') |
|
147 | h.flash(h.literal(_('Created user %(user_link)s') | |
148 | % {'user_link': user_link}), category='success') |
|
148 | % {'user_link': user_link}), category='success') | |
149 | Session().commit() |
|
149 | Session().commit() | |
150 | except formencode.Invalid as errors: |
|
150 | except formencode.Invalid as errors: | |
151 | return htmlfill.render( |
|
151 | return htmlfill.render( | |
152 | render('admin/users/user_add.html'), |
|
152 | render('admin/users/user_add.html'), | |
153 | defaults=errors.value, |
|
153 | defaults=errors.value, | |
154 | errors=errors.error_dict or {}, |
|
154 | errors=errors.error_dict or {}, | |
155 | prefix_error=False, |
|
155 | prefix_error=False, | |
156 | encoding="UTF-8", |
|
156 | encoding="UTF-8", | |
157 | force_defaults=False) |
|
157 | force_defaults=False) | |
158 | except UserCreationError as e: |
|
158 | except UserCreationError as e: | |
159 | h.flash(e, 'error') |
|
159 | h.flash(e, 'error') | |
160 | except Exception: |
|
160 | except Exception: | |
161 | log.exception("Exception creation of user") |
|
161 | log.exception("Exception creation of user") | |
162 | h.flash(_('Error occurred during creation of user %s') |
|
162 | h.flash(_('Error occurred during creation of user %s') | |
163 | % request.POST.get('username'), category='error') |
|
163 | % request.POST.get('username'), category='error') | |
164 | return redirect(url('users')) |
|
164 | return redirect(url('users')) | |
165 |
|
165 | |||
166 | @HasPermissionAllDecorator('hg.admin') |
|
166 | @HasPermissionAllDecorator('hg.admin') | |
167 | def new(self): |
|
167 | def new(self): | |
168 | """GET /users/new: Form to create a new item""" |
|
168 | """GET /users/new: Form to create a new item""" | |
169 | # url('new_user') |
|
169 | # url('new_user') | |
170 | c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.name |
|
170 | c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.name | |
171 | return render('admin/users/user_add.html') |
|
171 | return render('admin/users/user_add.html') | |
172 |
|
172 | |||
173 | @HasPermissionAllDecorator('hg.admin') |
|
173 | @HasPermissionAllDecorator('hg.admin') | |
174 | @auth.CSRFRequired() |
|
174 | @auth.CSRFRequired() | |
175 | def update(self, user_id): |
|
175 | def update(self, user_id): | |
176 | """PUT /users/user_id: Update an existing item""" |
|
176 | """PUT /users/user_id: Update an existing item""" | |
177 | # Forms posted to this method should contain a hidden field: |
|
177 | # Forms posted to this method should contain a hidden field: | |
178 | # <input type="hidden" name="_method" value="PUT" /> |
|
178 | # <input type="hidden" name="_method" value="PUT" /> | |
179 | # Or using helpers: |
|
179 | # Or using helpers: | |
180 | # h.form(url('update_user', user_id=ID), |
|
180 | # h.form(url('update_user', user_id=ID), | |
181 | # method='put') |
|
181 | # method='put') | |
182 | # url('user', user_id=ID) |
|
182 | # url('user', user_id=ID) | |
183 | user_id = safe_int(user_id) |
|
183 | user_id = safe_int(user_id) | |
184 | c.user = User.get_or_404(user_id) |
|
184 | c.user = User.get_or_404(user_id) | |
185 | c.active = 'profile' |
|
185 | c.active = 'profile' | |
186 | c.extern_type = c.user.extern_type |
|
186 | c.extern_type = c.user.extern_type | |
187 | c.extern_name = c.user.extern_name |
|
187 | c.extern_name = c.user.extern_name | |
188 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) |
|
188 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) | |
189 | available_languages = [x[0] for x in c.allowed_languages] |
|
189 | available_languages = [x[0] for x in c.allowed_languages] | |
190 | _form = UserForm(edit=True, available_languages=available_languages, |
|
190 | _form = UserForm(edit=True, available_languages=available_languages, | |
191 | old_data={'user_id': user_id, |
|
191 | old_data={'user_id': user_id, | |
192 | 'email': c.user.email})() |
|
192 | 'email': c.user.email})() | |
193 | form_result = {} |
|
193 | form_result = {} | |
194 | try: |
|
194 | try: | |
195 | form_result = _form.to_python(dict(request.POST)) |
|
195 | form_result = _form.to_python(dict(request.POST)) | |
196 | skip_attrs = ['extern_type', 'extern_name'] |
|
196 | skip_attrs = ['extern_type', 'extern_name'] | |
197 | # TODO: plugin should define if username can be updated |
|
197 | # TODO: plugin should define if username can be updated | |
198 | if c.extern_type != "rhodecode": |
|
198 | if c.extern_type != "rhodecode": | |
199 | # forbid updating username for external accounts |
|
199 | # forbid updating username for external accounts | |
200 | skip_attrs.append('username') |
|
200 | skip_attrs.append('username') | |
201 |
|
201 | |||
202 | UserModel().update_user(user_id, skip_attrs=skip_attrs, **form_result) |
|
202 | UserModel().update_user(user_id, skip_attrs=skip_attrs, **form_result) | |
203 | usr = form_result['username'] |
|
203 | usr = form_result['username'] | |
204 | action_logger(c.rhodecode_user, 'admin_updated_user:%s' % usr, |
|
204 | action_logger(c.rhodecode_user, 'admin_updated_user:%s' % usr, | |
205 | None, self.ip_addr, self.sa) |
|
205 | None, self.ip_addr, self.sa) | |
206 | h.flash(_('User updated successfully'), category='success') |
|
206 | h.flash(_('User updated successfully'), category='success') | |
207 | Session().commit() |
|
207 | Session().commit() | |
208 | except formencode.Invalid as errors: |
|
208 | except formencode.Invalid as errors: | |
209 | defaults = errors.value |
|
209 | defaults = errors.value | |
210 | e = errors.error_dict or {} |
|
210 | e = errors.error_dict or {} | |
211 |
|
211 | |||
212 | return htmlfill.render( |
|
212 | return htmlfill.render( | |
213 | render('admin/users/user_edit.html'), |
|
213 | render('admin/users/user_edit.html'), | |
214 | defaults=defaults, |
|
214 | defaults=defaults, | |
215 | errors=e, |
|
215 | errors=e, | |
216 | prefix_error=False, |
|
216 | prefix_error=False, | |
217 | encoding="UTF-8", |
|
217 | encoding="UTF-8", | |
218 | force_defaults=False) |
|
218 | force_defaults=False) | |
|
219 | except UserCreationError as e: | |||
|
220 | h.flash(e, 'error') | |||
219 | except Exception: |
|
221 | except Exception: | |
220 | log.exception("Exception updating user") |
|
222 | log.exception("Exception updating user") | |
221 | h.flash(_('Error occurred during update of user %s') |
|
223 | h.flash(_('Error occurred during update of user %s') | |
222 | % form_result.get('username'), category='error') |
|
224 | % form_result.get('username'), category='error') | |
223 | return redirect(url('edit_user', user_id=user_id)) |
|
225 | return redirect(url('edit_user', user_id=user_id)) | |
224 |
|
226 | |||
225 | @HasPermissionAllDecorator('hg.admin') |
|
227 | @HasPermissionAllDecorator('hg.admin') | |
226 | @auth.CSRFRequired() |
|
228 | @auth.CSRFRequired() | |
227 | def delete(self, user_id): |
|
229 | def delete(self, user_id): | |
228 | """DELETE /users/user_id: Delete an existing item""" |
|
230 | """DELETE /users/user_id: Delete an existing item""" | |
229 | # Forms posted to this method should contain a hidden field: |
|
231 | # Forms posted to this method should contain a hidden field: | |
230 | # <input type="hidden" name="_method" value="DELETE" /> |
|
232 | # <input type="hidden" name="_method" value="DELETE" /> | |
231 | # Or using helpers: |
|
233 | # Or using helpers: | |
232 | # h.form(url('delete_user', user_id=ID), |
|
234 | # h.form(url('delete_user', user_id=ID), | |
233 | # method='delete') |
|
235 | # method='delete') | |
234 | # url('user', user_id=ID) |
|
236 | # url('user', user_id=ID) | |
235 | user_id = safe_int(user_id) |
|
237 | user_id = safe_int(user_id) | |
236 | c.user = User.get_or_404(user_id) |
|
238 | c.user = User.get_or_404(user_id) | |
237 |
|
239 | |||
238 | _repos = c.user.repositories |
|
240 | _repos = c.user.repositories | |
239 | _repo_groups = c.user.repository_groups |
|
241 | _repo_groups = c.user.repository_groups | |
240 | _user_groups = c.user.user_groups |
|
242 | _user_groups = c.user.user_groups | |
241 |
|
243 | |||
242 | handle_repos = None |
|
244 | handle_repos = None | |
243 | handle_repo_groups = None |
|
245 | handle_repo_groups = None | |
244 | handle_user_groups = None |
|
246 | handle_user_groups = None | |
245 | # dummy call for flash of handle |
|
247 | # dummy call for flash of handle | |
246 | set_handle_flash_repos = lambda: None |
|
248 | set_handle_flash_repos = lambda: None | |
247 | set_handle_flash_repo_groups = lambda: None |
|
249 | set_handle_flash_repo_groups = lambda: None | |
248 | set_handle_flash_user_groups = lambda: None |
|
250 | set_handle_flash_user_groups = lambda: None | |
249 |
|
251 | |||
250 | if _repos and request.POST.get('user_repos'): |
|
252 | if _repos and request.POST.get('user_repos'): | |
251 | do = request.POST['user_repos'] |
|
253 | do = request.POST['user_repos'] | |
252 | if do == 'detach': |
|
254 | if do == 'detach': | |
253 | handle_repos = 'detach' |
|
255 | handle_repos = 'detach' | |
254 | set_handle_flash_repos = lambda: h.flash( |
|
256 | set_handle_flash_repos = lambda: h.flash( | |
255 | _('Detached %s repositories') % len(_repos), |
|
257 | _('Detached %s repositories') % len(_repos), | |
256 | category='success') |
|
258 | category='success') | |
257 | elif do == 'delete': |
|
259 | elif do == 'delete': | |
258 | handle_repos = 'delete' |
|
260 | handle_repos = 'delete' | |
259 | set_handle_flash_repos = lambda: h.flash( |
|
261 | set_handle_flash_repos = lambda: h.flash( | |
260 | _('Deleted %s repositories') % len(_repos), |
|
262 | _('Deleted %s repositories') % len(_repos), | |
261 | category='success') |
|
263 | category='success') | |
262 |
|
264 | |||
263 | if _repo_groups and request.POST.get('user_repo_groups'): |
|
265 | if _repo_groups and request.POST.get('user_repo_groups'): | |
264 | do = request.POST['user_repo_groups'] |
|
266 | do = request.POST['user_repo_groups'] | |
265 | if do == 'detach': |
|
267 | if do == 'detach': | |
266 | handle_repo_groups = 'detach' |
|
268 | handle_repo_groups = 'detach' | |
267 | set_handle_flash_repo_groups = lambda: h.flash( |
|
269 | set_handle_flash_repo_groups = lambda: h.flash( | |
268 | _('Detached %s repository groups') % len(_repo_groups), |
|
270 | _('Detached %s repository groups') % len(_repo_groups), | |
269 | category='success') |
|
271 | category='success') | |
270 | elif do == 'delete': |
|
272 | elif do == 'delete': | |
271 | handle_repo_groups = 'delete' |
|
273 | handle_repo_groups = 'delete' | |
272 | set_handle_flash_repo_groups = lambda: h.flash( |
|
274 | set_handle_flash_repo_groups = lambda: h.flash( | |
273 | _('Deleted %s repository groups') % len(_repo_groups), |
|
275 | _('Deleted %s repository groups') % len(_repo_groups), | |
274 | category='success') |
|
276 | category='success') | |
275 |
|
277 | |||
276 | if _user_groups and request.POST.get('user_user_groups'): |
|
278 | if _user_groups and request.POST.get('user_user_groups'): | |
277 | do = request.POST['user_user_groups'] |
|
279 | do = request.POST['user_user_groups'] | |
278 | if do == 'detach': |
|
280 | if do == 'detach': | |
279 | handle_user_groups = 'detach' |
|
281 | handle_user_groups = 'detach' | |
280 | set_handle_flash_user_groups = lambda: h.flash( |
|
282 | set_handle_flash_user_groups = lambda: h.flash( | |
281 | _('Detached %s user groups') % len(_user_groups), |
|
283 | _('Detached %s user groups') % len(_user_groups), | |
282 | category='success') |
|
284 | category='success') | |
283 | elif do == 'delete': |
|
285 | elif do == 'delete': | |
284 | handle_user_groups = 'delete' |
|
286 | handle_user_groups = 'delete' | |
285 | set_handle_flash_user_groups = lambda: h.flash( |
|
287 | set_handle_flash_user_groups = lambda: h.flash( | |
286 | _('Deleted %s user groups') % len(_user_groups), |
|
288 | _('Deleted %s user groups') % len(_user_groups), | |
287 | category='success') |
|
289 | category='success') | |
288 |
|
290 | |||
289 | try: |
|
291 | try: | |
290 | UserModel().delete(c.user, handle_repos=handle_repos, |
|
292 | UserModel().delete(c.user, handle_repos=handle_repos, | |
291 | handle_repo_groups=handle_repo_groups, |
|
293 | handle_repo_groups=handle_repo_groups, | |
292 | handle_user_groups=handle_user_groups) |
|
294 | handle_user_groups=handle_user_groups) | |
293 | Session().commit() |
|
295 | Session().commit() | |
294 | set_handle_flash_repos() |
|
296 | set_handle_flash_repos() | |
295 | set_handle_flash_repo_groups() |
|
297 | set_handle_flash_repo_groups() | |
296 | set_handle_flash_user_groups() |
|
298 | set_handle_flash_user_groups() | |
297 | h.flash(_('Successfully deleted user'), category='success') |
|
299 | h.flash(_('Successfully deleted user'), category='success') | |
298 | except (UserOwnsReposException, UserOwnsRepoGroupsException, |
|
300 | except (UserOwnsReposException, UserOwnsRepoGroupsException, | |
299 | UserOwnsUserGroupsException, DefaultUserException) as e: |
|
301 | UserOwnsUserGroupsException, DefaultUserException) as e: | |
300 | h.flash(e, category='warning') |
|
302 | h.flash(e, category='warning') | |
301 | except Exception: |
|
303 | except Exception: | |
302 | log.exception("Exception during deletion of user") |
|
304 | log.exception("Exception during deletion of user") | |
303 | h.flash(_('An error occurred during deletion of user'), |
|
305 | h.flash(_('An error occurred during deletion of user'), | |
304 | category='error') |
|
306 | category='error') | |
305 | return redirect(url('users')) |
|
307 | return redirect(url('users')) | |
306 |
|
308 | |||
307 | @HasPermissionAllDecorator('hg.admin') |
|
309 | @HasPermissionAllDecorator('hg.admin') | |
308 | @auth.CSRFRequired() |
|
310 | @auth.CSRFRequired() | |
309 | def reset_password(self, user_id): |
|
311 | def reset_password(self, user_id): | |
310 | """ |
|
312 | """ | |
311 | toggle reset password flag for this user |
|
313 | toggle reset password flag for this user | |
312 |
|
314 | |||
313 | :param user_id: |
|
315 | :param user_id: | |
314 | """ |
|
316 | """ | |
315 | user_id = safe_int(user_id) |
|
317 | user_id = safe_int(user_id) | |
316 | c.user = User.get_or_404(user_id) |
|
318 | c.user = User.get_or_404(user_id) | |
317 | try: |
|
319 | try: | |
318 | old_value = c.user.user_data.get('force_password_change') |
|
320 | old_value = c.user.user_data.get('force_password_change') | |
319 | c.user.update_userdata(force_password_change=not old_value) |
|
321 | c.user.update_userdata(force_password_change=not old_value) | |
320 | Session().commit() |
|
322 | Session().commit() | |
321 | if old_value: |
|
323 | if old_value: | |
322 | msg = _('Force password change disabled for user') |
|
324 | msg = _('Force password change disabled for user') | |
323 | else: |
|
325 | else: | |
324 | msg = _('Force password change enabled for user') |
|
326 | msg = _('Force password change enabled for user') | |
325 | h.flash(msg, category='success') |
|
327 | h.flash(msg, category='success') | |
326 | except Exception: |
|
328 | except Exception: | |
327 | log.exception("Exception during password reset for user") |
|
329 | log.exception("Exception during password reset for user") | |
328 | h.flash(_('An error occurred during password reset for user'), |
|
330 | h.flash(_('An error occurred during password reset for user'), | |
329 | category='error') |
|
331 | category='error') | |
330 |
|
332 | |||
331 | return redirect(url('edit_user_advanced', user_id=user_id)) |
|
333 | return redirect(url('edit_user_advanced', user_id=user_id)) | |
332 |
|
334 | |||
333 | @HasPermissionAllDecorator('hg.admin') |
|
335 | @HasPermissionAllDecorator('hg.admin') | |
334 | @auth.CSRFRequired() |
|
336 | @auth.CSRFRequired() | |
335 | def create_personal_repo_group(self, user_id): |
|
337 | def create_personal_repo_group(self, user_id): | |
336 | """ |
|
338 | """ | |
337 | Create personal repository group for this user |
|
339 | Create personal repository group for this user | |
338 |
|
340 | |||
339 | :param user_id: |
|
341 | :param user_id: | |
340 | """ |
|
342 | """ | |
341 | from rhodecode.model.repo_group import RepoGroupModel |
|
343 | from rhodecode.model.repo_group import RepoGroupModel | |
342 |
|
344 | |||
343 | user_id = safe_int(user_id) |
|
345 | user_id = safe_int(user_id) | |
344 | c.user = User.get_or_404(user_id) |
|
346 | c.user = User.get_or_404(user_id) | |
345 |
|
347 | |||
346 | try: |
|
348 | try: | |
347 | desc = RepoGroupModel.PERSONAL_GROUP_DESC % { |
|
349 | desc = RepoGroupModel.PERSONAL_GROUP_DESC % { | |
348 | 'username': c.user.username} |
|
350 | 'username': c.user.username} | |
349 | if not RepoGroup.get_by_group_name(c.user.username): |
|
351 | if not RepoGroup.get_by_group_name(c.user.username): | |
350 | RepoGroupModel().create(group_name=c.user.username, |
|
352 | RepoGroupModel().create(group_name=c.user.username, | |
351 | group_description=desc, |
|
353 | group_description=desc, | |
352 | owner=c.user.username) |
|
354 | owner=c.user.username) | |
353 |
|
355 | |||
354 | msg = _('Created repository group `%s`' % (c.user.username,)) |
|
356 | msg = _('Created repository group `%s`' % (c.user.username,)) | |
355 | h.flash(msg, category='success') |
|
357 | h.flash(msg, category='success') | |
356 | except Exception: |
|
358 | except Exception: | |
357 | log.exception("Exception during repository group creation") |
|
359 | log.exception("Exception during repository group creation") | |
358 | msg = _( |
|
360 | msg = _( | |
359 | 'An error occurred during repository group creation for user') |
|
361 | 'An error occurred during repository group creation for user') | |
360 | h.flash(msg, category='error') |
|
362 | h.flash(msg, category='error') | |
361 |
|
363 | |||
362 | return redirect(url('edit_user_advanced', user_id=user_id)) |
|
364 | return redirect(url('edit_user_advanced', user_id=user_id)) | |
363 |
|
365 | |||
364 | @HasPermissionAllDecorator('hg.admin') |
|
366 | @HasPermissionAllDecorator('hg.admin') | |
365 | def show(self, user_id): |
|
367 | def show(self, user_id): | |
366 | """GET /users/user_id: Show a specific item""" |
|
368 | """GET /users/user_id: Show a specific item""" | |
367 | # url('user', user_id=ID) |
|
369 | # url('user', user_id=ID) | |
368 | User.get_or_404(-1) |
|
370 | User.get_or_404(-1) | |
369 |
|
371 | |||
370 | @HasPermissionAllDecorator('hg.admin') |
|
372 | @HasPermissionAllDecorator('hg.admin') | |
371 | def edit(self, user_id): |
|
373 | def edit(self, user_id): | |
372 | """GET /users/user_id/edit: Form to edit an existing item""" |
|
374 | """GET /users/user_id/edit: Form to edit an existing item""" | |
373 | # url('edit_user', user_id=ID) |
|
375 | # url('edit_user', user_id=ID) | |
374 | user_id = safe_int(user_id) |
|
376 | user_id = safe_int(user_id) | |
375 | c.user = User.get_or_404(user_id) |
|
377 | c.user = User.get_or_404(user_id) | |
376 | if c.user.username == User.DEFAULT_USER: |
|
378 | if c.user.username == User.DEFAULT_USER: | |
377 | h.flash(_("You can't edit this user"), category='warning') |
|
379 | h.flash(_("You can't edit this user"), category='warning') | |
378 | return redirect(url('users')) |
|
380 | return redirect(url('users')) | |
379 |
|
381 | |||
380 | c.active = 'profile' |
|
382 | c.active = 'profile' | |
381 | c.extern_type = c.user.extern_type |
|
383 | c.extern_type = c.user.extern_type | |
382 | c.extern_name = c.user.extern_name |
|
384 | c.extern_name = c.user.extern_name | |
383 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) |
|
385 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) | |
384 |
|
386 | |||
385 | defaults = c.user.get_dict() |
|
387 | defaults = c.user.get_dict() | |
386 | defaults.update({'language': c.user.user_data.get('language')}) |
|
388 | defaults.update({'language': c.user.user_data.get('language')}) | |
387 | return htmlfill.render( |
|
389 | return htmlfill.render( | |
388 | render('admin/users/user_edit.html'), |
|
390 | render('admin/users/user_edit.html'), | |
389 | defaults=defaults, |
|
391 | defaults=defaults, | |
390 | encoding="UTF-8", |
|
392 | encoding="UTF-8", | |
391 | force_defaults=False) |
|
393 | force_defaults=False) | |
392 |
|
394 | |||
393 | @HasPermissionAllDecorator('hg.admin') |
|
395 | @HasPermissionAllDecorator('hg.admin') | |
394 | def edit_advanced(self, user_id): |
|
396 | def edit_advanced(self, user_id): | |
395 | user_id = safe_int(user_id) |
|
397 | user_id = safe_int(user_id) | |
396 | user = c.user = User.get_or_404(user_id) |
|
398 | user = c.user = User.get_or_404(user_id) | |
397 | if user.username == User.DEFAULT_USER: |
|
399 | if user.username == User.DEFAULT_USER: | |
398 | h.flash(_("You can't edit this user"), category='warning') |
|
400 | h.flash(_("You can't edit this user"), category='warning') | |
399 | return redirect(url('users')) |
|
401 | return redirect(url('users')) | |
400 |
|
402 | |||
401 | c.active = 'advanced' |
|
403 | c.active = 'advanced' | |
402 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) |
|
404 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) | |
403 | c.personal_repo_group = RepoGroup.get_by_group_name(user.username) |
|
405 | c.personal_repo_group = RepoGroup.get_by_group_name(user.username) | |
404 | c.first_admin = User.get_first_admin() |
|
406 | c.first_admin = User.get_first_super_admin() | |
405 | defaults = user.get_dict() |
|
407 | defaults = user.get_dict() | |
406 |
|
408 | |||
407 | # Interim workaround if the user participated on any pull requests as a |
|
409 | # Interim workaround if the user participated on any pull requests as a | |
408 | # reviewer. |
|
410 | # reviewer. | |
409 | has_review = bool(PullRequestReviewers.query().filter( |
|
411 | has_review = bool(PullRequestReviewers.query().filter( | |
410 | PullRequestReviewers.user_id == user_id).first()) |
|
412 | PullRequestReviewers.user_id == user_id).first()) | |
411 | c.can_delete_user = not has_review |
|
413 | c.can_delete_user = not has_review | |
412 | c.can_delete_user_message = _( |
|
414 | c.can_delete_user_message = _( | |
413 | 'The user participates as reviewer in pull requests and ' |
|
415 | 'The user participates as reviewer in pull requests and ' | |
414 | 'cannot be deleted. You can set the user to ' |
|
416 | 'cannot be deleted. You can set the user to ' | |
415 | '"inactive" instead of deleting it.') if has_review else '' |
|
417 | '"inactive" instead of deleting it.') if has_review else '' | |
416 |
|
418 | |||
417 | return htmlfill.render( |
|
419 | return htmlfill.render( | |
418 | render('admin/users/user_edit.html'), |
|
420 | render('admin/users/user_edit.html'), | |
419 | defaults=defaults, |
|
421 | defaults=defaults, | |
420 | encoding="UTF-8", |
|
422 | encoding="UTF-8", | |
421 | force_defaults=False) |
|
423 | force_defaults=False) | |
422 |
|
424 | |||
423 | @HasPermissionAllDecorator('hg.admin') |
|
425 | @HasPermissionAllDecorator('hg.admin') | |
424 | def edit_auth_tokens(self, user_id): |
|
426 | def edit_auth_tokens(self, user_id): | |
425 | user_id = safe_int(user_id) |
|
427 | user_id = safe_int(user_id) | |
426 | c.user = User.get_or_404(user_id) |
|
428 | c.user = User.get_or_404(user_id) | |
427 | if c.user.username == User.DEFAULT_USER: |
|
429 | if c.user.username == User.DEFAULT_USER: | |
428 | h.flash(_("You can't edit this user"), category='warning') |
|
430 | h.flash(_("You can't edit this user"), category='warning') | |
429 | return redirect(url('users')) |
|
431 | return redirect(url('users')) | |
430 |
|
432 | |||
431 | c.active = 'auth_tokens' |
|
433 | c.active = 'auth_tokens' | |
432 | show_expired = True |
|
434 | show_expired = True | |
433 | c.lifetime_values = [ |
|
435 | c.lifetime_values = [ | |
434 | (str(-1), _('forever')), |
|
436 | (str(-1), _('forever')), | |
435 | (str(5), _('5 minutes')), |
|
437 | (str(5), _('5 minutes')), | |
436 | (str(60), _('1 hour')), |
|
438 | (str(60), _('1 hour')), | |
437 | (str(60 * 24), _('1 day')), |
|
439 | (str(60 * 24), _('1 day')), | |
438 | (str(60 * 24 * 30), _('1 month')), |
|
440 | (str(60 * 24 * 30), _('1 month')), | |
439 | ] |
|
441 | ] | |
440 | c.lifetime_options = [(c.lifetime_values, _("Lifetime"))] |
|
442 | c.lifetime_options = [(c.lifetime_values, _("Lifetime"))] | |
441 | c.role_values = [(x, AuthTokenModel.cls._get_role_name(x)) |
|
443 | c.role_values = [(x, AuthTokenModel.cls._get_role_name(x)) | |
442 | for x in AuthTokenModel.cls.ROLES] |
|
444 | for x in AuthTokenModel.cls.ROLES] | |
443 | c.role_options = [(c.role_values, _("Role"))] |
|
445 | c.role_options = [(c.role_values, _("Role"))] | |
444 | c.user_auth_tokens = AuthTokenModel().get_auth_tokens( |
|
446 | c.user_auth_tokens = AuthTokenModel().get_auth_tokens( | |
445 | c.user.user_id, show_expired=show_expired) |
|
447 | c.user.user_id, show_expired=show_expired) | |
446 | defaults = c.user.get_dict() |
|
448 | defaults = c.user.get_dict() | |
447 | return htmlfill.render( |
|
449 | return htmlfill.render( | |
448 | render('admin/users/user_edit.html'), |
|
450 | render('admin/users/user_edit.html'), | |
449 | defaults=defaults, |
|
451 | defaults=defaults, | |
450 | encoding="UTF-8", |
|
452 | encoding="UTF-8", | |
451 | force_defaults=False) |
|
453 | force_defaults=False) | |
452 |
|
454 | |||
453 | @HasPermissionAllDecorator('hg.admin') |
|
455 | @HasPermissionAllDecorator('hg.admin') | |
454 | @auth.CSRFRequired() |
|
456 | @auth.CSRFRequired() | |
455 | def add_auth_token(self, user_id): |
|
457 | def add_auth_token(self, user_id): | |
456 | user_id = safe_int(user_id) |
|
458 | user_id = safe_int(user_id) | |
457 | c.user = User.get_or_404(user_id) |
|
459 | c.user = User.get_or_404(user_id) | |
458 | if c.user.username == User.DEFAULT_USER: |
|
460 | if c.user.username == User.DEFAULT_USER: | |
459 | h.flash(_("You can't edit this user"), category='warning') |
|
461 | h.flash(_("You can't edit this user"), category='warning') | |
460 | return redirect(url('users')) |
|
462 | return redirect(url('users')) | |
461 |
|
463 | |||
462 | lifetime = safe_int(request.POST.get('lifetime'), -1) |
|
464 | lifetime = safe_int(request.POST.get('lifetime'), -1) | |
463 | description = request.POST.get('description') |
|
465 | description = request.POST.get('description') | |
464 | role = request.POST.get('role') |
|
466 | role = request.POST.get('role') | |
465 | AuthTokenModel().create(c.user.user_id, description, lifetime, role) |
|
467 | AuthTokenModel().create(c.user.user_id, description, lifetime, role) | |
466 | Session().commit() |
|
468 | Session().commit() | |
467 | h.flash(_("Auth token successfully created"), category='success') |
|
469 | h.flash(_("Auth token successfully created"), category='success') | |
468 | return redirect(url('edit_user_auth_tokens', user_id=c.user.user_id)) |
|
470 | return redirect(url('edit_user_auth_tokens', user_id=c.user.user_id)) | |
469 |
|
471 | |||
470 | @HasPermissionAllDecorator('hg.admin') |
|
472 | @HasPermissionAllDecorator('hg.admin') | |
471 | @auth.CSRFRequired() |
|
473 | @auth.CSRFRequired() | |
472 | def delete_auth_token(self, user_id): |
|
474 | def delete_auth_token(self, user_id): | |
473 | user_id = safe_int(user_id) |
|
475 | user_id = safe_int(user_id) | |
474 | c.user = User.get_or_404(user_id) |
|
476 | c.user = User.get_or_404(user_id) | |
475 | if c.user.username == User.DEFAULT_USER: |
|
477 | if c.user.username == User.DEFAULT_USER: | |
476 | h.flash(_("You can't edit this user"), category='warning') |
|
478 | h.flash(_("You can't edit this user"), category='warning') | |
477 | return redirect(url('users')) |
|
479 | return redirect(url('users')) | |
478 |
|
480 | |||
479 | auth_token = request.POST.get('del_auth_token') |
|
481 | auth_token = request.POST.get('del_auth_token') | |
480 | if request.POST.get('del_auth_token_builtin'): |
|
482 | if request.POST.get('del_auth_token_builtin'): | |
481 | user = User.get(c.user.user_id) |
|
483 | user = User.get(c.user.user_id) | |
482 | if user: |
|
484 | if user: | |
483 | user.api_key = generate_auth_token(user.username) |
|
485 | user.api_key = generate_auth_token(user.username) | |
484 | Session().add(user) |
|
486 | Session().add(user) | |
485 | Session().commit() |
|
487 | Session().commit() | |
486 | h.flash(_("Auth token successfully reset"), category='success') |
|
488 | h.flash(_("Auth token successfully reset"), category='success') | |
487 | elif auth_token: |
|
489 | elif auth_token: | |
488 | AuthTokenModel().delete(auth_token, c.user.user_id) |
|
490 | AuthTokenModel().delete(auth_token, c.user.user_id) | |
489 | Session().commit() |
|
491 | Session().commit() | |
490 | h.flash(_("Auth token successfully deleted"), category='success') |
|
492 | h.flash(_("Auth token successfully deleted"), category='success') | |
491 |
|
493 | |||
492 | return redirect(url('edit_user_auth_tokens', user_id=c.user.user_id)) |
|
494 | return redirect(url('edit_user_auth_tokens', user_id=c.user.user_id)) | |
493 |
|
495 | |||
494 | @HasPermissionAllDecorator('hg.admin') |
|
496 | @HasPermissionAllDecorator('hg.admin') | |
495 | def edit_global_perms(self, user_id): |
|
497 | def edit_global_perms(self, user_id): | |
496 | user_id = safe_int(user_id) |
|
498 | user_id = safe_int(user_id) | |
497 | c.user = User.get_or_404(user_id) |
|
499 | c.user = User.get_or_404(user_id) | |
498 | if c.user.username == User.DEFAULT_USER: |
|
500 | if c.user.username == User.DEFAULT_USER: | |
499 | h.flash(_("You can't edit this user"), category='warning') |
|
501 | h.flash(_("You can't edit this user"), category='warning') | |
500 | return redirect(url('users')) |
|
502 | return redirect(url('users')) | |
501 |
|
503 | |||
502 | c.active = 'global_perms' |
|
504 | c.active = 'global_perms' | |
503 |
|
505 | |||
504 | c.default_user = User.get_default_user() |
|
506 | c.default_user = User.get_default_user() | |
505 | defaults = c.user.get_dict() |
|
507 | defaults = c.user.get_dict() | |
506 | defaults.update(c.default_user.get_default_perms(suffix='_inherited')) |
|
508 | defaults.update(c.default_user.get_default_perms(suffix='_inherited')) | |
507 | defaults.update(c.default_user.get_default_perms()) |
|
509 | defaults.update(c.default_user.get_default_perms()) | |
508 | defaults.update(c.user.get_default_perms()) |
|
510 | defaults.update(c.user.get_default_perms()) | |
509 |
|
511 | |||
510 | return htmlfill.render( |
|
512 | return htmlfill.render( | |
511 | render('admin/users/user_edit.html'), |
|
513 | render('admin/users/user_edit.html'), | |
512 | defaults=defaults, |
|
514 | defaults=defaults, | |
513 | encoding="UTF-8", |
|
515 | encoding="UTF-8", | |
514 | force_defaults=False) |
|
516 | force_defaults=False) | |
515 |
|
517 | |||
516 | @HasPermissionAllDecorator('hg.admin') |
|
518 | @HasPermissionAllDecorator('hg.admin') | |
517 | @auth.CSRFRequired() |
|
519 | @auth.CSRFRequired() | |
518 | def update_global_perms(self, user_id): |
|
520 | def update_global_perms(self, user_id): | |
519 | """PUT /users_perm/user_id: Update an existing item""" |
|
521 | """PUT /users_perm/user_id: Update an existing item""" | |
520 | # url('user_perm', user_id=ID, method='put') |
|
522 | # url('user_perm', user_id=ID, method='put') | |
521 | user_id = safe_int(user_id) |
|
523 | user_id = safe_int(user_id) | |
522 | user = User.get_or_404(user_id) |
|
524 | user = User.get_or_404(user_id) | |
523 | c.active = 'global_perms' |
|
525 | c.active = 'global_perms' | |
524 | try: |
|
526 | try: | |
525 | # first stage that verifies the checkbox |
|
527 | # first stage that verifies the checkbox | |
526 | _form = UserIndividualPermissionsForm() |
|
528 | _form = UserIndividualPermissionsForm() | |
527 | form_result = _form.to_python(dict(request.POST)) |
|
529 | form_result = _form.to_python(dict(request.POST)) | |
528 | inherit_perms = form_result['inherit_default_permissions'] |
|
530 | inherit_perms = form_result['inherit_default_permissions'] | |
529 | user.inherit_default_permissions = inherit_perms |
|
531 | user.inherit_default_permissions = inherit_perms | |
530 | Session().add(user) |
|
532 | Session().add(user) | |
531 |
|
533 | |||
532 | if not inherit_perms: |
|
534 | if not inherit_perms: | |
533 | # only update the individual ones if we un check the flag |
|
535 | # only update the individual ones if we un check the flag | |
534 | _form = UserPermissionsForm( |
|
536 | _form = UserPermissionsForm( | |
535 | [x[0] for x in c.repo_create_choices], |
|
537 | [x[0] for x in c.repo_create_choices], | |
536 | [x[0] for x in c.repo_create_on_write_choices], |
|
538 | [x[0] for x in c.repo_create_on_write_choices], | |
537 | [x[0] for x in c.repo_group_create_choices], |
|
539 | [x[0] for x in c.repo_group_create_choices], | |
538 | [x[0] for x in c.user_group_create_choices], |
|
540 | [x[0] for x in c.user_group_create_choices], | |
539 | [x[0] for x in c.fork_choices], |
|
541 | [x[0] for x in c.fork_choices], | |
540 | [x[0] for x in c.inherit_default_permission_choices])() |
|
542 | [x[0] for x in c.inherit_default_permission_choices])() | |
541 |
|
543 | |||
542 | form_result = _form.to_python(dict(request.POST)) |
|
544 | form_result = _form.to_python(dict(request.POST)) | |
543 | form_result.update({'perm_user_id': user.user_id}) |
|
545 | form_result.update({'perm_user_id': user.user_id}) | |
544 |
|
546 | |||
545 | PermissionModel().update_user_permissions(form_result) |
|
547 | PermissionModel().update_user_permissions(form_result) | |
546 |
|
548 | |||
547 | Session().commit() |
|
549 | Session().commit() | |
548 | h.flash(_('User global permissions updated successfully'), |
|
550 | h.flash(_('User global permissions updated successfully'), | |
549 | category='success') |
|
551 | category='success') | |
550 |
|
552 | |||
551 | Session().commit() |
|
553 | Session().commit() | |
552 | except formencode.Invalid as errors: |
|
554 | except formencode.Invalid as errors: | |
553 | defaults = errors.value |
|
555 | defaults = errors.value | |
554 | c.user = user |
|
556 | c.user = user | |
555 | return htmlfill.render( |
|
557 | return htmlfill.render( | |
556 | render('admin/users/user_edit.html'), |
|
558 | render('admin/users/user_edit.html'), | |
557 | defaults=defaults, |
|
559 | defaults=defaults, | |
558 | errors=errors.error_dict or {}, |
|
560 | errors=errors.error_dict or {}, | |
559 | prefix_error=False, |
|
561 | prefix_error=False, | |
560 | encoding="UTF-8", |
|
562 | encoding="UTF-8", | |
561 | force_defaults=False) |
|
563 | force_defaults=False) | |
562 | except Exception: |
|
564 | except Exception: | |
563 | log.exception("Exception during permissions saving") |
|
565 | log.exception("Exception during permissions saving") | |
564 | h.flash(_('An error occurred during permissions saving'), |
|
566 | h.flash(_('An error occurred during permissions saving'), | |
565 | category='error') |
|
567 | category='error') | |
566 | return redirect(url('edit_user_global_perms', user_id=user_id)) |
|
568 | return redirect(url('edit_user_global_perms', user_id=user_id)) | |
567 |
|
569 | |||
568 | @HasPermissionAllDecorator('hg.admin') |
|
570 | @HasPermissionAllDecorator('hg.admin') | |
569 | def edit_perms_summary(self, user_id): |
|
571 | def edit_perms_summary(self, user_id): | |
570 | user_id = safe_int(user_id) |
|
572 | user_id = safe_int(user_id) | |
571 | c.user = User.get_or_404(user_id) |
|
573 | c.user = User.get_or_404(user_id) | |
572 | if c.user.username == User.DEFAULT_USER: |
|
574 | if c.user.username == User.DEFAULT_USER: | |
573 | h.flash(_("You can't edit this user"), category='warning') |
|
575 | h.flash(_("You can't edit this user"), category='warning') | |
574 | return redirect(url('users')) |
|
576 | return redirect(url('users')) | |
575 |
|
577 | |||
576 | c.active = 'perms_summary' |
|
578 | c.active = 'perms_summary' | |
577 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) |
|
579 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) | |
578 |
|
580 | |||
579 | return render('admin/users/user_edit.html') |
|
581 | return render('admin/users/user_edit.html') | |
580 |
|
582 | |||
581 | @HasPermissionAllDecorator('hg.admin') |
|
583 | @HasPermissionAllDecorator('hg.admin') | |
582 | def edit_emails(self, user_id): |
|
584 | def edit_emails(self, user_id): | |
583 | user_id = safe_int(user_id) |
|
585 | user_id = safe_int(user_id) | |
584 | c.user = User.get_or_404(user_id) |
|
586 | c.user = User.get_or_404(user_id) | |
585 | if c.user.username == User.DEFAULT_USER: |
|
587 | if c.user.username == User.DEFAULT_USER: | |
586 | h.flash(_("You can't edit this user"), category='warning') |
|
588 | h.flash(_("You can't edit this user"), category='warning') | |
587 | return redirect(url('users')) |
|
589 | return redirect(url('users')) | |
588 |
|
590 | |||
589 | c.active = 'emails' |
|
591 | c.active = 'emails' | |
590 | c.user_email_map = UserEmailMap.query() \ |
|
592 | c.user_email_map = UserEmailMap.query() \ | |
591 | .filter(UserEmailMap.user == c.user).all() |
|
593 | .filter(UserEmailMap.user == c.user).all() | |
592 |
|
594 | |||
593 | defaults = c.user.get_dict() |
|
595 | defaults = c.user.get_dict() | |
594 | return htmlfill.render( |
|
596 | return htmlfill.render( | |
595 | render('admin/users/user_edit.html'), |
|
597 | render('admin/users/user_edit.html'), | |
596 | defaults=defaults, |
|
598 | defaults=defaults, | |
597 | encoding="UTF-8", |
|
599 | encoding="UTF-8", | |
598 | force_defaults=False) |
|
600 | force_defaults=False) | |
599 |
|
601 | |||
600 | @HasPermissionAllDecorator('hg.admin') |
|
602 | @HasPermissionAllDecorator('hg.admin') | |
601 | @auth.CSRFRequired() |
|
603 | @auth.CSRFRequired() | |
602 | def add_email(self, user_id): |
|
604 | def add_email(self, user_id): | |
603 | """POST /user_emails:Add an existing item""" |
|
605 | """POST /user_emails:Add an existing item""" | |
604 | # url('user_emails', user_id=ID, method='put') |
|
606 | # url('user_emails', user_id=ID, method='put') | |
605 | user_id = safe_int(user_id) |
|
607 | user_id = safe_int(user_id) | |
606 | c.user = User.get_or_404(user_id) |
|
608 | c.user = User.get_or_404(user_id) | |
607 |
|
609 | |||
608 | email = request.POST.get('new_email') |
|
610 | email = request.POST.get('new_email') | |
609 | user_model = UserModel() |
|
611 | user_model = UserModel() | |
610 |
|
612 | |||
611 | try: |
|
613 | try: | |
612 | user_model.add_extra_email(user_id, email) |
|
614 | user_model.add_extra_email(user_id, email) | |
613 | Session().commit() |
|
615 | Session().commit() | |
614 | h.flash(_("Added new email address `%s` for user account") % email, |
|
616 | h.flash(_("Added new email address `%s` for user account") % email, | |
615 | category='success') |
|
617 | category='success') | |
616 | except formencode.Invalid as error: |
|
618 | except formencode.Invalid as error: | |
617 | msg = error.error_dict['email'] |
|
619 | msg = error.error_dict['email'] | |
618 | h.flash(msg, category='error') |
|
620 | h.flash(msg, category='error') | |
619 | except Exception: |
|
621 | except Exception: | |
620 | log.exception("Exception during email saving") |
|
622 | log.exception("Exception during email saving") | |
621 | h.flash(_('An error occurred during email saving'), |
|
623 | h.flash(_('An error occurred during email saving'), | |
622 | category='error') |
|
624 | category='error') | |
623 | return redirect(url('edit_user_emails', user_id=user_id)) |
|
625 | return redirect(url('edit_user_emails', user_id=user_id)) | |
624 |
|
626 | |||
625 | @HasPermissionAllDecorator('hg.admin') |
|
627 | @HasPermissionAllDecorator('hg.admin') | |
626 | @auth.CSRFRequired() |
|
628 | @auth.CSRFRequired() | |
627 | def delete_email(self, user_id): |
|
629 | def delete_email(self, user_id): | |
628 | """DELETE /user_emails_delete/user_id: Delete an existing item""" |
|
630 | """DELETE /user_emails_delete/user_id: Delete an existing item""" | |
629 | # url('user_emails_delete', user_id=ID, method='delete') |
|
631 | # url('user_emails_delete', user_id=ID, method='delete') | |
630 | user_id = safe_int(user_id) |
|
632 | user_id = safe_int(user_id) | |
631 | c.user = User.get_or_404(user_id) |
|
633 | c.user = User.get_or_404(user_id) | |
632 | email_id = request.POST.get('del_email_id') |
|
634 | email_id = request.POST.get('del_email_id') | |
633 | user_model = UserModel() |
|
635 | user_model = UserModel() | |
634 | user_model.delete_extra_email(user_id, email_id) |
|
636 | user_model.delete_extra_email(user_id, email_id) | |
635 | Session().commit() |
|
637 | Session().commit() | |
636 | h.flash(_("Removed email address from user account"), category='success') |
|
638 | h.flash(_("Removed email address from user account"), category='success') | |
637 | return redirect(url('edit_user_emails', user_id=user_id)) |
|
639 | return redirect(url('edit_user_emails', user_id=user_id)) | |
638 |
|
640 | |||
639 | @HasPermissionAllDecorator('hg.admin') |
|
641 | @HasPermissionAllDecorator('hg.admin') | |
640 | def edit_ips(self, user_id): |
|
642 | def edit_ips(self, user_id): | |
641 | user_id = safe_int(user_id) |
|
643 | user_id = safe_int(user_id) | |
642 | c.user = User.get_or_404(user_id) |
|
644 | c.user = User.get_or_404(user_id) | |
643 | if c.user.username == User.DEFAULT_USER: |
|
645 | if c.user.username == User.DEFAULT_USER: | |
644 | h.flash(_("You can't edit this user"), category='warning') |
|
646 | h.flash(_("You can't edit this user"), category='warning') | |
645 | return redirect(url('users')) |
|
647 | return redirect(url('users')) | |
646 |
|
648 | |||
647 | c.active = 'ips' |
|
649 | c.active = 'ips' | |
648 | c.user_ip_map = UserIpMap.query() \ |
|
650 | c.user_ip_map = UserIpMap.query() \ | |
649 | .filter(UserIpMap.user == c.user).all() |
|
651 | .filter(UserIpMap.user == c.user).all() | |
650 |
|
652 | |||
651 | c.inherit_default_ips = c.user.inherit_default_permissions |
|
653 | c.inherit_default_ips = c.user.inherit_default_permissions | |
652 | c.default_user_ip_map = UserIpMap.query() \ |
|
654 | c.default_user_ip_map = UserIpMap.query() \ | |
653 | .filter(UserIpMap.user == User.get_default_user()).all() |
|
655 | .filter(UserIpMap.user == User.get_default_user()).all() | |
654 |
|
656 | |||
655 | defaults = c.user.get_dict() |
|
657 | defaults = c.user.get_dict() | |
656 | return htmlfill.render( |
|
658 | return htmlfill.render( | |
657 | render('admin/users/user_edit.html'), |
|
659 | render('admin/users/user_edit.html'), | |
658 | defaults=defaults, |
|
660 | defaults=defaults, | |
659 | encoding="UTF-8", |
|
661 | encoding="UTF-8", | |
660 | force_defaults=False) |
|
662 | force_defaults=False) | |
661 |
|
663 | |||
662 | @HasPermissionAllDecorator('hg.admin') |
|
664 | @HasPermissionAllDecorator('hg.admin') | |
663 | @auth.CSRFRequired() |
|
665 | @auth.CSRFRequired() | |
664 | def add_ip(self, user_id): |
|
666 | def add_ip(self, user_id): | |
665 | """POST /user_ips:Add an existing item""" |
|
667 | """POST /user_ips:Add an existing item""" | |
666 | # url('user_ips', user_id=ID, method='put') |
|
668 | # url('user_ips', user_id=ID, method='put') | |
667 |
|
669 | |||
668 | user_id = safe_int(user_id) |
|
670 | user_id = safe_int(user_id) | |
669 | c.user = User.get_or_404(user_id) |
|
671 | c.user = User.get_or_404(user_id) | |
670 | user_model = UserModel() |
|
672 | user_model = UserModel() | |
671 | try: |
|
673 | try: | |
672 | ip_list = user_model.parse_ip_range(request.POST.get('new_ip')) |
|
674 | ip_list = user_model.parse_ip_range(request.POST.get('new_ip')) | |
673 | except Exception as e: |
|
675 | except Exception as e: | |
674 | ip_list = [] |
|
676 | ip_list = [] | |
675 | log.exception("Exception during ip saving") |
|
677 | log.exception("Exception during ip saving") | |
676 | h.flash(_('An error occurred during ip saving:%s' % (e,)), |
|
678 | h.flash(_('An error occurred during ip saving:%s' % (e,)), | |
677 | category='error') |
|
679 | category='error') | |
678 |
|
680 | |||
679 | desc = request.POST.get('description') |
|
681 | desc = request.POST.get('description') | |
680 | added = [] |
|
682 | added = [] | |
681 | for ip in ip_list: |
|
683 | for ip in ip_list: | |
682 | try: |
|
684 | try: | |
683 | user_model.add_extra_ip(user_id, ip, desc) |
|
685 | user_model.add_extra_ip(user_id, ip, desc) | |
684 | Session().commit() |
|
686 | Session().commit() | |
685 | added.append(ip) |
|
687 | added.append(ip) | |
686 | except formencode.Invalid as error: |
|
688 | except formencode.Invalid as error: | |
687 | msg = error.error_dict['ip'] |
|
689 | msg = error.error_dict['ip'] | |
688 | h.flash(msg, category='error') |
|
690 | h.flash(msg, category='error') | |
689 | except Exception: |
|
691 | except Exception: | |
690 | log.exception("Exception during ip saving") |
|
692 | log.exception("Exception during ip saving") | |
691 | h.flash(_('An error occurred during ip saving'), |
|
693 | h.flash(_('An error occurred during ip saving'), | |
692 | category='error') |
|
694 | category='error') | |
693 | if added: |
|
695 | if added: | |
694 | h.flash( |
|
696 | h.flash( | |
695 | _("Added ips %s to user whitelist") % (', '.join(ip_list), ), |
|
697 | _("Added ips %s to user whitelist") % (', '.join(ip_list), ), | |
696 | category='success') |
|
698 | category='success') | |
697 | if 'default_user' in request.POST: |
|
699 | if 'default_user' in request.POST: | |
698 | return redirect(url('admin_permissions_ips')) |
|
700 | return redirect(url('admin_permissions_ips')) | |
699 | return redirect(url('edit_user_ips', user_id=user_id)) |
|
701 | return redirect(url('edit_user_ips', user_id=user_id)) | |
700 |
|
702 | |||
701 | @HasPermissionAllDecorator('hg.admin') |
|
703 | @HasPermissionAllDecorator('hg.admin') | |
702 | @auth.CSRFRequired() |
|
704 | @auth.CSRFRequired() | |
703 | def delete_ip(self, user_id): |
|
705 | def delete_ip(self, user_id): | |
704 | """DELETE /user_ips_delete/user_id: Delete an existing item""" |
|
706 | """DELETE /user_ips_delete/user_id: Delete an existing item""" | |
705 | # url('user_ips_delete', user_id=ID, method='delete') |
|
707 | # url('user_ips_delete', user_id=ID, method='delete') | |
706 | user_id = safe_int(user_id) |
|
708 | user_id = safe_int(user_id) | |
707 | c.user = User.get_or_404(user_id) |
|
709 | c.user = User.get_or_404(user_id) | |
708 |
|
710 | |||
709 | ip_id = request.POST.get('del_ip_id') |
|
711 | ip_id = request.POST.get('del_ip_id') | |
710 | user_model = UserModel() |
|
712 | user_model = UserModel() | |
711 | user_model.delete_extra_ip(user_id, ip_id) |
|
713 | user_model.delete_extra_ip(user_id, ip_id) | |
712 | Session().commit() |
|
714 | Session().commit() | |
713 | h.flash(_("Removed ip address from user whitelist"), category='success') |
|
715 | h.flash(_("Removed ip address from user whitelist"), category='success') | |
714 |
|
716 | |||
715 | if 'default_user' in request.POST: |
|
717 | if 'default_user' in request.POST: | |
716 | return redirect(url('admin_permissions_ips')) |
|
718 | return redirect(url('admin_permissions_ips')) | |
717 | return redirect(url('edit_user_ips', user_id=user_id)) |
|
719 | return redirect(url('edit_user_ips', user_id=user_id)) |
@@ -1,277 +1,289 b'' | |||||
1 | # -*- coding: utf-8 -*- |
|
1 | # -*- coding: utf-8 -*- | |
2 |
|
2 | |||
3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
4 | # |
|
4 | # | |
5 | # This program is free software: you can redistribute it and/or modify |
|
5 | # This program is free software: you can redistribute it and/or modify | |
6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
7 | # (only), as published by the Free Software Foundation. |
|
7 | # (only), as published by the Free Software Foundation. | |
8 | # |
|
8 | # | |
9 | # This program is distributed in the hope that it will be useful, |
|
9 | # This program is distributed in the hope that it will be useful, | |
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
12 | # GNU General Public License for more details. |
|
12 | # GNU General Public License for more details. | |
13 | # |
|
13 | # | |
14 | # You should have received a copy of the GNU Affero General Public License |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
16 | # |
|
16 | # | |
17 | # This program is dual-licensed. If you wish to learn more about the |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 | """ |
|
21 | """ | |
22 | Home controller for RhodeCode Enterprise |
|
22 | Home controller for RhodeCode Enterprise | |
23 | """ |
|
23 | """ | |
24 |
|
24 | |||
25 | import logging |
|
25 | import logging | |
26 | import time |
|
26 | import time | |
27 | import re |
|
27 | import re | |
28 |
|
28 | |||
29 | from pylons import tmpl_context as c, request, url, config |
|
29 | from pylons import tmpl_context as c, request, url, config | |
30 | from pylons.i18n.translation import _ |
|
30 | from pylons.i18n.translation import _ | |
31 | from sqlalchemy.sql import func |
|
31 | from sqlalchemy.sql import func | |
32 |
|
32 | |||
33 | from rhodecode.lib.auth import ( |
|
33 | from rhodecode.lib.auth import ( | |
34 | LoginRequired, HasPermissionAllDecorator, AuthUser, |
|
34 | LoginRequired, HasPermissionAllDecorator, AuthUser, | |
35 | HasRepoGroupPermissionAnyDecorator, XHRRequired) |
|
35 | HasRepoGroupPermissionAnyDecorator, XHRRequired) | |
36 | from rhodecode.lib.base import BaseController, render |
|
36 | from rhodecode.lib.base import BaseController, render | |
37 | from rhodecode.lib.index import searcher_from_config |
|
37 | from rhodecode.lib.index import searcher_from_config | |
38 | from rhodecode.lib.ext_json import json |
|
38 | from rhodecode.lib.ext_json import json | |
39 | from rhodecode.lib.utils import jsonify |
|
39 | from rhodecode.lib.utils import jsonify | |
40 | from rhodecode.lib.utils2 import safe_unicode |
|
40 | from rhodecode.lib.utils2 import safe_unicode, str2bool | |
41 | from rhodecode.model.db import Repository, RepoGroup |
|
41 | from rhodecode.model.db import Repository, RepoGroup | |
42 | from rhodecode.model.repo import RepoModel |
|
42 | from rhodecode.model.repo import RepoModel | |
43 | from rhodecode.model.repo_group import RepoGroupModel |
|
43 | from rhodecode.model.repo_group import RepoGroupModel | |
44 | from rhodecode.model.scm import RepoList, RepoGroupList |
|
44 | from rhodecode.model.scm import RepoList, RepoGroupList | |
45 |
|
45 | |||
46 |
|
46 | |||
47 | log = logging.getLogger(__name__) |
|
47 | log = logging.getLogger(__name__) | |
48 |
|
48 | |||
49 |
|
49 | |||
50 | class HomeController(BaseController): |
|
50 | class HomeController(BaseController): | |
51 | def __before__(self): |
|
51 | def __before__(self): | |
52 | super(HomeController, self).__before__() |
|
52 | super(HomeController, self).__before__() | |
53 |
|
53 | |||
54 | def ping(self): |
|
54 | def ping(self): | |
55 | """ |
|
55 | """ | |
56 | Ping, doesn't require login, good for checking out the platform |
|
56 | Ping, doesn't require login, good for checking out the platform | |
57 | """ |
|
57 | """ | |
58 | instance_id = getattr(c, 'rhodecode_instanceid', '') |
|
58 | instance_id = getattr(c, 'rhodecode_instanceid', '') | |
59 | return 'pong[%s] => %s' % (instance_id, self.ip_addr,) |
|
59 | return 'pong[%s] => %s' % (instance_id, self.ip_addr,) | |
60 |
|
60 | |||
61 | @LoginRequired() |
|
61 | @LoginRequired() | |
62 | @HasPermissionAllDecorator('hg.admin') |
|
62 | @HasPermissionAllDecorator('hg.admin') | |
63 | def error_test(self): |
|
63 | def error_test(self): | |
64 | """ |
|
64 | """ | |
65 | Test exception handling and emails on errors |
|
65 | Test exception handling and emails on errors | |
66 | """ |
|
66 | """ | |
67 | class TestException(Exception): |
|
67 | class TestException(Exception): | |
68 | pass |
|
68 | pass | |
69 |
|
69 | |||
70 | msg = ('RhodeCode Enterprise %s test exception. Generation time: %s' |
|
70 | msg = ('RhodeCode Enterprise %s test exception. Generation time: %s' | |
71 | % (c.rhodecode_name, time.time())) |
|
71 | % (c.rhodecode_name, time.time())) | |
72 | raise TestException(msg) |
|
72 | raise TestException(msg) | |
73 |
|
73 | |||
74 | def _get_groups_and_repos(self, repo_group_id=None): |
|
74 | def _get_groups_and_repos(self, repo_group_id=None): | |
75 | # repo groups groups |
|
75 | # repo groups groups | |
76 | repo_group_list = RepoGroup.get_all_repo_groups(group_id=repo_group_id) |
|
76 | repo_group_list = RepoGroup.get_all_repo_groups(group_id=repo_group_id) | |
77 | _perms = ['group.read', 'group.write', 'group.admin'] |
|
77 | _perms = ['group.read', 'group.write', 'group.admin'] | |
78 | repo_group_list_acl = RepoGroupList(repo_group_list, perm_set=_perms) |
|
78 | repo_group_list_acl = RepoGroupList(repo_group_list, perm_set=_perms) | |
79 | repo_group_data = RepoGroupModel().get_repo_groups_as_dict( |
|
79 | repo_group_data = RepoGroupModel().get_repo_groups_as_dict( | |
80 | repo_group_list=repo_group_list_acl, admin=False) |
|
80 | repo_group_list=repo_group_list_acl, admin=False) | |
81 |
|
81 | |||
82 | # repositories |
|
82 | # repositories | |
83 | repo_list = Repository.get_all_repos(group_id=repo_group_id) |
|
83 | repo_list = Repository.get_all_repos(group_id=repo_group_id) | |
84 | _perms = ['repository.read', 'repository.write', 'repository.admin'] |
|
84 | _perms = ['repository.read', 'repository.write', 'repository.admin'] | |
85 | repo_list_acl = RepoList(repo_list, perm_set=_perms) |
|
85 | repo_list_acl = RepoList(repo_list, perm_set=_perms) | |
86 | repo_data = RepoModel().get_repos_as_dict( |
|
86 | repo_data = RepoModel().get_repos_as_dict( | |
87 | repo_list=repo_list_acl, admin=False) |
|
87 | repo_list=repo_list_acl, admin=False) | |
88 |
|
88 | |||
89 | return repo_data, repo_group_data |
|
89 | return repo_data, repo_group_data | |
90 |
|
90 | |||
91 | @LoginRequired() |
|
91 | @LoginRequired() | |
92 | def index(self): |
|
92 | def index(self): | |
93 | c.repo_group = None |
|
93 | c.repo_group = None | |
94 |
|
94 | |||
95 | repo_data, repo_group_data = self._get_groups_and_repos() |
|
95 | repo_data, repo_group_data = self._get_groups_and_repos() | |
96 | # json used to render the grids |
|
96 | # json used to render the grids | |
97 | c.repos_data = json.dumps(repo_data) |
|
97 | c.repos_data = json.dumps(repo_data) | |
98 | c.repo_groups_data = json.dumps(repo_group_data) |
|
98 | c.repo_groups_data = json.dumps(repo_group_data) | |
99 |
|
99 | |||
100 | return render('/index.html') |
|
100 | return render('/index.html') | |
101 |
|
101 | |||
102 | @LoginRequired() |
|
102 | @LoginRequired() | |
103 | @HasRepoGroupPermissionAnyDecorator('group.read', 'group.write', |
|
103 | @HasRepoGroupPermissionAnyDecorator('group.read', 'group.write', | |
104 | 'group.admin') |
|
104 | 'group.admin') | |
105 | def index_repo_group(self, group_name): |
|
105 | def index_repo_group(self, group_name): | |
106 | """GET /repo_group_name: Show a specific item""" |
|
106 | """GET /repo_group_name: Show a specific item""" | |
107 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
107 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) | |
108 | repo_data, repo_group_data = self._get_groups_and_repos( |
|
108 | repo_data, repo_group_data = self._get_groups_and_repos( | |
109 | c.repo_group.group_id) |
|
109 | c.repo_group.group_id) | |
110 |
|
110 | |||
111 | # json used to render the grids |
|
111 | # json used to render the grids | |
112 | c.repos_data = json.dumps(repo_data) |
|
112 | c.repos_data = json.dumps(repo_data) | |
113 | c.repo_groups_data = json.dumps(repo_group_data) |
|
113 | c.repo_groups_data = json.dumps(repo_group_data) | |
114 |
|
114 | |||
115 | return render('index_repo_group.html') |
|
115 | return render('index_repo_group.html') | |
116 |
|
116 | |||
117 | def _get_repo_list(self, name_contains=None, repo_type=None, limit=20): |
|
117 | def _get_repo_list(self, name_contains=None, repo_type=None, limit=20): | |
118 | query = Repository.query()\ |
|
118 | query = Repository.query()\ | |
119 | .order_by(func.length(Repository.repo_name))\ |
|
119 | .order_by(func.length(Repository.repo_name))\ | |
120 | .order_by(Repository.repo_name) |
|
120 | .order_by(Repository.repo_name) | |
121 |
|
121 | |||
122 | if repo_type: |
|
122 | if repo_type: | |
123 | query = query.filter(Repository.repo_type == repo_type) |
|
123 | query = query.filter(Repository.repo_type == repo_type) | |
124 |
|
124 | |||
125 | if name_contains: |
|
125 | if name_contains: | |
126 | ilike_expression = u'%{}%'.format(safe_unicode(name_contains)) |
|
126 | ilike_expression = u'%{}%'.format(safe_unicode(name_contains)) | |
127 | query = query.filter( |
|
127 | query = query.filter( | |
128 | Repository.repo_name.ilike(ilike_expression)) |
|
128 | Repository.repo_name.ilike(ilike_expression)) | |
129 | query = query.limit(limit) |
|
129 | query = query.limit(limit) | |
130 |
|
130 | |||
131 | all_repos = query.all() |
|
131 | all_repos = query.all() | |
132 | repo_iter = self.scm_model.get_repos(all_repos) |
|
132 | repo_iter = self.scm_model.get_repos(all_repos) | |
133 | return [ |
|
133 | return [ | |
134 | { |
|
134 | { | |
135 | 'id': obj['name'], |
|
135 | 'id': obj['name'], | |
136 | 'text': obj['name'], |
|
136 | 'text': obj['name'], | |
137 | 'type': 'repo', |
|
137 | 'type': 'repo', | |
138 | 'obj': obj['dbrepo'], |
|
138 | 'obj': obj['dbrepo'], | |
139 | 'url': url('summary_home', repo_name=obj['name']) |
|
139 | 'url': url('summary_home', repo_name=obj['name']) | |
140 | } |
|
140 | } | |
141 | for obj in repo_iter] |
|
141 | for obj in repo_iter] | |
142 |
|
142 | |||
143 | def _get_repo_group_list(self, name_contains=None, limit=20): |
|
143 | def _get_repo_group_list(self, name_contains=None, limit=20): | |
144 | query = RepoGroup.query()\ |
|
144 | query = RepoGroup.query()\ | |
145 | .order_by(func.length(RepoGroup.group_name))\ |
|
145 | .order_by(func.length(RepoGroup.group_name))\ | |
146 | .order_by(RepoGroup.group_name) |
|
146 | .order_by(RepoGroup.group_name) | |
147 |
|
147 | |||
148 | if name_contains: |
|
148 | if name_contains: | |
149 | ilike_expression = u'%{}%'.format(safe_unicode(name_contains)) |
|
149 | ilike_expression = u'%{}%'.format(safe_unicode(name_contains)) | |
150 | query = query.filter( |
|
150 | query = query.filter( | |
151 | RepoGroup.group_name.ilike(ilike_expression)) |
|
151 | RepoGroup.group_name.ilike(ilike_expression)) | |
152 | query = query.limit(limit) |
|
152 | query = query.limit(limit) | |
153 |
|
153 | |||
154 | all_groups = query.all() |
|
154 | all_groups = query.all() | |
155 | repo_groups_iter = self.scm_model.get_repo_groups(all_groups) |
|
155 | repo_groups_iter = self.scm_model.get_repo_groups(all_groups) | |
156 | return [ |
|
156 | return [ | |
157 | { |
|
157 | { | |
158 | 'id': obj.group_name, |
|
158 | 'id': obj.group_name, | |
159 | 'text': obj.group_name, |
|
159 | 'text': obj.group_name, | |
160 | 'type': 'group', |
|
160 | 'type': 'group', | |
161 | 'obj': {}, |
|
161 | 'obj': {}, | |
162 | 'url': url('repo_group_home', group_name=obj.group_name) |
|
162 | 'url': url('repo_group_home', group_name=obj.group_name) | |
163 | } |
|
163 | } | |
164 | for obj in repo_groups_iter] |
|
164 | for obj in repo_groups_iter] | |
165 |
|
165 | |||
166 | def _get_hash_commit_list(self, hash_starts_with=None, limit=20): |
|
166 | def _get_hash_commit_list(self, hash_starts_with=None, limit=20): | |
167 | if not hash_starts_with or len(hash_starts_with) < 3: |
|
167 | if not hash_starts_with or len(hash_starts_with) < 3: | |
168 | return [] |
|
168 | return [] | |
169 |
|
169 | |||
170 | commit_hashes = re.compile('([0-9a-f]{2,40})').findall(hash_starts_with) |
|
170 | commit_hashes = re.compile('([0-9a-f]{2,40})').findall(hash_starts_with) | |
171 |
|
171 | |||
172 | if len(commit_hashes) != 1: |
|
172 | if len(commit_hashes) != 1: | |
173 | return [] |
|
173 | return [] | |
174 |
|
174 | |||
175 | commit_hash_prefix = commit_hashes[0] |
|
175 | commit_hash_prefix = commit_hashes[0] | |
176 |
|
176 | |||
177 | auth_user = AuthUser( |
|
177 | auth_user = AuthUser( | |
178 | user_id=c.rhodecode_user.user_id, ip_addr=self.ip_addr) |
|
178 | user_id=c.rhodecode_user.user_id, ip_addr=self.ip_addr) | |
179 | searcher = searcher_from_config(config) |
|
179 | searcher = searcher_from_config(config) | |
180 | result = searcher.search( |
|
180 | result = searcher.search( | |
181 | 'commit_id:%s*' % commit_hash_prefix, 'commit', auth_user) |
|
181 | 'commit_id:%s*' % commit_hash_prefix, 'commit', auth_user) | |
182 |
|
182 | |||
183 | return [ |
|
183 | return [ | |
184 | { |
|
184 | { | |
185 | 'id': entry['commit_id'], |
|
185 | 'id': entry['commit_id'], | |
186 | 'text': entry['commit_id'], |
|
186 | 'text': entry['commit_id'], | |
187 | 'type': 'commit', |
|
187 | 'type': 'commit', | |
188 | 'obj': {'repo': entry['repository']}, |
|
188 | 'obj': {'repo': entry['repository']}, | |
189 | 'url': url('changeset_home', |
|
189 | 'url': url('changeset_home', | |
190 | repo_name=entry['repository'], revision=entry['commit_id']) |
|
190 | repo_name=entry['repository'], revision=entry['commit_id']) | |
191 | } |
|
191 | } | |
192 | for entry in result['results']] |
|
192 | for entry in result['results']] | |
193 |
|
193 | |||
194 | @LoginRequired() |
|
194 | @LoginRequired() | |
195 | @XHRRequired() |
|
195 | @XHRRequired() | |
196 | @jsonify |
|
196 | @jsonify | |
197 | def goto_switcher_data(self): |
|
197 | def goto_switcher_data(self): | |
198 | query = request.GET.get('query') |
|
198 | query = request.GET.get('query') | |
199 | log.debug('generating goto switcher list, query %s', query) |
|
199 | log.debug('generating goto switcher list, query %s', query) | |
200 |
|
200 | |||
201 | res = [] |
|
201 | res = [] | |
202 | repo_groups = self._get_repo_group_list(query) |
|
202 | repo_groups = self._get_repo_group_list(query) | |
203 | if repo_groups: |
|
203 | if repo_groups: | |
204 | res.append({ |
|
204 | res.append({ | |
205 | 'text': _('Groups'), |
|
205 | 'text': _('Groups'), | |
206 | 'children': repo_groups |
|
206 | 'children': repo_groups | |
207 | }) |
|
207 | }) | |
208 |
|
208 | |||
209 | repos = self._get_repo_list(query) |
|
209 | repos = self._get_repo_list(query) | |
210 | if repos: |
|
210 | if repos: | |
211 | res.append({ |
|
211 | res.append({ | |
212 | 'text': _('Repositories'), |
|
212 | 'text': _('Repositories'), | |
213 | 'children': repos |
|
213 | 'children': repos | |
214 | }) |
|
214 | }) | |
215 |
|
215 | |||
216 | commits = self._get_hash_commit_list(query) |
|
216 | commits = self._get_hash_commit_list(query) | |
217 | if commits: |
|
217 | if commits: | |
218 | unique_repos = {} |
|
218 | unique_repos = {} | |
219 | for commit in commits: |
|
219 | for commit in commits: | |
220 | unique_repos.setdefault(commit['obj']['repo'], [] |
|
220 | unique_repos.setdefault(commit['obj']['repo'], [] | |
221 | ).append(commit) |
|
221 | ).append(commit) | |
222 |
|
222 | |||
223 | for repo in unique_repos: |
|
223 | for repo in unique_repos: | |
224 | res.append({ |
|
224 | res.append({ | |
225 | 'text': _('Commits in %(repo)s') % {'repo': repo}, |
|
225 | 'text': _('Commits in %(repo)s') % {'repo': repo}, | |
226 | 'children': unique_repos[repo] |
|
226 | 'children': unique_repos[repo] | |
227 | }) |
|
227 | }) | |
228 |
|
228 | |||
229 | data = { |
|
229 | data = { | |
230 | 'more': False, |
|
230 | 'more': False, | |
231 | 'results': res |
|
231 | 'results': res | |
232 | } |
|
232 | } | |
233 | return data |
|
233 | return data | |
234 |
|
234 | |||
235 | @LoginRequired() |
|
235 | @LoginRequired() | |
236 | @XHRRequired() |
|
236 | @XHRRequired() | |
237 | @jsonify |
|
237 | @jsonify | |
238 | def repo_list_data(self): |
|
238 | def repo_list_data(self): | |
239 | query = request.GET.get('query') |
|
239 | query = request.GET.get('query') | |
240 | repo_type = request.GET.get('repo_type') |
|
240 | repo_type = request.GET.get('repo_type') | |
241 | log.debug('generating repo list, query:%s', query) |
|
241 | log.debug('generating repo list, query:%s', query) | |
242 |
|
242 | |||
243 | res = [] |
|
243 | res = [] | |
244 | repos = self._get_repo_list(query, repo_type=repo_type) |
|
244 | repos = self._get_repo_list(query, repo_type=repo_type) | |
245 | if repos: |
|
245 | if repos: | |
246 | res.append({ |
|
246 | res.append({ | |
247 | 'text': _('Repositories'), |
|
247 | 'text': _('Repositories'), | |
248 | 'children': repos |
|
248 | 'children': repos | |
249 | }) |
|
249 | }) | |
250 |
|
250 | |||
251 | data = { |
|
251 | data = { | |
252 | 'more': False, |
|
252 | 'more': False, | |
253 | 'results': res |
|
253 | 'results': res | |
254 | } |
|
254 | } | |
255 | return data |
|
255 | return data | |
256 |
|
256 | |||
257 | @LoginRequired() |
|
257 | @LoginRequired() | |
258 | @XHRRequired() |
|
258 | @XHRRequired() | |
259 | @jsonify |
|
259 | @jsonify | |
260 | def user_autocomplete_data(self): |
|
260 | def user_autocomplete_data(self): | |
261 | query = request.GET.get('query') |
|
261 | query = request.GET.get('query') | |
|
262 | active = str2bool(request.GET.get('active') or True) | |||
262 |
|
263 | |||
263 | repo_model = RepoModel() |
|
264 | repo_model = RepoModel() | |
264 |
_users = repo_model.get_users( |
|
265 | _users = repo_model.get_users( | |
|
266 | name_contains=query, only_active=active) | |||
265 |
|
267 | |||
266 | if request.GET.get('user_groups'): |
|
268 | if request.GET.get('user_groups'): | |
267 | # extend with user groups |
|
269 | # extend with user groups | |
268 |
_user_groups = repo_model.get_user_groups( |
|
270 | _user_groups = repo_model.get_user_groups( | |
|
271 | name_contains=query, only_active=active) | |||
269 | _users = _users + _user_groups |
|
272 | _users = _users + _user_groups | |
270 |
|
273 | |||
271 | return {'suggestions': _users} |
|
274 | return {'suggestions': _users} | |
272 |
|
275 | |||
273 | @LoginRequired() |
|
276 | @LoginRequired() | |
274 | @XHRRequired() |
|
277 | @XHRRequired() | |
275 | @jsonify |
|
278 | @jsonify | |
276 | def user_group_autocomplete_data(self): |
|
279 | def user_group_autocomplete_data(self): | |
277 | return {'suggestions': []} |
|
280 | query = request.GET.get('query') | |
|
281 | active = str2bool(request.GET.get('active') or True) | |||
|
282 | ||||
|
283 | repo_model = RepoModel() | |||
|
284 | _user_groups = repo_model.get_user_groups( | |||
|
285 | name_contains=query, only_active=active) | |||
|
286 | _user_groups = _user_groups | |||
|
287 | ||||
|
288 | return {'suggestions': _user_groups} | |||
|
289 |
@@ -1,31 +1,53 b'' | |||||
1 | # Copyright (C) 2016-2016 RhodeCode GmbH |
|
1 | # Copyright (C) 2016-2016 RhodeCode GmbH | |
2 | # |
|
2 | # | |
3 | # This program is free software: you can redistribute it and/or modify |
|
3 | # This program is free software: you can redistribute it and/or modify | |
4 | # it under the terms of the GNU Affero General Public License, version 3 |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
5 | # (only), as published by the Free Software Foundation. |
|
5 | # (only), as published by the Free Software Foundation. | |
6 | # |
|
6 | # | |
7 | # This program is distributed in the hope that it will be useful, |
|
7 | # This program is distributed in the hope that it will be useful, | |
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
10 | # GNU General Public License for more details. |
|
10 | # GNU General Public License for more details. | |
11 | # |
|
11 | # | |
12 | # You should have received a copy of the GNU Affero General Public License |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
14 | # |
|
14 | # | |
15 | # This program is dual-licensed. If you wish to learn more about the |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
16 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
18 |
|
18 | |||
19 | from zope.interface import implementer |
|
19 | from zope.interface import implementer | |
20 |
from rhodecode.interfaces import |
|
20 | from rhodecode.interfaces import ( | |
|
21 | IUserRegistered, IUserPreCreate, IUserPreUpdate) | |||
21 |
|
22 | |||
22 |
|
23 | |||
23 | @implementer(IUserRegistered) |
|
24 | @implementer(IUserRegistered) | |
24 | class UserRegistered(object): |
|
25 | class UserRegistered(object): | |
25 | """ |
|
26 | """ | |
26 | An instance of this class is emitted as an :term:`event` whenever a user |
|
27 | An instance of this class is emitted as an :term:`event` whenever a user | |
27 | account is registered. |
|
28 | account is registered. | |
28 | """ |
|
29 | """ | |
29 | def __init__(self, user, session): |
|
30 | def __init__(self, user, session): | |
30 | self.user = user |
|
31 | self.user = user | |
31 | self.session = session |
|
32 | self.session = session | |
|
33 | ||||
|
34 | ||||
|
35 | @implementer(IUserPreCreate) | |||
|
36 | class UserPreCreate(object): | |||
|
37 | """ | |||
|
38 | An instance of this class is emitted as an :term:`event` before a new user | |||
|
39 | object is created. | |||
|
40 | """ | |||
|
41 | def __init__(self, user_data): | |||
|
42 | self.user_data = user_data | |||
|
43 | ||||
|
44 | ||||
|
45 | @implementer(IUserPreUpdate) | |||
|
46 | class UserPreUpdate(object): | |||
|
47 | """ | |||
|
48 | An instance of this class is emitted as an :term:`event` before a user | |||
|
49 | object is updated. | |||
|
50 | """ | |||
|
51 | def __init__(self, user, user_data): | |||
|
52 | self.user = user | |||
|
53 | self.user_data = user_data |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed, binary diff hidden |
|
NO CONTENT: file was removed, binary diff hidden |
1 | NO CONTENT: file was removed, binary diff hidden |
|
NO CONTENT: file was removed, binary diff hidden |
1 | NO CONTENT: file was removed, binary diff hidden |
|
NO CONTENT: file was removed, binary diff hidden |
General Comments 0
You need to be logged in to leave comments.
Login now