Show More
The requested changes are too big and content was truncated. Show full diff
@@ -0,0 +1,137 b'' | |||
|
1 | .. _checklist-tickets: | |
|
2 | ||
|
3 | ================= | |
|
4 | Ticket Checklists | |
|
5 | ================= | |
|
6 | ||
|
7 | ||
|
8 | Ticket Description | |
|
9 | ================== | |
|
10 | ||
|
11 | In general these things really matter in the description: | |
|
12 | ||
|
13 | - Reasoning / Rationale. Explain "WHY" it makes sense and is important. | |
|
14 | ||
|
15 | - How to reproduce. Easy to follow steps, that’s important. | |
|
16 | ||
|
17 | - Observation: The problem (short) | |
|
18 | ||
|
19 | - Expectation: How it should be (short) | |
|
20 | ||
|
21 | - Specs: It is fine to draft them as good as it works. | |
|
22 | ||
|
23 | If anything is unclear, please ask for a review or help on this via the | |
|
24 | Community Portal or Slack channel. | |
|
25 | ||
|
26 | ||
|
27 | Checklists for Tickets | |
|
28 | ====================== | |
|
29 | ||
|
30 | BUG | |
|
31 | --- | |
|
32 | ||
|
33 | Definition: An existing function that does not work as expected for the user. | |
|
34 | ||
|
35 | - Problem description | |
|
36 | - Steps needed to recreate (gherkin) | |
|
37 | - Link to the screen in question and/or description of how to find it via | |
|
38 | navigation | |
|
39 | - Explanation of what the expected outcome is | |
|
40 | - Any hints into the source of the problem | |
|
41 | - Information about platform/browser/db/etc. where applicable | |
|
42 | - Examples of other similar cases which have different behaviour | |
|
43 | ||
|
44 | DESIGN | |
|
45 | ------ | |
|
46 | ||
|
47 | Definition: Styling and user interface issues, including cosmetic improvements | |
|
48 | or appearance and behaviour of frontend functionality. | |
|
49 | ||
|
50 | - Screenshot/animation of existing page/behaviour | |
|
51 | - Sketches or wireframes if available | |
|
52 | - Link to the screen in question and/or description of how to find it via | |
|
53 | navigation | |
|
54 | - Problem description | |
|
55 | - Explanation of what the expected outcome is | |
|
56 | - Since this may be examined by a designer; it should be written in a way that a | |
|
57 | non-developer can understand | |
|
58 | ||
|
59 | EPIC | |
|
60 | ---- | |
|
61 | ||
|
62 | Definition: A collection of tickets which together complete a larger overall | |
|
63 | project. | |
|
64 | ||
|
65 | - Benefit explanation | |
|
66 | - Clear objective - when is this complete? | |
|
67 | - Explanations of exceptions/corner cases | |
|
68 | - Documentation subtask | |
|
69 | - Comprehensive wireframes and/or design subtasks | |
|
70 | - Links to subtasks | |
|
71 | ||
|
72 | FEATURE | |
|
73 | ------- | |
|
74 | ||
|
75 | Definition: A new function in the software which previously did not exist. | |
|
76 | ||
|
77 | - Benefit explanation | |
|
78 | - Clear objective | |
|
79 | - Explanations of exceptions/corner cases | |
|
80 | - Documentation subtask | |
|
81 | - Comprehensive wireframes and/or design subtasks | |
|
82 | ||
|
83 | SUPPORT | |
|
84 | ------- | |
|
85 | ||
|
86 | Definition: An issue related to a customer report. | |
|
87 | ||
|
88 | - Link to support ticket, if available | |
|
89 | - Problem description | |
|
90 | - Steps needed to recreate (gherkin) | |
|
91 | - Link to the screen in question and/or description of how to find it via | |
|
92 | navigation | |
|
93 | - Explanation of what the expected outcome is | |
|
94 | - Any hints into the source of the problem | |
|
95 | - Information about platform/browser/db/etc. where applicable | |
|
96 | - Examples of other similar cases which have different behaviour | |
|
97 | ||
|
98 | TASK | |
|
99 | ---- | |
|
100 | ||
|
101 | Definition: An improvement or step towards implementing a feature or fixing | |
|
102 | a bug. Includes refactoring and other tech debt. | |
|
103 | ||
|
104 | - Clear objective | |
|
105 | - Benefit explanation | |
|
106 | - Links to parent/related tickets | |
|
107 | ||
|
108 | ||
|
109 | All details below. | |
|
110 | ||
|
111 | ||
|
112 | External links: | |
|
113 | ||
|
114 | - Avoid linking to external images; they disappear over time. Please attach any | |
|
115 | relevant images to the ticket itself. | |
|
116 | ||
|
117 | - External links in general: They also disappear over time, consider copying the | |
|
118 | relevant bit of information into a comment or write a paragraph to sum up the | |
|
119 | general idea. | |
|
120 | ||
|
121 | ||
|
122 | Hints | |
|
123 | ===== | |
|
124 | ||
|
125 | Change Description | |
|
126 | ------------------ | |
|
127 | ||
|
128 | It can be tricky to figure out how to change the description of a ticket. There | |
|
129 | is a very small pencil which has to be clicked once you see the edit form of a | |
|
130 | ticket. | |
|
131 | ||
|
132 | ||
|
133 | .. figure:: images/redmine-description.png | |
|
134 | :alt: Example of pencil to change the ticket description | |
|
135 | ||
|
136 | Shows an example of the pencil which lets you change the description. | |
|
137 |
@@ -0,0 +1,153 b'' | |||
|
1 | ||
|
2 | ================================================== | |
|
3 | Code style and structure guide for frontend work | |
|
4 | ================================================== | |
|
5 | ||
|
6 | About: Outline of frontend development practices. | |
|
7 | ||
|
8 | ||
|
9 | ||
|
10 | ||
|
11 | Templates | |
|
12 | ========= | |
|
13 | ||
|
14 | - Indent with 4 spaces in general. | |
|
15 | - Embedded Python code follows the same conventions as in the backend. | |
|
16 | ||
|
17 | A common problem is missed spaces around operators. | |
|
18 | ||
|
19 | ||
|
20 | ||
|
21 | ||
|
22 | Grunt | |
|
23 | ===== | |
|
24 | ||
|
25 | We use Grunt to compile our JavaScript and LESS files. This is done automatically | |
|
26 | when you start an instance. If you are changing these files, however, it is | |
|
27 | recommended to amend `--reload` to the `runserver` command, or use `grunt watch` | |
|
28 | - the Gruntfile is located in the base directory. For more info on Grunt, see | |
|
29 | http://gruntjs.com/ | |
|
30 | ||
|
31 | ||
|
32 | ||
|
33 | ||
|
34 | LESS CSS | |
|
35 | ======== | |
|
36 | ||
|
37 | ||
|
38 | Style | |
|
39 | ----- | |
|
40 | ||
|
41 | - Use 4 spaces instead of tabs. | |
|
42 | - Avoid ``!important``; it is very often an indicator for a problem. | |
|
43 | ||
|
44 | ||
|
45 | ||
|
46 | ||
|
47 | Structure | |
|
48 | --------- | |
|
49 | ||
|
50 | It is important that we maintain consistency in the LESS files so that things | |
|
51 | scale properly. CSS is organized using LESS and then compiled into a CSS file | |
|
52 | to be used in production. Find the class you need to change and change it | |
|
53 | there. Do not add overriding styles at the end of the file. The LESS file will | |
|
54 | be minified; use plenty of spacing and comments for readability. | |
|
55 | ||
|
56 | These will be kept in auxillary LESS files to be imported (in this order) at the top: | |
|
57 | ||
|
58 | - `fonts.less` (font-face declarations) | |
|
59 | - `mixins` (place all LESS mixins here) | |
|
60 | - `helpers` (basic classes for hiding mobile elements, centering, etc) | |
|
61 | - `variables` (theme-specific colors, spacing, and fonts which might change later) | |
|
62 | ||
|
63 | ||
|
64 | Sections of the primary LESS file are as follows. Add comments describing | |
|
65 | layout and modules. | |
|
66 | ||
|
67 | .. code-block:: css | |
|
68 | ||
|
69 | //--- BASE ------------------// | |
|
70 | Very basic, sitewide styles. | |
|
71 | ||
|
72 | //--- LAYOUT ------------------// | |
|
73 | Essential layout, ex. containers and wrappers. | |
|
74 | Do not put type styles in here. | |
|
75 | ||
|
76 | //--- MODULES ------------------// | |
|
77 | Reusable sections, such as sidebars and menus. | |
|
78 | ||
|
79 | //--- THEME ------------------// | |
|
80 | Theme styles, typography, etc. | |
|
81 | ||
|
82 | ||
|
83 | ||
|
84 | Formatting rules | |
|
85 | ~~~~~~~~~~~~~~~~ | |
|
86 | ||
|
87 | - Each rule should be indented on a separate line (this is helpful for diff | |
|
88 | checking). | |
|
89 | ||
|
90 | - Use a space after each colon and a semicolon after each last rule. | |
|
91 | ||
|
92 | - Put a blank line between each class. | |
|
93 | ||
|
94 | - Nested classes should be listed after the parent class' rules, separated with a | |
|
95 | blank line, and indented. | |
|
96 | ||
|
97 | - Using the below as a guide, place each rule in order of its effect on content, | |
|
98 | layout, sizing, and last listing minor style changes such as font color and | |
|
99 | backgrounds. Not every possible rule is listed here; when adding new ones, | |
|
100 | judge where it should go in the list based on that hierarchy. | |
|
101 | ||
|
102 | .. code-block:: scss | |
|
103 | ||
|
104 | .class { | |
|
105 | content | |
|
106 | list-style-type | |
|
107 | position | |
|
108 | float | |
|
109 | top | |
|
110 | right | |
|
111 | bottom | |
|
112 | left | |
|
113 | height | |
|
114 | max-height | |
|
115 | min-height | |
|
116 | width | |
|
117 | max-width | |
|
118 | min-width | |
|
119 | margin | |
|
120 | padding | |
|
121 | indent | |
|
122 | vertical-align | |
|
123 | text-align | |
|
124 | border | |
|
125 | border-radius | |
|
126 | font-size | |
|
127 | line-height | |
|
128 | font | |
|
129 | font-style | |
|
130 | font-variant | |
|
131 | font-weight | |
|
132 | color | |
|
133 | text-shadow | |
|
134 | background | |
|
135 | background-color | |
|
136 | box-shadow | |
|
137 | background-url | |
|
138 | background-position | |
|
139 | background-repeat | |
|
140 | background-cover | |
|
141 | transitions | |
|
142 | cursor | |
|
143 | pointer-events | |
|
144 | ||
|
145 | .nested-class { | |
|
146 | position | |
|
147 | background-color | |
|
148 | ||
|
149 | &:hover { | |
|
150 | color | |
|
151 | } | |
|
152 | } | |
|
153 | } |
@@ -0,0 +1,111 b'' | |||
|
1 | ||
|
2 | ======================= | |
|
3 | Contributing Overview | |
|
4 | ======================= | |
|
5 | ||
|
6 | ||
|
7 | RhodeCode Community Edition is an open source code management platform. We | |
|
8 | encourage contributions to our project from the community. This is a basic | |
|
9 | overview of the procedures for adding your contribution to RhodeCode. | |
|
10 | ||
|
11 | ||
|
12 | ||
|
13 | Check the Issue Tracker | |
|
14 | ======================= | |
|
15 | ||
|
16 | Make an account at https://issues.rhodecode.com/account/register and browse the | |
|
17 | current tickets for bugs to fix and tasks to do. Have a bug or feature that you | |
|
18 | can't find in the tracker? Create a new issue for it. When you select a ticket, | |
|
19 | make sure to assign it to yourself and mark it "in progress" to avoid duplicated | |
|
20 | work. | |
|
21 | ||
|
22 | ||
|
23 | ||
|
24 | Sign Up at code.rhodecode.com | |
|
25 | ============================= | |
|
26 | ||
|
27 | Make an account at https://code.rhodecode.com/ using an email or your existing | |
|
28 | GitHub, Bitbucket, Google, or Twitter account. Fork the repo you'd like to | |
|
29 | contribute to; we suggest adding your username to the fork name. Clone your fork | |
|
30 | to your computer. We use Mercurial for source control management; see | |
|
31 | https://www.mercurial-scm.org/guide to get started quickly. | |
|
32 | ||
|
33 | ||
|
34 | ||
|
35 | Set Up A Local Instance | |
|
36 | ======================= | |
|
37 | ||
|
38 | You will need to set up an instance of RhodeCode CE using VCSServer so that you | |
|
39 | can see your work locally as you make changes. We recommend using Linux for this | |
|
40 | but it can also be built on OSX. | |
|
41 | ||
|
42 | See :doc:`dev-setup` for instructions. | |
|
43 | ||
|
44 | ||
|
45 | ||
|
46 | Code! | |
|
47 | ===== | |
|
48 | ||
|
49 | You can now make, see, and test your changes locally. We are always improving to | |
|
50 | keep our code clean and the cost of maintaining it low. This applies in the same | |
|
51 | way for contributions. We run automated checks on our pull requests, and expect | |
|
52 | understandable code. We also aim to provide test coverage for as much of our | |
|
53 | codebase as possible; any new features should be augmented with tests. | |
|
54 | ||
|
55 | Keep in mind that when we accept your contribution, we also take responsibility | |
|
56 | for it; we must understand it to take on that responsibility. | |
|
57 | ||
|
58 | See :doc:`standards` for more detailed information. | |
|
59 | ||
|
60 | ||
|
61 | ||
|
62 | Commit And Push Your Changes | |
|
63 | ============================ | |
|
64 | ||
|
65 | We highly recommend making a new bookmark for each feature, bug, or set of | |
|
66 | commits you make so that you can point to it when creating your pull request. | |
|
67 | Please also reference the ticket number in your commit messages. Don't forget to | |
|
68 | push the bookmark! | |
|
69 | ||
|
70 | ||
|
71 | ||
|
72 | Submit a Pull Request | |
|
73 | ===================== | |
|
74 | ||
|
75 | Go to your fork, and choose "Create Pull Request" from the Options menu. Use | |
|
76 | your bookmark as the source, and choose someone to review it. Don't worry about | |
|
77 | chosing the right person; we'll assign the best contributor for the job. You'll | |
|
78 | get feedback and an assigned status. | |
|
79 | ||
|
80 | Be prepared to make updates to your pull request after some feedback. | |
|
81 | Collaboration is part of the process and improvements can often be made. | |
|
82 | ||
|
83 | ||
|
84 | ||
|
85 | Sign the Contributor License Agreement | |
|
86 | ====================================== | |
|
87 | ||
|
88 | If your contribution is approved, you will need to virtually sign the license | |
|
89 | agreement in order for it to be merged into the project's codebase. You can read | |
|
90 | it on our website here: https://rhodecode.com/static/pdf/RhodeCode-CLA.pdf | |
|
91 | ||
|
92 | To sign, go to code.rhodecode.com | |
|
93 | and clone the CLA repository. Add your name and make a pull request to add it to | |
|
94 | the contributor agreement; this serves as your virtual signature. Once your | |
|
95 | signature is merged, add a link to the relevant commit to your contribution | |
|
96 | pull request. | |
|
97 | ||
|
98 | ||
|
99 | ||
|
100 | That's it! We'll take it from there. Thanks for your contribution! | |
|
101 | ------------------------------------------------------------------ | |
|
102 | ||
|
103 | .. note:: If you have any questions or comments, feel free to contact us through | |
|
104 | either the community portal(community.rhodecode.com), IRC | |
|
105 | (irc.freenode.net), or Slack (rhodecode.com/join). | |
|
106 | ||
|
107 | ||
|
108 | ||
|
109 | ||
|
110 | ||
|
111 |
@@ -0,0 +1,177 b'' | |||
|
1 | ||
|
2 | ====================== | |
|
3 | Contribution Standards | |
|
4 | ====================== | |
|
5 | ||
|
6 | Standards help to improve the quality of our product and its development. Herein | |
|
7 | we define our standards for processes and development to maintain consistency | |
|
8 | and function well as a community. It is a work in progress; modifications to this | |
|
9 | document should be discussed and agreed upon by the community. | |
|
10 | ||
|
11 | ||
|
12 | -------------------------------------------------------------------------------- | |
|
13 | ||
|
14 | Code | |
|
15 | ==== | |
|
16 | ||
|
17 | This provides an outline for standards we use in our codebase to keep our code | |
|
18 | easy to read and easy to maintain. Much of our code guidelines are based on the | |
|
19 | book `Clean Code <http://www.pearsonhighered.com/educator/product/Clean-Code-A-Handbook-of-Agile-Software-Craftsmanship/9780132350884.page>`_ | |
|
20 | by Robert Martin. | |
|
21 | ||
|
22 | We maintain a Tech Glossary to provide consistency in terms and symbolic names | |
|
23 | used for items and concepts within the application. This is found in the CE | |
|
24 | project in /docs-internal/glossary.rst | |
|
25 | ||
|
26 | ||
|
27 | Refactoring | |
|
28 | ----------- | |
|
29 | Make it better than you found it! | |
|
30 | ||
|
31 | Our codebase can always use improvement and often benefits from refactoring. | |
|
32 | New code should be refactored as it is being written, and old code should be | |
|
33 | treated with the same care as if it was new. Before doing any refactoring, | |
|
34 | ensure that there is test coverage on the affected code; this will help | |
|
35 | minimize issues. | |
|
36 | ||
|
37 | ||
|
38 | Python | |
|
39 | ------ | |
|
40 | For Python, we use `PEP8 <https://www.python.org/dev/peps/pep-0008/>`_. | |
|
41 | We adjust lines of code to under 80 characters and use 4 spaces for indentation. | |
|
42 | ||
|
43 | ||
|
44 | JavaScript | |
|
45 | ---------- | |
|
46 | This currently remains undefined. Suggestions welcome! | |
|
47 | ||
|
48 | ||
|
49 | HTML | |
|
50 | ---- | |
|
51 | Unfortunately, we currently have no strict HTML standards, but there are a few | |
|
52 | guidelines we do follow. Templates must work in all modern browsers. HTML should | |
|
53 | be clean and easy to read, and additionally should be free of inline CSS or | |
|
54 | JavaScript. It is recommended to use data attributes for JS actions where | |
|
55 | possible in order to separate it from styling and prevent unintentional changes. | |
|
56 | ||
|
57 | ||
|
58 | LESS/CSS | |
|
59 | -------- | |
|
60 | We use LESS for our CSS; see :doc:`frontend` for structure and formatting | |
|
61 | guidelines. | |
|
62 | ||
|
63 | ||
|
64 | Linters | |
|
65 | ------- | |
|
66 | Currently, we have a linter for pull requests which checks code against PEP8. | |
|
67 | We intend to add more in the future as we clarify standards. | |
|
68 | ||
|
69 | ||
|
70 | -------------------------------------------------------------------------------- | |
|
71 | ||
|
72 | Naming Conventions | |
|
73 | ================== | |
|
74 | ||
|
75 | These still need to be defined for naming everything from Python variables to | |
|
76 | HTML classes to files and folders. | |
|
77 | ||
|
78 | ||
|
79 | -------------------------------------------------------------------------------- | |
|
80 | ||
|
81 | Testing | |
|
82 | ======= | |
|
83 | ||
|
84 | Testing is a very important aspect of our process, especially as we are our own | |
|
85 | quality control team. While it is of course unrealistic to hit every potential | |
|
86 | combination, our goal is to cover every line of Python code with a test. | |
|
87 | ||
|
88 | The following is a brief introduction to our test suite. Our tests are primarily | |
|
89 | written using `py.test <http://pytest.org/>`_ | |
|
90 | ||
|
91 | ||
|
92 | Acceptance Tests | |
|
93 | ---------------- | |
|
94 | Also known as "ac tests", these test from the user and business perspective to | |
|
95 | check if the requirements of a feature are met. Scenarios are created at a | |
|
96 | feature's inception and help to define its value. | |
|
97 | ||
|
98 | py.test is used for ac tests; they are located in a repo separate from the | |
|
99 | other tests which follow. Each feature has a .feature file which contains a | |
|
100 | brief description and the scenarios to be tested. | |
|
101 | ||
|
102 | ||
|
103 | Functional Tests | |
|
104 | ---------------- | |
|
105 | These test specific functionality in the application which checks through the | |
|
106 | entire stack. Typically these are user actions or permissions which go through | |
|
107 | the web browser. They are located in rhodecode/tests. | |
|
108 | ||
|
109 | ||
|
110 | Unit Tests | |
|
111 | ---------- | |
|
112 | These test isolated, individual modules to ensure that they function correctly. | |
|
113 | They are located in rhodecode/tests. | |
|
114 | ||
|
115 | ||
|
116 | Integration Tests | |
|
117 | ----------------- | |
|
118 | These are used for testing performance of larger systems than the unit tests. | |
|
119 | They are located in rhodecode/tests. | |
|
120 | ||
|
121 | ||
|
122 | JavaScript Testing | |
|
123 | ------------------ | |
|
124 | Currently, we have not defined how to test our JavaScript code. | |
|
125 | ||
|
126 | ||
|
127 | -------------------------------------------------------------------------------- | |
|
128 | ||
|
129 | Pull Requests | |
|
130 | ============= | |
|
131 | ||
|
132 | Pull requests should generally contain only one thing: a single feature, one bug | |
|
133 | fix, etc.. The commit history should be concise and clean, and the pull request | |
|
134 | should contain the ticket number (also a good idea for the commits themselves) | |
|
135 | to provide context for the reviewer. | |
|
136 | ||
|
137 | See also: :doc:`checklist-pull-request` | |
|
138 | ||
|
139 | ||
|
140 | Reviewers | |
|
141 | --------- | |
|
142 | Each pull request must be approved by at least one member of the RhodeCode | |
|
143 | team. Assignments may be based on expertise or familiarity with a particular | |
|
144 | area of code, or simply availability. Switching up or adding extra community | |
|
145 | members for different pull requests helps to share knowledge as well as provide | |
|
146 | other perspectives. | |
|
147 | ||
|
148 | ||
|
149 | Responsibility | |
|
150 | -------------- | |
|
151 | The community is responsible for maintaining features and this must be taken | |
|
152 | into consideration. External contributions must be held to the same standards | |
|
153 | as internal contributions. | |
|
154 | ||
|
155 | ||
|
156 | Feature Switch | |
|
157 | -------------- | |
|
158 | Experimental and work-in-progress features can be hidden behind one of two | |
|
159 | switches: | |
|
160 | ||
|
161 | * A setting can be added to the Labs page in the Admin section which may allow | |
|
162 | customers to access and toggle additional features. | |
|
163 | * For work-in-progress or other features where customer access is not desired, | |
|
164 | use a custom setting in the .ini file as a trigger. | |
|
165 | ||
|
166 | ||
|
167 | -------------------------------------------------------------------------------- | |
|
168 | ||
|
169 | Tickets | |
|
170 | ======= | |
|
171 | ||
|
172 | Redmine tickets are a crucial part of our development process. Any code added or | |
|
173 | changed in our codebase should have a corresponding ticket to document it. With | |
|
174 | this in mind, it is important that tickets be as clear and concise as possible, | |
|
175 | including what the expected outcome is. | |
|
176 | ||
|
177 | See also: :doc:`checklist-tickets` |
@@ -0,0 +1,70 b'' | |||
|
1 | |RCE| 4.2.0 |RNS| | |
|
2 | ----------------- | |
|
3 | ||
|
4 | Release Date | |
|
5 | ^^^^^^^^^^^^ | |
|
6 | ||
|
7 | - 2016-06-30 | |
|
8 | ||
|
9 | ||
|
10 | General | |
|
11 | ^^^^^^^ | |
|
12 | ||
|
13 | - Autocomplete: add GET flag support to show/hide active users on autocomplete, | |
|
14 | also display this information in autocomplete data. ref #3374 | |
|
15 | - Gravatar: add flag to show current gravatar + user as disabled user (non-active) | |
|
16 | - Repos, repo groups, user groups: allow to use disabled users in owner field. | |
|
17 | This fixes #3374. | |
|
18 | - Repos, repo groups, user groups: visually show what user is an owner of a | |
|
19 | resource, and if potentially he is disabled in the system. | |
|
20 | - Pull requests: reorder navigation on repo pull requests, fixes #2995 | |
|
21 | - Dependencies: bump dulwich to 0.13.0 | |
|
22 | ||
|
23 | New Features | |
|
24 | ^^^^^^^^^^^^ | |
|
25 | ||
|
26 | - My account: made pull requests aggregate view for users look like not | |
|
27 | created in 1995. Now uses a consistent look with repo one. | |
|
28 | - emails: expose profile link on registation email that super-admins receive. | |
|
29 | Implements #3999. | |
|
30 | - Social auth: move the buttons to the top of nav so they are easier to reach. | |
|
31 | ||
|
32 | ||
|
33 | Security | |
|
34 | ^^^^^^^^ | |
|
35 | ||
|
36 | - Encryption: allow to pass in alternative key for encryption values. Now | |
|
37 | users can use `rhodecode.encrypted_values.secret` that is alternative key, | |
|
38 | de-coupled from `beaker.session` key. | |
|
39 | - Encryption: Implement a slightly improved AesCipher encryption. | |
|
40 | This addresses issues from #4036. | |
|
41 | - Encryption: encrypted values now by default uses HMAC signatures to detect | |
|
42 | if data or secret key is incorrect. The strict checks can be disabled using | |
|
43 | `rhodecode.encrypted_values.strict = false` .ini setting | |
|
44 | ||
|
45 | ||
|
46 | Performance | |
|
47 | ^^^^^^^^^^^ | |
|
48 | ||
|
49 | - Sql: use smarter JOINs when fetching repository information | |
|
50 | - Helpers: optimize slight calls for link_to_user to save some intense queries. | |
|
51 | - App settings: use computed caches for repository settings, this in some cases | |
|
52 | brings almost 4x performance increase for large repos with a lot of issue | |
|
53 | tracker patterns. | |
|
54 | ||
|
55 | ||
|
56 | Fixes | |
|
57 | ^^^^^ | |
|
58 | ||
|
59 | - Fixed events on user pre/post create actions | |
|
60 | - Authentication: fixed problem with saving forms with errors on auth plugins | |
|
61 | - Svn: Avoid chunked transfer for Subversion that caused checkout issues in some cases. | |
|
62 | - Users: fix generate new user password helper. | |
|
63 | - Celery: fixed problem with workers running action in sync mode in some cases. | |
|
64 | - Setup-db: fix redundant question on writable dir. The question needs to be | |
|
65 | asked only when the dir is actually not writable. | |
|
66 | - Elasticsearch: fixed issues when searching single repo using elastic search | |
|
67 | - Social auth: fix issues with non-active users using social authentication | |
|
68 | causing a 500 error. | |
|
69 | - Fixed problem with largefiles extensions on per-repo settings using local | |
|
70 | .hgrc files present inside the repo directory. |
@@ -0,0 +1,40 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2016-2016 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | ||
|
22 | from rhodecode.admin.navigation import NavigationRegistry | |
|
23 | from rhodecode.config.routing import ADMIN_PREFIX | |
|
24 | from rhodecode.lib.utils2 import str2bool | |
|
25 | ||
|
26 | ||
|
27 | def includeme(config): | |
|
28 | settings = config.get_settings() | |
|
29 | ||
|
30 | # Create admin navigation registry and add it to the pyramid registry. | |
|
31 | labs_active = str2bool(settings.get('labs_settings_active', False)) | |
|
32 | navigation_registry = NavigationRegistry(labs_active=labs_active) | |
|
33 | config.registry.registerUtility(navigation_registry) | |
|
34 | ||
|
35 | config.add_route( | |
|
36 | name='admin_settings_open_source', | |
|
37 | pattern=ADMIN_PREFIX + '/settings/open_source') | |
|
38 | ||
|
39 | # Scan module for configuration decorators. | |
|
40 | config.scan() |
@@ -0,0 +1,29 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2016-2016 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | from zope.interface import Interface | |
|
22 | ||
|
23 | ||
|
24 | class IAdminNavigationRegistry(Interface): | |
|
25 | """ | |
|
26 | Interface for the admin navigation registry. Currently this is only | |
|
27 | used to register and retrieve it via pyramids registry. | |
|
28 | """ | |
|
29 | pass |
@@ -0,0 +1,124 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2016-2016 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | ||
|
22 | import logging | |
|
23 | import collections | |
|
24 | ||
|
25 | from pylons import url | |
|
26 | from zope.interface import implementer | |
|
27 | ||
|
28 | from rhodecode.admin.interfaces import IAdminNavigationRegistry | |
|
29 | from rhodecode.lib.utils import get_registry | |
|
30 | from rhodecode.translation import _ | |
|
31 | ||
|
32 | ||
|
33 | log = logging.getLogger(__name__) | |
|
34 | ||
|
35 | NavListEntry = collections.namedtuple('NavListEntry', ['key', 'name', 'url']) | |
|
36 | ||
|
37 | ||
|
38 | class NavEntry(object): | |
|
39 | """ | |
|
40 | Represents an entry in the admin navigation. | |
|
41 | ||
|
42 | :param key: Unique identifier used to store reference in an OrderedDict. | |
|
43 | :param name: Display name, usually a translation string. | |
|
44 | :param view_name: Name of the view, used generate the URL. | |
|
45 | :param pyramid: Indicator to use pyramid for URL generation. This should | |
|
46 | be removed as soon as we are fully migrated to pyramid. | |
|
47 | """ | |
|
48 | ||
|
49 | def __init__(self, key, name, view_name, pyramid=False): | |
|
50 | self.key = key | |
|
51 | self.name = name | |
|
52 | self.view_name = view_name | |
|
53 | self.pyramid = pyramid | |
|
54 | ||
|
55 | def generate_url(self, request): | |
|
56 | if self.pyramid: | |
|
57 | if hasattr(request, 'route_path'): | |
|
58 | return request.route_path(self.view_name) | |
|
59 | else: | |
|
60 | # TODO: johbo: Remove this after migrating to pyramid. | |
|
61 | # We need the pyramid request here to generate URLs to pyramid | |
|
62 | # views from within pylons views. | |
|
63 | from pyramid.threadlocal import get_current_request | |
|
64 | pyramid_request = get_current_request() | |
|
65 | return pyramid_request.route_path(self.view_name) | |
|
66 | else: | |
|
67 | return url(self.view_name) | |
|
68 | ||
|
69 | ||
|
70 | @implementer(IAdminNavigationRegistry) | |
|
71 | class NavigationRegistry(object): | |
|
72 | ||
|
73 | _base_entries = [ | |
|
74 | NavEntry('global', _('Global'), 'admin_settings_global'), | |
|
75 | NavEntry('vcs', _('VCS'), 'admin_settings_vcs'), | |
|
76 | NavEntry('visual', _('Visual'), 'admin_settings_visual'), | |
|
77 | NavEntry('mapping', _('Remap and Rescan'), 'admin_settings_mapping'), | |
|
78 | NavEntry('issuetracker', _('Issue Tracker'), | |
|
79 | 'admin_settings_issuetracker'), | |
|
80 | NavEntry('email', _('Email'), 'admin_settings_email'), | |
|
81 | NavEntry('hooks', _('Hooks'), 'admin_settings_hooks'), | |
|
82 | NavEntry('search', _('Full Text Search'), 'admin_settings_search'), | |
|
83 | NavEntry('system', _('System Info'), 'admin_settings_system'), | |
|
84 | NavEntry('open_source', _('Open Source Licenses'), | |
|
85 | 'admin_settings_open_source', pyramid=True), | |
|
86 | # TODO: marcink: we disable supervisor now until the supervisor stats | |
|
87 | # page is fixed in the nix configuration | |
|
88 | # NavEntry('supervisor', _('Supervisor'), 'admin_settings_supervisor'), | |
|
89 | ] | |
|
90 | ||
|
91 | _labs_entry = NavEntry('labs', _('Labs'), | |
|
92 | 'admin_settings_labs') | |
|
93 | ||
|
94 | def __init__(self, labs_active=False): | |
|
95 | self._registered_entries = collections.OrderedDict([ | |
|
96 | (item.key, item) for item in self.__class__._base_entries | |
|
97 | ]) | |
|
98 | ||
|
99 | if labs_active: | |
|
100 | self.add_entry(self._labs_entry) | |
|
101 | ||
|
102 | def add_entry(self, entry): | |
|
103 | self._registered_entries[entry.key] = entry | |
|
104 | ||
|
105 | def get_navlist(self, request): | |
|
106 | navlist = [NavListEntry(i.key, i.name, i.generate_url(request)) | |
|
107 | for i in self._registered_entries.values()] | |
|
108 | return navlist | |
|
109 | ||
|
110 | ||
|
111 | def navigation_registry(request): | |
|
112 | """ | |
|
113 | Helper that returns the admin navigation registry. | |
|
114 | """ | |
|
115 | pyramid_registry = get_registry(request) | |
|
116 | nav_registry = pyramid_registry.queryUtility(IAdminNavigationRegistry) | |
|
117 | return nav_registry | |
|
118 | ||
|
119 | ||
|
120 | def navigation_list(request): | |
|
121 | """ | |
|
122 | Helper that returns the admin navigation as list of NavListEntry objects. | |
|
123 | """ | |
|
124 | return navigation_registry(request).get_navlist(request) |
@@ -0,0 +1,55 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2016-2016 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | import collections | |
|
22 | import logging | |
|
23 | ||
|
24 | from pylons import tmpl_context as c | |
|
25 | from pyramid.view import view_config | |
|
26 | ||
|
27 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator | |
|
28 | from rhodecode.lib.utils import read_opensource_licenses | |
|
29 | ||
|
30 | from .navigation import navigation_list | |
|
31 | ||
|
32 | ||
|
33 | log = logging.getLogger(__name__) | |
|
34 | ||
|
35 | ||
|
36 | class AdminSettingsView(object): | |
|
37 | ||
|
38 | def __init__(self, context, request): | |
|
39 | self.request = request | |
|
40 | self.context = context | |
|
41 | self.session = request.session | |
|
42 | self._rhodecode_user = request.user | |
|
43 | ||
|
44 | @LoginRequired() | |
|
45 | @HasPermissionAllDecorator('hg.admin') | |
|
46 | @view_config( | |
|
47 | route_name='admin_settings_open_source', request_method='GET', | |
|
48 | renderer='rhodecode:templates/admin/settings/settings.html') | |
|
49 | def open_source_licenses(self): | |
|
50 | c.active = 'open_source' | |
|
51 | c.navlist = navigation_list(self.request) | |
|
52 | c.opensource_licenses = collections.OrderedDict( | |
|
53 | sorted(read_opensource_licenses().items(), key=lambda t: t[0])) | |
|
54 | ||
|
55 | return {} |
|
1 | NO CONTENT: new file 100644 |
@@ -0,0 +1,92 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2016-2016 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | ||
|
22 | import pytest | |
|
23 | ||
|
24 | ||
|
25 | class EnabledAuthPlugin(): | |
|
26 | """ | |
|
27 | Context manager that updates the 'auth_plugins' setting in DB to enable | |
|
28 | a plugin. Previous setting is restored on exit. The rhodecode auth plugin | |
|
29 | is also enabled because it is needed to login the test users. | |
|
30 | """ | |
|
31 | ||
|
32 | def __init__(self, plugin): | |
|
33 | self.new_value = set([ | |
|
34 | 'egg:rhodecode-enterprise-ce#rhodecode', | |
|
35 | plugin.get_id() | |
|
36 | ]) | |
|
37 | ||
|
38 | def __enter__(self): | |
|
39 | from rhodecode.model.settings import SettingsModel | |
|
40 | self._old_value = SettingsModel().get_auth_plugins() | |
|
41 | SettingsModel().create_or_update_setting( | |
|
42 | 'auth_plugins', ','.join(self.new_value)) | |
|
43 | ||
|
44 | def __exit__(self, type, value, traceback): | |
|
45 | from rhodecode.model.settings import SettingsModel | |
|
46 | SettingsModel().create_or_update_setting( | |
|
47 | 'auth_plugins', ','.join(self._old_value)) | |
|
48 | ||
|
49 | ||
|
50 | class DisabledAuthPlugin(): | |
|
51 | """ | |
|
52 | Context manager that updates the 'auth_plugins' setting in DB to disable | |
|
53 | a plugin. Previous setting is restored on exit. | |
|
54 | """ | |
|
55 | ||
|
56 | def __init__(self, plugin): | |
|
57 | self.plugin_id = plugin.get_id() | |
|
58 | ||
|
59 | def __enter__(self): | |
|
60 | from rhodecode.model.settings import SettingsModel | |
|
61 | self._old_value = SettingsModel().get_auth_plugins() | |
|
62 | new_value = [id_ for id_ in self._old_value if id_ != self.plugin_id] | |
|
63 | SettingsModel().create_or_update_setting( | |
|
64 | 'auth_plugins', ','.join(new_value)) | |
|
65 | ||
|
66 | def __exit__(self, type, value, traceback): | |
|
67 | from rhodecode.model.settings import SettingsModel | |
|
68 | SettingsModel().create_or_update_setting( | |
|
69 | 'auth_plugins', ','.join(self._old_value)) | |
|
70 | ||
|
71 | ||
|
72 | @pytest.fixture(params=[ | |
|
73 | ('rhodecode.authentication.plugins.auth_crowd', 'egg:rhodecode-enterprise-ce#crowd'), | |
|
74 | ('rhodecode.authentication.plugins.auth_headers', 'egg:rhodecode-enterprise-ce#headers'), | |
|
75 | ('rhodecode.authentication.plugins.auth_jasig_cas', 'egg:rhodecode-enterprise-ce#jasig_cas'), | |
|
76 | ('rhodecode.authentication.plugins.auth_ldap', 'egg:rhodecode-enterprise-ce#ldap'), | |
|
77 | ('rhodecode.authentication.plugins.auth_pam', 'egg:rhodecode-enterprise-ce#pam'), | |
|
78 | ('rhodecode.authentication.plugins.auth_rhodecode', 'egg:rhodecode-enterprise-ce#rhodecode'), | |
|
79 | ('rhodecode.authentication.plugins.auth_token', 'egg:rhodecode-enterprise-ce#token'), | |
|
80 | ]) | |
|
81 | def auth_plugin(request): | |
|
82 | """ | |
|
83 | Fixture that provides instance for each authentication plugin. These | |
|
84 | instances are NOT the instances which are registered to the authentication | |
|
85 | registry. | |
|
86 | """ | |
|
87 | from importlib import import_module | |
|
88 | ||
|
89 | # Create plugin instance. | |
|
90 | module, plugin_id = request.param | |
|
91 | plugin_module = import_module(module) | |
|
92 | return plugin_module.plugin_factory(plugin_id) |
|
1 | NO CONTENT: new file 100644 |
@@ -0,0 +1,77 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2016-2016 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | ||
|
22 | import pytest | |
|
23 | ||
|
24 | from rhodecode.authentication.tests.conftest import ( | |
|
25 | EnabledAuthPlugin, DisabledAuthPlugin) | |
|
26 | from rhodecode.config.routing import ADMIN_PREFIX | |
|
27 | ||
|
28 | ||
|
29 | @pytest.mark.usefixtures('autologin_user', 'app') | |
|
30 | class TestAuthenticationSettings: | |
|
31 | ||
|
32 | def test_auth_settings_global_view_get(self, app): | |
|
33 | url = '{prefix}/auth/'.format(prefix=ADMIN_PREFIX) | |
|
34 | response = app.get(url) | |
|
35 | assert response.status_code == 200 | |
|
36 | ||
|
37 | def test_plugin_settings_view_get(self, app, auth_plugin): | |
|
38 | url = '{prefix}/auth/{name}'.format( | |
|
39 | prefix=ADMIN_PREFIX, | |
|
40 | name=auth_plugin.name) | |
|
41 | with EnabledAuthPlugin(auth_plugin): | |
|
42 | response = app.get(url) | |
|
43 | assert response.status_code == 200 | |
|
44 | ||
|
45 | def test_plugin_settings_view_post(self, app, auth_plugin, csrf_token): | |
|
46 | url = '{prefix}/auth/{name}'.format( | |
|
47 | prefix=ADMIN_PREFIX, | |
|
48 | name=auth_plugin.name) | |
|
49 | params = { | |
|
50 | 'enabled': True, | |
|
51 | 'cache_ttl': 0, | |
|
52 | 'csrf_token': csrf_token, | |
|
53 | } | |
|
54 | with EnabledAuthPlugin(auth_plugin): | |
|
55 | response = app.post(url, params=params) | |
|
56 | assert response.status_code in [200, 302] | |
|
57 | ||
|
58 | def test_plugin_settings_view_get_404(self, app, auth_plugin): | |
|
59 | url = '{prefix}/auth/{name}'.format( | |
|
60 | prefix=ADMIN_PREFIX, | |
|
61 | name=auth_plugin.name) | |
|
62 | with DisabledAuthPlugin(auth_plugin): | |
|
63 | response = app.get(url, status=404) | |
|
64 | assert response.status_code == 404 | |
|
65 | ||
|
66 | def test_plugin_settings_view_post_404(self, app, auth_plugin, csrf_token): | |
|
67 | url = '{prefix}/auth/{name}'.format( | |
|
68 | prefix=ADMIN_PREFIX, | |
|
69 | name=auth_plugin.name) | |
|
70 | params = { | |
|
71 | 'enabled': True, | |
|
72 | 'cache_ttl': 0, | |
|
73 | 'csrf_token': csrf_token, | |
|
74 | } | |
|
75 | with DisabledAuthPlugin(auth_plugin): | |
|
76 | response = app.post(url, params=params, status=404) | |
|
77 | assert response.status_code == 404 |
@@ -0,0 +1,28 b'' | |||
|
1 | # Copyright (C) 2016 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | """ | |
|
20 | Checks around the API of the class RhodeCodeAuthPluginBase. | |
|
21 | """ | |
|
22 | ||
|
23 | from rhodecode.authentication.base import RhodeCodeAuthPluginBase | |
|
24 | ||
|
25 | ||
|
26 | def test_str_returns_plugin_id(): | |
|
27 | plugin = RhodeCodeAuthPluginBase(plugin_id='stub_plugin_id') | |
|
28 | assert str(plugin) == 'stub_plugin_id' |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
@@ -1,6 +1,6 b'' | |||
|
1 | 1 | [bumpversion] |
|
2 |
current_version = 4. |
|
|
2 | current_version = 4.2.0 | |
|
3 | 3 | message = release: Bump version {current_version} to {new_version} |
|
4 | 4 | |
|
5 | 5 | [bumpversion:file:rhodecode/VERSION] |
|
6 | 6 |
@@ -1,34 +1,28 b'' | |||
|
1 | 1 | [DEFAULT] |
|
2 | 2 | done = false |
|
3 | 3 | |
|
4 | [task:fixes_on_stable] | |
|
5 | ||
|
6 | [task:changelog_updated] | |
|
7 | ||
|
4 | 8 | [task:bump_version] |
|
5 | 9 | done = true |
|
6 | 10 | |
|
7 | [task:rc_tools_pinned] | |
|
8 | done = true | |
|
11 | [task:generate_api_docs] | |
|
12 | ||
|
13 | [task:updated_translation] | |
|
9 | 14 | |
|
10 | [task:fixes_on_stable] | |
|
11 | done = true | |
|
15 | [release] | |
|
16 | state = in_progress | |
|
17 | version = 4.2.0 | |
|
18 | ||
|
19 | [task:rc_tools_pinned] | |
|
12 | 20 | |
|
13 | 21 | [task:pip2nix_generated] |
|
14 | done = true | |
|
15 | ||
|
16 | [task:changelog_updated] | |
|
17 | done = true | |
|
18 | ||
|
19 | [task:generate_api_docs] | |
|
20 | done = true | |
|
21 | 22 | |
|
22 | 23 | [task:generate_js_routes] |
|
23 | done = true | |
|
24 | ||
|
25 | [release] | |
|
26 | state = prepared | |
|
27 | version = 4.1.2 | |
|
28 | ||
|
29 | [task:updated_translation] | |
|
30 | 24 | |
|
31 | 25 | [task:updated_trial_license] |
|
32 | 26 | |
|
33 | 27 | [task:generate_oss_licenses] |
|
34 | 28 |
@@ -1,101 +1,114 b'' | |||
|
1 | 1 | ========= |
|
2 | 2 | RhodeCode |
|
3 | 3 | ========= |
|
4 | 4 | |
|
5 | 5 | About |
|
6 | 6 | ----- |
|
7 | 7 | |
|
8 | 8 | ``RhodeCode`` is a fast and powerful management tool for Mercurial_ and GIT_ |
|
9 | 9 | and Subversion_ with a built in push/pull server, full text search, |
|
10 | 10 | pull requests and powerfull code-review system. It works on http/https and |
|
11 | 11 | has a few unique features like: |
|
12 | 12 | - plugable architecture |
|
13 | 13 | - advanced permission system with IP restrictions |
|
14 | 14 | - rich set of authentication plugins including LDAP, |
|
15 | 15 | ActiveDirectory, Atlassian Crowd, Http-Headers, Pam, Token-Auth. |
|
16 | 16 | - live code-review chat |
|
17 | 17 | - full web based file editing |
|
18 | 18 | - unified multi vcs support |
|
19 | 19 | - snippets (gist) system |
|
20 | 20 | - integration with all 3rd party issue trackers |
|
21 | 21 | |
|
22 | 22 | RhodeCode also provides rich API, and multiple event hooks so it's easy |
|
23 | 23 | integrable with existing external systems. |
|
24 | 24 | |
|
25 | 25 | RhodeCode is similar in some respects to gitlab_, github_ or bitbucket_, |
|
26 | 26 | however RhodeCode can be run as standalone hosted application on your own server. |
|
27 | 27 | RhodeCode can be installed on \*nix or Windows systems. |
|
28 | 28 | |
|
29 | 29 | RhodeCode uses `PEP386 versioning <http://www.python.org/dev/peps/pep-0386/>`_ |
|
30 | 30 | |
|
31 | 31 | Installation |
|
32 | 32 | ------------ |
|
33 | 33 | Please visit https://docs.rhodecode.com/RhodeCode-Control/tasks/install-cli.html |
|
34 | 34 | for more details |
|
35 | 35 | |
|
36 | 36 | |
|
37 | 37 | Source code |
|
38 | 38 | ----------- |
|
39 | 39 | |
|
40 | 40 | The latest sources can be obtained from official RhodeCode instance |
|
41 | 41 | https://code.rhodecode.com |
|
42 | 42 | |
|
43 | 43 | |
|
44 | Contributions | |
|
45 | ------------- | |
|
46 | ||
|
47 | RhodeCode is open-source; contributions are welcome! | |
|
48 | ||
|
49 | Please see the contribution documentation inside of the docs folder, which is | |
|
50 | also available at | |
|
51 | https://docs.rhodecode.com/RhodeCode-Enterprise/contributing/contributing.html | |
|
52 | ||
|
53 | For additional information about collaboration tools, our issue tracker, | |
|
54 | licensing, and contribution credit, visit https://rhodecode.com/open-source | |
|
55 | ||
|
56 | ||
|
44 | 57 | RhodeCode Features |
|
45 | 58 | ------------------ |
|
46 | 59 | |
|
47 | 60 | Check out all features of RhodeCode at https://rhodecode.com/features |
|
48 | 61 | |
|
49 | 62 | License |
|
50 | 63 | ------- |
|
51 | 64 | |
|
52 | 65 | ``RhodeCode`` is dual-licensed with AGPLv3 and commercial license. |
|
53 | 66 | Please see LICENSE.txt file for details. |
|
54 | 67 | |
|
55 | 68 | |
|
56 | 69 | Getting help |
|
57 | 70 | ------------ |
|
58 | 71 | |
|
59 | 72 | Listed bellow are various support resources that should help. |
|
60 | 73 | |
|
61 | 74 | .. note:: |
|
62 | 75 | |
|
63 | 76 | Please try to read the documentation before posting any issues, especially |
|
64 | 77 | the **troubleshooting section** |
|
65 | 78 | |
|
66 | 79 | - Official issue tracker `RhodeCode Issue tracker <https://issues.rhodecode.com>`_ |
|
67 | 80 | |
|
68 | 81 | - Search our community portal `Community portal <https://community.rhodecode.com>`_ |
|
69 | 82 | |
|
70 | 83 | - Join #rhodecode on FreeNode (irc.freenode.net) |
|
71 | 84 | or use http://webchat.freenode.net/?channels=rhodecode for web access to irc. |
|
72 | 85 | |
|
73 | 86 | - You can also follow RhodeCode on twitter **@RhodeCode** where we often post |
|
74 | 87 | news and other interesting stuff about RhodeCode. |
|
75 | 88 | |
|
76 | 89 | |
|
77 | 90 | Online documentation |
|
78 | 91 | -------------------- |
|
79 | 92 | |
|
80 | 93 | Online documentation for the current version of RhodeCode is available at |
|
81 | 94 | - http://rhodecode.com/docs |
|
82 | 95 | |
|
83 | 96 | You may also build the documentation for yourself - go into ``docs/`` and run:: |
|
84 | 97 | |
|
85 | 98 | nix-build default.nix -o result && make clean html |
|
86 | 99 | |
|
87 | 100 | (You need to have sphinx_ installed to build the documentation. If you don't |
|
88 | 101 | have sphinx_ installed you can install it via the command: |
|
89 | 102 | ``pip install sphinx``) |
|
90 | 103 | |
|
91 | 104 | .. _virtualenv: http://pypi.python.org/pypi/virtualenv |
|
92 | 105 | .. _python: http://www.python.org/ |
|
93 | 106 | .. _sphinx: http://sphinx.pocoo.org/ |
|
94 | 107 | .. _mercurial: http://mercurial.selenic.com/ |
|
95 | 108 | .. _bitbucket: http://bitbucket.org/ |
|
96 | 109 | .. _github: http://github.com/ |
|
97 | 110 | .. _gitlab: http://gitlab.com/ |
|
98 | 111 | .. _subversion: http://subversion.tigris.org/ |
|
99 | 112 | .. _git: http://git-scm.com/ |
|
100 | 113 | .. _celery: http://celeryproject.org/ |
|
101 | 114 | .. _vcs: http://pypi.python.org/pypi/vcs |
@@ -1,47 +1,47 b'' | |||
|
1 | 1 | README - Quickstart |
|
2 | 2 | =================== |
|
3 | 3 | |
|
4 |
This folder contains functional tests and |
|
|
4 | This folder contains the functional tests and automation of specification | |
|
5 | 5 | examples. Details about testing can be found in |
|
6 | 6 | `/docs-internal/testing/index.rst`. |
|
7 | 7 | |
|
8 | 8 | |
|
9 | 9 | Setting up your Rhodecode Enterprise instance |
|
10 | 10 | --------------------------------------------- |
|
11 | 11 | |
|
12 | 12 | The tests will create users and repositories as needed, so you can start with a |
|
13 | 13 | new and empty instance. |
|
14 | 14 | |
|
15 | 15 | Use the following example call for the database setup of Enterprise:: |
|
16 | 16 | |
|
17 | 17 | paster setup-rhodecode \ |
|
18 | 18 | --user=admin \ |
|
19 | 19 | --email=admin@example.com \ |
|
20 | 20 | --password=secret \ |
|
21 | 21 | --api-key=9999999999999999999999999999999999999999 \ |
|
22 | 22 | your-enterprise-config.ini |
|
23 | 23 | |
|
24 | This way the username, password and auth token of the admin user will match the | |
|
24 | This way the username, password, and auth token of the admin user will match the | |
|
25 | 25 | defaults from the test run. |
|
26 | 26 | |
|
27 | 27 | |
|
28 | 28 | Usage |
|
29 | 29 | ----- |
|
30 | 30 | |
|
31 | 31 | 1. Make sure your Rhodecode Enterprise instance is running at |
|
32 | 32 | http://localhost:5000. |
|
33 | 33 | |
|
34 | 34 | 2. Enter `nix-shell` from the acceptance_tests folder:: |
|
35 | 35 | |
|
36 | 36 | cd acceptance_tests |
|
37 |
nix-shell |
|
|
37 | nix-shell | |
|
38 | 38 | |
|
39 | 39 | Make sure that `rcpkgs` and `rcnixpkgs` are available on the nix path. |
|
40 | 40 | |
|
41 | 41 | 3. Run the tests:: |
|
42 | 42 | |
|
43 | 43 | py.test -c example.ini -vs |
|
44 | 44 | |
|
45 | 45 | The parameter ``-vs`` allows you to see debugging output during the test |
|
46 | 46 | run. Check ``py.test --help`` and the documentation at http://pytest.org to |
|
47 | 47 | learn all details about the test runner. |
@@ -1,608 +1,612 b'' | |||
|
1 | 1 | ################################################################################ |
|
2 | 2 | ################################################################################ |
|
3 | 3 | # RhodeCode Enterprise - configuration file # |
|
4 | 4 | # Built-in functions and variables # |
|
5 | 5 | # The %(here)s variable will be replaced with the parent directory of this file# |
|
6 | 6 | # # |
|
7 | 7 | ################################################################################ |
|
8 | 8 | |
|
9 | 9 | [DEFAULT] |
|
10 | 10 | debug = true |
|
11 | pdebug = false | |
|
12 | 11 | ################################################################################ |
|
13 | 12 | ## Uncomment and replace with the email address which should receive ## |
|
14 | 13 | ## any error reports after an application crash ## |
|
15 | 14 | ## Additionally these settings will be used by the RhodeCode mailing system ## |
|
16 | 15 | ################################################################################ |
|
17 | 16 | #email_to = admin@localhost |
|
18 | 17 | #error_email_from = paste_error@localhost |
|
19 | 18 | #app_email_from = rhodecode-noreply@localhost |
|
20 | 19 | #error_message = |
|
21 | 20 | #email_prefix = [RhodeCode] |
|
22 | 21 | |
|
23 | 22 | #smtp_server = mail.server.com |
|
24 | 23 | #smtp_username = |
|
25 | 24 | #smtp_password = |
|
26 | 25 | #smtp_port = |
|
27 | 26 | #smtp_use_tls = false |
|
28 | 27 | #smtp_use_ssl = true |
|
29 | 28 | ## Specify available auth parameters here (e.g. LOGIN PLAIN CRAM-MD5, etc.) |
|
30 | 29 | #smtp_auth = |
|
31 | 30 | |
|
32 | 31 | [server:main] |
|
33 | 32 | ## COMMON ## |
|
34 | 33 | host = 127.0.0.1 |
|
35 | 34 | port = 5000 |
|
36 | 35 | |
|
37 | 36 | ################################## |
|
38 | 37 | ## WAITRESS WSGI SERVER ## |
|
39 | 38 | ## Recommended for Development ## |
|
40 | 39 | ################################## |
|
41 | 40 | use = egg:waitress#main |
|
42 | 41 | ## number of worker threads |
|
43 | 42 | threads = 5 |
|
44 | 43 | ## MAX BODY SIZE 100GB |
|
45 | 44 | max_request_body_size = 107374182400 |
|
46 | 45 | ## Use poll instead of select, fixes file descriptors limits problems. |
|
47 | 46 | ## May not work on old windows systems. |
|
48 | 47 | asyncore_use_poll = true |
|
49 | 48 | |
|
50 | 49 | |
|
51 | 50 | ########################## |
|
52 | 51 | ## GUNICORN WSGI SERVER ## |
|
53 | 52 | ########################## |
|
54 | 53 | ## run with gunicorn --log-config <inifile.ini> --paste <inifile.ini> |
|
55 | 54 | #use = egg:gunicorn#main |
|
56 | 55 | ## Sets the number of process workers. You must set `instance_id = *` |
|
57 | 56 | ## when this option is set to more than one worker, recommended |
|
58 | 57 | ## value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers |
|
59 | 58 | ## The `instance_id = *` must be set in the [app:main] section below |
|
60 | 59 | #workers = 2 |
|
61 | 60 | ## number of threads for each of the worker, must be set to 1 for gevent |
|
62 | 61 | ## generally recommened to be at 1 |
|
63 | 62 | #threads = 1 |
|
64 | 63 | ## process name |
|
65 | 64 | #proc_name = rhodecode |
|
66 | 65 | ## type of worker class, one of sync, gevent |
|
67 | 66 | ## recommended for bigger setup is using of of other than sync one |
|
68 | 67 | #worker_class = sync |
|
69 | 68 | ## The maximum number of simultaneous clients. Valid only for Gevent |
|
70 | 69 | #worker_connections = 10 |
|
71 | 70 | ## max number of requests that worker will handle before being gracefully |
|
72 | 71 | ## restarted, could prevent memory leaks |
|
73 | 72 | #max_requests = 1000 |
|
74 | 73 | #max_requests_jitter = 30 |
|
75 | 74 | ## amount of time a worker can spend with handling a request before it |
|
76 | 75 | ## gets killed and restarted. Set to 6hrs |
|
77 | 76 | #timeout = 21600 |
|
78 | 77 | |
|
79 | 78 | |
|
80 | 79 | ## prefix middleware for RhodeCode, disables force_https flag. |
|
81 | 80 | ## allows to set RhodeCode under a prefix in server. |
|
82 | 81 | ## eg https://server.com/<prefix>. Enable `filter-with =` option below as well. |
|
83 | 82 | #[filter:proxy-prefix] |
|
84 | 83 | #use = egg:PasteDeploy#prefix |
|
85 | 84 | #prefix = /<your-prefix> |
|
86 | 85 | |
|
87 | 86 | [app:main] |
|
88 | 87 | use = egg:rhodecode-enterprise-ce |
|
89 | 88 | ## enable proxy prefix middleware, defined below |
|
90 | 89 | #filter-with = proxy-prefix |
|
91 | 90 | |
|
92 | 91 | # During development the we want to have the debug toolbar enabled |
|
93 | 92 | pyramid.includes = |
|
94 | 93 | pyramid_debugtoolbar |
|
95 | 94 | rhodecode.utils.debugtoolbar |
|
96 | 95 | rhodecode.lib.middleware.request_wrapper |
|
97 | 96 | |
|
98 | 97 | pyramid.reload_templates = true |
|
99 | 98 | |
|
100 | 99 | debugtoolbar.hosts = 0.0.0.0/0 |
|
101 | 100 | debugtoolbar.exclude_prefixes = |
|
102 | 101 | /css |
|
103 | 102 | /fonts |
|
104 | 103 | /images |
|
105 | 104 | /js |
|
106 | 105 | |
|
107 | 106 | ## RHODECODE PLUGINS ## |
|
108 | 107 | rhodecode.includes = |
|
109 | 108 | rhodecode.api |
|
110 | 109 | |
|
111 | 110 | |
|
112 | 111 | # api prefix url |
|
113 | 112 | rhodecode.api.url = /_admin/api |
|
114 | 113 | |
|
115 | 114 | |
|
116 | 115 | ## END RHODECODE PLUGINS ## |
|
117 | 116 | |
|
117 | ## encryption key used to encrypt social plugin tokens, | |
|
118 | ## remote_urls with credentials etc, if not set it defaults to | |
|
119 | ## `beaker.session.secret` | |
|
120 | #rhodecode.encrypted_values.secret = | |
|
121 | ||
|
122 | ## decryption strict mode (enabled by default). It controls if decryption raises | |
|
123 | ## `SignatureVerificationError` in case of wrong key, or damaged encryption data. | |
|
124 | #rhodecode.encrypted_values.strict = false | |
|
125 | ||
|
118 | 126 | full_stack = true |
|
119 | 127 | |
|
120 | 128 | ## Serve static files via RhodeCode, disable to serve them via HTTP server |
|
121 | 129 | static_files = true |
|
122 | 130 | |
|
131 | # autogenerate javascript routes file on startup | |
|
132 | generate_js_files = false | |
|
133 | ||
|
123 | 134 | ## Optional Languages |
|
124 | 135 | ## en(default), be, de, es, fr, it, ja, pl, pt, ru, zh |
|
125 | 136 | lang = en |
|
126 | 137 | |
|
127 | 138 | ## perform a full repository scan on each server start, this should be |
|
128 | 139 | ## set to false after first startup, to allow faster server restarts. |
|
129 | 140 | startup.import_repos = false |
|
130 | 141 | |
|
131 | 142 | ## Uncomment and set this path to use archive download cache. |
|
132 | 143 | ## Once enabled, generated archives will be cached at this location |
|
133 | 144 | ## and served from the cache during subsequent requests for the same archive of |
|
134 | 145 | ## the repository. |
|
135 | 146 | #archive_cache_dir = /tmp/tarballcache |
|
136 | 147 | |
|
137 | 148 | ## change this to unique ID for security |
|
138 | 149 | app_instance_uuid = rc-production |
|
139 | 150 | |
|
140 | 151 | ## cut off limit for large diffs (size in bytes) |
|
141 | 152 | cut_off_limit_diff = 1024000 |
|
142 | 153 | cut_off_limit_file = 256000 |
|
143 | 154 | |
|
144 | 155 | ## use cache version of scm repo everywhere |
|
145 | 156 | vcs_full_cache = true |
|
146 | 157 | |
|
147 | 158 | ## force https in RhodeCode, fixes https redirects, assumes it's always https |
|
148 | 159 | ## Normally this is controlled by proper http flags sent from http server |
|
149 | 160 | force_https = false |
|
150 | 161 | |
|
151 | 162 | ## use Strict-Transport-Security headers |
|
152 | 163 | use_htsts = false |
|
153 | 164 | |
|
154 | 165 | ## number of commits stats will parse on each iteration |
|
155 | 166 | commit_parse_limit = 25 |
|
156 | 167 | |
|
157 | 168 | ## git rev filter option, --all is the default filter, if you need to |
|
158 | 169 | ## hide all refs in changelog switch this to --branches --tags |
|
159 | 170 | git_rev_filter = --branches --tags |
|
160 | 171 | |
|
161 | 172 | # Set to true if your repos are exposed using the dumb protocol |
|
162 | 173 | git_update_server_info = false |
|
163 | 174 | |
|
164 | 175 | ## RSS/ATOM feed options |
|
165 | 176 | rss_cut_off_limit = 256000 |
|
166 | 177 | rss_items_per_page = 10 |
|
167 | 178 | rss_include_diff = false |
|
168 | 179 | |
|
169 | 180 | ## gist URL alias, used to create nicer urls for gist. This should be an |
|
170 | 181 | ## url that does rewrites to _admin/gists/<gistid>. |
|
171 | 182 | ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal |
|
172 | 183 | ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/<gistid> |
|
173 | 184 | gist_alias_url = |
|
174 | 185 | |
|
175 | 186 | ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be |
|
176 | 187 | ## used for access. |
|
177 | 188 | ## Adding ?auth_token = <token> to the url authenticates this request as if it |
|
178 | 189 | ## came from the the logged in user who own this authentication token. |
|
179 | 190 | ## |
|
180 | 191 | ## Syntax is <ControllerClass>:<function_pattern>. |
|
181 | 192 | ## To enable access to raw_files put `FilesController:raw`. |
|
182 | 193 | ## To enable access to patches add `ChangesetController:changeset_patch`. |
|
183 | 194 | ## The list should be "," separated and on a single line. |
|
184 | 195 | ## |
|
185 | 196 | ## Recommended controllers to enable: |
|
186 | 197 | # ChangesetController:changeset_patch, |
|
187 | 198 | # ChangesetController:changeset_raw, |
|
188 | 199 | # FilesController:raw, |
|
189 | 200 | # FilesController:archivefile, |
|
190 | 201 | # GistsController:*, |
|
191 | 202 | api_access_controllers_whitelist = |
|
192 | 203 | |
|
193 | 204 | ## default encoding used to convert from and to unicode |
|
194 | 205 | ## can be also a comma separated list of encoding in case of mixed encodings |
|
195 | 206 | default_encoding = UTF-8 |
|
196 | 207 | |
|
197 | 208 | ## instance-id prefix |
|
198 | 209 | ## a prefix key for this instance used for cache invalidation when running |
|
199 | 210 | ## multiple instances of rhodecode, make sure it's globally unique for |
|
200 | 211 | ## all running rhodecode instances. Leave empty if you don't use it |
|
201 | 212 | instance_id = |
|
202 | 213 | |
|
203 | 214 | ## Fallback authentication plugin. Set this to a plugin ID to force the usage |
|
204 | 215 | ## of an authentication plugin also if it is disabled by it's settings. |
|
205 | 216 | ## This could be useful if you are unable to log in to the system due to broken |
|
206 | 217 | ## authentication settings. Then you can enable e.g. the internal rhodecode auth |
|
207 | 218 | ## module to log in again and fix the settings. |
|
208 | 219 | ## |
|
209 | 220 | ## Available builtin plugin IDs (hash is part of the ID): |
|
210 | 221 | ## egg:rhodecode-enterprise-ce#rhodecode |
|
211 | 222 | ## egg:rhodecode-enterprise-ce#pam |
|
212 | 223 | ## egg:rhodecode-enterprise-ce#ldap |
|
213 | 224 | ## egg:rhodecode-enterprise-ce#jasig_cas |
|
214 | 225 | ## egg:rhodecode-enterprise-ce#headers |
|
215 | 226 | ## egg:rhodecode-enterprise-ce#crowd |
|
216 | 227 | #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode |
|
217 | 228 | |
|
218 | 229 | ## alternative return HTTP header for failed authentication. Default HTTP |
|
219 | 230 | ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with |
|
220 | 231 | ## handling that causing a series of failed authentication calls. |
|
221 | 232 | ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code |
|
222 | 233 | ## This will be served instead of default 401 on bad authnetication |
|
223 | 234 | auth_ret_code = |
|
224 | 235 | |
|
225 | 236 | ## use special detection method when serving auth_ret_code, instead of serving |
|
226 | 237 | ## ret_code directly, use 401 initially (Which triggers credentials prompt) |
|
227 | 238 | ## and then serve auth_ret_code to clients |
|
228 | 239 | auth_ret_code_detection = false |
|
229 | 240 | |
|
230 | 241 | ## locking return code. When repository is locked return this HTTP code. 2XX |
|
231 | 242 | ## codes don't break the transactions while 4XX codes do |
|
232 | 243 | lock_ret_code = 423 |
|
233 | 244 | |
|
234 | 245 | ## allows to change the repository location in settings page |
|
235 | 246 | allow_repo_location_change = true |
|
236 | 247 | |
|
237 | 248 | ## allows to setup custom hooks in settings page |
|
238 | 249 | allow_custom_hooks_settings = true |
|
239 | 250 | |
|
240 | 251 | ## generated license token, goto license page in RhodeCode settings to obtain |
|
241 | 252 | ## new token |
|
242 | 253 | license_token = |
|
243 | 254 | |
|
244 | 255 | ## supervisor connection uri, for managing supervisor and logs. |
|
245 | 256 | supervisor.uri = |
|
246 | 257 | ## supervisord group name/id we only want this RC instance to handle |
|
247 | 258 | supervisor.group_id = dev |
|
248 | 259 | |
|
249 | 260 | ## Display extended labs settings |
|
250 | 261 | labs_settings_active = true |
|
251 | 262 | |
|
252 | 263 | #################################### |
|
253 | 264 | ### CELERY CONFIG #### |
|
254 | 265 | #################################### |
|
255 | 266 | use_celery = false |
|
256 | 267 | broker.host = localhost |
|
257 | 268 | broker.vhost = rabbitmqhost |
|
258 | 269 | broker.port = 5672 |
|
259 | 270 | broker.user = rabbitmq |
|
260 | 271 | broker.password = qweqwe |
|
261 | 272 | |
|
262 | 273 | celery.imports = rhodecode.lib.celerylib.tasks |
|
263 | 274 | |
|
264 | 275 | celery.result.backend = amqp |
|
265 | 276 | celery.result.dburi = amqp:// |
|
266 | 277 | celery.result.serialier = json |
|
267 | 278 | |
|
268 | 279 | #celery.send.task.error.emails = true |
|
269 | 280 | #celery.amqp.task.result.expires = 18000 |
|
270 | 281 | |
|
271 | 282 | celeryd.concurrency = 2 |
|
272 | 283 | #celeryd.log.file = celeryd.log |
|
273 | 284 | celeryd.log.level = debug |
|
274 | 285 | celeryd.max.tasks.per.child = 1 |
|
275 | 286 | |
|
276 | 287 | ## tasks will never be sent to the queue, but executed locally instead. |
|
277 | 288 | celery.always.eager = false |
|
278 | 289 | |
|
279 | 290 | #################################### |
|
280 | 291 | ### BEAKER CACHE #### |
|
281 | 292 | #################################### |
|
282 | 293 | # default cache dir for templates. Putting this into a ramdisk |
|
283 | 294 | ## can boost performance, eg. %(here)s/data_ramdisk |
|
284 | 295 | cache_dir = %(here)s/data |
|
285 | 296 | |
|
286 | 297 | ## locking and default file storage for Beaker. Putting this into a ramdisk |
|
287 | 298 | ## can boost performance, eg. %(here)s/data_ramdisk/cache/beaker_data |
|
288 | 299 | beaker.cache.data_dir = %(here)s/data/cache/beaker_data |
|
289 | 300 | beaker.cache.lock_dir = %(here)s/data/cache/beaker_lock |
|
290 | 301 | |
|
291 | 302 | beaker.cache.regions = super_short_term, short_term, long_term, sql_cache_short, auth_plugins, repo_cache_long |
|
292 | 303 | |
|
293 | 304 | beaker.cache.super_short_term.type = memory |
|
294 | 305 | beaker.cache.super_short_term.expire = 10 |
|
295 | 306 | beaker.cache.super_short_term.key_length = 256 |
|
296 | 307 | |
|
297 | 308 | beaker.cache.short_term.type = memory |
|
298 | 309 | beaker.cache.short_term.expire = 60 |
|
299 | 310 | beaker.cache.short_term.key_length = 256 |
|
300 | 311 | |
|
301 | 312 | beaker.cache.long_term.type = memory |
|
302 | 313 | beaker.cache.long_term.expire = 36000 |
|
303 | 314 | beaker.cache.long_term.key_length = 256 |
|
304 | 315 | |
|
305 | 316 | beaker.cache.sql_cache_short.type = memory |
|
306 | 317 | beaker.cache.sql_cache_short.expire = 10 |
|
307 | 318 | beaker.cache.sql_cache_short.key_length = 256 |
|
308 | 319 | |
|
309 | 320 | # default is memory cache, configure only if required |
|
310 | 321 | # using multi-node or multi-worker setup |
|
311 | 322 | #beaker.cache.auth_plugins.type = ext:database |
|
312 | 323 | #beaker.cache.auth_plugins.lock_dir = %(here)s/data/cache/auth_plugin_lock |
|
313 | 324 | #beaker.cache.auth_plugins.url = postgresql://postgres:secret@localhost/rhodecode |
|
314 | 325 | #beaker.cache.auth_plugins.url = mysql://root:secret@127.0.0.1/rhodecode |
|
315 | 326 | #beaker.cache.auth_plugins.sa.pool_recycle = 3600 |
|
316 | 327 | #beaker.cache.auth_plugins.sa.pool_size = 10 |
|
317 | 328 | #beaker.cache.auth_plugins.sa.max_overflow = 0 |
|
318 | 329 | |
|
319 | 330 | beaker.cache.repo_cache_long.type = memorylru_base |
|
320 | 331 | beaker.cache.repo_cache_long.max_items = 4096 |
|
321 | 332 | beaker.cache.repo_cache_long.expire = 2592000 |
|
322 | 333 | |
|
323 | 334 | # default is memorylru_base cache, configure only if required |
|
324 | 335 | # using multi-node or multi-worker setup |
|
325 | 336 | #beaker.cache.repo_cache_long.type = ext:memcached |
|
326 | 337 | #beaker.cache.repo_cache_long.url = localhost:11211 |
|
327 | 338 | #beaker.cache.repo_cache_long.expire = 1209600 |
|
328 | 339 | #beaker.cache.repo_cache_long.key_length = 256 |
|
329 | 340 | |
|
330 | 341 | #################################### |
|
331 | 342 | ### BEAKER SESSION #### |
|
332 | 343 | #################################### |
|
333 | 344 | |
|
334 | 345 | ## .session.type is type of storage options for the session, current allowed |
|
335 | 346 | ## types are file, ext:memcached, ext:database, and memory (default). |
|
336 | 347 | beaker.session.type = file |
|
337 | 348 | beaker.session.data_dir = %(here)s/data/sessions/data |
|
338 | 349 | |
|
339 | 350 | ## db based session, fast, and allows easy management over logged in users ## |
|
340 | 351 | #beaker.session.type = ext:database |
|
341 | 352 | #beaker.session.table_name = db_session |
|
342 | 353 | #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode |
|
343 | 354 | #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode |
|
344 | 355 | #beaker.session.sa.pool_recycle = 3600 |
|
345 | 356 | #beaker.session.sa.echo = false |
|
346 | 357 | |
|
347 | 358 | beaker.session.key = rhodecode |
|
348 | 359 | beaker.session.secret = develop-rc-uytcxaz |
|
349 | 360 | beaker.session.lock_dir = %(here)s/data/sessions/lock |
|
350 | 361 | |
|
351 | 362 | ## Secure encrypted cookie. Requires AES and AES python libraries |
|
352 | 363 | ## you must disable beaker.session.secret to use this |
|
353 | 364 | #beaker.session.encrypt_key = <key_for_encryption> |
|
354 | 365 | #beaker.session.validate_key = <validation_key> |
|
355 | 366 | |
|
356 | 367 | ## sets session as invalid(also logging out user) if it haven not been |
|
357 | 368 | ## accessed for given amount of time in seconds |
|
358 | 369 | beaker.session.timeout = 2592000 |
|
359 | 370 | beaker.session.httponly = true |
|
360 | 371 | #beaker.session.cookie_path = /<your-prefix> |
|
361 | 372 | |
|
362 | 373 | ## uncomment for https secure cookie |
|
363 | 374 | beaker.session.secure = false |
|
364 | 375 | |
|
365 | 376 | ## auto save the session to not to use .save() |
|
366 | 377 | beaker.session.auto = false |
|
367 | 378 | |
|
368 | 379 | ## default cookie expiration time in seconds, set to `true` to set expire |
|
369 | 380 | ## at browser close |
|
370 | 381 | #beaker.session.cookie_expires = 3600 |
|
371 | 382 | |
|
372 | 383 | ################################### |
|
373 | 384 | ## SEARCH INDEXING CONFIGURATION ## |
|
374 | 385 | ################################### |
|
375 | 386 | ## Full text search indexer is available in rhodecode-tools under |
|
376 | 387 | ## `rhodecode-tools index` command |
|
377 | 388 | |
|
378 | 389 | # WHOOSH Backend, doesn't require additional services to run |
|
379 | 390 | # it works good with few dozen repos |
|
380 | 391 | search.module = rhodecode.lib.index.whoosh |
|
381 | 392 | search.location = %(here)s/data/index |
|
382 | 393 | |
|
383 | ||
|
384 | 394 | ################################### |
|
385 | 395 | ## APPENLIGHT CONFIG ## |
|
386 | 396 | ################################### |
|
387 | 397 | |
|
388 | 398 | ## Appenlight is tailored to work with RhodeCode, see |
|
389 | 399 | ## http://appenlight.com for details how to obtain an account |
|
390 | 400 | |
|
391 | 401 | ## appenlight integration enabled |
|
392 | 402 | appenlight = false |
|
393 | 403 | |
|
394 | 404 | appenlight.server_url = https://api.appenlight.com |
|
395 | 405 | appenlight.api_key = YOUR_API_KEY |
|
396 | 406 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 |
|
397 | 407 | |
|
398 | 408 | # used for JS client |
|
399 | 409 | appenlight.api_public_key = YOUR_API_PUBLIC_KEY |
|
400 | 410 | |
|
401 | 411 | ## TWEAK AMOUNT OF INFO SENT HERE |
|
402 | 412 | |
|
403 | 413 | ## enables 404 error logging (default False) |
|
404 | 414 | appenlight.report_404 = false |
|
405 | 415 | |
|
406 | 416 | ## time in seconds after request is considered being slow (default 1) |
|
407 | 417 | appenlight.slow_request_time = 1 |
|
408 | 418 | |
|
409 | 419 | ## record slow requests in application |
|
410 | 420 | ## (needs to be enabled for slow datastore recording and time tracking) |
|
411 | 421 | appenlight.slow_requests = true |
|
412 | 422 | |
|
413 | 423 | ## enable hooking to application loggers |
|
414 | 424 | appenlight.logging = true |
|
415 | 425 | |
|
416 | 426 | ## minimum log level for log capture |
|
417 | 427 | appenlight.logging.level = WARNING |
|
418 | 428 | |
|
419 | 429 | ## send logs only from erroneous/slow requests |
|
420 | 430 | ## (saves API quota for intensive logging) |
|
421 | 431 | appenlight.logging_on_error = false |
|
422 | 432 | |
|
423 | 433 | ## list of additonal keywords that should be grabbed from environ object |
|
424 | 434 | ## can be string with comma separated list of words in lowercase |
|
425 | 435 | ## (by default client will always send following info: |
|
426 | 436 | ## 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that |
|
427 | 437 | ## start with HTTP* this list be extended with additional keywords here |
|
428 | 438 | appenlight.environ_keys_whitelist = |
|
429 | 439 | |
|
430 | 440 | ## list of keywords that should be blanked from request object |
|
431 | 441 | ## can be string with comma separated list of words in lowercase |
|
432 | 442 | ## (by default client will always blank keys that contain following words |
|
433 | 443 | ## 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf' |
|
434 | 444 | ## this list be extended with additional keywords set here |
|
435 | 445 | appenlight.request_keys_blacklist = |
|
436 | 446 | |
|
437 | 447 | ## list of namespaces that should be ignores when gathering log entries |
|
438 | 448 | ## can be string with comma separated list of namespaces |
|
439 | 449 | ## (by default the client ignores own entries: appenlight_client.client) |
|
440 | 450 | appenlight.log_namespace_blacklist = |
|
441 | 451 | |
|
442 | 452 | |
|
443 | 453 | ################################################################################ |
|
444 | 454 | ## WARNING: *THE LINE BELOW MUST BE UNCOMMENTED ON A PRODUCTION ENVIRONMENT* ## |
|
445 | 455 | ## Debug mode will enable the interactive debugging tool, allowing ANYONE to ## |
|
446 | 456 | ## execute malicious code after an exception is raised. ## |
|
447 | 457 | ################################################################################ |
|
448 | 458 | #set debug = false |
|
449 | 459 | |
|
450 | 460 | |
|
451 | 461 | ############## |
|
452 | 462 | ## STYLING ## |
|
453 | 463 | ############## |
|
454 | 464 | debug_style = true |
|
455 | 465 | |
|
456 | 466 | ######################################################### |
|
457 | 467 | ### DB CONFIGS - EACH DB WILL HAVE IT'S OWN CONFIG ### |
|
458 | 468 | ######################################################### |
|
459 | 469 | sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 |
|
460 | 470 | #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode |
|
461 | 471 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode |
|
462 | 472 | |
|
463 | 473 | # see sqlalchemy docs for other advanced settings |
|
464 | 474 | |
|
465 | 475 | ## print the sql statements to output |
|
466 | 476 | sqlalchemy.db1.echo = false |
|
467 | 477 | ## recycle the connections after this ammount of seconds |
|
468 | 478 | sqlalchemy.db1.pool_recycle = 3600 |
|
469 | 479 | sqlalchemy.db1.convert_unicode = true |
|
470 | 480 | |
|
471 | 481 | ## the number of connections to keep open inside the connection pool. |
|
472 | 482 | ## 0 indicates no limit |
|
473 | 483 | #sqlalchemy.db1.pool_size = 5 |
|
474 | 484 | |
|
475 | 485 | ## the number of connections to allow in connection pool "overflow", that is |
|
476 | 486 | ## connections that can be opened above and beyond the pool_size setting, |
|
477 | 487 | ## which defaults to five. |
|
478 | 488 | #sqlalchemy.db1.max_overflow = 10 |
|
479 | 489 | |
|
480 | 490 | |
|
481 | 491 | ################## |
|
482 | 492 | ### VCS CONFIG ### |
|
483 | 493 | ################## |
|
484 | 494 | vcs.server.enable = true |
|
485 | 495 | vcs.server = localhost:9900 |
|
486 | 496 | |
|
487 | 497 | ## Web server connectivity protocol, responsible for web based VCS operatations |
|
488 | 498 | ## Available protocols are: |
|
489 | 499 | ## `pyro4` - using pyro4 server |
|
490 | 500 | ## `http` - using http-rpc backend |
|
491 | 501 | #vcs.server.protocol = http |
|
492 | 502 | |
|
493 | 503 | ## Push/Pull operations protocol, available options are: |
|
494 | 504 | ## `pyro4` - using pyro4 server |
|
495 | 505 | ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended |
|
496 | 506 | ## `vcsserver.scm_app` - internal app (EE only) |
|
497 | 507 | #vcs.scm_app_implementation = rhodecode.lib.middleware.utils.scm_app_http |
|
498 | 508 | |
|
499 | 509 | ## Push/Pull operations hooks protocol, available options are: |
|
500 | 510 | ## `pyro4` - using pyro4 server |
|
501 | 511 | ## `http` - using http-rpc backend |
|
502 | 512 | #vcs.hooks.protocol = http |
|
503 | 513 | |
|
504 | 514 | vcs.server.log_level = debug |
|
505 | 515 | ## Start VCSServer with this instance as a subprocess, usefull for development |
|
506 | 516 | vcs.start_server = true |
|
507 | 517 | vcs.backends = hg, git, svn |
|
508 | 518 | vcs.connection_timeout = 3600 |
|
509 | 519 | ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out. |
|
510 | 520 | ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible |
|
511 | 521 | #vcs.svn.compatible_version = pre-1.8-compatible |
|
512 | 522 | |
|
513 | 523 | ################################ |
|
514 | 524 | ### LOGGING CONFIGURATION #### |
|
515 | 525 | ################################ |
|
516 | 526 | [loggers] |
|
517 |
keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates |
|
|
527 | keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates | |
|
518 | 528 | |
|
519 | 529 | [handlers] |
|
520 | 530 | keys = console, console_sql |
|
521 | 531 | |
|
522 | 532 | [formatters] |
|
523 | 533 | keys = generic, color_formatter, color_formatter_sql |
|
524 | 534 | |
|
525 | 535 | ############# |
|
526 | 536 | ## LOGGERS ## |
|
527 | 537 | ############# |
|
528 | 538 | [logger_root] |
|
529 | 539 | level = NOTSET |
|
530 | 540 | handlers = console |
|
531 | 541 | |
|
532 | 542 | [logger_routes] |
|
533 | 543 | level = DEBUG |
|
534 | 544 | handlers = |
|
535 | 545 | qualname = routes.middleware |
|
536 | 546 | ## "level = DEBUG" logs the route matched and routing variables. |
|
537 | 547 | propagate = 1 |
|
538 | 548 | |
|
539 | 549 | [logger_beaker] |
|
540 | 550 | level = DEBUG |
|
541 | 551 | handlers = |
|
542 | 552 | qualname = beaker.container |
|
543 | 553 | propagate = 1 |
|
544 | 554 | |
|
545 | 555 | [logger_pyro4] |
|
546 | 556 | level = DEBUG |
|
547 | 557 | handlers = |
|
548 | 558 | qualname = Pyro4 |
|
549 | 559 | propagate = 1 |
|
550 | 560 | |
|
551 | 561 | [logger_templates] |
|
552 | 562 | level = INFO |
|
553 | 563 | handlers = |
|
554 | 564 | qualname = pylons.templating |
|
555 | 565 | propagate = 1 |
|
556 | 566 | |
|
557 | 567 | [logger_rhodecode] |
|
558 | 568 | level = DEBUG |
|
559 | 569 | handlers = |
|
560 | 570 | qualname = rhodecode |
|
561 | 571 | propagate = 1 |
|
562 | 572 | |
|
563 | 573 | [logger_sqlalchemy] |
|
564 | 574 | level = INFO |
|
565 | 575 | handlers = console_sql |
|
566 | 576 | qualname = sqlalchemy.engine |
|
567 | 577 | propagate = 0 |
|
568 | 578 | |
|
569 | [logger_whoosh_indexer] | |
|
570 | level = DEBUG | |
|
571 | handlers = | |
|
572 | qualname = whoosh_indexer | |
|
573 | propagate = 1 | |
|
574 | ||
|
575 | 579 | ############## |
|
576 | 580 | ## HANDLERS ## |
|
577 | 581 | ############## |
|
578 | 582 | |
|
579 | 583 | [handler_console] |
|
580 | 584 | class = StreamHandler |
|
581 | 585 | args = (sys.stderr,) |
|
582 | 586 | level = DEBUG |
|
583 | 587 | formatter = color_formatter |
|
584 | 588 | |
|
585 | 589 | [handler_console_sql] |
|
586 | 590 | class = StreamHandler |
|
587 | 591 | args = (sys.stderr,) |
|
588 | 592 | level = DEBUG |
|
589 | 593 | formatter = color_formatter_sql |
|
590 | 594 | |
|
591 | 595 | ################ |
|
592 | 596 | ## FORMATTERS ## |
|
593 | 597 | ################ |
|
594 | 598 | |
|
595 | 599 | [formatter_generic] |
|
596 | 600 | class = rhodecode.lib.logging_formatter.Pyro4AwareFormatter |
|
597 | 601 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
598 | 602 | datefmt = %Y-%m-%d %H:%M:%S |
|
599 | 603 | |
|
600 | 604 | [formatter_color_formatter] |
|
601 | 605 | class = rhodecode.lib.logging_formatter.ColorFormatter |
|
602 | 606 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
603 | 607 | datefmt = %Y-%m-%d %H:%M:%S |
|
604 | 608 | |
|
605 | 609 | [formatter_color_formatter_sql] |
|
606 | 610 | class = rhodecode.lib.logging_formatter.ColorFormatterSql |
|
607 | 611 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
608 | 612 | datefmt = %Y-%m-%d %H:%M:%S |
@@ -1,577 +1,581 b'' | |||
|
1 | 1 | ################################################################################ |
|
2 | 2 | ################################################################################ |
|
3 | 3 | # RhodeCode Enterprise - configuration file # |
|
4 | 4 | # Built-in functions and variables # |
|
5 | 5 | # The %(here)s variable will be replaced with the parent directory of this file# |
|
6 | 6 | # # |
|
7 | 7 | ################################################################################ |
|
8 | 8 | |
|
9 | 9 | [DEFAULT] |
|
10 | 10 | debug = true |
|
11 | pdebug = false | |
|
12 | 11 | ################################################################################ |
|
13 | 12 | ## Uncomment and replace with the email address which should receive ## |
|
14 | 13 | ## any error reports after an application crash ## |
|
15 | 14 | ## Additionally these settings will be used by the RhodeCode mailing system ## |
|
16 | 15 | ################################################################################ |
|
17 | 16 | #email_to = admin@localhost |
|
18 | 17 | #error_email_from = paste_error@localhost |
|
19 | 18 | #app_email_from = rhodecode-noreply@localhost |
|
20 | 19 | #error_message = |
|
21 | 20 | #email_prefix = [RhodeCode] |
|
22 | 21 | |
|
23 | 22 | #smtp_server = mail.server.com |
|
24 | 23 | #smtp_username = |
|
25 | 24 | #smtp_password = |
|
26 | 25 | #smtp_port = |
|
27 | 26 | #smtp_use_tls = false |
|
28 | 27 | #smtp_use_ssl = true |
|
29 | 28 | ## Specify available auth parameters here (e.g. LOGIN PLAIN CRAM-MD5, etc.) |
|
30 | 29 | #smtp_auth = |
|
31 | 30 | |
|
32 | 31 | [server:main] |
|
33 | 32 | ## COMMON ## |
|
34 | 33 | host = 127.0.0.1 |
|
35 | 34 | port = 5000 |
|
36 | 35 | |
|
37 | 36 | ################################## |
|
38 | 37 | ## WAITRESS WSGI SERVER ## |
|
39 | 38 | ## Recommended for Development ## |
|
40 | 39 | ################################## |
|
41 | 40 | #use = egg:waitress#main |
|
42 | 41 | ## number of worker threads |
|
43 | 42 | #threads = 5 |
|
44 | 43 | ## MAX BODY SIZE 100GB |
|
45 | 44 | #max_request_body_size = 107374182400 |
|
46 | 45 | ## Use poll instead of select, fixes file descriptors limits problems. |
|
47 | 46 | ## May not work on old windows systems. |
|
48 | 47 | #asyncore_use_poll = true |
|
49 | 48 | |
|
50 | 49 | |
|
51 | 50 | ########################## |
|
52 | 51 | ## GUNICORN WSGI SERVER ## |
|
53 | 52 | ########################## |
|
54 | 53 | ## run with gunicorn --log-config <inifile.ini> --paste <inifile.ini> |
|
55 | 54 | use = egg:gunicorn#main |
|
56 | 55 | ## Sets the number of process workers. You must set `instance_id = *` |
|
57 | 56 | ## when this option is set to more than one worker, recommended |
|
58 | 57 | ## value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers |
|
59 | 58 | ## The `instance_id = *` must be set in the [app:main] section below |
|
60 | 59 | workers = 2 |
|
61 | 60 | ## number of threads for each of the worker, must be set to 1 for gevent |
|
62 | 61 | ## generally recommened to be at 1 |
|
63 | 62 | #threads = 1 |
|
64 | 63 | ## process name |
|
65 | 64 | proc_name = rhodecode |
|
66 | 65 | ## type of worker class, one of sync, gevent |
|
67 | 66 | ## recommended for bigger setup is using of of other than sync one |
|
68 | 67 | worker_class = sync |
|
69 | 68 | ## The maximum number of simultaneous clients. Valid only for Gevent |
|
70 | 69 | #worker_connections = 10 |
|
71 | 70 | ## max number of requests that worker will handle before being gracefully |
|
72 | 71 | ## restarted, could prevent memory leaks |
|
73 | 72 | max_requests = 1000 |
|
74 | 73 | max_requests_jitter = 30 |
|
75 | 74 | ## amount of time a worker can spend with handling a request before it |
|
76 | 75 | ## gets killed and restarted. Set to 6hrs |
|
77 | 76 | timeout = 21600 |
|
78 | 77 | |
|
79 | 78 | |
|
80 | 79 | ## prefix middleware for RhodeCode, disables force_https flag. |
|
81 | 80 | ## allows to set RhodeCode under a prefix in server. |
|
82 | 81 | ## eg https://server.com/<prefix>. Enable `filter-with =` option below as well. |
|
83 | 82 | #[filter:proxy-prefix] |
|
84 | 83 | #use = egg:PasteDeploy#prefix |
|
85 | 84 | #prefix = /<your-prefix> |
|
86 | 85 | |
|
87 | 86 | [app:main] |
|
88 | 87 | use = egg:rhodecode-enterprise-ce |
|
89 | 88 | ## enable proxy prefix middleware, defined below |
|
90 | 89 | #filter-with = proxy-prefix |
|
91 | 90 | |
|
91 | ## encryption key used to encrypt social plugin tokens, | |
|
92 | ## remote_urls with credentials etc, if not set it defaults to | |
|
93 | ## `beaker.session.secret` | |
|
94 | #rhodecode.encrypted_values.secret = | |
|
95 | ||
|
96 | ## decryption strict mode (enabled by default). It controls if decryption raises | |
|
97 | ## `SignatureVerificationError` in case of wrong key, or damaged encryption data. | |
|
98 | #rhodecode.encrypted_values.strict = false | |
|
99 | ||
|
92 | 100 | full_stack = true |
|
93 | 101 | |
|
94 | 102 | ## Serve static files via RhodeCode, disable to serve them via HTTP server |
|
95 | 103 | static_files = true |
|
96 | 104 | |
|
105 | # autogenerate javascript routes file on startup | |
|
106 | generate_js_files = false | |
|
107 | ||
|
97 | 108 | ## Optional Languages |
|
98 | 109 | ## en(default), be, de, es, fr, it, ja, pl, pt, ru, zh |
|
99 | 110 | lang = en |
|
100 | 111 | |
|
101 | 112 | ## perform a full repository scan on each server start, this should be |
|
102 | 113 | ## set to false after first startup, to allow faster server restarts. |
|
103 | 114 | startup.import_repos = false |
|
104 | 115 | |
|
105 | 116 | ## Uncomment and set this path to use archive download cache. |
|
106 | 117 | ## Once enabled, generated archives will be cached at this location |
|
107 | 118 | ## and served from the cache during subsequent requests for the same archive of |
|
108 | 119 | ## the repository. |
|
109 | 120 | #archive_cache_dir = /tmp/tarballcache |
|
110 | 121 | |
|
111 | 122 | ## change this to unique ID for security |
|
112 | 123 | app_instance_uuid = rc-production |
|
113 | 124 | |
|
114 | 125 | ## cut off limit for large diffs (size in bytes) |
|
115 | 126 | cut_off_limit_diff = 1024000 |
|
116 | 127 | cut_off_limit_file = 256000 |
|
117 | 128 | |
|
118 | 129 | ## use cache version of scm repo everywhere |
|
119 | 130 | vcs_full_cache = true |
|
120 | 131 | |
|
121 | 132 | ## force https in RhodeCode, fixes https redirects, assumes it's always https |
|
122 | 133 | ## Normally this is controlled by proper http flags sent from http server |
|
123 | 134 | force_https = false |
|
124 | 135 | |
|
125 | 136 | ## use Strict-Transport-Security headers |
|
126 | 137 | use_htsts = false |
|
127 | 138 | |
|
128 | 139 | ## number of commits stats will parse on each iteration |
|
129 | 140 | commit_parse_limit = 25 |
|
130 | 141 | |
|
131 | 142 | ## git rev filter option, --all is the default filter, if you need to |
|
132 | 143 | ## hide all refs in changelog switch this to --branches --tags |
|
133 | 144 | git_rev_filter = --branches --tags |
|
134 | 145 | |
|
135 | 146 | # Set to true if your repos are exposed using the dumb protocol |
|
136 | 147 | git_update_server_info = false |
|
137 | 148 | |
|
138 | 149 | ## RSS/ATOM feed options |
|
139 | 150 | rss_cut_off_limit = 256000 |
|
140 | 151 | rss_items_per_page = 10 |
|
141 | 152 | rss_include_diff = false |
|
142 | 153 | |
|
143 | 154 | ## gist URL alias, used to create nicer urls for gist. This should be an |
|
144 | 155 | ## url that does rewrites to _admin/gists/<gistid>. |
|
145 | 156 | ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal |
|
146 | 157 | ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/<gistid> |
|
147 | 158 | gist_alias_url = |
|
148 | 159 | |
|
149 | 160 | ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be |
|
150 | 161 | ## used for access. |
|
151 | 162 | ## Adding ?auth_token = <token> to the url authenticates this request as if it |
|
152 | 163 | ## came from the the logged in user who own this authentication token. |
|
153 | 164 | ## |
|
154 | 165 | ## Syntax is <ControllerClass>:<function_pattern>. |
|
155 | 166 | ## To enable access to raw_files put `FilesController:raw`. |
|
156 | 167 | ## To enable access to patches add `ChangesetController:changeset_patch`. |
|
157 | 168 | ## The list should be "," separated and on a single line. |
|
158 | 169 | ## |
|
159 | 170 | ## Recommended controllers to enable: |
|
160 | 171 | # ChangesetController:changeset_patch, |
|
161 | 172 | # ChangesetController:changeset_raw, |
|
162 | 173 | # FilesController:raw, |
|
163 | 174 | # FilesController:archivefile, |
|
164 | 175 | # GistsController:*, |
|
165 | 176 | api_access_controllers_whitelist = |
|
166 | 177 | |
|
167 | 178 | ## default encoding used to convert from and to unicode |
|
168 | 179 | ## can be also a comma separated list of encoding in case of mixed encodings |
|
169 | 180 | default_encoding = UTF-8 |
|
170 | 181 | |
|
171 | 182 | ## instance-id prefix |
|
172 | 183 | ## a prefix key for this instance used for cache invalidation when running |
|
173 | 184 | ## multiple instances of rhodecode, make sure it's globally unique for |
|
174 | 185 | ## all running rhodecode instances. Leave empty if you don't use it |
|
175 | 186 | instance_id = |
|
176 | 187 | |
|
177 | 188 | ## Fallback authentication plugin. Set this to a plugin ID to force the usage |
|
178 | 189 | ## of an authentication plugin also if it is disabled by it's settings. |
|
179 | 190 | ## This could be useful if you are unable to log in to the system due to broken |
|
180 | 191 | ## authentication settings. Then you can enable e.g. the internal rhodecode auth |
|
181 | 192 | ## module to log in again and fix the settings. |
|
182 | 193 | ## |
|
183 | 194 | ## Available builtin plugin IDs (hash is part of the ID): |
|
184 | 195 | ## egg:rhodecode-enterprise-ce#rhodecode |
|
185 | 196 | ## egg:rhodecode-enterprise-ce#pam |
|
186 | 197 | ## egg:rhodecode-enterprise-ce#ldap |
|
187 | 198 | ## egg:rhodecode-enterprise-ce#jasig_cas |
|
188 | 199 | ## egg:rhodecode-enterprise-ce#headers |
|
189 | 200 | ## egg:rhodecode-enterprise-ce#crowd |
|
190 | 201 | #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode |
|
191 | 202 | |
|
192 | 203 | ## alternative return HTTP header for failed authentication. Default HTTP |
|
193 | 204 | ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with |
|
194 | 205 | ## handling that causing a series of failed authentication calls. |
|
195 | 206 | ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code |
|
196 | 207 | ## This will be served instead of default 401 on bad authnetication |
|
197 | 208 | auth_ret_code = |
|
198 | 209 | |
|
199 | 210 | ## use special detection method when serving auth_ret_code, instead of serving |
|
200 | 211 | ## ret_code directly, use 401 initially (Which triggers credentials prompt) |
|
201 | 212 | ## and then serve auth_ret_code to clients |
|
202 | 213 | auth_ret_code_detection = false |
|
203 | 214 | |
|
204 | 215 | ## locking return code. When repository is locked return this HTTP code. 2XX |
|
205 | 216 | ## codes don't break the transactions while 4XX codes do |
|
206 | 217 | lock_ret_code = 423 |
|
207 | 218 | |
|
208 | 219 | ## allows to change the repository location in settings page |
|
209 | 220 | allow_repo_location_change = true |
|
210 | 221 | |
|
211 | 222 | ## allows to setup custom hooks in settings page |
|
212 | 223 | allow_custom_hooks_settings = true |
|
213 | 224 | |
|
214 | 225 | ## generated license token, goto license page in RhodeCode settings to obtain |
|
215 | 226 | ## new token |
|
216 | 227 | license_token = |
|
217 | 228 | |
|
218 | 229 | ## supervisor connection uri, for managing supervisor and logs. |
|
219 | 230 | supervisor.uri = |
|
220 | 231 | ## supervisord group name/id we only want this RC instance to handle |
|
221 | 232 | supervisor.group_id = prod |
|
222 | 233 | |
|
223 | 234 | ## Display extended labs settings |
|
224 | 235 | labs_settings_active = true |
|
225 | 236 | |
|
226 | 237 | #################################### |
|
227 | 238 | ### CELERY CONFIG #### |
|
228 | 239 | #################################### |
|
229 | 240 | use_celery = false |
|
230 | 241 | broker.host = localhost |
|
231 | 242 | broker.vhost = rabbitmqhost |
|
232 | 243 | broker.port = 5672 |
|
233 | 244 | broker.user = rabbitmq |
|
234 | 245 | broker.password = qweqwe |
|
235 | 246 | |
|
236 | 247 | celery.imports = rhodecode.lib.celerylib.tasks |
|
237 | 248 | |
|
238 | 249 | celery.result.backend = amqp |
|
239 | 250 | celery.result.dburi = amqp:// |
|
240 | 251 | celery.result.serialier = json |
|
241 | 252 | |
|
242 | 253 | #celery.send.task.error.emails = true |
|
243 | 254 | #celery.amqp.task.result.expires = 18000 |
|
244 | 255 | |
|
245 | 256 | celeryd.concurrency = 2 |
|
246 | 257 | #celeryd.log.file = celeryd.log |
|
247 | 258 | celeryd.log.level = debug |
|
248 | 259 | celeryd.max.tasks.per.child = 1 |
|
249 | 260 | |
|
250 | 261 | ## tasks will never be sent to the queue, but executed locally instead. |
|
251 | 262 | celery.always.eager = false |
|
252 | 263 | |
|
253 | 264 | #################################### |
|
254 | 265 | ### BEAKER CACHE #### |
|
255 | 266 | #################################### |
|
256 | 267 | # default cache dir for templates. Putting this into a ramdisk |
|
257 | 268 | ## can boost performance, eg. %(here)s/data_ramdisk |
|
258 | 269 | cache_dir = %(here)s/data |
|
259 | 270 | |
|
260 | 271 | ## locking and default file storage for Beaker. Putting this into a ramdisk |
|
261 | 272 | ## can boost performance, eg. %(here)s/data_ramdisk/cache/beaker_data |
|
262 | 273 | beaker.cache.data_dir = %(here)s/data/cache/beaker_data |
|
263 | 274 | beaker.cache.lock_dir = %(here)s/data/cache/beaker_lock |
|
264 | 275 | |
|
265 | 276 | beaker.cache.regions = super_short_term, short_term, long_term, sql_cache_short, auth_plugins, repo_cache_long |
|
266 | 277 | |
|
267 | 278 | beaker.cache.super_short_term.type = memory |
|
268 | 279 | beaker.cache.super_short_term.expire = 10 |
|
269 | 280 | beaker.cache.super_short_term.key_length = 256 |
|
270 | 281 | |
|
271 | 282 | beaker.cache.short_term.type = memory |
|
272 | 283 | beaker.cache.short_term.expire = 60 |
|
273 | 284 | beaker.cache.short_term.key_length = 256 |
|
274 | 285 | |
|
275 | 286 | beaker.cache.long_term.type = memory |
|
276 | 287 | beaker.cache.long_term.expire = 36000 |
|
277 | 288 | beaker.cache.long_term.key_length = 256 |
|
278 | 289 | |
|
279 | 290 | beaker.cache.sql_cache_short.type = memory |
|
280 | 291 | beaker.cache.sql_cache_short.expire = 10 |
|
281 | 292 | beaker.cache.sql_cache_short.key_length = 256 |
|
282 | 293 | |
|
283 | 294 | # default is memory cache, configure only if required |
|
284 | 295 | # using multi-node or multi-worker setup |
|
285 | 296 | #beaker.cache.auth_plugins.type = ext:database |
|
286 | 297 | #beaker.cache.auth_plugins.lock_dir = %(here)s/data/cache/auth_plugin_lock |
|
287 | 298 | #beaker.cache.auth_plugins.url = postgresql://postgres:secret@localhost/rhodecode |
|
288 | 299 | #beaker.cache.auth_plugins.url = mysql://root:secret@127.0.0.1/rhodecode |
|
289 | 300 | #beaker.cache.auth_plugins.sa.pool_recycle = 3600 |
|
290 | 301 | #beaker.cache.auth_plugins.sa.pool_size = 10 |
|
291 | 302 | #beaker.cache.auth_plugins.sa.max_overflow = 0 |
|
292 | 303 | |
|
293 | 304 | beaker.cache.repo_cache_long.type = memorylru_base |
|
294 | 305 | beaker.cache.repo_cache_long.max_items = 4096 |
|
295 | 306 | beaker.cache.repo_cache_long.expire = 2592000 |
|
296 | 307 | |
|
297 | 308 | # default is memorylru_base cache, configure only if required |
|
298 | 309 | # using multi-node or multi-worker setup |
|
299 | 310 | #beaker.cache.repo_cache_long.type = ext:memcached |
|
300 | 311 | #beaker.cache.repo_cache_long.url = localhost:11211 |
|
301 | 312 | #beaker.cache.repo_cache_long.expire = 1209600 |
|
302 | 313 | #beaker.cache.repo_cache_long.key_length = 256 |
|
303 | 314 | |
|
304 | 315 | #################################### |
|
305 | 316 | ### BEAKER SESSION #### |
|
306 | 317 | #################################### |
|
307 | 318 | |
|
308 | 319 | ## .session.type is type of storage options for the session, current allowed |
|
309 | 320 | ## types are file, ext:memcached, ext:database, and memory (default). |
|
310 | 321 | beaker.session.type = file |
|
311 | 322 | beaker.session.data_dir = %(here)s/data/sessions/data |
|
312 | 323 | |
|
313 | 324 | ## db based session, fast, and allows easy management over logged in users ## |
|
314 | 325 | #beaker.session.type = ext:database |
|
315 | 326 | #beaker.session.table_name = db_session |
|
316 | 327 | #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode |
|
317 | 328 | #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode |
|
318 | 329 | #beaker.session.sa.pool_recycle = 3600 |
|
319 | 330 | #beaker.session.sa.echo = false |
|
320 | 331 | |
|
321 | 332 | beaker.session.key = rhodecode |
|
322 | 333 | beaker.session.secret = production-rc-uytcxaz |
|
323 | 334 | beaker.session.lock_dir = %(here)s/data/sessions/lock |
|
324 | 335 | |
|
325 | 336 | ## Secure encrypted cookie. Requires AES and AES python libraries |
|
326 | 337 | ## you must disable beaker.session.secret to use this |
|
327 | 338 | #beaker.session.encrypt_key = <key_for_encryption> |
|
328 | 339 | #beaker.session.validate_key = <validation_key> |
|
329 | 340 | |
|
330 | 341 | ## sets session as invalid(also logging out user) if it haven not been |
|
331 | 342 | ## accessed for given amount of time in seconds |
|
332 | 343 | beaker.session.timeout = 2592000 |
|
333 | 344 | beaker.session.httponly = true |
|
334 | 345 | #beaker.session.cookie_path = /<your-prefix> |
|
335 | 346 | |
|
336 | 347 | ## uncomment for https secure cookie |
|
337 | 348 | beaker.session.secure = false |
|
338 | 349 | |
|
339 | 350 | ## auto save the session to not to use .save() |
|
340 | 351 | beaker.session.auto = false |
|
341 | 352 | |
|
342 | 353 | ## default cookie expiration time in seconds, set to `true` to set expire |
|
343 | 354 | ## at browser close |
|
344 | 355 | #beaker.session.cookie_expires = 3600 |
|
345 | 356 | |
|
346 | 357 | ################################### |
|
347 | 358 | ## SEARCH INDEXING CONFIGURATION ## |
|
348 | 359 | ################################### |
|
349 | 360 | ## Full text search indexer is available in rhodecode-tools under |
|
350 | 361 | ## `rhodecode-tools index` command |
|
351 | 362 | |
|
352 | 363 | # WHOOSH Backend, doesn't require additional services to run |
|
353 | 364 | # it works good with few dozen repos |
|
354 | 365 | search.module = rhodecode.lib.index.whoosh |
|
355 | 366 | search.location = %(here)s/data/index |
|
356 | 367 | |
|
357 | ||
|
358 | 368 | ################################### |
|
359 | 369 | ## APPENLIGHT CONFIG ## |
|
360 | 370 | ################################### |
|
361 | 371 | |
|
362 | 372 | ## Appenlight is tailored to work with RhodeCode, see |
|
363 | 373 | ## http://appenlight.com for details how to obtain an account |
|
364 | 374 | |
|
365 | 375 | ## appenlight integration enabled |
|
366 | 376 | appenlight = false |
|
367 | 377 | |
|
368 | 378 | appenlight.server_url = https://api.appenlight.com |
|
369 | 379 | appenlight.api_key = YOUR_API_KEY |
|
370 | 380 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 |
|
371 | 381 | |
|
372 | 382 | # used for JS client |
|
373 | 383 | appenlight.api_public_key = YOUR_API_PUBLIC_KEY |
|
374 | 384 | |
|
375 | 385 | ## TWEAK AMOUNT OF INFO SENT HERE |
|
376 | 386 | |
|
377 | 387 | ## enables 404 error logging (default False) |
|
378 | 388 | appenlight.report_404 = false |
|
379 | 389 | |
|
380 | 390 | ## time in seconds after request is considered being slow (default 1) |
|
381 | 391 | appenlight.slow_request_time = 1 |
|
382 | 392 | |
|
383 | 393 | ## record slow requests in application |
|
384 | 394 | ## (needs to be enabled for slow datastore recording and time tracking) |
|
385 | 395 | appenlight.slow_requests = true |
|
386 | 396 | |
|
387 | 397 | ## enable hooking to application loggers |
|
388 | 398 | appenlight.logging = true |
|
389 | 399 | |
|
390 | 400 | ## minimum log level for log capture |
|
391 | 401 | appenlight.logging.level = WARNING |
|
392 | 402 | |
|
393 | 403 | ## send logs only from erroneous/slow requests |
|
394 | 404 | ## (saves API quota for intensive logging) |
|
395 | 405 | appenlight.logging_on_error = false |
|
396 | 406 | |
|
397 | 407 | ## list of additonal keywords that should be grabbed from environ object |
|
398 | 408 | ## can be string with comma separated list of words in lowercase |
|
399 | 409 | ## (by default client will always send following info: |
|
400 | 410 | ## 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that |
|
401 | 411 | ## start with HTTP* this list be extended with additional keywords here |
|
402 | 412 | appenlight.environ_keys_whitelist = |
|
403 | 413 | |
|
404 | 414 | ## list of keywords that should be blanked from request object |
|
405 | 415 | ## can be string with comma separated list of words in lowercase |
|
406 | 416 | ## (by default client will always blank keys that contain following words |
|
407 | 417 | ## 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf' |
|
408 | 418 | ## this list be extended with additional keywords set here |
|
409 | 419 | appenlight.request_keys_blacklist = |
|
410 | 420 | |
|
411 | 421 | ## list of namespaces that should be ignores when gathering log entries |
|
412 | 422 | ## can be string with comma separated list of namespaces |
|
413 | 423 | ## (by default the client ignores own entries: appenlight_client.client) |
|
414 | 424 | appenlight.log_namespace_blacklist = |
|
415 | 425 | |
|
416 | 426 | |
|
417 | 427 | ################################################################################ |
|
418 | 428 | ## WARNING: *THE LINE BELOW MUST BE UNCOMMENTED ON A PRODUCTION ENVIRONMENT* ## |
|
419 | 429 | ## Debug mode will enable the interactive debugging tool, allowing ANYONE to ## |
|
420 | 430 | ## execute malicious code after an exception is raised. ## |
|
421 | 431 | ################################################################################ |
|
422 | 432 | set debug = false |
|
423 | 433 | |
|
424 | 434 | |
|
425 | 435 | ######################################################### |
|
426 | 436 | ### DB CONFIGS - EACH DB WILL HAVE IT'S OWN CONFIG ### |
|
427 | 437 | ######################################################### |
|
428 | 438 | #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 |
|
429 | 439 | sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode |
|
430 | 440 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode |
|
431 | 441 | |
|
432 | 442 | # see sqlalchemy docs for other advanced settings |
|
433 | 443 | |
|
434 | 444 | ## print the sql statements to output |
|
435 | 445 | sqlalchemy.db1.echo = false |
|
436 | 446 | ## recycle the connections after this ammount of seconds |
|
437 | 447 | sqlalchemy.db1.pool_recycle = 3600 |
|
438 | 448 | sqlalchemy.db1.convert_unicode = true |
|
439 | 449 | |
|
440 | 450 | ## the number of connections to keep open inside the connection pool. |
|
441 | 451 | ## 0 indicates no limit |
|
442 | 452 | #sqlalchemy.db1.pool_size = 5 |
|
443 | 453 | |
|
444 | 454 | ## the number of connections to allow in connection pool "overflow", that is |
|
445 | 455 | ## connections that can be opened above and beyond the pool_size setting, |
|
446 | 456 | ## which defaults to five. |
|
447 | 457 | #sqlalchemy.db1.max_overflow = 10 |
|
448 | 458 | |
|
449 | 459 | |
|
450 | 460 | ################## |
|
451 | 461 | ### VCS CONFIG ### |
|
452 | 462 | ################## |
|
453 | 463 | vcs.server.enable = true |
|
454 | 464 | vcs.server = localhost:9900 |
|
455 | 465 | |
|
456 | 466 | ## Web server connectivity protocol, responsible for web based VCS operatations |
|
457 | 467 | ## Available protocols are: |
|
458 | 468 | ## `pyro4` - using pyro4 server |
|
459 | 469 | ## `http` - using http-rpc backend |
|
460 | 470 | #vcs.server.protocol = http |
|
461 | 471 | |
|
462 | 472 | ## Push/Pull operations protocol, available options are: |
|
463 | 473 | ## `pyro4` - using pyro4 server |
|
464 | 474 | ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended |
|
465 | 475 | ## `vcsserver.scm_app` - internal app (EE only) |
|
466 | 476 | #vcs.scm_app_implementation = rhodecode.lib.middleware.utils.scm_app_http |
|
467 | 477 | |
|
468 | 478 | ## Push/Pull operations hooks protocol, available options are: |
|
469 | 479 | ## `pyro4` - using pyro4 server |
|
470 | 480 | ## `http` - using http-rpc backend |
|
471 | 481 | #vcs.hooks.protocol = http |
|
472 | 482 | |
|
473 | 483 | vcs.server.log_level = info |
|
474 | 484 | ## Start VCSServer with this instance as a subprocess, usefull for development |
|
475 | 485 | vcs.start_server = false |
|
476 | 486 | vcs.backends = hg, git, svn |
|
477 | 487 | vcs.connection_timeout = 3600 |
|
478 | 488 | ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out. |
|
479 | 489 | ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible |
|
480 | 490 | #vcs.svn.compatible_version = pre-1.8-compatible |
|
481 | 491 | |
|
482 | 492 | ################################ |
|
483 | 493 | ### LOGGING CONFIGURATION #### |
|
484 | 494 | ################################ |
|
485 | 495 | [loggers] |
|
486 |
keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates |
|
|
496 | keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates | |
|
487 | 497 | |
|
488 | 498 | [handlers] |
|
489 | 499 | keys = console, console_sql |
|
490 | 500 | |
|
491 | 501 | [formatters] |
|
492 | 502 | keys = generic, color_formatter, color_formatter_sql |
|
493 | 503 | |
|
494 | 504 | ############# |
|
495 | 505 | ## LOGGERS ## |
|
496 | 506 | ############# |
|
497 | 507 | [logger_root] |
|
498 | 508 | level = NOTSET |
|
499 | 509 | handlers = console |
|
500 | 510 | |
|
501 | 511 | [logger_routes] |
|
502 | 512 | level = DEBUG |
|
503 | 513 | handlers = |
|
504 | 514 | qualname = routes.middleware |
|
505 | 515 | ## "level = DEBUG" logs the route matched and routing variables. |
|
506 | 516 | propagate = 1 |
|
507 | 517 | |
|
508 | 518 | [logger_beaker] |
|
509 | 519 | level = DEBUG |
|
510 | 520 | handlers = |
|
511 | 521 | qualname = beaker.container |
|
512 | 522 | propagate = 1 |
|
513 | 523 | |
|
514 | 524 | [logger_pyro4] |
|
515 | 525 | level = DEBUG |
|
516 | 526 | handlers = |
|
517 | 527 | qualname = Pyro4 |
|
518 | 528 | propagate = 1 |
|
519 | 529 | |
|
520 | 530 | [logger_templates] |
|
521 | 531 | level = INFO |
|
522 | 532 | handlers = |
|
523 | 533 | qualname = pylons.templating |
|
524 | 534 | propagate = 1 |
|
525 | 535 | |
|
526 | 536 | [logger_rhodecode] |
|
527 | 537 | level = DEBUG |
|
528 | 538 | handlers = |
|
529 | 539 | qualname = rhodecode |
|
530 | 540 | propagate = 1 |
|
531 | 541 | |
|
532 | 542 | [logger_sqlalchemy] |
|
533 | 543 | level = INFO |
|
534 | 544 | handlers = console_sql |
|
535 | 545 | qualname = sqlalchemy.engine |
|
536 | 546 | propagate = 0 |
|
537 | 547 | |
|
538 | [logger_whoosh_indexer] | |
|
539 | level = DEBUG | |
|
540 | handlers = | |
|
541 | qualname = whoosh_indexer | |
|
542 | propagate = 1 | |
|
543 | ||
|
544 | 548 | ############## |
|
545 | 549 | ## HANDLERS ## |
|
546 | 550 | ############## |
|
547 | 551 | |
|
548 | 552 | [handler_console] |
|
549 | 553 | class = StreamHandler |
|
550 | 554 | args = (sys.stderr,) |
|
551 | 555 | level = INFO |
|
552 | 556 | formatter = generic |
|
553 | 557 | |
|
554 | 558 | [handler_console_sql] |
|
555 | 559 | class = StreamHandler |
|
556 | 560 | args = (sys.stderr,) |
|
557 | 561 | level = WARN |
|
558 | 562 | formatter = generic |
|
559 | 563 | |
|
560 | 564 | ################ |
|
561 | 565 | ## FORMATTERS ## |
|
562 | 566 | ################ |
|
563 | 567 | |
|
564 | 568 | [formatter_generic] |
|
565 | 569 | class = rhodecode.lib.logging_formatter.Pyro4AwareFormatter |
|
566 | 570 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
567 | 571 | datefmt = %Y-%m-%d %H:%M:%S |
|
568 | 572 | |
|
569 | 573 | [formatter_color_formatter] |
|
570 | 574 | class = rhodecode.lib.logging_formatter.ColorFormatter |
|
571 | 575 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
572 | 576 | datefmt = %Y-%m-%d %H:%M:%S |
|
573 | 577 | |
|
574 | 578 | [formatter_color_formatter_sql] |
|
575 | 579 | class = rhodecode.lib.logging_formatter.ColorFormatterSql |
|
576 | 580 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
577 | 581 | datefmt = %Y-%m-%d %H:%M:%S |
@@ -1,219 +1,227 b'' | |||
|
1 | 1 | # Nix environment for the community edition |
|
2 | 2 | # |
|
3 | 3 | # This shall be as lean as possible, just producing the Enterprise |
|
4 | 4 | # derivation. For advanced tweaks to pimp up the development environment we use |
|
5 | 5 | # "shell.nix" so that it does not have to clutter this file. |
|
6 | 6 | |
|
7 | 7 | { pkgs ? (import <nixpkgs> {}) |
|
8 | 8 | , pythonPackages ? "python27Packages" |
|
9 | 9 | , pythonExternalOverrides ? self: super: {} |
|
10 | 10 | , doCheck ? true |
|
11 | 11 | }: |
|
12 | 12 | |
|
13 | 13 | let pkgs_ = pkgs; in |
|
14 | 14 | |
|
15 | 15 | let |
|
16 | 16 | pkgs = pkgs_.overridePackages (self: super: { |
|
17 | 17 | # Override subversion derivation to |
|
18 | 18 | # - activate python bindings |
|
19 | 19 | # - set version to 1.8 |
|
20 | 20 | subversion = super.subversion18.override { |
|
21 | 21 | httpSupport = true; |
|
22 | 22 | pythonBindings = true; |
|
23 | 23 | python = self.python27Packages.python; |
|
24 | 24 | }; |
|
25 | 25 | }); |
|
26 | 26 | |
|
27 | 27 | inherit (pkgs.lib) fix extends; |
|
28 | 28 | |
|
29 | 29 | basePythonPackages = with builtins; if isAttrs pythonPackages |
|
30 | 30 | then pythonPackages |
|
31 | 31 | else getAttr pythonPackages pkgs; |
|
32 | 32 | |
|
33 | 33 | elem = builtins.elem; |
|
34 | 34 | basename = path: with pkgs.lib; last (splitString "/" path); |
|
35 | 35 | startsWith = prefix: full: let |
|
36 | 36 | actualPrefix = builtins.substring 0 (builtins.stringLength prefix) full; |
|
37 | 37 | in actualPrefix == prefix; |
|
38 | 38 | |
|
39 | 39 | src-filter = path: type: with pkgs.lib; |
|
40 | 40 | let |
|
41 | 41 | ext = last (splitString "." path); |
|
42 | 42 | in |
|
43 | 43 | !elem (basename path) [ |
|
44 | 44 | ".git" ".hg" "__pycache__" ".eggs" "node_modules" |
|
45 | 45 | "build" "data" "tmp"] && |
|
46 | 46 | !elem ext ["egg-info" "pyc"] && |
|
47 | 47 | !startsWith "result" path; |
|
48 | 48 | |
|
49 | sources = pkgs.config.rc.sources or {}; | |
|
49 | 50 | rhodecode-enterprise-ce-src = builtins.filterSource src-filter ./.; |
|
50 | 51 | |
|
51 | 52 | # Load the generated node packages |
|
52 | 53 | nodePackages = pkgs.callPackage "${pkgs.path}/pkgs/top-level/node-packages.nix" rec { |
|
53 | 54 | self = nodePackages; |
|
54 | 55 | generated = pkgs.callPackage ./pkgs/node-packages.nix { inherit self; }; |
|
55 | 56 | }; |
|
56 | 57 | |
|
57 | 58 | # TODO: Should be taken automatically out of the generates packages. |
|
58 | 59 | # apps.nix has one solution for this, although I'd prefer to have the deps |
|
59 | 60 | # from package.json mapped in here. |
|
60 | 61 | nodeDependencies = with nodePackages; [ |
|
61 | 62 | grunt |
|
62 | 63 | grunt-contrib-concat |
|
63 | 64 | grunt-contrib-jshint |
|
64 | 65 | grunt-contrib-less |
|
65 | 66 | grunt-contrib-watch |
|
66 | 67 | jshint |
|
67 | 68 | ]; |
|
68 | 69 | |
|
69 | 70 | pythonGeneratedPackages = self: basePythonPackages.override (a: { |
|
70 | 71 | inherit self; |
|
71 | 72 | }) |
|
72 | 73 | // (scopedImport { |
|
73 | 74 | self = self; |
|
74 | 75 | super = basePythonPackages; |
|
75 | 76 | inherit pkgs; |
|
76 | 77 | inherit (pkgs) fetchurl fetchgit; |
|
77 | 78 | } ./pkgs/python-packages.nix); |
|
78 | 79 | |
|
79 | 80 | pythonOverrides = import ./pkgs/python-packages-overrides.nix { |
|
80 | 81 | inherit |
|
81 | 82 | basePythonPackages |
|
82 | 83 | pkgs; |
|
83 | 84 | }; |
|
84 | 85 | |
|
85 | 86 | pythonLocalOverrides = self: super: { |
|
86 | 87 | rhodecode-enterprise-ce = |
|
87 | 88 | let |
|
88 | 89 | version = builtins.readFile ./rhodecode/VERSION; |
|
89 | 90 | linkNodeModules = '' |
|
90 | 91 | echo "Link node packages" |
|
91 | 92 | # TODO: check if this adds stuff as a dependency, closure size |
|
92 | 93 | rm -fr node_modules |
|
93 | 94 | mkdir -p node_modules |
|
94 | 95 | ${pkgs.lib.concatMapStrings (dep: '' |
|
95 | 96 | ln -sfv ${dep}/lib/node_modules/${dep.pkgName} node_modules/ |
|
96 | 97 | '') nodeDependencies} |
|
97 | 98 | echo "DONE: Link node packages" |
|
98 | 99 | ''; |
|
99 | 100 | in super.rhodecode-enterprise-ce.override (attrs: { |
|
100 | 101 | |
|
101 |
inherit |
|
|
102 | inherit | |
|
103 | doCheck | |
|
104 | version; | |
|
102 | 105 | name = "rhodecode-enterprise-ce-${version}"; |
|
103 | version = version; | |
|
106 | releaseName = "RhodeCodeEnterpriseCE-${version}"; | |
|
104 | 107 | src = rhodecode-enterprise-ce-src; |
|
105 | 108 | |
|
106 | 109 | buildInputs = |
|
107 | 110 | attrs.buildInputs ++ |
|
108 | 111 | (with self; [ |
|
109 | 112 | pkgs.nodePackages.grunt-cli |
|
110 | 113 | pkgs.subversion |
|
111 | 114 | pytest-catchlog |
|
112 |
r |
|
|
115 | rhodecode-testdata | |
|
113 | 116 | ]); |
|
114 | 117 | |
|
115 | 118 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ (with self; [ |
|
116 | 119 | rhodecode-tools |
|
117 | 120 | ]); |
|
118 | 121 | |
|
119 | 122 | # TODO: johbo: Make a nicer way to expose the parts. Maybe |
|
120 | 123 | # pkgs/default.nix? |
|
121 | 124 | passthru = { |
|
122 | 125 | inherit |
|
123 | 126 | pythonLocalOverrides |
|
124 | 127 | myPythonPackagesUnfix; |
|
125 | 128 | pythonPackages = self; |
|
126 | 129 | }; |
|
127 | 130 | |
|
128 | 131 | LC_ALL = "en_US.UTF-8"; |
|
129 | 132 | LOCALE_ARCHIVE = |
|
130 | 133 | if pkgs.stdenv ? glibc |
|
131 | 134 | then "${pkgs.glibcLocales}/lib/locale/locale-archive" |
|
132 | 135 | else ""; |
|
133 | 136 | |
|
134 | 137 | # Somewhat snappier setup of the development environment |
|
135 | 138 | # TODO: move into shell.nix |
|
136 | 139 | # TODO: think of supporting a stable path again, so that multiple shells |
|
137 | 140 | # can share it. |
|
138 | 141 | shellHook = '' |
|
139 | 142 | tmp_path=$(mktemp -d) |
|
140 | 143 | export PATH="$tmp_path/bin:$PATH" |
|
141 | 144 | export PYTHONPATH="$tmp_path/${self.python.sitePackages}:$PYTHONPATH" |
|
142 | 145 | mkdir -p $tmp_path/${self.python.sitePackages} |
|
143 | 146 | python setup.py develop --prefix $tmp_path --allow-hosts "" |
|
144 | 147 | '' + linkNodeModules; |
|
145 | 148 | |
|
146 | 149 | preCheck = '' |
|
147 | 150 | export PATH="$out/bin:$PATH" |
|
148 | 151 | ''; |
|
149 | 152 | |
|
150 | 153 | postCheck = '' |
|
151 | 154 | rm -rf $out/lib/${self.python.libPrefix}/site-packages/pytest_pylons |
|
152 | 155 | rm -rf $out/lib/${self.python.libPrefix}/site-packages/rhodecode/tests |
|
153 | 156 | ''; |
|
154 | 157 | |
|
155 | 158 | preBuild = linkNodeModules + '' |
|
156 | 159 | grunt |
|
157 | 160 | rm -fr node_modules |
|
158 | 161 | ''; |
|
159 | 162 | |
|
160 | 163 | postInstall = '' |
|
161 | 164 | # python based programs need to be wrapped |
|
162 | 165 | ln -s ${self.supervisor}/bin/supervisor* $out/bin/ |
|
163 | 166 | ln -s ${self.gunicorn}/bin/gunicorn $out/bin/ |
|
164 | 167 | ln -s ${self.PasteScript}/bin/paster $out/bin/ |
|
165 | 168 | ln -s ${self.pyramid}/bin/* $out/bin/ #*/ |
|
166 | 169 | |
|
167 | 170 | # rhodecode-tools |
|
168 | 171 | # TODO: johbo: re-think this. Do the tools import anything from enterprise? |
|
169 | 172 | ln -s ${self.rhodecode-tools}/bin/rhodecode-* $out/bin/ |
|
170 | 173 | |
|
171 | 174 | # note that condition should be restricted when adding further tools |
|
172 | 175 | for file in $out/bin/*; do #*/ |
|
173 | 176 | wrapProgram $file \ |
|
174 | 177 | --prefix PYTHONPATH : $PYTHONPATH \ |
|
175 | 178 | --prefix PATH : $PATH \ |
|
176 | 179 | --set PYTHONHASHSEED random |
|
177 | 180 | done |
|
178 | 181 | |
|
179 | 182 | mkdir $out/etc |
|
180 | 183 | cp configs/production.ini $out/etc |
|
181 | 184 | |
|
182 | 185 | echo "Writing meta information for rccontrol to nix-support/rccontrol" |
|
183 | 186 | mkdir -p $out/nix-support/rccontrol |
|
184 | 187 | cp -v rhodecode/VERSION $out/nix-support/rccontrol/version |
|
185 | 188 | echo "DONE: Meta information for rccontrol written" |
|
186 | 189 | |
|
187 | 190 | # TODO: johbo: Make part of ac-tests |
|
188 | 191 | if [ ! -f rhodecode/public/js/scripts.js ]; then |
|
189 | 192 | echo "Missing scripts.js" |
|
190 | 193 | exit 1 |
|
191 | 194 | fi |
|
192 | 195 | if [ ! -f rhodecode/public/css/style.css ]; then |
|
193 | 196 | echo "Missing style.css" |
|
194 | 197 | exit 1 |
|
195 | 198 | fi |
|
196 | 199 | ''; |
|
197 | 200 | |
|
198 | 201 | }); |
|
199 | 202 | |
|
200 | rc_testdata = self.buildPythonPackage rec { | |
|
201 | name = "rc_testdata-0.7.0"; | |
|
202 | src = pkgs.fetchhg { | |
|
203 | url = "https://code.rhodecode.com/upstream/rc_testdata"; | |
|
204 | rev = "v0.7.0"; | |
|
205 | sha256 = "0w3z0zn8lagr707v67lgys23sl6pbi4xg7pfvdbw58h3q384h6rx"; | |
|
206 | }; | |
|
203 | rhodecode-testdata = import "${rhodecode-testdata-src}/default.nix" { | |
|
204 | inherit | |
|
205 | doCheck | |
|
206 | pkgs | |
|
207 | pythonPackages; | |
|
207 | 208 | }; |
|
208 | 209 | |
|
209 | 210 | }; |
|
210 | 211 | |
|
212 | rhodecode-testdata-src = sources.rhodecode-testdata or ( | |
|
213 | pkgs.fetchhg { | |
|
214 | url = "https://code.rhodecode.com/upstream/rc_testdata"; | |
|
215 | rev = "v0.8.0"; | |
|
216 | sha256 = "0hy1ba134rq2f9si85yx7j4qhc9ky0hjzdk553s3q026i7km809m"; | |
|
217 | }); | |
|
218 | ||
|
211 | 219 | # Apply all overrides and fix the final package set |
|
212 | 220 | myPythonPackagesUnfix = |
|
213 | 221 | (extends pythonExternalOverrides |
|
214 | 222 | (extends pythonLocalOverrides |
|
215 | 223 | (extends pythonOverrides |
|
216 | 224 | pythonGeneratedPackages))); |
|
217 | 225 | myPythonPackages = (fix myPythonPackagesUnfix); |
|
218 | 226 | |
|
219 | 227 | in myPythonPackages.rhodecode-enterprise-ce |
@@ -1,136 +1,130 b'' | |||
|
1 | 1 | .. _debug-mode: |
|
2 | 2 | |
|
3 | 3 | Enabling Debug Mode |
|
4 | 4 | ------------------- |
|
5 | 5 | |
|
6 | 6 | To enable debug mode on a |RCE| instance you need to set the debug property |
|
7 | 7 | in the :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini` file. To |
|
8 | 8 | do this, use the following steps |
|
9 | 9 | |
|
10 | 10 | 1. Open the file and set the ``debug`` line to ``true`` |
|
11 | 11 | 2. Restart you instance using the ``rccontrol restart`` command, |
|
12 | 12 | see the following example: |
|
13 | 13 | |
|
14 | 14 | You can also set the log level, the follow are the valid options; |
|
15 | 15 | ``debug``, ``info``, ``warning``, or ``fatal``. |
|
16 | 16 | |
|
17 | 17 | .. code-block:: ini |
|
18 | 18 | |
|
19 | 19 | [DEFAULT] |
|
20 | 20 | debug = true |
|
21 | 21 | pdebug = false |
|
22 | 22 | |
|
23 | 23 | .. code-block:: bash |
|
24 | 24 | |
|
25 | 25 | # Restart your instance |
|
26 | 26 | $ rccontrol restart enterprise-1 |
|
27 | 27 | Instance "enterprise-1" successfully stopped. |
|
28 | 28 | Instance "enterprise-1" successfully started. |
|
29 | 29 | |
|
30 | 30 | Debug and Logging Configuration |
|
31 | 31 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
32 | 32 | |
|
33 | 33 | Further debugging and logging settings can also be set in the |
|
34 | 34 | :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini` file. |
|
35 | 35 | |
|
36 | 36 | In the logging section, the various packages that run with |RCE| can have |
|
37 | 37 | different debug levels set. If you want to increase the logging level change |
|
38 | 38 | ``level = DEBUG`` line to one of the valid options. |
|
39 | 39 | |
|
40 | 40 | You also need to change the log level for handlers. See the example |
|
41 | 41 | ``##handler`` section below. The ``handler`` level takes the same options as |
|
42 | 42 | the ``debug`` level. |
|
43 | 43 | |
|
44 | 44 | .. code-block:: ini |
|
45 | 45 | |
|
46 | 46 | ################################ |
|
47 | 47 | ### LOGGING CONFIGURATION #### |
|
48 | 48 | ################################ |
|
49 | 49 | [loggers] |
|
50 |
keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates |
|
|
50 | keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates | |
|
51 | 51 | |
|
52 | 52 | [handlers] |
|
53 | 53 | keys = console, console_sql, file, file_rotating |
|
54 | 54 | |
|
55 | 55 | [formatters] |
|
56 | 56 | keys = generic, color_formatter, color_formatter_sql |
|
57 | 57 | |
|
58 | 58 | ############# |
|
59 | 59 | ## LOGGERS ## |
|
60 | 60 | ############# |
|
61 | 61 | [logger_root] |
|
62 | 62 | level = NOTSET |
|
63 | 63 | handlers = console |
|
64 | 64 | |
|
65 | 65 | [logger_routes] |
|
66 | 66 | level = DEBUG |
|
67 | 67 | handlers = |
|
68 | 68 | qualname = routes.middleware |
|
69 | 69 | ## "level = DEBUG" logs the route matched and routing variables. |
|
70 | 70 | propagate = 1 |
|
71 | 71 | |
|
72 | 72 | [logger_beaker] |
|
73 | 73 | level = DEBUG |
|
74 | 74 | handlers = |
|
75 | 75 | qualname = beaker.container |
|
76 | 76 | propagate = 1 |
|
77 | 77 | |
|
78 | 78 | [logger_pyro4] |
|
79 | 79 | level = DEBUG |
|
80 | 80 | handlers = |
|
81 | 81 | qualname = Pyro4 |
|
82 | 82 | propagate = 1 |
|
83 | 83 | |
|
84 | 84 | [logger_templates] |
|
85 | 85 | level = INFO |
|
86 | 86 | handlers = |
|
87 | 87 | qualname = pylons.templating |
|
88 | 88 | propagate = 1 |
|
89 | 89 | |
|
90 | 90 | [logger_rhodecode] |
|
91 | 91 | level = DEBUG |
|
92 | 92 | handlers = |
|
93 | 93 | qualname = rhodecode |
|
94 | 94 | propagate = 1 |
|
95 | 95 | |
|
96 | 96 | [logger_sqlalchemy] |
|
97 | 97 | level = INFO |
|
98 | 98 | handlers = console_sql |
|
99 | 99 | qualname = sqlalchemy.engine |
|
100 | 100 | propagate = 0 |
|
101 | 101 | |
|
102 | [logger_whoosh_indexer] | |
|
103 | level = DEBUG | |
|
104 | handlers = | |
|
105 | qualname = whoosh_indexer | |
|
106 | propagate = 1 | |
|
107 | ||
|
108 | 102 | ############## |
|
109 | 103 | ## HANDLERS ## |
|
110 | 104 | ############## |
|
111 | 105 | |
|
112 | 106 | [handler_console] |
|
113 | 107 | class = StreamHandler |
|
114 | 108 | args = (sys.stderr,) |
|
115 | 109 | level = INFO |
|
116 | 110 | formatter = generic |
|
117 | 111 | |
|
118 | 112 | [handler_console_sql] |
|
119 | 113 | class = StreamHandler |
|
120 | 114 | args = (sys.stderr,) |
|
121 | 115 | level = WARN |
|
122 | 116 | formatter = generic |
|
123 | 117 | |
|
124 | 118 | [handler_file] |
|
125 | 119 | class = FileHandler |
|
126 | 120 | args = ('rhodecode.log', 'a',) |
|
127 | 121 | level = INFO |
|
128 | 122 | formatter = generic |
|
129 | 123 | |
|
130 | 124 | [handler_file_rotating] |
|
131 | 125 | class = logging.handlers.TimedRotatingFileHandler |
|
132 | 126 | # 'D', 5 - rotate every 5days |
|
133 | 127 | # you can set 'h', 'midnight' |
|
134 | 128 | args = ('rhodecode.log', 'D', 5, 10,) |
|
135 | 129 | level = INFO |
|
136 | 130 | formatter = generic |
@@ -1,19 +1,20 b'' | |||
|
1 | 1 | .. _contributing: |
|
2 | 2 | |
|
3 | 3 | Contributing to RhodeCode |
|
4 | 4 | ========================= |
|
5 | 5 | |
|
6 | 6 | |
|
7 | 7 | |
|
8 | Welcome to contribution guides and development docs of RhodeCode. | |
|
8 | Welcome to the contribution guides and development docs of RhodeCode. | |
|
9 | 9 | |
|
10 | 10 | |
|
11 | 11 | |
|
12 | 12 | .. toctree:: |
|
13 | 13 | :maxdepth: 1 |
|
14 | 14 | |
|
15 | overview | |
|
15 | 16 | testing/index |
|
16 | 17 | dev-setup |
|
17 | 18 | db-schema |
|
18 | 19 | dev-settings |
|
19 | 20 | api |
@@ -1,52 +1,52 b'' | |||
|
1 | 1 | ======================= |
|
2 | 2 | DB Schema and Migration |
|
3 | 3 | ======================= |
|
4 | 4 | |
|
5 | To create or alter tables in the database it's necessary to change a couple of | |
|
5 | To create or alter tables in the database, it's necessary to change a couple of | |
|
6 | 6 | files, apart from configuring the settings pointing to the latest database |
|
7 | 7 | schema. |
|
8 | 8 | |
|
9 | 9 | |
|
10 | 10 | Database Model and ORM |
|
11 | 11 | ---------------------- |
|
12 | 12 | |
|
13 | 13 | On ``rhodecode.model.db`` you will find the database definition of all tables and |
|
14 | fields. Any fresh install database will be correctly created by the definitions | |
|
15 |
here. So, any change to this file |
|
|
14 | fields. Any freshly installed database will be correctly created by the definitions | |
|
15 | here. So, any change to this file will affect the tests without having to change | |
|
16 | 16 | any other file. |
|
17 | 17 | |
|
18 |
A second layer are the busin |
|
|
18 | A second layer are the business classes inside ``rhodecode.model``. | |
|
19 | 19 | |
|
20 | 20 | |
|
21 | 21 | Database Migration |
|
22 | 22 | ------------------ |
|
23 | 23 | |
|
24 | 24 | Three files play a role when creating database migrations: |
|
25 | 25 | |
|
26 | 26 | * Database schema inside ``rhodecode.lib.dbmigrate`` |
|
27 | 27 | * Database version inside ``rhodecode.lib.dbmigrate`` |
|
28 | 28 | * Configuration ``__dbversion__`` at ``rhodecode.__init__`` |
|
29 | 29 | |
|
30 | 30 | |
|
31 | 31 | Schema is a snapshot of the database version BEFORE the migration. So, it's |
|
32 | 32 | the initial state before any changes were added. The name convention is |
|
33 |
the latest release version where the snapshot w |
|
|
33 | the latest release version where the snapshot was created, and not the | |
|
34 | 34 | target version of this code. |
|
35 | 35 | |
|
36 | 36 | Version is the method that will define how to UPGRADE/DOWNGRADE the database. |
|
37 | 37 | |
|
38 | 38 | ``rhodecode.__init__`` contains only a variable that defines up to which version of |
|
39 | 39 | the database will be used to upgrade. Eg.: ``__dbversion__ = 45`` |
|
40 | 40 | |
|
41 | 41 | |
|
42 | 42 | For examples on how to create those files, please see the existing code. |
|
43 | 43 | |
|
44 | 44 | |
|
45 | 45 | Migration Command |
|
46 | 46 | ^^^^^^^^^^^^^^^^^ |
|
47 | 47 | |
|
48 | After you changed the database ORM and migration files, you can run:: | |
|
48 | After you've changed the database ORM and migration files, you can run:: | |
|
49 | 49 | |
|
50 | 50 | paster upgrade-db <ini-file> |
|
51 | 51 | |
|
52 |
|
|
|
52 | The database will be upgraded up to the version defined in the ``__init__`` file. No newline at end of file |
@@ -1,46 +1,46 b'' | |||
|
1 | 1 | |
|
2 | 2 | ========================== |
|
3 | 3 | Settings for Development |
|
4 | 4 | ========================== |
|
5 | 5 | |
|
6 | 6 | |
|
7 | 7 | We have a few settings which are intended to be used only for development |
|
8 | 8 | purposes. This section contains an overview of them. |
|
9 | 9 | |
|
10 | 10 | |
|
11 | 11 | |
|
12 | 12 | `debug_style` |
|
13 | 13 | ============= |
|
14 | 14 | |
|
15 | 15 | Enables the section "Style" in the application. This section provides an |
|
16 |
overview of all components which are found in the frontend |
|
|
16 | overview of all components which are found in the frontend of the | |
|
17 | 17 | application. |
|
18 | 18 | |
|
19 | 19 | |
|
20 | 20 | |
|
21 | 21 | `vcs.start_server` |
|
22 | 22 | ================== |
|
23 | 23 | |
|
24 | 24 | Starts the server as a subprocess while the system comes up. Intended usage is |
|
25 | 25 | to ease development. |
|
26 | 26 | |
|
27 | 27 | |
|
28 | 28 | |
|
29 | 29 | `[logging]` |
|
30 | 30 | =========== |
|
31 | 31 | |
|
32 | Use this to configure loggig to your current needs. The documentation of | |
|
33 |
Python's `logging` module explains all details. The following snippets |
|
|
34 | useful for day to day development work. | |
|
32 | Use this to configure logging to your current needs. The documentation of | |
|
33 | Python's `logging` module explains all of the details. The following snippets | |
|
34 | are useful for day to day development work. | |
|
35 | 35 | |
|
36 | 36 | |
|
37 | 37 | Mute SQL output |
|
38 | 38 | --------------- |
|
39 | 39 | |
|
40 | 40 | They come out of the package `sqlalchemy.engine`:: |
|
41 | 41 | |
|
42 | 42 | [logger_sqlalchemy] |
|
43 | 43 | level = WARNING |
|
44 | 44 | handlers = console_sql |
|
45 | 45 | qualname = sqlalchemy.engine |
|
46 | 46 | propagate = 0 |
@@ -1,141 +1,144 b'' | |||
|
1 | .. _dev-setup: | |
|
1 | 2 | |
|
2 | 3 | =================== |
|
3 | 4 | Development setup |
|
4 | 5 | =================== |
|
5 | 6 | |
|
6 | 7 | |
|
7 | 8 | RhodeCode Enterprise runs inside a Nix managed environment. This ensures build |
|
8 | 9 | environment dependencies are correctly declared and installed during setup. |
|
9 | 10 | It also enables atomic upgrades, rollbacks, and multiple instances of RhodeCode |
|
10 | Enterprise for efficient cluster management. | |
|
11 | Enterprise running with isolation. | |
|
11 | 12 | |
|
12 | To set up RhodeCode Enterprise inside the Nix environment use the following steps: | |
|
13 | To set up RhodeCode Enterprise inside the Nix environment, use the following steps: | |
|
13 | 14 | |
|
14 | 15 | |
|
15 | 16 | |
|
16 | 17 | Setup Nix Package Manager |
|
17 | 18 | ------------------------- |
|
18 | 19 | |
|
19 | To install the Nix Package Manager please run:: | |
|
20 | To install the Nix Package Manager, please run:: | |
|
20 | 21 | |
|
21 | 22 | $ curl https://nixos.org/nix/install | sh |
|
22 | 23 | |
|
23 |
or go to https://nixos.org/nix/ and follow the |
|
|
24 | Once this is correctly set up on your system you should be able to use the | |
|
24 | or go to https://nixos.org/nix/ and follow the installation instructions. | |
|
25 | Once this is correctly set up on your system, you should be able to use the | |
|
25 | 26 | following commands: |
|
26 | 27 | |
|
27 | 28 | * `nix-env` |
|
28 | 29 | |
|
29 | 30 | * `nix-shell` |
|
30 | 31 | |
|
31 | 32 | |
|
32 | 33 | .. tip:: |
|
33 | 34 | |
|
34 | 35 | Update your channels frequently by running ``nix-channel --upgrade``. |
|
35 | 36 | |
|
36 | 37 | |
|
37 | Switch nix to latest STABLE channel | |
|
38 | ----------------------------------- | |
|
38 | Switch nix to the latest STABLE channel | |
|
39 | --------------------------------------- | |
|
39 | 40 | |
|
40 | 41 | run:: |
|
41 | 42 | |
|
42 | 43 | nix-channel --add https://nixos.org/channels/nixos-16.03 nixpkgs |
|
43 | 44 | |
|
44 | 45 | Followed by:: |
|
45 | 46 | |
|
46 | 47 | nix-channel --update |
|
47 | 48 | |
|
48 | 49 | |
|
49 | 50 | Clone the required repositories |
|
50 | 51 | ------------------------------- |
|
51 | 52 | |
|
52 |
After Nix is set up, clone the RhodeCode Enterprise Community Edition |
|
|
53 | After Nix is set up, clone the RhodeCode Enterprise Community Edition and | |
|
53 | 54 | RhodeCode VCSServer repositories into the same directory. |
|
54 | 55 | To do this, use the following example:: |
|
55 | 56 | |
|
56 | 57 | mkdir rhodecode-develop && cd rhodecode-develop |
|
57 | 58 | hg clone https://code.rhodecode.com/rhodecode-enterprise-ce |
|
58 | 59 | hg clone https://code.rhodecode.com/rhodecode-vcsserver |
|
59 | 60 | |
|
60 | 61 | .. note:: |
|
61 | 62 | |
|
62 |
If you cannot clone the repository, please request read permissions |
|
|
63 | If you cannot clone the repository, please request read permissions | |
|
64 | via support@rhodecode.com | |
|
63 | 65 | |
|
64 | 66 | |
|
65 | 67 | |
|
66 | 68 | Enter the Development Shell |
|
67 | 69 | --------------------------- |
|
68 | 70 | |
|
69 |
The final step is to start |
|
|
71 | The final step is to start the development shell. To do this, run the | |
|
70 | 72 | following command from inside the cloned repository:: |
|
71 | 73 | |
|
72 | 74 | cd ~/rhodecode-enterprise-ce |
|
73 |
nix-shell |
|
|
75 | nix-shell | |
|
74 | 76 | |
|
75 | 77 | .. note:: |
|
76 | 78 | |
|
77 | 79 | On the first run, this will take a while to download and optionally compile |
|
78 |
a few things. The |
|
|
80 | a few things. The following runs will be faster. The development shell works | |
|
81 | fine on MacOS and Linux platforms. | |
|
79 | 82 | |
|
80 | 83 | |
|
81 | 84 | |
|
82 | 85 | Creating a Development Configuration |
|
83 | 86 | ------------------------------------ |
|
84 | 87 | |
|
85 | 88 | To create a development environment for RhodeCode Enterprise, |
|
86 | 89 | use the following steps: |
|
87 | 90 | |
|
88 | 91 | 1. Create a copy of `~/rhodecode-enterprise-ce/configs/development.ini` |
|
89 | 92 | 2. Adjust the configuration settings to your needs |
|
90 | 93 | |
|
91 | 94 | .. note:: |
|
92 | 95 | |
|
93 |
It is recommended to |
|
|
96 | It is recommended to use the name `dev.ini`. | |
|
94 | 97 | |
|
95 | 98 | |
|
96 | 99 | Setup the Development Database |
|
97 | 100 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
98 | 101 | |
|
99 | To create a development database use the following example. This is a one | |
|
102 | To create a development database, use the following example. This is a one | |
|
100 | 103 | time operation:: |
|
101 | 104 | |
|
102 | 105 | paster setup-rhodecode dev.ini \ |
|
103 | 106 | --user=admin --password=secret \ |
|
104 | 107 | --email=admin@example.com \ |
|
105 | 108 | --repos=~/my_dev_repos |
|
106 | 109 | |
|
107 | 110 | |
|
108 | 111 | Start the Development Server |
|
109 | 112 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
110 | 113 | |
|
111 | 114 | When starting the development server, you should start the vcsserver as a |
|
112 | separate process. To do this use one of the following examples: | |
|
115 | separate process. To do this, use one of the following examples: | |
|
113 | 116 | |
|
114 | 117 | 1. Set the `start.vcs_server` flag in the ``dev.ini`` file to true. For example: |
|
115 | 118 | |
|
116 | 119 | .. code-block:: python |
|
117 | 120 | |
|
118 | 121 | ### VCS CONFIG ### |
|
119 | 122 | ################## |
|
120 | 123 | vcs.start_server = true |
|
121 | 124 | vcs.server = localhost:9900 |
|
122 | 125 | vcs.server.log_level = debug |
|
123 | 126 | |
|
124 | 127 | Then start the server using the following command: ``rcserver dev.ini`` |
|
125 | 128 | |
|
126 | 129 | 2. Start the development server using the following example:: |
|
127 | 130 | |
|
128 | 131 | rcserver --with-vcsserver dev.ini |
|
129 | 132 | |
|
130 | 133 | 3. Start the development server in a different terminal using the following |
|
131 | 134 | example:: |
|
132 | 135 | |
|
133 | 136 | vcsserver |
|
134 | 137 | |
|
135 | 138 | |
|
136 | 139 | Run the Environment Tests |
|
137 | 140 | ^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
138 | 141 | |
|
139 | Please make sure that the test are passing to verify that your environment is | |
|
140 | set up correctly. More details about the tests are described in: | |
|
141 | :file:`/docs/dev/testing`. | |
|
142 | Please make sure that the tests are passing to verify that your environment is | |
|
143 | set up correctly. RhodeCode uses py.test to run tests. | |
|
144 | Please simply run ``make test`` to run the basic test suite. |
@@ -1,28 +1,28 b'' | |||
|
1 | 1 | |
|
2 | 2 | |
|
3 | 3 | ============================ |
|
4 | 4 | Testing and Specifications |
|
5 | 5 | ============================ |
|
6 | 6 | |
|
7 | 7 | |
|
8 | 8 | .. toctree:: |
|
9 | 9 | :maxdepth: 2 |
|
10 | 10 | |
|
11 | 11 | unit-and-functional |
|
12 | 12 | spec-by-example |
|
13 | 13 | naming-conventions |
|
14 | 14 | |
|
15 | 15 | |
|
16 | 16 | |
|
17 | 17 | Overview |
|
18 | 18 | ======== |
|
19 | 19 | |
|
20 |
We have a quite |
|
|
20 | We have a quite large test suite inside of :file:`rhodecode/tests` which is a mix | |
|
21 | 21 | of unit tests and functional or integration tests. More details are in |
|
22 | 22 | :ref:`test-unit-and-functional`. |
|
23 | 23 | |
|
24 | 24 | |
|
25 |
Apart from that we start to apply "Specification by Example" and maintain |
|
|
26 |
collection of such specifications together with an implementation so that it |
|
|
27 | be validated in an automatic way. The files can be found in | |
|
25 | Apart from that, we are starting to apply "Specification by Example" and maintain | |
|
26 | a collection of such specifications together with an implementation so that it | |
|
27 | can be validated in an automatic way. The files can be found in | |
|
28 | 28 | :file:`acceptance_tests`. More details are in :ref:`test-spec-by-example`. |
@@ -1,75 +1,75 b'' | |||
|
1 | 1 | |
|
2 | 2 | .. _test-spec-by-example: |
|
3 | 3 | |
|
4 | 4 | ========================== |
|
5 | 5 | Specification by Example |
|
6 | 6 | ========================== |
|
7 | 7 | |
|
8 | 8 | |
|
9 | 9 | .. Avoid duplicating the quickstart instructions by importing the README |
|
10 | 10 | file. |
|
11 | 11 | |
|
12 | 12 | .. include:: ../../../acceptance_tests/README.rst |
|
13 | 13 | |
|
14 | 14 | |
|
15 | 15 | |
|
16 | 16 | Choices of technology and tools |
|
17 | 17 | =============================== |
|
18 | 18 | |
|
19 | 19 | |
|
20 | 20 | `nix` as runtime environment |
|
21 | 21 | ---------------------------- |
|
22 | 22 | |
|
23 | 23 | We settled to use the `nix` tools to provide us the needed environment for |
|
24 | 24 | running the tests. |
|
25 | 25 | |
|
26 | 26 | |
|
27 | 27 | |
|
28 | 28 | `Gherkins` as specification language |
|
29 | 29 | ------------------------------------ |
|
30 | 30 | |
|
31 | 31 | To specify by example, we settled on Gherkins as the semi-formal specification |
|
32 | 32 | language. |
|
33 | 33 | |
|
34 | 34 | |
|
35 | 35 | `py.test` as a runner |
|
36 | 36 | --------------------- |
|
37 | 37 | |
|
38 | 38 | After experimenting with `behave` and `py.test` our choice was `pytest-bdd` |
|
39 | 39 | because it allows us to use our existing knowledge about `py.test` and avoids |
|
40 | 40 | that we have to learn another tool. |
|
41 | 41 | |
|
42 | 42 | |
|
43 | 43 | |
|
44 | 44 | Concepts |
|
45 | 45 | ======== |
|
46 | 46 | |
|
47 | 47 | The logic is structured around the design pattern of "page objects". The |
|
48 | 48 | documentation of `python-selemium` contains a few more details about this |
|
49 | 49 | pattern. |
|
50 | 50 | |
|
51 | 51 | |
|
52 | 52 | |
|
53 | 53 | Page Objects |
|
54 | 54 | ------------ |
|
55 | 55 | |
|
56 | 56 | We introduce an abstraction class for every page which we have to interact with |
|
57 | 57 | in order to validate the specifications. |
|
58 | 58 | |
|
59 | 59 | The implementation for the page objects is inside of the module |
|
60 | 60 | :mod:`page_objects`. The class :class:`page_objects.base.BasePage` should be |
|
61 | 61 | used as a base for all page object implementations. |
|
62 | 62 | |
|
63 | 63 | |
|
64 | 64 | |
|
65 | 65 | Locators |
|
66 | 66 | -------- |
|
67 | 67 | |
|
68 | 68 | The specific information how to locate an element inside of the DOM tree of a |
|
69 |
page is kept in a separate class. This class serves mainly as a data container |
|
|
69 | page is kept in a separate class. This class serves mainly as a data container; | |
|
70 | 70 | it shall not contain any logic. |
|
71 | 71 | |
|
72 | 72 | The reason for keeping the locators separate is that we expect a frequent need |
|
73 | for change whenever we work on our templates. In such a case it is more | |
|
74 |
efficient to have all locators together and update them there instead of |
|
|
75 |
to find |
|
|
73 | for change whenever we work on our templates. In such a case, it is more | |
|
74 | efficient to have all of thelocators together and update them there instead of | |
|
75 | having to find every locator inside of the logic of a page object. |
@@ -1,61 +1,61 b'' | |||
|
1 | 1 | |
|
2 | 2 | .. _test-unit-and-functional: |
|
3 | 3 | |
|
4 | 4 | =========================== |
|
5 | 5 | Unit and Functional Tests |
|
6 | 6 | =========================== |
|
7 | 7 | |
|
8 | 8 | |
|
9 | 9 | |
|
10 | 10 | py.test based test suite |
|
11 | 11 | ======================== |
|
12 | 12 | |
|
13 | 13 | |
|
14 | 14 | The test suite is in the folder :file:`rhodecode/tests/` and should be run with |
|
15 | 15 | the test runner `py.test` inside of your `nix-shell` environment:: |
|
16 | 16 | |
|
17 | 17 | # In case you need the cythonized version |
|
18 | 18 | CYTHONIZE=1 python setup.py develop --prefix=$tmp_path |
|
19 | 19 | |
|
20 | 20 | py.test rhodecode |
|
21 | 21 | |
|
22 | 22 | |
|
23 | 23 | |
|
24 | 24 | py.test integration |
|
25 | 25 | ------------------- |
|
26 | 26 | |
|
27 | 27 | The integration with the test runner is based on the following three parts: |
|
28 | 28 | |
|
29 | 29 | - `pytest_pylons` is a py.test plugin which does the integration with the |
|
30 |
Pylons web framework. It sets up the Pylons environment based on |
|
|
30 | Pylons web framework. It sets up the Pylons environment based on the given ini | |
|
31 | 31 | file. |
|
32 | 32 | |
|
33 | 33 | Tests which depend on the Pylons environment to be set up must request the |
|
34 | 34 | fixture `pylonsapp`. |
|
35 | 35 | |
|
36 | 36 | - :file:`rhodecode/tests/plugin.py` contains the integration of py.test with |
|
37 | 37 | RhodeCode Enterprise itself. |
|
38 | 38 | |
|
39 | 39 | - :file:`conftest.py` plugins are used to provide a special integration for |
|
40 | 40 | certain groups of tests based on the directory location. |
|
41 | 41 | |
|
42 | 42 | |
|
43 | 43 | |
|
44 | 44 | VCS backend selection |
|
45 | 45 | --------------------- |
|
46 | 46 | |
|
47 | 47 | The py.test integration provides a parameter `--backends`. It will skip all |
|
48 | 48 | tests which are marked for other backends. |
|
49 | 49 | |
|
50 | 50 | To run only Subversion tests:: |
|
51 | 51 | |
|
52 | 52 | py.test rhodecode --backends=svn |
|
53 | 53 | |
|
54 | 54 | |
|
55 | 55 | |
|
56 | 56 | Frontend / Styling support |
|
57 | 57 | ========================== |
|
58 | 58 | |
|
59 | 59 | All relevant style components have an example inside of the "Style" section |
|
60 | 60 | within the application. Enable the setting `debug_style` to make this section |
|
61 | 61 | visible in your local instance of the application. |
@@ -1,82 +1,83 b'' | |||
|
1 | 1 | .. _rhodecode-release-notes-ref: |
|
2 | 2 | |
|
3 | 3 | Release Notes |
|
4 | 4 | ============= |
|
5 | 5 | |
|
6 | 6 | |RCE| 4.x Versions |
|
7 | 7 | ------------------ |
|
8 | 8 | |
|
9 | 9 | .. toctree:: |
|
10 | 10 | :maxdepth: 1 |
|
11 | 11 | |
|
12 | release-notes-4.2.0.rst | |
|
12 | 13 | release-notes-4.1.2.rst |
|
13 | 14 | release-notes-4.1.1.rst |
|
14 | 15 | release-notes-4.1.0.rst |
|
15 | 16 | release-notes-4.0.1.rst |
|
16 | 17 | release-notes-4.0.0.rst |
|
17 | 18 | |
|
18 | 19 | |RCE| 3.x Versions |
|
19 | 20 | ------------------ |
|
20 | 21 | |
|
21 | 22 | .. toctree:: |
|
22 | 23 | :maxdepth: 1 |
|
23 | 24 | |
|
24 | 25 | release-notes-3.8.4.rst |
|
25 | 26 | release-notes-3.8.3.rst |
|
26 | 27 | release-notes-3.8.2.rst |
|
27 | 28 | release-notes-3.8.1.rst |
|
28 | 29 | release-notes-3.8.0.rst |
|
29 | 30 | release-notes-3.7.1.rst |
|
30 | 31 | release-notes-3.7.0.rst |
|
31 | 32 | release-notes-3.6.1.rst |
|
32 | 33 | release-notes-3.6.0.rst |
|
33 | 34 | release-notes-3.5.2.rst |
|
34 | 35 | release-notes-3.5.1.rst |
|
35 | 36 | release-notes-3.5.0.rst |
|
36 | 37 | release-notes-3.4.1.rst |
|
37 | 38 | release-notes-3.4.0.rst |
|
38 | 39 | release-notes-3.3.4.rst |
|
39 | 40 | release-notes-3.3.3.rst |
|
40 | 41 | release-notes-3.3.2.rst |
|
41 | 42 | release-notes-3.3.1.rst |
|
42 | 43 | release-notes-3.3.0.rst |
|
43 | 44 | release-notes-3.2.3.rst |
|
44 | 45 | release-notes-3.2.2.rst |
|
45 | 46 | release-notes-3.2.1.rst |
|
46 | 47 | release-notes-3.2.0.rst |
|
47 | 48 | release-notes-3.1.1.rst |
|
48 | 49 | release-notes-3.1.0.rst |
|
49 | 50 | release-notes-3.0.2.rst |
|
50 | 51 | release-notes-3.0.1.rst |
|
51 | 52 | release-notes-3.0.0.rst |
|
52 | 53 | |
|
53 | 54 | |RCE| 2.x Versions |
|
54 | 55 | ------------------ |
|
55 | 56 | |
|
56 | 57 | .. toctree:: |
|
57 | 58 | :maxdepth: 1 |
|
58 | 59 | |
|
59 | 60 | release-notes-2.2.8.rst |
|
60 | 61 | release-notes-2.2.7.rst |
|
61 | 62 | release-notes-2.2.6.rst |
|
62 | 63 | release-notes-2.2.5.rst |
|
63 | 64 | release-notes-2.2.4.rst |
|
64 | 65 | release-notes-2.2.3.rst |
|
65 | 66 | release-notes-2.2.2.rst |
|
66 | 67 | release-notes-2.2.1.rst |
|
67 | 68 | release-notes-2.2.0.rst |
|
68 | 69 | release-notes-2.1.0.rst |
|
69 | 70 | release-notes-2.0.2.rst |
|
70 | 71 | release-notes-2.0.1.rst |
|
71 | 72 | release-notes-2.0.0.rst |
|
72 | 73 | |
|
73 | 74 | |RCE| 1.x Versions |
|
74 | 75 | ------------------ |
|
75 | 76 | |
|
76 | 77 | .. toctree:: |
|
77 | 78 | :maxdepth: 1 |
|
78 | 79 | |
|
79 | 80 | release-notes-1.7.2.rst |
|
80 | 81 | release-notes-1.7.1.rst |
|
81 | 82 | release-notes-1.7.0.rst |
|
82 | 83 | release-notes-1.6.0.rst |
@@ -1,163 +1,160 b'' | |||
|
1 | 1 | # Utility to generate the license information |
|
2 | 2 | # |
|
3 | 3 | # Usage: |
|
4 | 4 | # |
|
5 | 5 | # nix-build -I ~/dev license.nix -A result |
|
6 | 6 | # |
|
7 | 7 | # Afterwards ./result will contain the license information as JSON files. |
|
8 | 8 | # |
|
9 | 9 | # |
|
10 | 10 | # Overview |
|
11 | 11 | # |
|
12 | 12 | # Uses two steps to get the relevant license information: |
|
13 | 13 | # |
|
14 | 14 | # 1. Walk down the derivations based on "buildInputs" and |
|
15 | 15 | # "propagatedBuildInputs". This results in all dependencies based on the nix |
|
16 | 16 | # declartions. |
|
17 | 17 | # |
|
18 | 18 | # 2. Build Enterprise and query nix-store to get a list of runtime |
|
19 | 19 | # dependencies. The results from step 1 are then limited to the ones which |
|
20 | 20 | # are in this list. |
|
21 | 21 | # |
|
22 | 22 | # The result is then available in ./result/license.json. |
|
23 | 23 | # |
|
24 | 24 | |
|
25 | 25 | |
|
26 | 26 | let |
|
27 | 27 | |
|
28 | 28 | nixpkgs = import <nixpkgs> {}; |
|
29 | 29 | |
|
30 | 30 | stdenv = nixpkgs.stdenv; |
|
31 | 31 | |
|
32 | 32 | # Enterprise as simple as possible, goal here is just to identify the runtime |
|
33 | 33 | # dependencies. Ideally we could avoid building Enterprise at all and somehow |
|
34 | 34 | # figure it out without calling into nix-store. |
|
35 | 35 | enterprise = import ./default.nix { |
|
36 | 36 | doCheck = false; |
|
37 | with_vcsserver = false; | |
|
38 | with_pyramid = false; | |
|
39 | cythonize = false; | |
|
40 | 37 | }; |
|
41 | 38 | |
|
42 | 39 | # For a given derivation, return the list of all dependencies |
|
43 | 40 | drvToDependencies = drv: nixpkgs.lib.flatten [ |
|
44 | 41 | drv.nativeBuildInputs or [] |
|
45 | 42 | drv.propagatedNativeBuildInputs or [] |
|
46 | 43 | ]; |
|
47 | 44 | |
|
48 | 45 | # Transform the given derivation into the meta information which we need in |
|
49 | 46 | # the resulting JSON files. |
|
50 | 47 | drvToMeta = drv: { |
|
51 | 48 | name = drv.name or "UNNAMED"; |
|
52 | 49 | license = if drv ? meta.license then drv.meta.license else "UNKNOWN"; |
|
53 | 50 | }; |
|
54 | 51 | |
|
55 | 52 | # Walk the tree of buildInputs and propagatedBuildInputs and return it as a |
|
56 | 53 | # flat list. Duplicates are avoided. |
|
57 | 54 | listDrvDependencies = drv: let |
|
58 | 55 | addElement = element: seen: |
|
59 | 56 | if (builtins.elem element seen) |
|
60 | 57 | then seen |
|
61 | 58 | else let |
|
62 | 59 | newSeen = seen ++ [ element ]; |
|
63 | 60 | newDeps = drvToDependencies element; |
|
64 | 61 | in nixpkgs.lib.fold addElement newSeen newDeps; |
|
65 | 62 | initialElements = drvToDependencies drv; |
|
66 | 63 | in nixpkgs.lib.fold addElement [] initialElements; |
|
67 | 64 | |
|
68 | 65 | # Reads in a file with store paths and returns a list of derivation names. |
|
69 | 66 | # |
|
70 | 67 | # Reads the file, splits the lines, then removes the prefix, so that we |
|
71 | 68 | # end up with a list of derivation names in the end. |
|
72 | 69 | storePathsToDrvNames = srcPath: let |
|
73 | 70 | rawStorePaths = nixpkgs.lib.removeSuffix "\n" ( |
|
74 | 71 | builtins.readFile srcPath); |
|
75 | 72 | storePaths = nixpkgs.lib.splitString "\n" rawStorePaths; |
|
76 | 73 | # TODO: johbo: Would be nice to use some sort of utility here to convert |
|
77 | 74 | # the path to a derivation name. |
|
78 | 75 | storePathPrefix = ( |
|
79 | 76 | builtins.stringLength "/nix/store/zwy7aavnif9ayw30rya1k6xiacafzzl6-"); |
|
80 | 77 | storePathToName = path: |
|
81 | 78 | builtins.substring storePathPrefix (builtins.stringLength path) path; |
|
82 | 79 | in (map storePathToName storePaths); |
|
83 | 80 | |
|
84 | 81 | in rec { |
|
85 | 82 | |
|
86 | 83 | # Build Enterprise and call nix-store to retrieve the runtime |
|
87 | 84 | # dependencies. The result is available in the nix store. |
|
88 | 85 | runtimeDependencies = stdenv.mkDerivation { |
|
89 | 86 | name = "runtime-dependencies"; |
|
90 | 87 | buildInputs = [ |
|
91 | 88 | # Needed to query the store |
|
92 | 89 | nixpkgs.nix |
|
93 | 90 | ]; |
|
94 | 91 | unpackPhase = '' |
|
95 | 92 | echo "Nothing to unpack" |
|
96 | 93 | ''; |
|
97 | 94 | buildPhase = '' |
|
98 | 95 | # Get a list of runtime dependencies |
|
99 | 96 | nix-store -q --references ${enterprise} > nix-store-references |
|
100 | 97 | ''; |
|
101 | 98 | installPhase = '' |
|
102 | 99 | mkdir -p $out |
|
103 | 100 | cp -v nix-store-references $out/ |
|
104 | 101 | ''; |
|
105 | 102 | }; |
|
106 | 103 | |
|
107 | 104 | # Produce the license overview files. |
|
108 | 105 | result = let |
|
109 | 106 | |
|
110 | 107 | # Dependencies according to the nix-store |
|
111 | 108 | runtimeDependencyNames = ( |
|
112 | 109 | storePathsToDrvNames "${runtimeDependencies}/nix-store-references"); |
|
113 | 110 | |
|
114 | 111 | # Dependencies based on buildInputs and propagatedBuildInputs |
|
115 | 112 | enterpriseAllDependencies = listDrvDependencies enterprise; |
|
116 | 113 | enterpriseRuntimeDependencies = let |
|
117 | 114 | elemName = element: element.name or "UNNAMED"; |
|
118 | 115 | isRuntime = element: builtins.elem (elemName element) runtimeDependencyNames; |
|
119 | 116 | in builtins.filter isRuntime enterpriseAllDependencies; |
|
120 | 117 | |
|
121 | 118 | # Extract relevant meta information |
|
122 | 119 | enterpriseAllLicenses = map drvToMeta enterpriseAllDependencies; |
|
123 | 120 | enterpriseRuntimeLicenses = map drvToMeta enterpriseRuntimeDependencies; |
|
124 | 121 | |
|
125 | 122 | in stdenv.mkDerivation { |
|
126 | 123 | |
|
127 | 124 | name = "licenses"; |
|
128 | 125 | |
|
129 | 126 | buildInputs = []; |
|
130 | 127 | |
|
131 | 128 | unpackPhase = '' |
|
132 | 129 | echo "Nothing to unpack" |
|
133 | 130 | ''; |
|
134 | 131 | |
|
135 | 132 | buildPhase = '' |
|
136 | 133 | mkdir build |
|
137 | 134 | |
|
138 | 135 | # Copy list of runtime dependencies for the Python processor |
|
139 | 136 | cp "${runtimeDependencies}/nix-store-references" ./build/nix-store-references |
|
140 | 137 | |
|
141 | 138 | # All licenses which we found by walking buildInputs and |
|
142 | 139 | # propagatedBuildInputs |
|
143 | 140 | cat > build/all-licenses.json <<EOF |
|
144 | 141 | ${builtins.toJSON enterpriseAllLicenses} |
|
145 | 142 | EOF |
|
146 | 143 | |
|
147 | 144 | # License information for our runtime dependencies only. Basically all |
|
148 | 145 | # licenses limited to the items which where also reported by nix-store as |
|
149 | 146 | # a dependency. |
|
150 | 147 | cat > build/licenses.json <<EOF |
|
151 | 148 | ${builtins.toJSON enterpriseRuntimeLicenses} |
|
152 | 149 | EOF |
|
153 | 150 | ''; |
|
154 | 151 | |
|
155 | 152 | installPhase = '' |
|
156 | 153 | mkdir -p $out |
|
157 | 154 | |
|
158 | 155 | # Store it all, that helps when things go wrong |
|
159 | 156 | cp -rv ./build/* $out |
|
160 | 157 | ''; |
|
161 | 158 | }; |
|
162 | 159 | |
|
163 | 160 | } |
@@ -1,165 +1,273 b'' | |||
|
1 | 1 | # Overrides for the generated python-packages.nix |
|
2 | 2 | # |
|
3 | 3 | # This function is intended to be used as an extension to the generated file |
|
4 | 4 | # python-packages.nix. The main objective is to add needed dependencies of C |
|
5 | 5 | # libraries and tweak the build instructions where needed. |
|
6 | 6 | |
|
7 | 7 | { pkgs, basePythonPackages }: |
|
8 | 8 | |
|
9 | 9 | let |
|
10 | 10 | sed = "sed -i"; |
|
11 | localLicenses = { | |
|
12 | repoze = { | |
|
13 | fullName = "Repoze License"; | |
|
14 | url = http://www.repoze.org/LICENSE.txt; | |
|
15 | }; | |
|
16 | }; | |
|
11 | 17 | in |
|
12 | 18 | |
|
13 | 19 | self: super: { |
|
14 | 20 | |
|
21 | appenlight-client = super.appenlight-client.override (attrs: { | |
|
22 | meta = { | |
|
23 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
24 | }; | |
|
25 | }); | |
|
26 | ||
|
27 | future = super.future.override (attrs: { | |
|
28 | meta = { | |
|
29 | license = [ pkgs.lib.licenses.mit ]; | |
|
30 | }; | |
|
31 | }); | |
|
32 | ||
|
15 | 33 | gnureadline = super.gnureadline.override (attrs: { |
|
16 | 34 | buildInputs = attrs.buildInputs ++ [ |
|
17 | 35 | pkgs.ncurses |
|
18 | 36 | ]; |
|
19 | 37 | patchPhase = '' |
|
20 | 38 | substituteInPlace setup.py --replace "/bin/bash" "${pkgs.bash}/bin/bash" |
|
21 | 39 | ''; |
|
22 | 40 | }); |
|
23 | 41 | |
|
24 | 42 | gunicorn = super.gunicorn.override (attrs: { |
|
25 | 43 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
26 | 44 | # johbo: futures is needed as long as we are on Python 2, otherwise |
|
27 | 45 | # gunicorn explodes if used with multiple threads per worker. |
|
28 | 46 | self.futures |
|
29 | 47 | ]; |
|
30 | 48 | }); |
|
31 | 49 | |
|
32 | 50 | ipython = super.ipython.override (attrs: { |
|
33 | 51 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
34 | 52 | self.gnureadline |
|
35 | 53 | ]; |
|
36 | 54 | }); |
|
37 | 55 | |
|
38 | 56 | kombu = super.kombu.override (attrs: { |
|
39 | 57 | # The current version of kombu needs some patching to work with the |
|
40 | 58 | # other libs. Should be removed once we update celery and kombu. |
|
41 | 59 | patches = [ |
|
42 | 60 | ./patch-kombu-py-2-7-11.diff |
|
43 | 61 | ./patch-kombu-msgpack.diff |
|
44 | 62 | ]; |
|
45 | 63 | }); |
|
46 | 64 | |
|
47 | 65 | lxml = super.lxml.override (attrs: { |
|
48 | 66 | buildInputs = with self; [ |
|
49 | 67 | pkgs.libxml2 |
|
50 | 68 | pkgs.libxslt |
|
51 | 69 | ]; |
|
52 | 70 | }); |
|
53 | 71 | |
|
54 | 72 | MySQL-python = super.MySQL-python.override (attrs: { |
|
55 | 73 | buildInputs = attrs.buildInputs ++ [ |
|
56 | 74 | pkgs.openssl |
|
57 | 75 | ]; |
|
58 | 76 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
59 | 77 | pkgs.mysql.lib |
|
60 | 78 | pkgs.zlib |
|
61 | 79 | ]; |
|
62 | 80 | }); |
|
63 | 81 | |
|
64 | 82 | psutil = super.psutil.override (attrs: { |
|
65 | 83 | buildInputs = attrs.buildInputs ++ |
|
66 | 84 | pkgs.lib.optional pkgs.stdenv.isDarwin pkgs.darwin.IOKit; |
|
67 | 85 | }); |
|
68 | 86 | |
|
69 | 87 | psycopg2 = super.psycopg2.override (attrs: { |
|
70 | 88 | buildInputs = attrs.buildInputs ++ |
|
71 | 89 | pkgs.lib.optional pkgs.stdenv.isDarwin pkgs.openssl; |
|
72 | 90 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
73 | 91 | pkgs.postgresql |
|
74 | 92 | ]; |
|
93 | meta = { | |
|
94 | license = pkgs.lib.licenses.lgpl3Plus; | |
|
95 | }; | |
|
75 | 96 | }); |
|
76 | 97 | |
|
77 | 98 | pycurl = super.pycurl.override (attrs: { |
|
78 | 99 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
79 | 100 | pkgs.curl |
|
80 | 101 | pkgs.openssl |
|
81 | 102 | ]; |
|
82 | 103 | preConfigure = '' |
|
83 | 104 | substituteInPlace setup.py --replace '--static-libs' '--libs' |
|
84 | 105 | export PYCURL_SSL_LIBRARY=openssl |
|
85 | 106 | ''; |
|
107 | meta = { | |
|
108 | # TODO: It is LGPL and MIT | |
|
109 | license = pkgs.lib.licenses.mit; | |
|
110 | }; | |
|
86 | 111 | }); |
|
87 | 112 | |
|
88 | 113 | Pylons = super.Pylons.override (attrs: { |
|
89 | 114 | name = "Pylons-1.0.1-patch1"; |
|
90 | 115 | src = pkgs.fetchgit { |
|
91 | 116 | url = "https://code.rhodecode.com/upstream/pylons"; |
|
92 | 117 | rev = "707354ee4261b9c10450404fc9852ccea4fd667d"; |
|
93 | 118 | sha256 = "b2763274c2780523a335f83a1df65be22ebe4ff413a7bc9e9288d23c1f62032e"; |
|
94 | 119 | }; |
|
95 | 120 | }); |
|
96 | 121 | |
|
97 | 122 | pyramid = super.pyramid.override (attrs: { |
|
98 | 123 | postFixup = '' |
|
99 | 124 | wrapPythonPrograms |
|
100 | 125 | # TODO: johbo: "wrapPython" adds this magic line which |
|
101 | 126 | # confuses pserve. |
|
102 | 127 | ${sed} '/import sys; sys.argv/d' $out/bin/.pserve-wrapped |
|
103 | 128 | ''; |
|
129 | meta = { | |
|
130 | license = localLicenses.repoze; | |
|
131 | }; | |
|
132 | }); | |
|
133 | ||
|
134 | pyramid-debugtoolbar = super.pyramid-debugtoolbar.override (attrs: { | |
|
135 | meta = { | |
|
136 | license = [ pkgs.lib.licenses.bsdOriginal localLicenses.repoze ]; | |
|
137 | }; | |
|
104 | 138 | }); |
|
105 | 139 | |
|
106 | 140 | Pyro4 = super.Pyro4.override (attrs: { |
|
107 | 141 | # TODO: Was not able to generate this version, needs further |
|
108 | 142 | # investigation. |
|
109 | 143 | name = "Pyro4-4.35"; |
|
110 | 144 | src = pkgs.fetchurl { |
|
111 | 145 | url = "https://pypi.python.org/packages/source/P/Pyro4/Pyro4-4.35.src.tar.gz"; |
|
112 | 146 | md5 = "cbe6cb855f086a0f092ca075005855f3"; |
|
113 | 147 | }; |
|
114 | 148 | }); |
|
115 | 149 | |
|
116 | 150 | pysqlite = super.pysqlite.override (attrs: { |
|
117 | 151 | propagatedBuildInputs = [ |
|
118 | 152 | pkgs.sqlite |
|
119 | 153 | ]; |
|
154 | meta = { | |
|
155 | license = [ pkgs.lib.licenses.zlib pkgs.lib.licenses.libpng ]; | |
|
156 | }; | |
|
120 | 157 | }); |
|
121 | 158 | |
|
122 | 159 | pytest-runner = super.pytest-runner.override (attrs: { |
|
123 | 160 | propagatedBuildInputs = [ |
|
124 | 161 | self.setuptools-scm |
|
125 | 162 | ]; |
|
126 | 163 | }); |
|
127 | 164 | |
|
128 | 165 | python-ldap = super.python-ldap.override (attrs: { |
|
129 | 166 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
130 | 167 | pkgs.cyrus_sasl |
|
131 | 168 | pkgs.openldap |
|
132 | 169 | pkgs.openssl |
|
133 | 170 | ]; |
|
134 | 171 | NIX_CFLAGS_COMPILE = "-I${pkgs.cyrus_sasl}/include/sasl"; |
|
135 | 172 | }); |
|
136 | 173 | |
|
137 | 174 | python-pam = super.python-pam.override (attrs: |
|
138 | 175 | let |
|
139 | 176 | includeLibPam = pkgs.stdenv.isLinux; |
|
140 | 177 | in { |
|
141 | 178 | # TODO: johbo: Move the option up into the default.nix, we should |
|
142 | 179 | # include python-pam only on supported platforms. |
|
143 | 180 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ |
|
144 | 181 | pkgs.lib.optional includeLibPam [ |
|
145 | 182 | pkgs.pam |
|
146 | 183 | ]; |
|
147 | 184 | # TODO: johbo: Check if this can be avoided, or transform into |
|
148 | 185 | # a real patch |
|
149 | 186 | patchPhase = pkgs.lib.optionals includeLibPam '' |
|
150 | 187 | substituteInPlace pam.py \ |
|
151 | 188 | --replace 'find_library("pam")' '"${pkgs.pam}/lib/libpam.so.0"' |
|
152 | 189 | ''; |
|
153 | 190 | }); |
|
154 | 191 | |
|
155 | 192 | rhodecode-tools = super.rhodecode-tools.override (attrs: { |
|
156 | 193 | patches = [ |
|
157 | 194 | ./patch-rhodecode-tools-setup.diff |
|
158 | 195 | ]; |
|
159 | 196 | }); |
|
160 | 197 | |
|
198 | URLObject = super.URLObject.override (attrs: { | |
|
199 | meta = { | |
|
200 | license = { | |
|
201 | spdxId = "Unlicense"; | |
|
202 | fullName = "The Unlicense"; | |
|
203 | url = http://unlicense.org/; | |
|
204 | }; | |
|
205 | }; | |
|
206 | }); | |
|
207 | ||
|
208 | amqplib = super.amqplib.override (attrs: { | |
|
209 | meta = { | |
|
210 | license = pkgs.lib.licenses.lgpl3; | |
|
211 | }; | |
|
212 | }); | |
|
213 | ||
|
214 | docutils = super.docutils.override (attrs: { | |
|
215 | meta = { | |
|
216 | license = pkgs.lib.licenses.bsd2; | |
|
217 | }; | |
|
218 | }); | |
|
219 | ||
|
220 | colander = super.colander.override (attrs: { | |
|
221 | meta = { | |
|
222 | license = localLicenses.repoze; | |
|
223 | }; | |
|
224 | }); | |
|
225 | ||
|
226 | pyramid-beaker = super.pyramid-beaker.override (attrs: { | |
|
227 | meta = { | |
|
228 | license = localLicenses.repoze; | |
|
229 | }; | |
|
230 | }); | |
|
231 | ||
|
232 | pyramid-mako = super.pyramid-mako.override (attrs: { | |
|
233 | meta = { | |
|
234 | license = localLicenses.repoze; | |
|
235 | }; | |
|
236 | }); | |
|
237 | ||
|
238 | repoze.lru = super.repoze.lru.override (attrs: { | |
|
239 | meta = { | |
|
240 | license = localLicenses.repoze; | |
|
241 | }; | |
|
242 | }); | |
|
243 | ||
|
244 | recaptcha-client = super.recaptcha-client.override (attrs: { | |
|
245 | meta = { | |
|
246 | # TODO: It is MIT/X11 | |
|
247 | license = pkgs.lib.licenses.mit; | |
|
248 | }; | |
|
249 | }); | |
|
250 | ||
|
251 | python-editor = super.python-editor.override (attrs: { | |
|
252 | meta = { | |
|
253 | license = pkgs.lib.licenses.asl20; | |
|
254 | }; | |
|
255 | }); | |
|
256 | ||
|
257 | translationstring = super.translationstring.override (attrs: { | |
|
258 | meta = { | |
|
259 | license = localLicenses.repoze; | |
|
260 | }; | |
|
261 | }); | |
|
262 | ||
|
263 | venusian = super.venusian.override (attrs: { | |
|
264 | meta = { | |
|
265 | license = localLicenses.repoze; | |
|
266 | }; | |
|
267 | }); | |
|
268 | ||
|
161 | 269 | # Avoid that setuptools is replaced, this leads to trouble |
|
162 | 270 | # with buildPythonPackage. |
|
163 | 271 | setuptools = basePythonPackages.setuptools; |
|
164 | 272 | |
|
165 | 273 | } |
@@ -1,1263 +1,1641 b'' | |||
|
1 | 1 | { |
|
2 | 2 | Babel = super.buildPythonPackage { |
|
3 | 3 | name = "Babel-1.3"; |
|
4 | 4 | buildInputs = with self; []; |
|
5 | 5 | doCheck = false; |
|
6 | 6 | propagatedBuildInputs = with self; [pytz]; |
|
7 | 7 | src = fetchurl { |
|
8 | 8 | url = "https://pypi.python.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz"; |
|
9 | 9 | md5 = "5264ceb02717843cbc9ffce8e6e06bdb"; |
|
10 | 10 | }; |
|
11 | meta = { | |
|
12 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
13 | }; | |
|
11 | 14 | }; |
|
12 | 15 | Beaker = super.buildPythonPackage { |
|
13 | 16 | name = "Beaker-1.7.0"; |
|
14 | 17 | buildInputs = with self; []; |
|
15 | 18 | doCheck = false; |
|
16 | 19 | propagatedBuildInputs = with self; []; |
|
17 | 20 | src = fetchurl { |
|
18 | 21 | url = "https://pypi.python.org/packages/97/8e/409d2e7c009b8aa803dc9e6f239f1db7c3cdf578249087a404e7c27a505d/Beaker-1.7.0.tar.gz"; |
|
19 | 22 | md5 = "386be3f7fe427358881eee4622b428b3"; |
|
20 | 23 | }; |
|
24 | meta = { | |
|
25 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
26 | }; | |
|
21 | 27 | }; |
|
22 | 28 | CProfileV = super.buildPythonPackage { |
|
23 | 29 | name = "CProfileV-1.0.6"; |
|
24 | 30 | buildInputs = with self; []; |
|
25 | 31 | doCheck = false; |
|
26 | 32 | propagatedBuildInputs = with self; [bottle]; |
|
27 | 33 | src = fetchurl { |
|
28 | 34 | url = "https://pypi.python.org/packages/eb/df/983a0b6cfd3ac94abf023f5011cb04f33613ace196e33f53c86cf91850d5/CProfileV-1.0.6.tar.gz"; |
|
29 | 35 | md5 = "08c7c242b6e64237bc53c5d13537e03d"; |
|
30 | 36 | }; |
|
37 | meta = { | |
|
38 | license = [ pkgs.lib.licenses.mit ]; | |
|
39 | }; | |
|
31 | 40 | }; |
|
32 | 41 | Fabric = super.buildPythonPackage { |
|
33 | 42 | name = "Fabric-1.10.0"; |
|
34 | 43 | buildInputs = with self; []; |
|
35 | 44 | doCheck = false; |
|
36 | 45 | propagatedBuildInputs = with self; [paramiko]; |
|
37 | 46 | src = fetchurl { |
|
38 | 47 | url = "https://pypi.python.org/packages/e3/5f/b6ebdb5241d5ec9eab582a5c8a01255c1107da396f849e538801d2fe64a5/Fabric-1.10.0.tar.gz"; |
|
39 | 48 | md5 = "2cb96473387f0e7aa035210892352f4a"; |
|
40 | 49 | }; |
|
50 | meta = { | |
|
51 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
52 | }; | |
|
41 | 53 | }; |
|
42 | 54 | FormEncode = super.buildPythonPackage { |
|
43 | 55 | name = "FormEncode-1.2.4"; |
|
44 | 56 | buildInputs = with self; []; |
|
45 | 57 | doCheck = false; |
|
46 | 58 | propagatedBuildInputs = with self; []; |
|
47 | 59 | src = fetchurl { |
|
48 | 60 | url = "https://pypi.python.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz"; |
|
49 | 61 | md5 = "6bc17fb9aed8aea198975e888e2077f4"; |
|
50 | 62 | }; |
|
63 | meta = { | |
|
64 | license = [ pkgs.lib.licenses.psfl ]; | |
|
65 | }; | |
|
51 | 66 | }; |
|
52 | 67 | Jinja2 = super.buildPythonPackage { |
|
53 | 68 | name = "Jinja2-2.7.3"; |
|
54 | 69 | buildInputs = with self; []; |
|
55 | 70 | doCheck = false; |
|
56 | 71 | propagatedBuildInputs = with self; [MarkupSafe]; |
|
57 | 72 | src = fetchurl { |
|
58 | 73 | url = "https://pypi.python.org/packages/b0/73/eab0bca302d6d6a0b5c402f47ad1760dc9cb2dd14bbc1873ad48db258e4d/Jinja2-2.7.3.tar.gz"; |
|
59 | 74 | md5 = "b9dffd2f3b43d673802fe857c8445b1a"; |
|
60 | 75 | }; |
|
76 | meta = { | |
|
77 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
78 | }; | |
|
61 | 79 | }; |
|
62 | 80 | Mako = super.buildPythonPackage { |
|
63 | 81 | name = "Mako-1.0.1"; |
|
64 | 82 | buildInputs = with self; []; |
|
65 | 83 | doCheck = false; |
|
66 | 84 | propagatedBuildInputs = with self; [MarkupSafe]; |
|
67 | 85 | src = fetchurl { |
|
68 | 86 | url = "https://pypi.python.org/packages/8e/a4/aa56533ecaa5f22ca92428f74e074d0c9337282933c722391902c8f9e0f8/Mako-1.0.1.tar.gz"; |
|
69 | 87 | md5 = "9f0aafd177b039ef67b90ea350497a54"; |
|
70 | 88 | }; |
|
89 | meta = { | |
|
90 | license = [ pkgs.lib.licenses.mit ]; | |
|
91 | }; | |
|
71 | 92 | }; |
|
72 | 93 | Markdown = super.buildPythonPackage { |
|
73 | 94 | name = "Markdown-2.6.2"; |
|
74 | 95 | buildInputs = with self; []; |
|
75 | 96 | doCheck = false; |
|
76 | 97 | propagatedBuildInputs = with self; []; |
|
77 | 98 | src = fetchurl { |
|
78 | 99 | url = "https://pypi.python.org/packages/62/8b/83658b5f6c220d5fcde9f9852d46ea54765d734cfbc5a9f4c05bfc36db4d/Markdown-2.6.2.tar.gz"; |
|
79 | 100 | md5 = "256d19afcc564dc4ce4c229bb762f7ae"; |
|
80 | 101 | }; |
|
102 | meta = { | |
|
103 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
104 | }; | |
|
81 | 105 | }; |
|
82 | 106 | MarkupSafe = super.buildPythonPackage { |
|
83 | 107 | name = "MarkupSafe-0.23"; |
|
84 | 108 | buildInputs = with self; []; |
|
85 | 109 | doCheck = false; |
|
86 | 110 | propagatedBuildInputs = with self; []; |
|
87 | 111 | src = fetchurl { |
|
88 | 112 | url = "https://pypi.python.org/packages/c0/41/bae1254e0396c0cc8cf1751cb7d9afc90a602353695af5952530482c963f/MarkupSafe-0.23.tar.gz"; |
|
89 | 113 | md5 = "f5ab3deee4c37cd6a922fb81e730da6e"; |
|
90 | 114 | }; |
|
115 | meta = { | |
|
116 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
117 | }; | |
|
91 | 118 | }; |
|
92 | 119 | MySQL-python = super.buildPythonPackage { |
|
93 | 120 | name = "MySQL-python-1.2.5"; |
|
94 | 121 | buildInputs = with self; []; |
|
95 | 122 | doCheck = false; |
|
96 | 123 | propagatedBuildInputs = with self; []; |
|
97 | 124 | src = fetchurl { |
|
98 | 125 | url = "https://pypi.python.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip"; |
|
99 | 126 | md5 = "654f75b302db6ed8dc5a898c625e030c"; |
|
100 | 127 | }; |
|
128 | meta = { | |
|
129 | license = [ pkgs.lib.licenses.gpl1 ]; | |
|
130 | }; | |
|
101 | 131 | }; |
|
102 | 132 | Paste = super.buildPythonPackage { |
|
103 | 133 | name = "Paste-2.0.2"; |
|
104 | 134 | buildInputs = with self; []; |
|
105 | 135 | doCheck = false; |
|
106 | 136 | propagatedBuildInputs = with self; [six]; |
|
107 | 137 | src = fetchurl { |
|
108 | 138 | url = "https://pypi.python.org/packages/d5/8d/0f8ac40687b97ff3e07ebd1369be20bdb3f93864d2dc3c2ff542edb4ce50/Paste-2.0.2.tar.gz"; |
|
109 | 139 | md5 = "4bfc8a7eaf858f6309d2ac0f40fc951c"; |
|
110 | 140 | }; |
|
141 | meta = { | |
|
142 | license = [ pkgs.lib.licenses.mit ]; | |
|
143 | }; | |
|
111 | 144 | }; |
|
112 | 145 | PasteDeploy = super.buildPythonPackage { |
|
113 | 146 | name = "PasteDeploy-1.5.2"; |
|
114 | 147 | buildInputs = with self; []; |
|
115 | 148 | doCheck = false; |
|
116 | 149 | propagatedBuildInputs = with self; []; |
|
117 | 150 | src = fetchurl { |
|
118 | 151 | url = "https://pypi.python.org/packages/0f/90/8e20cdae206c543ea10793cbf4136eb9a8b3f417e04e40a29d72d9922cbd/PasteDeploy-1.5.2.tar.gz"; |
|
119 | 152 | md5 = "352b7205c78c8de4987578d19431af3b"; |
|
120 | 153 | }; |
|
154 | meta = { | |
|
155 | license = [ pkgs.lib.licenses.mit ]; | |
|
156 | }; | |
|
121 | 157 | }; |
|
122 | 158 | PasteScript = super.buildPythonPackage { |
|
123 | 159 | name = "PasteScript-1.7.5"; |
|
124 | 160 | buildInputs = with self; []; |
|
125 | 161 | doCheck = false; |
|
126 | 162 | propagatedBuildInputs = with self; [Paste PasteDeploy]; |
|
127 | 163 | src = fetchurl { |
|
128 | 164 | url = "https://pypi.python.org/packages/a5/05/fc60efa7c2f17a1dbaeccb2a903a1e90902d92b9d00eebabe3095829d806/PasteScript-1.7.5.tar.gz"; |
|
129 | 165 | md5 = "4c72d78dcb6bb993f30536842c16af4d"; |
|
130 | 166 | }; |
|
167 | meta = { | |
|
168 | license = [ pkgs.lib.licenses.mit ]; | |
|
169 | }; | |
|
131 | 170 | }; |
|
132 | 171 | Pygments = super.buildPythonPackage { |
|
133 | 172 | name = "Pygments-2.0.2"; |
|
134 | 173 | buildInputs = with self; []; |
|
135 | 174 | doCheck = false; |
|
136 | 175 | propagatedBuildInputs = with self; []; |
|
137 | 176 | src = fetchurl { |
|
138 | 177 | url = "https://pypi.python.org/packages/f4/c6/bdbc5a8a112256b2b6136af304dbae93d8b1ef8738ff2d12a51018800e46/Pygments-2.0.2.tar.gz"; |
|
139 | 178 | md5 = "238587a1370d62405edabd0794b3ec4a"; |
|
140 | 179 | }; |
|
180 | meta = { | |
|
181 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
182 | }; | |
|
141 | 183 | }; |
|
142 | 184 | Pylons = super.buildPythonPackage { |
|
143 | 185 | name = "Pylons-1.0.1"; |
|
144 | 186 | buildInputs = with self; []; |
|
145 | 187 | doCheck = false; |
|
146 | 188 | propagatedBuildInputs = with self; [Routes WebHelpers Beaker Paste PasteDeploy PasteScript FormEncode simplejson decorator nose Mako WebError WebTest Tempita MarkupSafe WebOb]; |
|
147 | 189 | src = fetchurl { |
|
148 | 190 | url = "https://pypi.python.org/packages/a2/69/b835a6bad00acbfeed3f33c6e44fa3f936efc998c795bfb15c61a79ecf62/Pylons-1.0.1.tar.gz"; |
|
149 | 191 | md5 = "6cb880d75fa81213192142b07a6e4915"; |
|
150 | 192 | }; |
|
193 | meta = { | |
|
194 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
195 | }; | |
|
151 | 196 | }; |
|
152 | 197 | Pyro4 = super.buildPythonPackage { |
|
153 | 198 | name = "Pyro4-4.41"; |
|
154 | 199 | buildInputs = with self; []; |
|
155 | 200 | doCheck = false; |
|
156 | 201 | propagatedBuildInputs = with self; [serpent]; |
|
157 | 202 | src = fetchurl { |
|
158 | 203 | url = "https://pypi.python.org/packages/56/2b/89b566b4bf3e7f8ba790db2d1223852f8cb454c52cab7693dd41f608ca2a/Pyro4-4.41.tar.gz"; |
|
159 | 204 | md5 = "ed69e9bfafa9c06c049a87cb0c4c2b6c"; |
|
160 | 205 | }; |
|
206 | meta = { | |
|
207 | license = [ pkgs.lib.licenses.mit ]; | |
|
208 | }; | |
|
161 | 209 | }; |
|
162 | 210 | Routes = super.buildPythonPackage { |
|
163 | 211 | name = "Routes-1.13"; |
|
164 | 212 | buildInputs = with self; []; |
|
165 | 213 | doCheck = false; |
|
166 | 214 | propagatedBuildInputs = with self; [repoze.lru]; |
|
167 | 215 | src = fetchurl { |
|
168 | 216 | url = "https://pypi.python.org/packages/88/d3/259c3b3cde8837eb9441ab5f574a660e8a4acea8f54a078441d4d2acac1c/Routes-1.13.tar.gz"; |
|
169 | 217 | md5 = "d527b0ab7dd9172b1275a41f97448783"; |
|
170 | 218 | }; |
|
219 | meta = { | |
|
220 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
221 | }; | |
|
171 | 222 | }; |
|
172 | 223 | SQLAlchemy = super.buildPythonPackage { |
|
173 | 224 | name = "SQLAlchemy-0.9.9"; |
|
174 | 225 | buildInputs = with self; []; |
|
175 | 226 | doCheck = false; |
|
176 | 227 | propagatedBuildInputs = with self; []; |
|
177 | 228 | src = fetchurl { |
|
178 | 229 | url = "https://pypi.python.org/packages/28/f7/1bbfd0d8597e8c358d5e15a166a486ad82fc5579b4e67b6ef7c05b1d182b/SQLAlchemy-0.9.9.tar.gz"; |
|
179 | 230 | md5 = "8a10a9bd13ed3336ef7333ac2cc679ff"; |
|
180 | 231 | }; |
|
232 | meta = { | |
|
233 | license = [ pkgs.lib.licenses.mit ]; | |
|
234 | }; | |
|
181 | 235 | }; |
|
182 | 236 | Sphinx = super.buildPythonPackage { |
|
183 | 237 | name = "Sphinx-1.2.2"; |
|
184 | 238 | buildInputs = with self; []; |
|
185 | 239 | doCheck = false; |
|
186 | 240 | propagatedBuildInputs = with self; [Pygments docutils Jinja2]; |
|
187 | 241 | src = fetchurl { |
|
188 | 242 | url = "https://pypi.python.org/packages/0a/50/34017e6efcd372893a416aba14b84a1a149fc7074537b0e9cb6ca7b7abe9/Sphinx-1.2.2.tar.gz"; |
|
189 | 243 | md5 = "3dc73ccaa8d0bfb2d62fb671b1f7e8a4"; |
|
190 | 244 | }; |
|
245 | meta = { | |
|
246 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
247 | }; | |
|
191 | 248 | }; |
|
192 | 249 | Tempita = super.buildPythonPackage { |
|
193 | 250 | name = "Tempita-0.5.2"; |
|
194 | 251 | buildInputs = with self; []; |
|
195 | 252 | doCheck = false; |
|
196 | 253 | propagatedBuildInputs = with self; []; |
|
197 | 254 | src = fetchurl { |
|
198 | 255 | url = "https://pypi.python.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz"; |
|
199 | 256 | md5 = "4c2f17bb9d481821c41b6fbee904cea1"; |
|
200 | 257 | }; |
|
258 | meta = { | |
|
259 | license = [ pkgs.lib.licenses.mit ]; | |
|
260 | }; | |
|
201 | 261 | }; |
|
202 | 262 | URLObject = super.buildPythonPackage { |
|
203 | 263 | name = "URLObject-2.4.0"; |
|
204 | 264 | buildInputs = with self; []; |
|
205 | 265 | doCheck = false; |
|
206 | 266 | propagatedBuildInputs = with self; []; |
|
207 | 267 | src = fetchurl { |
|
208 | 268 | url = "https://pypi.python.org/packages/cb/b6/e25e58500f9caef85d664bec71ec67c116897bfebf8622c32cb75d1ca199/URLObject-2.4.0.tar.gz"; |
|
209 | 269 | md5 = "2ed819738a9f0a3051f31dc9924e3065"; |
|
210 | 270 | }; |
|
271 | meta = { | |
|
272 | license = [ ]; | |
|
273 | }; | |
|
211 | 274 | }; |
|
212 | 275 | WebError = super.buildPythonPackage { |
|
213 | 276 | name = "WebError-0.10.3"; |
|
214 | 277 | buildInputs = with self; []; |
|
215 | 278 | doCheck = false; |
|
216 | 279 | propagatedBuildInputs = with self; [WebOb Tempita Pygments Paste]; |
|
217 | 280 | src = fetchurl { |
|
218 | 281 | url = "https://pypi.python.org/packages/35/76/e7e5c2ce7e9c7f31b54c1ff295a495886d1279a002557d74dd8957346a79/WebError-0.10.3.tar.gz"; |
|
219 | 282 | md5 = "84b9990b0baae6fd440b1e60cdd06f9a"; |
|
220 | 283 | }; |
|
284 | meta = { | |
|
285 | license = [ pkgs.lib.licenses.mit ]; | |
|
286 | }; | |
|
221 | 287 | }; |
|
222 | 288 | WebHelpers = super.buildPythonPackage { |
|
223 | 289 | name = "WebHelpers-1.3"; |
|
224 | 290 | buildInputs = with self; []; |
|
225 | 291 | doCheck = false; |
|
226 | 292 | propagatedBuildInputs = with self; [MarkupSafe]; |
|
227 | 293 | src = fetchurl { |
|
228 | 294 | url = "https://pypi.python.org/packages/ee/68/4d07672821d514184357f1552f2dad923324f597e722de3b016ca4f7844f/WebHelpers-1.3.tar.gz"; |
|
229 | 295 | md5 = "32749ffadfc40fea51075a7def32588b"; |
|
230 | 296 | }; |
|
297 | meta = { | |
|
298 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
299 | }; | |
|
231 | 300 | }; |
|
232 | 301 | WebHelpers2 = super.buildPythonPackage { |
|
233 | 302 | name = "WebHelpers2-2.0"; |
|
234 | 303 | buildInputs = with self; []; |
|
235 | 304 | doCheck = false; |
|
236 | 305 | propagatedBuildInputs = with self; [MarkupSafe six]; |
|
237 | 306 | src = fetchurl { |
|
238 | 307 | url = "https://pypi.python.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz"; |
|
239 | 308 | md5 = "0f6b68d70c12ee0aed48c00b24da13d3"; |
|
240 | 309 | }; |
|
310 | meta = { | |
|
311 | license = [ pkgs.lib.licenses.mit ]; | |
|
312 | }; | |
|
241 | 313 | }; |
|
242 | 314 | WebOb = super.buildPythonPackage { |
|
243 | 315 | name = "WebOb-1.3.1"; |
|
244 | 316 | buildInputs = with self; []; |
|
245 | 317 | doCheck = false; |
|
246 | 318 | propagatedBuildInputs = with self; []; |
|
247 | 319 | src = fetchurl { |
|
248 | 320 | url = "https://pypi.python.org/packages/16/78/adfc0380b8a0d75b2d543fa7085ba98a573b1ae486d9def88d172b81b9fa/WebOb-1.3.1.tar.gz"; |
|
249 | 321 | md5 = "20918251c5726956ba8fef22d1556177"; |
|
250 | 322 | }; |
|
323 | meta = { | |
|
324 | license = [ pkgs.lib.licenses.mit ]; | |
|
325 | }; | |
|
251 | 326 | }; |
|
252 | 327 | WebTest = super.buildPythonPackage { |
|
253 | 328 | name = "WebTest-1.4.3"; |
|
254 | 329 | buildInputs = with self; []; |
|
255 | 330 | doCheck = false; |
|
256 | 331 | propagatedBuildInputs = with self; [WebOb]; |
|
257 | 332 | src = fetchurl { |
|
258 | 333 | url = "https://pypi.python.org/packages/51/3d/84fd0f628df10b30c7db87895f56d0158e5411206b721ca903cb51bfd948/WebTest-1.4.3.zip"; |
|
259 | 334 | md5 = "631ce728bed92c681a4020a36adbc353"; |
|
260 | 335 | }; |
|
336 | meta = { | |
|
337 | license = [ pkgs.lib.licenses.mit ]; | |
|
338 | }; | |
|
261 | 339 | }; |
|
262 | 340 | Whoosh = super.buildPythonPackage { |
|
263 | 341 | name = "Whoosh-2.7.0"; |
|
264 | 342 | buildInputs = with self; []; |
|
265 | 343 | doCheck = false; |
|
266 | 344 | propagatedBuildInputs = with self; []; |
|
267 | 345 | src = fetchurl { |
|
268 | 346 | url = "https://pypi.python.org/packages/1c/dc/2f0231ff3875ded36df8c1ab851451e51a237dc0e5a86d3d96036158da94/Whoosh-2.7.0.zip"; |
|
269 | 347 | md5 = "7abfd970f16fadc7311960f3fa0bc7a9"; |
|
270 | 348 | }; |
|
349 | meta = { | |
|
350 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ]; | |
|
351 | }; | |
|
271 | 352 | }; |
|
272 | 353 | alembic = super.buildPythonPackage { |
|
273 | 354 | name = "alembic-0.8.4"; |
|
274 | 355 | buildInputs = with self; []; |
|
275 | 356 | doCheck = false; |
|
276 | 357 | propagatedBuildInputs = with self; [SQLAlchemy Mako python-editor]; |
|
277 | 358 | src = fetchurl { |
|
278 | 359 | url = "https://pypi.python.org/packages/ca/7e/299b4499b5c75e5a38c5845145ad24755bebfb8eec07a2e1c366b7181eeb/alembic-0.8.4.tar.gz"; |
|
279 | 360 | md5 = "5f95d8ee62b443f9b37eb5bee76c582d"; |
|
280 | 361 | }; |
|
362 | meta = { | |
|
363 | license = [ pkgs.lib.licenses.mit ]; | |
|
364 | }; | |
|
281 | 365 | }; |
|
282 | 366 | amqplib = super.buildPythonPackage { |
|
283 | 367 | name = "amqplib-1.0.2"; |
|
284 | 368 | buildInputs = with self; []; |
|
285 | 369 | doCheck = false; |
|
286 | 370 | propagatedBuildInputs = with self; []; |
|
287 | 371 | src = fetchurl { |
|
288 | 372 | url = "https://pypi.python.org/packages/75/b7/8c2429bf8d92354a0118614f9a4d15e53bc69ebedce534284111de5a0102/amqplib-1.0.2.tgz"; |
|
289 | 373 | md5 = "5c92f17fbedd99b2b4a836d4352d1e2f"; |
|
290 | 374 | }; |
|
375 | meta = { | |
|
376 | license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; | |
|
377 | }; | |
|
291 | 378 | }; |
|
292 | 379 | anyjson = super.buildPythonPackage { |
|
293 | 380 | name = "anyjson-0.3.3"; |
|
294 | 381 | buildInputs = with self; []; |
|
295 | 382 | doCheck = false; |
|
296 | 383 | propagatedBuildInputs = with self; []; |
|
297 | 384 | src = fetchurl { |
|
298 | 385 | url = "https://pypi.python.org/packages/c3/4d/d4089e1a3dd25b46bebdb55a992b0797cff657b4477bc32ce28038fdecbc/anyjson-0.3.3.tar.gz"; |
|
299 | 386 | md5 = "2ea28d6ec311aeeebaf993cb3008b27c"; |
|
300 | 387 | }; |
|
388 | meta = { | |
|
389 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
390 | }; | |
|
301 | 391 | }; |
|
302 | 392 | appenlight-client = super.buildPythonPackage { |
|
303 | 393 | name = "appenlight-client-0.6.14"; |
|
304 | 394 | buildInputs = with self; []; |
|
305 | 395 | doCheck = false; |
|
306 | 396 | propagatedBuildInputs = with self; [WebOb requests]; |
|
307 | 397 | src = fetchurl { |
|
308 | 398 | url = "https://pypi.python.org/packages/4d/e0/23fee3ebada8143f707e65c06bcb82992040ee64ea8355e044ed55ebf0c1/appenlight_client-0.6.14.tar.gz"; |
|
309 | 399 | md5 = "578c69b09f4356d898fff1199b98a95c"; |
|
310 | 400 | }; |
|
401 | meta = { | |
|
402 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "DFSG approved"; } ]; | |
|
403 | }; | |
|
311 | 404 | }; |
|
312 | 405 | authomatic = super.buildPythonPackage { |
|
313 | 406 | name = "authomatic-0.1.0.post1"; |
|
314 | 407 | buildInputs = with self; []; |
|
315 | 408 | doCheck = false; |
|
316 | 409 | propagatedBuildInputs = with self; []; |
|
317 | 410 | src = fetchurl { |
|
318 | 411 | url = "https://pypi.python.org/packages/08/1a/8a930461e604c2d5a7a871e1ac59fa82ccf994c32e807230c8d2fb07815a/Authomatic-0.1.0.post1.tar.gz"; |
|
319 | 412 | md5 = "be3f3ce08747d776aae6d6cc8dcb49a9"; |
|
320 | 413 | }; |
|
414 | meta = { | |
|
415 | license = [ pkgs.lib.licenses.mit ]; | |
|
416 | }; | |
|
321 | 417 | }; |
|
322 | 418 | backport-ipaddress = super.buildPythonPackage { |
|
323 | 419 | name = "backport-ipaddress-0.1"; |
|
324 | 420 | buildInputs = with self; []; |
|
325 | 421 | doCheck = false; |
|
326 | 422 | propagatedBuildInputs = with self; []; |
|
327 | 423 | src = fetchurl { |
|
328 | 424 | url = "https://pypi.python.org/packages/d3/30/54c6dab05a4dec44db25ff309f1fbb6b7a8bde3f2bade38bb9da67bbab8f/backport_ipaddress-0.1.tar.gz"; |
|
329 | 425 | md5 = "9c1f45f4361f71b124d7293a60006c05"; |
|
330 | 426 | }; |
|
427 | meta = { | |
|
428 | license = [ pkgs.lib.licenses.psfl ]; | |
|
429 | }; | |
|
331 | 430 | }; |
|
332 | 431 | bottle = super.buildPythonPackage { |
|
333 | 432 | name = "bottle-0.12.8"; |
|
334 | 433 | buildInputs = with self; []; |
|
335 | 434 | doCheck = false; |
|
336 | 435 | propagatedBuildInputs = with self; []; |
|
337 | 436 | src = fetchurl { |
|
338 | 437 | url = "https://pypi.python.org/packages/52/df/e4a408f3a7af396d186d4ecd3b389dd764f0f943b4fa8d257bfe7b49d343/bottle-0.12.8.tar.gz"; |
|
339 | 438 | md5 = "13132c0a8f607bf860810a6ee9064c5b"; |
|
340 | 439 | }; |
|
440 | meta = { | |
|
441 | license = [ pkgs.lib.licenses.mit ]; | |
|
442 | }; | |
|
341 | 443 | }; |
|
342 | 444 | bumpversion = super.buildPythonPackage { |
|
343 | 445 | name = "bumpversion-0.5.3"; |
|
344 | 446 | buildInputs = with self; []; |
|
345 | 447 | doCheck = false; |
|
346 | 448 | propagatedBuildInputs = with self; []; |
|
347 | 449 | src = fetchurl { |
|
348 | 450 | url = "https://pypi.python.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz"; |
|
349 | 451 | md5 = "c66a3492eafcf5ad4b024be9fca29820"; |
|
350 | 452 | }; |
|
453 | meta = { | |
|
454 | license = [ pkgs.lib.licenses.mit ]; | |
|
455 | }; | |
|
351 | 456 | }; |
|
352 | 457 | celery = super.buildPythonPackage { |
|
353 | 458 | name = "celery-2.2.10"; |
|
354 | 459 | buildInputs = with self; []; |
|
355 | 460 | doCheck = false; |
|
356 | 461 | propagatedBuildInputs = with self; [python-dateutil anyjson kombu pyparsing]; |
|
357 | 462 | src = fetchurl { |
|
358 | 463 | url = "https://pypi.python.org/packages/b1/64/860fd50e45844c83442e7953effcddeff66b2851d90b2d784f7201c111b8/celery-2.2.10.tar.gz"; |
|
359 | 464 | md5 = "898bc87e54f278055b561316ba73e222"; |
|
360 | 465 | }; |
|
466 | meta = { | |
|
467 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
468 | }; | |
|
361 | 469 | }; |
|
362 | 470 | click = super.buildPythonPackage { |
|
363 | 471 | name = "click-5.1"; |
|
364 | 472 | buildInputs = with self; []; |
|
365 | 473 | doCheck = false; |
|
366 | 474 | propagatedBuildInputs = with self; []; |
|
367 | 475 | src = fetchurl { |
|
368 | 476 | url = "https://pypi.python.org/packages/b7/34/a496632c4fb6c1ee76efedf77bb8d28b29363d839953d95095b12defe791/click-5.1.tar.gz"; |
|
369 | 477 | md5 = "9c5323008cccfe232a8b161fc8196d41"; |
|
370 | 478 | }; |
|
479 | meta = { | |
|
480 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
481 | }; | |
|
371 | 482 | }; |
|
372 | 483 | colander = super.buildPythonPackage { |
|
373 | 484 | name = "colander-1.2"; |
|
374 | 485 | buildInputs = with self; []; |
|
375 | 486 | doCheck = false; |
|
376 | 487 | propagatedBuildInputs = with self; [translationstring iso8601]; |
|
377 | 488 | src = fetchurl { |
|
378 | 489 | url = "https://pypi.python.org/packages/14/23/c9ceba07a6a1dc0eefbb215fc0dc64aabc2b22ee756bc0f0c13278fa0887/colander-1.2.tar.gz"; |
|
379 | 490 | md5 = "83db21b07936a0726e588dae1914b9ed"; |
|
380 | 491 | }; |
|
492 | meta = { | |
|
493 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
|
494 | }; | |
|
381 | 495 | }; |
|
382 | 496 | configobj = super.buildPythonPackage { |
|
383 | 497 | name = "configobj-5.0.6"; |
|
384 | 498 | buildInputs = with self; []; |
|
385 | 499 | doCheck = false; |
|
386 | 500 | propagatedBuildInputs = with self; [six]; |
|
387 | 501 | src = fetchurl { |
|
388 | 502 | url = "https://pypi.python.org/packages/64/61/079eb60459c44929e684fa7d9e2fdca403f67d64dd9dbac27296be2e0fab/configobj-5.0.6.tar.gz"; |
|
389 | 503 | md5 = "e472a3a1c2a67bb0ec9b5d54c13a47d6"; |
|
390 | 504 | }; |
|
505 | meta = { | |
|
506 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
507 | }; | |
|
391 | 508 | }; |
|
392 | 509 | cov-core = super.buildPythonPackage { |
|
393 | 510 | name = "cov-core-1.15.0"; |
|
394 | 511 | buildInputs = with self; []; |
|
395 | 512 | doCheck = false; |
|
396 | 513 | propagatedBuildInputs = with self; [coverage]; |
|
397 | 514 | src = fetchurl { |
|
398 | 515 | url = "https://pypi.python.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz"; |
|
399 | 516 | md5 = "f519d4cb4c4e52856afb14af52919fe6"; |
|
400 | 517 | }; |
|
518 | meta = { | |
|
519 | license = [ pkgs.lib.licenses.mit ]; | |
|
520 | }; | |
|
401 | 521 | }; |
|
402 | 522 | coverage = super.buildPythonPackage { |
|
403 | 523 | name = "coverage-3.7.1"; |
|
404 | 524 | buildInputs = with self; []; |
|
405 | 525 | doCheck = false; |
|
406 | 526 | propagatedBuildInputs = with self; []; |
|
407 | 527 | src = fetchurl { |
|
408 | 528 | url = "https://pypi.python.org/packages/09/4f/89b06c7fdc09687bca507dc411c342556ef9c5a3b26756137a4878ff19bf/coverage-3.7.1.tar.gz"; |
|
409 | 529 | md5 = "c47b36ceb17eaff3ecfab3bcd347d0df"; |
|
410 | 530 | }; |
|
531 | meta = { | |
|
532 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
533 | }; | |
|
411 | 534 | }; |
|
412 | 535 | cssselect = super.buildPythonPackage { |
|
413 | 536 | name = "cssselect-0.9.1"; |
|
414 | 537 | buildInputs = with self; []; |
|
415 | 538 | doCheck = false; |
|
416 | 539 | propagatedBuildInputs = with self; []; |
|
417 | 540 | src = fetchurl { |
|
418 | 541 | url = "https://pypi.python.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz"; |
|
419 | 542 | md5 = "c74f45966277dc7a0f768b9b0f3522ac"; |
|
420 | 543 | }; |
|
544 | meta = { | |
|
545 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
546 | }; | |
|
421 | 547 | }; |
|
422 | 548 | decorator = super.buildPythonPackage { |
|
423 | 549 | name = "decorator-3.4.2"; |
|
424 | 550 | buildInputs = with self; []; |
|
425 | 551 | doCheck = false; |
|
426 | 552 | propagatedBuildInputs = with self; []; |
|
427 | 553 | src = fetchurl { |
|
428 | 554 | url = "https://pypi.python.org/packages/35/3a/42566eb7a2cbac774399871af04e11d7ae3fc2579e7dae85213b8d1d1c57/decorator-3.4.2.tar.gz"; |
|
429 | 555 | md5 = "9e0536870d2b83ae27d58dbf22582f4d"; |
|
430 | 556 | }; |
|
557 | meta = { | |
|
558 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
559 | }; | |
|
431 | 560 | }; |
|
432 | 561 | docutils = super.buildPythonPackage { |
|
433 | 562 | name = "docutils-0.12"; |
|
434 | 563 | buildInputs = with self; []; |
|
435 | 564 | doCheck = false; |
|
436 | 565 | propagatedBuildInputs = with self; []; |
|
437 | 566 | src = fetchurl { |
|
438 | 567 | url = "https://pypi.python.org/packages/37/38/ceda70135b9144d84884ae2fc5886c6baac4edea39550f28bcd144c1234d/docutils-0.12.tar.gz"; |
|
439 | 568 | md5 = "4622263b62c5c771c03502afa3157768"; |
|
440 | 569 | }; |
|
570 | meta = { | |
|
571 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.publicDomain pkgs.lib.licenses.gpl1 { fullName = "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"; } pkgs.lib.licenses.psfl ]; | |
|
572 | }; | |
|
441 | 573 | }; |
|
442 | 574 | dogpile.cache = super.buildPythonPackage { |
|
443 | 575 | name = "dogpile.cache-0.5.7"; |
|
444 | 576 | buildInputs = with self; []; |
|
445 | 577 | doCheck = false; |
|
446 | 578 | propagatedBuildInputs = with self; [dogpile.core]; |
|
447 | 579 | src = fetchurl { |
|
448 | 580 | url = "https://pypi.python.org/packages/07/74/2a83bedf758156d9c95d112691bbad870d3b77ccbcfb781b4ef836ea7d96/dogpile.cache-0.5.7.tar.gz"; |
|
449 | 581 | md5 = "3e58ce41af574aab41d78e9c4190f194"; |
|
450 | 582 | }; |
|
583 | meta = { | |
|
584 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
585 | }; | |
|
451 | 586 | }; |
|
452 | 587 | dogpile.core = super.buildPythonPackage { |
|
453 | 588 | name = "dogpile.core-0.4.1"; |
|
454 | 589 | buildInputs = with self; []; |
|
455 | 590 | doCheck = false; |
|
456 | 591 | propagatedBuildInputs = with self; []; |
|
457 | 592 | src = fetchurl { |
|
458 | 593 | url = "https://pypi.python.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz"; |
|
459 | 594 | md5 = "01cb19f52bba3e95c9b560f39341f045"; |
|
460 | 595 | }; |
|
596 | meta = { | |
|
597 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
598 | }; | |
|
461 | 599 | }; |
|
462 | 600 | dulwich = super.buildPythonPackage { |
|
463 | 601 | name = "dulwich-0.12.0"; |
|
464 | 602 | buildInputs = with self; []; |
|
465 | 603 | doCheck = false; |
|
466 | 604 | propagatedBuildInputs = with self; []; |
|
467 | 605 | src = fetchurl { |
|
468 | 606 | url = "https://pypi.python.org/packages/6f/04/fbe561b6d45c0ec758330d5b7f5ba4b6cb4f1ca1ab49859d2fc16320da75/dulwich-0.12.0.tar.gz"; |
|
469 | 607 | md5 = "f3a8a12bd9f9dd8c233e18f3d49436fa"; |
|
470 | 608 | }; |
|
609 | meta = { | |
|
610 | license = [ pkgs.lib.licenses.gpl2Plus ]; | |
|
611 | }; | |
|
471 | 612 | }; |
|
472 | 613 | ecdsa = super.buildPythonPackage { |
|
473 | 614 | name = "ecdsa-0.11"; |
|
474 | 615 | buildInputs = with self; []; |
|
475 | 616 | doCheck = false; |
|
476 | 617 | propagatedBuildInputs = with self; []; |
|
477 | 618 | src = fetchurl { |
|
478 | 619 | url = "https://pypi.python.org/packages/6c/3f/92fe5dcdcaa7bd117be21e5520c9a54375112b66ec000d209e9e9519fad1/ecdsa-0.11.tar.gz"; |
|
479 | 620 | md5 = "8ef586fe4dbb156697d756900cb41d7c"; |
|
480 | 621 | }; |
|
622 | meta = { | |
|
623 | license = [ pkgs.lib.licenses.mit ]; | |
|
624 | }; | |
|
481 | 625 | }; |
|
482 | 626 | elasticsearch = super.buildPythonPackage { |
|
483 | 627 | name = "elasticsearch-2.3.0"; |
|
484 | 628 | buildInputs = with self; []; |
|
485 | 629 | doCheck = false; |
|
486 | 630 | propagatedBuildInputs = with self; [urllib3]; |
|
487 | 631 | src = fetchurl { |
|
488 | 632 | url = "https://pypi.python.org/packages/10/35/5fd52c5f0b0ee405ed4b5195e8bce44c5e041787680dc7b94b8071cac600/elasticsearch-2.3.0.tar.gz"; |
|
489 | 633 | md5 = "2550f3b51629cf1ef9636608af92c340"; |
|
490 | 634 | }; |
|
635 | meta = { | |
|
636 | license = [ pkgs.lib.licenses.asl20 ]; | |
|
637 | }; | |
|
491 | 638 | }; |
|
492 | 639 | elasticsearch-dsl = super.buildPythonPackage { |
|
493 | 640 | name = "elasticsearch-dsl-2.0.0"; |
|
494 | 641 | buildInputs = with self; []; |
|
495 | 642 | doCheck = false; |
|
496 | 643 | propagatedBuildInputs = with self; [six python-dateutil elasticsearch]; |
|
497 | 644 | src = fetchurl { |
|
498 | 645 | url = "https://pypi.python.org/packages/4e/5d/e788ae8dbe2ff4d13426db0a027533386a5c276c77a2654dc0e2007ce04a/elasticsearch-dsl-2.0.0.tar.gz"; |
|
499 | 646 | md5 = "4cdfec81bb35383dd3b7d02d7dc5ee68"; |
|
500 | 647 | }; |
|
648 | meta = { | |
|
649 | license = [ pkgs.lib.licenses.asl20 ]; | |
|
650 | }; | |
|
501 | 651 | }; |
|
502 | 652 | flake8 = super.buildPythonPackage { |
|
503 | 653 | name = "flake8-2.4.1"; |
|
504 | 654 | buildInputs = with self; []; |
|
505 | 655 | doCheck = false; |
|
506 | 656 | propagatedBuildInputs = with self; [pyflakes pep8 mccabe]; |
|
507 | 657 | src = fetchurl { |
|
508 | 658 | url = "https://pypi.python.org/packages/8f/b5/9a73c66c7dba273bac8758398f060c008a25f3e84531063b42503b5d0a95/flake8-2.4.1.tar.gz"; |
|
509 | 659 | md5 = "ed45d3db81a3b7c88bd63c6e37ca1d65"; |
|
510 | 660 | }; |
|
661 | meta = { | |
|
662 | license = [ pkgs.lib.licenses.mit ]; | |
|
663 | }; | |
|
511 | 664 | }; |
|
512 | 665 | future = super.buildPythonPackage { |
|
513 | 666 | name = "future-0.14.3"; |
|
514 | 667 | buildInputs = with self; []; |
|
515 | 668 | doCheck = false; |
|
516 | 669 | propagatedBuildInputs = with self; []; |
|
517 | 670 | src = fetchurl { |
|
518 | 671 | url = "https://pypi.python.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz"; |
|
519 | 672 | md5 = "e94079b0bd1fc054929e8769fc0f6083"; |
|
520 | 673 | }; |
|
674 | meta = { | |
|
675 | license = [ { fullName = "OSI Approved"; } pkgs.lib.licenses.mit ]; | |
|
676 | }; | |
|
521 | 677 | }; |
|
522 | 678 | futures = super.buildPythonPackage { |
|
523 | 679 | name = "futures-3.0.2"; |
|
524 | 680 | buildInputs = with self; []; |
|
525 | 681 | doCheck = false; |
|
526 | 682 | propagatedBuildInputs = with self; []; |
|
527 | 683 | src = fetchurl { |
|
528 | 684 | url = "https://pypi.python.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz"; |
|
529 | 685 | md5 = "42aaf1e4de48d6e871d77dc1f9d96d5a"; |
|
530 | 686 | }; |
|
687 | meta = { | |
|
688 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
689 | }; | |
|
531 | 690 | }; |
|
532 | 691 | gnureadline = super.buildPythonPackage { |
|
533 | 692 | name = "gnureadline-6.3.3"; |
|
534 | 693 | buildInputs = with self; []; |
|
535 | 694 | doCheck = false; |
|
536 | 695 | propagatedBuildInputs = with self; []; |
|
537 | 696 | src = fetchurl { |
|
538 | 697 | url = "https://pypi.python.org/packages/3a/ee/2c3f568b0a74974791ac590ec742ef6133e2fbd287a074ba72a53fa5e97c/gnureadline-6.3.3.tar.gz"; |
|
539 | 698 | md5 = "c4af83c9a3fbeac8f2da9b5a7c60e51c"; |
|
540 | 699 | }; |
|
700 | meta = { | |
|
701 | license = [ pkgs.lib.licenses.gpl1 ]; | |
|
702 | }; | |
|
541 | 703 | }; |
|
542 | 704 | gprof2dot = super.buildPythonPackage { |
|
543 |
name = "gprof2dot-2015.12. |
|
|
705 | name = "gprof2dot-2015.12.1"; | |
|
544 | 706 | buildInputs = with self; []; |
|
545 | 707 | doCheck = false; |
|
546 | 708 | propagatedBuildInputs = with self; []; |
|
547 | 709 | src = fetchurl { |
|
548 | 710 | url = "https://pypi.python.org/packages/b9/34/7bf93c1952d40fa5c95ad963f4d8344b61ef58558632402eca18e6c14127/gprof2dot-2015.12.1.tar.gz"; |
|
549 | 711 | md5 = "e23bf4e2f94db032750c193384b4165b"; |
|
550 | 712 | }; |
|
713 | meta = { | |
|
714 | license = [ { fullName = "LGPL"; } ]; | |
|
715 | }; | |
|
551 | 716 | }; |
|
552 | 717 | greenlet = super.buildPythonPackage { |
|
553 | 718 | name = "greenlet-0.4.9"; |
|
554 | 719 | buildInputs = with self; []; |
|
555 | 720 | doCheck = false; |
|
556 | 721 | propagatedBuildInputs = with self; []; |
|
557 | 722 | src = fetchurl { |
|
558 | 723 | url = "https://pypi.python.org/packages/4e/3d/9d421539b74e33608b245092870156b2e171fb49f2b51390aa4641eecb4a/greenlet-0.4.9.zip"; |
|
559 | 724 | md5 = "c6659cdb2a5e591723e629d2eef22e82"; |
|
560 | 725 | }; |
|
726 | meta = { | |
|
727 | license = [ pkgs.lib.licenses.mit ]; | |
|
728 | }; | |
|
561 | 729 | }; |
|
562 | 730 | gunicorn = super.buildPythonPackage { |
|
563 | 731 | name = "gunicorn-19.6.0"; |
|
564 | 732 | buildInputs = with self; []; |
|
565 | 733 | doCheck = false; |
|
566 | 734 | propagatedBuildInputs = with self; []; |
|
567 | 735 | src = fetchurl { |
|
568 | 736 | url = "https://pypi.python.org/packages/84/ce/7ea5396efad1cef682bbc4068e72a0276341d9d9d0f501da609fab9fcb80/gunicorn-19.6.0.tar.gz"; |
|
569 | 737 | md5 = "338e5e8a83ea0f0625f768dba4597530"; |
|
570 | 738 | }; |
|
739 | meta = { | |
|
740 | license = [ pkgs.lib.licenses.mit ]; | |
|
741 | }; | |
|
571 | 742 | }; |
|
572 | 743 | infrae.cache = super.buildPythonPackage { |
|
573 | 744 | name = "infrae.cache-1.0.1"; |
|
574 | 745 | buildInputs = with self; []; |
|
575 | 746 | doCheck = false; |
|
576 | 747 | propagatedBuildInputs = with self; [Beaker repoze.lru]; |
|
577 | 748 | src = fetchurl { |
|
578 | 749 | url = "https://pypi.python.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz"; |
|
579 | 750 | md5 = "b09076a766747e6ed2a755cc62088e32"; |
|
580 | 751 | }; |
|
752 | meta = { | |
|
753 | license = [ pkgs.lib.licenses.zpt21 ]; | |
|
754 | }; | |
|
581 | 755 | }; |
|
582 | 756 | invoke = super.buildPythonPackage { |
|
583 |
name = "invoke-0.1 |
|
|
757 | name = "invoke-0.13.0"; | |
|
584 | 758 | buildInputs = with self; []; |
|
585 | 759 | doCheck = false; |
|
586 | 760 | propagatedBuildInputs = with self; []; |
|
587 | 761 | src = fetchurl { |
|
588 | url = "https://pypi.python.org/packages/d3/bb/36a5558ea19882073def7b0edeef4a0e6282056fed96506dd10b1d532bd4/invoke-0.11.1.tar.gz"; | |
|
589 | md5 = "3d4ecbe26779ceef1046ecf702c9c4a8"; | |
|
762 | url = "https://pypi.python.org/packages/47/bf/d07ef52fa1ac645468858bbac7cb95b246a972a045e821493d17d89c81be/invoke-0.13.0.tar.gz"; | |
|
763 | md5 = "c0d1ed4bfb34eaab551662d8cfee6540"; | |
|
764 | }; | |
|
765 | meta = { | |
|
766 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
590 | 767 | }; |
|
591 | 768 | }; |
|
592 | 769 | ipdb = super.buildPythonPackage { |
|
593 | 770 | name = "ipdb-0.8"; |
|
594 | 771 | buildInputs = with self; []; |
|
595 | 772 | doCheck = false; |
|
596 | 773 | propagatedBuildInputs = with self; [ipython]; |
|
597 | 774 | src = fetchurl { |
|
598 | 775 | url = "https://pypi.python.org/packages/f0/25/d7dd430ced6cd8dc242a933c8682b5dbf32eb4011d82f87e34209e5ec845/ipdb-0.8.zip"; |
|
599 | 776 | md5 = "96dca0712efa01aa5eaf6b22071dd3ed"; |
|
600 | 777 | }; |
|
778 | meta = { | |
|
779 | license = [ pkgs.lib.licenses.gpl1 ]; | |
|
780 | }; | |
|
601 | 781 | }; |
|
602 | 782 | ipython = super.buildPythonPackage { |
|
603 | 783 | name = "ipython-3.1.0"; |
|
604 | 784 | buildInputs = with self; []; |
|
605 | 785 | doCheck = false; |
|
606 | 786 | propagatedBuildInputs = with self; []; |
|
607 | 787 | src = fetchurl { |
|
608 | 788 | url = "https://pypi.python.org/packages/06/91/120c0835254c120af89f066afaabf81289bc2726c1fc3ca0555df6882f58/ipython-3.1.0.tar.gz"; |
|
609 | 789 | md5 = "a749d90c16068687b0ec45a27e72ef8f"; |
|
610 | 790 | }; |
|
791 | meta = { | |
|
792 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
793 | }; | |
|
611 | 794 | }; |
|
612 | 795 | iso8601 = super.buildPythonPackage { |
|
613 | 796 | name = "iso8601-0.1.11"; |
|
614 | 797 | buildInputs = with self; []; |
|
615 | 798 | doCheck = false; |
|
616 | 799 | propagatedBuildInputs = with self; []; |
|
617 | 800 | src = fetchurl { |
|
618 | 801 | url = "https://pypi.python.org/packages/c0/75/c9209ee4d1b5975eb8c2cba4428bde6b61bd55664a98290dd015cdb18e98/iso8601-0.1.11.tar.gz"; |
|
619 | 802 | md5 = "b06d11cd14a64096f907086044f0fe38"; |
|
620 | 803 | }; |
|
804 | meta = { | |
|
805 | license = [ pkgs.lib.licenses.mit ]; | |
|
806 | }; | |
|
621 | 807 | }; |
|
622 | 808 | itsdangerous = super.buildPythonPackage { |
|
623 | 809 | name = "itsdangerous-0.24"; |
|
624 | 810 | buildInputs = with self; []; |
|
625 | 811 | doCheck = false; |
|
626 | 812 | propagatedBuildInputs = with self; []; |
|
627 | 813 | src = fetchurl { |
|
628 | 814 | url = "https://pypi.python.org/packages/dc/b4/a60bcdba945c00f6d608d8975131ab3f25b22f2bcfe1dab221165194b2d4/itsdangerous-0.24.tar.gz"; |
|
629 | 815 | md5 = "a3d55aa79369aef5345c036a8a26307f"; |
|
630 | 816 | }; |
|
817 | meta = { | |
|
818 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
819 | }; | |
|
631 | 820 | }; |
|
632 | 821 | kombu = super.buildPythonPackage { |
|
633 | 822 | name = "kombu-1.5.1"; |
|
634 | 823 | buildInputs = with self; []; |
|
635 | 824 | doCheck = false; |
|
636 | 825 | propagatedBuildInputs = with self; [anyjson amqplib]; |
|
637 | 826 | src = fetchurl { |
|
638 | 827 | url = "https://pypi.python.org/packages/19/53/74bf2a624644b45f0850a638752514fc10a8e1cbd738f10804951a6df3f5/kombu-1.5.1.tar.gz"; |
|
639 | 828 | md5 = "50662f3c7e9395b3d0721fb75d100b63"; |
|
640 | 829 | }; |
|
830 | meta = { | |
|
831 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
832 | }; | |
|
641 | 833 | }; |
|
642 | 834 | lxml = super.buildPythonPackage { |
|
643 | 835 | name = "lxml-3.4.4"; |
|
644 | 836 | buildInputs = with self; []; |
|
645 | 837 | doCheck = false; |
|
646 | 838 | propagatedBuildInputs = with self; []; |
|
647 | 839 | src = fetchurl { |
|
648 | 840 | url = "https://pypi.python.org/packages/63/c7/4f2a2a4ad6c6fa99b14be6b3c1cece9142e2d915aa7c43c908677afc8fa4/lxml-3.4.4.tar.gz"; |
|
649 | 841 | md5 = "a9a65972afc173ec7a39c585f4eea69c"; |
|
650 | 842 | }; |
|
843 | meta = { | |
|
844 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
845 | }; | |
|
651 | 846 | }; |
|
652 | 847 | mccabe = super.buildPythonPackage { |
|
653 | 848 | name = "mccabe-0.3"; |
|
654 | 849 | buildInputs = with self; []; |
|
655 | 850 | doCheck = false; |
|
656 | 851 | propagatedBuildInputs = with self; []; |
|
657 | 852 | src = fetchurl { |
|
658 | 853 | url = "https://pypi.python.org/packages/c9/2e/75231479e11a906b64ac43bad9d0bb534d00080b18bdca8db9da46e1faf7/mccabe-0.3.tar.gz"; |
|
659 | 854 | md5 = "81640948ff226f8c12b3277059489157"; |
|
660 | 855 | }; |
|
856 | meta = { | |
|
857 | license = [ { fullName = "Expat license"; } pkgs.lib.licenses.mit ]; | |
|
858 | }; | |
|
661 | 859 | }; |
|
662 | 860 | meld3 = super.buildPythonPackage { |
|
663 | 861 | name = "meld3-1.0.2"; |
|
664 | 862 | buildInputs = with self; []; |
|
665 | 863 | doCheck = false; |
|
666 | 864 | propagatedBuildInputs = with self; []; |
|
667 | 865 | src = fetchurl { |
|
668 | 866 | url = "https://pypi.python.org/packages/45/a0/317c6422b26c12fe0161e936fc35f36552069ba8e6f7ecbd99bbffe32a5f/meld3-1.0.2.tar.gz"; |
|
669 | 867 | md5 = "3ccc78cd79cffd63a751ad7684c02c91"; |
|
670 | 868 | }; |
|
869 | meta = { | |
|
870 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
|
871 | }; | |
|
671 | 872 | }; |
|
672 | 873 | mock = super.buildPythonPackage { |
|
673 | 874 | name = "mock-1.0.1"; |
|
674 | 875 | buildInputs = with self; []; |
|
675 | 876 | doCheck = false; |
|
676 | 877 | propagatedBuildInputs = with self; []; |
|
677 | 878 | src = fetchurl { |
|
678 | 879 | url = "https://pypi.python.org/packages/15/45/30273ee91feb60dabb8fbb2da7868520525f02cf910279b3047182feed80/mock-1.0.1.zip"; |
|
679 | 880 | md5 = "869f08d003c289a97c1a6610faf5e913"; |
|
680 | 881 | }; |
|
882 | meta = { | |
|
883 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
884 | }; | |
|
681 | 885 | }; |
|
682 | 886 | msgpack-python = super.buildPythonPackage { |
|
683 | 887 | name = "msgpack-python-0.4.6"; |
|
684 | 888 | buildInputs = with self; []; |
|
685 | 889 | doCheck = false; |
|
686 | 890 | propagatedBuildInputs = with self; []; |
|
687 | 891 | src = fetchurl { |
|
688 | 892 | url = "https://pypi.python.org/packages/15/ce/ff2840885789ef8035f66cd506ea05bdb228340307d5e71a7b1e3f82224c/msgpack-python-0.4.6.tar.gz"; |
|
689 | 893 | md5 = "8b317669314cf1bc881716cccdaccb30"; |
|
690 | 894 | }; |
|
895 | meta = { | |
|
896 | license = [ pkgs.lib.licenses.asl20 ]; | |
|
897 | }; | |
|
691 | 898 | }; |
|
692 | 899 | nose = super.buildPythonPackage { |
|
693 | 900 | name = "nose-1.3.6"; |
|
694 | 901 | buildInputs = with self; []; |
|
695 | 902 | doCheck = false; |
|
696 | 903 | propagatedBuildInputs = with self; []; |
|
697 | 904 | src = fetchurl { |
|
698 | 905 | url = "https://pypi.python.org/packages/70/c7/469e68148d17a0d3db5ed49150242fd70a74a8147b8f3f8b87776e028d99/nose-1.3.6.tar.gz"; |
|
699 | 906 | md5 = "0ca546d81ca8309080fc80cb389e7a16"; |
|
700 | 907 | }; |
|
908 | meta = { | |
|
909 | license = [ { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "GNU LGPL"; } ]; | |
|
910 | }; | |
|
701 | 911 | }; |
|
702 | 912 | objgraph = super.buildPythonPackage { |
|
703 | 913 | name = "objgraph-2.0.0"; |
|
704 | 914 | buildInputs = with self; []; |
|
705 | 915 | doCheck = false; |
|
706 | 916 | propagatedBuildInputs = with self; []; |
|
707 | 917 | src = fetchurl { |
|
708 | 918 | url = "https://pypi.python.org/packages/d7/33/ace750b59247496ed769b170586c5def7202683f3d98e737b75b767ff29e/objgraph-2.0.0.tar.gz"; |
|
709 | 919 | md5 = "25b0d5e5adc74aa63ead15699614159c"; |
|
710 | 920 | }; |
|
921 | meta = { | |
|
922 | license = [ pkgs.lib.licenses.mit ]; | |
|
923 | }; | |
|
711 | 924 | }; |
|
712 | 925 | packaging = super.buildPythonPackage { |
|
713 | 926 | name = "packaging-15.2"; |
|
714 | 927 | buildInputs = with self; []; |
|
715 | 928 | doCheck = false; |
|
716 | 929 | propagatedBuildInputs = with self; []; |
|
717 | 930 | src = fetchurl { |
|
718 | 931 | url = "https://pypi.python.org/packages/24/c4/185da1304f07047dc9e0c46c31db75c0351bd73458ac3efad7da3dbcfbe1/packaging-15.2.tar.gz"; |
|
719 | 932 | md5 = "c16093476f6ced42128bf610e5db3784"; |
|
720 | 933 | }; |
|
934 | meta = { | |
|
935 | license = [ pkgs.lib.licenses.asl20 ]; | |
|
936 | }; | |
|
721 | 937 | }; |
|
722 | 938 | paramiko = super.buildPythonPackage { |
|
723 | 939 | name = "paramiko-1.15.1"; |
|
724 | 940 | buildInputs = with self; []; |
|
725 | 941 | doCheck = false; |
|
726 | 942 | propagatedBuildInputs = with self; [pycrypto ecdsa]; |
|
727 | 943 | src = fetchurl { |
|
728 | 944 | url = "https://pypi.python.org/packages/04/2b/a22d2a560c1951abbbf95a0628e245945565f70dc082d9e784666887222c/paramiko-1.15.1.tar.gz"; |
|
729 | 945 | md5 = "48c274c3f9b1282932567b21f6acf3b5"; |
|
730 | 946 | }; |
|
947 | meta = { | |
|
948 | license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; | |
|
949 | }; | |
|
731 | 950 | }; |
|
732 | 951 | pep8 = super.buildPythonPackage { |
|
733 | 952 | name = "pep8-1.5.7"; |
|
734 | 953 | buildInputs = with self; []; |
|
735 | 954 | doCheck = false; |
|
736 | 955 | propagatedBuildInputs = with self; []; |
|
737 | 956 | src = fetchurl { |
|
738 | 957 | url = "https://pypi.python.org/packages/8b/de/259f5e735897ada1683489dd514b2a1c91aaa74e5e6b68f80acf128a6368/pep8-1.5.7.tar.gz"; |
|
739 | 958 | md5 = "f6adbdd69365ecca20513c709f9b7c93"; |
|
740 | 959 | }; |
|
960 | meta = { | |
|
961 | license = [ { fullName = "Expat license"; } pkgs.lib.licenses.mit ]; | |
|
962 | }; | |
|
741 | 963 | }; |
|
742 | 964 | psutil = super.buildPythonPackage { |
|
743 | 965 | name = "psutil-2.2.1"; |
|
744 | 966 | buildInputs = with self; []; |
|
745 | 967 | doCheck = false; |
|
746 | 968 | propagatedBuildInputs = with self; []; |
|
747 | 969 | src = fetchurl { |
|
748 | 970 | url = "https://pypi.python.org/packages/df/47/ee54ef14dd40f8ce831a7581001a5096494dc99fe71586260ca6b531fe86/psutil-2.2.1.tar.gz"; |
|
749 | 971 | md5 = "1a2b58cd9e3a53528bb6148f0c4d5244"; |
|
750 | 972 | }; |
|
973 | meta = { | |
|
974 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
975 | }; | |
|
751 | 976 | }; |
|
752 | 977 | psycopg2 = super.buildPythonPackage { |
|
753 | 978 | name = "psycopg2-2.6"; |
|
754 | 979 | buildInputs = with self; []; |
|
755 | 980 | doCheck = false; |
|
756 | 981 | propagatedBuildInputs = with self; []; |
|
757 | 982 | src = fetchurl { |
|
758 | 983 | url = "https://pypi.python.org/packages/dd/c7/9016ff8ff69da269b1848276eebfb264af5badf6b38caad805426771f04d/psycopg2-2.6.tar.gz"; |
|
759 | 984 | md5 = "fbbb039a8765d561a1c04969bbae7c74"; |
|
760 | 985 | }; |
|
986 | meta = { | |
|
987 | license = [ pkgs.lib.licenses.zpt21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ]; | |
|
988 | }; | |
|
761 | 989 | }; |
|
762 | 990 | py = super.buildPythonPackage { |
|
763 | 991 | name = "py-1.4.29"; |
|
764 | 992 | buildInputs = with self; []; |
|
765 | 993 | doCheck = false; |
|
766 | 994 | propagatedBuildInputs = with self; []; |
|
767 | 995 | src = fetchurl { |
|
768 | 996 | url = "https://pypi.python.org/packages/2a/bc/a1a4a332ac10069b8e5e25136a35e08a03f01fd6ab03d819889d79a1fd65/py-1.4.29.tar.gz"; |
|
769 | 997 | md5 = "c28e0accba523a29b35a48bb703fb96c"; |
|
770 | 998 | }; |
|
999 | meta = { | |
|
1000 | license = [ pkgs.lib.licenses.mit ]; | |
|
1001 | }; | |
|
771 | 1002 | }; |
|
772 | 1003 | py-bcrypt = super.buildPythonPackage { |
|
773 | 1004 | name = "py-bcrypt-0.4"; |
|
774 | 1005 | buildInputs = with self; []; |
|
775 | 1006 | doCheck = false; |
|
776 | 1007 | propagatedBuildInputs = with self; []; |
|
777 | 1008 | src = fetchurl { |
|
778 | 1009 | url = "https://pypi.python.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz"; |
|
779 | 1010 | md5 = "dd8b367d6b716a2ea2e72392525f4e36"; |
|
780 | 1011 | }; |
|
1012 | meta = { | |
|
1013 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
1014 | }; | |
|
781 | 1015 | }; |
|
782 | 1016 | pycrypto = super.buildPythonPackage { |
|
783 | 1017 | name = "pycrypto-2.6.1"; |
|
784 | 1018 | buildInputs = with self; []; |
|
785 | 1019 | doCheck = false; |
|
786 | 1020 | propagatedBuildInputs = with self; []; |
|
787 | 1021 | src = fetchurl { |
|
788 | 1022 | url = "https://pypi.python.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz"; |
|
789 | 1023 | md5 = "55a61a054aa66812daf5161a0d5d7eda"; |
|
790 | 1024 | }; |
|
1025 | meta = { | |
|
1026 | license = [ pkgs.lib.licenses.publicDomain ]; | |
|
1027 | }; | |
|
791 | 1028 | }; |
|
792 | 1029 | pycurl = super.buildPythonPackage { |
|
793 | 1030 | name = "pycurl-7.19.5"; |
|
794 | 1031 | buildInputs = with self; []; |
|
795 | 1032 | doCheck = false; |
|
796 | 1033 | propagatedBuildInputs = with self; []; |
|
797 | 1034 | src = fetchurl { |
|
798 | 1035 | url = "https://pypi.python.org/packages/6c/48/13bad289ef6f4869b1d8fc11ae54de8cfb3cc4a2eb9f7419c506f763be46/pycurl-7.19.5.tar.gz"; |
|
799 | 1036 | md5 = "47b4eac84118e2606658122104e62072"; |
|
800 | 1037 | }; |
|
1038 | meta = { | |
|
1039 | license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; | |
|
1040 | }; | |
|
801 | 1041 | }; |
|
802 | 1042 | pyflakes = super.buildPythonPackage { |
|
803 | 1043 | name = "pyflakes-0.8.1"; |
|
804 | 1044 | buildInputs = with self; []; |
|
805 | 1045 | doCheck = false; |
|
806 | 1046 | propagatedBuildInputs = with self; []; |
|
807 | 1047 | src = fetchurl { |
|
808 | 1048 | url = "https://pypi.python.org/packages/75/22/a90ec0252f4f87f3ffb6336504de71fe16a49d69c4538dae2f12b9360a38/pyflakes-0.8.1.tar.gz"; |
|
809 | 1049 | md5 = "905fe91ad14b912807e8fdc2ac2e2c23"; |
|
810 | 1050 | }; |
|
1051 | meta = { | |
|
1052 | license = [ pkgs.lib.licenses.mit ]; | |
|
1053 | }; | |
|
811 | 1054 | }; |
|
812 | 1055 | pyparsing = super.buildPythonPackage { |
|
813 | 1056 | name = "pyparsing-1.5.7"; |
|
814 | 1057 | buildInputs = with self; []; |
|
815 | 1058 | doCheck = false; |
|
816 | 1059 | propagatedBuildInputs = with self; []; |
|
817 | 1060 | src = fetchurl { |
|
818 | 1061 | url = "https://pypi.python.org/packages/2e/26/e8fb5b4256a5f5036be7ce115ef8db8d06bc537becfbdc46c6af008314ee/pyparsing-1.5.7.zip"; |
|
819 | 1062 | md5 = "b86854857a368d6ccb4d5b6e76d0637f"; |
|
820 | 1063 | }; |
|
1064 | meta = { | |
|
1065 | license = [ pkgs.lib.licenses.mit ]; | |
|
1066 | }; | |
|
821 | 1067 | }; |
|
822 | 1068 | pyramid = super.buildPythonPackage { |
|
823 | 1069 | name = "pyramid-1.6.1"; |
|
824 | 1070 | buildInputs = with self; []; |
|
825 | 1071 | doCheck = false; |
|
826 | 1072 | propagatedBuildInputs = with self; [setuptools WebOb repoze.lru zope.interface zope.deprecation venusian translationstring PasteDeploy]; |
|
827 | 1073 | src = fetchurl { |
|
828 | 1074 | url = "https://pypi.python.org/packages/30/b3/fcc4a2a4800cbf21989e00454b5828cf1f7fe35c63e0810b350e56d4c475/pyramid-1.6.1.tar.gz"; |
|
829 | 1075 | md5 = "b18688ff3cc33efdbb098a35b45dd122"; |
|
830 | 1076 | }; |
|
1077 | meta = { | |
|
1078 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
|
1079 | }; | |
|
831 | 1080 | }; |
|
832 | 1081 | pyramid-beaker = super.buildPythonPackage { |
|
833 | 1082 | name = "pyramid-beaker-0.8"; |
|
834 | 1083 | buildInputs = with self; []; |
|
835 | 1084 | doCheck = false; |
|
836 | 1085 | propagatedBuildInputs = with self; [pyramid Beaker]; |
|
837 | 1086 | src = fetchurl { |
|
838 | 1087 | url = "https://pypi.python.org/packages/d9/6e/b85426e00fd3d57f4545f74e1c3828552d8700f13ededeef9233f7bca8be/pyramid_beaker-0.8.tar.gz"; |
|
839 | 1088 | md5 = "22f14be31b06549f80890e2c63a93834"; |
|
840 | 1089 | }; |
|
1090 | meta = { | |
|
1091 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
|
1092 | }; | |
|
841 | 1093 | }; |
|
842 | 1094 | pyramid-debugtoolbar = super.buildPythonPackage { |
|
843 | 1095 | name = "pyramid-debugtoolbar-2.4.2"; |
|
844 | 1096 | buildInputs = with self; []; |
|
845 | 1097 | doCheck = false; |
|
846 | 1098 | propagatedBuildInputs = with self; [pyramid pyramid-mako repoze.lru Pygments]; |
|
847 | 1099 | src = fetchurl { |
|
848 | 1100 | url = "https://pypi.python.org/packages/89/00/ed5426ee41ed747ba3ffd30e8230841a6878286ea67d480b1444d24f06a2/pyramid_debugtoolbar-2.4.2.tar.gz"; |
|
849 | 1101 | md5 = "073ea67086cc4bd5decc3a000853642d"; |
|
850 | 1102 | }; |
|
1103 | meta = { | |
|
1104 | license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ]; | |
|
1105 | }; | |
|
851 | 1106 | }; |
|
852 | 1107 | pyramid-jinja2 = super.buildPythonPackage { |
|
853 | 1108 | name = "pyramid-jinja2-2.5"; |
|
854 | 1109 | buildInputs = with self; []; |
|
855 | 1110 | doCheck = false; |
|
856 | 1111 | propagatedBuildInputs = with self; [pyramid zope.deprecation Jinja2 MarkupSafe]; |
|
857 | 1112 | src = fetchurl { |
|
858 | 1113 | url = "https://pypi.python.org/packages/a1/80/595e26ffab7deba7208676b6936b7e5a721875710f982e59899013cae1ed/pyramid_jinja2-2.5.tar.gz"; |
|
859 | 1114 | md5 = "07cb6547204ac5e6f0b22a954ccee928"; |
|
860 | 1115 | }; |
|
1116 | meta = { | |
|
1117 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
|
1118 | }; | |
|
861 | 1119 | }; |
|
862 | 1120 | pyramid-mako = super.buildPythonPackage { |
|
863 | 1121 | name = "pyramid-mako-1.0.2"; |
|
864 | 1122 | buildInputs = with self; []; |
|
865 | 1123 | doCheck = false; |
|
866 | 1124 | propagatedBuildInputs = with self; [pyramid Mako]; |
|
867 | 1125 | src = fetchurl { |
|
868 | 1126 | url = "https://pypi.python.org/packages/f1/92/7e69bcf09676d286a71cb3bbb887b16595b96f9ba7adbdc239ffdd4b1eb9/pyramid_mako-1.0.2.tar.gz"; |
|
869 | 1127 | md5 = "ee25343a97eb76bd90abdc2a774eb48a"; |
|
870 | 1128 | }; |
|
1129 | meta = { | |
|
1130 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
|
1131 | }; | |
|
871 | 1132 | }; |
|
872 | 1133 | pysqlite = super.buildPythonPackage { |
|
873 | 1134 | name = "pysqlite-2.6.3"; |
|
874 | 1135 | buildInputs = with self; []; |
|
875 | 1136 | doCheck = false; |
|
876 | 1137 | propagatedBuildInputs = with self; []; |
|
877 | 1138 | src = fetchurl { |
|
878 | 1139 | url = "https://pypi.python.org/packages/5c/a6/1c429cd4c8069cf4bfbd0eb4d592b3f4042155a8202df83d7e9b93aa3dc2/pysqlite-2.6.3.tar.gz"; |
|
879 | 1140 | md5 = "7ff1cedee74646b50117acff87aa1cfa"; |
|
880 | 1141 | }; |
|
1142 | meta = { | |
|
1143 | license = [ { fullName = "zlib/libpng License"; } { fullName = "zlib/libpng license"; } ]; | |
|
1144 | }; | |
|
881 | 1145 | }; |
|
882 | 1146 | pytest = super.buildPythonPackage { |
|
883 | 1147 | name = "pytest-2.8.5"; |
|
884 | 1148 | buildInputs = with self; []; |
|
885 | 1149 | doCheck = false; |
|
886 | 1150 | propagatedBuildInputs = with self; [py]; |
|
887 | 1151 | src = fetchurl { |
|
888 | 1152 | url = "https://pypi.python.org/packages/b1/3d/d7ea9b0c51e0cacded856e49859f0a13452747491e842c236bbab3714afe/pytest-2.8.5.zip"; |
|
889 | 1153 | md5 = "8493b06f700862f1294298d6c1b715a9"; |
|
890 | 1154 | }; |
|
1155 | meta = { | |
|
1156 | license = [ pkgs.lib.licenses.mit ]; | |
|
1157 | }; | |
|
891 | 1158 | }; |
|
892 | 1159 | pytest-catchlog = super.buildPythonPackage { |
|
893 | 1160 | name = "pytest-catchlog-1.2.2"; |
|
894 | 1161 | buildInputs = with self; []; |
|
895 | 1162 | doCheck = false; |
|
896 | 1163 | propagatedBuildInputs = with self; [py pytest]; |
|
897 | 1164 | src = fetchurl { |
|
898 | 1165 | url = "https://pypi.python.org/packages/f2/2b/2faccdb1a978fab9dd0bf31cca9f6847fbe9184a0bdcc3011ac41dd44191/pytest-catchlog-1.2.2.zip"; |
|
899 | 1166 | md5 = "09d890c54c7456c818102b7ff8c182c8"; |
|
900 | 1167 | }; |
|
1168 | meta = { | |
|
1169 | license = [ pkgs.lib.licenses.mit ]; | |
|
1170 | }; | |
|
901 | 1171 | }; |
|
902 | 1172 | pytest-cov = super.buildPythonPackage { |
|
903 | 1173 | name = "pytest-cov-1.8.1"; |
|
904 | 1174 | buildInputs = with self; []; |
|
905 | 1175 | doCheck = false; |
|
906 | 1176 | propagatedBuildInputs = with self; [py pytest coverage cov-core]; |
|
907 | 1177 | src = fetchurl { |
|
908 | 1178 | url = "https://pypi.python.org/packages/11/4b/b04646e97f1721878eb21e9f779102d84dd044d324382263b1770a3e4838/pytest-cov-1.8.1.tar.gz"; |
|
909 | 1179 | md5 = "76c778afa2494088270348be42d759fc"; |
|
910 | 1180 | }; |
|
1181 | meta = { | |
|
1182 | license = [ pkgs.lib.licenses.mit ]; | |
|
1183 | }; | |
|
911 | 1184 | }; |
|
912 | 1185 | pytest-profiling = super.buildPythonPackage { |
|
913 | 1186 | name = "pytest-profiling-1.0.1"; |
|
914 | 1187 | buildInputs = with self; []; |
|
915 | 1188 | doCheck = false; |
|
916 | 1189 | propagatedBuildInputs = with self; [six pytest gprof2dot]; |
|
917 | 1190 | src = fetchurl { |
|
918 | 1191 | url = "https://pypi.python.org/packages/d8/67/8ffab73406e22870e07fa4dc8dce1d7689b26dba8efd00161c9b6fc01ec0/pytest-profiling-1.0.1.tar.gz"; |
|
919 | 1192 | md5 = "354404eb5b3fd4dc5eb7fffbb3d9b68b"; |
|
920 | 1193 | }; |
|
1194 | meta = { | |
|
1195 | license = [ pkgs.lib.licenses.mit ]; | |
|
1196 | }; | |
|
921 | 1197 | }; |
|
922 | 1198 | pytest-runner = super.buildPythonPackage { |
|
923 | 1199 | name = "pytest-runner-2.7.1"; |
|
924 | 1200 | buildInputs = with self; []; |
|
925 | 1201 | doCheck = false; |
|
926 | 1202 | propagatedBuildInputs = with self; []; |
|
927 | 1203 | src = fetchurl { |
|
928 | 1204 | url = "https://pypi.python.org/packages/99/6b/c4ff4418d3424d4475b7af60724fd4a5cdd91ed8e489dc9443281f0052bc/pytest-runner-2.7.1.tar.gz"; |
|
929 | 1205 | md5 = "e56f0bc8d79a6bd91772b44ef4215c7e"; |
|
930 | 1206 | }; |
|
1207 | meta = { | |
|
1208 | license = [ pkgs.lib.licenses.mit ]; | |
|
1209 | }; | |
|
931 | 1210 | }; |
|
932 | 1211 | pytest-timeout = super.buildPythonPackage { |
|
933 | 1212 | name = "pytest-timeout-0.4"; |
|
934 | 1213 | buildInputs = with self; []; |
|
935 | 1214 | doCheck = false; |
|
936 | 1215 | propagatedBuildInputs = with self; [pytest]; |
|
937 | 1216 | src = fetchurl { |
|
938 | 1217 | url = "https://pypi.python.org/packages/24/48/5f6bd4b8026a26e1dd427243d560a29a0f1b24a5c7cffca4bf049a7bb65b/pytest-timeout-0.4.tar.gz"; |
|
939 | 1218 | md5 = "03b28aff69cbbfb959ed35ade5fde262"; |
|
940 | 1219 | }; |
|
1220 | meta = { | |
|
1221 | license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ]; | |
|
1222 | }; | |
|
941 | 1223 | }; |
|
942 | 1224 | python-dateutil = super.buildPythonPackage { |
|
943 | 1225 | name = "python-dateutil-1.5"; |
|
944 | 1226 | buildInputs = with self; []; |
|
945 | 1227 | doCheck = false; |
|
946 | 1228 | propagatedBuildInputs = with self; []; |
|
947 | 1229 | src = fetchurl { |
|
948 | 1230 | url = "https://pypi.python.org/packages/b4/7c/df59c89a753eb33c7c44e1dd42de0e9bc2ccdd5a4d576e0bfad97cc280cb/python-dateutil-1.5.tar.gz"; |
|
949 | 1231 | md5 = "0dcb1de5e5cad69490a3b6ab63f0cfa5"; |
|
950 | 1232 | }; |
|
1233 | meta = { | |
|
1234 | license = [ pkgs.lib.licenses.psfl ]; | |
|
1235 | }; | |
|
951 | 1236 | }; |
|
952 | 1237 | python-editor = super.buildPythonPackage { |
|
953 | 1238 | name = "python-editor-1.0.1"; |
|
954 | 1239 | buildInputs = with self; []; |
|
955 | 1240 | doCheck = false; |
|
956 | 1241 | propagatedBuildInputs = with self; []; |
|
957 | 1242 | src = fetchurl { |
|
958 | 1243 | url = "https://pypi.python.org/packages/2b/c0/df7b87d5cf016f82eab3b05cd35f53287c1178ad8c42bfb6fa61b89b22f6/python-editor-1.0.1.tar.gz"; |
|
959 | 1244 | md5 = "e1fa63535b40e022fa4fd646fd8b511a"; |
|
960 | 1245 | }; |
|
1246 | meta = { | |
|
1247 | license = [ pkgs.lib.licenses.asl20 ]; | |
|
1248 | }; | |
|
961 | 1249 | }; |
|
962 | 1250 | python-ldap = super.buildPythonPackage { |
|
963 | 1251 | name = "python-ldap-2.4.19"; |
|
964 | 1252 | buildInputs = with self; []; |
|
965 | 1253 | doCheck = false; |
|
966 | 1254 | propagatedBuildInputs = with self; [setuptools]; |
|
967 | 1255 | src = fetchurl { |
|
968 | 1256 | url = "https://pypi.python.org/packages/42/81/1b64838c82e64f14d4e246ff00b52e650a35c012551b891ada2b85d40737/python-ldap-2.4.19.tar.gz"; |
|
969 | 1257 | md5 = "b941bf31d09739492aa19ef679e94ae3"; |
|
970 | 1258 | }; |
|
1259 | meta = { | |
|
1260 | license = [ pkgs.lib.licenses.psfl ]; | |
|
1261 | }; | |
|
971 | 1262 | }; |
|
972 | 1263 | python-memcached = super.buildPythonPackage { |
|
973 | 1264 | name = "python-memcached-1.57"; |
|
974 | 1265 | buildInputs = with self; []; |
|
975 | 1266 | doCheck = false; |
|
976 | 1267 | propagatedBuildInputs = with self; [six]; |
|
977 | 1268 | src = fetchurl { |
|
978 | 1269 | url = "https://pypi.python.org/packages/52/9d/eebc0dcbc5c7c66840ad207dfc1baa376dadb74912484bff73819cce01e6/python-memcached-1.57.tar.gz"; |
|
979 | 1270 | md5 = "de21f64b42b2d961f3d4ad7beb5468a1"; |
|
980 | 1271 | }; |
|
1272 | meta = { | |
|
1273 | license = [ pkgs.lib.licenses.psfl ]; | |
|
1274 | }; | |
|
981 | 1275 | }; |
|
982 | 1276 | python-pam = super.buildPythonPackage { |
|
983 | 1277 | name = "python-pam-1.8.2"; |
|
984 | 1278 | buildInputs = with self; []; |
|
985 | 1279 | doCheck = false; |
|
986 | 1280 | propagatedBuildInputs = with self; []; |
|
987 | 1281 | src = fetchurl { |
|
988 | 1282 | url = "https://pypi.python.org/packages/de/8c/f8f5d38b4f26893af267ea0b39023d4951705ab0413a39e0cf7cf4900505/python-pam-1.8.2.tar.gz"; |
|
989 | 1283 | md5 = "db71b6b999246fb05d78ecfbe166629d"; |
|
990 | 1284 | }; |
|
1285 | meta = { | |
|
1286 | license = [ { fullName = "License :: OSI Approved :: MIT License"; } pkgs.lib.licenses.mit ]; | |
|
1287 | }; | |
|
991 | 1288 | }; |
|
992 | 1289 | pytz = super.buildPythonPackage { |
|
993 | 1290 | name = "pytz-2015.4"; |
|
994 | 1291 | buildInputs = with self; []; |
|
995 | 1292 | doCheck = false; |
|
996 | 1293 | propagatedBuildInputs = with self; []; |
|
997 | 1294 | src = fetchurl { |
|
998 | 1295 | url = "https://pypi.python.org/packages/7e/1a/f43b5c92df7b156822030fed151327ea096bcf417e45acc23bd1df43472f/pytz-2015.4.zip"; |
|
999 | 1296 | md5 = "233f2a2b370d03f9b5911700cc9ebf3c"; |
|
1000 | 1297 | }; |
|
1298 | meta = { | |
|
1299 | license = [ pkgs.lib.licenses.mit ]; | |
|
1300 | }; | |
|
1001 | 1301 | }; |
|
1002 | 1302 | pyzmq = super.buildPythonPackage { |
|
1003 | 1303 | name = "pyzmq-14.6.0"; |
|
1004 | 1304 | buildInputs = with self; []; |
|
1005 | 1305 | doCheck = false; |
|
1006 | 1306 | propagatedBuildInputs = with self; []; |
|
1007 | 1307 | src = fetchurl { |
|
1008 | 1308 | url = "https://pypi.python.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz"; |
|
1009 | 1309 | md5 = "395b5de95a931afa5b14c9349a5b8024"; |
|
1010 | 1310 | }; |
|
1311 | meta = { | |
|
1312 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "LGPL+BSD"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; | |
|
1313 | }; | |
|
1011 | 1314 | }; |
|
1012 | 1315 | recaptcha-client = super.buildPythonPackage { |
|
1013 | 1316 | name = "recaptcha-client-1.0.6"; |
|
1014 | 1317 | buildInputs = with self; []; |
|
1015 | 1318 | doCheck = false; |
|
1016 | 1319 | propagatedBuildInputs = with self; []; |
|
1017 | 1320 | src = fetchurl { |
|
1018 | 1321 | url = "https://pypi.python.org/packages/0a/ea/5f2fbbfd894bdac1c68ef8d92019066cfcf9fbff5fe3d728d2b5c25c8db4/recaptcha-client-1.0.6.tar.gz"; |
|
1019 | 1322 | md5 = "74228180f7e1fb76c4d7089160b0d919"; |
|
1020 | 1323 | }; |
|
1324 | meta = { | |
|
1325 | license = [ { fullName = "MIT/X11"; } ]; | |
|
1326 | }; | |
|
1021 | 1327 | }; |
|
1022 | 1328 | repoze.lru = super.buildPythonPackage { |
|
1023 | 1329 | name = "repoze.lru-0.6"; |
|
1024 | 1330 | buildInputs = with self; []; |
|
1025 | 1331 | doCheck = false; |
|
1026 | 1332 | propagatedBuildInputs = with self; []; |
|
1027 | 1333 | src = fetchurl { |
|
1028 | 1334 | url = "https://pypi.python.org/packages/6e/1e/aa15cc90217e086dc8769872c8778b409812ff036bf021b15795638939e4/repoze.lru-0.6.tar.gz"; |
|
1029 | 1335 | md5 = "2c3b64b17a8e18b405f55d46173e14dd"; |
|
1030 | 1336 | }; |
|
1337 | meta = { | |
|
1338 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
|
1339 | }; | |
|
1031 | 1340 | }; |
|
1032 | 1341 | requests = super.buildPythonPackage { |
|
1033 | 1342 | name = "requests-2.9.1"; |
|
1034 | 1343 | buildInputs = with self; []; |
|
1035 | 1344 | doCheck = false; |
|
1036 | 1345 | propagatedBuildInputs = with self; []; |
|
1037 | 1346 | src = fetchurl { |
|
1038 | 1347 | url = "https://pypi.python.org/packages/f9/6d/07c44fb1ebe04d069459a189e7dab9e4abfe9432adcd4477367c25332748/requests-2.9.1.tar.gz"; |
|
1039 | 1348 | md5 = "0b7f480d19012ec52bab78292efd976d"; |
|
1040 | 1349 | }; |
|
1350 | meta = { | |
|
1351 | license = [ pkgs.lib.licenses.asl20 ]; | |
|
1352 | }; | |
|
1041 | 1353 | }; |
|
1042 | 1354 | rhodecode-enterprise-ce = super.buildPythonPackage { |
|
1043 |
name = "rhodecode-enterprise-ce-4. |
|
|
1355 | name = "rhodecode-enterprise-ce-4.2.0"; | |
|
1044 | 1356 | buildInputs = with self; [WebTest configobj cssselect flake8 lxml mock pytest pytest-cov pytest-runner]; |
|
1045 | 1357 | doCheck = true; |
|
1046 | 1358 | propagatedBuildInputs = with self; [Babel Beaker FormEncode Mako Markdown MarkupSafe MySQL-python Paste PasteDeploy PasteScript Pygments Pylons Pyro4 Routes SQLAlchemy Tempita URLObject WebError WebHelpers WebHelpers2 WebOb WebTest Whoosh alembic amqplib anyjson appenlight-client authomatic backport-ipaddress celery colander decorator docutils gunicorn infrae.cache ipython iso8601 kombu msgpack-python packaging psycopg2 pycrypto pycurl pyparsing pyramid pyramid-debugtoolbar pyramid-mako pyramid-beaker pysqlite python-dateutil python-ldap python-memcached python-pam recaptcha-client repoze.lru requests simplejson waitress zope.cachedescriptors psutil py-bcrypt]; |
|
1047 | 1359 | src = ./.; |
|
1360 | meta = { | |
|
1361 | license = [ { fullName = "AGPLv3, and Commercial License"; } ]; | |
|
1362 | }; | |
|
1048 | 1363 | }; |
|
1049 | 1364 | rhodecode-tools = super.buildPythonPackage { |
|
1050 | 1365 | name = "rhodecode-tools-0.8.3"; |
|
1051 | 1366 | buildInputs = with self; []; |
|
1052 | 1367 | doCheck = false; |
|
1053 | 1368 | propagatedBuildInputs = with self; [click future six Mako MarkupSafe requests Whoosh elasticsearch elasticsearch-dsl]; |
|
1054 | 1369 | src = fetchurl { |
|
1055 | 1370 | url = "https://code.rhodecode.com/rhodecode-tools-ce/archive/v0.8.3.zip"; |
|
1056 | 1371 | md5 = "9acdfd71b8ddf4056057065f37ab9ccb"; |
|
1057 | 1372 | }; |
|
1373 | meta = { | |
|
1374 | license = [ { fullName = "AGPLv3 and Proprietary"; } ]; | |
|
1375 | }; | |
|
1058 | 1376 | }; |
|
1059 | 1377 | serpent = super.buildPythonPackage { |
|
1060 | 1378 | name = "serpent-1.12"; |
|
1061 | 1379 | buildInputs = with self; []; |
|
1062 | 1380 | doCheck = false; |
|
1063 | 1381 | propagatedBuildInputs = with self; []; |
|
1064 | 1382 | src = fetchurl { |
|
1065 | 1383 | url = "https://pypi.python.org/packages/3b/19/1e0e83b47c09edaef8398655088036e7e67386b5c48770218ebb339fbbd5/serpent-1.12.tar.gz"; |
|
1066 | 1384 | md5 = "05869ac7b062828b34f8f927f0457b65"; |
|
1067 | 1385 | }; |
|
1386 | meta = { | |
|
1387 | license = [ pkgs.lib.licenses.mit ]; | |
|
1388 | }; | |
|
1068 | 1389 | }; |
|
1069 | 1390 | setproctitle = super.buildPythonPackage { |
|
1070 | 1391 | name = "setproctitle-1.1.8"; |
|
1071 | 1392 | buildInputs = with self; []; |
|
1072 | 1393 | doCheck = false; |
|
1073 | 1394 | propagatedBuildInputs = with self; []; |
|
1074 | 1395 | src = fetchurl { |
|
1075 | 1396 | url = "https://pypi.python.org/packages/33/c3/ad367a4f4f1ca90468863ae727ac62f6edb558fc09a003d344a02cfc6ea6/setproctitle-1.1.8.tar.gz"; |
|
1076 | 1397 | md5 = "728f4c8c6031bbe56083a48594027edd"; |
|
1077 | 1398 | }; |
|
1399 | meta = { | |
|
1400 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
1401 | }; | |
|
1078 | 1402 | }; |
|
1079 | 1403 | setuptools = super.buildPythonPackage { |
|
1080 | 1404 | name = "setuptools-20.8.1"; |
|
1081 | 1405 | buildInputs = with self; []; |
|
1082 | 1406 | doCheck = false; |
|
1083 | 1407 | propagatedBuildInputs = with self; []; |
|
1084 | 1408 | src = fetchurl { |
|
1085 | 1409 | url = "https://pypi.python.org/packages/c4/19/c1bdc88b53da654df43770f941079dbab4e4788c2dcb5658fb86259894c7/setuptools-20.8.1.zip"; |
|
1086 | 1410 | md5 = "fe58a5cac0df20bb83942b252a4b0543"; |
|
1087 | 1411 | }; |
|
1412 | meta = { | |
|
1413 | license = [ pkgs.lib.licenses.mit ]; | |
|
1414 | }; | |
|
1088 | 1415 | }; |
|
1089 | 1416 | setuptools-scm = super.buildPythonPackage { |
|
1090 | 1417 | name = "setuptools-scm-1.11.0"; |
|
1091 | 1418 | buildInputs = with self; []; |
|
1092 | 1419 | doCheck = false; |
|
1093 | 1420 | propagatedBuildInputs = with self; []; |
|
1094 | 1421 | src = fetchurl { |
|
1095 | 1422 | url = "https://pypi.python.org/packages/cd/5f/e3a038292358058d83d764a47d09114aa5a8003ed4529518f9e580f1a94f/setuptools_scm-1.11.0.tar.gz"; |
|
1096 | 1423 | md5 = "4c5c896ba52e134bbc3507bac6400087"; |
|
1097 | 1424 | }; |
|
1425 | meta = { | |
|
1426 | license = [ pkgs.lib.licenses.mit ]; | |
|
1427 | }; | |
|
1098 | 1428 | }; |
|
1099 | 1429 | simplejson = super.buildPythonPackage { |
|
1100 | 1430 | name = "simplejson-3.7.2"; |
|
1101 | 1431 | buildInputs = with self; []; |
|
1102 | 1432 | doCheck = false; |
|
1103 | 1433 | propagatedBuildInputs = with self; []; |
|
1104 | 1434 | src = fetchurl { |
|
1105 | 1435 | url = "https://pypi.python.org/packages/6d/89/7f13f099344eea9d6722779a1f165087cb559598107844b1ac5dbd831fb1/simplejson-3.7.2.tar.gz"; |
|
1106 | 1436 | md5 = "a5fc7d05d4cb38492285553def5d4b46"; |
|
1107 | 1437 | }; |
|
1438 | meta = { | |
|
1439 | license = [ pkgs.lib.licenses.mit pkgs.lib.licenses.afl21 ]; | |
|
1440 | }; | |
|
1108 | 1441 | }; |
|
1109 | 1442 | six = super.buildPythonPackage { |
|
1110 | 1443 | name = "six-1.9.0"; |
|
1111 | 1444 | buildInputs = with self; []; |
|
1112 | 1445 | doCheck = false; |
|
1113 | 1446 | propagatedBuildInputs = with self; []; |
|
1114 | 1447 | src = fetchurl { |
|
1115 | 1448 | url = "https://pypi.python.org/packages/16/64/1dc5e5976b17466fd7d712e59cbe9fb1e18bec153109e5ba3ed6c9102f1a/six-1.9.0.tar.gz"; |
|
1116 | 1449 | md5 = "476881ef4012262dfc8adc645ee786c4"; |
|
1117 | 1450 | }; |
|
1451 | meta = { | |
|
1452 | license = [ pkgs.lib.licenses.mit ]; | |
|
1453 | }; | |
|
1118 | 1454 | }; |
|
1119 | 1455 | subprocess32 = super.buildPythonPackage { |
|
1120 | 1456 | name = "subprocess32-3.2.6"; |
|
1121 | 1457 | buildInputs = with self; []; |
|
1122 | 1458 | doCheck = false; |
|
1123 | 1459 | propagatedBuildInputs = with self; []; |
|
1124 | 1460 | src = fetchurl { |
|
1125 | 1461 | url = "https://pypi.python.org/packages/28/8d/33ccbff51053f59ae6c357310cac0e79246bbed1d345ecc6188b176d72c3/subprocess32-3.2.6.tar.gz"; |
|
1126 | 1462 | md5 = "754c5ab9f533e764f931136974b618f1"; |
|
1127 | 1463 | }; |
|
1464 | meta = { | |
|
1465 | license = [ pkgs.lib.licenses.psfl ]; | |
|
1466 | }; | |
|
1128 | 1467 | }; |
|
1129 | 1468 | supervisor = super.buildPythonPackage { |
|
1130 | 1469 | name = "supervisor-3.1.3"; |
|
1131 | 1470 | buildInputs = with self; []; |
|
1132 | 1471 | doCheck = false; |
|
1133 | 1472 | propagatedBuildInputs = with self; [meld3]; |
|
1134 | 1473 | src = fetchurl { |
|
1135 | 1474 | url = "https://pypi.python.org/packages/a6/41/65ad5bd66230b173eb4d0b8810230f3a9c59ef52ae066e540b6b99895db7/supervisor-3.1.3.tar.gz"; |
|
1136 | 1475 | md5 = "aad263c4fbc070de63dd354864d5e552"; |
|
1137 | 1476 | }; |
|
1477 | meta = { | |
|
1478 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
|
1479 | }; | |
|
1138 | 1480 | }; |
|
1139 | 1481 | transifex-client = super.buildPythonPackage { |
|
1140 | 1482 | name = "transifex-client-0.10"; |
|
1141 | 1483 | buildInputs = with self; []; |
|
1142 | 1484 | doCheck = false; |
|
1143 | 1485 | propagatedBuildInputs = with self; []; |
|
1144 | 1486 | src = fetchurl { |
|
1145 | 1487 | url = "https://pypi.python.org/packages/f3/4e/7b925192aee656fb3e04fa6381c8b3dc40198047c3b4a356f6cfd642c809/transifex-client-0.10.tar.gz"; |
|
1146 | 1488 | md5 = "5549538d84b8eede6b254cd81ae024fa"; |
|
1147 | 1489 | }; |
|
1490 | meta = { | |
|
1491 | license = [ pkgs.lib.licenses.gpl2 ]; | |
|
1492 | }; | |
|
1148 | 1493 | }; |
|
1149 | 1494 | translationstring = super.buildPythonPackage { |
|
1150 | 1495 | name = "translationstring-1.3"; |
|
1151 | 1496 | buildInputs = with self; []; |
|
1152 | 1497 | doCheck = false; |
|
1153 | 1498 | propagatedBuildInputs = with self; []; |
|
1154 | 1499 | src = fetchurl { |
|
1155 | 1500 | url = "https://pypi.python.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz"; |
|
1156 | 1501 | md5 = "a4b62e0f3c189c783a1685b3027f7c90"; |
|
1157 | 1502 | }; |
|
1503 | meta = { | |
|
1504 | license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ]; | |
|
1505 | }; | |
|
1158 | 1506 | }; |
|
1159 | 1507 | trollius = super.buildPythonPackage { |
|
1160 | 1508 | name = "trollius-1.0.4"; |
|
1161 | 1509 | buildInputs = with self; []; |
|
1162 | 1510 | doCheck = false; |
|
1163 | 1511 | propagatedBuildInputs = with self; [futures]; |
|
1164 | 1512 | src = fetchurl { |
|
1165 | 1513 | url = "https://pypi.python.org/packages/aa/e6/4141db437f55e6ee7a3fb69663239e3fde7841a811b4bef293145ad6c836/trollius-1.0.4.tar.gz"; |
|
1166 | 1514 | md5 = "3631a464d49d0cbfd30ab2918ef2b783"; |
|
1167 | 1515 | }; |
|
1516 | meta = { | |
|
1517 | license = [ pkgs.lib.licenses.asl20 ]; | |
|
1518 | }; | |
|
1168 | 1519 | }; |
|
1169 | 1520 | uWSGI = super.buildPythonPackage { |
|
1170 | 1521 | name = "uWSGI-2.0.11.2"; |
|
1171 | 1522 | buildInputs = with self; []; |
|
1172 | 1523 | doCheck = false; |
|
1173 | 1524 | propagatedBuildInputs = with self; []; |
|
1174 | 1525 | src = fetchurl { |
|
1175 | 1526 | url = "https://pypi.python.org/packages/9b/78/918db0cfab0546afa580c1e565209c49aaf1476bbfe491314eadbe47c556/uwsgi-2.0.11.2.tar.gz"; |
|
1176 | 1527 | md5 = "1f02dcbee7f6f61de4b1fd68350cf16f"; |
|
1177 | 1528 | }; |
|
1529 | meta = { | |
|
1530 | license = [ pkgs.lib.licenses.gpl2 ]; | |
|
1531 | }; | |
|
1178 | 1532 | }; |
|
1179 | 1533 | urllib3 = super.buildPythonPackage { |
|
1180 | 1534 | name = "urllib3-1.16"; |
|
1181 | 1535 | buildInputs = with self; []; |
|
1182 | 1536 | doCheck = false; |
|
1183 | 1537 | propagatedBuildInputs = with self; []; |
|
1184 | 1538 | src = fetchurl { |
|
1185 | 1539 | url = "https://pypi.python.org/packages/3b/f0/e763169124e3f5db0926bc3dbfcd580a105f9ca44cf5d8e6c7a803c9f6b5/urllib3-1.16.tar.gz"; |
|
1186 | 1540 | md5 = "fcaab1c5385c57deeb7053d3d7d81d59"; |
|
1187 | 1541 | }; |
|
1542 | meta = { | |
|
1543 | license = [ pkgs.lib.licenses.mit ]; | |
|
1544 | }; | |
|
1188 | 1545 | }; |
|
1189 | 1546 | venusian = super.buildPythonPackage { |
|
1190 | 1547 | name = "venusian-1.0"; |
|
1191 | 1548 | buildInputs = with self; []; |
|
1192 | 1549 | doCheck = false; |
|
1193 | 1550 | propagatedBuildInputs = with self; []; |
|
1194 | 1551 | src = fetchurl { |
|
1195 | 1552 | url = "https://pypi.python.org/packages/86/20/1948e0dfc4930ddde3da8c33612f6a5717c0b4bc28f591a5c5cf014dd390/venusian-1.0.tar.gz"; |
|
1196 | 1553 | md5 = "dccf2eafb7113759d60c86faf5538756"; |
|
1197 | 1554 | }; |
|
1555 | meta = { | |
|
1556 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
|
1557 | }; | |
|
1198 | 1558 | }; |
|
1199 | 1559 | waitress = super.buildPythonPackage { |
|
1200 | 1560 | name = "waitress-0.8.9"; |
|
1201 | 1561 | buildInputs = with self; []; |
|
1202 | 1562 | doCheck = false; |
|
1203 | 1563 | propagatedBuildInputs = with self; [setuptools]; |
|
1204 | 1564 | src = fetchurl { |
|
1205 | 1565 | url = "https://pypi.python.org/packages/ee/65/fc9dee74a909a1187ca51e4f15ad9c4d35476e4ab5813f73421505c48053/waitress-0.8.9.tar.gz"; |
|
1206 | 1566 | md5 = "da3f2e62b3676be5dd630703a68e2a04"; |
|
1207 | 1567 | }; |
|
1568 | meta = { | |
|
1569 | license = [ pkgs.lib.licenses.zpt21 ]; | |
|
1570 | }; | |
|
1208 | 1571 | }; |
|
1209 | 1572 | wsgiref = super.buildPythonPackage { |
|
1210 | 1573 | name = "wsgiref-0.1.2"; |
|
1211 | 1574 | buildInputs = with self; []; |
|
1212 | 1575 | doCheck = false; |
|
1213 | 1576 | propagatedBuildInputs = with self; []; |
|
1214 | 1577 | src = fetchurl { |
|
1215 | 1578 | url = "https://pypi.python.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip"; |
|
1216 | 1579 | md5 = "29b146e6ebd0f9fb119fe321f7bcf6cb"; |
|
1217 | 1580 | }; |
|
1581 | meta = { | |
|
1582 | license = [ { fullName = "PSF or ZPL"; } ]; | |
|
1583 | }; | |
|
1218 | 1584 | }; |
|
1219 | 1585 | zope.cachedescriptors = super.buildPythonPackage { |
|
1220 | 1586 | name = "zope.cachedescriptors-4.0.0"; |
|
1221 | 1587 | buildInputs = with self; []; |
|
1222 | 1588 | doCheck = false; |
|
1223 | 1589 | propagatedBuildInputs = with self; [setuptools]; |
|
1224 | 1590 | src = fetchurl { |
|
1225 | 1591 | url = "https://pypi.python.org/packages/40/33/694b6644c37f28553f4b9f20b3c3a20fb709a22574dff20b5bdffb09ecd5/zope.cachedescriptors-4.0.0.tar.gz"; |
|
1226 | 1592 | md5 = "8d308de8c936792c8e758058fcb7d0f0"; |
|
1227 | 1593 | }; |
|
1594 | meta = { | |
|
1595 | license = [ pkgs.lib.licenses.zpt21 ]; | |
|
1596 | }; | |
|
1228 | 1597 | }; |
|
1229 | 1598 | zope.deprecation = super.buildPythonPackage { |
|
1230 | 1599 | name = "zope.deprecation-4.1.2"; |
|
1231 | 1600 | buildInputs = with self; []; |
|
1232 | 1601 | doCheck = false; |
|
1233 | 1602 | propagatedBuildInputs = with self; [setuptools]; |
|
1234 | 1603 | src = fetchurl { |
|
1235 | 1604 | url = "https://pypi.python.org/packages/c1/d3/3919492d5e57d8dd01b36f30b34fc8404a30577392b1eb817c303499ad20/zope.deprecation-4.1.2.tar.gz"; |
|
1236 | 1605 | md5 = "e9a663ded58f4f9f7881beb56cae2782"; |
|
1237 | 1606 | }; |
|
1607 | meta = { | |
|
1608 | license = [ pkgs.lib.licenses.zpt21 ]; | |
|
1609 | }; | |
|
1238 | 1610 | }; |
|
1239 | 1611 | zope.event = super.buildPythonPackage { |
|
1240 | 1612 | name = "zope.event-4.0.3"; |
|
1241 | 1613 | buildInputs = with self; []; |
|
1242 | 1614 | doCheck = false; |
|
1243 | 1615 | propagatedBuildInputs = with self; [setuptools]; |
|
1244 | 1616 | src = fetchurl { |
|
1245 | 1617 | url = "https://pypi.python.org/packages/c1/29/91ba884d7d6d96691df592e9e9c2bfa57a47040ec1ff47eff18c85137152/zope.event-4.0.3.tar.gz"; |
|
1246 | 1618 | md5 = "9a3780916332b18b8b85f522bcc3e249"; |
|
1247 | 1619 | }; |
|
1620 | meta = { | |
|
1621 | license = [ pkgs.lib.licenses.zpt21 ]; | |
|
1622 | }; | |
|
1248 | 1623 | }; |
|
1249 | 1624 | zope.interface = super.buildPythonPackage { |
|
1250 | 1625 | name = "zope.interface-4.1.3"; |
|
1251 | 1626 | buildInputs = with self; []; |
|
1252 | 1627 | doCheck = false; |
|
1253 | 1628 | propagatedBuildInputs = with self; [setuptools]; |
|
1254 | 1629 | src = fetchurl { |
|
1255 | 1630 | url = "https://pypi.python.org/packages/9d/81/2509ca3c6f59080123c1a8a97125eb48414022618cec0e64eb1313727bfe/zope.interface-4.1.3.tar.gz"; |
|
1256 | 1631 | md5 = "9ae3d24c0c7415deb249dd1a132f0f79"; |
|
1257 | 1632 | }; |
|
1633 | meta = { | |
|
1634 | license = [ pkgs.lib.licenses.zpt21 ]; | |
|
1635 | }; | |
|
1258 | 1636 | }; |
|
1259 | 1637 | |
|
1260 | 1638 | ### Test requirements |
|
1261 | 1639 | |
|
1262 | 1640 | |
|
1263 | 1641 | } |
@@ -1,231 +1,232 b'' | |||
|
1 | 1 | # |
|
2 | 2 | # About |
|
3 | 3 | # ===== |
|
4 | 4 | # |
|
5 | 5 | # This file defines jobs for our CI system and the attribute "build" is used |
|
6 | 6 | # as the input for packaging. |
|
7 | 7 | # |
|
8 | 8 | # |
|
9 | 9 | # CI details |
|
10 | 10 | # ========== |
|
11 | 11 | # |
|
12 | 12 | # This file defines an attribute set of derivations. Each of these attributes is |
|
13 | 13 | # then used in our CI system as one job to run. This way we keep the |
|
14 | 14 | # configuration for the CI jobs as well under version control. |
|
15 | 15 | # |
|
16 | 16 | # Run CI jobs locally |
|
17 | 17 | # ------------------- |
|
18 | 18 | # |
|
19 | 19 | # Since it is all based on normal Nix derivations, the jobs can be tested |
|
20 | 20 | # locally with a run of "nix-build" like the following example: |
|
21 | 21 | # |
|
22 | 22 | # nix-build release.nix -A test-api -I vcsserver=~/rhodecode-vcsserver |
|
23 | 23 | # |
|
24 | 24 | # Note: Replace "~/rhodecode-vcsserver" with a path where a clone of the |
|
25 | 25 | # vcsserver resides. |
|
26 | 26 | |
|
27 | 27 | { pkgs ? import <nixpkgs> {} |
|
28 | , doCheck ? true | |
|
28 | 29 | }: |
|
29 | 30 | |
|
30 | 31 | let |
|
31 | 32 | |
|
32 | 33 | inherit (pkgs) |
|
33 | 34 | stdenv |
|
34 | 35 | system; |
|
35 | 36 | |
|
36 | 37 | testing = import <nixpkgs/nixos/lib/testing.nix> { |
|
37 | 38 | inherit system; |
|
38 | 39 | }; |
|
39 | 40 | |
|
40 | 41 | runInMachine = testing.runInMachine; |
|
41 | 42 | |
|
42 | 43 | sphinx = import ./docs/default.nix {}; |
|
43 | 44 | |
|
44 | 45 | mkDocs = kind: stdenv.mkDerivation { |
|
45 | 46 | name = kind; |
|
46 | 47 | srcs = [ |
|
47 | 48 | (./. + (builtins.toPath "/${kind}")) |
|
48 | 49 | (builtins.filterSource |
|
49 | 50 | (path: type: baseNameOf path == "VERSION") |
|
50 | 51 | ./rhodecode) |
|
51 | 52 | ]; |
|
52 | 53 | sourceRoot = kind; |
|
53 | 54 | buildInputs = [ sphinx ]; |
|
54 | 55 | configurePhase = null; |
|
55 | 56 | buildPhase = '' |
|
56 | 57 | make SPHINXBUILD=sphinx-build html |
|
57 | 58 | ''; |
|
58 | 59 | installPhase = '' |
|
59 | 60 | mkdir -p $out |
|
60 | 61 | mv _build/html $out/ |
|
61 | 62 | |
|
62 | 63 | mkdir -p $out/nix-support |
|
63 | 64 | echo "doc manual $out/html index.html" >> \ |
|
64 | 65 | "$out/nix-support/hydra-build-products" |
|
65 | 66 | ''; |
|
66 | 67 | }; |
|
67 | 68 | |
|
68 | 69 | enterprise = import ./default.nix { |
|
69 | 70 | inherit |
|
70 | 71 | pkgs; |
|
71 | 72 | |
|
72 | 73 | # TODO: for quick local testing |
|
73 | 74 | doCheck = false; |
|
74 | 75 | }; |
|
75 | 76 | |
|
76 | 77 | test-cfg = stdenv.mkDerivation { |
|
77 | 78 | name = "test-cfg"; |
|
78 | 79 | unpackPhase = "true"; |
|
79 | 80 | buildInputs = [ |
|
80 | 81 | enterprise.src |
|
81 | 82 | ]; |
|
82 | 83 | installPhase = '' |
|
83 | 84 | mkdir -p $out/etc |
|
84 | 85 | cp ${enterprise.src}/test.ini $out/etc/enterprise.ini |
|
85 | 86 | # TODO: johbo: Needed, so that the login works, this causes |
|
86 | 87 | # probably some side effects |
|
87 | 88 | substituteInPlace $out/etc/enterprise.ini --replace "is_test = True" "" |
|
88 | 89 | |
|
89 | 90 | # Gevent configuration |
|
90 | 91 | cp $out/etc/enterprise.ini $out/etc/enterprise-gevent.ini; |
|
91 | 92 | cat >> $out/etc/enterprise-gevent.ini <<EOF |
|
92 | 93 | |
|
93 | 94 | [server:main] |
|
94 | 95 | use = egg:gunicorn#main |
|
95 | 96 | worker_class = gevent |
|
96 | 97 | EOF |
|
97 | 98 | |
|
98 | 99 | cp ${enterprise.src}/vcsserver/test.ini $out/etc/vcsserver.ini |
|
99 | 100 | ''; |
|
100 | 101 | }; |
|
101 | 102 | |
|
102 | 103 | ac-test-drv = import ./acceptance_tests { |
|
103 | 104 | withExternals = false; |
|
104 | 105 | }; |
|
105 | 106 | |
|
106 | 107 | # TODO: johbo: Currently abusing buildPythonPackage to make the |
|
107 | 108 | # needed environment for the ac-test tools. |
|
108 | 109 | mkAcTests = { |
|
109 | 110 | # Path to an INI file which will be used to run Enterprise. |
|
110 | 111 | # |
|
111 | 112 | # Intended usage is to provide different configuration files to |
|
112 | 113 | # run the tests against a different configuration. |
|
113 | 114 | enterpriseCfg ? "${test-cfg}/etc/enterprise.ini" |
|
114 | 115 | |
|
115 | 116 | # Path to an INI file which will be used to run the VCSServer. |
|
116 | 117 | , vcsserverCfg ? "${test-cfg}/etc/vcsserver.ini" |
|
117 | 118 | }: pkgs.pythonPackages.buildPythonPackage { |
|
118 | 119 | name = "enterprise-ac-tests"; |
|
119 | 120 | src = ./acceptance_tests; |
|
120 | 121 | |
|
121 | 122 | buildInputs = with pkgs; [ |
|
122 | 123 | curl |
|
123 | 124 | enterprise |
|
124 | 125 | ac-test-drv |
|
125 | 126 | ]; |
|
126 | 127 | |
|
127 | 128 | buildPhase = '' |
|
128 | 129 | cp ${enterpriseCfg} enterprise.ini |
|
129 | 130 | |
|
130 | 131 | echo "Creating a fake home directory" |
|
131 | 132 | mkdir fake-home |
|
132 | 133 | export HOME=$PWD/fake-home |
|
133 | 134 | |
|
134 | 135 | echo "Creating a repository directory" |
|
135 | 136 | mkdir repos |
|
136 | 137 | |
|
137 | 138 | echo "Preparing the database" |
|
138 | 139 | paster setup-rhodecode \ |
|
139 | 140 | --user=admin \ |
|
140 | 141 | --email=admin@example.com \ |
|
141 | 142 | --password=secret \ |
|
142 | 143 | --api-key=9999999999999999999999999999999999999999 \ |
|
143 | 144 | --force-yes \ |
|
144 | 145 | --repos=$PWD/repos \ |
|
145 | 146 | enterprise.ini > /dev/null |
|
146 | 147 | |
|
147 | 148 | echo "Starting rcserver" |
|
148 | 149 | vcsserver --config ${vcsserverCfg} >vcsserver.log 2>&1 & |
|
149 | 150 | rcserver enterprise.ini >rcserver.log 2>&1 & |
|
150 | 151 | |
|
151 | 152 | while ! curl -f -s http://localhost:5000 > /dev/null |
|
152 | 153 | do |
|
153 | 154 | echo "Waiting for server to be ready..." |
|
154 | 155 | sleep 3 |
|
155 | 156 | done |
|
156 | 157 | echo "Webserver is ready." |
|
157 | 158 | |
|
158 | 159 | echo "Starting the test run" |
|
159 | 160 | py.test -c example.ini -vs --maxfail=5 tests |
|
160 | 161 | |
|
161 | 162 | echo "Kill rcserver" |
|
162 | 163 | kill %2 |
|
163 | 164 | kill %1 |
|
164 | 165 | ''; |
|
165 | 166 | |
|
166 | 167 | # TODO: johbo: Use the install phase again once the normal mkDerivation |
|
167 | 168 | # can be used again. |
|
168 | 169 | postInstall = '' |
|
169 | 170 | mkdir -p $out |
|
170 | 171 | cp enterprise.ini $out |
|
171 | 172 | cp ${vcsserverCfg} $out/vcsserver.ini |
|
172 | 173 | cp rcserver.log $out |
|
173 | 174 | cp vcsserver.log $out |
|
174 | 175 | |
|
175 | 176 | mkdir -p $out/nix-support |
|
176 | 177 | echo "report config $out enterprise.ini" >> $out/nix-support/hydra-build-products |
|
177 | 178 | echo "report config $out vcsserver.ini" >> $out/nix-support/hydra-build-products |
|
178 | 179 | echo "report rcserver $out rcserver.log" >> $out/nix-support/hydra-build-products |
|
179 | 180 | echo "report vcsserver $out vcsserver.log" >> $out/nix-support/hydra-build-products |
|
180 | 181 | ''; |
|
181 | 182 | }; |
|
182 | 183 | |
|
183 | 184 | vcsserver = import <vcsserver> { |
|
184 | 185 | inherit pkgs; |
|
185 | 186 | |
|
186 | 187 | # TODO: johbo: Think of a more elegant solution to this problem |
|
187 | 188 | pythonExternalOverrides = self: super: (enterprise.myPythonPackagesUnfix self); |
|
188 | 189 | }; |
|
189 | 190 | |
|
190 | 191 | runTests = optionString: (enterprise.override (attrs: { |
|
191 | 192 | doCheck = true; |
|
192 | 193 | name = "test-run"; |
|
193 | 194 | buildInputs = attrs.buildInputs ++ [ |
|
194 | 195 | vcsserver |
|
195 | 196 | ]; |
|
196 | 197 | checkPhase = '' |
|
197 | 198 | py.test ${optionString} -vv -ra |
|
198 | 199 | ''; |
|
199 | 200 | buildPhase = attrs.shellHook; |
|
200 | 201 | installPhase = '' |
|
201 | 202 | echo "Intentionally not installing anything" |
|
202 | 203 | ''; |
|
203 | 204 | meta.description = "Enterprise test run ${optionString}"; |
|
204 | 205 | })); |
|
205 | 206 | |
|
206 | 207 | jobs = { |
|
207 | 208 | |
|
208 | 209 | build = enterprise; |
|
209 | 210 | |
|
210 | 211 | # johbo: Currently this is simply running the tests against the sources. Nicer |
|
211 | 212 | # would be to run xdist and against the installed application, so that we also |
|
212 | 213 | # cover the impact of installing the application. |
|
213 | 214 | test-api = runTests "rhodecode/api"; |
|
214 | 215 | test-functional = runTests "rhodecode/tests/functional"; |
|
215 | 216 | test-rest = runTests "rhodecode/tests --ignore=rhodecode/tests/functional"; |
|
216 | 217 | test-full = runTests "rhodecode"; |
|
217 | 218 | |
|
218 | 219 | docs = mkDocs "docs"; |
|
219 | 220 | |
|
220 | 221 | aggregate = pkgs.releaseTools.aggregate { |
|
221 | 222 | name = "aggregated-jobs"; |
|
222 | 223 | constituents = [ |
|
223 | 224 | jobs.build |
|
224 | 225 | jobs.test-api |
|
225 | 226 | jobs.test-rest |
|
226 | 227 | jobs.docs |
|
227 | 228 | ]; |
|
228 | 229 | }; |
|
229 | 230 | }; |
|
230 | 231 | |
|
231 | 232 | in jobs |
@@ -1,151 +1,150 b'' | |||
|
1 | 1 | Babel==1.3 |
|
2 | 2 | Beaker==1.7.0 |
|
3 | 3 | CProfileV==1.0.6 |
|
4 | 4 | Fabric==1.10.0 |
|
5 | 5 | FormEncode==1.2.4 |
|
6 | 6 | Jinja2==2.7.3 |
|
7 | 7 | Mako==1.0.1 |
|
8 | 8 | Markdown==2.6.2 |
|
9 | 9 | MarkupSafe==0.23 |
|
10 | 10 | MySQL-python==1.2.5 |
|
11 | 11 | Paste==2.0.2 |
|
12 | 12 | PasteDeploy==1.5.2 |
|
13 | 13 | PasteScript==1.7.5 |
|
14 | 14 | Pygments==2.0.2 |
|
15 | 15 | |
|
16 | 16 | # TODO: This version is not available on PyPI |
|
17 | 17 | # Pylons==1.0.2.dev20160108 |
|
18 | 18 | Pylons==1.0.1 |
|
19 | 19 | |
|
20 | 20 | # TODO: This version is not available, but newer ones are |
|
21 | 21 | # Pyro4==4.35 |
|
22 | 22 | Pyro4==4.41 |
|
23 | 23 | |
|
24 | 24 | # TODO: This should probably not be in here |
|
25 | 25 | # -e hg+https://johbo@code.rhodecode.com/johbo/rhodecode-fork@3a454bd1f17c0b2b2a951cf2b111e0320d7942a9#egg=RhodeCodeEnterprise-dev |
|
26 | 26 | |
|
27 | 27 | # TODO: This is not really a dependency, we should add it only |
|
28 | 28 | # into the development environment, since there it is useful. |
|
29 | 29 | # RhodeCodeVCSServer==3.9.0 |
|
30 | 30 | |
|
31 | 31 | Routes==1.13 |
|
32 | 32 | SQLAlchemy==0.9.9 |
|
33 | 33 | Sphinx==1.2.2 |
|
34 | 34 | Tempita==0.5.2 |
|
35 | 35 | URLObject==2.4.0 |
|
36 | 36 | WebError==0.10.3 |
|
37 | 37 | |
|
38 | 38 | # TODO: This is modified by us, needs a better integration. For now |
|
39 | 39 | # using the latest version before. |
|
40 | 40 | # WebHelpers==1.3.dev20150807 |
|
41 | 41 | WebHelpers==1.3 |
|
42 | 42 | |
|
43 | 43 | WebHelpers2==2.0 |
|
44 | 44 | WebOb==1.3.1 |
|
45 | 45 | WebTest==1.4.3 |
|
46 | 46 | Whoosh==2.7.0 |
|
47 | 47 | alembic==0.8.4 |
|
48 | 48 | amqplib==1.0.2 |
|
49 | 49 | anyjson==0.3.3 |
|
50 | 50 | appenlight-client==0.6.14 |
|
51 | 51 | authomatic==0.1.0.post1; |
|
52 | 52 | backport-ipaddress==0.1 |
|
53 | 53 | bottle==0.12.8 |
|
54 | 54 | bumpversion==0.5.3 |
|
55 | 55 | celery==2.2.10 |
|
56 | 56 | click==5.1 |
|
57 | 57 | colander==1.2 |
|
58 | 58 | configobj==5.0.6 |
|
59 | 59 | cov-core==1.15.0 |
|
60 | 60 | coverage==3.7.1 |
|
61 | 61 | cssselect==0.9.1 |
|
62 | 62 | decorator==3.4.2 |
|
63 | 63 | docutils==0.12 |
|
64 | 64 | dogpile.cache==0.5.7 |
|
65 | 65 | dogpile.core==0.4.1 |
|
66 | 66 | dulwich==0.12.0 |
|
67 | 67 | ecdsa==0.11 |
|
68 | 68 | flake8==2.4.1 |
|
69 | 69 | future==0.14.3 |
|
70 | 70 | futures==3.0.2 |
|
71 | 71 | gprof2dot==2015.12.1 |
|
72 | greenlet==0.4.9 | |
|
73 | 72 | gunicorn==19.6.0 |
|
74 | 73 | |
|
75 | 74 | # TODO: Needs subvertpy and blows up without Subversion headers, |
|
76 | 75 | # actually we should not need this for Enterprise at all. |
|
77 | 76 | # hgsubversion==1.8.2 |
|
78 | 77 | |
|
79 | 78 | gnureadline==6.3.3 |
|
80 | 79 | infrae.cache==1.0.1 |
|
81 |
invoke==0.1 |
|
|
80 | invoke==0.13.0 | |
|
82 | 81 | ipdb==0.8 |
|
83 | 82 | ipython==3.1.0 |
|
84 | 83 | iso8601==0.1.11 |
|
85 | 84 | itsdangerous==0.24 |
|
86 | 85 | kombu==1.5.1 |
|
87 | 86 | lxml==3.4.4 |
|
88 | 87 | mccabe==0.3 |
|
89 | 88 | meld3==1.0.2 |
|
90 | 89 | mock==1.0.1 |
|
91 | 90 | msgpack-python==0.4.6 |
|
92 | 91 | nose==1.3.6 |
|
93 | 92 | objgraph==2.0.0 |
|
94 | 93 | packaging==15.2 |
|
95 | 94 | paramiko==1.15.1 |
|
96 | 95 | pep8==1.5.7 |
|
97 | 96 | psutil==2.2.1 |
|
98 | 97 | psycopg2==2.6 |
|
99 | 98 | py==1.4.29 |
|
100 | 99 | py-bcrypt==0.4 |
|
101 | 100 | pycrypto==2.6.1 |
|
102 | 101 | pycurl==7.19.5 |
|
103 | 102 | pyflakes==0.8.1 |
|
104 | 103 | pyparsing==1.5.7 |
|
105 | 104 | pyramid==1.6.1 |
|
106 | 105 | pyramid-beaker==0.8 |
|
107 | 106 | pyramid-debugtoolbar==2.4.2 |
|
108 | 107 | pyramid-jinja2==2.5 |
|
109 | 108 | pyramid-mako==1.0.2 |
|
110 | 109 | pysqlite==2.6.3 |
|
111 | 110 | pytest==2.8.5 |
|
112 | 111 | pytest-runner==2.7.1 |
|
113 | 112 | pytest-catchlog==1.2.2 |
|
114 | 113 | pytest-cov==1.8.1 |
|
115 | 114 | pytest-profiling==1.0.1 |
|
116 | 115 | pytest-timeout==0.4 |
|
117 | 116 | python-dateutil==1.5 |
|
118 | 117 | python-ldap==2.4.19 |
|
119 | 118 | python-memcached==1.57 |
|
120 | 119 | python-pam==1.8.2 |
|
121 | 120 | pytz==2015.4 |
|
122 | 121 | pyzmq==14.6.0 |
|
123 | 122 | |
|
124 | 123 | # TODO: This is not available in public |
|
125 | 124 | # rc-testdata==0.2.0 |
|
126 | 125 | |
|
127 | 126 | https://code.rhodecode.com/rhodecode-tools-ce/archive/v0.8.3.zip#md5=9acdfd71b8ddf4056057065f37ab9ccb |
|
128 | 127 | |
|
129 | 128 | |
|
130 | 129 | recaptcha-client==1.0.6 |
|
131 | 130 | repoze.lru==0.6 |
|
132 | 131 | requests==2.9.1 |
|
133 | 132 | serpent==1.12 |
|
134 | 133 | setproctitle==1.1.8 |
|
135 | 134 | setuptools==20.8.1 |
|
136 | 135 | setuptools-scm==1.11.0 |
|
137 | 136 | simplejson==3.7.2 |
|
138 | 137 | six==1.9.0 |
|
139 | 138 | subprocess32==3.2.6 |
|
140 | 139 | supervisor==3.1.3 |
|
141 | 140 | transifex-client==0.10 |
|
142 | 141 | translationstring==1.3 |
|
143 | 142 | trollius==1.0.4 |
|
144 | 143 | uWSGI==2.0.11.2 |
|
145 | 144 | venusian==1.0 |
|
146 | 145 | waitress==0.8.9 |
|
147 | 146 | wsgiref==0.1.2 |
|
148 | 147 | zope.cachedescriptors==4.0.0 |
|
149 | 148 | zope.deprecation==4.1.2 |
|
150 | 149 | zope.event==4.0.3 |
|
151 | 150 | zope.interface==4.1.3 |
@@ -1,609 +1,615 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | """ |
|
22 | 22 | Authentication modules |
|
23 | 23 | """ |
|
24 | 24 | |
|
25 | import colander | |
|
25 | 26 | import logging |
|
26 | 27 | import time |
|
27 | 28 | import traceback |
|
28 | 29 | import warnings |
|
29 | 30 | |
|
30 | 31 | from pyramid.threadlocal import get_current_registry |
|
31 | 32 | from sqlalchemy.ext.hybrid import hybrid_property |
|
32 | 33 | |
|
33 | 34 | from rhodecode.authentication.interface import IAuthnPluginRegistry |
|
34 | 35 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase |
|
35 | 36 | from rhodecode.lib import caches |
|
36 | 37 | from rhodecode.lib.auth import PasswordGenerator, _RhodeCodeCryptoBCrypt |
|
37 | 38 | from rhodecode.lib.utils2 import md5_safe, safe_int |
|
38 | 39 | from rhodecode.lib.utils2 import safe_str |
|
39 | 40 | from rhodecode.model.db import User |
|
40 | 41 | from rhodecode.model.meta import Session |
|
41 | 42 | from rhodecode.model.settings import SettingsModel |
|
42 | 43 | from rhodecode.model.user import UserModel |
|
43 | 44 | from rhodecode.model.user_group import UserGroupModel |
|
44 | 45 | |
|
45 | 46 | |
|
46 | 47 | log = logging.getLogger(__name__) |
|
47 | 48 | |
|
48 | 49 | # auth types that authenticate() function can receive |
|
49 | 50 | VCS_TYPE = 'vcs' |
|
50 | 51 | HTTP_TYPE = 'http' |
|
51 | 52 | |
|
52 | 53 | |
|
53 | 54 | class LazyFormencode(object): |
|
54 | 55 | def __init__(self, formencode_obj, *args, **kwargs): |
|
55 | 56 | self.formencode_obj = formencode_obj |
|
56 | 57 | self.args = args |
|
57 | 58 | self.kwargs = kwargs |
|
58 | 59 | |
|
59 | 60 | def __call__(self, *args, **kwargs): |
|
60 | 61 | from inspect import isfunction |
|
61 | 62 | formencode_obj = self.formencode_obj |
|
62 | 63 | if isfunction(formencode_obj): |
|
63 | 64 | # case we wrap validators into functions |
|
64 | 65 | formencode_obj = self.formencode_obj(*args, **kwargs) |
|
65 | 66 | return formencode_obj(*self.args, **self.kwargs) |
|
66 | 67 | |
|
67 | 68 | |
|
68 | 69 | class RhodeCodeAuthPluginBase(object): |
|
69 | 70 | # cache the authentication request for N amount of seconds. Some kind |
|
70 | 71 | # of authentication methods are very heavy and it's very efficient to cache |
|
71 | 72 | # the result of a call. If it's set to None (default) cache is off |
|
72 | 73 | AUTH_CACHE_TTL = None |
|
73 | 74 | AUTH_CACHE = {} |
|
74 | 75 | |
|
75 | 76 | auth_func_attrs = { |
|
76 | 77 | "username": "unique username", |
|
77 | 78 | "firstname": "first name", |
|
78 | 79 | "lastname": "last name", |
|
79 | 80 | "email": "email address", |
|
80 | 81 | "groups": '["list", "of", "groups"]', |
|
81 | 82 | "extern_name": "name in external source of record", |
|
82 | 83 | "extern_type": "type of external source of record", |
|
83 | 84 | "admin": 'True|False defines if user should be RhodeCode super admin', |
|
84 | 85 | "active": |
|
85 | 86 | 'True|False defines active state of user internally for RhodeCode', |
|
86 | 87 | "active_from_extern": |
|
87 | 88 | "True|False\None, active state from the external auth, " |
|
88 | 89 | "None means use definition from RhodeCode extern_type active value" |
|
89 | 90 | } |
|
90 | 91 | # set on authenticate() method and via set_auth_type func. |
|
91 | 92 | auth_type = None |
|
92 | 93 | |
|
93 | 94 | # List of setting names to store encrypted. Plugins may override this list |
|
94 | 95 | # to store settings encrypted. |
|
95 | 96 | _settings_encrypted = [] |
|
96 | 97 | |
|
97 | 98 | # Mapping of python to DB settings model types. Plugins may override or |
|
98 | 99 | # extend this mapping. |
|
99 | 100 | _settings_type_map = { |
|
100 | str: 'str', | |
|
101 |
|
|
|
102 |
|
|
|
103 |
|
|
|
104 | list: 'list', | |
|
101 | colander.String: 'unicode', | |
|
102 | colander.Integer: 'int', | |
|
103 | colander.Boolean: 'bool', | |
|
104 | colander.List: 'list', | |
|
105 | 105 | } |
|
106 | 106 | |
|
107 | 107 | def __init__(self, plugin_id): |
|
108 | 108 | self._plugin_id = plugin_id |
|
109 | 109 | |
|
110 | def __str__(self): | |
|
111 | return self.get_id() | |
|
112 | ||
|
110 | 113 | def _get_setting_full_name(self, name): |
|
111 | 114 | """ |
|
112 | 115 | Return the full setting name used for storing values in the database. |
|
113 | 116 | """ |
|
114 | 117 | # TODO: johbo: Using the name here is problematic. It would be good to |
|
115 | 118 | # introduce either new models in the database to hold Plugin and |
|
116 | 119 | # PluginSetting or to use the plugin id here. |
|
117 | 120 | return 'auth_{}_{}'.format(self.name, name) |
|
118 | 121 | |
|
119 |
def _get_setting_type(self, name |
|
|
122 | def _get_setting_type(self, name): | |
|
123 | """ | |
|
124 | Return the type of a setting. This type is defined by the SettingsModel | |
|
125 | and determines how the setting is stored in DB. Optionally the suffix | |
|
126 | `.encrypted` is appended to instruct SettingsModel to store it | |
|
127 | encrypted. | |
|
120 | 128 |
|
|
121 | Get the type as used by the SettingsModel accordingly to type of passed | |
|
122 | value. Optionally the suffix `.encrypted` is appended to instruct | |
|
123 | SettingsModel to store it encrypted. | |
|
124 | """ | |
|
125 | type_ = self._settings_type_map.get(type(value), 'unicode') | |
|
129 | schema_node = self.get_settings_schema().get(name) | |
|
130 | db_type = self._settings_type_map.get( | |
|
131 | type(schema_node.typ), 'unicode') | |
|
126 | 132 | if name in self._settings_encrypted: |
|
127 |
type |
|
|
128 |
return type |
|
|
133 | db_type = '{}.encrypted'.format(db_type) | |
|
134 | return db_type | |
|
129 | 135 | |
|
130 | 136 | def is_enabled(self): |
|
131 | 137 | """ |
|
132 | 138 | Returns true if this plugin is enabled. An enabled plugin can be |
|
133 | 139 | configured in the admin interface but it is not consulted during |
|
134 | 140 | authentication. |
|
135 | 141 | """ |
|
136 | 142 | auth_plugins = SettingsModel().get_auth_plugins() |
|
137 | 143 | return self.get_id() in auth_plugins |
|
138 | 144 | |
|
139 | 145 | def is_active(self): |
|
140 | 146 | """ |
|
141 | 147 | Returns true if the plugin is activated. An activated plugin is |
|
142 | 148 | consulted during authentication, assumed it is also enabled. |
|
143 | 149 | """ |
|
144 | 150 | return self.get_setting_by_name('enabled') |
|
145 | 151 | |
|
146 | 152 | def get_id(self): |
|
147 | 153 | """ |
|
148 | 154 | Returns the plugin id. |
|
149 | 155 | """ |
|
150 | 156 | return self._plugin_id |
|
151 | 157 | |
|
152 | 158 | def get_display_name(self): |
|
153 | 159 | """ |
|
154 | 160 | Returns a translation string for displaying purposes. |
|
155 | 161 | """ |
|
156 | 162 | raise NotImplementedError('Not implemented in base class') |
|
157 | 163 | |
|
158 | 164 | def get_settings_schema(self): |
|
159 | 165 | """ |
|
160 | 166 | Returns a colander schema, representing the plugin settings. |
|
161 | 167 | """ |
|
162 | 168 | return AuthnPluginSettingsSchemaBase() |
|
163 | 169 | |
|
164 | def get_setting_by_name(self, name): | |
|
170 | def get_setting_by_name(self, name, default=None): | |
|
165 | 171 | """ |
|
166 | 172 | Returns a plugin setting by name. |
|
167 | 173 | """ |
|
168 | 174 | full_name = self._get_setting_full_name(name) |
|
169 | 175 | db_setting = SettingsModel().get_setting_by_name(full_name) |
|
170 |
return db_setting.app_settings_value if db_setting else |
|
|
176 | return db_setting.app_settings_value if db_setting else default | |
|
171 | 177 | |
|
172 | 178 | def create_or_update_setting(self, name, value): |
|
173 | 179 | """ |
|
174 | 180 | Create or update a setting for this plugin in the persistent storage. |
|
175 | 181 | """ |
|
176 | 182 | full_name = self._get_setting_full_name(name) |
|
177 |
type_ = self._get_setting_type(name |
|
|
183 | type_ = self._get_setting_type(name) | |
|
178 | 184 | db_setting = SettingsModel().create_or_update_setting( |
|
179 | 185 | full_name, value, type_) |
|
180 | 186 | return db_setting.app_settings_value |
|
181 | 187 | |
|
182 | 188 | def get_settings(self): |
|
183 | 189 | """ |
|
184 | 190 | Returns the plugin settings as dictionary. |
|
185 | 191 | """ |
|
186 | 192 | settings = {} |
|
187 | 193 | for node in self.get_settings_schema(): |
|
188 | 194 | settings[node.name] = self.get_setting_by_name(node.name) |
|
189 | 195 | return settings |
|
190 | 196 | |
|
191 | 197 | @property |
|
192 | 198 | def validators(self): |
|
193 | 199 | """ |
|
194 | 200 | Exposes RhodeCode validators modules |
|
195 | 201 | """ |
|
196 | 202 | # this is a hack to overcome issues with pylons threadlocals and |
|
197 | 203 | # translator object _() not beein registered properly. |
|
198 | 204 | class LazyCaller(object): |
|
199 | 205 | def __init__(self, name): |
|
200 | 206 | self.validator_name = name |
|
201 | 207 | |
|
202 | 208 | def __call__(self, *args, **kwargs): |
|
203 | 209 | from rhodecode.model import validators as v |
|
204 | 210 | obj = getattr(v, self.validator_name) |
|
205 | 211 | # log.debug('Initializing lazy formencode object: %s', obj) |
|
206 | 212 | return LazyFormencode(obj, *args, **kwargs) |
|
207 | 213 | |
|
208 | 214 | class ProxyGet(object): |
|
209 | 215 | def __getattribute__(self, name): |
|
210 | 216 | return LazyCaller(name) |
|
211 | 217 | |
|
212 | 218 | return ProxyGet() |
|
213 | 219 | |
|
214 | 220 | @hybrid_property |
|
215 | 221 | def name(self): |
|
216 | 222 | """ |
|
217 | 223 | Returns the name of this authentication plugin. |
|
218 | 224 | |
|
219 | 225 | :returns: string |
|
220 | 226 | """ |
|
221 | 227 | raise NotImplementedError("Not implemented in base class") |
|
222 | 228 | |
|
223 | 229 | @property |
|
224 | 230 | def is_headers_auth(self): |
|
225 | 231 | """ |
|
226 | 232 | Returns True if this authentication plugin uses HTTP headers as |
|
227 | 233 | authentication method. |
|
228 | 234 | """ |
|
229 | 235 | return False |
|
230 | 236 | |
|
231 | 237 | @hybrid_property |
|
232 | 238 | def is_container_auth(self): |
|
233 | 239 | """ |
|
234 | 240 | Deprecated method that indicates if this authentication plugin uses |
|
235 | 241 | HTTP headers as authentication method. |
|
236 | 242 | """ |
|
237 | 243 | warnings.warn( |
|
238 | 244 | 'Use is_headers_auth instead.', category=DeprecationWarning) |
|
239 | 245 | return self.is_headers_auth |
|
240 | 246 | |
|
241 | 247 | @hybrid_property |
|
242 | 248 | def allows_creating_users(self): |
|
243 | 249 | """ |
|
244 | 250 | Defines if Plugin allows users to be created on-the-fly when |
|
245 | 251 | authentication is called. Controls how external plugins should behave |
|
246 | 252 | in terms if they are allowed to create new users, or not. Base plugins |
|
247 | 253 | should not be allowed to, but External ones should be ! |
|
248 | 254 | |
|
249 | 255 | :return: bool |
|
250 | 256 | """ |
|
251 | 257 | return False |
|
252 | 258 | |
|
253 | 259 | def set_auth_type(self, auth_type): |
|
254 | 260 | self.auth_type = auth_type |
|
255 | 261 | |
|
256 | 262 | def allows_authentication_from( |
|
257 | 263 | self, user, allows_non_existing_user=True, |
|
258 | 264 | allowed_auth_plugins=None, allowed_auth_sources=None): |
|
259 | 265 | """ |
|
260 | 266 | Checks if this authentication module should accept a request for |
|
261 | 267 | the current user. |
|
262 | 268 | |
|
263 | 269 | :param user: user object fetched using plugin's get_user() method. |
|
264 | 270 | :param allows_non_existing_user: if True, don't allow the |
|
265 | 271 | user to be empty, meaning not existing in our database |
|
266 | 272 | :param allowed_auth_plugins: if provided, users extern_type will be |
|
267 | 273 | checked against a list of provided extern types, which are plugin |
|
268 | 274 | auth_names in the end |
|
269 | 275 | :param allowed_auth_sources: authentication type allowed, |
|
270 | 276 | `http` or `vcs` default is both. |
|
271 | 277 | defines if plugin will accept only http authentication vcs |
|
272 | 278 | authentication(git/hg) or both |
|
273 | 279 | :returns: boolean |
|
274 | 280 | """ |
|
275 | 281 | if not user and not allows_non_existing_user: |
|
276 | 282 | log.debug('User is empty but plugin does not allow empty users,' |
|
277 | 283 | 'not allowed to authenticate') |
|
278 | 284 | return False |
|
279 | 285 | |
|
280 | 286 | expected_auth_plugins = allowed_auth_plugins or [self.name] |
|
281 | 287 | if user and (user.extern_type and |
|
282 | 288 | user.extern_type not in expected_auth_plugins): |
|
283 | 289 | log.debug( |
|
284 | 290 | 'User `%s` is bound to `%s` auth type. Plugin allows only ' |
|
285 | 291 | '%s, skipping', user, user.extern_type, expected_auth_plugins) |
|
286 | 292 | |
|
287 | 293 | return False |
|
288 | 294 | |
|
289 | 295 | # by default accept both |
|
290 | 296 | expected_auth_from = allowed_auth_sources or [HTTP_TYPE, VCS_TYPE] |
|
291 | 297 | if self.auth_type not in expected_auth_from: |
|
292 | 298 | log.debug('Current auth source is %s but plugin only allows %s', |
|
293 | 299 | self.auth_type, expected_auth_from) |
|
294 | 300 | return False |
|
295 | 301 | |
|
296 | 302 | return True |
|
297 | 303 | |
|
298 | 304 | def get_user(self, username=None, **kwargs): |
|
299 | 305 | """ |
|
300 | 306 | Helper method for user fetching in plugins, by default it's using |
|
301 | 307 | simple fetch by username, but this method can be custimized in plugins |
|
302 | 308 | eg. headers auth plugin to fetch user by environ params |
|
303 | 309 | |
|
304 | 310 | :param username: username if given to fetch from database |
|
305 | 311 | :param kwargs: extra arguments needed for user fetching. |
|
306 | 312 | """ |
|
307 | 313 | user = None |
|
308 | 314 | log.debug( |
|
309 | 315 | 'Trying to fetch user `%s` from RhodeCode database', username) |
|
310 | 316 | if username: |
|
311 | 317 | user = User.get_by_username(username) |
|
312 | 318 | if not user: |
|
313 | 319 | log.debug('User not found, fallback to fetch user in ' |
|
314 | 320 | 'case insensitive mode') |
|
315 | 321 | user = User.get_by_username(username, case_insensitive=True) |
|
316 | 322 | else: |
|
317 | 323 | log.debug('provided username:`%s` is empty skipping...', username) |
|
318 | 324 | if not user: |
|
319 | 325 | log.debug('User `%s` not found in database', username) |
|
320 | 326 | return user |
|
321 | 327 | |
|
322 | 328 | def user_activation_state(self): |
|
323 | 329 | """ |
|
324 | 330 | Defines user activation state when creating new users |
|
325 | 331 | |
|
326 | 332 | :returns: boolean |
|
327 | 333 | """ |
|
328 | 334 | raise NotImplementedError("Not implemented in base class") |
|
329 | 335 | |
|
330 | 336 | def auth(self, userobj, username, passwd, settings, **kwargs): |
|
331 | 337 | """ |
|
332 | 338 | Given a user object (which may be null), username, a plaintext |
|
333 | 339 | password, and a settings object (containing all the keys needed as |
|
334 | 340 | listed in settings()), authenticate this user's login attempt. |
|
335 | 341 | |
|
336 | 342 | Return None on failure. On success, return a dictionary of the form: |
|
337 | 343 | |
|
338 | 344 | see: RhodeCodeAuthPluginBase.auth_func_attrs |
|
339 | 345 | This is later validated for correctness |
|
340 | 346 | """ |
|
341 | 347 | raise NotImplementedError("not implemented in base class") |
|
342 | 348 | |
|
343 | 349 | def _authenticate(self, userobj, username, passwd, settings, **kwargs): |
|
344 | 350 | """ |
|
345 | 351 | Wrapper to call self.auth() that validates call on it |
|
346 | 352 | |
|
347 | 353 | :param userobj: userobj |
|
348 | 354 | :param username: username |
|
349 | 355 | :param passwd: plaintext password |
|
350 | 356 | :param settings: plugin settings |
|
351 | 357 | """ |
|
352 | 358 | auth = self.auth(userobj, username, passwd, settings, **kwargs) |
|
353 | 359 | if auth: |
|
354 | 360 | # check if hash should be migrated ? |
|
355 | 361 | new_hash = auth.get('_hash_migrate') |
|
356 | 362 | if new_hash: |
|
357 | 363 | self._migrate_hash_to_bcrypt(username, passwd, new_hash) |
|
358 | 364 | return self._validate_auth_return(auth) |
|
359 | 365 | return auth |
|
360 | 366 | |
|
361 | 367 | def _migrate_hash_to_bcrypt(self, username, password, new_hash): |
|
362 | 368 | new_hash_cypher = _RhodeCodeCryptoBCrypt() |
|
363 | 369 | # extra checks, so make sure new hash is correct. |
|
364 | 370 | password_encoded = safe_str(password) |
|
365 | 371 | if new_hash and new_hash_cypher.hash_check( |
|
366 | 372 | password_encoded, new_hash): |
|
367 | 373 | cur_user = User.get_by_username(username) |
|
368 | 374 | cur_user.password = new_hash |
|
369 | 375 | Session().add(cur_user) |
|
370 | 376 | Session().flush() |
|
371 | 377 | log.info('Migrated user %s hash to bcrypt', cur_user) |
|
372 | 378 | |
|
373 | 379 | def _validate_auth_return(self, ret): |
|
374 | 380 | if not isinstance(ret, dict): |
|
375 | 381 | raise Exception('returned value from auth must be a dict') |
|
376 | 382 | for k in self.auth_func_attrs: |
|
377 | 383 | if k not in ret: |
|
378 | 384 | raise Exception('Missing %s attribute from returned data' % k) |
|
379 | 385 | return ret |
|
380 | 386 | |
|
381 | 387 | |
|
382 | 388 | class RhodeCodeExternalAuthPlugin(RhodeCodeAuthPluginBase): |
|
383 | 389 | |
|
384 | 390 | @hybrid_property |
|
385 | 391 | def allows_creating_users(self): |
|
386 | 392 | return True |
|
387 | 393 | |
|
388 | 394 | def use_fake_password(self): |
|
389 | 395 | """ |
|
390 | 396 | Return a boolean that indicates whether or not we should set the user's |
|
391 | 397 | password to a random value when it is authenticated by this plugin. |
|
392 | 398 | If your plugin provides authentication, then you will generally |
|
393 | 399 | want this. |
|
394 | 400 | |
|
395 | 401 | :returns: boolean |
|
396 | 402 | """ |
|
397 | 403 | raise NotImplementedError("Not implemented in base class") |
|
398 | 404 | |
|
399 | 405 | def _authenticate(self, userobj, username, passwd, settings, **kwargs): |
|
400 | 406 | # at this point _authenticate calls plugin's `auth()` function |
|
401 | 407 | auth = super(RhodeCodeExternalAuthPlugin, self)._authenticate( |
|
402 | 408 | userobj, username, passwd, settings, **kwargs) |
|
403 | 409 | if auth: |
|
404 | 410 | # maybe plugin will clean the username ? |
|
405 | 411 | # we should use the return value |
|
406 | 412 | username = auth['username'] |
|
407 | 413 | |
|
408 | 414 | # if external source tells us that user is not active, we should |
|
409 | 415 | # skip rest of the process. This can prevent from creating users in |
|
410 | 416 | # RhodeCode when using external authentication, but if it's |
|
411 | 417 | # inactive user we shouldn't create that user anyway |
|
412 | 418 | if auth['active_from_extern'] is False: |
|
413 | 419 | log.warning( |
|
414 | 420 | "User %s authenticated against %s, but is inactive", |
|
415 | 421 | username, self.__module__) |
|
416 | 422 | return None |
|
417 | 423 | |
|
418 | 424 | cur_user = User.get_by_username(username, case_insensitive=True) |
|
419 | 425 | is_user_existing = cur_user is not None |
|
420 | 426 | |
|
421 | 427 | if is_user_existing: |
|
422 | 428 | log.debug('Syncing user `%s` from ' |
|
423 | 429 | '`%s` plugin', username, self.name) |
|
424 | 430 | else: |
|
425 | 431 | log.debug('Creating non existing user `%s` from ' |
|
426 | 432 | '`%s` plugin', username, self.name) |
|
427 | 433 | |
|
428 | 434 | if self.allows_creating_users: |
|
429 | 435 | log.debug('Plugin `%s` allows to ' |
|
430 | 436 | 'create new users', self.name) |
|
431 | 437 | else: |
|
432 | 438 | log.debug('Plugin `%s` does not allow to ' |
|
433 | 439 | 'create new users', self.name) |
|
434 | 440 | |
|
435 | 441 | user_parameters = { |
|
436 | 442 | 'username': username, |
|
437 | 443 | 'email': auth["email"], |
|
438 | 444 | 'firstname': auth["firstname"], |
|
439 | 445 | 'lastname': auth["lastname"], |
|
440 | 446 | 'active': auth["active"], |
|
441 | 447 | 'admin': auth["admin"], |
|
442 | 448 | 'extern_name': auth["extern_name"], |
|
443 | 449 | 'extern_type': self.name, |
|
444 | 450 | 'plugin': self, |
|
445 | 451 | 'allow_to_create_user': self.allows_creating_users, |
|
446 | 452 | } |
|
447 | 453 | |
|
448 | 454 | if not is_user_existing: |
|
449 | 455 | if self.use_fake_password(): |
|
450 | 456 | # Randomize the PW because we don't need it, but don't want |
|
451 | 457 | # them blank either |
|
452 | 458 | passwd = PasswordGenerator().gen_password(length=16) |
|
453 | 459 | user_parameters['password'] = passwd |
|
454 | 460 | else: |
|
455 | 461 | # Since the password is required by create_or_update method of |
|
456 | 462 | # UserModel, we need to set it explicitly. |
|
457 | 463 | # The create_or_update method is smart and recognises the |
|
458 | 464 | # password hashes as well. |
|
459 | 465 | user_parameters['password'] = cur_user.password |
|
460 | 466 | |
|
461 | 467 | # we either create or update users, we also pass the flag |
|
462 | 468 | # that controls if this method can actually do that. |
|
463 | 469 | # raises NotAllowedToCreateUserError if it cannot, and we try to. |
|
464 | 470 | user = UserModel().create_or_update(**user_parameters) |
|
465 | 471 | Session().flush() |
|
466 | 472 | # enforce user is just in given groups, all of them has to be ones |
|
467 | 473 | # created from plugins. We store this info in _group_data JSON |
|
468 | 474 | # field |
|
469 | 475 | try: |
|
470 | 476 | groups = auth['groups'] or [] |
|
471 | 477 | UserGroupModel().enforce_groups(user, groups, self.name) |
|
472 | 478 | except Exception: |
|
473 | 479 | # for any reason group syncing fails, we should |
|
474 | 480 | # proceed with login |
|
475 | 481 | log.error(traceback.format_exc()) |
|
476 | 482 | Session().commit() |
|
477 | 483 | return auth |
|
478 | 484 | |
|
479 | 485 | |
|
480 | 486 | def loadplugin(plugin_id): |
|
481 | 487 | """ |
|
482 | 488 | Loads and returns an instantiated authentication plugin. |
|
483 | 489 | Returns the RhodeCodeAuthPluginBase subclass on success, |
|
484 | 490 | or None on failure. |
|
485 | 491 | """ |
|
486 | 492 | # TODO: Disusing pyramids thread locals to retrieve the registry. |
|
487 | 493 | authn_registry = get_current_registry().getUtility(IAuthnPluginRegistry) |
|
488 | 494 | plugin = authn_registry.get_plugin(plugin_id) |
|
489 | 495 | if plugin is None: |
|
490 | 496 | log.error('Authentication plugin not found: "%s"', plugin_id) |
|
491 | 497 | return plugin |
|
492 | 498 | |
|
493 | 499 | |
|
494 | 500 | def get_auth_cache_manager(custom_ttl=None): |
|
495 | 501 | return caches.get_cache_manager( |
|
496 | 502 | 'auth_plugins', 'rhodecode.authentication', custom_ttl) |
|
497 | 503 | |
|
498 | 504 | |
|
499 | 505 | def authenticate(username, password, environ=None, auth_type=None, |
|
500 | 506 | skip_missing=False): |
|
501 | 507 | """ |
|
502 | 508 | Authentication function used for access control, |
|
503 | 509 | It tries to authenticate based on enabled authentication modules. |
|
504 | 510 | |
|
505 | 511 | :param username: username can be empty for headers auth |
|
506 | 512 | :param password: password can be empty for headers auth |
|
507 | 513 | :param environ: environ headers passed for headers auth |
|
508 | 514 | :param auth_type: type of authentication, either `HTTP_TYPE` or `VCS_TYPE` |
|
509 | 515 | :param skip_missing: ignores plugins that are in db but not in environment |
|
510 | 516 | :returns: None if auth failed, plugin_user dict if auth is correct |
|
511 | 517 | """ |
|
512 | 518 | if not auth_type or auth_type not in [HTTP_TYPE, VCS_TYPE]: |
|
513 | 519 | raise ValueError('auth type must be on of http, vcs got "%s" instead' |
|
514 | 520 | % auth_type) |
|
515 | 521 | headers_only = environ and not (username and password) |
|
516 | 522 | |
|
517 | 523 | authn_registry = get_current_registry().getUtility(IAuthnPluginRegistry) |
|
518 | 524 | for plugin in authn_registry.get_plugins_for_authentication(): |
|
519 | 525 | plugin.set_auth_type(auth_type) |
|
520 | 526 | user = plugin.get_user(username) |
|
521 | 527 | display_user = user.username if user else username |
|
522 | 528 | |
|
523 | 529 | if headers_only and not plugin.is_headers_auth: |
|
524 | 530 | log.debug('Auth type is for headers only and plugin `%s` is not ' |
|
525 | 531 | 'headers plugin, skipping...', plugin.get_id()) |
|
526 | 532 | continue |
|
527 | 533 | |
|
528 | 534 | # load plugin settings from RhodeCode database |
|
529 | 535 | plugin_settings = plugin.get_settings() |
|
530 | 536 | log.debug('Plugin settings:%s', plugin_settings) |
|
531 | 537 | |
|
532 | 538 | log.debug('Trying authentication using ** %s **', plugin.get_id()) |
|
533 | 539 | # use plugin's method of user extraction. |
|
534 | 540 | user = plugin.get_user(username, environ=environ, |
|
535 | 541 | settings=plugin_settings) |
|
536 | 542 | display_user = user.username if user else username |
|
537 | 543 | log.debug( |
|
538 | 544 | 'Plugin %s extracted user is `%s`', plugin.get_id(), display_user) |
|
539 | 545 | |
|
540 | 546 | if not plugin.allows_authentication_from(user): |
|
541 | 547 | log.debug('Plugin %s does not accept user `%s` for authentication', |
|
542 | 548 | plugin.get_id(), display_user) |
|
543 | 549 | continue |
|
544 | 550 | else: |
|
545 | 551 | log.debug('Plugin %s accepted user `%s` for authentication', |
|
546 | 552 | plugin.get_id(), display_user) |
|
547 | 553 | |
|
548 | 554 | log.info('Authenticating user `%s` using %s plugin', |
|
549 | 555 | display_user, plugin.get_id()) |
|
550 | 556 | |
|
551 | 557 | _cache_ttl = 0 |
|
552 | 558 | |
|
553 | 559 | if isinstance(plugin.AUTH_CACHE_TTL, (int, long)): |
|
554 | 560 | # plugin cache set inside is more important than the settings value |
|
555 | 561 | _cache_ttl = plugin.AUTH_CACHE_TTL |
|
556 | 562 | elif plugin_settings.get('auth_cache_ttl'): |
|
557 | 563 | _cache_ttl = safe_int(plugin_settings.get('auth_cache_ttl'), 0) |
|
558 | 564 | |
|
559 | 565 | plugin_cache_active = bool(_cache_ttl and _cache_ttl > 0) |
|
560 | 566 | |
|
561 | 567 | # get instance of cache manager configured for a namespace |
|
562 | 568 | cache_manager = get_auth_cache_manager(custom_ttl=_cache_ttl) |
|
563 | 569 | |
|
564 | 570 | log.debug('Cache for plugin `%s` active: %s', plugin.get_id(), |
|
565 | 571 | plugin_cache_active) |
|
566 | 572 | |
|
567 | 573 | # for environ based password can be empty, but then the validation is |
|
568 | 574 | # on the server that fills in the env data needed for authentication |
|
569 | 575 | _password_hash = md5_safe(plugin.name + username + (password or '')) |
|
570 | 576 | |
|
571 | 577 | # _authenticate is a wrapper for .auth() method of plugin. |
|
572 | 578 | # it checks if .auth() sends proper data. |
|
573 | 579 | # For RhodeCodeExternalAuthPlugin it also maps users to |
|
574 | 580 | # Database and maps the attributes returned from .auth() |
|
575 | 581 | # to RhodeCode database. If this function returns data |
|
576 | 582 | # then auth is correct. |
|
577 | 583 | start = time.time() |
|
578 | 584 | log.debug('Running plugin `%s` _authenticate method', |
|
579 | 585 | plugin.get_id()) |
|
580 | 586 | |
|
581 | 587 | def auth_func(): |
|
582 | 588 | """ |
|
583 | 589 | This function is used internally in Cache of Beaker to calculate |
|
584 | 590 | Results |
|
585 | 591 | """ |
|
586 | 592 | return plugin._authenticate( |
|
587 | 593 | user, username, password, plugin_settings, |
|
588 | 594 | environ=environ or {}) |
|
589 | 595 | |
|
590 | 596 | if plugin_cache_active: |
|
591 | 597 | plugin_user = cache_manager.get( |
|
592 | 598 | _password_hash, createfunc=auth_func) |
|
593 | 599 | else: |
|
594 | 600 | plugin_user = auth_func() |
|
595 | 601 | |
|
596 | 602 | auth_time = time.time() - start |
|
597 | 603 | log.debug('Authentication for plugin `%s` completed in %.3fs, ' |
|
598 | 604 | 'expiration time of fetched cache %.1fs.', |
|
599 | 605 | plugin.get_id(), auth_time, _cache_ttl) |
|
600 | 606 | |
|
601 | 607 | log.debug('PLUGIN USER DATA: %s', plugin_user) |
|
602 | 608 | |
|
603 | 609 | if plugin_user: |
|
604 | 610 | log.debug('Plugin returned proper authentication data') |
|
605 | 611 | return plugin_user |
|
606 | 612 | # we failed to Auth because .auth() method didn't return proper user |
|
607 | 613 | log.debug("User `%s` failed to authenticate against %s", |
|
608 | 614 | display_user, plugin.get_id()) |
|
609 | 615 | return None |
@@ -1,188 +1,192 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2012-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | import colander |
|
22 | 22 | import formencode.htmlfill |
|
23 | 23 | import logging |
|
24 | 24 | |
|
25 | 25 | from pyramid.httpexceptions import HTTPFound |
|
26 | 26 | from pyramid.renderers import render |
|
27 | 27 | from pyramid.response import Response |
|
28 | 28 | |
|
29 | 29 | from rhodecode.authentication.base import get_auth_cache_manager |
|
30 | 30 | from rhodecode.authentication.interface import IAuthnPluginRegistry |
|
31 | 31 | from rhodecode.lib import auth |
|
32 | 32 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator |
|
33 | 33 | from rhodecode.model.forms import AuthSettingsForm |
|
34 | 34 | from rhodecode.model.meta import Session |
|
35 | 35 | from rhodecode.model.settings import SettingsModel |
|
36 | 36 | from rhodecode.translation import _ |
|
37 | 37 | |
|
38 | 38 | log = logging.getLogger(__name__) |
|
39 | 39 | |
|
40 | 40 | |
|
41 | 41 | class AuthnPluginViewBase(object): |
|
42 | 42 | |
|
43 | 43 | def __init__(self, context, request): |
|
44 | 44 | self.request = request |
|
45 | 45 | self.context = context |
|
46 | 46 | self.plugin = context.plugin |
|
47 | 47 | self._rhodecode_user = request.user |
|
48 | 48 | |
|
49 | 49 | @LoginRequired() |
|
50 | 50 | @HasPermissionAllDecorator('hg.admin') |
|
51 | 51 | def settings_get(self, defaults=None, errors=None): |
|
52 | 52 | """ |
|
53 | 53 | View that displays the plugin settings as a form. |
|
54 | 54 | """ |
|
55 | 55 | defaults = defaults or {} |
|
56 | 56 | errors = errors or {} |
|
57 | 57 | schema = self.plugin.get_settings_schema() |
|
58 | 58 | |
|
59 |
# |
|
|
59 | # Compute default values for the form. Priority is: | |
|
60 | # 1. Passed to this method 2. DB value 3. Schema default | |
|
60 | 61 | for node in schema: |
|
61 | db_value = self.plugin.get_setting_by_name(node.name) | |
|
62 | defaults.setdefault(node.name, db_value) | |
|
62 | if node.name not in defaults: | |
|
63 | defaults[node.name] = self.plugin.get_setting_by_name( | |
|
64 | node.name, node.default) | |
|
63 | 65 | |
|
64 | 66 | template_context = { |
|
65 | 67 | 'defaults': defaults, |
|
66 | 68 | 'errors': errors, |
|
67 | 69 | 'plugin': self.context.plugin, |
|
68 | 70 | 'resource': self.context, |
|
69 | 71 | } |
|
70 | 72 | |
|
71 | 73 | return template_context |
|
72 | 74 | |
|
73 | 75 | @LoginRequired() |
|
74 | 76 | @HasPermissionAllDecorator('hg.admin') |
|
75 | 77 | @auth.CSRFRequired() |
|
76 | 78 | def settings_post(self): |
|
77 | 79 | """ |
|
78 | 80 | View that validates and stores the plugin settings. |
|
79 | 81 | """ |
|
80 | 82 | schema = self.plugin.get_settings_schema() |
|
83 | data = self.request.params | |
|
84 | ||
|
81 | 85 | try: |
|
82 |
valid_data = schema.deserialize( |
|
|
86 | valid_data = schema.deserialize(data) | |
|
83 | 87 | except colander.Invalid, e: |
|
84 | 88 | # Display error message and display form again. |
|
85 | 89 | self.request.session.flash( |
|
86 | 90 | _('Errors exist when saving plugin settings. ' |
|
87 | 91 | 'Please check the form inputs.'), |
|
88 | 92 | queue='error') |
|
89 | defaults = schema.flatten(self.request.params) | |
|
93 | defaults = {key: data[key] for key in data if key in schema} | |
|
90 | 94 | return self.settings_get(errors=e.asdict(), defaults=defaults) |
|
91 | 95 | |
|
92 | 96 | # Store validated data. |
|
93 | 97 | for name, value in valid_data.items(): |
|
94 | 98 | self.plugin.create_or_update_setting(name, value) |
|
95 | 99 | Session.commit() |
|
96 | 100 | |
|
97 | 101 | # Display success message and redirect. |
|
98 | 102 | self.request.session.flash( |
|
99 | 103 | _('Auth settings updated successfully.'), |
|
100 | 104 | queue='success') |
|
101 | 105 | redirect_to = self.request.resource_path( |
|
102 | 106 | self.context, route_name='auth_home') |
|
103 | 107 | return HTTPFound(redirect_to) |
|
104 | 108 | |
|
105 | 109 | |
|
106 | 110 | # TODO: Ongoing migration in these views. |
|
107 | 111 | # - Maybe we should also use a colander schema for these views. |
|
108 | 112 | class AuthSettingsView(object): |
|
109 | 113 | def __init__(self, context, request): |
|
110 | 114 | self.context = context |
|
111 | 115 | self.request = request |
|
112 | 116 | |
|
113 | 117 | # TODO: Move this into a utility function. It is needed in all view |
|
114 | 118 | # classes during migration. Maybe a mixin? |
|
115 | 119 | |
|
116 | 120 | # Some of the decorators rely on this attribute to be present on the |
|
117 | 121 | # class of the decorated method. |
|
118 | 122 | self._rhodecode_user = request.user |
|
119 | 123 | |
|
120 | 124 | @LoginRequired() |
|
121 | 125 | @HasPermissionAllDecorator('hg.admin') |
|
122 | 126 | def index(self, defaults=None, errors=None, prefix_error=False): |
|
123 | 127 | defaults = defaults or {} |
|
124 | 128 | authn_registry = self.request.registry.getUtility(IAuthnPluginRegistry) |
|
125 | 129 | enabled_plugins = SettingsModel().get_auth_plugins() |
|
126 | 130 | |
|
127 | 131 | # Create template context and render it. |
|
128 | 132 | template_context = { |
|
129 | 133 | 'resource': self.context, |
|
130 | 134 | 'available_plugins': authn_registry.get_plugins(), |
|
131 | 135 | 'enabled_plugins': enabled_plugins, |
|
132 | 136 | } |
|
133 | 137 | html = render('rhodecode:templates/admin/auth/auth_settings.html', |
|
134 | 138 | template_context, |
|
135 | 139 | request=self.request) |
|
136 | 140 | |
|
137 | 141 | # Create form default values and fill the form. |
|
138 | 142 | form_defaults = { |
|
139 | 143 | 'auth_plugins': ','.join(enabled_plugins) |
|
140 | 144 | } |
|
141 | 145 | form_defaults.update(defaults) |
|
142 | 146 | html = formencode.htmlfill.render( |
|
143 | 147 | html, |
|
144 | 148 | defaults=form_defaults, |
|
145 | 149 | errors=errors, |
|
146 | 150 | prefix_error=prefix_error, |
|
147 | 151 | encoding="UTF-8", |
|
148 | 152 | force_defaults=False) |
|
149 | 153 | |
|
150 | 154 | return Response(html) |
|
151 | 155 | |
|
152 | 156 | @LoginRequired() |
|
153 | 157 | @HasPermissionAllDecorator('hg.admin') |
|
154 | 158 | @auth.CSRFRequired() |
|
155 | 159 | def auth_settings(self): |
|
156 | 160 | try: |
|
157 | 161 | form = AuthSettingsForm()() |
|
158 | 162 | form_result = form.to_python(self.request.params) |
|
159 | 163 | plugins = ','.join(form_result['auth_plugins']) |
|
160 | 164 | setting = SettingsModel().create_or_update_setting( |
|
161 | 165 | 'auth_plugins', plugins) |
|
162 | 166 | Session().add(setting) |
|
163 | 167 | Session().commit() |
|
164 | 168 | |
|
165 | 169 | cache_manager = get_auth_cache_manager() |
|
166 | 170 | cache_manager.clear() |
|
167 | 171 | self.request.session.flash( |
|
168 | 172 | _('Auth settings updated successfully.'), |
|
169 | 173 | queue='success') |
|
170 | 174 | except formencode.Invalid as errors: |
|
171 | 175 | e = errors.error_dict or {} |
|
172 | 176 | self.request.session.flash( |
|
173 | 177 | _('Errors exist when saving plugin setting. ' |
|
174 | 178 | 'Please check the form inputs.'), |
|
175 | 179 | queue='error') |
|
176 | 180 | return self.index( |
|
177 | 181 | defaults=errors.value, |
|
178 | 182 | errors=e, |
|
179 | 183 | prefix_error=False) |
|
180 | 184 | except Exception: |
|
181 | 185 | log.exception('Exception in auth_settings') |
|
182 | 186 | self.request.session.flash( |
|
183 | 187 | _('Error occurred during update of auth settings.'), |
|
184 | 188 | queue='error') |
|
185 | 189 | |
|
186 | 190 | redirect_to = self.request.resource_path( |
|
187 | 191 | self.context, route_name='auth_home') |
|
188 | 192 | return HTTPFound(redirect_to) |
@@ -1,192 +1,192 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | """ |
|
22 | 22 | Pylons environment configuration |
|
23 | 23 | """ |
|
24 | 24 | |
|
25 | 25 | import os |
|
26 | 26 | import logging |
|
27 | 27 | import rhodecode |
|
28 | 28 | import platform |
|
29 | 29 | import re |
|
30 | 30 | import io |
|
31 | 31 | |
|
32 | 32 | from mako.lookup import TemplateLookup |
|
33 | 33 | from pylons.configuration import PylonsConfig |
|
34 | 34 | from pylons.error import handle_mako_error |
|
35 | 35 | from pyramid.settings import asbool |
|
36 | 36 | |
|
37 | 37 | # don't remove this import it does magic for celery |
|
38 | 38 | from rhodecode.lib import celerypylons # noqa |
|
39 | 39 | |
|
40 | 40 | import rhodecode.lib.app_globals as app_globals |
|
41 | 41 | |
|
42 | 42 | from rhodecode.config import utils |
|
43 | 43 | from rhodecode.config.routing import make_map |
|
44 | 44 | from rhodecode.config.jsroutes import generate_jsroutes_content |
|
45 | 45 | |
|
46 | 46 | from rhodecode.lib import helpers |
|
47 | 47 | from rhodecode.lib.auth import set_available_permissions |
|
48 | 48 | from rhodecode.lib.utils import ( |
|
49 | 49 | repo2db_mapper, make_db_config, set_rhodecode_config, |
|
50 | 50 | load_rcextensions) |
|
51 | 51 | from rhodecode.lib.utils2 import str2bool, aslist |
|
52 | 52 | from rhodecode.lib.vcs import connect_vcs, start_vcs_server |
|
53 | 53 | from rhodecode.model.scm import ScmModel |
|
54 | 54 | |
|
55 | 55 | log = logging.getLogger(__name__) |
|
56 | 56 | |
|
57 | 57 | def load_environment(global_conf, app_conf, initial=False, |
|
58 | 58 | test_env=None, test_index=None): |
|
59 | 59 | """ |
|
60 | 60 | Configure the Pylons environment via the ``pylons.config`` |
|
61 | 61 | object |
|
62 | 62 | """ |
|
63 | 63 | config = PylonsConfig() |
|
64 | 64 | |
|
65 | 65 | |
|
66 | 66 | # Pylons paths |
|
67 | 67 | root = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) |
|
68 | 68 | paths = { |
|
69 | 69 | 'root': root, |
|
70 | 70 | 'controllers': os.path.join(root, 'controllers'), |
|
71 | 71 | 'static_files': os.path.join(root, 'public'), |
|
72 | 72 | 'templates': [os.path.join(root, 'templates')], |
|
73 | 73 | } |
|
74 | 74 | |
|
75 | 75 | # Initialize config with the basic options |
|
76 | 76 | config.init_app(global_conf, app_conf, package='rhodecode', paths=paths) |
|
77 | 77 | |
|
78 | 78 | # store some globals into rhodecode |
|
79 | 79 | rhodecode.CELERY_ENABLED = str2bool(config['app_conf'].get('use_celery')) |
|
80 | 80 | rhodecode.CELERY_EAGER = str2bool( |
|
81 | 81 | config['app_conf'].get('celery.always.eager')) |
|
82 | 82 | |
|
83 | 83 | config['routes.map'] = make_map(config) |
|
84 | 84 | |
|
85 | if asbool(config['debug']): | |
|
85 | if asbool(config.get('generate_js_files', 'false')): | |
|
86 | 86 | jsroutes = config['routes.map'].jsroutes() |
|
87 | 87 | jsroutes_file_content = generate_jsroutes_content(jsroutes) |
|
88 | 88 | jsroutes_file_path = os.path.join( |
|
89 | 89 | paths['static_files'], 'js', 'rhodecode', 'routes.js') |
|
90 | 90 | |
|
91 | 91 | with io.open(jsroutes_file_path, 'w', encoding='utf-8') as f: |
|
92 | 92 | f.write(jsroutes_file_content) |
|
93 | 93 | |
|
94 | 94 | config['pylons.app_globals'] = app_globals.Globals(config) |
|
95 | 95 | config['pylons.h'] = helpers |
|
96 | 96 | rhodecode.CONFIG = config |
|
97 | 97 | |
|
98 | 98 | load_rcextensions(root_path=config['here']) |
|
99 | 99 | |
|
100 | 100 | # Setup cache object as early as possible |
|
101 | 101 | import pylons |
|
102 | 102 | pylons.cache._push_object(config['pylons.app_globals'].cache) |
|
103 | 103 | |
|
104 | 104 | # Create the Mako TemplateLookup, with the default auto-escaping |
|
105 | 105 | config['pylons.app_globals'].mako_lookup = TemplateLookup( |
|
106 | 106 | directories=paths['templates'], |
|
107 | 107 | error_handler=handle_mako_error, |
|
108 | 108 | module_directory=os.path.join(app_conf['cache_dir'], 'templates'), |
|
109 | 109 | input_encoding='utf-8', default_filters=['escape'], |
|
110 | 110 | imports=['from webhelpers.html import escape']) |
|
111 | 111 | |
|
112 | 112 | # sets the c attribute access when don't existing attribute are accessed |
|
113 | 113 | config['pylons.strict_tmpl_context'] = True |
|
114 | 114 | |
|
115 | 115 | # Limit backends to "vcs.backends" from configuration |
|
116 | 116 | backends = config['vcs.backends'] = aslist( |
|
117 | 117 | config.get('vcs.backends', 'hg,git'), sep=',') |
|
118 | 118 | for alias in rhodecode.BACKENDS.keys(): |
|
119 | 119 | if alias not in backends: |
|
120 | 120 | del rhodecode.BACKENDS[alias] |
|
121 | 121 | log.info("Enabled backends: %s", backends) |
|
122 | 122 | |
|
123 | 123 | # initialize vcs client and optionally run the server if enabled |
|
124 | 124 | vcs_server_uri = config.get('vcs.server', '') |
|
125 | 125 | vcs_server_enabled = str2bool(config.get('vcs.server.enable', 'true')) |
|
126 | 126 | start_server = ( |
|
127 | 127 | str2bool(config.get('vcs.start_server', 'false')) and |
|
128 | 128 | not int(os.environ.get('RC_VCSSERVER_TEST_DISABLE', '0'))) |
|
129 | 129 | if vcs_server_enabled and start_server: |
|
130 | 130 | log.info("Starting vcsserver") |
|
131 | 131 | start_vcs_server(server_and_port=vcs_server_uri, |
|
132 | 132 | protocol=utils.get_vcs_server_protocol(config), |
|
133 | 133 | log_level=config['vcs.server.log_level']) |
|
134 | 134 | |
|
135 | 135 | set_available_permissions(config) |
|
136 | 136 | db_cfg = make_db_config(clear_session=True) |
|
137 | 137 | |
|
138 | 138 | repos_path = list(db_cfg.items('paths'))[0][1] |
|
139 | 139 | config['base_path'] = repos_path |
|
140 | 140 | |
|
141 | 141 | config['vcs.hooks.direct_calls'] = _use_direct_hook_calls(config) |
|
142 | 142 | config['vcs.hooks.protocol'] = _get_vcs_hooks_protocol(config) |
|
143 | 143 | |
|
144 | 144 | # store db config also in main global CONFIG |
|
145 | 145 | set_rhodecode_config(config) |
|
146 | 146 | |
|
147 | 147 | # configure instance id |
|
148 | 148 | utils.set_instance_id(config) |
|
149 | 149 | |
|
150 | 150 | # CONFIGURATION OPTIONS HERE (note: all config options will override |
|
151 | 151 | # any Pylons config options) |
|
152 | 152 | |
|
153 | 153 | # store config reference into our module to skip import magic of pylons |
|
154 | 154 | rhodecode.CONFIG.update(config) |
|
155 | 155 | |
|
156 | 156 | utils.configure_pyro4(config) |
|
157 | 157 | utils.configure_vcs(config) |
|
158 | 158 | if vcs_server_enabled: |
|
159 | 159 | connect_vcs(vcs_server_uri, utils.get_vcs_server_protocol(config)) |
|
160 | 160 | |
|
161 | 161 | import_on_startup = str2bool(config.get('startup.import_repos', False)) |
|
162 | 162 | if vcs_server_enabled and import_on_startup: |
|
163 | 163 | repo2db_mapper(ScmModel().repo_scan(repos_path), remove_obsolete=False) |
|
164 | 164 | return config |
|
165 | 165 | |
|
166 | 166 | |
|
167 | 167 | def _use_direct_hook_calls(config): |
|
168 | 168 | default_direct_hook_calls = 'false' |
|
169 | 169 | direct_hook_calls = str2bool( |
|
170 | 170 | config.get('vcs.hooks.direct_calls', default_direct_hook_calls)) |
|
171 | 171 | return direct_hook_calls |
|
172 | 172 | |
|
173 | 173 | |
|
174 | 174 | def _get_vcs_hooks_protocol(config): |
|
175 | 175 | protocol = config.get('vcs.hooks.protocol', 'pyro4').lower() |
|
176 | 176 | return protocol |
|
177 | 177 | |
|
178 | 178 | |
|
179 | 179 | def load_pyramid_environment(global_config, settings): |
|
180 | 180 | # Some parts of the code expect a merge of global and app settings. |
|
181 | 181 | settings_merged = global_config.copy() |
|
182 | 182 | settings_merged.update(settings) |
|
183 | 183 | |
|
184 | 184 | # If this is a test run we prepare the test environment like |
|
185 | 185 | # creating a test database, test search index and test repositories. |
|
186 | 186 | # This has to be done before the database connection is initialized. |
|
187 | 187 | if settings['is_test']: |
|
188 | 188 | rhodecode.is_test = True |
|
189 | 189 | utils.initialize_test_environment(settings_merged) |
|
190 | 190 | |
|
191 | 191 | # Initialize the database connection. |
|
192 | 192 | utils.initialize_database(settings_merged) |
@@ -1,42 +1,43 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | def generate_jsroutes_content(jsroutes): |
|
22 | 22 | statements = [] |
|
23 | 23 | for url_name, url, fields in jsroutes: |
|
24 | 24 | statements.append( |
|
25 | 25 | "pyroutes.register('%s', '%s', %s);" % (url_name, url, fields)) |
|
26 | 26 | return u''' |
|
27 | 27 | /****************************************************************************** |
|
28 | 28 | * * |
|
29 | 29 | * DO NOT CHANGE THIS FILE MANUALLY * |
|
30 | 30 | * * |
|
31 | 31 | * * |
|
32 |
* This file is automatically generated when the app starts up |
|
|
32 | * This file is automatically generated when the app starts up with * | |
|
33 | * generate_js_files = true * | |
|
33 | 34 | * * |
|
34 | 35 | * To add a route here pass jsroute=True to the route definition in the app * |
|
35 | 36 | * * |
|
36 | 37 | ******************************************************************************/ |
|
37 | 38 | function registerRCRoutes() { |
|
38 | 39 | // routes registration |
|
39 | 40 | %s |
|
40 | 41 | } |
|
41 | 42 | ''' % '\n '.join(statements) |
|
42 | 43 |
@@ -1,217 +1,256 b'' | |||
|
1 | 1 | { |
|
2 | "cyrus-sasl-2.1.26": { | |
|
3 | "cyrus": "http://cyrusimap.web.cmu.edu/mediawiki/index.php/Downloads#Licensing" | |
|
2 | "nodejs-4.3.1": { | |
|
3 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
4 | 4 | }, |
|
5 |
" |
|
|
6 |
" |
|
|
5 | "postgresql-9.5.1": { | |
|
6 | "PostgreSQL License": "http://spdx.org/licenses/PostgreSQL" | |
|
7 | 7 | }, |
|
8 |
" |
|
|
9 | "OpenSSL": "http://spdx.org/licenses/OpenSSL" | |
|
10 | }, | |
|
11 | "python-2.7.10": { | |
|
12 | "Python-2.0": "http://spdx.org/licenses/Python-2.0" | |
|
8 | "python-2.7.11": { | |
|
9 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0" | |
|
13 | 10 | }, |
|
14 | 11 | "python2.7-Babel-1.3": { |
|
15 |
"BSD |
|
|
12 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
16 | 13 | }, |
|
17 | 14 | "python2.7-Beaker-1.7.0": { |
|
18 |
"BSD |
|
|
15 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
19 | 16 | }, |
|
20 | 17 | "python2.7-FormEncode-1.2.4": { |
|
21 |
"Python |
|
|
18 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0" | |
|
22 | 19 | }, |
|
23 | 20 | "python2.7-Mako-1.0.1": { |
|
24 | "MIT": "http://spdx.org/licenses/MIT" | |
|
21 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
25 | 22 | }, |
|
26 | 23 | "python2.7-Markdown-2.6.2": { |
|
27 |
"BSD |
|
|
24 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
28 | 25 | }, |
|
29 | 26 | "python2.7-MarkupSafe-0.23": { |
|
30 |
"BSD |
|
|
27 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
31 | 28 | }, |
|
32 | 29 | "python2.7-Paste-2.0.2": { |
|
33 | "MIT": "http://spdx.org/licenses/MIT" | |
|
30 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
34 | 31 | }, |
|
35 | 32 | "python2.7-PasteDeploy-1.5.2": { |
|
36 | "MIT": "http://spdx.org/licenses/MIT" | |
|
33 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
37 | 34 | }, |
|
38 | 35 | "python2.7-PasteScript-1.7.5": { |
|
39 | "MIT": "http://spdx.org/licenses/MIT" | |
|
36 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
40 | 37 | }, |
|
41 | 38 | "python2.7-Pygments-2.0.2": { |
|
42 |
"BSD |
|
|
39 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
43 | 40 | }, |
|
44 |
"python2.7-Pylons-1.0. |
|
|
45 |
"BSD |
|
|
41 | "python2.7-Pylons-1.0.1-patch1": { | |
|
42 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
46 | 43 | }, |
|
47 | 44 | "python2.7-Pyro4-4.35": { |
|
48 | "MIT": "http://spdx.org/licenses/MIT" | |
|
45 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
49 | 46 | }, |
|
50 | 47 | "python2.7-Routes-1.13": { |
|
51 | "MIT": "http://spdx.org/licenses/MIT" | |
|
48 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
52 | 49 | }, |
|
53 | 50 | "python2.7-SQLAlchemy-0.9.9": { |
|
54 | "MIT": "http://spdx.org/licenses/MIT" | |
|
51 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
55 | 52 | }, |
|
56 | 53 | "python2.7-Tempita-0.5.2": { |
|
57 | "MIT": "http://spdx.org/licenses/MIT" | |
|
54 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
58 | 55 | }, |
|
59 | 56 | "python2.7-URLObject-2.4.0": { |
|
60 |
"Unlicense": "http:// |
|
|
57 | "The Unlicense": "http://unlicense.org/" | |
|
61 | 58 | }, |
|
62 | 59 | "python2.7-WebError-0.10.3": { |
|
63 | "MIT": "http://spdx.org/licenses/MIT" | |
|
60 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
64 | 61 | }, |
|
65 |
"python2.7-WebHelpers-1.3 |
|
|
66 |
"BSD |
|
|
62 | "python2.7-WebHelpers-1.3": { | |
|
63 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
67 | 64 | }, |
|
68 | 65 | "python2.7-WebHelpers2-2.0": { |
|
69 |
" |
|
|
66 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
70 | 67 | }, |
|
71 | 68 | "python2.7-WebOb-1.3.1": { |
|
72 | "MIT": "http://spdx.org/licenses/MIT" | |
|
69 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
73 | 70 | }, |
|
74 |
"python2.7-Whoosh-2.7.0 |
|
|
75 |
"BSD |
|
|
71 | "python2.7-Whoosh-2.7.0": { | |
|
72 | "BSD 2-clause \"Simplified\" License": "http://spdx.org/licenses/BSD-2-Clause", | |
|
73 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
76 | 74 | }, |
|
77 | 75 | "python2.7-alembic-0.8.4": { |
|
78 | "MIT": "http://spdx.org/licenses/MIT" | |
|
76 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
79 | 77 | }, |
|
80 | 78 | "python2.7-amqplib-1.0.2": { |
|
81 |
" |
|
|
79 | "GNU Lesser General Public License v3.0 only": "http://spdx.org/licenses/LGPL-3.0" | |
|
82 | 80 | }, |
|
83 | 81 | "python2.7-anyjson-0.3.3": { |
|
84 |
"BSD |
|
|
82 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
83 | }, | |
|
84 | "python2.7-appenlight-client-0.6.14": { | |
|
85 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
85 | 86 | }, |
|
86 |
"python2.7-a |
|
|
87 |
" |
|
|
87 | "python2.7-authomatic-0.1.0.post1": { | |
|
88 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
88 | 89 | }, |
|
89 |
"python2.7-backport |
|
|
90 |
"Python |
|
|
90 | "python2.7-backport-ipaddress-0.1": { | |
|
91 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0" | |
|
91 | 92 | }, |
|
92 | 93 | "python2.7-celery-2.2.10": { |
|
93 |
"BSD |
|
|
94 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
94 | 95 | }, |
|
95 |
"python2.7-click- |
|
|
96 |
"BSD |
|
|
96 | "python2.7-click-5.1": { | |
|
97 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
98 | }, | |
|
99 | "python2.7-colander-1.2": { | |
|
100 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |
|
97 | 101 | }, |
|
98 | 102 | "python2.7-configobj-5.0.6": { |
|
99 |
"BSD |
|
|
103 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
100 | 104 | }, |
|
101 | 105 | "python2.7-cssselect-0.9.1": { |
|
102 |
"BSD |
|
|
106 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
103 | 107 | }, |
|
104 | 108 | "python2.7-decorator-3.4.2": { |
|
105 |
"BSD |
|
|
109 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
106 | 110 | }, |
|
107 | 111 | "python2.7-docutils-0.12": { |
|
108 |
"BSD |
|
|
112 | "BSD 2-clause \"Simplified\" License": "http://spdx.org/licenses/BSD-2-Clause" | |
|
113 | }, | |
|
114 | "python2.7-elasticsearch-2.3.0": { | |
|
115 | "Apache License 2.0": "http://spdx.org/licenses/Apache-2.0" | |
|
116 | }, | |
|
117 | "python2.7-elasticsearch-dsl-2.0.0": { | |
|
118 | "Apache License 2.0": "http://spdx.org/licenses/Apache-2.0" | |
|
109 | 119 | }, |
|
110 | 120 | "python2.7-future-0.14.3": { |
|
111 | "MIT": "http://spdx.org/licenses/MIT" | |
|
121 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
112 | 122 | }, |
|
113 | 123 | "python2.7-futures-3.0.2": { |
|
114 |
"BSD |
|
|
124 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
125 | }, | |
|
126 | "python2.7-gnureadline-6.3.3": { | |
|
127 | "GNU General Public License v1.0 only": "http://spdx.org/licenses/GPL-1.0" | |
|
115 | 128 | }, |
|
116 |
"python2.7-g |
|
|
117 | "MIT": "http://spdx.org/licenses/MIT" | |
|
129 | "python2.7-gunicorn-19.6.0": { | |
|
130 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
118 | 131 | }, |
|
119 |
"python2.7- |
|
|
120 |
" |
|
|
132 | "python2.7-infrae.cache-1.0.1": { | |
|
133 | "Zope Public License 2.1": "http://spdx.org/licenses/ZPL-2.1" | |
|
121 | 134 | }, |
|
122 | 135 | "python2.7-ipython-3.1.0": { |
|
123 |
"BSD |
|
|
124 | }, | |
|
125 | "python2.7-kombu-1.5.1-patch1": { | |
|
126 | "BSD-3-Clause": "http://spdx.org/licenses/BSD-3-Clause" | |
|
136 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
127 | 137 | }, |
|
128 |
"python2.7- |
|
|
129 | "expat": "http://directory.fsf.org/wiki/License:Expat" | |
|
138 | "python2.7-iso8601-0.1.11": { | |
|
139 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
130 | 140 | }, |
|
131 |
"python2.7- |
|
|
132 | "repoze": "http://repoze.org/license.html" | |
|
141 | "python2.7-kombu-1.5.1": { | |
|
142 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
133 | 143 | }, |
|
134 | 144 | "python2.7-msgpack-python-0.4.6": { |
|
135 |
"Apache |
|
|
136 | }, | |
|
137 | "python2.7-objgraph-2.0.0": { | |
|
138 | "MIT": "http://spdx.org/licenses/MIT" | |
|
145 | "Apache License 2.0": "http://spdx.org/licenses/Apache-2.0" | |
|
139 | 146 | }, |
|
140 | 147 | "python2.7-packaging-15.2": { |
|
141 |
"Apache |
|
|
148 | "Apache License 2.0": "http://spdx.org/licenses/Apache-2.0" | |
|
142 | 149 | }, |
|
143 | 150 | "python2.7-psutil-2.2.1": { |
|
144 |
"BSD |
|
|
151 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
145 | 152 | }, |
|
146 | 153 | "python2.7-psycopg2-2.6": { |
|
147 |
" |
|
|
154 | "GNU Lesser General Public License v3.0 or later": "http://spdx.org/licenses/LGPL-3.0+" | |
|
148 | 155 | }, |
|
149 | 156 | "python2.7-py-1.4.29": { |
|
150 | "MIT": "http://spdx.org/licenses/MIT" | |
|
157 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
151 | 158 | }, |
|
152 | 159 | "python2.7-py-bcrypt-0.4": { |
|
153 |
"BSD |
|
|
160 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause" | |
|
154 | 161 | }, |
|
155 | 162 | "python2.7-pycrypto-2.6.1": { |
|
156 |
" |
|
|
163 | "Public Domain": null | |
|
164 | }, | |
|
165 | "python2.7-pycurl-7.19.5": { | |
|
166 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
157 | 167 | }, |
|
158 | 168 | "python2.7-pyparsing-1.5.7": { |
|
159 | "MIT": "http://spdx.org/licenses/MIT" | |
|
169 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
170 | }, | |
|
171 | "python2.7-pyramid-1.6.1": { | |
|
172 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |
|
173 | }, | |
|
174 | "python2.7-pyramid-beaker-0.8": { | |
|
175 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |
|
176 | }, | |
|
177 | "python2.7-pyramid-debugtoolbar-2.4.2": { | |
|
178 | "BSD 4-clause \"Original\" or \"Old\" License": "http://spdx.org/licenses/BSD-4-Clause", | |
|
179 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |
|
180 | }, | |
|
181 | "python2.7-pyramid-mako-1.0.2": { | |
|
182 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |
|
160 | 183 | }, |
|
161 | 184 | "python2.7-pysqlite-2.6.3": { |
|
162 |
" |
|
|
163 |
" |
|
|
185 | "libpng License": "http://spdx.org/licenses/Libpng", | |
|
186 | "zlib License": "http://spdx.org/licenses/Zlib" | |
|
164 | 187 | }, |
|
165 | 188 | "python2.7-pytest-2.8.5": { |
|
166 | "MIT": "http://spdx.org/licenses/MIT" | |
|
189 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
190 | }, | |
|
191 | "python2.7-pytest-runner-2.7.1": { | |
|
192 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
167 | 193 | }, |
|
168 | 194 | "python2.7-python-dateutil-1.5": { |
|
169 | "BSD-2-Clause": "http://spdx.org/licenses/BSD-2-Clause" | |
|
195 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0" | |
|
196 | }, | |
|
197 | "python2.7-python-editor-1.0.1": { | |
|
198 | "Apache License 2.0": "http://spdx.org/licenses/Apache-2.0" | |
|
170 | 199 | }, |
|
171 | 200 | "python2.7-python-ldap-2.4.19": { |
|
172 |
"Python |
|
|
201 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0" | |
|
202 | }, | |
|
203 | "python2.7-python-memcached-1.57": { | |
|
204 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0" | |
|
173 | 205 | }, |
|
174 | 206 | "python2.7-pytz-2015.4": { |
|
175 | "MIT": "http://spdx.org/licenses/MIT" | |
|
207 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
176 | 208 | }, |
|
177 | 209 | "python2.7-recaptcha-client-1.0.6": { |
|
178 | "MIT": "http://spdx.org/licenses/MIT" | |
|
210 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
179 | 211 | }, |
|
180 | 212 | "python2.7-repoze.lru-0.6": { |
|
181 |
" |
|
|
213 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |
|
182 | 214 | }, |
|
183 |
"python2.7-requests-2. |
|
|
184 |
"A |
|
|
215 | "python2.7-requests-2.9.1": { | |
|
216 | "Apache License 2.0": "http://spdx.org/licenses/Apache-2.0" | |
|
185 | 217 | }, |
|
186 |
"python2.7-serpent-1.1 |
|
|
187 | "MIT": "http://spdx.org/licenses/MIT" | |
|
218 | "python2.7-serpent-1.12": { | |
|
219 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
188 | 220 | }, |
|
189 |
"python2.7-set |
|
|
190 | "BSD-2-Clause": "http://spdx.org/licenses/BSD-2-Clause" | |
|
221 | "python2.7-setuptools-19.4": { | |
|
222 | "Python Software Foundation License version 2": "http://spdx.org/licenses/Python-2.0", | |
|
223 | "Zope Public License 2.0": "http://spdx.org/licenses/ZPL-2.0" | |
|
191 | 224 | }, |
|
192 |
"python2.7-setuptools- |
|
|
193 | "PSF": null, | |
|
194 | "ZPL": null | |
|
225 | "python2.7-setuptools-scm-1.11.0": { | |
|
226 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
195 | 227 | }, |
|
196 | 228 | "python2.7-simplejson-3.7.2": { |
|
197 |
" |
|
|
229 | "Academic Free License": "http://spdx.org/licenses/AFL-2.1", | |
|
230 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
198 | 231 | }, |
|
199 | 232 | "python2.7-six-1.9.0": { |
|
200 | "MIT": "http://spdx.org/licenses/MIT" | |
|
233 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
201 | 234 | }, |
|
202 | "python2.7-subprocess32-3.2.6": { | |
|
203 | "Python-2.0": "http://spdx.org/licenses/Python-2.0" | |
|
235 | "python2.7-translationstring-1.3": { | |
|
236 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |
|
204 | 237 | }, |
|
205 |
"python2.7- |
|
|
206 |
" |
|
|
238 | "python2.7-urllib3-1.16": { | |
|
239 | "MIT License": "http://spdx.org/licenses/MIT" | |
|
207 | 240 | }, |
|
208 |
"python2.7- |
|
|
209 | "APSL-2.0": "http://spdx.org/licenses/APSL-2.0" | |
|
241 | "python2.7-venusian-1.0": { | |
|
242 | "Repoze License": "http://www.repoze.org/LICENSE.txt" | |
|
210 | 243 | }, |
|
211 | 244 | "python2.7-waitress-0.8.9": { |
|
212 |
"Z |
|
|
245 | "Zope Public License 2.1": "http://spdx.org/licenses/ZPL-2.1" | |
|
213 | 246 | }, |
|
214 | 247 | "python2.7-zope.cachedescriptors-4.0.0": { |
|
215 |
"Z |
|
|
248 | "Zope Public License 2.1": "http://spdx.org/licenses/ZPL-2.1" | |
|
249 | }, | |
|
250 | "python2.7-zope.deprecation-4.1.2": { | |
|
251 | "Zope Public License 2.1": "http://spdx.org/licenses/ZPL-2.1" | |
|
252 | }, | |
|
253 | "python2.7-zope.interface-4.1.3": { | |
|
254 | "Zope Public License 2.1": "http://spdx.org/licenses/ZPL-2.1" | |
|
216 | 255 | } |
|
217 |
} |
|
|
256 | } No newline at end of file |
@@ -1,316 +1,387 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | """ |
|
22 | 22 | Pylons middleware initialization |
|
23 | 23 | """ |
|
24 | 24 | import logging |
|
25 | 25 | |
|
26 | 26 | from paste.registry import RegistryManager |
|
27 | 27 | from paste.gzipper import make_gzip_middleware |
|
28 | from pylons.middleware import ErrorHandler, StatusCodeRedirect | |
|
29 | 28 | from pylons.wsgiapp import PylonsApp |
|
30 | 29 | from pyramid.authorization import ACLAuthorizationPolicy |
|
31 | 30 | from pyramid.config import Configurator |
|
32 | 31 | from pyramid.static import static_view |
|
33 | 32 | from pyramid.settings import asbool, aslist |
|
34 | 33 | from pyramid.wsgi import wsgiapp |
|
34 | from pyramid.httpexceptions import HTTPError, HTTPInternalServerError | |
|
35 | import pyramid.httpexceptions as httpexceptions | |
|
36 | from pyramid.renderers import render_to_response, render | |
|
35 | 37 | from routes.middleware import RoutesMiddleware |
|
36 | 38 | import routes.util |
|
37 | 39 | |
|
38 | 40 | import rhodecode |
|
39 | 41 | from rhodecode.config import patches |
|
40 | 42 | from rhodecode.config.environment import ( |
|
41 | 43 | load_environment, load_pyramid_environment) |
|
42 | 44 | from rhodecode.lib.middleware import csrf |
|
43 | 45 | from rhodecode.lib.middleware.appenlight import wrap_in_appenlight_if_enabled |
|
44 | 46 | from rhodecode.lib.middleware.disable_vcs import DisableVCSPagesWrapper |
|
45 | 47 | from rhodecode.lib.middleware.https_fixup import HttpsFixup |
|
46 | 48 | from rhodecode.lib.middleware.vcs import VCSMiddleware |
|
47 | 49 | from rhodecode.lib.plugins.utils import register_rhodecode_plugin |
|
48 | 50 | |
|
49 | 51 | |
|
50 | 52 | log = logging.getLogger(__name__) |
|
51 | 53 | |
|
52 | 54 | |
|
53 | 55 | def make_app(global_conf, full_stack=True, static_files=True, **app_conf): |
|
54 | 56 | """Create a Pylons WSGI application and return it |
|
55 | 57 | |
|
56 | 58 | ``global_conf`` |
|
57 | 59 | The inherited configuration for this application. Normally from |
|
58 | 60 | the [DEFAULT] section of the Paste ini file. |
|
59 | 61 | |
|
60 | 62 | ``full_stack`` |
|
61 | 63 | Whether or not this application provides a full WSGI stack (by |
|
62 | 64 | default, meaning it handles its own exceptions and errors). |
|
63 | 65 | Disable full_stack when this application is "managed" by |
|
64 | 66 | another WSGI middleware. |
|
65 | 67 | |
|
66 | 68 | ``app_conf`` |
|
67 | 69 | The application's local configuration. Normally specified in |
|
68 | 70 | the [app:<name>] section of the Paste ini file (where <name> |
|
69 | 71 | defaults to main). |
|
70 | 72 | |
|
71 | 73 | """ |
|
72 | 74 | # Apply compatibility patches |
|
73 | 75 | patches.kombu_1_5_1_python_2_7_11() |
|
74 | 76 | patches.inspect_getargspec() |
|
75 | 77 | |
|
76 | 78 | # Configure the Pylons environment |
|
77 | 79 | config = load_environment(global_conf, app_conf) |
|
78 | 80 | |
|
79 | 81 | # The Pylons WSGI app |
|
80 | 82 | app = PylonsApp(config=config) |
|
81 | 83 | if rhodecode.is_test: |
|
82 | 84 | app = csrf.CSRFDetector(app) |
|
83 | 85 | |
|
84 | 86 | expected_origin = config.get('expected_origin') |
|
85 | 87 | if expected_origin: |
|
86 | 88 | # The API can be accessed from other Origins. |
|
87 | 89 | app = csrf.OriginChecker(app, expected_origin, |
|
88 | 90 | skip_urls=[routes.util.url_for('api')]) |
|
89 | 91 | |
|
90 | # Add RoutesMiddleware. Currently we have two instances in the stack. This | |
|
91 | # is the lower one to make the StatusCodeRedirect middleware happy. | |
|
92 | # TODO: johbo: This is not optimal, search for a better solution. | |
|
93 | app = RoutesMiddleware(app, config['routes.map']) | |
|
94 | ||
|
95 | # CUSTOM MIDDLEWARE HERE (filtered by error handling middlewares) | |
|
96 | if asbool(config['pdebug']): | |
|
97 | from rhodecode.lib.profiler import ProfilingMiddleware | |
|
98 | app = ProfilingMiddleware(app) | |
|
99 | ||
|
100 | # Protect from VCS Server error related pages when server is not available | |
|
101 | vcs_server_enabled = asbool(config.get('vcs.server.enable', 'true')) | |
|
102 | if not vcs_server_enabled: | |
|
103 | app = DisableVCSPagesWrapper(app) | |
|
104 | 92 | |
|
105 | 93 | if asbool(full_stack): |
|
106 | 94 | |
|
107 | 95 | # Appenlight monitoring and error handler |
|
108 | 96 | app, appenlight_client = wrap_in_appenlight_if_enabled(app, config) |
|
109 | 97 | |
|
110 | # Handle Python exceptions | |
|
111 | app = ErrorHandler(app, global_conf, **config['pylons.errorware']) | |
|
112 | ||
|
113 | 98 | # we want our low level middleware to get to the request ASAP. We don't |
|
114 | 99 | # need any pylons stack middleware in them |
|
115 | 100 | app = VCSMiddleware(app, config, appenlight_client) |
|
116 | # Display error documents for 401, 403, 404 status codes (and | |
|
117 | # 500 when debug is disabled) | |
|
118 | if asbool(config['debug']): | |
|
119 | app = StatusCodeRedirect(app) | |
|
120 | else: | |
|
121 | app = StatusCodeRedirect(app, [400, 401, 403, 404, 500]) | |
|
122 | 101 | |
|
123 | 102 | # Establish the Registry for this application |
|
124 | 103 | app = RegistryManager(app) |
|
125 | 104 | |
|
126 | 105 | app.config = config |
|
127 | 106 | |
|
128 | 107 | return app |
|
129 | 108 | |
|
130 | 109 | |
|
131 | 110 | def make_pyramid_app(global_config, **settings): |
|
132 | 111 | """ |
|
133 | 112 | Constructs the WSGI application based on Pyramid and wraps the Pylons based |
|
134 | 113 | application. |
|
135 | 114 | |
|
136 | 115 | Specials: |
|
137 | 116 | |
|
138 | 117 | * We migrate from Pylons to Pyramid. While doing this, we keep both |
|
139 | 118 | frameworks functional. This involves moving some WSGI middlewares around |
|
140 | 119 | and providing access to some data internals, so that the old code is |
|
141 | 120 | still functional. |
|
142 | 121 | |
|
143 | 122 | * The application can also be integrated like a plugin via the call to |
|
144 | 123 | `includeme`. This is accompanied with the other utility functions which |
|
145 | 124 | are called. Changing this should be done with great care to not break |
|
146 | 125 | cases when these fragments are assembled from another place. |
|
147 | 126 | |
|
148 | 127 | """ |
|
149 | 128 | # The edition string should be available in pylons too, so we add it here |
|
150 | 129 | # before copying the settings. |
|
151 | 130 | settings.setdefault('rhodecode.edition', 'Community Edition') |
|
152 | 131 | |
|
153 | 132 | # As long as our Pylons application does expect "unprepared" settings, make |
|
154 | 133 | # sure that we keep an unmodified copy. This avoids unintentional change of |
|
155 | 134 | # behavior in the old application. |
|
156 | 135 | settings_pylons = settings.copy() |
|
157 | 136 | |
|
158 | 137 | sanitize_settings_and_apply_defaults(settings) |
|
159 | 138 | config = Configurator(settings=settings) |
|
160 | 139 | add_pylons_compat_data(config.registry, global_config, settings_pylons) |
|
161 | 140 | |
|
162 | 141 | load_pyramid_environment(global_config, settings) |
|
163 | 142 | |
|
164 | 143 | includeme(config) |
|
165 | 144 | includeme_last(config) |
|
166 | 145 | pyramid_app = config.make_wsgi_app() |
|
167 | 146 | pyramid_app = wrap_app_in_wsgi_middlewares(pyramid_app, config) |
|
168 | 147 | return pyramid_app |
|
169 | 148 | |
|
170 | 149 | |
|
171 | 150 | def add_pylons_compat_data(registry, global_config, settings): |
|
172 | 151 | """ |
|
173 | 152 | Attach data to the registry to support the Pylons integration. |
|
174 | 153 | """ |
|
175 | 154 | registry._pylons_compat_global_config = global_config |
|
176 | 155 | registry._pylons_compat_settings = settings |
|
177 | 156 | |
|
178 | 157 | |
|
158 | def webob_to_pyramid_http_response(webob_response): | |
|
159 | ResponseClass = httpexceptions.status_map[webob_response.status_int] | |
|
160 | pyramid_response = ResponseClass(webob_response.status) | |
|
161 | pyramid_response.status = webob_response.status | |
|
162 | pyramid_response.headers.update(webob_response.headers) | |
|
163 | if pyramid_response.headers['content-type'] == 'text/html': | |
|
164 | pyramid_response.headers['content-type'] = 'text/html; charset=UTF-8' | |
|
165 | return pyramid_response | |
|
166 | ||
|
167 | ||
|
168 | def error_handler(exception, request): | |
|
169 | # TODO: dan: replace the old pylons error controller with this | |
|
170 | from rhodecode.model.settings import SettingsModel | |
|
171 | from rhodecode.lib.utils2 import AttributeDict | |
|
172 | ||
|
173 | try: | |
|
174 | rc_config = SettingsModel().get_all_settings() | |
|
175 | except Exception: | |
|
176 | log.exception('failed to fetch settings') | |
|
177 | rc_config = {} | |
|
178 | ||
|
179 | base_response = HTTPInternalServerError() | |
|
180 | # prefer original exception for the response since it may have headers set | |
|
181 | if isinstance(exception, HTTPError): | |
|
182 | base_response = exception | |
|
183 | ||
|
184 | c = AttributeDict() | |
|
185 | c.error_message = base_response.status | |
|
186 | c.error_explanation = base_response.explanation or str(base_response) | |
|
187 | c.visual = AttributeDict() | |
|
188 | ||
|
189 | c.visual.rhodecode_support_url = ( | |
|
190 | request.registry.settings.get('rhodecode_support_url') or | |
|
191 | request.route_url('rhodecode_support') | |
|
192 | ) | |
|
193 | c.redirect_time = 0 | |
|
194 | c.rhodecode_name = rc_config.get('rhodecode_title', '') | |
|
195 | if not c.rhodecode_name: | |
|
196 | c.rhodecode_name = 'Rhodecode' | |
|
197 | ||
|
198 | response = render_to_response( | |
|
199 | '/errors/error_document.html', {'c': c}, request=request, | |
|
200 | response=base_response) | |
|
201 | ||
|
202 | return response | |
|
203 | ||
|
204 | ||
|
179 | 205 | def includeme(config): |
|
180 | 206 | settings = config.registry.settings |
|
181 | 207 | |
|
208 | if asbool(settings.get('appenlight', 'false')): | |
|
209 | config.include('appenlight_client.ext.pyramid_tween') | |
|
210 | ||
|
182 | 211 | # Includes which are required. The application would fail without them. |
|
183 | 212 | config.include('pyramid_mako') |
|
184 | 213 | config.include('pyramid_beaker') |
|
214 | config.include('rhodecode.admin') | |
|
185 | 215 | config.include('rhodecode.authentication') |
|
186 | 216 | config.include('rhodecode.login') |
|
187 | 217 | config.include('rhodecode.tweens') |
|
188 | 218 | config.include('rhodecode.api') |
|
219 | config.add_route( | |
|
220 | 'rhodecode_support', 'https://rhodecode.com/help/', static=True) | |
|
189 | 221 | |
|
190 | 222 | # Set the authorization policy. |
|
191 | 223 | authz_policy = ACLAuthorizationPolicy() |
|
192 | 224 | config.set_authorization_policy(authz_policy) |
|
193 | 225 | |
|
194 | 226 | # Set the default renderer for HTML templates to mako. |
|
195 | 227 | config.add_mako_renderer('.html') |
|
196 | 228 | |
|
197 | 229 | # plugin information |
|
198 | 230 | config.registry.rhodecode_plugins = {} |
|
199 | 231 | |
|
200 | 232 | config.add_directive( |
|
201 | 233 | 'register_rhodecode_plugin', register_rhodecode_plugin) |
|
202 | 234 | # include RhodeCode plugins |
|
203 | 235 | includes = aslist(settings.get('rhodecode.includes', [])) |
|
204 | 236 | for inc in includes: |
|
205 | 237 | config.include(inc) |
|
206 | 238 | |
|
239 | pylons_app = make_app( | |
|
240 | config.registry._pylons_compat_global_config, | |
|
241 | **config.registry._pylons_compat_settings) | |
|
242 | config.registry._pylons_compat_config = pylons_app.config | |
|
243 | ||
|
244 | pylons_app_as_view = wsgiapp(pylons_app) | |
|
245 | ||
|
246 | # Protect from VCS Server error related pages when server is not available | |
|
247 | vcs_server_enabled = asbool(settings.get('vcs.server.enable', 'true')) | |
|
248 | if not vcs_server_enabled: | |
|
249 | pylons_app_as_view = DisableVCSPagesWrapper(pylons_app_as_view) | |
|
250 | ||
|
251 | ||
|
252 | def pylons_app_with_error_handler(context, request): | |
|
253 | """ | |
|
254 | Handle exceptions from rc pylons app: | |
|
255 | ||
|
256 | - old webob type exceptions get converted to pyramid exceptions | |
|
257 | - pyramid exceptions are passed to the error handler view | |
|
258 | """ | |
|
259 | try: | |
|
260 | response = pylons_app_as_view(context, request) | |
|
261 | if 400 <= response.status_int <= 599: # webob type error responses | |
|
262 | return error_handler( | |
|
263 | webob_to_pyramid_http_response(response), request) | |
|
264 | except HTTPError as e: # pyramid type exceptions | |
|
265 | return error_handler(e, request) | |
|
266 | except Exception: | |
|
267 | if settings.get('debugtoolbar.enabled', False): | |
|
268 | raise | |
|
269 | return error_handler(HTTPInternalServerError(), request) | |
|
270 | return response | |
|
271 | ||
|
207 | 272 | # This is the glue which allows us to migrate in chunks. By registering the |
|
208 | 273 | # pylons based application as the "Not Found" view in Pyramid, we will |
|
209 | 274 | # fallback to the old application each time the new one does not yet know |
|
210 | 275 | # how to handle a request. |
|
211 | pylons_app = make_app( | |
|
212 | config.registry._pylons_compat_global_config, | |
|
213 | **config.registry._pylons_compat_settings) | |
|
214 | config.registry._pylons_compat_config = pylons_app.config | |
|
215 | pylons_app_as_view = wsgiapp(pylons_app) | |
|
216 | config.add_notfound_view(pylons_app_as_view) | |
|
276 | config.add_notfound_view(pylons_app_with_error_handler) | |
|
277 | ||
|
278 | if settings.get('debugtoolbar.enabled', False): | |
|
279 | # if toolbar, then only http type exceptions get caught and rendered | |
|
280 | ExcClass = HTTPError | |
|
281 | else: | |
|
282 | # if no toolbar, then any exception gets caught and rendered | |
|
283 | ExcClass = Exception | |
|
284 | config.add_view(error_handler, context=ExcClass) | |
|
217 | 285 | |
|
218 | 286 | |
|
219 | 287 | def includeme_last(config): |
|
220 | 288 | """ |
|
221 | 289 | The static file catchall needs to be last in the view configuration. |
|
222 | 290 | """ |
|
223 | 291 | settings = config.registry.settings |
|
224 | 292 | |
|
225 | 293 | # Note: johbo: I would prefer to register a prefix for static files at some |
|
226 | 294 | # point, e.g. move them under '_static/'. This would fully avoid that we |
|
227 | 295 | # can have name clashes with a repository name. Imaging someone calling his |
|
228 | 296 | # repo "css" ;-) Also having an external web server to serve out the static |
|
229 | 297 | # files seems to be easier to set up if they have a common prefix. |
|
230 | 298 | # |
|
231 | 299 | # Example: config.add_static_view('_static', path='rhodecode:public') |
|
232 | 300 | # |
|
233 | 301 | # It might be an option to register both paths for a while and then migrate |
|
234 | 302 | # over to the new location. |
|
235 | 303 | |
|
236 | 304 | # Serving static files with a catchall. |
|
237 | 305 | if settings['static_files']: |
|
238 | 306 | config.add_route('catchall_static', '/*subpath') |
|
239 | 307 | config.add_view( |
|
240 | 308 | static_view('rhodecode:public'), route_name='catchall_static') |
|
241 | 309 | |
|
242 | 310 | |
|
243 | 311 | def wrap_app_in_wsgi_middlewares(pyramid_app, config): |
|
244 | 312 | """ |
|
245 | 313 | Apply outer WSGI middlewares around the application. |
|
246 | 314 | |
|
247 | 315 | Part of this has been moved up from the Pylons layer, so that the |
|
248 | 316 | data is also available if old Pylons code is hit through an already ported |
|
249 | 317 | view. |
|
250 | 318 | """ |
|
251 | 319 | settings = config.registry.settings |
|
252 | 320 | |
|
253 | 321 | # enable https redirects based on HTTP_X_URL_SCHEME set by proxy |
|
254 | 322 | pyramid_app = HttpsFixup(pyramid_app, settings) |
|
255 | 323 | |
|
256 | # Add RoutesMiddleware. Currently we have two instances in the stack. This | |
|
257 | # is the upper one to support the pylons compatibility tween during | |
|
324 | # Add RoutesMiddleware to support the pylons compatibility tween during | |
|
258 | 325 | |
|
259 | 326 | # migration to pyramid. |
|
260 | 327 | pyramid_app = RoutesMiddleware( |
|
261 | 328 | pyramid_app, config.registry._pylons_compat_config['routes.map']) |
|
262 | 329 | |
|
330 | if asbool(settings.get('appenlight', 'false')): | |
|
331 | pyramid_app, _ = wrap_in_appenlight_if_enabled( | |
|
332 | pyramid_app, config.registry._pylons_compat_config) | |
|
333 | ||
|
263 | 334 | # TODO: johbo: Don't really see why we enable the gzip middleware when |
|
264 | 335 | # serving static files, might be something that should have its own setting |
|
265 | 336 | # as well? |
|
266 | 337 | if settings['static_files']: |
|
267 | 338 | pyramid_app = make_gzip_middleware( |
|
268 | 339 | pyramid_app, settings, compress_level=1) |
|
269 | 340 | |
|
270 | 341 | return pyramid_app |
|
271 | 342 | |
|
272 | 343 | |
|
273 | 344 | def sanitize_settings_and_apply_defaults(settings): |
|
274 | 345 | """ |
|
275 | 346 | Applies settings defaults and does all type conversion. |
|
276 | 347 | |
|
277 | 348 | We would move all settings parsing and preparation into this place, so that |
|
278 | 349 | we have only one place left which deals with this part. The remaining parts |
|
279 | 350 | of the application would start to rely fully on well prepared settings. |
|
280 | 351 | |
|
281 | 352 | This piece would later be split up per topic to avoid a big fat monster |
|
282 | 353 | function. |
|
283 | 354 | """ |
|
284 | 355 | |
|
285 | 356 | # Pyramid's mako renderer has to search in the templates folder so that the |
|
286 | 357 | # old templates still work. Ported and new templates are expected to use |
|
287 | 358 | # real asset specifications for the includes. |
|
288 | 359 | mako_directories = settings.setdefault('mako.directories', [ |
|
289 | 360 | # Base templates of the original Pylons application |
|
290 | 361 | 'rhodecode:templates', |
|
291 | 362 | ]) |
|
292 | 363 | log.debug( |
|
293 | 364 | "Using the following Mako template directories: %s", |
|
294 | 365 | mako_directories) |
|
295 | 366 | |
|
296 | 367 | # Default includes, possible to change as a user |
|
297 | 368 | pyramid_includes = settings.setdefault('pyramid.includes', [ |
|
298 | 369 | 'rhodecode.lib.middleware.request_wrapper', |
|
299 | 370 | ]) |
|
300 | 371 | log.debug( |
|
301 | 372 | "Using the following pyramid.includes: %s", |
|
302 | 373 | pyramid_includes) |
|
303 | 374 | |
|
304 | 375 | # TODO: johbo: Re-think this, usually the call to config.include |
|
305 | 376 | # should allow to pass in a prefix. |
|
306 | 377 | settings.setdefault('rhodecode.api.url', '/_admin/api') |
|
307 | 378 | |
|
308 | 379 | _bool_setting(settings, 'vcs.server.enable', 'true') |
|
309 | 380 | _bool_setting(settings, 'static_files', 'true') |
|
310 | 381 | _bool_setting(settings, 'is_test', 'false') |
|
311 | 382 | |
|
312 | 383 | return settings |
|
313 | 384 | |
|
314 | 385 | |
|
315 | 386 | def _bool_setting(settings, name, default): |
|
316 | 387 | settings[name] = asbool(settings.get(name, default)) |
@@ -1,1149 +1,1141 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | """ |
|
22 | 22 | Routes configuration |
|
23 | 23 | |
|
24 | 24 | The more specific and detailed routes should be defined first so they |
|
25 | 25 | may take precedent over the more generic routes. For more information |
|
26 | 26 | refer to the routes manual at http://routes.groovie.org/docs/ |
|
27 | 27 | |
|
28 | 28 | IMPORTANT: if you change any routing here, make sure to take a look at lib/base.py |
|
29 | 29 | and _route_name variable which uses some of stored naming here to do redirects. |
|
30 | 30 | """ |
|
31 | 31 | import os |
|
32 | 32 | import re |
|
33 | 33 | from routes import Mapper |
|
34 | 34 | |
|
35 | 35 | from rhodecode.config import routing_links |
|
36 | 36 | |
|
37 | 37 | # prefix for non repository related links needs to be prefixed with `/` |
|
38 | 38 | ADMIN_PREFIX = '/_admin' |
|
39 | 39 | |
|
40 | 40 | # Default requirements for URL parts |
|
41 | 41 | URL_NAME_REQUIREMENTS = { |
|
42 | 42 | # group name can have a slash in them, but they must not end with a slash |
|
43 | 43 | 'group_name': r'.*?[^/]', |
|
44 | 44 | # repo names can have a slash in them, but they must not end with a slash |
|
45 | 45 | 'repo_name': r'.*?[^/]', |
|
46 | 46 | # file path eats up everything at the end |
|
47 | 47 | 'f_path': r'.*', |
|
48 | 48 | # reference types |
|
49 | 49 | 'source_ref_type': '(branch|book|tag|rev|\%\(source_ref_type\)s)', |
|
50 | 50 | 'target_ref_type': '(branch|book|tag|rev|\%\(target_ref_type\)s)', |
|
51 | 51 | } |
|
52 | 52 | |
|
53 | 53 | |
|
54 | 54 | class JSRoutesMapper(Mapper): |
|
55 | 55 | """ |
|
56 | 56 | Wrapper for routes.Mapper to make pyroutes compatible url definitions |
|
57 | 57 | """ |
|
58 | 58 | _named_route_regex = re.compile(r'^[a-z-_0-9A-Z]+$') |
|
59 | 59 | _argument_prog = re.compile('\{(.*?)\}|:\((.*)\)') |
|
60 | 60 | def __init__(self, *args, **kw): |
|
61 | 61 | super(JSRoutesMapper, self).__init__(*args, **kw) |
|
62 | 62 | self._jsroutes = [] |
|
63 | 63 | |
|
64 | 64 | def connect(self, *args, **kw): |
|
65 | 65 | """ |
|
66 | 66 | Wrapper for connect to take an extra argument jsroute=True |
|
67 | 67 | |
|
68 | 68 | :param jsroute: boolean, if True will add the route to the pyroutes list |
|
69 | 69 | """ |
|
70 | 70 | if kw.pop('jsroute', False): |
|
71 | 71 | if not self._named_route_regex.match(args[0]): |
|
72 | 72 | raise Exception('only named routes can be added to pyroutes') |
|
73 | 73 | self._jsroutes.append(args[0]) |
|
74 | 74 | |
|
75 | 75 | super(JSRoutesMapper, self).connect(*args, **kw) |
|
76 | 76 | |
|
77 | 77 | def _extract_route_information(self, route): |
|
78 | 78 | """ |
|
79 | 79 | Convert a route into tuple(name, path, args), eg: |
|
80 | 80 | ('user_profile', '/profile/%(username)s', ['username']) |
|
81 | 81 | """ |
|
82 | 82 | routepath = route.routepath |
|
83 | 83 | def replace(matchobj): |
|
84 | 84 | if matchobj.group(1): |
|
85 | 85 | return "%%(%s)s" % matchobj.group(1).split(':')[0] |
|
86 | 86 | else: |
|
87 | 87 | return "%%(%s)s" % matchobj.group(2) |
|
88 | 88 | |
|
89 | 89 | routepath = self._argument_prog.sub(replace, routepath) |
|
90 | 90 | return ( |
|
91 | 91 | route.name, |
|
92 | 92 | routepath, |
|
93 | 93 | [(arg[0].split(':')[0] if arg[0] != '' else arg[1]) |
|
94 | 94 | for arg in self._argument_prog.findall(route.routepath)] |
|
95 | 95 | ) |
|
96 | 96 | |
|
97 | 97 | def jsroutes(self): |
|
98 | 98 | """ |
|
99 | 99 | Return a list of pyroutes.js compatible routes |
|
100 | 100 | """ |
|
101 | 101 | for route_name in self._jsroutes: |
|
102 | 102 | yield self._extract_route_information(self._routenames[route_name]) |
|
103 | 103 | |
|
104 | 104 | |
|
105 | 105 | def make_map(config): |
|
106 | 106 | """Create, configure and return the routes Mapper""" |
|
107 | 107 | rmap = JSRoutesMapper(directory=config['pylons.paths']['controllers'], |
|
108 | 108 | always_scan=config['debug']) |
|
109 | 109 | rmap.minimization = False |
|
110 | 110 | rmap.explicit = False |
|
111 | 111 | |
|
112 | 112 | from rhodecode.lib.utils2 import str2bool |
|
113 | 113 | from rhodecode.model import repo, repo_group |
|
114 | 114 | |
|
115 | 115 | def check_repo(environ, match_dict): |
|
116 | 116 | """ |
|
117 | 117 | check for valid repository for proper 404 handling |
|
118 | 118 | |
|
119 | 119 | :param environ: |
|
120 | 120 | :param match_dict: |
|
121 | 121 | """ |
|
122 | 122 | repo_name = match_dict.get('repo_name') |
|
123 | 123 | |
|
124 | 124 | if match_dict.get('f_path'): |
|
125 | 125 | # fix for multiple initial slashes that causes errors |
|
126 | 126 | match_dict['f_path'] = match_dict['f_path'].lstrip('/') |
|
127 | 127 | repo_model = repo.RepoModel() |
|
128 | 128 | by_name_match = repo_model.get_by_repo_name(repo_name) |
|
129 | 129 | # if we match quickly from database, short circuit the operation, |
|
130 | 130 | # and validate repo based on the type. |
|
131 | 131 | if by_name_match: |
|
132 | 132 | return True |
|
133 | 133 | |
|
134 | 134 | by_id_match = repo_model.get_repo_by_id(repo_name) |
|
135 | 135 | if by_id_match: |
|
136 | 136 | repo_name = by_id_match.repo_name |
|
137 | 137 | match_dict['repo_name'] = repo_name |
|
138 | 138 | return True |
|
139 | 139 | |
|
140 | 140 | return False |
|
141 | 141 | |
|
142 | 142 | def check_group(environ, match_dict): |
|
143 | 143 | """ |
|
144 | 144 | check for valid repository group path for proper 404 handling |
|
145 | 145 | |
|
146 | 146 | :param environ: |
|
147 | 147 | :param match_dict: |
|
148 | 148 | """ |
|
149 | 149 | repo_group_name = match_dict.get('group_name') |
|
150 | 150 | repo_group_model = repo_group.RepoGroupModel() |
|
151 | 151 | by_name_match = repo_group_model.get_by_group_name(repo_group_name) |
|
152 | 152 | if by_name_match: |
|
153 | 153 | return True |
|
154 | 154 | |
|
155 | 155 | return False |
|
156 | 156 | |
|
157 | 157 | def check_user_group(environ, match_dict): |
|
158 | 158 | """ |
|
159 | 159 | check for valid user group for proper 404 handling |
|
160 | 160 | |
|
161 | 161 | :param environ: |
|
162 | 162 | :param match_dict: |
|
163 | 163 | """ |
|
164 | 164 | return True |
|
165 | 165 | |
|
166 | 166 | def check_int(environ, match_dict): |
|
167 | 167 | return match_dict.get('id').isdigit() |
|
168 | 168 | |
|
169 | # The ErrorController route (handles 404/500 error pages); it should | |
|
170 | # likely stay at the top, ensuring it can always be resolved | |
|
171 | rmap.connect('/error/{action}', controller='error') | |
|
172 | rmap.connect('/error/{action}/{id}', controller='error') | |
|
173 | 169 | |
|
174 | 170 | #========================================================================== |
|
175 | 171 | # CUSTOM ROUTES HERE |
|
176 | 172 | #========================================================================== |
|
177 | 173 | |
|
178 | 174 | # MAIN PAGE |
|
179 | 175 | rmap.connect('home', '/', controller='home', action='index', jsroute=True) |
|
180 | 176 | rmap.connect('goto_switcher_data', '/_goto_data', controller='home', |
|
181 | 177 | action='goto_switcher_data') |
|
182 | 178 | rmap.connect('repo_list_data', '/_repos', controller='home', |
|
183 | 179 | action='repo_list_data') |
|
184 | 180 | |
|
185 | 181 | rmap.connect('user_autocomplete_data', '/_users', controller='home', |
|
186 | 182 | action='user_autocomplete_data', jsroute=True) |
|
187 | 183 | rmap.connect('user_group_autocomplete_data', '/_user_groups', controller='home', |
|
188 | 184 | action='user_group_autocomplete_data') |
|
189 | 185 | |
|
190 | 186 | rmap.connect( |
|
191 | 187 | 'user_profile', '/_profiles/{username}', controller='users', |
|
192 | 188 | action='user_profile') |
|
193 | 189 | |
|
194 | 190 | # TODO: johbo: Static links, to be replaced by our redirection mechanism |
|
195 | 191 | rmap.connect('rst_help', |
|
196 | 192 | 'http://docutils.sourceforge.net/docs/user/rst/quickref.html', |
|
197 | 193 | _static=True) |
|
198 | 194 | rmap.connect('markdown_help', |
|
199 | 195 | 'http://daringfireball.net/projects/markdown/syntax', |
|
200 | 196 | _static=True) |
|
201 | 197 | rmap.connect('rhodecode_official', 'https://rhodecode.com', _static=True) |
|
202 | 198 | rmap.connect('rhodecode_support', 'https://rhodecode.com/help/', _static=True) |
|
203 | 199 | rmap.connect('rhodecode_translations', 'https://rhodecode.com/translate/enterprise', _static=True) |
|
204 | 200 | # TODO: anderson - making this a static link since redirect won't play |
|
205 | 201 | # nice with POST requests |
|
206 | 202 | rmap.connect('enterprise_license_convert_from_old', |
|
207 | 203 | 'https://rhodecode.com/u/license-upgrade', |
|
208 | 204 | _static=True) |
|
209 | 205 | |
|
210 | 206 | routing_links.connect_redirection_links(rmap) |
|
211 | 207 | |
|
212 | 208 | rmap.connect('ping', '%s/ping' % (ADMIN_PREFIX,), controller='home', action='ping') |
|
213 | 209 | rmap.connect('error_test', '%s/error_test' % (ADMIN_PREFIX,), controller='home', action='error_test') |
|
214 | 210 | |
|
215 | 211 | # ADMIN REPOSITORY ROUTES |
|
216 | 212 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
217 | 213 | controller='admin/repos') as m: |
|
218 | 214 | m.connect('repos', '/repos', |
|
219 | 215 | action='create', conditions={'method': ['POST']}) |
|
220 | 216 | m.connect('repos', '/repos', |
|
221 | 217 | action='index', conditions={'method': ['GET']}) |
|
222 | 218 | m.connect('new_repo', '/create_repository', jsroute=True, |
|
223 | 219 | action='create_repository', conditions={'method': ['GET']}) |
|
224 | 220 | m.connect('/repos/{repo_name}', |
|
225 | 221 | action='update', conditions={'method': ['PUT'], |
|
226 | 222 | 'function': check_repo}, |
|
227 | 223 | requirements=URL_NAME_REQUIREMENTS) |
|
228 | 224 | m.connect('delete_repo', '/repos/{repo_name}', |
|
229 | 225 | action='delete', conditions={'method': ['DELETE']}, |
|
230 | 226 | requirements=URL_NAME_REQUIREMENTS) |
|
231 | 227 | m.connect('repo', '/repos/{repo_name}', |
|
232 | 228 | action='show', conditions={'method': ['GET'], |
|
233 | 229 | 'function': check_repo}, |
|
234 | 230 | requirements=URL_NAME_REQUIREMENTS) |
|
235 | 231 | |
|
236 | 232 | # ADMIN REPOSITORY GROUPS ROUTES |
|
237 | 233 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
238 | 234 | controller='admin/repo_groups') as m: |
|
239 | 235 | m.connect('repo_groups', '/repo_groups', |
|
240 | 236 | action='create', conditions={'method': ['POST']}) |
|
241 | 237 | m.connect('repo_groups', '/repo_groups', |
|
242 | 238 | action='index', conditions={'method': ['GET']}) |
|
243 | 239 | m.connect('new_repo_group', '/repo_groups/new', |
|
244 | 240 | action='new', conditions={'method': ['GET']}) |
|
245 | 241 | m.connect('update_repo_group', '/repo_groups/{group_name}', |
|
246 | 242 | action='update', conditions={'method': ['PUT'], |
|
247 | 243 | 'function': check_group}, |
|
248 | 244 | requirements=URL_NAME_REQUIREMENTS) |
|
249 | 245 | |
|
250 | 246 | # EXTRAS REPO GROUP ROUTES |
|
251 | 247 | m.connect('edit_repo_group', '/repo_groups/{group_name}/edit', |
|
252 | 248 | action='edit', |
|
253 | 249 | conditions={'method': ['GET'], 'function': check_group}, |
|
254 | 250 | requirements=URL_NAME_REQUIREMENTS) |
|
255 | 251 | m.connect('edit_repo_group', '/repo_groups/{group_name}/edit', |
|
256 | 252 | action='edit', |
|
257 | 253 | conditions={'method': ['PUT'], 'function': check_group}, |
|
258 | 254 | requirements=URL_NAME_REQUIREMENTS) |
|
259 | 255 | |
|
260 | 256 | m.connect('edit_repo_group_advanced', '/repo_groups/{group_name}/edit/advanced', |
|
261 | 257 | action='edit_repo_group_advanced', |
|
262 | 258 | conditions={'method': ['GET'], 'function': check_group}, |
|
263 | 259 | requirements=URL_NAME_REQUIREMENTS) |
|
264 | 260 | m.connect('edit_repo_group_advanced', '/repo_groups/{group_name}/edit/advanced', |
|
265 | 261 | action='edit_repo_group_advanced', |
|
266 | 262 | conditions={'method': ['PUT'], 'function': check_group}, |
|
267 | 263 | requirements=URL_NAME_REQUIREMENTS) |
|
268 | 264 | |
|
269 | 265 | m.connect('edit_repo_group_perms', '/repo_groups/{group_name}/edit/permissions', |
|
270 | 266 | action='edit_repo_group_perms', |
|
271 | 267 | conditions={'method': ['GET'], 'function': check_group}, |
|
272 | 268 | requirements=URL_NAME_REQUIREMENTS) |
|
273 | 269 | m.connect('edit_repo_group_perms', '/repo_groups/{group_name}/edit/permissions', |
|
274 | 270 | action='update_perms', |
|
275 | 271 | conditions={'method': ['PUT'], 'function': check_group}, |
|
276 | 272 | requirements=URL_NAME_REQUIREMENTS) |
|
277 | 273 | |
|
278 | 274 | m.connect('delete_repo_group', '/repo_groups/{group_name}', |
|
279 | 275 | action='delete', conditions={'method': ['DELETE'], |
|
280 | 276 | 'function': check_group}, |
|
281 | 277 | requirements=URL_NAME_REQUIREMENTS) |
|
282 | 278 | |
|
283 | 279 | # ADMIN USER ROUTES |
|
284 | 280 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
285 | 281 | controller='admin/users') as m: |
|
286 | 282 | m.connect('users', '/users', |
|
287 | 283 | action='create', conditions={'method': ['POST']}) |
|
288 | 284 | m.connect('users', '/users', |
|
289 | 285 | action='index', conditions={'method': ['GET']}) |
|
290 | 286 | m.connect('new_user', '/users/new', |
|
291 | 287 | action='new', conditions={'method': ['GET']}) |
|
292 | 288 | m.connect('update_user', '/users/{user_id}', |
|
293 | 289 | action='update', conditions={'method': ['PUT']}) |
|
294 | 290 | m.connect('delete_user', '/users/{user_id}', |
|
295 | 291 | action='delete', conditions={'method': ['DELETE']}) |
|
296 | 292 | m.connect('edit_user', '/users/{user_id}/edit', |
|
297 | 293 | action='edit', conditions={'method': ['GET']}) |
|
298 | 294 | m.connect('user', '/users/{user_id}', |
|
299 | 295 | action='show', conditions={'method': ['GET']}) |
|
300 | 296 | m.connect('force_password_reset_user', '/users/{user_id}/password_reset', |
|
301 | 297 | action='reset_password', conditions={'method': ['POST']}) |
|
302 | 298 | m.connect('create_personal_repo_group', '/users/{user_id}/create_repo_group', |
|
303 | 299 | action='create_personal_repo_group', conditions={'method': ['POST']}) |
|
304 | 300 | |
|
305 | 301 | # EXTRAS USER ROUTES |
|
306 | 302 | m.connect('edit_user_advanced', '/users/{user_id}/edit/advanced', |
|
307 | 303 | action='edit_advanced', conditions={'method': ['GET']}) |
|
308 | 304 | m.connect('edit_user_advanced', '/users/{user_id}/edit/advanced', |
|
309 | 305 | action='update_advanced', conditions={'method': ['PUT']}) |
|
310 | 306 | |
|
311 | 307 | m.connect('edit_user_auth_tokens', '/users/{user_id}/edit/auth_tokens', |
|
312 | 308 | action='edit_auth_tokens', conditions={'method': ['GET']}) |
|
313 | 309 | m.connect('edit_user_auth_tokens', '/users/{user_id}/edit/auth_tokens', |
|
314 | 310 | action='add_auth_token', conditions={'method': ['PUT']}) |
|
315 | 311 | m.connect('edit_user_auth_tokens', '/users/{user_id}/edit/auth_tokens', |
|
316 | 312 | action='delete_auth_token', conditions={'method': ['DELETE']}) |
|
317 | 313 | |
|
318 | 314 | m.connect('edit_user_global_perms', '/users/{user_id}/edit/global_permissions', |
|
319 | 315 | action='edit_global_perms', conditions={'method': ['GET']}) |
|
320 | 316 | m.connect('edit_user_global_perms', '/users/{user_id}/edit/global_permissions', |
|
321 | 317 | action='update_global_perms', conditions={'method': ['PUT']}) |
|
322 | 318 | |
|
323 | 319 | m.connect('edit_user_perms_summary', '/users/{user_id}/edit/permissions_summary', |
|
324 | 320 | action='edit_perms_summary', conditions={'method': ['GET']}) |
|
325 | 321 | |
|
326 | 322 | m.connect('edit_user_emails', '/users/{user_id}/edit/emails', |
|
327 | 323 | action='edit_emails', conditions={'method': ['GET']}) |
|
328 | 324 | m.connect('edit_user_emails', '/users/{user_id}/edit/emails', |
|
329 | 325 | action='add_email', conditions={'method': ['PUT']}) |
|
330 | 326 | m.connect('edit_user_emails', '/users/{user_id}/edit/emails', |
|
331 | 327 | action='delete_email', conditions={'method': ['DELETE']}) |
|
332 | 328 | |
|
333 | 329 | m.connect('edit_user_ips', '/users/{user_id}/edit/ips', |
|
334 | 330 | action='edit_ips', conditions={'method': ['GET']}) |
|
335 | 331 | m.connect('edit_user_ips', '/users/{user_id}/edit/ips', |
|
336 | 332 | action='add_ip', conditions={'method': ['PUT']}) |
|
337 | 333 | m.connect('edit_user_ips', '/users/{user_id}/edit/ips', |
|
338 | 334 | action='delete_ip', conditions={'method': ['DELETE']}) |
|
339 | 335 | |
|
340 | 336 | # ADMIN USER GROUPS REST ROUTES |
|
341 | 337 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
342 | 338 | controller='admin/user_groups') as m: |
|
343 | 339 | m.connect('users_groups', '/user_groups', |
|
344 | 340 | action='create', conditions={'method': ['POST']}) |
|
345 | 341 | m.connect('users_groups', '/user_groups', |
|
346 | 342 | action='index', conditions={'method': ['GET']}) |
|
347 | 343 | m.connect('new_users_group', '/user_groups/new', |
|
348 | 344 | action='new', conditions={'method': ['GET']}) |
|
349 | 345 | m.connect('update_users_group', '/user_groups/{user_group_id}', |
|
350 | 346 | action='update', conditions={'method': ['PUT']}) |
|
351 | 347 | m.connect('delete_users_group', '/user_groups/{user_group_id}', |
|
352 | 348 | action='delete', conditions={'method': ['DELETE']}) |
|
353 | 349 | m.connect('edit_users_group', '/user_groups/{user_group_id}/edit', |
|
354 | 350 | action='edit', conditions={'method': ['GET']}, |
|
355 | 351 | function=check_user_group) |
|
356 | 352 | |
|
357 | 353 | # EXTRAS USER GROUP ROUTES |
|
358 | 354 | m.connect('edit_user_group_global_perms', |
|
359 | 355 | '/user_groups/{user_group_id}/edit/global_permissions', |
|
360 | 356 | action='edit_global_perms', conditions={'method': ['GET']}) |
|
361 | 357 | m.connect('edit_user_group_global_perms', |
|
362 | 358 | '/user_groups/{user_group_id}/edit/global_permissions', |
|
363 | 359 | action='update_global_perms', conditions={'method': ['PUT']}) |
|
364 | 360 | m.connect('edit_user_group_perms_summary', |
|
365 | 361 | '/user_groups/{user_group_id}/edit/permissions_summary', |
|
366 | 362 | action='edit_perms_summary', conditions={'method': ['GET']}) |
|
367 | 363 | |
|
368 | 364 | m.connect('edit_user_group_perms', |
|
369 | 365 | '/user_groups/{user_group_id}/edit/permissions', |
|
370 | 366 | action='edit_perms', conditions={'method': ['GET']}) |
|
371 | 367 | m.connect('edit_user_group_perms', |
|
372 | 368 | '/user_groups/{user_group_id}/edit/permissions', |
|
373 | 369 | action='update_perms', conditions={'method': ['PUT']}) |
|
374 | 370 | |
|
375 | 371 | m.connect('edit_user_group_advanced', |
|
376 | 372 | '/user_groups/{user_group_id}/edit/advanced', |
|
377 | 373 | action='edit_advanced', conditions={'method': ['GET']}) |
|
378 | 374 | |
|
379 | 375 | m.connect('edit_user_group_members', |
|
380 | 376 | '/user_groups/{user_group_id}/edit/members', jsroute=True, |
|
381 | 377 | action='edit_members', conditions={'method': ['GET']}) |
|
382 | 378 | |
|
383 | 379 | # ADMIN PERMISSIONS ROUTES |
|
384 | 380 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
385 | 381 | controller='admin/permissions') as m: |
|
386 | 382 | m.connect('admin_permissions_application', '/permissions/application', |
|
387 | 383 | action='permission_application_update', conditions={'method': ['POST']}) |
|
388 | 384 | m.connect('admin_permissions_application', '/permissions/application', |
|
389 | 385 | action='permission_application', conditions={'method': ['GET']}) |
|
390 | 386 | |
|
391 | 387 | m.connect('admin_permissions_global', '/permissions/global', |
|
392 | 388 | action='permission_global_update', conditions={'method': ['POST']}) |
|
393 | 389 | m.connect('admin_permissions_global', '/permissions/global', |
|
394 | 390 | action='permission_global', conditions={'method': ['GET']}) |
|
395 | 391 | |
|
396 | 392 | m.connect('admin_permissions_object', '/permissions/object', |
|
397 | 393 | action='permission_objects_update', conditions={'method': ['POST']}) |
|
398 | 394 | m.connect('admin_permissions_object', '/permissions/object', |
|
399 | 395 | action='permission_objects', conditions={'method': ['GET']}) |
|
400 | 396 | |
|
401 | 397 | m.connect('admin_permissions_ips', '/permissions/ips', |
|
402 | 398 | action='permission_ips', conditions={'method': ['POST']}) |
|
403 | 399 | m.connect('admin_permissions_ips', '/permissions/ips', |
|
404 | 400 | action='permission_ips', conditions={'method': ['GET']}) |
|
405 | 401 | |
|
406 | 402 | m.connect('admin_permissions_overview', '/permissions/overview', |
|
407 | 403 | action='permission_perms', conditions={'method': ['GET']}) |
|
408 | 404 | |
|
409 | 405 | # ADMIN DEFAULTS REST ROUTES |
|
410 | 406 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
411 | 407 | controller='admin/defaults') as m: |
|
412 | 408 | m.connect('admin_defaults_repositories', '/defaults/repositories', |
|
413 | 409 | action='update_repository_defaults', conditions={'method': ['POST']}) |
|
414 | 410 | m.connect('admin_defaults_repositories', '/defaults/repositories', |
|
415 | 411 | action='index', conditions={'method': ['GET']}) |
|
416 | 412 | |
|
417 | 413 | # ADMIN DEBUG STYLE ROUTES |
|
418 | 414 | if str2bool(config.get('debug_style')): |
|
419 | 415 | with rmap.submapper(path_prefix=ADMIN_PREFIX + '/debug_style', |
|
420 | 416 | controller='debug_style') as m: |
|
421 | 417 | m.connect('debug_style_home', '', |
|
422 | 418 | action='index', conditions={'method': ['GET']}) |
|
423 | 419 | m.connect('debug_style_template', '/t/{t_path}', |
|
424 | 420 | action='template', conditions={'method': ['GET']}) |
|
425 | 421 | |
|
426 | 422 | # ADMIN SETTINGS ROUTES |
|
427 | 423 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
428 | 424 | controller='admin/settings') as m: |
|
429 | 425 | |
|
430 | 426 | # default |
|
431 | 427 | m.connect('admin_settings', '/settings', |
|
432 | 428 | action='settings_global_update', |
|
433 | 429 | conditions={'method': ['POST']}) |
|
434 | 430 | m.connect('admin_settings', '/settings', |
|
435 | 431 | action='settings_global', conditions={'method': ['GET']}) |
|
436 | 432 | |
|
437 | 433 | m.connect('admin_settings_vcs', '/settings/vcs', |
|
438 | 434 | action='settings_vcs_update', |
|
439 | 435 | conditions={'method': ['POST']}) |
|
440 | 436 | m.connect('admin_settings_vcs', '/settings/vcs', |
|
441 | 437 | action='settings_vcs', |
|
442 | 438 | conditions={'method': ['GET']}) |
|
443 | 439 | m.connect('admin_settings_vcs', '/settings/vcs', |
|
444 | 440 | action='delete_svn_pattern', |
|
445 | 441 | conditions={'method': ['DELETE']}) |
|
446 | 442 | |
|
447 | 443 | m.connect('admin_settings_mapping', '/settings/mapping', |
|
448 | 444 | action='settings_mapping_update', |
|
449 | 445 | conditions={'method': ['POST']}) |
|
450 | 446 | m.connect('admin_settings_mapping', '/settings/mapping', |
|
451 | 447 | action='settings_mapping', conditions={'method': ['GET']}) |
|
452 | 448 | |
|
453 | 449 | m.connect('admin_settings_global', '/settings/global', |
|
454 | 450 | action='settings_global_update', |
|
455 | 451 | conditions={'method': ['POST']}) |
|
456 | 452 | m.connect('admin_settings_global', '/settings/global', |
|
457 | 453 | action='settings_global', conditions={'method': ['GET']}) |
|
458 | 454 | |
|
459 | 455 | m.connect('admin_settings_visual', '/settings/visual', |
|
460 | 456 | action='settings_visual_update', |
|
461 | 457 | conditions={'method': ['POST']}) |
|
462 | 458 | m.connect('admin_settings_visual', '/settings/visual', |
|
463 | 459 | action='settings_visual', conditions={'method': ['GET']}) |
|
464 | 460 | |
|
465 | 461 | m.connect('admin_settings_issuetracker', |
|
466 | 462 | '/settings/issue-tracker', action='settings_issuetracker', |
|
467 | 463 | conditions={'method': ['GET']}) |
|
468 | 464 | m.connect('admin_settings_issuetracker_save', |
|
469 | 465 | '/settings/issue-tracker/save', |
|
470 | 466 | action='settings_issuetracker_save', |
|
471 | 467 | conditions={'method': ['POST']}) |
|
472 | 468 | m.connect('admin_issuetracker_test', '/settings/issue-tracker/test', |
|
473 | 469 | action='settings_issuetracker_test', |
|
474 | 470 | conditions={'method': ['POST']}) |
|
475 | 471 | m.connect('admin_issuetracker_delete', |
|
476 | 472 | '/settings/issue-tracker/delete', |
|
477 | 473 | action='settings_issuetracker_delete', |
|
478 | 474 | conditions={'method': ['DELETE']}) |
|
479 | 475 | |
|
480 | 476 | m.connect('admin_settings_email', '/settings/email', |
|
481 | 477 | action='settings_email_update', |
|
482 | 478 | conditions={'method': ['POST']}) |
|
483 | 479 | m.connect('admin_settings_email', '/settings/email', |
|
484 | 480 | action='settings_email', conditions={'method': ['GET']}) |
|
485 | 481 | |
|
486 | 482 | m.connect('admin_settings_hooks', '/settings/hooks', |
|
487 | 483 | action='settings_hooks_update', |
|
488 | 484 | conditions={'method': ['POST', 'DELETE']}) |
|
489 | 485 | m.connect('admin_settings_hooks', '/settings/hooks', |
|
490 | 486 | action='settings_hooks', conditions={'method': ['GET']}) |
|
491 | 487 | |
|
492 | 488 | m.connect('admin_settings_search', '/settings/search', |
|
493 | 489 | action='settings_search', conditions={'method': ['GET']}) |
|
494 | 490 | |
|
495 | 491 | m.connect('admin_settings_system', '/settings/system', |
|
496 | 492 | action='settings_system', conditions={'method': ['GET']}) |
|
497 | 493 | |
|
498 | 494 | m.connect('admin_settings_system_update', '/settings/system/updates', |
|
499 | 495 | action='settings_system_update', conditions={'method': ['GET']}) |
|
500 | 496 | |
|
501 | 497 | m.connect('admin_settings_supervisor', '/settings/supervisor', |
|
502 | 498 | action='settings_supervisor', conditions={'method': ['GET']}) |
|
503 | 499 | m.connect('admin_settings_supervisor_log', '/settings/supervisor/{procid}/log', |
|
504 | 500 | action='settings_supervisor_log', conditions={'method': ['GET']}) |
|
505 | 501 | |
|
506 | 502 | m.connect('admin_settings_labs', '/settings/labs', |
|
507 | 503 | action='settings_labs_update', |
|
508 | 504 | conditions={'method': ['POST']}) |
|
509 | 505 | m.connect('admin_settings_labs', '/settings/labs', |
|
510 | 506 | action='settings_labs', conditions={'method': ['GET']}) |
|
511 | 507 | |
|
512 | m.connect('admin_settings_open_source', '/settings/open_source', | |
|
513 | action='settings_open_source', | |
|
514 | conditions={'method': ['GET']}) | |
|
515 | ||
|
516 | 508 | # ADMIN MY ACCOUNT |
|
517 | 509 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
518 | 510 | controller='admin/my_account') as m: |
|
519 | 511 | |
|
520 | 512 | m.connect('my_account', '/my_account', |
|
521 | 513 | action='my_account', conditions={'method': ['GET']}) |
|
522 | 514 | m.connect('my_account_edit', '/my_account/edit', |
|
523 | 515 | action='my_account_edit', conditions={'method': ['GET']}) |
|
524 | 516 | m.connect('my_account', '/my_account', |
|
525 | 517 | action='my_account_update', conditions={'method': ['POST']}) |
|
526 | 518 | |
|
527 | 519 | m.connect('my_account_password', '/my_account/password', |
|
528 | 520 | action='my_account_password', conditions={'method': ['GET']}) |
|
529 | 521 | m.connect('my_account_password', '/my_account/password', |
|
530 | 522 | action='my_account_password_update', conditions={'method': ['POST']}) |
|
531 | 523 | |
|
532 | 524 | m.connect('my_account_repos', '/my_account/repos', |
|
533 | 525 | action='my_account_repos', conditions={'method': ['GET']}) |
|
534 | 526 | |
|
535 | 527 | m.connect('my_account_watched', '/my_account/watched', |
|
536 | 528 | action='my_account_watched', conditions={'method': ['GET']}) |
|
537 | 529 | |
|
538 | 530 | m.connect('my_account_pullrequests', '/my_account/pull_requests', |
|
539 | 531 | action='my_account_pullrequests', conditions={'method': ['GET']}) |
|
540 | 532 | |
|
541 | 533 | m.connect('my_account_perms', '/my_account/perms', |
|
542 | 534 | action='my_account_perms', conditions={'method': ['GET']}) |
|
543 | 535 | |
|
544 | 536 | m.connect('my_account_emails', '/my_account/emails', |
|
545 | 537 | action='my_account_emails', conditions={'method': ['GET']}) |
|
546 | 538 | m.connect('my_account_emails', '/my_account/emails', |
|
547 | 539 | action='my_account_emails_add', conditions={'method': ['POST']}) |
|
548 | 540 | m.connect('my_account_emails', '/my_account/emails', |
|
549 | 541 | action='my_account_emails_delete', conditions={'method': ['DELETE']}) |
|
550 | 542 | |
|
551 | 543 | m.connect('my_account_auth_tokens', '/my_account/auth_tokens', |
|
552 | 544 | action='my_account_auth_tokens', conditions={'method': ['GET']}) |
|
553 | 545 | m.connect('my_account_auth_tokens', '/my_account/auth_tokens', |
|
554 | 546 | action='my_account_auth_tokens_add', conditions={'method': ['POST']}) |
|
555 | 547 | m.connect('my_account_auth_tokens', '/my_account/auth_tokens', |
|
556 | 548 | action='my_account_auth_tokens_delete', conditions={'method': ['DELETE']}) |
|
557 | 549 | |
|
558 | 550 | # NOTIFICATION REST ROUTES |
|
559 | 551 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
560 | 552 | controller='admin/notifications') as m: |
|
561 | 553 | m.connect('notifications', '/notifications', |
|
562 | 554 | action='index', conditions={'method': ['GET']}) |
|
563 | 555 | m.connect('notifications_mark_all_read', '/notifications/mark_all_read', |
|
564 | 556 | action='mark_all_read', conditions={'method': ['POST']}) |
|
565 | 557 | |
|
566 | 558 | m.connect('/notifications/{notification_id}', |
|
567 | 559 | action='update', conditions={'method': ['PUT']}) |
|
568 | 560 | m.connect('/notifications/{notification_id}', |
|
569 | 561 | action='delete', conditions={'method': ['DELETE']}) |
|
570 | 562 | m.connect('notification', '/notifications/{notification_id}', |
|
571 | 563 | action='show', conditions={'method': ['GET']}) |
|
572 | 564 | |
|
573 | 565 | # ADMIN GIST |
|
574 | 566 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
575 | 567 | controller='admin/gists') as m: |
|
576 | 568 | m.connect('gists', '/gists', |
|
577 | 569 | action='create', conditions={'method': ['POST']}) |
|
578 | 570 | m.connect('gists', '/gists', jsroute=True, |
|
579 | 571 | action='index', conditions={'method': ['GET']}) |
|
580 | 572 | m.connect('new_gist', '/gists/new', jsroute=True, |
|
581 | 573 | action='new', conditions={'method': ['GET']}) |
|
582 | 574 | |
|
583 | 575 | m.connect('/gists/{gist_id}', |
|
584 | 576 | action='delete', conditions={'method': ['DELETE']}) |
|
585 | 577 | m.connect('edit_gist', '/gists/{gist_id}/edit', |
|
586 | 578 | action='edit_form', conditions={'method': ['GET']}) |
|
587 | 579 | m.connect('edit_gist', '/gists/{gist_id}/edit', |
|
588 | 580 | action='edit', conditions={'method': ['POST']}) |
|
589 | 581 | m.connect( |
|
590 | 582 | 'edit_gist_check_revision', '/gists/{gist_id}/edit/check_revision', |
|
591 | 583 | action='check_revision', conditions={'method': ['GET']}) |
|
592 | 584 | |
|
593 | 585 | m.connect('gist', '/gists/{gist_id}', |
|
594 | 586 | action='show', conditions={'method': ['GET']}) |
|
595 | 587 | m.connect('gist_rev', '/gists/{gist_id}/{revision}', |
|
596 | 588 | revision='tip', |
|
597 | 589 | action='show', conditions={'method': ['GET']}) |
|
598 | 590 | m.connect('formatted_gist', '/gists/{gist_id}/{revision}/{format}', |
|
599 | 591 | revision='tip', |
|
600 | 592 | action='show', conditions={'method': ['GET']}) |
|
601 | 593 | m.connect('formatted_gist_file', '/gists/{gist_id}/{revision}/{format}/{f_path}', |
|
602 | 594 | revision='tip', |
|
603 | 595 | action='show', conditions={'method': ['GET']}, |
|
604 | 596 | requirements=URL_NAME_REQUIREMENTS) |
|
605 | 597 | |
|
606 | 598 | # ADMIN MAIN PAGES |
|
607 | 599 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
608 | 600 | controller='admin/admin') as m: |
|
609 | 601 | m.connect('admin_home', '', action='index') |
|
610 | 602 | m.connect('admin_add_repo', '/add_repo/{new_repo:[a-z0-9\. _-]*}', |
|
611 | 603 | action='add_repo') |
|
612 | 604 | m.connect( |
|
613 | 605 | 'pull_requests_global_0', '/pull_requests/{pull_request_id:[0-9]+}', |
|
614 | 606 | action='pull_requests') |
|
615 | 607 | m.connect( |
|
616 | 608 | 'pull_requests_global', '/pull-requests/{pull_request_id:[0-9]+}', |
|
617 | 609 | action='pull_requests') |
|
618 | 610 | |
|
619 | 611 | |
|
620 | 612 | # USER JOURNAL |
|
621 | 613 | rmap.connect('journal', '%s/journal' % (ADMIN_PREFIX,), |
|
622 | 614 | controller='journal', action='index') |
|
623 | 615 | rmap.connect('journal_rss', '%s/journal/rss' % (ADMIN_PREFIX,), |
|
624 | 616 | controller='journal', action='journal_rss') |
|
625 | 617 | rmap.connect('journal_atom', '%s/journal/atom' % (ADMIN_PREFIX,), |
|
626 | 618 | controller='journal', action='journal_atom') |
|
627 | 619 | |
|
628 | 620 | rmap.connect('public_journal', '%s/public_journal' % (ADMIN_PREFIX,), |
|
629 | 621 | controller='journal', action='public_journal') |
|
630 | 622 | |
|
631 | 623 | rmap.connect('public_journal_rss', '%s/public_journal/rss' % (ADMIN_PREFIX,), |
|
632 | 624 | controller='journal', action='public_journal_rss') |
|
633 | 625 | |
|
634 | 626 | rmap.connect('public_journal_rss_old', '%s/public_journal_rss' % (ADMIN_PREFIX,), |
|
635 | 627 | controller='journal', action='public_journal_rss') |
|
636 | 628 | |
|
637 | 629 | rmap.connect('public_journal_atom', |
|
638 | 630 | '%s/public_journal/atom' % (ADMIN_PREFIX,), controller='journal', |
|
639 | 631 | action='public_journal_atom') |
|
640 | 632 | |
|
641 | 633 | rmap.connect('public_journal_atom_old', |
|
642 | 634 | '%s/public_journal_atom' % (ADMIN_PREFIX,), controller='journal', |
|
643 | 635 | action='public_journal_atom') |
|
644 | 636 | |
|
645 | 637 | rmap.connect('toggle_following', '%s/toggle_following' % (ADMIN_PREFIX,), |
|
646 | 638 | controller='journal', action='toggle_following', jsroute=True, |
|
647 | 639 | conditions={'method': ['POST']}) |
|
648 | 640 | |
|
649 | 641 | # FULL TEXT SEARCH |
|
650 | 642 | rmap.connect('search', '%s/search' % (ADMIN_PREFIX,), |
|
651 | 643 | controller='search') |
|
652 | 644 | rmap.connect('search_repo_home', '/{repo_name}/search', |
|
653 | 645 | controller='search', |
|
654 | 646 | action='index', |
|
655 | 647 | conditions={'function': check_repo}, |
|
656 | 648 | requirements=URL_NAME_REQUIREMENTS) |
|
657 | 649 | |
|
658 | 650 | # FEEDS |
|
659 | 651 | rmap.connect('rss_feed_home', '/{repo_name}/feed/rss', |
|
660 | 652 | controller='feed', action='rss', |
|
661 | 653 | conditions={'function': check_repo}, |
|
662 | 654 | requirements=URL_NAME_REQUIREMENTS) |
|
663 | 655 | |
|
664 | 656 | rmap.connect('atom_feed_home', '/{repo_name}/feed/atom', |
|
665 | 657 | controller='feed', action='atom', |
|
666 | 658 | conditions={'function': check_repo}, |
|
667 | 659 | requirements=URL_NAME_REQUIREMENTS) |
|
668 | 660 | |
|
669 | 661 | #========================================================================== |
|
670 | 662 | # REPOSITORY ROUTES |
|
671 | 663 | #========================================================================== |
|
672 | 664 | |
|
673 | 665 | rmap.connect('repo_creating_home', '/{repo_name}/repo_creating', |
|
674 | 666 | controller='admin/repos', action='repo_creating', |
|
675 | 667 | requirements=URL_NAME_REQUIREMENTS) |
|
676 | 668 | rmap.connect('repo_check_home', '/{repo_name}/crepo_check', |
|
677 | 669 | controller='admin/repos', action='repo_check', |
|
678 | 670 | requirements=URL_NAME_REQUIREMENTS) |
|
679 | 671 | |
|
680 | 672 | rmap.connect('repo_stats', '/{repo_name}/repo_stats/{commit_id}', |
|
681 | 673 | controller='summary', action='repo_stats', |
|
682 | 674 | conditions={'function': check_repo}, |
|
683 | 675 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
684 | 676 | |
|
685 | 677 | rmap.connect('repo_refs_data', '/{repo_name}/refs-data', |
|
686 | 678 | controller='summary', action='repo_refs_data', jsroute=True, |
|
687 | 679 | requirements=URL_NAME_REQUIREMENTS) |
|
688 | 680 | rmap.connect('repo_refs_changelog_data', '/{repo_name}/refs-data-changelog', |
|
689 | 681 | controller='summary', action='repo_refs_changelog_data', |
|
690 | 682 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
691 | 683 | |
|
692 | 684 | rmap.connect('changeset_home', '/{repo_name}/changeset/{revision}', |
|
693 | 685 | controller='changeset', revision='tip', jsroute=True, |
|
694 | 686 | conditions={'function': check_repo}, |
|
695 | 687 | requirements=URL_NAME_REQUIREMENTS) |
|
696 | 688 | rmap.connect('changeset_children', '/{repo_name}/changeset_children/{revision}', |
|
697 | 689 | controller='changeset', revision='tip', action='changeset_children', |
|
698 | 690 | conditions={'function': check_repo}, |
|
699 | 691 | requirements=URL_NAME_REQUIREMENTS) |
|
700 | 692 | rmap.connect('changeset_parents', '/{repo_name}/changeset_parents/{revision}', |
|
701 | 693 | controller='changeset', revision='tip', action='changeset_parents', |
|
702 | 694 | conditions={'function': check_repo}, |
|
703 | 695 | requirements=URL_NAME_REQUIREMENTS) |
|
704 | 696 | |
|
705 | 697 | # repo edit options |
|
706 | 698 | rmap.connect('edit_repo', '/{repo_name}/settings', jsroute=True, |
|
707 | 699 | controller='admin/repos', action='edit', |
|
708 | 700 | conditions={'method': ['GET'], 'function': check_repo}, |
|
709 | 701 | requirements=URL_NAME_REQUIREMENTS) |
|
710 | 702 | |
|
711 | 703 | rmap.connect('edit_repo_perms', '/{repo_name}/settings/permissions', |
|
712 | 704 | jsroute=True, |
|
713 | 705 | controller='admin/repos', action='edit_permissions', |
|
714 | 706 | conditions={'method': ['GET'], 'function': check_repo}, |
|
715 | 707 | requirements=URL_NAME_REQUIREMENTS) |
|
716 | 708 | rmap.connect('edit_repo_perms_update', '/{repo_name}/settings/permissions', |
|
717 | 709 | controller='admin/repos', action='edit_permissions_update', |
|
718 | 710 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
719 | 711 | requirements=URL_NAME_REQUIREMENTS) |
|
720 | 712 | |
|
721 | 713 | rmap.connect('edit_repo_fields', '/{repo_name}/settings/fields', |
|
722 | 714 | controller='admin/repos', action='edit_fields', |
|
723 | 715 | conditions={'method': ['GET'], 'function': check_repo}, |
|
724 | 716 | requirements=URL_NAME_REQUIREMENTS) |
|
725 | 717 | rmap.connect('create_repo_fields', '/{repo_name}/settings/fields/new', |
|
726 | 718 | controller='admin/repos', action='create_repo_field', |
|
727 | 719 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
728 | 720 | requirements=URL_NAME_REQUIREMENTS) |
|
729 | 721 | rmap.connect('delete_repo_fields', '/{repo_name}/settings/fields/{field_id}', |
|
730 | 722 | controller='admin/repos', action='delete_repo_field', |
|
731 | 723 | conditions={'method': ['DELETE'], 'function': check_repo}, |
|
732 | 724 | requirements=URL_NAME_REQUIREMENTS) |
|
733 | 725 | |
|
734 | 726 | rmap.connect('edit_repo_advanced', '/{repo_name}/settings/advanced', |
|
735 | 727 | controller='admin/repos', action='edit_advanced', |
|
736 | 728 | conditions={'method': ['GET'], 'function': check_repo}, |
|
737 | 729 | requirements=URL_NAME_REQUIREMENTS) |
|
738 | 730 | |
|
739 | 731 | rmap.connect('edit_repo_advanced_locking', '/{repo_name}/settings/advanced/locking', |
|
740 | 732 | controller='admin/repos', action='edit_advanced_locking', |
|
741 | 733 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
742 | 734 | requirements=URL_NAME_REQUIREMENTS) |
|
743 | 735 | rmap.connect('toggle_locking', '/{repo_name}/settings/advanced/locking_toggle', |
|
744 | 736 | controller='admin/repos', action='toggle_locking', |
|
745 | 737 | conditions={'method': ['GET'], 'function': check_repo}, |
|
746 | 738 | requirements=URL_NAME_REQUIREMENTS) |
|
747 | 739 | |
|
748 | 740 | rmap.connect('edit_repo_advanced_journal', '/{repo_name}/settings/advanced/journal', |
|
749 | 741 | controller='admin/repos', action='edit_advanced_journal', |
|
750 | 742 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
751 | 743 | requirements=URL_NAME_REQUIREMENTS) |
|
752 | 744 | |
|
753 | 745 | rmap.connect('edit_repo_advanced_fork', '/{repo_name}/settings/advanced/fork', |
|
754 | 746 | controller='admin/repos', action='edit_advanced_fork', |
|
755 | 747 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
756 | 748 | requirements=URL_NAME_REQUIREMENTS) |
|
757 | 749 | |
|
758 | 750 | rmap.connect('edit_repo_caches', '/{repo_name}/settings/caches', |
|
759 | 751 | controller='admin/repos', action='edit_caches_form', |
|
760 | 752 | conditions={'method': ['GET'], 'function': check_repo}, |
|
761 | 753 | requirements=URL_NAME_REQUIREMENTS) |
|
762 | 754 | rmap.connect('edit_repo_caches', '/{repo_name}/settings/caches', |
|
763 | 755 | controller='admin/repos', action='edit_caches', |
|
764 | 756 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
765 | 757 | requirements=URL_NAME_REQUIREMENTS) |
|
766 | 758 | |
|
767 | 759 | rmap.connect('edit_repo_remote', '/{repo_name}/settings/remote', |
|
768 | 760 | controller='admin/repos', action='edit_remote_form', |
|
769 | 761 | conditions={'method': ['GET'], 'function': check_repo}, |
|
770 | 762 | requirements=URL_NAME_REQUIREMENTS) |
|
771 | 763 | rmap.connect('edit_repo_remote', '/{repo_name}/settings/remote', |
|
772 | 764 | controller='admin/repos', action='edit_remote', |
|
773 | 765 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
774 | 766 | requirements=URL_NAME_REQUIREMENTS) |
|
775 | 767 | |
|
776 | 768 | rmap.connect('edit_repo_statistics', '/{repo_name}/settings/statistics', |
|
777 | 769 | controller='admin/repos', action='edit_statistics_form', |
|
778 | 770 | conditions={'method': ['GET'], 'function': check_repo}, |
|
779 | 771 | requirements=URL_NAME_REQUIREMENTS) |
|
780 | 772 | rmap.connect('edit_repo_statistics', '/{repo_name}/settings/statistics', |
|
781 | 773 | controller='admin/repos', action='edit_statistics', |
|
782 | 774 | conditions={'method': ['PUT'], 'function': check_repo}, |
|
783 | 775 | requirements=URL_NAME_REQUIREMENTS) |
|
784 | 776 | rmap.connect('repo_settings_issuetracker', |
|
785 | 777 | '/{repo_name}/settings/issue-tracker', |
|
786 | 778 | controller='admin/repos', action='repo_issuetracker', |
|
787 | 779 | conditions={'method': ['GET'], 'function': check_repo}, |
|
788 | 780 | requirements=URL_NAME_REQUIREMENTS) |
|
789 | 781 | rmap.connect('repo_issuetracker_test', |
|
790 | 782 | '/{repo_name}/settings/issue-tracker/test', |
|
791 | 783 | controller='admin/repos', action='repo_issuetracker_test', |
|
792 | 784 | conditions={'method': ['POST'], 'function': check_repo}, |
|
793 | 785 | requirements=URL_NAME_REQUIREMENTS) |
|
794 | 786 | rmap.connect('repo_issuetracker_delete', |
|
795 | 787 | '/{repo_name}/settings/issue-tracker/delete', |
|
796 | 788 | controller='admin/repos', action='repo_issuetracker_delete', |
|
797 | 789 | conditions={'method': ['DELETE'], 'function': check_repo}, |
|
798 | 790 | requirements=URL_NAME_REQUIREMENTS) |
|
799 | 791 | rmap.connect('repo_issuetracker_save', |
|
800 | 792 | '/{repo_name}/settings/issue-tracker/save', |
|
801 | 793 | controller='admin/repos', action='repo_issuetracker_save', |
|
802 | 794 | conditions={'method': ['POST'], 'function': check_repo}, |
|
803 | 795 | requirements=URL_NAME_REQUIREMENTS) |
|
804 | 796 | rmap.connect('repo_vcs_settings', '/{repo_name}/settings/vcs', |
|
805 | 797 | controller='admin/repos', action='repo_settings_vcs_update', |
|
806 | 798 | conditions={'method': ['POST'], 'function': check_repo}, |
|
807 | 799 | requirements=URL_NAME_REQUIREMENTS) |
|
808 | 800 | rmap.connect('repo_vcs_settings', '/{repo_name}/settings/vcs', |
|
809 | 801 | controller='admin/repos', action='repo_settings_vcs', |
|
810 | 802 | conditions={'method': ['GET'], 'function': check_repo}, |
|
811 | 803 | requirements=URL_NAME_REQUIREMENTS) |
|
812 | 804 | rmap.connect('repo_vcs_settings', '/{repo_name}/settings/vcs', |
|
813 | 805 | controller='admin/repos', action='repo_delete_svn_pattern', |
|
814 | 806 | conditions={'method': ['DELETE'], 'function': check_repo}, |
|
815 | 807 | requirements=URL_NAME_REQUIREMENTS) |
|
816 | 808 | |
|
817 | 809 | # still working url for backward compat. |
|
818 | 810 | rmap.connect('raw_changeset_home_depraced', |
|
819 | 811 | '/{repo_name}/raw-changeset/{revision}', |
|
820 | 812 | controller='changeset', action='changeset_raw', |
|
821 | 813 | revision='tip', conditions={'function': check_repo}, |
|
822 | 814 | requirements=URL_NAME_REQUIREMENTS) |
|
823 | 815 | |
|
824 | 816 | # new URLs |
|
825 | 817 | rmap.connect('changeset_raw_home', |
|
826 | 818 | '/{repo_name}/changeset-diff/{revision}', |
|
827 | 819 | controller='changeset', action='changeset_raw', |
|
828 | 820 | revision='tip', conditions={'function': check_repo}, |
|
829 | 821 | requirements=URL_NAME_REQUIREMENTS) |
|
830 | 822 | |
|
831 | 823 | rmap.connect('changeset_patch_home', |
|
832 | 824 | '/{repo_name}/changeset-patch/{revision}', |
|
833 | 825 | controller='changeset', action='changeset_patch', |
|
834 | 826 | revision='tip', conditions={'function': check_repo}, |
|
835 | 827 | requirements=URL_NAME_REQUIREMENTS) |
|
836 | 828 | |
|
837 | 829 | rmap.connect('changeset_download_home', |
|
838 | 830 | '/{repo_name}/changeset-download/{revision}', |
|
839 | 831 | controller='changeset', action='changeset_download', |
|
840 | 832 | revision='tip', conditions={'function': check_repo}, |
|
841 | 833 | requirements=URL_NAME_REQUIREMENTS) |
|
842 | 834 | |
|
843 | 835 | rmap.connect('changeset_comment', |
|
844 | 836 | '/{repo_name}/changeset/{revision}/comment', jsroute=True, |
|
845 | 837 | controller='changeset', revision='tip', action='comment', |
|
846 | 838 | conditions={'function': check_repo}, |
|
847 | 839 | requirements=URL_NAME_REQUIREMENTS) |
|
848 | 840 | |
|
849 | 841 | rmap.connect('changeset_comment_preview', |
|
850 | 842 | '/{repo_name}/changeset/comment/preview', jsroute=True, |
|
851 | 843 | controller='changeset', action='preview_comment', |
|
852 | 844 | conditions={'function': check_repo, 'method': ['POST']}, |
|
853 | 845 | requirements=URL_NAME_REQUIREMENTS) |
|
854 | 846 | |
|
855 | 847 | rmap.connect('changeset_comment_delete', |
|
856 | 848 | '/{repo_name}/changeset/comment/{comment_id}/delete', |
|
857 | 849 | controller='changeset', action='delete_comment', |
|
858 | 850 | conditions={'function': check_repo, 'method': ['DELETE']}, |
|
859 | 851 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
860 | 852 | |
|
861 | 853 | rmap.connect('changeset_info', '/changeset_info/{repo_name}/{revision}', |
|
862 | 854 | controller='changeset', action='changeset_info', |
|
863 | 855 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
864 | 856 | |
|
865 | 857 | rmap.connect('compare_home', |
|
866 | 858 | '/{repo_name}/compare', |
|
867 | 859 | controller='compare', action='index', |
|
868 | 860 | conditions={'function': check_repo}, |
|
869 | 861 | requirements=URL_NAME_REQUIREMENTS) |
|
870 | 862 | |
|
871 | 863 | rmap.connect('compare_url', |
|
872 | 864 | '/{repo_name}/compare/{source_ref_type}@{source_ref:.*?}...{target_ref_type}@{target_ref:.*?}', |
|
873 | 865 | controller='compare', action='compare', |
|
874 | 866 | conditions={'function': check_repo}, |
|
875 | 867 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
876 | 868 | |
|
877 | 869 | rmap.connect('pullrequest_home', |
|
878 | 870 | '/{repo_name}/pull-request/new', controller='pullrequests', |
|
879 | 871 | action='index', conditions={'function': check_repo, |
|
880 | 872 | 'method': ['GET']}, |
|
881 | 873 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
882 | 874 | |
|
883 | 875 | rmap.connect('pullrequest', |
|
884 | 876 | '/{repo_name}/pull-request/new', controller='pullrequests', |
|
885 | 877 | action='create', conditions={'function': check_repo, |
|
886 | 878 | 'method': ['POST']}, |
|
887 | 879 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
888 | 880 | |
|
889 | 881 | rmap.connect('pullrequest_repo_refs', |
|
890 | 882 | '/{repo_name}/pull-request/refs/{target_repo_name:.*?[^/]}', |
|
891 | 883 | controller='pullrequests', |
|
892 | 884 | action='get_repo_refs', |
|
893 | 885 | conditions={'function': check_repo, 'method': ['GET']}, |
|
894 | 886 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
895 | 887 | |
|
896 | 888 | rmap.connect('pullrequest_repo_destinations', |
|
897 | 889 | '/{repo_name}/pull-request/repo-destinations', |
|
898 | 890 | controller='pullrequests', |
|
899 | 891 | action='get_repo_destinations', |
|
900 | 892 | conditions={'function': check_repo, 'method': ['GET']}, |
|
901 | 893 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
902 | 894 | |
|
903 | 895 | rmap.connect('pullrequest_show', |
|
904 | 896 | '/{repo_name}/pull-request/{pull_request_id}', |
|
905 | 897 | controller='pullrequests', |
|
906 | 898 | action='show', conditions={'function': check_repo, |
|
907 | 899 | 'method': ['GET']}, |
|
908 | 900 | requirements=URL_NAME_REQUIREMENTS) |
|
909 | 901 | |
|
910 | 902 | rmap.connect('pullrequest_update', |
|
911 | 903 | '/{repo_name}/pull-request/{pull_request_id}', |
|
912 | 904 | controller='pullrequests', |
|
913 | 905 | action='update', conditions={'function': check_repo, |
|
914 | 906 | 'method': ['PUT']}, |
|
915 | 907 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
916 | 908 | |
|
917 | 909 | rmap.connect('pullrequest_merge', |
|
918 | 910 | '/{repo_name}/pull-request/{pull_request_id}', |
|
919 | 911 | controller='pullrequests', |
|
920 | 912 | action='merge', conditions={'function': check_repo, |
|
921 | 913 | 'method': ['POST']}, |
|
922 | 914 | requirements=URL_NAME_REQUIREMENTS) |
|
923 | 915 | |
|
924 | 916 | rmap.connect('pullrequest_delete', |
|
925 | 917 | '/{repo_name}/pull-request/{pull_request_id}', |
|
926 | 918 | controller='pullrequests', |
|
927 | 919 | action='delete', conditions={'function': check_repo, |
|
928 | 920 | 'method': ['DELETE']}, |
|
929 | 921 | requirements=URL_NAME_REQUIREMENTS) |
|
930 | 922 | |
|
931 | 923 | rmap.connect('pullrequest_show_all', |
|
932 | 924 | '/{repo_name}/pull-request', |
|
933 | 925 | controller='pullrequests', |
|
934 | 926 | action='show_all', conditions={'function': check_repo, |
|
935 | 927 | 'method': ['GET']}, |
|
936 | 928 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
937 | 929 | |
|
938 | 930 | rmap.connect('pullrequest_comment', |
|
939 | 931 | '/{repo_name}/pull-request-comment/{pull_request_id}', |
|
940 | 932 | controller='pullrequests', |
|
941 | 933 | action='comment', conditions={'function': check_repo, |
|
942 | 934 | 'method': ['POST']}, |
|
943 | 935 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
944 | 936 | |
|
945 | 937 | rmap.connect('pullrequest_comment_delete', |
|
946 | 938 | '/{repo_name}/pull-request-comment/{comment_id}/delete', |
|
947 | 939 | controller='pullrequests', action='delete_comment', |
|
948 | 940 | conditions={'function': check_repo, 'method': ['DELETE']}, |
|
949 | 941 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
950 | 942 | |
|
951 | 943 | rmap.connect('summary_home_explicit', '/{repo_name}/summary', |
|
952 | 944 | controller='summary', conditions={'function': check_repo}, |
|
953 | 945 | requirements=URL_NAME_REQUIREMENTS) |
|
954 | 946 | |
|
955 | 947 | rmap.connect('branches_home', '/{repo_name}/branches', |
|
956 | 948 | controller='branches', conditions={'function': check_repo}, |
|
957 | 949 | requirements=URL_NAME_REQUIREMENTS) |
|
958 | 950 | |
|
959 | 951 | rmap.connect('tags_home', '/{repo_name}/tags', |
|
960 | 952 | controller='tags', conditions={'function': check_repo}, |
|
961 | 953 | requirements=URL_NAME_REQUIREMENTS) |
|
962 | 954 | |
|
963 | 955 | rmap.connect('bookmarks_home', '/{repo_name}/bookmarks', |
|
964 | 956 | controller='bookmarks', conditions={'function': check_repo}, |
|
965 | 957 | requirements=URL_NAME_REQUIREMENTS) |
|
966 | 958 | |
|
967 | 959 | rmap.connect('changelog_home', '/{repo_name}/changelog', jsroute=True, |
|
968 | 960 | controller='changelog', conditions={'function': check_repo}, |
|
969 | 961 | requirements=URL_NAME_REQUIREMENTS) |
|
970 | 962 | |
|
971 | 963 | rmap.connect('changelog_summary_home', '/{repo_name}/changelog_summary', |
|
972 | 964 | controller='changelog', action='changelog_summary', |
|
973 | 965 | conditions={'function': check_repo}, |
|
974 | 966 | requirements=URL_NAME_REQUIREMENTS) |
|
975 | 967 | |
|
976 | 968 | rmap.connect('changelog_file_home', |
|
977 | 969 | '/{repo_name}/changelog/{revision}/{f_path}', |
|
978 | 970 | controller='changelog', f_path=None, |
|
979 | 971 | conditions={'function': check_repo}, |
|
980 | 972 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
981 | 973 | |
|
982 | 974 | rmap.connect('changelog_details', '/{repo_name}/changelog_details/{cs}', |
|
983 | 975 | controller='changelog', action='changelog_details', |
|
984 | 976 | conditions={'function': check_repo}, |
|
985 | 977 | requirements=URL_NAME_REQUIREMENTS) |
|
986 | 978 | |
|
987 | 979 | rmap.connect('files_home', '/{repo_name}/files/{revision}/{f_path}', |
|
988 | 980 | controller='files', revision='tip', f_path='', |
|
989 | 981 | conditions={'function': check_repo}, |
|
990 | 982 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
991 | 983 | |
|
992 | 984 | rmap.connect('files_home_simple_catchrev', |
|
993 | 985 | '/{repo_name}/files/{revision}', |
|
994 | 986 | controller='files', revision='tip', f_path='', |
|
995 | 987 | conditions={'function': check_repo}, |
|
996 | 988 | requirements=URL_NAME_REQUIREMENTS) |
|
997 | 989 | |
|
998 | 990 | rmap.connect('files_home_simple_catchall', |
|
999 | 991 | '/{repo_name}/files', |
|
1000 | 992 | controller='files', revision='tip', f_path='', |
|
1001 | 993 | conditions={'function': check_repo}, |
|
1002 | 994 | requirements=URL_NAME_REQUIREMENTS) |
|
1003 | 995 | |
|
1004 | 996 | rmap.connect('files_history_home', |
|
1005 | 997 | '/{repo_name}/history/{revision}/{f_path}', |
|
1006 | 998 | controller='files', action='history', revision='tip', f_path='', |
|
1007 | 999 | conditions={'function': check_repo}, |
|
1008 | 1000 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
1009 | 1001 | |
|
1010 | 1002 | rmap.connect('files_authors_home', |
|
1011 | 1003 | '/{repo_name}/authors/{revision}/{f_path}', |
|
1012 | 1004 | controller='files', action='authors', revision='tip', f_path='', |
|
1013 | 1005 | conditions={'function': check_repo}, |
|
1014 | 1006 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
1015 | 1007 | |
|
1016 | 1008 | rmap.connect('files_diff_home', '/{repo_name}/diff/{f_path}', |
|
1017 | 1009 | controller='files', action='diff', f_path='', |
|
1018 | 1010 | conditions={'function': check_repo}, |
|
1019 | 1011 | requirements=URL_NAME_REQUIREMENTS) |
|
1020 | 1012 | |
|
1021 | 1013 | rmap.connect('files_diff_2way_home', |
|
1022 | 1014 | '/{repo_name}/diff-2way/{f_path}', |
|
1023 | 1015 | controller='files', action='diff_2way', f_path='', |
|
1024 | 1016 | conditions={'function': check_repo}, |
|
1025 | 1017 | requirements=URL_NAME_REQUIREMENTS) |
|
1026 | 1018 | |
|
1027 | 1019 | rmap.connect('files_rawfile_home', |
|
1028 | 1020 | '/{repo_name}/rawfile/{revision}/{f_path}', |
|
1029 | 1021 | controller='files', action='rawfile', revision='tip', |
|
1030 | 1022 | f_path='', conditions={'function': check_repo}, |
|
1031 | 1023 | requirements=URL_NAME_REQUIREMENTS) |
|
1032 | 1024 | |
|
1033 | 1025 | rmap.connect('files_raw_home', |
|
1034 | 1026 | '/{repo_name}/raw/{revision}/{f_path}', |
|
1035 | 1027 | controller='files', action='raw', revision='tip', f_path='', |
|
1036 | 1028 | conditions={'function': check_repo}, |
|
1037 | 1029 | requirements=URL_NAME_REQUIREMENTS) |
|
1038 | 1030 | |
|
1039 | 1031 | rmap.connect('files_render_home', |
|
1040 | 1032 | '/{repo_name}/render/{revision}/{f_path}', |
|
1041 | 1033 | controller='files', action='index', revision='tip', f_path='', |
|
1042 | 1034 | rendered=True, conditions={'function': check_repo}, |
|
1043 | 1035 | requirements=URL_NAME_REQUIREMENTS) |
|
1044 | 1036 | |
|
1045 | 1037 | rmap.connect('files_annotate_home', |
|
1046 | 1038 | '/{repo_name}/annotate/{revision}/{f_path}', |
|
1047 | 1039 | controller='files', action='index', revision='tip', |
|
1048 | 1040 | f_path='', annotate=True, conditions={'function': check_repo}, |
|
1049 | 1041 | requirements=URL_NAME_REQUIREMENTS) |
|
1050 | 1042 | |
|
1051 | 1043 | rmap.connect('files_edit', |
|
1052 | 1044 | '/{repo_name}/edit/{revision}/{f_path}', |
|
1053 | 1045 | controller='files', action='edit', revision='tip', |
|
1054 | 1046 | f_path='', |
|
1055 | 1047 | conditions={'function': check_repo, 'method': ['POST']}, |
|
1056 | 1048 | requirements=URL_NAME_REQUIREMENTS) |
|
1057 | 1049 | |
|
1058 | 1050 | rmap.connect('files_edit_home', |
|
1059 | 1051 | '/{repo_name}/edit/{revision}/{f_path}', |
|
1060 | 1052 | controller='files', action='edit_home', revision='tip', |
|
1061 | 1053 | f_path='', conditions={'function': check_repo}, |
|
1062 | 1054 | requirements=URL_NAME_REQUIREMENTS) |
|
1063 | 1055 | |
|
1064 | 1056 | rmap.connect('files_add', |
|
1065 | 1057 | '/{repo_name}/add/{revision}/{f_path}', |
|
1066 | 1058 | controller='files', action='add', revision='tip', |
|
1067 | 1059 | f_path='', |
|
1068 | 1060 | conditions={'function': check_repo, 'method': ['POST']}, |
|
1069 | 1061 | requirements=URL_NAME_REQUIREMENTS) |
|
1070 | 1062 | |
|
1071 | 1063 | rmap.connect('files_add_home', |
|
1072 | 1064 | '/{repo_name}/add/{revision}/{f_path}', |
|
1073 | 1065 | controller='files', action='add_home', revision='tip', |
|
1074 | 1066 | f_path='', conditions={'function': check_repo}, |
|
1075 | 1067 | requirements=URL_NAME_REQUIREMENTS) |
|
1076 | 1068 | |
|
1077 | 1069 | rmap.connect('files_delete', |
|
1078 | 1070 | '/{repo_name}/delete/{revision}/{f_path}', |
|
1079 | 1071 | controller='files', action='delete', revision='tip', |
|
1080 | 1072 | f_path='', |
|
1081 | 1073 | conditions={'function': check_repo, 'method': ['POST']}, |
|
1082 | 1074 | requirements=URL_NAME_REQUIREMENTS) |
|
1083 | 1075 | |
|
1084 | 1076 | rmap.connect('files_delete_home', |
|
1085 | 1077 | '/{repo_name}/delete/{revision}/{f_path}', |
|
1086 | 1078 | controller='files', action='delete_home', revision='tip', |
|
1087 | 1079 | f_path='', conditions={'function': check_repo}, |
|
1088 | 1080 | requirements=URL_NAME_REQUIREMENTS) |
|
1089 | 1081 | |
|
1090 | 1082 | rmap.connect('files_archive_home', '/{repo_name}/archive/{fname}', |
|
1091 | 1083 | controller='files', action='archivefile', |
|
1092 | 1084 | conditions={'function': check_repo}, |
|
1093 | 1085 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
1094 | 1086 | |
|
1095 | 1087 | rmap.connect('files_nodelist_home', |
|
1096 | 1088 | '/{repo_name}/nodelist/{revision}/{f_path}', |
|
1097 | 1089 | controller='files', action='nodelist', |
|
1098 | 1090 | conditions={'function': check_repo}, |
|
1099 | 1091 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
1100 | 1092 | |
|
1101 | 1093 | rmap.connect('files_metadata_list_home', |
|
1102 | 1094 | '/{repo_name}/metadata_list/{revision}/{f_path}', |
|
1103 | 1095 | controller='files', action='metadata_list', |
|
1104 | 1096 | conditions={'function': check_repo}, |
|
1105 | 1097 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
1106 | 1098 | |
|
1107 | 1099 | rmap.connect('repo_fork_create_home', '/{repo_name}/fork', |
|
1108 | 1100 | controller='forks', action='fork_create', |
|
1109 | 1101 | conditions={'function': check_repo, 'method': ['POST']}, |
|
1110 | 1102 | requirements=URL_NAME_REQUIREMENTS) |
|
1111 | 1103 | |
|
1112 | 1104 | rmap.connect('repo_fork_home', '/{repo_name}/fork', |
|
1113 | 1105 | controller='forks', action='fork', |
|
1114 | 1106 | conditions={'function': check_repo}, |
|
1115 | 1107 | requirements=URL_NAME_REQUIREMENTS) |
|
1116 | 1108 | |
|
1117 | 1109 | rmap.connect('repo_forks_home', '/{repo_name}/forks', |
|
1118 | 1110 | controller='forks', action='forks', |
|
1119 | 1111 | conditions={'function': check_repo}, |
|
1120 | 1112 | requirements=URL_NAME_REQUIREMENTS) |
|
1121 | 1113 | |
|
1122 | 1114 | rmap.connect('repo_followers_home', '/{repo_name}/followers', |
|
1123 | 1115 | controller='followers', action='followers', |
|
1124 | 1116 | conditions={'function': check_repo}, |
|
1125 | 1117 | requirements=URL_NAME_REQUIREMENTS) |
|
1126 | 1118 | |
|
1127 | 1119 | # must be here for proper group/repo catching pattern |
|
1128 | 1120 | _connect_with_slash( |
|
1129 | 1121 | rmap, 'repo_group_home', '/{group_name}', |
|
1130 | 1122 | controller='home', action='index_repo_group', |
|
1131 | 1123 | conditions={'function': check_group}, |
|
1132 | 1124 | requirements=URL_NAME_REQUIREMENTS) |
|
1133 | 1125 | |
|
1134 | 1126 | # catch all, at the end |
|
1135 | 1127 | _connect_with_slash( |
|
1136 | 1128 | rmap, 'summary_home', '/{repo_name}', jsroute=True, |
|
1137 | 1129 | controller='summary', action='index', |
|
1138 | 1130 | conditions={'function': check_repo}, |
|
1139 | 1131 | requirements=URL_NAME_REQUIREMENTS) |
|
1140 | 1132 | |
|
1141 | 1133 | return rmap |
|
1142 | 1134 | |
|
1143 | 1135 | |
|
1144 | 1136 | def _connect_with_slash(mapper, name, path, *args, **kwargs): |
|
1145 | 1137 | """ |
|
1146 | 1138 | Connect a route with an optional trailing slash in `path`. |
|
1147 | 1139 | """ |
|
1148 | 1140 | mapper.connect(name + '_slash', path + '/', *args, **kwargs) |
|
1149 | 1141 | mapper.connect(name, path, *args, **kwargs) |
@@ -1,99 +1,103 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | import os |
|
22 | 22 | import shlex |
|
23 | 23 | import Pyro4 |
|
24 | 24 | import platform |
|
25 | 25 | |
|
26 | 26 | from rhodecode.model import init_model |
|
27 | 27 | |
|
28 | 28 | |
|
29 | 29 | def configure_pyro4(config): |
|
30 | 30 | """ |
|
31 | 31 | Configure Pyro4 based on `config`. |
|
32 | 32 | |
|
33 | 33 | This will mainly set the different configuration parameters of the Pyro4 |
|
34 | 34 | library based on the settings in our INI files. The Pyro4 documentation |
|
35 | 35 | lists more details about the specific settings and their meaning. |
|
36 | 36 | """ |
|
37 | 37 | Pyro4.config.COMMTIMEOUT = float(config['vcs.connection_timeout']) |
|
38 | 38 | Pyro4.config.SERIALIZER = 'pickle' |
|
39 | 39 | Pyro4.config.SERIALIZERS_ACCEPTED.add('pickle') |
|
40 | 40 | |
|
41 | 41 | # Note: We need server configuration in the WSGI processes |
|
42 | 42 | # because we provide a callback server in certain vcs operations. |
|
43 | 43 | Pyro4.config.SERVERTYPE = "multiplex" |
|
44 | 44 | Pyro4.config.POLLTIMEOUT = 0.01 |
|
45 | 45 | |
|
46 | 46 | |
|
47 | 47 | def configure_vcs(config): |
|
48 | 48 | """ |
|
49 | 49 | Patch VCS config with some RhodeCode specific stuff |
|
50 | 50 | """ |
|
51 | 51 | from rhodecode.lib.vcs import conf |
|
52 | 52 | from rhodecode.lib.utils2 import aslist |
|
53 | 53 | conf.settings.BACKENDS = { |
|
54 | 54 | 'hg': 'rhodecode.lib.vcs.backends.hg.MercurialRepository', |
|
55 | 55 | 'git': 'rhodecode.lib.vcs.backends.git.GitRepository', |
|
56 | 56 | 'svn': 'rhodecode.lib.vcs.backends.svn.SubversionRepository', |
|
57 | 57 | } |
|
58 | 58 | |
|
59 | 59 | conf.settings.HG_USE_REBASE_FOR_MERGING = config.get( |
|
60 | 60 | 'rhodecode_hg_use_rebase_for_merging', False) |
|
61 | 61 | conf.settings.GIT_REV_FILTER = shlex.split( |
|
62 | 62 | config.get('git_rev_filter', '--all').strip()) |
|
63 | 63 | conf.settings.DEFAULT_ENCODINGS = aslist(config.get('default_encoding', |
|
64 | 64 | 'UTF-8'), sep=',') |
|
65 | 65 | conf.settings.ALIASES[:] = config.get('vcs.backends') |
|
66 | 66 | conf.settings.SVN_COMPATIBLE_VERSION = config.get( |
|
67 | 67 | 'vcs.svn.compatible_version') |
|
68 | 68 | |
|
69 | 69 | |
|
70 | 70 | def initialize_database(config): |
|
71 | from rhodecode.lib.utils2 import engine_from_config | |
|
71 | from rhodecode.lib.utils2 import engine_from_config, get_encryption_key | |
|
72 | 72 | engine = engine_from_config(config, 'sqlalchemy.db1.') |
|
73 |
init_model(engine, encryption_key=config |
|
|
73 | init_model(engine, encryption_key=get_encryption_key(config)) | |
|
74 | 74 | |
|
75 | 75 | |
|
76 | 76 | def initialize_test_environment(settings, test_env=None): |
|
77 | 77 | if test_env is None: |
|
78 | 78 | test_env = not int(os.environ.get('RC_NO_TMP_PATH', 0)) |
|
79 | 79 | |
|
80 |
from rhodecode.lib.utils import |
|
|
80 | from rhodecode.lib.utils import ( | |
|
81 | create_test_directory, create_test_database, create_test_repositories, | |
|
82 | create_test_index) | |
|
81 | 83 | from rhodecode.tests import TESTS_TMP_PATH |
|
82 | 84 | # test repos |
|
83 | 85 | if test_env: |
|
84 |
create_test_ |
|
|
85 |
create_test_ |
|
|
86 | create_test_directory(TESTS_TMP_PATH) | |
|
87 | create_test_database(TESTS_TMP_PATH, settings) | |
|
88 | create_test_repositories(TESTS_TMP_PATH, settings) | |
|
89 | create_test_index(TESTS_TMP_PATH, settings) | |
|
86 | 90 | |
|
87 | 91 | |
|
88 | 92 | def get_vcs_server_protocol(config): |
|
89 | 93 | protocol = config.get('vcs.server.protocol', 'pyro4') |
|
90 | 94 | return protocol |
|
91 | 95 | |
|
92 | 96 | |
|
93 | 97 | def set_instance_id(config): |
|
94 | 98 | """ Sets a dynamic generated config['instance_id'] if missing or '*' """ |
|
95 | 99 | |
|
96 | 100 | config['instance_id'] = config.get('instance_id') or '' |
|
97 | 101 | if config['instance_id'] == '*' or not config['instance_id']: |
|
98 | 102 | _platform_id = platform.uname()[1] or 'instance' |
|
99 | 103 | config['instance_id'] = '%s-%s' % (_platform_id, os.getpid()) |
@@ -1,407 +1,406 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | |
|
22 | 22 | """ |
|
23 | 23 | Repository groups controller for RhodeCode |
|
24 | 24 | """ |
|
25 | 25 | |
|
26 | 26 | import logging |
|
27 | 27 | import formencode |
|
28 | 28 | |
|
29 | 29 | from formencode import htmlfill |
|
30 | 30 | |
|
31 | 31 | from pylons import request, tmpl_context as c, url |
|
32 | 32 | from pylons.controllers.util import abort, redirect |
|
33 | 33 | from pylons.i18n.translation import _, ungettext |
|
34 | 34 | |
|
35 | 35 | from rhodecode.lib import auth |
|
36 | 36 | from rhodecode.lib import helpers as h |
|
37 | 37 | from rhodecode.lib.ext_json import json |
|
38 | 38 | from rhodecode.lib.auth import ( |
|
39 | 39 | LoginRequired, NotAnonymous, HasPermissionAll, |
|
40 | 40 | HasRepoGroupPermissionAll, HasRepoGroupPermissionAnyDecorator) |
|
41 | 41 | from rhodecode.lib.base import BaseController, render |
|
42 | 42 | from rhodecode.model.db import RepoGroup, User |
|
43 | 43 | from rhodecode.model.scm import RepoGroupList |
|
44 | 44 | from rhodecode.model.repo_group import RepoGroupModel |
|
45 | 45 | from rhodecode.model.forms import RepoGroupForm, RepoGroupPermsForm |
|
46 | 46 | from rhodecode.model.meta import Session |
|
47 | 47 | from rhodecode.lib.utils2 import safe_int |
|
48 | 48 | |
|
49 | 49 | |
|
50 | 50 | log = logging.getLogger(__name__) |
|
51 | 51 | |
|
52 | 52 | |
|
53 | 53 | class RepoGroupsController(BaseController): |
|
54 | 54 | """REST Controller styled on the Atom Publishing Protocol""" |
|
55 | 55 | |
|
56 | 56 | @LoginRequired() |
|
57 | 57 | def __before__(self): |
|
58 | 58 | super(RepoGroupsController, self).__before__() |
|
59 | 59 | |
|
60 | 60 | def __load_defaults(self, allow_empty_group=False, repo_group=None): |
|
61 | 61 | if self._can_create_repo_group(): |
|
62 | 62 | # we're global admin, we're ok and we can create TOP level groups |
|
63 | 63 | allow_empty_group = True |
|
64 | 64 | |
|
65 | 65 | # override the choices for this form, we need to filter choices |
|
66 | 66 | # and display only those we have ADMIN right |
|
67 | 67 | groups_with_admin_rights = RepoGroupList( |
|
68 | 68 | RepoGroup.query().all(), |
|
69 | 69 | perm_set=['group.admin']) |
|
70 | 70 | c.repo_groups = RepoGroup.groups_choices( |
|
71 | 71 | groups=groups_with_admin_rights, |
|
72 | 72 | show_empty_group=allow_empty_group) |
|
73 | 73 | |
|
74 | 74 | if repo_group: |
|
75 | 75 | # exclude filtered ids |
|
76 | 76 | exclude_group_ids = [repo_group.group_id] |
|
77 | 77 | c.repo_groups = filter(lambda x: x[0] not in exclude_group_ids, |
|
78 | 78 | c.repo_groups) |
|
79 | 79 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) |
|
80 | 80 | parent_group = repo_group.parent_group |
|
81 | 81 | |
|
82 | 82 | add_parent_group = (parent_group and ( |
|
83 | 83 | unicode(parent_group.group_id) not in c.repo_groups_choices)) |
|
84 | 84 | if add_parent_group: |
|
85 | 85 | c.repo_groups_choices.append(unicode(parent_group.group_id)) |
|
86 | 86 | c.repo_groups.append(RepoGroup._generate_choice(parent_group)) |
|
87 | 87 | |
|
88 | 88 | def __load_data(self, group_id): |
|
89 | 89 | """ |
|
90 | 90 | Load defaults settings for edit, and update |
|
91 | 91 | |
|
92 | 92 | :param group_id: |
|
93 | 93 | """ |
|
94 | 94 | repo_group = RepoGroup.get_or_404(group_id) |
|
95 | 95 | data = repo_group.get_dict() |
|
96 | 96 | data['group_name'] = repo_group.name |
|
97 | 97 | |
|
98 | 98 | # fill owner |
|
99 | 99 | if repo_group.user: |
|
100 | 100 | data.update({'user': repo_group.user.username}) |
|
101 | 101 | else: |
|
102 | replacement_user = User.get_first_admin().username | |
|
102 | replacement_user = User.get_first_super_admin().username | |
|
103 | 103 | data.update({'user': replacement_user}) |
|
104 | 104 | |
|
105 | 105 | # fill repository group users |
|
106 | 106 | for p in repo_group.repo_group_to_perm: |
|
107 | 107 | data.update({ |
|
108 | 108 | 'u_perm_%s' % p.user.user_id: p.permission.permission_name}) |
|
109 | 109 | |
|
110 | 110 | # fill repository group user groups |
|
111 | 111 | for p in repo_group.users_group_to_perm: |
|
112 | 112 | data.update({ |
|
113 | 113 | 'g_perm_%s' % p.users_group.users_group_id: |
|
114 | 114 | p.permission.permission_name}) |
|
115 | 115 | # html and form expects -1 as empty parent group |
|
116 | 116 | data['group_parent_id'] = data['group_parent_id'] or -1 |
|
117 | 117 | return data |
|
118 | 118 | |
|
119 | 119 | def _revoke_perms_on_yourself(self, form_result): |
|
120 | 120 | _updates = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), |
|
121 | 121 | form_result['perm_updates']) |
|
122 | 122 | _additions = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), |
|
123 | 123 | form_result['perm_additions']) |
|
124 | 124 | _deletions = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), |
|
125 | 125 | form_result['perm_deletions']) |
|
126 | 126 | admin_perm = 'group.admin' |
|
127 | 127 | if _updates and _updates[0][1] != admin_perm or \ |
|
128 | 128 | _additions and _additions[0][1] != admin_perm or \ |
|
129 | 129 | _deletions and _deletions[0][1] != admin_perm: |
|
130 | 130 | return True |
|
131 | 131 | return False |
|
132 | 132 | |
|
133 | 133 | def _can_create_repo_group(self, parent_group_id=None): |
|
134 | 134 | is_admin = HasPermissionAll('hg.admin')('group create controller') |
|
135 | 135 | create_repo_group = HasPermissionAll( |
|
136 | 136 | 'hg.repogroup.create.true')('group create controller') |
|
137 | 137 | if is_admin or (create_repo_group and not parent_group_id): |
|
138 | 138 | # we're global admin, or we have global repo group create |
|
139 | 139 | # permission |
|
140 | 140 | # we're ok and we can create TOP level groups |
|
141 | 141 | return True |
|
142 | 142 | elif parent_group_id: |
|
143 | 143 | # we check the permission if we can write to parent group |
|
144 | 144 | group = RepoGroup.get(parent_group_id) |
|
145 | 145 | group_name = group.group_name if group else None |
|
146 | 146 | if HasRepoGroupPermissionAll('group.admin')( |
|
147 | 147 | group_name, 'check if user is an admin of group'): |
|
148 | 148 | # we're an admin of passed in group, we're ok. |
|
149 | 149 | return True |
|
150 | 150 | else: |
|
151 | 151 | return False |
|
152 | 152 | return False |
|
153 | 153 | |
|
154 | 154 | @NotAnonymous() |
|
155 | 155 | def index(self): |
|
156 | 156 | """GET /repo_groups: All items in the collection""" |
|
157 | 157 | # url('repo_groups') |
|
158 | 158 | |
|
159 | 159 | repo_group_list = RepoGroup.get_all_repo_groups() |
|
160 | 160 | _perms = ['group.admin'] |
|
161 | 161 | repo_group_list_acl = RepoGroupList(repo_group_list, perm_set=_perms) |
|
162 | 162 | repo_group_data = RepoGroupModel().get_repo_groups_as_dict( |
|
163 | 163 | repo_group_list=repo_group_list_acl, admin=True) |
|
164 | 164 | c.data = json.dumps(repo_group_data) |
|
165 | 165 | return render('admin/repo_groups/repo_groups.html') |
|
166 | 166 | |
|
167 | 167 | # perm checks inside |
|
168 | 168 | @NotAnonymous() |
|
169 | 169 | @auth.CSRFRequired() |
|
170 | 170 | def create(self): |
|
171 | 171 | """POST /repo_groups: Create a new item""" |
|
172 | 172 | # url('repo_groups') |
|
173 | 173 | |
|
174 | 174 | parent_group_id = safe_int(request.POST.get('group_parent_id')) |
|
175 | 175 | can_create = self._can_create_repo_group(parent_group_id) |
|
176 | 176 | |
|
177 | 177 | self.__load_defaults() |
|
178 | 178 | # permissions for can create group based on parent_id are checked |
|
179 | 179 | # here in the Form |
|
180 | 180 | available_groups = map(lambda k: unicode(k[0]), c.repo_groups) |
|
181 | 181 | repo_group_form = RepoGroupForm(available_groups=available_groups, |
|
182 | 182 | can_create_in_root=can_create)() |
|
183 | 183 | try: |
|
184 | 184 | owner = c.rhodecode_user |
|
185 | 185 | form_result = repo_group_form.to_python(dict(request.POST)) |
|
186 | 186 | RepoGroupModel().create( |
|
187 | 187 | group_name=form_result['group_name_full'], |
|
188 | 188 | group_description=form_result['group_description'], |
|
189 | 189 | owner=owner.user_id, |
|
190 | 190 | copy_permissions=form_result['group_copy_permissions'] |
|
191 | 191 | ) |
|
192 | 192 | Session().commit() |
|
193 | 193 | _new_group_name = form_result['group_name_full'] |
|
194 | 194 | repo_group_url = h.link_to( |
|
195 | 195 | _new_group_name, |
|
196 | 196 | h.url('repo_group_home', group_name=_new_group_name)) |
|
197 | 197 | h.flash(h.literal(_('Created repository group %s') |
|
198 | 198 | % repo_group_url), category='success') |
|
199 | 199 | # TODO: in futureaction_logger(, '', '', '', self.sa) |
|
200 | 200 | except formencode.Invalid as errors: |
|
201 | 201 | return htmlfill.render( |
|
202 | 202 | render('admin/repo_groups/repo_group_add.html'), |
|
203 | 203 | defaults=errors.value, |
|
204 | 204 | errors=errors.error_dict or {}, |
|
205 | 205 | prefix_error=False, |
|
206 | 206 | encoding="UTF-8", |
|
207 | 207 | force_defaults=False) |
|
208 | 208 | except Exception: |
|
209 | 209 | log.exception("Exception during creation of repository group") |
|
210 | 210 | h.flash(_('Error occurred during creation of repository group %s') |
|
211 | 211 | % request.POST.get('group_name'), category='error') |
|
212 | 212 | |
|
213 | 213 | # TODO: maybe we should get back to the main view, not the admin one |
|
214 | 214 | return redirect(url('repo_groups', parent_group=parent_group_id)) |
|
215 | 215 | |
|
216 | 216 | # perm checks inside |
|
217 | 217 | @NotAnonymous() |
|
218 | 218 | def new(self): |
|
219 | 219 | """GET /repo_groups/new: Form to create a new item""" |
|
220 | 220 | # url('new_repo_group') |
|
221 | 221 | # perm check for admin, create_group perm or admin of parent_group |
|
222 | 222 | parent_group_id = safe_int(request.GET.get('parent_group')) |
|
223 | 223 | if not self._can_create_repo_group(parent_group_id): |
|
224 | 224 | return abort(403) |
|
225 | 225 | |
|
226 | 226 | self.__load_defaults() |
|
227 | 227 | return render('admin/repo_groups/repo_group_add.html') |
|
228 | 228 | |
|
229 | 229 | @HasRepoGroupPermissionAnyDecorator('group.admin') |
|
230 | 230 | @auth.CSRFRequired() |
|
231 | 231 | def update(self, group_name): |
|
232 | 232 | """PUT /repo_groups/group_name: Update an existing item""" |
|
233 | 233 | # Forms posted to this method should contain a hidden field: |
|
234 | 234 | # <input type="hidden" name="_method" value="PUT" /> |
|
235 | 235 | # Or using helpers: |
|
236 | 236 | # h.form(url('repos_group', group_name=GROUP_NAME), method='put') |
|
237 | 237 | # url('repo_group_home', group_name=GROUP_NAME) |
|
238 | 238 | |
|
239 | 239 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
240 | 240 | can_create_in_root = self._can_create_repo_group() |
|
241 | 241 | show_root_location = can_create_in_root |
|
242 | 242 | if not c.repo_group.parent_group: |
|
243 | 243 | # this group don't have a parrent so we should show empty value |
|
244 | 244 | show_root_location = True |
|
245 | 245 | self.__load_defaults(allow_empty_group=show_root_location, |
|
246 | 246 | repo_group=c.repo_group) |
|
247 | 247 | |
|
248 | 248 | repo_group_form = RepoGroupForm( |
|
249 | edit=True, | |
|
250 | old_data=c.repo_group.get_dict(), | |
|
249 | edit=True, old_data=c.repo_group.get_dict(), | |
|
251 | 250 | available_groups=c.repo_groups_choices, |
|
252 | can_create_in_root=can_create_in_root, | |
|
253 | )() | |
|
251 | can_create_in_root=can_create_in_root, allow_disabled=True)() | |
|
252 | ||
|
254 | 253 | try: |
|
255 | 254 | form_result = repo_group_form.to_python(dict(request.POST)) |
|
256 | 255 | gr_name = form_result['group_name'] |
|
257 | 256 | new_gr = RepoGroupModel().update(group_name, form_result) |
|
258 | 257 | Session().commit() |
|
259 | 258 | h.flash(_('Updated repository group %s') % (gr_name,), |
|
260 | 259 | category='success') |
|
261 | 260 | # we now have new name ! |
|
262 | 261 | group_name = new_gr.group_name |
|
263 | 262 | # TODO: in future action_logger(, '', '', '', self.sa) |
|
264 | 263 | except formencode.Invalid as errors: |
|
265 | 264 | c.active = 'settings' |
|
266 | 265 | return htmlfill.render( |
|
267 | 266 | render('admin/repo_groups/repo_group_edit.html'), |
|
268 | 267 | defaults=errors.value, |
|
269 | 268 | errors=errors.error_dict or {}, |
|
270 | 269 | prefix_error=False, |
|
271 | 270 | encoding="UTF-8", |
|
272 | 271 | force_defaults=False) |
|
273 | 272 | except Exception: |
|
274 | 273 | log.exception("Exception during update or repository group") |
|
275 | 274 | h.flash(_('Error occurred during update of repository group %s') |
|
276 | 275 | % request.POST.get('group_name'), category='error') |
|
277 | 276 | |
|
278 | 277 | return redirect(url('edit_repo_group', group_name=group_name)) |
|
279 | 278 | |
|
280 | 279 | @HasRepoGroupPermissionAnyDecorator('group.admin') |
|
281 | 280 | @auth.CSRFRequired() |
|
282 | 281 | def delete(self, group_name): |
|
283 | 282 | """DELETE /repo_groups/group_name: Delete an existing item""" |
|
284 | 283 | # Forms posted to this method should contain a hidden field: |
|
285 | 284 | # <input type="hidden" name="_method" value="DELETE" /> |
|
286 | 285 | # Or using helpers: |
|
287 | 286 | # h.form(url('repos_group', group_name=GROUP_NAME), method='delete') |
|
288 | 287 | # url('repo_group_home', group_name=GROUP_NAME) |
|
289 | 288 | |
|
290 | 289 | gr = c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
291 | 290 | repos = gr.repositories.all() |
|
292 | 291 | if repos: |
|
293 | 292 | msg = ungettext( |
|
294 | 293 | 'This group contains %(num)d repository and cannot be deleted', |
|
295 | 294 | 'This group contains %(num)d repositories and cannot be' |
|
296 | 295 | ' deleted', |
|
297 | 296 | len(repos)) % {'num': len(repos)} |
|
298 | 297 | h.flash(msg, category='warning') |
|
299 | 298 | return redirect(url('repo_groups')) |
|
300 | 299 | |
|
301 | 300 | children = gr.children.all() |
|
302 | 301 | if children: |
|
303 | 302 | msg = ungettext( |
|
304 | 303 | 'This group contains %(num)d subgroup and cannot be deleted', |
|
305 | 304 | 'This group contains %(num)d subgroups and cannot be deleted', |
|
306 | 305 | len(children)) % {'num': len(children)} |
|
307 | 306 | h.flash(msg, category='warning') |
|
308 | 307 | return redirect(url('repo_groups')) |
|
309 | 308 | |
|
310 | 309 | try: |
|
311 | 310 | RepoGroupModel().delete(group_name) |
|
312 | 311 | Session().commit() |
|
313 | 312 | h.flash(_('Removed repository group %s') % group_name, |
|
314 | 313 | category='success') |
|
315 | 314 | # TODO: in future action_logger(, '', '', '', self.sa) |
|
316 | 315 | except Exception: |
|
317 | 316 | log.exception("Exception during deletion of repository group") |
|
318 | 317 | h.flash(_('Error occurred during deletion of repository group %s') |
|
319 | 318 | % group_name, category='error') |
|
320 | 319 | |
|
321 | 320 | return redirect(url('repo_groups')) |
|
322 | 321 | |
|
323 | 322 | @HasRepoGroupPermissionAnyDecorator('group.admin') |
|
324 | 323 | def edit(self, group_name): |
|
325 | 324 | """GET /repo_groups/group_name/edit: Form to edit an existing item""" |
|
326 | 325 | # url('edit_repo_group', group_name=GROUP_NAME) |
|
327 | 326 | c.active = 'settings' |
|
328 | 327 | |
|
329 | 328 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
330 | 329 | # we can only allow moving empty group if it's already a top-level |
|
331 | 330 | # group, ie has no parents, or we're admin |
|
332 | 331 | can_create_in_root = self._can_create_repo_group() |
|
333 | 332 | show_root_location = can_create_in_root |
|
334 | 333 | if not c.repo_group.parent_group: |
|
335 | 334 | # this group don't have a parrent so we should show empty value |
|
336 | 335 | show_root_location = True |
|
337 | 336 | self.__load_defaults(allow_empty_group=show_root_location, |
|
338 | 337 | repo_group=c.repo_group) |
|
339 | 338 | defaults = self.__load_data(c.repo_group.group_id) |
|
340 | 339 | |
|
341 | 340 | return htmlfill.render( |
|
342 | 341 | render('admin/repo_groups/repo_group_edit.html'), |
|
343 | 342 | defaults=defaults, |
|
344 | 343 | encoding="UTF-8", |
|
345 | 344 | force_defaults=False |
|
346 | 345 | ) |
|
347 | 346 | |
|
348 | 347 | @HasRepoGroupPermissionAnyDecorator('group.admin') |
|
349 | 348 | def edit_repo_group_advanced(self, group_name): |
|
350 | 349 | """GET /repo_groups/group_name/edit: Form to edit an existing item""" |
|
351 | 350 | # url('edit_repo_group', group_name=GROUP_NAME) |
|
352 | 351 | c.active = 'advanced' |
|
353 | 352 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
354 | 353 | |
|
355 | 354 | return render('admin/repo_groups/repo_group_edit.html') |
|
356 | 355 | |
|
357 | 356 | @HasRepoGroupPermissionAnyDecorator('group.admin') |
|
358 | 357 | def edit_repo_group_perms(self, group_name): |
|
359 | 358 | """GET /repo_groups/group_name/edit: Form to edit an existing item""" |
|
360 | 359 | # url('edit_repo_group', group_name=GROUP_NAME) |
|
361 | 360 | c.active = 'perms' |
|
362 | 361 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
363 | 362 | self.__load_defaults() |
|
364 | 363 | defaults = self.__load_data(c.repo_group.group_id) |
|
365 | 364 | |
|
366 | 365 | return htmlfill.render( |
|
367 | 366 | render('admin/repo_groups/repo_group_edit.html'), |
|
368 | 367 | defaults=defaults, |
|
369 | 368 | encoding="UTF-8", |
|
370 | 369 | force_defaults=False |
|
371 | 370 | ) |
|
372 | 371 | |
|
373 | 372 | @HasRepoGroupPermissionAnyDecorator('group.admin') |
|
374 | 373 | @auth.CSRFRequired() |
|
375 | 374 | def update_perms(self, group_name): |
|
376 | 375 | """ |
|
377 | 376 | Update permissions for given repository group |
|
378 | 377 | |
|
379 | 378 | :param group_name: |
|
380 | 379 | """ |
|
381 | 380 | |
|
382 | 381 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
383 | 382 | valid_recursive_choices = ['none', 'repos', 'groups', 'all'] |
|
384 | 383 | form = RepoGroupPermsForm(valid_recursive_choices)().to_python( |
|
385 | 384 | request.POST) |
|
386 | 385 | |
|
387 | 386 | if not c.rhodecode_user.is_admin: |
|
388 | 387 | if self._revoke_perms_on_yourself(form): |
|
389 | 388 | msg = _('Cannot change permission for yourself as admin') |
|
390 | 389 | h.flash(msg, category='warning') |
|
391 | 390 | return redirect( |
|
392 | 391 | url('edit_repo_group_perms', group_name=group_name)) |
|
393 | 392 | |
|
394 | 393 | # iterate over all members(if in recursive mode) of this groups and |
|
395 | 394 | # set the permissions ! |
|
396 | 395 | # this can be potentially heavy operation |
|
397 | 396 | RepoGroupModel().update_permissions( |
|
398 | 397 | c.repo_group, |
|
399 | 398 | form['perm_additions'], form['perm_updates'], |
|
400 | 399 | form['perm_deletions'], form['recursive']) |
|
401 | 400 | |
|
402 | 401 | # TODO: implement this |
|
403 | 402 | # action_logger(c.rhodecode_user, 'admin_changed_repo_permissions', |
|
404 | 403 | # repo_name, self.ip_addr, self.sa) |
|
405 | 404 | Session().commit() |
|
406 | 405 | h.flash(_('Repository Group permissions updated'), category='success') |
|
407 | 406 | return redirect(url('edit_repo_group_perms', group_name=group_name)) |
@@ -1,878 +1,878 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2013-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | |
|
22 | 22 | """ |
|
23 | 23 | Repositories controller for RhodeCode |
|
24 | 24 | """ |
|
25 | 25 | |
|
26 | 26 | import logging |
|
27 | 27 | import traceback |
|
28 | 28 | |
|
29 | 29 | import formencode |
|
30 | 30 | from formencode import htmlfill |
|
31 | 31 | from pylons import request, tmpl_context as c, url |
|
32 | 32 | from pylons.controllers.util import redirect |
|
33 | 33 | from pylons.i18n.translation import _ |
|
34 | 34 | from webob.exc import HTTPForbidden, HTTPNotFound, HTTPBadRequest |
|
35 | 35 | |
|
36 | 36 | from rhodecode.lib import auth, helpers as h |
|
37 | 37 | from rhodecode.lib.auth import ( |
|
38 | 38 | LoginRequired, HasPermissionAllDecorator, |
|
39 | 39 | HasRepoPermissionAllDecorator, NotAnonymous, HasPermissionAny, |
|
40 | 40 | HasRepoGroupPermissionAny, HasRepoPermissionAnyDecorator) |
|
41 | 41 | from rhodecode.lib.base import BaseRepoController, render |
|
42 | 42 | from rhodecode.lib.ext_json import json |
|
43 | 43 | from rhodecode.lib.exceptions import AttachedForksError |
|
44 | 44 | from rhodecode.lib.utils import action_logger, repo_name_slug, jsonify |
|
45 | 45 | from rhodecode.lib.utils2 import safe_int |
|
46 | 46 | from rhodecode.lib.vcs import RepositoryError |
|
47 | 47 | from rhodecode.model.db import ( |
|
48 | 48 | User, Repository, UserFollowing, RepoGroup, RepositoryField) |
|
49 | 49 | from rhodecode.model.forms import ( |
|
50 | 50 | RepoForm, RepoFieldForm, RepoPermsForm, RepoVcsSettingsForm, |
|
51 | 51 | IssueTrackerPatternsForm) |
|
52 | 52 | from rhodecode.model.meta import Session |
|
53 | 53 | from rhodecode.model.repo import RepoModel |
|
54 | 54 | from rhodecode.model.scm import ScmModel, RepoGroupList, RepoList |
|
55 | 55 | from rhodecode.model.settings import ( |
|
56 | 56 | SettingsModel, IssueTrackerSettingsModel, VcsSettingsModel, |
|
57 | 57 | SettingNotFound) |
|
58 | 58 | |
|
59 | 59 | log = logging.getLogger(__name__) |
|
60 | 60 | |
|
61 | 61 | |
|
62 | 62 | class ReposController(BaseRepoController): |
|
63 | 63 | """ |
|
64 | 64 | REST Controller styled on the Atom Publishing Protocol""" |
|
65 | 65 | # To properly map this controller, ensure your config/routing.py |
|
66 | 66 | # file has a resource setup: |
|
67 | 67 | # map.resource('repo', 'repos') |
|
68 | 68 | |
|
69 | 69 | @LoginRequired() |
|
70 | 70 | def __before__(self): |
|
71 | 71 | super(ReposController, self).__before__() |
|
72 | 72 | |
|
73 | 73 | def _load_repo(self, repo_name): |
|
74 | 74 | repo_obj = Repository.get_by_repo_name(repo_name) |
|
75 | 75 | |
|
76 | 76 | if repo_obj is None: |
|
77 | 77 | h.not_mapped_error(repo_name) |
|
78 | 78 | return redirect(url('repos')) |
|
79 | 79 | |
|
80 | 80 | return repo_obj |
|
81 | 81 | |
|
82 | 82 | def __load_defaults(self, repo=None): |
|
83 | 83 | acl_groups = RepoGroupList(RepoGroup.query().all(), |
|
84 | 84 | perm_set=['group.write', 'group.admin']) |
|
85 | 85 | c.repo_groups = RepoGroup.groups_choices(groups=acl_groups) |
|
86 | 86 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) |
|
87 | 87 | |
|
88 | 88 | # in case someone no longer have a group.write access to a repository |
|
89 | 89 | # pre fill the list with this entry, we don't care if this is the same |
|
90 | 90 | # but it will allow saving repo data properly. |
|
91 | 91 | |
|
92 | 92 | repo_group = None |
|
93 | 93 | if repo: |
|
94 | 94 | repo_group = repo.group |
|
95 | 95 | if repo_group and unicode(repo_group.group_id) not in c.repo_groups_choices: |
|
96 | 96 | c.repo_groups_choices.append(unicode(repo_group.group_id)) |
|
97 | 97 | c.repo_groups.append(RepoGroup._generate_choice(repo_group)) |
|
98 | 98 | |
|
99 | 99 | choices, c.landing_revs = ScmModel().get_repo_landing_revs() |
|
100 | 100 | c.landing_revs_choices = choices |
|
101 | 101 | |
|
102 | 102 | def __load_data(self, repo_name=None): |
|
103 | 103 | """ |
|
104 | 104 | Load defaults settings for edit, and update |
|
105 | 105 | |
|
106 | 106 | :param repo_name: |
|
107 | 107 | """ |
|
108 | 108 | c.repo_info = self._load_repo(repo_name) |
|
109 | 109 | self.__load_defaults(c.repo_info) |
|
110 | 110 | |
|
111 | 111 | # override defaults for exact repo info here git/hg etc |
|
112 | 112 | if not c.repository_requirements_missing: |
|
113 | 113 | choices, c.landing_revs = ScmModel().get_repo_landing_revs( |
|
114 | 114 | c.repo_info) |
|
115 | 115 | c.landing_revs_choices = choices |
|
116 | 116 | defaults = RepoModel()._get_defaults(repo_name) |
|
117 | 117 | |
|
118 | 118 | return defaults |
|
119 | 119 | |
|
120 | 120 | def _log_creation_exception(self, e, repo_name): |
|
121 | 121 | reason = None |
|
122 | 122 | if len(e.args) == 2: |
|
123 | 123 | reason = e.args[1] |
|
124 | 124 | |
|
125 | 125 | if reason == 'INVALID_CERTIFICATE': |
|
126 | 126 | log.exception( |
|
127 | 127 | 'Exception creating a repository: invalid certificate') |
|
128 | 128 | msg = (_('Error creating repository %s: invalid certificate') |
|
129 | 129 | % repo_name) |
|
130 | 130 | else: |
|
131 | 131 | log.exception("Exception creating a repository") |
|
132 | 132 | msg = (_('Error creating repository %s') |
|
133 | 133 | % repo_name) |
|
134 | 134 | |
|
135 | 135 | return msg |
|
136 | 136 | |
|
137 | 137 | @NotAnonymous() |
|
138 | 138 | def index(self, format='html'): |
|
139 | 139 | """GET /repos: All items in the collection""" |
|
140 | 140 | # url('repos') |
|
141 | 141 | |
|
142 | 142 | repo_list = Repository.get_all_repos() |
|
143 | 143 | c.repo_list = RepoList(repo_list, perm_set=['repository.admin']) |
|
144 | 144 | repos_data = RepoModel().get_repos_as_dict( |
|
145 | 145 | repo_list=c.repo_list, admin=True, super_user_actions=True) |
|
146 | 146 | # json used to render the grid |
|
147 | 147 | c.data = json.dumps(repos_data) |
|
148 | 148 | |
|
149 | 149 | return render('admin/repos/repos.html') |
|
150 | 150 | |
|
151 | 151 | # perms check inside |
|
152 | 152 | @NotAnonymous() |
|
153 | 153 | @auth.CSRFRequired() |
|
154 | 154 | def create(self): |
|
155 | 155 | """ |
|
156 | 156 | POST /repos: Create a new item""" |
|
157 | 157 | # url('repos') |
|
158 | 158 | |
|
159 | 159 | self.__load_defaults() |
|
160 | 160 | form_result = {} |
|
161 | 161 | task_id = None |
|
162 | 162 | try: |
|
163 | 163 | # CanWriteToGroup validators checks permissions of this POST |
|
164 | 164 | form_result = RepoForm(repo_groups=c.repo_groups_choices, |
|
165 | 165 | landing_revs=c.landing_revs_choices)()\ |
|
166 | 166 | .to_python(dict(request.POST)) |
|
167 | 167 | |
|
168 | 168 | # create is done sometimes async on celery, db transaction |
|
169 | 169 | # management is handled there. |
|
170 | 170 | task = RepoModel().create(form_result, c.rhodecode_user.user_id) |
|
171 | 171 | from celery.result import BaseAsyncResult |
|
172 | 172 | if isinstance(task, BaseAsyncResult): |
|
173 | 173 | task_id = task.task_id |
|
174 | 174 | except formencode.Invalid as errors: |
|
175 | 175 | c.personal_repo_group = RepoGroup.get_by_group_name( |
|
176 | 176 | c.rhodecode_user.username) |
|
177 | 177 | return htmlfill.render( |
|
178 | 178 | render('admin/repos/repo_add.html'), |
|
179 | 179 | defaults=errors.value, |
|
180 | 180 | errors=errors.error_dict or {}, |
|
181 | 181 | prefix_error=False, |
|
182 | 182 | encoding="UTF-8", |
|
183 | 183 | force_defaults=False) |
|
184 | 184 | |
|
185 | 185 | except Exception as e: |
|
186 | 186 | msg = self._log_creation_exception(e, form_result.get('repo_name')) |
|
187 | 187 | h.flash(msg, category='error') |
|
188 | 188 | return redirect(url('home')) |
|
189 | 189 | |
|
190 | 190 | return redirect(h.url('repo_creating_home', |
|
191 | 191 | repo_name=form_result['repo_name_full'], |
|
192 | 192 | task_id=task_id)) |
|
193 | 193 | |
|
194 | 194 | # perms check inside |
|
195 | 195 | @NotAnonymous() |
|
196 | 196 | def create_repository(self): |
|
197 | 197 | """GET /_admin/create_repository: Form to create a new item""" |
|
198 | 198 | new_repo = request.GET.get('repo', '') |
|
199 | 199 | parent_group = request.GET.get('parent_group') |
|
200 | 200 | if not HasPermissionAny('hg.admin', 'hg.create.repository')(): |
|
201 | 201 | # you're not super admin nor have global create permissions, |
|
202 | 202 | # but maybe you have at least write permission to a parent group ? |
|
203 | 203 | _gr = RepoGroup.get(parent_group) |
|
204 | 204 | gr_name = _gr.group_name if _gr else None |
|
205 | 205 | # create repositories with write permission on group is set to true |
|
206 | 206 | create_on_write = HasPermissionAny('hg.create.write_on_repogroup.true')() |
|
207 | 207 | group_admin = HasRepoGroupPermissionAny('group.admin')(group_name=gr_name) |
|
208 | 208 | group_write = HasRepoGroupPermissionAny('group.write')(group_name=gr_name) |
|
209 | 209 | if not (group_admin or (group_write and create_on_write)): |
|
210 | 210 | raise HTTPForbidden |
|
211 | 211 | |
|
212 | 212 | acl_groups = RepoGroupList(RepoGroup.query().all(), |
|
213 | 213 | perm_set=['group.write', 'group.admin']) |
|
214 | 214 | c.repo_groups = RepoGroup.groups_choices(groups=acl_groups) |
|
215 | 215 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) |
|
216 | 216 | choices, c.landing_revs = ScmModel().get_repo_landing_revs() |
|
217 | 217 | c.personal_repo_group = RepoGroup.get_by_group_name(c.rhodecode_user.username) |
|
218 | 218 | c.new_repo = repo_name_slug(new_repo) |
|
219 | 219 | |
|
220 | 220 | ## apply the defaults from defaults page |
|
221 | 221 | defaults = SettingsModel().get_default_repo_settings(strip_prefix=True) |
|
222 | 222 | # set checkbox to autochecked |
|
223 | 223 | defaults['repo_copy_permissions'] = True |
|
224 | 224 | if parent_group: |
|
225 | 225 | defaults.update({'repo_group': parent_group}) |
|
226 | 226 | |
|
227 | 227 | return htmlfill.render( |
|
228 | 228 | render('admin/repos/repo_add.html'), |
|
229 | 229 | defaults=defaults, |
|
230 | 230 | errors={}, |
|
231 | 231 | prefix_error=False, |
|
232 | 232 | encoding="UTF-8", |
|
233 | 233 | force_defaults=False |
|
234 | 234 | ) |
|
235 | 235 | |
|
236 | 236 | @NotAnonymous() |
|
237 | 237 | def repo_creating(self, repo_name): |
|
238 | 238 | c.repo = repo_name |
|
239 | 239 | c.task_id = request.GET.get('task_id') |
|
240 | 240 | if not c.repo: |
|
241 | 241 | raise HTTPNotFound() |
|
242 | 242 | return render('admin/repos/repo_creating.html') |
|
243 | 243 | |
|
244 | 244 | @NotAnonymous() |
|
245 | 245 | @jsonify |
|
246 | 246 | def repo_check(self, repo_name): |
|
247 | 247 | c.repo = repo_name |
|
248 | 248 | task_id = request.GET.get('task_id') |
|
249 | 249 | |
|
250 | 250 | if task_id and task_id not in ['None']: |
|
251 |
|
|
|
251 | import rhodecode | |
|
252 | 252 | from celery.result import AsyncResult |
|
253 | if CELERY_ENABLED: | |
|
253 | if rhodecode.CELERY_ENABLED: | |
|
254 | 254 | task = AsyncResult(task_id) |
|
255 | 255 | if task.failed(): |
|
256 | 256 | msg = self._log_creation_exception(task.result, c.repo) |
|
257 | 257 | h.flash(msg, category='error') |
|
258 | 258 | return redirect(url('home'), code=501) |
|
259 | 259 | |
|
260 | 260 | repo = Repository.get_by_repo_name(repo_name) |
|
261 | 261 | if repo and repo.repo_state == Repository.STATE_CREATED: |
|
262 | 262 | if repo.clone_uri: |
|
263 | 263 | clone_uri = repo.clone_uri_hidden |
|
264 | 264 | h.flash(_('Created repository %s from %s') |
|
265 | 265 | % (repo.repo_name, clone_uri), category='success') |
|
266 | 266 | else: |
|
267 | 267 | repo_url = h.link_to(repo.repo_name, |
|
268 | 268 | h.url('summary_home', |
|
269 | 269 | repo_name=repo.repo_name)) |
|
270 | 270 | fork = repo.fork |
|
271 | 271 | if fork: |
|
272 | 272 | fork_name = fork.repo_name |
|
273 | 273 | h.flash(h.literal(_('Forked repository %s as %s') |
|
274 | 274 | % (fork_name, repo_url)), category='success') |
|
275 | 275 | else: |
|
276 | 276 | h.flash(h.literal(_('Created repository %s') % repo_url), |
|
277 | 277 | category='success') |
|
278 | 278 | return {'result': True} |
|
279 | 279 | return {'result': False} |
|
280 | 280 | |
|
281 | 281 | @HasRepoPermissionAllDecorator('repository.admin') |
|
282 | 282 | @auth.CSRFRequired() |
|
283 | 283 | def update(self, repo_name): |
|
284 | 284 | """ |
|
285 | 285 | PUT /repos/repo_name: Update an existing item""" |
|
286 | 286 | # Forms posted to this method should contain a hidden field: |
|
287 | 287 | # <input type="hidden" name="_method" value="PUT" /> |
|
288 | 288 | # Or using helpers: |
|
289 | 289 | # h.form(url('repo', repo_name=ID), |
|
290 | 290 | # method='put') |
|
291 | 291 | # url('repo', repo_name=ID) |
|
292 | 292 | |
|
293 | 293 | self.__load_data(repo_name) |
|
294 | 294 | c.active = 'settings' |
|
295 | 295 | c.repo_fields = RepositoryField.query()\ |
|
296 | 296 | .filter(RepositoryField.repository == c.repo_info).all() |
|
297 | 297 | |
|
298 | 298 | repo_model = RepoModel() |
|
299 | 299 | changed_name = repo_name |
|
300 | 300 | |
|
301 | 301 | # override the choices with extracted revisions ! |
|
302 | 302 | c.personal_repo_group = RepoGroup.get_by_group_name( |
|
303 | 303 | c.rhodecode_user.username) |
|
304 | 304 | repo = Repository.get_by_repo_name(repo_name) |
|
305 | 305 | old_data = { |
|
306 | 306 | 'repo_name': repo_name, |
|
307 | 307 | 'repo_group': repo.group.get_dict() if repo.group else {}, |
|
308 | 308 | 'repo_type': repo.repo_type, |
|
309 | 309 | } |
|
310 |
_form = RepoForm( |
|
|
311 |
|
|
|
312 |
|
|
|
310 | _form = RepoForm( | |
|
311 | edit=True, old_data=old_data, repo_groups=c.repo_groups_choices, | |
|
312 | landing_revs=c.landing_revs_choices, allow_disabled=True)() | |
|
313 | 313 | |
|
314 | 314 | try: |
|
315 | 315 | form_result = _form.to_python(dict(request.POST)) |
|
316 | 316 | repo = repo_model.update(repo_name, **form_result) |
|
317 | 317 | ScmModel().mark_for_invalidation(repo_name) |
|
318 | 318 | h.flash(_('Repository %s updated successfully') % repo_name, |
|
319 | 319 | category='success') |
|
320 | 320 | changed_name = repo.repo_name |
|
321 | 321 | action_logger(c.rhodecode_user, 'admin_updated_repo', |
|
322 | 322 | changed_name, self.ip_addr, self.sa) |
|
323 | 323 | Session().commit() |
|
324 | 324 | except formencode.Invalid as errors: |
|
325 | 325 | defaults = self.__load_data(repo_name) |
|
326 | 326 | defaults.update(errors.value) |
|
327 | 327 | return htmlfill.render( |
|
328 | 328 | render('admin/repos/repo_edit.html'), |
|
329 | 329 | defaults=defaults, |
|
330 | 330 | errors=errors.error_dict or {}, |
|
331 | 331 | prefix_error=False, |
|
332 | 332 | encoding="UTF-8", |
|
333 | 333 | force_defaults=False) |
|
334 | 334 | |
|
335 | 335 | except Exception: |
|
336 | 336 | log.exception("Exception during update of repository") |
|
337 | 337 | h.flash(_('Error occurred during update of repository %s') \ |
|
338 | 338 | % repo_name, category='error') |
|
339 | 339 | return redirect(url('edit_repo', repo_name=changed_name)) |
|
340 | 340 | |
|
341 | 341 | @HasRepoPermissionAllDecorator('repository.admin') |
|
342 | 342 | @auth.CSRFRequired() |
|
343 | 343 | def delete(self, repo_name): |
|
344 | 344 | """ |
|
345 | 345 | DELETE /repos/repo_name: Delete an existing item""" |
|
346 | 346 | # Forms posted to this method should contain a hidden field: |
|
347 | 347 | # <input type="hidden" name="_method" value="DELETE" /> |
|
348 | 348 | # Or using helpers: |
|
349 | 349 | # h.form(url('repo', repo_name=ID), |
|
350 | 350 | # method='delete') |
|
351 | 351 | # url('repo', repo_name=ID) |
|
352 | 352 | |
|
353 | 353 | repo_model = RepoModel() |
|
354 | 354 | repo = repo_model.get_by_repo_name(repo_name) |
|
355 | 355 | if not repo: |
|
356 | 356 | h.not_mapped_error(repo_name) |
|
357 | 357 | return redirect(url('repos')) |
|
358 | 358 | try: |
|
359 | 359 | _forks = repo.forks.count() |
|
360 | 360 | handle_forks = None |
|
361 | 361 | if _forks and request.POST.get('forks'): |
|
362 | 362 | do = request.POST['forks'] |
|
363 | 363 | if do == 'detach_forks': |
|
364 | 364 | handle_forks = 'detach' |
|
365 | 365 | h.flash(_('Detached %s forks') % _forks, category='success') |
|
366 | 366 | elif do == 'delete_forks': |
|
367 | 367 | handle_forks = 'delete' |
|
368 | 368 | h.flash(_('Deleted %s forks') % _forks, category='success') |
|
369 | 369 | repo_model.delete(repo, forks=handle_forks) |
|
370 | 370 | action_logger(c.rhodecode_user, 'admin_deleted_repo', |
|
371 | 371 | repo_name, self.ip_addr, self.sa) |
|
372 | 372 | ScmModel().mark_for_invalidation(repo_name) |
|
373 | 373 | h.flash(_('Deleted repository %s') % repo_name, category='success') |
|
374 | 374 | Session().commit() |
|
375 | 375 | except AttachedForksError: |
|
376 | 376 | h.flash(_('Cannot delete %s it still contains attached forks') |
|
377 | 377 | % repo_name, category='warning') |
|
378 | 378 | |
|
379 | 379 | except Exception: |
|
380 | 380 | log.exception("Exception during deletion of repository") |
|
381 | 381 | h.flash(_('An error occurred during deletion of %s') % repo_name, |
|
382 | 382 | category='error') |
|
383 | 383 | |
|
384 | 384 | return redirect(url('repos')) |
|
385 | 385 | |
|
386 | 386 | @HasPermissionAllDecorator('hg.admin') |
|
387 | 387 | def show(self, repo_name, format='html'): |
|
388 | 388 | """GET /repos/repo_name: Show a specific item""" |
|
389 | 389 | # url('repo', repo_name=ID) |
|
390 | 390 | |
|
391 | 391 | @HasRepoPermissionAllDecorator('repository.admin') |
|
392 | 392 | def edit(self, repo_name): |
|
393 | 393 | """GET /repo_name/settings: Form to edit an existing item""" |
|
394 | 394 | # url('edit_repo', repo_name=ID) |
|
395 | 395 | defaults = self.__load_data(repo_name) |
|
396 | 396 | if 'clone_uri' in defaults: |
|
397 | 397 | del defaults['clone_uri'] |
|
398 | 398 | |
|
399 | 399 | c.repo_fields = RepositoryField.query()\ |
|
400 | 400 | .filter(RepositoryField.repository == c.repo_info).all() |
|
401 | 401 | c.personal_repo_group = RepoGroup.get_by_group_name( |
|
402 | 402 | c.rhodecode_user.username) |
|
403 | 403 | c.active = 'settings' |
|
404 | 404 | return htmlfill.render( |
|
405 | 405 | render('admin/repos/repo_edit.html'), |
|
406 | 406 | defaults=defaults, |
|
407 | 407 | encoding="UTF-8", |
|
408 | 408 | force_defaults=False) |
|
409 | 409 | |
|
410 | 410 | @HasRepoPermissionAllDecorator('repository.admin') |
|
411 | 411 | def edit_permissions(self, repo_name): |
|
412 | 412 | """GET /repo_name/settings: Form to edit an existing item""" |
|
413 | 413 | # url('edit_repo', repo_name=ID) |
|
414 | 414 | c.repo_info = self._load_repo(repo_name) |
|
415 | 415 | c.active = 'permissions' |
|
416 | 416 | defaults = RepoModel()._get_defaults(repo_name) |
|
417 | 417 | |
|
418 | 418 | return htmlfill.render( |
|
419 | 419 | render('admin/repos/repo_edit.html'), |
|
420 | 420 | defaults=defaults, |
|
421 | 421 | encoding="UTF-8", |
|
422 | 422 | force_defaults=False) |
|
423 | 423 | |
|
424 | 424 | @HasRepoPermissionAllDecorator('repository.admin') |
|
425 | 425 | @auth.CSRFRequired() |
|
426 | 426 | def edit_permissions_update(self, repo_name): |
|
427 | 427 | form = RepoPermsForm()().to_python(request.POST) |
|
428 | 428 | RepoModel().update_permissions(repo_name, |
|
429 | 429 | form['perm_additions'], form['perm_updates'], form['perm_deletions']) |
|
430 | 430 | |
|
431 | 431 | #TODO: implement this |
|
432 | 432 | #action_logger(c.rhodecode_user, 'admin_changed_repo_permissions', |
|
433 | 433 | # repo_name, self.ip_addr, self.sa) |
|
434 | 434 | Session().commit() |
|
435 | 435 | h.flash(_('Repository permissions updated'), category='success') |
|
436 | 436 | return redirect(url('edit_repo_perms', repo_name=repo_name)) |
|
437 | 437 | |
|
438 | 438 | @HasRepoPermissionAllDecorator('repository.admin') |
|
439 | 439 | def edit_fields(self, repo_name): |
|
440 | 440 | """GET /repo_name/settings: Form to edit an existing item""" |
|
441 | 441 | # url('edit_repo', repo_name=ID) |
|
442 | 442 | c.repo_info = self._load_repo(repo_name) |
|
443 | 443 | c.repo_fields = RepositoryField.query()\ |
|
444 | 444 | .filter(RepositoryField.repository == c.repo_info).all() |
|
445 | 445 | c.active = 'fields' |
|
446 | 446 | if request.POST: |
|
447 | 447 | |
|
448 | 448 | return redirect(url('repo_edit_fields')) |
|
449 | 449 | return render('admin/repos/repo_edit.html') |
|
450 | 450 | |
|
451 | 451 | @HasRepoPermissionAllDecorator('repository.admin') |
|
452 | 452 | @auth.CSRFRequired() |
|
453 | 453 | def create_repo_field(self, repo_name): |
|
454 | 454 | try: |
|
455 | 455 | form_result = RepoFieldForm()().to_python(dict(request.POST)) |
|
456 | 456 | RepoModel().add_repo_field( |
|
457 | 457 | repo_name, form_result['new_field_key'], |
|
458 | 458 | field_type=form_result['new_field_type'], |
|
459 | 459 | field_value=form_result['new_field_value'], |
|
460 | 460 | field_label=form_result['new_field_label'], |
|
461 | 461 | field_desc=form_result['new_field_desc']) |
|
462 | 462 | |
|
463 | 463 | Session().commit() |
|
464 | 464 | except Exception as e: |
|
465 | 465 | log.exception("Exception creating field") |
|
466 | 466 | msg = _('An error occurred during creation of field') |
|
467 | 467 | if isinstance(e, formencode.Invalid): |
|
468 | 468 | msg += ". " + e.msg |
|
469 | 469 | h.flash(msg, category='error') |
|
470 | 470 | return redirect(url('edit_repo_fields', repo_name=repo_name)) |
|
471 | 471 | |
|
472 | 472 | @HasRepoPermissionAllDecorator('repository.admin') |
|
473 | 473 | @auth.CSRFRequired() |
|
474 | 474 | def delete_repo_field(self, repo_name, field_id): |
|
475 | 475 | field = RepositoryField.get_or_404(field_id) |
|
476 | 476 | try: |
|
477 | 477 | RepoModel().delete_repo_field(repo_name, field.field_key) |
|
478 | 478 | Session().commit() |
|
479 | 479 | except Exception as e: |
|
480 | 480 | log.exception("Exception during removal of field") |
|
481 | 481 | msg = _('An error occurred during removal of field') |
|
482 | 482 | h.flash(msg, category='error') |
|
483 | 483 | return redirect(url('edit_repo_fields', repo_name=repo_name)) |
|
484 | 484 | |
|
485 | 485 | @HasRepoPermissionAllDecorator('repository.admin') |
|
486 | 486 | def edit_advanced(self, repo_name): |
|
487 | 487 | """GET /repo_name/settings: Form to edit an existing item""" |
|
488 | 488 | # url('edit_repo', repo_name=ID) |
|
489 | 489 | c.repo_info = self._load_repo(repo_name) |
|
490 | 490 | c.default_user_id = User.get_default_user().user_id |
|
491 | 491 | c.in_public_journal = UserFollowing.query()\ |
|
492 | 492 | .filter(UserFollowing.user_id == c.default_user_id)\ |
|
493 | 493 | .filter(UserFollowing.follows_repository == c.repo_info).scalar() |
|
494 | 494 | |
|
495 | 495 | c.active = 'advanced' |
|
496 | 496 | c.has_origin_repo_read_perm = False |
|
497 | 497 | if c.repo_info.fork: |
|
498 | 498 | c.has_origin_repo_read_perm = h.HasRepoPermissionAny( |
|
499 | 499 | 'repository.write', 'repository.read', 'repository.admin')( |
|
500 | 500 | c.repo_info.fork.repo_name, 'repo set as fork page') |
|
501 | 501 | |
|
502 | 502 | if request.POST: |
|
503 | 503 | return redirect(url('repo_edit_advanced')) |
|
504 | 504 | return render('admin/repos/repo_edit.html') |
|
505 | 505 | |
|
506 | 506 | @HasRepoPermissionAllDecorator('repository.admin') |
|
507 | 507 | @auth.CSRFRequired() |
|
508 | 508 | def edit_advanced_journal(self, repo_name): |
|
509 | 509 | """ |
|
510 | 510 | Set's this repository to be visible in public journal, |
|
511 | 511 | in other words assing default user to follow this repo |
|
512 | 512 | |
|
513 | 513 | :param repo_name: |
|
514 | 514 | """ |
|
515 | 515 | |
|
516 | 516 | try: |
|
517 | 517 | repo_id = Repository.get_by_repo_name(repo_name).repo_id |
|
518 | 518 | user_id = User.get_default_user().user_id |
|
519 | 519 | self.scm_model.toggle_following_repo(repo_id, user_id) |
|
520 | 520 | h.flash(_('Updated repository visibility in public journal'), |
|
521 | 521 | category='success') |
|
522 | 522 | Session().commit() |
|
523 | 523 | except Exception: |
|
524 | 524 | h.flash(_('An error occurred during setting this' |
|
525 | 525 | ' repository in public journal'), |
|
526 | 526 | category='error') |
|
527 | 527 | |
|
528 | 528 | return redirect(url('edit_repo_advanced', repo_name=repo_name)) |
|
529 | 529 | |
|
530 | 530 | @HasRepoPermissionAllDecorator('repository.admin') |
|
531 | 531 | @auth.CSRFRequired() |
|
532 | 532 | def edit_advanced_fork(self, repo_name): |
|
533 | 533 | """ |
|
534 | 534 | Mark given repository as a fork of another |
|
535 | 535 | |
|
536 | 536 | :param repo_name: |
|
537 | 537 | """ |
|
538 | 538 | |
|
539 | 539 | new_fork_id = request.POST.get('id_fork_of') |
|
540 | 540 | try: |
|
541 | 541 | |
|
542 | 542 | if new_fork_id and not new_fork_id.isdigit(): |
|
543 | 543 | log.error('Given fork id %s is not an INT', new_fork_id) |
|
544 | 544 | |
|
545 | 545 | fork_id = safe_int(new_fork_id) |
|
546 | 546 | repo = ScmModel().mark_as_fork(repo_name, fork_id, |
|
547 | 547 | c.rhodecode_user.username) |
|
548 | 548 | fork = repo.fork.repo_name if repo.fork else _('Nothing') |
|
549 | 549 | Session().commit() |
|
550 | 550 | h.flash(_('Marked repo %s as fork of %s') % (repo_name, fork), |
|
551 | 551 | category='success') |
|
552 | 552 | except RepositoryError as e: |
|
553 | 553 | log.exception("Repository Error occurred") |
|
554 | 554 | h.flash(str(e), category='error') |
|
555 | 555 | except Exception as e: |
|
556 | 556 | log.exception("Exception while editing fork") |
|
557 | 557 | h.flash(_('An error occurred during this operation'), |
|
558 | 558 | category='error') |
|
559 | 559 | |
|
560 | 560 | return redirect(url('edit_repo_advanced', repo_name=repo_name)) |
|
561 | 561 | |
|
562 | 562 | @HasRepoPermissionAllDecorator('repository.admin') |
|
563 | 563 | @auth.CSRFRequired() |
|
564 | 564 | def edit_advanced_locking(self, repo_name): |
|
565 | 565 | """ |
|
566 | 566 | Unlock repository when it is locked ! |
|
567 | 567 | |
|
568 | 568 | :param repo_name: |
|
569 | 569 | """ |
|
570 | 570 | try: |
|
571 | 571 | repo = Repository.get_by_repo_name(repo_name) |
|
572 | 572 | if request.POST.get('set_lock'): |
|
573 | 573 | Repository.lock(repo, c.rhodecode_user.user_id, |
|
574 | 574 | lock_reason=Repository.LOCK_WEB) |
|
575 | 575 | h.flash(_('Locked repository'), category='success') |
|
576 | 576 | elif request.POST.get('set_unlock'): |
|
577 | 577 | Repository.unlock(repo) |
|
578 | 578 | h.flash(_('Unlocked repository'), category='success') |
|
579 | 579 | except Exception as e: |
|
580 | 580 | log.exception("Exception during unlocking") |
|
581 | 581 | h.flash(_('An error occurred during unlocking'), |
|
582 | 582 | category='error') |
|
583 | 583 | return redirect(url('edit_repo_advanced', repo_name=repo_name)) |
|
584 | 584 | |
|
585 | 585 | @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin') |
|
586 | 586 | @auth.CSRFRequired() |
|
587 | 587 | def toggle_locking(self, repo_name): |
|
588 | 588 | """ |
|
589 | 589 | Toggle locking of repository by simple GET call to url |
|
590 | 590 | |
|
591 | 591 | :param repo_name: |
|
592 | 592 | """ |
|
593 | 593 | |
|
594 | 594 | try: |
|
595 | 595 | repo = Repository.get_by_repo_name(repo_name) |
|
596 | 596 | |
|
597 | 597 | if repo.enable_locking: |
|
598 | 598 | if repo.locked[0]: |
|
599 | 599 | Repository.unlock(repo) |
|
600 | 600 | action = _('Unlocked') |
|
601 | 601 | else: |
|
602 | 602 | Repository.lock(repo, c.rhodecode_user.user_id, |
|
603 | 603 | lock_reason=Repository.LOCK_WEB) |
|
604 | 604 | action = _('Locked') |
|
605 | 605 | |
|
606 | 606 | h.flash(_('Repository has been %s') % action, |
|
607 | 607 | category='success') |
|
608 | 608 | except Exception: |
|
609 | 609 | log.exception("Exception during unlocking") |
|
610 | 610 | h.flash(_('An error occurred during unlocking'), |
|
611 | 611 | category='error') |
|
612 | 612 | return redirect(url('summary_home', repo_name=repo_name)) |
|
613 | 613 | |
|
614 | 614 | @HasRepoPermissionAllDecorator('repository.admin') |
|
615 | 615 | @auth.CSRFRequired() |
|
616 | 616 | def edit_caches(self, repo_name): |
|
617 | 617 | """PUT /{repo_name}/settings/caches: invalidate the repo caches.""" |
|
618 | 618 | try: |
|
619 | 619 | ScmModel().mark_for_invalidation(repo_name, delete=True) |
|
620 | 620 | Session().commit() |
|
621 | 621 | h.flash(_('Cache invalidation successful'), |
|
622 | 622 | category='success') |
|
623 | 623 | except Exception: |
|
624 | 624 | log.exception("Exception during cache invalidation") |
|
625 | 625 | h.flash(_('An error occurred during cache invalidation'), |
|
626 | 626 | category='error') |
|
627 | 627 | |
|
628 | 628 | return redirect(url('edit_repo_caches', repo_name=c.repo_name)) |
|
629 | 629 | |
|
630 | 630 | @HasRepoPermissionAllDecorator('repository.admin') |
|
631 | 631 | def edit_caches_form(self, repo_name): |
|
632 | 632 | """GET /repo_name/settings: Form to edit an existing item""" |
|
633 | 633 | # url('edit_repo', repo_name=ID) |
|
634 | 634 | c.repo_info = self._load_repo(repo_name) |
|
635 | 635 | c.active = 'caches' |
|
636 | 636 | |
|
637 | 637 | return render('admin/repos/repo_edit.html') |
|
638 | 638 | |
|
639 | 639 | @HasRepoPermissionAllDecorator('repository.admin') |
|
640 | 640 | @auth.CSRFRequired() |
|
641 | 641 | def edit_remote(self, repo_name): |
|
642 | 642 | """PUT /{repo_name}/settings/remote: edit the repo remote.""" |
|
643 | 643 | try: |
|
644 | 644 | ScmModel().pull_changes(repo_name, c.rhodecode_user.username) |
|
645 | 645 | h.flash(_('Pulled from remote location'), category='success') |
|
646 | 646 | except Exception: |
|
647 | 647 | log.exception("Exception during pull from remote") |
|
648 | 648 | h.flash(_('An error occurred during pull from remote location'), |
|
649 | 649 | category='error') |
|
650 | 650 | return redirect(url('edit_repo_remote', repo_name=c.repo_name)) |
|
651 | 651 | |
|
652 | 652 | @HasRepoPermissionAllDecorator('repository.admin') |
|
653 | 653 | def edit_remote_form(self, repo_name): |
|
654 | 654 | """GET /repo_name/settings: Form to edit an existing item""" |
|
655 | 655 | # url('edit_repo', repo_name=ID) |
|
656 | 656 | c.repo_info = self._load_repo(repo_name) |
|
657 | 657 | c.active = 'remote' |
|
658 | 658 | |
|
659 | 659 | return render('admin/repos/repo_edit.html') |
|
660 | 660 | |
|
661 | 661 | @HasRepoPermissionAllDecorator('repository.admin') |
|
662 | 662 | @auth.CSRFRequired() |
|
663 | 663 | def edit_statistics(self, repo_name): |
|
664 | 664 | """PUT /{repo_name}/settings/statistics: reset the repo statistics.""" |
|
665 | 665 | try: |
|
666 | 666 | RepoModel().delete_stats(repo_name) |
|
667 | 667 | Session().commit() |
|
668 | 668 | except Exception as e: |
|
669 | 669 | log.error(traceback.format_exc()) |
|
670 | 670 | h.flash(_('An error occurred during deletion of repository stats'), |
|
671 | 671 | category='error') |
|
672 | 672 | return redirect(url('edit_repo_statistics', repo_name=c.repo_name)) |
|
673 | 673 | |
|
674 | 674 | @HasRepoPermissionAllDecorator('repository.admin') |
|
675 | 675 | def edit_statistics_form(self, repo_name): |
|
676 | 676 | """GET /repo_name/settings: Form to edit an existing item""" |
|
677 | 677 | # url('edit_repo', repo_name=ID) |
|
678 | 678 | c.repo_info = self._load_repo(repo_name) |
|
679 | 679 | repo = c.repo_info.scm_instance() |
|
680 | 680 | |
|
681 | 681 | if c.repo_info.stats: |
|
682 | 682 | # this is on what revision we ended up so we add +1 for count |
|
683 | 683 | last_rev = c.repo_info.stats.stat_on_revision + 1 |
|
684 | 684 | else: |
|
685 | 685 | last_rev = 0 |
|
686 | 686 | c.stats_revision = last_rev |
|
687 | 687 | |
|
688 | 688 | c.repo_last_rev = repo.count() |
|
689 | 689 | |
|
690 | 690 | if last_rev == 0 or c.repo_last_rev == 0: |
|
691 | 691 | c.stats_percentage = 0 |
|
692 | 692 | else: |
|
693 | 693 | c.stats_percentage = '%.2f' % ((float((last_rev)) / c.repo_last_rev) * 100) |
|
694 | 694 | |
|
695 | 695 | c.active = 'statistics' |
|
696 | 696 | |
|
697 | 697 | return render('admin/repos/repo_edit.html') |
|
698 | 698 | |
|
699 | 699 | @HasRepoPermissionAllDecorator('repository.admin') |
|
700 | 700 | @auth.CSRFRequired() |
|
701 | 701 | def repo_issuetracker_test(self, repo_name): |
|
702 | 702 | if request.is_xhr: |
|
703 | 703 | return h.urlify_commit_message( |
|
704 | 704 | request.POST.get('test_text', ''), |
|
705 | 705 | repo_name) |
|
706 | 706 | else: |
|
707 | 707 | raise HTTPBadRequest() |
|
708 | 708 | |
|
709 | 709 | @HasRepoPermissionAllDecorator('repository.admin') |
|
710 | 710 | @auth.CSRFRequired() |
|
711 | 711 | def repo_issuetracker_delete(self, repo_name): |
|
712 | 712 | uid = request.POST.get('uid') |
|
713 | 713 | repo_settings = IssueTrackerSettingsModel(repo=repo_name) |
|
714 | 714 | try: |
|
715 | 715 | repo_settings.delete_entries(uid) |
|
716 | 716 | except Exception: |
|
717 | 717 | h.flash(_('Error occurred during deleting issue tracker entry'), |
|
718 | 718 | category='error') |
|
719 | 719 | else: |
|
720 | 720 | h.flash(_('Removed issue tracker entry'), category='success') |
|
721 | 721 | return redirect(url('repo_settings_issuetracker', |
|
722 | 722 | repo_name=repo_name)) |
|
723 | 723 | |
|
724 | 724 | def _update_patterns(self, form, repo_settings): |
|
725 | 725 | for uid in form['delete_patterns']: |
|
726 | 726 | repo_settings.delete_entries(uid) |
|
727 | 727 | |
|
728 | 728 | for pattern in form['patterns']: |
|
729 | 729 | for setting, value, type_ in pattern: |
|
730 | 730 | sett = repo_settings.create_or_update_setting( |
|
731 | 731 | setting, value, type_) |
|
732 | 732 | Session().add(sett) |
|
733 | 733 | |
|
734 | 734 | Session().commit() |
|
735 | 735 | |
|
736 | 736 | @HasRepoPermissionAllDecorator('repository.admin') |
|
737 | 737 | @auth.CSRFRequired() |
|
738 | 738 | def repo_issuetracker_save(self, repo_name): |
|
739 | 739 | # Save inheritance |
|
740 | 740 | repo_settings = IssueTrackerSettingsModel(repo=repo_name) |
|
741 | 741 | inherited = (request.POST.get('inherit_global_issuetracker') |
|
742 | 742 | == "inherited") |
|
743 | 743 | repo_settings.inherit_global_settings = inherited |
|
744 | 744 | Session().commit() |
|
745 | 745 | |
|
746 | 746 | form = IssueTrackerPatternsForm()().to_python(request.POST) |
|
747 | 747 | if form: |
|
748 | 748 | self._update_patterns(form, repo_settings) |
|
749 | 749 | |
|
750 | 750 | h.flash(_('Updated issue tracker entries'), category='success') |
|
751 | 751 | return redirect(url('repo_settings_issuetracker', |
|
752 | 752 | repo_name=repo_name)) |
|
753 | 753 | |
|
754 | 754 | @HasRepoPermissionAllDecorator('repository.admin') |
|
755 | 755 | def repo_issuetracker(self, repo_name): |
|
756 | 756 | """GET /admin/settings/issue-tracker: All items in the collection""" |
|
757 | 757 | c.active = 'issuetracker' |
|
758 | 758 | c.data = 'data' |
|
759 | 759 | c.repo_info = self._load_repo(repo_name) |
|
760 | 760 | |
|
761 | 761 | repo = Repository.get_by_repo_name(repo_name) |
|
762 | 762 | c.settings_model = IssueTrackerSettingsModel(repo=repo) |
|
763 | 763 | c.global_patterns = c.settings_model.get_global_settings() |
|
764 | 764 | c.repo_patterns = c.settings_model.get_repo_settings() |
|
765 | 765 | |
|
766 | 766 | return render('admin/repos/repo_edit.html') |
|
767 | 767 | |
|
768 | 768 | @HasRepoPermissionAllDecorator('repository.admin') |
|
769 | 769 | def repo_settings_vcs(self, repo_name): |
|
770 | 770 | """GET /{repo_name}/settings/vcs/: All items in the collection""" |
|
771 | 771 | |
|
772 | 772 | model = VcsSettingsModel(repo=repo_name) |
|
773 | 773 | |
|
774 | 774 | c.active = 'vcs' |
|
775 | 775 | c.global_svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
776 | 776 | c.global_svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
777 | 777 | c.svn_branch_patterns = model.get_repo_svn_branch_patterns() |
|
778 | 778 | c.svn_tag_patterns = model.get_repo_svn_tag_patterns() |
|
779 | 779 | c.repo_info = self._load_repo(repo_name) |
|
780 | 780 | defaults = self._vcs_form_defaults(repo_name) |
|
781 | 781 | c.inherit_global_settings = defaults['inherit_global_settings'] |
|
782 | 782 | |
|
783 | 783 | return htmlfill.render( |
|
784 | 784 | render('admin/repos/repo_edit.html'), |
|
785 | 785 | defaults=defaults, |
|
786 | 786 | encoding="UTF-8", |
|
787 | 787 | force_defaults=False) |
|
788 | 788 | |
|
789 | 789 | @HasRepoPermissionAllDecorator('repository.admin') |
|
790 | 790 | @auth.CSRFRequired() |
|
791 | 791 | def repo_settings_vcs_update(self, repo_name): |
|
792 | 792 | """POST /{repo_name}/settings/vcs/: All items in the collection""" |
|
793 | 793 | c.active = 'vcs' |
|
794 | 794 | |
|
795 | 795 | model = VcsSettingsModel(repo=repo_name) |
|
796 | 796 | c.global_svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
797 | 797 | c.global_svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
798 | 798 | c.svn_branch_patterns = model.get_repo_svn_branch_patterns() |
|
799 | 799 | c.svn_tag_patterns = model.get_repo_svn_tag_patterns() |
|
800 | 800 | c.repo_info = self._load_repo(repo_name) |
|
801 | 801 | defaults = self._vcs_form_defaults(repo_name) |
|
802 | 802 | c.inherit_global_settings = defaults['inherit_global_settings'] |
|
803 | 803 | |
|
804 | 804 | application_form = RepoVcsSettingsForm(repo_name)() |
|
805 | 805 | try: |
|
806 | 806 | form_result = application_form.to_python(dict(request.POST)) |
|
807 | 807 | except formencode.Invalid as errors: |
|
808 | 808 | h.flash( |
|
809 | 809 | _("Some form inputs contain invalid data."), |
|
810 | 810 | category='error') |
|
811 | 811 | return htmlfill.render( |
|
812 | 812 | render('admin/repos/repo_edit.html'), |
|
813 | 813 | defaults=errors.value, |
|
814 | 814 | errors=errors.error_dict or {}, |
|
815 | 815 | prefix_error=False, |
|
816 | 816 | encoding="UTF-8", |
|
817 | 817 | force_defaults=False |
|
818 | 818 | ) |
|
819 | 819 | |
|
820 | 820 | try: |
|
821 | 821 | inherit_global_settings = form_result['inherit_global_settings'] |
|
822 | 822 | model.create_or_update_repo_settings( |
|
823 | 823 | form_result, inherit_global_settings=inherit_global_settings) |
|
824 | 824 | except Exception: |
|
825 | 825 | log.exception("Exception while updating settings") |
|
826 | 826 | h.flash( |
|
827 | 827 | _('Error occurred during updating repository VCS settings'), |
|
828 | 828 | category='error') |
|
829 | 829 | else: |
|
830 | 830 | Session().commit() |
|
831 | 831 | h.flash(_('Updated VCS settings'), category='success') |
|
832 | 832 | return redirect(url('repo_vcs_settings', repo_name=repo_name)) |
|
833 | 833 | |
|
834 | 834 | return htmlfill.render( |
|
835 | 835 | render('admin/repos/repo_edit.html'), |
|
836 | 836 | defaults=self._vcs_form_defaults(repo_name), |
|
837 | 837 | encoding="UTF-8", |
|
838 | 838 | force_defaults=False) |
|
839 | 839 | |
|
840 | 840 | @HasRepoPermissionAllDecorator('repository.admin') |
|
841 | 841 | @auth.CSRFRequired() |
|
842 | 842 | @jsonify |
|
843 | 843 | def repo_delete_svn_pattern(self, repo_name): |
|
844 | 844 | if not request.is_xhr: |
|
845 | 845 | return False |
|
846 | 846 | |
|
847 | 847 | delete_pattern_id = request.POST.get('delete_svn_pattern') |
|
848 | 848 | model = VcsSettingsModel(repo=repo_name) |
|
849 | 849 | try: |
|
850 | 850 | model.delete_repo_svn_pattern(delete_pattern_id) |
|
851 | 851 | except SettingNotFound: |
|
852 | 852 | raise HTTPBadRequest() |
|
853 | 853 | |
|
854 | 854 | Session().commit() |
|
855 | 855 | return True |
|
856 | 856 | |
|
857 | 857 | def _vcs_form_defaults(self, repo_name): |
|
858 | 858 | model = VcsSettingsModel(repo=repo_name) |
|
859 | 859 | global_defaults = model.get_global_settings() |
|
860 | 860 | |
|
861 | 861 | repo_defaults = {} |
|
862 | 862 | repo_defaults.update(global_defaults) |
|
863 | 863 | repo_defaults.update(model.get_repo_settings()) |
|
864 | 864 | |
|
865 | 865 | global_defaults = { |
|
866 | 866 | '{}_inherited'.format(k): global_defaults[k] |
|
867 | 867 | for k in global_defaults} |
|
868 | 868 | |
|
869 | 869 | defaults = { |
|
870 | 870 | 'inherit_global_settings': model.inherit_global_settings |
|
871 | 871 | } |
|
872 | 872 | defaults.update(global_defaults) |
|
873 | 873 | defaults.update(repo_defaults) |
|
874 | 874 | defaults.update({ |
|
875 | 875 | 'new_svn_branch': '', |
|
876 | 876 | 'new_svn_tag': '', |
|
877 | 877 | }) |
|
878 | 878 | return defaults |
@@ -1,866 +1,813 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | |
|
22 | 22 | """ |
|
23 | 23 | settings controller for rhodecode admin |
|
24 | 24 | """ |
|
25 | 25 | |
|
26 | 26 | import collections |
|
27 | 27 | import logging |
|
28 | 28 | import urllib2 |
|
29 | 29 | |
|
30 | 30 | import datetime |
|
31 | 31 | import formencode |
|
32 | 32 | from formencode import htmlfill |
|
33 | 33 | import packaging.version |
|
34 | 34 | from pylons import request, tmpl_context as c, url, config |
|
35 | 35 | from pylons.controllers.util import redirect |
|
36 | 36 | from pylons.i18n.translation import _, lazy_ugettext |
|
37 | 37 | from webob.exc import HTTPBadRequest |
|
38 | 38 | |
|
39 | 39 | import rhodecode |
|
40 | from rhodecode.admin.navigation import navigation_list | |
|
40 | 41 | from rhodecode.lib import auth |
|
41 | 42 | from rhodecode.lib import helpers as h |
|
42 | 43 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator |
|
43 | 44 | from rhodecode.lib.base import BaseController, render |
|
44 | 45 | from rhodecode.lib.celerylib import tasks, run_task |
|
45 | 46 | from rhodecode.lib.utils import repo2db_mapper |
|
46 | 47 | from rhodecode.lib.utils2 import ( |
|
47 | 48 | str2bool, safe_unicode, AttributeDict, safe_int) |
|
48 | 49 | from rhodecode.lib.compat import OrderedDict |
|
49 | 50 | from rhodecode.lib.ext_json import json |
|
50 |
from rhodecode.lib.utils import jsonify |
|
|
51 | from rhodecode.lib.utils import jsonify | |
|
51 | 52 | |
|
52 | 53 | from rhodecode.model.db import RhodeCodeUi, Repository |
|
53 | 54 | from rhodecode.model.forms import ApplicationSettingsForm, \ |
|
54 | 55 | ApplicationUiSettingsForm, ApplicationVisualisationForm, \ |
|
55 | 56 | LabsSettingsForm, IssueTrackerPatternsForm |
|
56 | 57 | |
|
57 | 58 | from rhodecode.model.scm import ScmModel |
|
58 | 59 | from rhodecode.model.notification import EmailNotificationModel |
|
59 | 60 | from rhodecode.model.meta import Session |
|
60 | 61 | from rhodecode.model.settings import ( |
|
61 | 62 | IssueTrackerSettingsModel, VcsSettingsModel, SettingNotFound, |
|
62 | 63 | SettingsModel) |
|
64 | ||
|
63 | 65 | from rhodecode.model.supervisor import SupervisorModel, SUPERVISOR_MASTER |
|
64 | from rhodecode.model.user import UserModel | |
|
66 | ||
|
65 | 67 | |
|
66 | 68 | log = logging.getLogger(__name__) |
|
67 | 69 | |
|
68 | 70 | |
|
69 | 71 | class SettingsController(BaseController): |
|
70 | 72 | """REST Controller styled on the Atom Publishing Protocol""" |
|
71 | 73 | # To properly map this controller, ensure your config/routing.py |
|
72 | 74 | # file has a resource setup: |
|
73 | 75 | # map.resource('setting', 'settings', controller='admin/settings', |
|
74 | 76 | # path_prefix='/admin', name_prefix='admin_') |
|
75 | 77 | |
|
76 | 78 | @LoginRequired() |
|
77 | 79 | def __before__(self): |
|
78 | 80 | super(SettingsController, self).__before__() |
|
79 | 81 | c.labs_active = str2bool( |
|
80 | 82 | rhodecode.CONFIG.get('labs_settings_active', 'false')) |
|
81 |
c.navlist = navigation |
|
|
83 | c.navlist = navigation_list(request) | |
|
82 | 84 | |
|
83 | 85 | def _get_hg_ui_settings(self): |
|
84 | 86 | ret = RhodeCodeUi.query().all() |
|
85 | 87 | |
|
86 | 88 | if not ret: |
|
87 | 89 | raise Exception('Could not get application ui settings !') |
|
88 | 90 | settings = {} |
|
89 | 91 | for each in ret: |
|
90 | 92 | k = each.ui_key |
|
91 | 93 | v = each.ui_value |
|
92 | 94 | if k == '/': |
|
93 | 95 | k = 'root_path' |
|
94 | 96 | |
|
95 | 97 | if k in ['push_ssl', 'publish']: |
|
96 | 98 | v = str2bool(v) |
|
97 | 99 | |
|
98 | 100 | if k.find('.') != -1: |
|
99 | 101 | k = k.replace('.', '_') |
|
100 | 102 | |
|
101 | 103 | if each.ui_section in ['hooks', 'extensions']: |
|
102 | 104 | v = each.ui_active |
|
103 | 105 | |
|
104 | 106 | settings[each.ui_section + '_' + k] = v |
|
105 | 107 | return settings |
|
106 | 108 | |
|
107 | 109 | @HasPermissionAllDecorator('hg.admin') |
|
108 | 110 | @auth.CSRFRequired() |
|
109 | 111 | @jsonify |
|
110 | 112 | def delete_svn_pattern(self): |
|
111 | 113 | if not request.is_xhr: |
|
112 | 114 | raise HTTPBadRequest() |
|
113 | 115 | |
|
114 | 116 | delete_pattern_id = request.POST.get('delete_svn_pattern') |
|
115 | 117 | model = VcsSettingsModel() |
|
116 | 118 | try: |
|
117 | 119 | model.delete_global_svn_pattern(delete_pattern_id) |
|
118 | 120 | except SettingNotFound: |
|
119 | 121 | raise HTTPBadRequest() |
|
120 | 122 | |
|
121 | 123 | Session().commit() |
|
122 | 124 | return True |
|
123 | 125 | |
|
124 | 126 | @HasPermissionAllDecorator('hg.admin') |
|
125 | 127 | @auth.CSRFRequired() |
|
126 | 128 | def settings_vcs_update(self): |
|
127 | 129 | """POST /admin/settings: All items in the collection""" |
|
128 | 130 | # url('admin_settings_vcs') |
|
129 | 131 | c.active = 'vcs' |
|
130 | 132 | |
|
131 | 133 | model = VcsSettingsModel() |
|
132 | 134 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
133 | 135 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
134 | 136 | |
|
135 | 137 | application_form = ApplicationUiSettingsForm()() |
|
136 | 138 | try: |
|
137 | 139 | form_result = application_form.to_python(dict(request.POST)) |
|
138 | 140 | except formencode.Invalid as errors: |
|
139 | 141 | h.flash( |
|
140 | 142 | _("Some form inputs contain invalid data."), |
|
141 | 143 | category='error') |
|
142 | 144 | return htmlfill.render( |
|
143 | 145 | render('admin/settings/settings.html'), |
|
144 | 146 | defaults=errors.value, |
|
145 | 147 | errors=errors.error_dict or {}, |
|
146 | 148 | prefix_error=False, |
|
147 | 149 | encoding="UTF-8", |
|
148 | 150 | force_defaults=False |
|
149 | 151 | ) |
|
150 | 152 | |
|
151 | 153 | try: |
|
152 | 154 | model.update_global_ssl_setting(form_result['web_push_ssl']) |
|
153 | 155 | if c.visual.allow_repo_location_change: |
|
154 | 156 | model.update_global_path_setting( |
|
155 | 157 | form_result['paths_root_path']) |
|
156 | 158 | model.update_global_hook_settings(form_result) |
|
157 | 159 | model.create_global_svn_settings(form_result) |
|
158 | 160 | model.create_or_update_global_hg_settings(form_result) |
|
159 | 161 | model.create_or_update_global_pr_settings(form_result) |
|
160 | 162 | except Exception: |
|
161 | 163 | log.exception("Exception while updating settings") |
|
162 | 164 | h.flash(_('Error occurred during updating ' |
|
163 | 165 | 'application settings'), category='error') |
|
164 | 166 | else: |
|
165 | 167 | Session().commit() |
|
166 | 168 | h.flash(_('Updated VCS settings'), category='success') |
|
167 | 169 | return redirect(url('admin_settings_vcs')) |
|
168 | 170 | |
|
169 | 171 | return htmlfill.render( |
|
170 | 172 | render('admin/settings/settings.html'), |
|
171 | 173 | defaults=self._form_defaults(), |
|
172 | 174 | encoding="UTF-8", |
|
173 | 175 | force_defaults=False) |
|
174 | 176 | |
|
175 | 177 | @HasPermissionAllDecorator('hg.admin') |
|
176 | 178 | def settings_vcs(self): |
|
177 | 179 | """GET /admin/settings: All items in the collection""" |
|
178 | 180 | # url('admin_settings_vcs') |
|
179 | 181 | c.active = 'vcs' |
|
180 | 182 | model = VcsSettingsModel() |
|
181 | 183 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
182 | 184 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
183 | 185 | |
|
184 | 186 | return htmlfill.render( |
|
185 | 187 | render('admin/settings/settings.html'), |
|
186 | 188 | defaults=self._form_defaults(), |
|
187 | 189 | encoding="UTF-8", |
|
188 | 190 | force_defaults=False) |
|
189 | 191 | |
|
190 | 192 | @HasPermissionAllDecorator('hg.admin') |
|
191 | 193 | @auth.CSRFRequired() |
|
192 | 194 | def settings_mapping_update(self): |
|
193 | 195 | """POST /admin/settings/mapping: All items in the collection""" |
|
194 | 196 | # url('admin_settings_mapping') |
|
195 | 197 | c.active = 'mapping' |
|
196 | 198 | rm_obsolete = request.POST.get('destroy', False) |
|
197 | 199 | invalidate_cache = request.POST.get('invalidate', False) |
|
198 | 200 | log.debug( |
|
199 | 201 | 'rescanning repo location with destroy obsolete=%s', rm_obsolete) |
|
200 | 202 | |
|
201 | 203 | if invalidate_cache: |
|
202 | 204 | log.debug('invalidating all repositories cache') |
|
203 | 205 | for repo in Repository.get_all(): |
|
204 | 206 | ScmModel().mark_for_invalidation(repo.repo_name, delete=True) |
|
205 | 207 | |
|
206 | 208 | filesystem_repos = ScmModel().repo_scan() |
|
207 | 209 | added, removed = repo2db_mapper(filesystem_repos, rm_obsolete) |
|
208 | 210 | _repr = lambda l: ', '.join(map(safe_unicode, l)) or '-' |
|
209 | 211 | h.flash(_('Repositories successfully ' |
|
210 | 212 | 'rescanned added: %s ; removed: %s') % |
|
211 | 213 | (_repr(added), _repr(removed)), |
|
212 | 214 | category='success') |
|
213 | 215 | return redirect(url('admin_settings_mapping')) |
|
214 | 216 | |
|
215 | 217 | @HasPermissionAllDecorator('hg.admin') |
|
216 | 218 | def settings_mapping(self): |
|
217 | 219 | """GET /admin/settings/mapping: All items in the collection""" |
|
218 | 220 | # url('admin_settings_mapping') |
|
219 | 221 | c.active = 'mapping' |
|
220 | 222 | |
|
221 | 223 | return htmlfill.render( |
|
222 | 224 | render('admin/settings/settings.html'), |
|
223 | 225 | defaults=self._form_defaults(), |
|
224 | 226 | encoding="UTF-8", |
|
225 | 227 | force_defaults=False) |
|
226 | 228 | |
|
227 | 229 | @HasPermissionAllDecorator('hg.admin') |
|
228 | 230 | @auth.CSRFRequired() |
|
229 | 231 | def settings_global_update(self): |
|
230 | 232 | """POST /admin/settings/global: All items in the collection""" |
|
231 | 233 | # url('admin_settings_global') |
|
232 | 234 | c.active = 'global' |
|
233 | 235 | application_form = ApplicationSettingsForm()() |
|
234 | 236 | try: |
|
235 | 237 | form_result = application_form.to_python(dict(request.POST)) |
|
236 | 238 | except formencode.Invalid as errors: |
|
237 | 239 | return htmlfill.render( |
|
238 | 240 | render('admin/settings/settings.html'), |
|
239 | 241 | defaults=errors.value, |
|
240 | 242 | errors=errors.error_dict or {}, |
|
241 | 243 | prefix_error=False, |
|
242 | 244 | encoding="UTF-8", |
|
243 | 245 | force_defaults=False) |
|
244 | 246 | |
|
245 | 247 | try: |
|
246 | 248 | settings = [ |
|
247 | 249 | ('title', 'rhodecode_title'), |
|
248 | 250 | ('realm', 'rhodecode_realm'), |
|
249 | 251 | ('pre_code', 'rhodecode_pre_code'), |
|
250 | 252 | ('post_code', 'rhodecode_post_code'), |
|
251 | 253 | ('captcha_public_key', 'rhodecode_captcha_public_key'), |
|
252 | 254 | ('captcha_private_key', 'rhodecode_captcha_private_key'), |
|
253 | 255 | ] |
|
254 | 256 | for setting, form_key in settings: |
|
255 | 257 | sett = SettingsModel().create_or_update_setting( |
|
256 | 258 | setting, form_result[form_key]) |
|
257 | 259 | Session().add(sett) |
|
258 | 260 | |
|
259 | 261 | Session().commit() |
|
262 | SettingsModel().invalidate_settings_cache() | |
|
260 | 263 | h.flash(_('Updated application settings'), category='success') |
|
261 | ||
|
262 | 264 | except Exception: |
|
263 | 265 | log.exception("Exception while updating application settings") |
|
264 | 266 | h.flash( |
|
265 | 267 | _('Error occurred during updating application settings'), |
|
266 | 268 | category='error') |
|
267 | 269 | |
|
268 | 270 | return redirect(url('admin_settings_global')) |
|
269 | 271 | |
|
270 | 272 | @HasPermissionAllDecorator('hg.admin') |
|
271 | 273 | def settings_global(self): |
|
272 | 274 | """GET /admin/settings/global: All items in the collection""" |
|
273 | 275 | # url('admin_settings_global') |
|
274 | 276 | c.active = 'global' |
|
275 | 277 | |
|
276 | 278 | return htmlfill.render( |
|
277 | 279 | render('admin/settings/settings.html'), |
|
278 | 280 | defaults=self._form_defaults(), |
|
279 | 281 | encoding="UTF-8", |
|
280 | 282 | force_defaults=False) |
|
281 | 283 | |
|
282 | 284 | @HasPermissionAllDecorator('hg.admin') |
|
283 | 285 | @auth.CSRFRequired() |
|
284 | 286 | def settings_visual_update(self): |
|
285 | 287 | """POST /admin/settings/visual: All items in the collection""" |
|
286 | 288 | # url('admin_settings_visual') |
|
287 | 289 | c.active = 'visual' |
|
288 | 290 | application_form = ApplicationVisualisationForm()() |
|
289 | 291 | try: |
|
290 | 292 | form_result = application_form.to_python(dict(request.POST)) |
|
291 | 293 | except formencode.Invalid as errors: |
|
292 | 294 | return htmlfill.render( |
|
293 | 295 | render('admin/settings/settings.html'), |
|
294 | 296 | defaults=errors.value, |
|
295 | 297 | errors=errors.error_dict or {}, |
|
296 | 298 | prefix_error=False, |
|
297 | 299 | encoding="UTF-8", |
|
298 | 300 | force_defaults=False |
|
299 | 301 | ) |
|
300 | 302 | |
|
301 | 303 | try: |
|
302 | 304 | settings = [ |
|
303 | 305 | ('show_public_icon', 'rhodecode_show_public_icon', 'bool'), |
|
304 | 306 | ('show_private_icon', 'rhodecode_show_private_icon', 'bool'), |
|
305 | 307 | ('stylify_metatags', 'rhodecode_stylify_metatags', 'bool'), |
|
306 | 308 | ('repository_fields', 'rhodecode_repository_fields', 'bool'), |
|
307 | 309 | ('dashboard_items', 'rhodecode_dashboard_items', 'int'), |
|
308 | 310 | ('admin_grid_items', 'rhodecode_admin_grid_items', 'int'), |
|
309 | 311 | ('show_version', 'rhodecode_show_version', 'bool'), |
|
310 | 312 | ('use_gravatar', 'rhodecode_use_gravatar', 'bool'), |
|
311 | 313 | ('markup_renderer', 'rhodecode_markup_renderer', 'unicode'), |
|
312 | 314 | ('gravatar_url', 'rhodecode_gravatar_url', 'unicode'), |
|
313 | 315 | ('clone_uri_tmpl', 'rhodecode_clone_uri_tmpl', 'unicode'), |
|
314 | 316 | ('support_url', 'rhodecode_support_url', 'unicode'), |
|
315 | 317 | ('show_revision_number', 'rhodecode_show_revision_number', 'bool'), |
|
316 | 318 | ('show_sha_length', 'rhodecode_show_sha_length', 'int'), |
|
317 | 319 | ] |
|
318 | 320 | for setting, form_key, type_ in settings: |
|
319 | 321 | sett = SettingsModel().create_or_update_setting( |
|
320 | 322 | setting, form_result[form_key], type_) |
|
321 | 323 | Session().add(sett) |
|
322 | 324 | |
|
323 | 325 | Session().commit() |
|
324 | ||
|
326 | SettingsModel().invalidate_settings_cache() | |
|
325 | 327 | h.flash(_('Updated visualisation settings'), category='success') |
|
326 | 328 | except Exception: |
|
327 | 329 | log.exception("Exception updating visualization settings") |
|
328 | 330 | h.flash(_('Error occurred during updating ' |
|
329 | 331 | 'visualisation settings'), |
|
330 | 332 | category='error') |
|
331 | 333 | |
|
332 | 334 | return redirect(url('admin_settings_visual')) |
|
333 | 335 | |
|
334 | 336 | @HasPermissionAllDecorator('hg.admin') |
|
335 | 337 | def settings_visual(self): |
|
336 | 338 | """GET /admin/settings/visual: All items in the collection""" |
|
337 | 339 | # url('admin_settings_visual') |
|
338 | 340 | c.active = 'visual' |
|
339 | 341 | |
|
340 | 342 | return htmlfill.render( |
|
341 | 343 | render('admin/settings/settings.html'), |
|
342 | 344 | defaults=self._form_defaults(), |
|
343 | 345 | encoding="UTF-8", |
|
344 | 346 | force_defaults=False) |
|
345 | 347 | |
|
346 | 348 | @HasPermissionAllDecorator('hg.admin') |
|
347 | 349 | @auth.CSRFRequired() |
|
348 | 350 | def settings_issuetracker_test(self): |
|
349 | 351 | if request.is_xhr: |
|
350 | 352 | return h.urlify_commit_message( |
|
351 | 353 | request.POST.get('test_text', ''), |
|
352 | 354 | 'repo_group/test_repo1') |
|
353 | 355 | else: |
|
354 | 356 | raise HTTPBadRequest() |
|
355 | 357 | |
|
356 | 358 | @HasPermissionAllDecorator('hg.admin') |
|
357 | 359 | @auth.CSRFRequired() |
|
358 | 360 | def settings_issuetracker_delete(self): |
|
359 | 361 | uid = request.POST.get('uid') |
|
360 | 362 | IssueTrackerSettingsModel().delete_entries(uid) |
|
361 | 363 | h.flash(_('Removed issue tracker entry'), category='success') |
|
362 | 364 | return redirect(url('admin_settings_issuetracker')) |
|
363 | 365 | |
|
364 | 366 | @HasPermissionAllDecorator('hg.admin') |
|
365 | 367 | def settings_issuetracker(self): |
|
366 | 368 | """GET /admin/settings/issue-tracker: All items in the collection""" |
|
367 | 369 | # url('admin_settings_issuetracker') |
|
368 | 370 | c.active = 'issuetracker' |
|
369 | 371 | defaults = SettingsModel().get_all_settings() |
|
370 | 372 | |
|
371 | 373 | entry_key = 'rhodecode_issuetracker_pat_' |
|
372 | 374 | |
|
373 | 375 | c.issuetracker_entries = {} |
|
374 | 376 | for k, v in defaults.items(): |
|
375 | 377 | if k.startswith(entry_key): |
|
376 | 378 | uid = k[len(entry_key):] |
|
377 | 379 | c.issuetracker_entries[uid] = None |
|
378 | 380 | |
|
379 | 381 | for uid in c.issuetracker_entries: |
|
380 | 382 | c.issuetracker_entries[uid] = AttributeDict({ |
|
381 | 383 | 'pat': defaults.get('rhodecode_issuetracker_pat_' + uid), |
|
382 | 384 | 'url': defaults.get('rhodecode_issuetracker_url_' + uid), |
|
383 | 385 | 'pref': defaults.get('rhodecode_issuetracker_pref_' + uid), |
|
384 | 386 | 'desc': defaults.get('rhodecode_issuetracker_desc_' + uid), |
|
385 | 387 | }) |
|
386 | 388 | |
|
387 | 389 | return render('admin/settings/settings.html') |
|
388 | 390 | |
|
389 | 391 | @HasPermissionAllDecorator('hg.admin') |
|
390 | 392 | @auth.CSRFRequired() |
|
391 | 393 | def settings_issuetracker_save(self): |
|
392 | 394 | settings_model = IssueTrackerSettingsModel() |
|
393 | 395 | |
|
394 | 396 | form = IssueTrackerPatternsForm()().to_python(request.POST) |
|
395 | 397 | for uid in form['delete_patterns']: |
|
396 | 398 | settings_model.delete_entries(uid) |
|
397 | 399 | |
|
398 | 400 | for pattern in form['patterns']: |
|
399 | 401 | for setting, value, type_ in pattern: |
|
400 | 402 | sett = settings_model.create_or_update_setting( |
|
401 | 403 | setting, value, type_) |
|
402 | 404 | Session().add(sett) |
|
403 | 405 | |
|
404 | 406 | Session().commit() |
|
405 | 407 | |
|
408 | SettingsModel().invalidate_settings_cache() | |
|
406 | 409 | h.flash(_('Updated issue tracker entries'), category='success') |
|
407 | 410 | return redirect(url('admin_settings_issuetracker')) |
|
408 | 411 | |
|
409 | 412 | @HasPermissionAllDecorator('hg.admin') |
|
410 | 413 | @auth.CSRFRequired() |
|
411 | 414 | def settings_email_update(self): |
|
412 | 415 | """POST /admin/settings/email: All items in the collection""" |
|
413 | 416 | # url('admin_settings_email') |
|
414 | 417 | c.active = 'email' |
|
415 | 418 | |
|
416 | 419 | test_email = request.POST.get('test_email') |
|
417 | 420 | |
|
418 | 421 | if not test_email: |
|
419 | 422 | h.flash(_('Please enter email address'), category='error') |
|
420 | 423 | return redirect(url('admin_settings_email')) |
|
421 | 424 | |
|
422 | 425 | email_kwargs = { |
|
423 | 426 | 'date': datetime.datetime.now(), |
|
424 | 427 | 'user': c.rhodecode_user, |
|
425 | 428 | 'rhodecode_version': c.rhodecode_version |
|
426 | 429 | } |
|
427 | 430 | |
|
428 | 431 | (subject, headers, email_body, |
|
429 | 432 | email_body_plaintext) = EmailNotificationModel().render_email( |
|
430 | 433 | EmailNotificationModel.TYPE_EMAIL_TEST, **email_kwargs) |
|
431 | 434 | |
|
432 | 435 | recipients = [test_email] if test_email else None |
|
433 | 436 | |
|
434 | 437 | run_task(tasks.send_email, recipients, subject, |
|
435 | 438 | email_body_plaintext, email_body) |
|
436 | 439 | |
|
437 | 440 | h.flash(_('Send email task created'), category='success') |
|
438 | 441 | return redirect(url('admin_settings_email')) |
|
439 | 442 | |
|
440 | 443 | @HasPermissionAllDecorator('hg.admin') |
|
441 | 444 | def settings_email(self): |
|
442 | 445 | """GET /admin/settings/email: All items in the collection""" |
|
443 | 446 | # url('admin_settings_email') |
|
444 | 447 | c.active = 'email' |
|
445 | 448 | c.rhodecode_ini = rhodecode.CONFIG |
|
446 | 449 | |
|
447 | 450 | return htmlfill.render( |
|
448 | 451 | render('admin/settings/settings.html'), |
|
449 | 452 | defaults=self._form_defaults(), |
|
450 | 453 | encoding="UTF-8", |
|
451 | 454 | force_defaults=False) |
|
452 | 455 | |
|
453 | 456 | @HasPermissionAllDecorator('hg.admin') |
|
454 | 457 | @auth.CSRFRequired() |
|
455 | 458 | def settings_hooks_update(self): |
|
456 | 459 | """POST or DELETE /admin/settings/hooks: All items in the collection""" |
|
457 | 460 | # url('admin_settings_hooks') |
|
458 | 461 | c.active = 'hooks' |
|
459 | 462 | if c.visual.allow_custom_hooks_settings: |
|
460 | 463 | ui_key = request.POST.get('new_hook_ui_key') |
|
461 | 464 | ui_value = request.POST.get('new_hook_ui_value') |
|
462 | 465 | |
|
463 | 466 | hook_id = request.POST.get('hook_id') |
|
464 | 467 | new_hook = False |
|
465 | 468 | |
|
466 | 469 | model = SettingsModel() |
|
467 | 470 | try: |
|
468 | 471 | if ui_value and ui_key: |
|
469 | 472 | model.create_or_update_hook(ui_key, ui_value) |
|
470 | 473 | h.flash(_('Added new hook'), category='success') |
|
471 | 474 | new_hook = True |
|
472 | 475 | elif hook_id: |
|
473 | 476 | RhodeCodeUi.delete(hook_id) |
|
474 | 477 | Session().commit() |
|
475 | 478 | |
|
476 | 479 | # check for edits |
|
477 | 480 | update = False |
|
478 | 481 | _d = request.POST.dict_of_lists() |
|
479 | 482 | for k, v in zip(_d.get('hook_ui_key', []), |
|
480 | 483 | _d.get('hook_ui_value_new', [])): |
|
481 | 484 | model.create_or_update_hook(k, v) |
|
482 | 485 | update = True |
|
483 | 486 | |
|
484 | 487 | if update and not new_hook: |
|
485 | 488 | h.flash(_('Updated hooks'), category='success') |
|
486 | 489 | Session().commit() |
|
487 | 490 | except Exception: |
|
488 | 491 | log.exception("Exception during hook creation") |
|
489 | 492 | h.flash(_('Error occurred during hook creation'), |
|
490 | 493 | category='error') |
|
491 | 494 | |
|
492 | 495 | return redirect(url('admin_settings_hooks')) |
|
493 | 496 | |
|
494 | 497 | @HasPermissionAllDecorator('hg.admin') |
|
495 | 498 | def settings_hooks(self): |
|
496 | 499 | """GET /admin/settings/hooks: All items in the collection""" |
|
497 | 500 | # url('admin_settings_hooks') |
|
498 | 501 | c.active = 'hooks' |
|
499 | 502 | |
|
500 | 503 | model = SettingsModel() |
|
501 | 504 | c.hooks = model.get_builtin_hooks() |
|
502 | 505 | c.custom_hooks = model.get_custom_hooks() |
|
503 | 506 | |
|
504 | 507 | return htmlfill.render( |
|
505 | 508 | render('admin/settings/settings.html'), |
|
506 | 509 | defaults=self._form_defaults(), |
|
507 | 510 | encoding="UTF-8", |
|
508 | 511 | force_defaults=False) |
|
509 | 512 | |
|
510 | 513 | @HasPermissionAllDecorator('hg.admin') |
|
511 | 514 | def settings_search(self): |
|
512 | 515 | """GET /admin/settings/search: All items in the collection""" |
|
513 | 516 | # url('admin_settings_search') |
|
514 | 517 | c.active = 'search' |
|
515 | 518 | |
|
516 | 519 | from rhodecode.lib.index import searcher_from_config |
|
517 | 520 | searcher = searcher_from_config(config) |
|
518 | 521 | c.statistics = searcher.statistics() |
|
519 | 522 | |
|
520 | 523 | return render('admin/settings/settings.html') |
|
521 | 524 | |
|
522 | 525 | @HasPermissionAllDecorator('hg.admin') |
|
523 | 526 | def settings_system(self): |
|
524 | 527 | """GET /admin/settings/system: All items in the collection""" |
|
525 | 528 | # url('admin_settings_system') |
|
529 | snapshot = str2bool(request.GET.get('snapshot')) | |
|
526 | 530 | c.active = 'system' |
|
527 | 531 | |
|
528 | 532 | defaults = self._form_defaults() |
|
529 | 533 | c.rhodecode_ini = rhodecode.CONFIG |
|
530 | 534 | c.rhodecode_update_url = defaults.get('rhodecode_update_url') |
|
531 | 535 | server_info = ScmModel().get_server_info(request.environ) |
|
532 | 536 | for key, val in server_info.iteritems(): |
|
533 | 537 | setattr(c, key, val) |
|
534 | 538 | |
|
535 | 539 | if c.disk['percent'] > 90: |
|
536 | 540 | h.flash(h.literal(_( |
|
537 | 541 | 'Critical: your disk space is very low <b>%s%%</b> used' % |
|
538 | 542 | c.disk['percent'])), 'error') |
|
539 | 543 | elif c.disk['percent'] > 70: |
|
540 | 544 | h.flash(h.literal(_( |
|
541 | 545 | 'Warning: your disk space is running low <b>%s%%</b> used' % |
|
542 | 546 | c.disk['percent'])), 'warning') |
|
543 | 547 | |
|
544 | 548 | try: |
|
545 | 549 | c.uptime_age = h._age( |
|
546 | 550 | h.time_to_datetime(c.boot_time), False, show_suffix=False) |
|
547 | 551 | except TypeError: |
|
548 | 552 | c.uptime_age = c.boot_time |
|
549 | 553 | |
|
550 | 554 | try: |
|
551 | 555 | c.system_memory = '%s/%s, %s%% (%s%%) used%s' % ( |
|
552 | 556 | h.format_byte_size_binary(c.memory['used']), |
|
553 | 557 | h.format_byte_size_binary(c.memory['total']), |
|
554 | 558 | c.memory['percent2'], |
|
555 | 559 | c.memory['percent'], |
|
556 | 560 | ' %s' % c.memory['error'] if 'error' in c.memory else '') |
|
557 | 561 | except TypeError: |
|
558 | 562 | c.system_memory = 'NOT AVAILABLE' |
|
559 | 563 | |
|
564 | rhodecode_ini_safe = rhodecode.CONFIG.copy() | |
|
565 | blacklist = [ | |
|
566 | 'rhodecode_license_key', | |
|
567 | 'routes.map', | |
|
568 | 'pylons.h', | |
|
569 | 'pylons.app_globals', | |
|
570 | 'pylons.environ_config', | |
|
571 | 'sqlalchemy.db1.url', | |
|
572 | ('app_conf', 'sqlalchemy.db1.url') | |
|
573 | ] | |
|
574 | for k in blacklist: | |
|
575 | if isinstance(k, tuple): | |
|
576 | section, key = k | |
|
577 | if section in rhodecode_ini_safe: | |
|
578 | rhodecode_ini_safe[section].pop(key, None) | |
|
579 | else: | |
|
580 | rhodecode_ini_safe.pop(k, None) | |
|
581 | ||
|
582 | c.rhodecode_ini_safe = rhodecode_ini_safe | |
|
583 | ||
|
584 | # TODO: marcink, figure out how to allow only selected users to do this | |
|
585 | c.allowed_to_snapshot = False | |
|
586 | ||
|
587 | if snapshot: | |
|
588 | if c.allowed_to_snapshot: | |
|
589 | return render('admin/settings/settings_system_snapshot.html') | |
|
590 | else: | |
|
591 | h.flash('You are not allowed to do this', category='warning') | |
|
592 | ||
|
560 | 593 | return htmlfill.render( |
|
561 | 594 | render('admin/settings/settings.html'), |
|
562 | 595 | defaults=defaults, |
|
563 | 596 | encoding="UTF-8", |
|
564 | 597 | force_defaults=False) |
|
565 | 598 | |
|
566 | 599 | @staticmethod |
|
567 | 600 | def get_update_data(update_url): |
|
568 | 601 | """Return the JSON update data.""" |
|
569 | 602 | ver = rhodecode.__version__ |
|
570 | 603 | log.debug('Checking for upgrade on `%s` server', update_url) |
|
571 | 604 | opener = urllib2.build_opener() |
|
572 | 605 | opener.addheaders = [('User-agent', 'RhodeCode-SCM/%s' % ver)] |
|
573 | 606 | response = opener.open(update_url) |
|
574 | 607 | response_data = response.read() |
|
575 | 608 | data = json.loads(response_data) |
|
576 | 609 | |
|
577 | 610 | return data |
|
578 | 611 | |
|
579 | 612 | @HasPermissionAllDecorator('hg.admin') |
|
580 | 613 | def settings_system_update(self): |
|
581 | 614 | """GET /admin/settings/system/updates: All items in the collection""" |
|
582 | 615 | # url('admin_settings_system_update') |
|
583 | 616 | defaults = self._form_defaults() |
|
584 | 617 | update_url = defaults.get('rhodecode_update_url', '') |
|
585 | 618 | |
|
586 | 619 | _err = lambda s: '<div style="color:#ff8888; padding:4px 0px">%s</div>' % (s) |
|
587 | 620 | try: |
|
588 | 621 | data = self.get_update_data(update_url) |
|
589 | 622 | except urllib2.URLError as e: |
|
590 | 623 | log.exception("Exception contacting upgrade server") |
|
591 | 624 | return _err('Failed to contact upgrade server: %r' % e) |
|
592 | 625 | except ValueError as e: |
|
593 | 626 | log.exception("Bad data sent from update server") |
|
594 | 627 | return _err('Bad data sent from update server') |
|
595 | 628 | |
|
596 | 629 | latest = data['versions'][0] |
|
597 | 630 | |
|
598 | 631 | c.update_url = update_url |
|
599 | 632 | c.latest_data = latest |
|
600 | 633 | c.latest_ver = latest['version'] |
|
601 | 634 | c.cur_ver = rhodecode.__version__ |
|
602 | 635 | c.should_upgrade = False |
|
603 | 636 | |
|
604 | 637 | if (packaging.version.Version(c.latest_ver) > |
|
605 | 638 | packaging.version.Version(c.cur_ver)): |
|
606 | 639 | c.should_upgrade = True |
|
607 | 640 | c.important_notices = latest['general'] |
|
608 | 641 | |
|
609 | 642 | return render('admin/settings/settings_system_update.html') |
|
610 | 643 | |
|
611 | 644 | @HasPermissionAllDecorator('hg.admin') |
|
612 | 645 | def settings_supervisor(self): |
|
613 | 646 | c.rhodecode_ini = rhodecode.CONFIG |
|
614 | 647 | c.active = 'supervisor' |
|
615 | 648 | |
|
616 | 649 | c.supervisor_procs = OrderedDict([ |
|
617 | 650 | (SUPERVISOR_MASTER, {}), |
|
618 | 651 | ]) |
|
619 | 652 | |
|
620 | 653 | c.log_size = 10240 |
|
621 | 654 | supervisor = SupervisorModel() |
|
622 | 655 | |
|
623 | 656 | _connection = supervisor.get_connection( |
|
624 | 657 | c.rhodecode_ini.get('supervisor.uri')) |
|
625 | 658 | c.connection_error = None |
|
626 | 659 | try: |
|
627 | 660 | _connection.supervisor.getAllProcessInfo() |
|
628 | 661 | except Exception as e: |
|
629 | 662 | c.connection_error = str(e) |
|
630 | 663 | log.exception("Exception reading supervisor data") |
|
631 | 664 | return render('admin/settings/settings.html') |
|
632 | 665 | |
|
633 | 666 | groupid = c.rhodecode_ini.get('supervisor.group_id') |
|
634 | 667 | |
|
635 | 668 | # feed our group processes to the main |
|
636 | 669 | for proc in supervisor.get_group_processes(_connection, groupid): |
|
637 | 670 | c.supervisor_procs[proc['name']] = {} |
|
638 | 671 | |
|
639 | 672 | for k in c.supervisor_procs.keys(): |
|
640 | 673 | try: |
|
641 | 674 | # master process info |
|
642 | 675 | if k == SUPERVISOR_MASTER: |
|
643 | 676 | _data = supervisor.get_master_state(_connection) |
|
644 | 677 | _data['name'] = 'supervisor master' |
|
645 | 678 | _data['description'] = 'pid %s, id: %s, ver: %s' % ( |
|
646 | 679 | _data['pid'], _data['id'], _data['ver']) |
|
647 | 680 | c.supervisor_procs[k] = _data |
|
648 | 681 | else: |
|
649 | 682 | procid = groupid + ":" + k |
|
650 | 683 | c.supervisor_procs[k] = supervisor.get_process_info(_connection, procid) |
|
651 | 684 | except Exception as e: |
|
652 | 685 | log.exception("Exception reading supervisor data") |
|
653 | 686 | c.supervisor_procs[k] = {'_rhodecode_error': str(e)} |
|
654 | 687 | |
|
655 | 688 | return render('admin/settings/settings.html') |
|
656 | 689 | |
|
657 | 690 | @HasPermissionAllDecorator('hg.admin') |
|
658 | 691 | def settings_supervisor_log(self, procid): |
|
659 | 692 | import rhodecode |
|
660 | 693 | c.rhodecode_ini = rhodecode.CONFIG |
|
661 | 694 | c.active = 'supervisor_tail' |
|
662 | 695 | |
|
663 | 696 | supervisor = SupervisorModel() |
|
664 | 697 | _connection = supervisor.get_connection(c.rhodecode_ini.get('supervisor.uri')) |
|
665 | 698 | groupid = c.rhodecode_ini.get('supervisor.group_id') |
|
666 | 699 | procid = groupid + ":" + procid if procid != SUPERVISOR_MASTER else procid |
|
667 | 700 | |
|
668 | 701 | c.log_size = 10240 |
|
669 | 702 | offset = abs(safe_int(request.GET.get('offset', c.log_size))) * -1 |
|
670 | 703 | c.log = supervisor.read_process_log(_connection, procid, offset, 0) |
|
671 | 704 | |
|
672 | 705 | return render('admin/settings/settings.html') |
|
673 | 706 | |
|
674 | 707 | @HasPermissionAllDecorator('hg.admin') |
|
675 | 708 | @auth.CSRFRequired() |
|
676 | 709 | def settings_labs_update(self): |
|
677 | 710 | """POST /admin/settings/labs: All items in the collection""" |
|
678 | 711 | # url('admin_settings/labs', method={'POST'}) |
|
679 | 712 | c.active = 'labs' |
|
680 | 713 | |
|
681 | 714 | application_form = LabsSettingsForm()() |
|
682 | 715 | try: |
|
683 | 716 | form_result = application_form.to_python(dict(request.POST)) |
|
684 | 717 | except formencode.Invalid as errors: |
|
685 | 718 | h.flash( |
|
686 | 719 | _('Some form inputs contain invalid data.'), |
|
687 | 720 | category='error') |
|
688 | 721 | return htmlfill.render( |
|
689 | 722 | render('admin/settings/settings.html'), |
|
690 | 723 | defaults=errors.value, |
|
691 | 724 | errors=errors.error_dict or {}, |
|
692 | 725 | prefix_error=False, |
|
693 | 726 | encoding='UTF-8', |
|
694 | 727 | force_defaults=False |
|
695 | 728 | ) |
|
696 | 729 | |
|
697 | 730 | try: |
|
698 | 731 | session = Session() |
|
699 | 732 | for setting in _LAB_SETTINGS: |
|
700 | 733 | setting_name = setting.key[len('rhodecode_'):] |
|
701 | 734 | sett = SettingsModel().create_or_update_setting( |
|
702 | 735 | setting_name, form_result[setting.key], setting.type) |
|
703 | 736 | session.add(sett) |
|
704 | 737 | |
|
705 | 738 | except Exception: |
|
706 | 739 | log.exception('Exception while updating lab settings') |
|
707 | 740 | h.flash(_('Error occurred during updating labs settings'), |
|
708 | 741 | category='error') |
|
709 | 742 | else: |
|
710 | 743 | Session().commit() |
|
744 | SettingsModel().invalidate_settings_cache() | |
|
711 | 745 | h.flash(_('Updated Labs settings'), category='success') |
|
712 | 746 | return redirect(url('admin_settings_labs')) |
|
713 | 747 | |
|
714 | 748 | return htmlfill.render( |
|
715 | 749 | render('admin/settings/settings.html'), |
|
716 | 750 | defaults=self._form_defaults(), |
|
717 | 751 | encoding='UTF-8', |
|
718 | 752 | force_defaults=False) |
|
719 | 753 | |
|
720 | 754 | @HasPermissionAllDecorator('hg.admin') |
|
721 | 755 | def settings_labs(self): |
|
722 | 756 | """GET /admin/settings/labs: All items in the collection""" |
|
723 | 757 | # url('admin_settings_labs') |
|
724 | 758 | if not c.labs_active: |
|
725 | 759 | redirect(url('admin_settings')) |
|
726 | 760 | |
|
727 | 761 | c.active = 'labs' |
|
728 | 762 | c.lab_settings = _LAB_SETTINGS |
|
729 | 763 | |
|
730 | 764 | return htmlfill.render( |
|
731 | 765 | render('admin/settings/settings.html'), |
|
732 | 766 | defaults=self._form_defaults(), |
|
733 | 767 | encoding='UTF-8', |
|
734 | 768 | force_defaults=False) |
|
735 | 769 | |
|
736 | @HasPermissionAllDecorator('hg.admin') | |
|
737 | def settings_open_source(self): | |
|
738 | # url('admin_settings_open_source') | |
|
739 | ||
|
740 | c.active = 'open_source' | |
|
741 | c.opensource_licenses = collections.OrderedDict( | |
|
742 | sorted(read_opensource_licenses().items(), key=lambda t: t[0])) | |
|
743 | ||
|
744 | return htmlfill.render( | |
|
745 | render('admin/settings/settings.html'), | |
|
746 | defaults=self._form_defaults(), | |
|
747 | encoding='UTF-8', | |
|
748 | force_defaults=False) | |
|
749 | ||
|
750 | 770 | def _form_defaults(self): |
|
751 | 771 | defaults = SettingsModel().get_all_settings() |
|
752 | 772 | defaults.update(self._get_hg_ui_settings()) |
|
753 | 773 | defaults.update({ |
|
754 | 774 | 'new_svn_branch': '', |
|
755 | 775 | 'new_svn_tag': '', |
|
756 | 776 | }) |
|
757 | 777 | return defaults |
|
758 | 778 | |
|
759 | 779 | |
|
760 | 780 | # :param key: name of the setting including the 'rhodecode_' prefix |
|
761 | 781 | # :param type: the RhodeCodeSetting type to use. |
|
762 | 782 | # :param group: the i18ned group in which we should dispaly this setting |
|
763 | 783 | # :param label: the i18ned label we should display for this setting |
|
764 | 784 | # :param help: the i18ned help we should dispaly for this setting |
|
765 | 785 | LabSetting = collections.namedtuple( |
|
766 | 786 | 'LabSetting', ('key', 'type', 'group', 'label', 'help')) |
|
767 | 787 | |
|
768 | 788 | |
|
769 | 789 | # This list has to be kept in sync with the form |
|
770 | 790 | # rhodecode.model.forms.LabsSettingsForm. |
|
771 | 791 | _LAB_SETTINGS = [ |
|
772 | 792 | LabSetting( |
|
773 | 793 | key='rhodecode_hg_use_rebase_for_merging', |
|
774 | 794 | type='bool', |
|
775 | 795 | group=lazy_ugettext('Mercurial server-side merge'), |
|
776 | 796 | label=lazy_ugettext('Use rebase instead of creating a merge commit when merging via web interface'), |
|
777 | 797 | help='' # Do not translate the empty string! |
|
778 | 798 | ), |
|
779 | 799 | LabSetting( |
|
780 | 800 | key='rhodecode_proxy_subversion_http_requests', |
|
781 | 801 | type='bool', |
|
782 | 802 | group=lazy_ugettext('Subversion HTTP Support'), |
|
783 | 803 | label=lazy_ugettext('Proxy subversion HTTP requests'), |
|
784 | 804 | help='' # Do not translate the empty string! |
|
785 | 805 | ), |
|
786 | 806 | LabSetting( |
|
787 | 807 | key='rhodecode_subversion_http_server_url', |
|
788 | 808 | type='str', |
|
789 | 809 | group=lazy_ugettext('Subversion HTTP Server URL'), |
|
790 | 810 | label='', # Do not translate the empty string! |
|
791 | 811 | help=lazy_ugettext('e.g. http://localhost:8080/') |
|
792 | 812 | ), |
|
793 | 813 | ] |
|
794 | ||
|
795 | ||
|
796 | NavListEntry = collections.namedtuple('NavListEntry', ['key', 'name', 'url']) | |
|
797 | ||
|
798 | ||
|
799 | class NavEntry(object): | |
|
800 | ||
|
801 | def __init__(self, key, name, view_name, pyramid=False): | |
|
802 | self.key = key | |
|
803 | self.name = name | |
|
804 | self.view_name = view_name | |
|
805 | self.pyramid = pyramid | |
|
806 | ||
|
807 | def generate_url(self, request): | |
|
808 | if self.pyramid: | |
|
809 | if hasattr(request, 'route_path'): | |
|
810 | return request.route_path(self.view_name) | |
|
811 | else: | |
|
812 | # TODO: johbo: Remove this after migrating to pyramid. | |
|
813 | # We need the pyramid request here to generate URLs to pyramid | |
|
814 | # views from within pylons views. | |
|
815 | from pyramid.threadlocal import get_current_request | |
|
816 | pyramid_request = get_current_request() | |
|
817 | return pyramid_request.route_path(self.view_name) | |
|
818 | else: | |
|
819 | return url(self.view_name) | |
|
820 | ||
|
821 | ||
|
822 | class NavigationRegistry(object): | |
|
823 | ||
|
824 | _base_entries = [ | |
|
825 | NavEntry('global', lazy_ugettext('Global'), 'admin_settings_global'), | |
|
826 | NavEntry('vcs', lazy_ugettext('VCS'), 'admin_settings_vcs'), | |
|
827 | NavEntry('visual', lazy_ugettext('Visual'), 'admin_settings_visual'), | |
|
828 | NavEntry('mapping', lazy_ugettext('Remap and Rescan'), | |
|
829 | 'admin_settings_mapping'), | |
|
830 | NavEntry('issuetracker', lazy_ugettext('Issue Tracker'), | |
|
831 | 'admin_settings_issuetracker'), | |
|
832 | NavEntry('email', lazy_ugettext('Email'), 'admin_settings_email'), | |
|
833 | NavEntry('hooks', lazy_ugettext('Hooks'), 'admin_settings_hooks'), | |
|
834 | NavEntry('search', lazy_ugettext('Full Text Search'), | |
|
835 | 'admin_settings_search'), | |
|
836 | NavEntry('system', lazy_ugettext('System Info'), | |
|
837 | 'admin_settings_system'), | |
|
838 | NavEntry('open_source', lazy_ugettext('Open Source Licenses'), | |
|
839 | 'admin_settings_open_source'), | |
|
840 | # TODO: marcink: we disable supervisor now until the supervisor stats | |
|
841 | # page is fixed in the nix configuration | |
|
842 | # NavEntry('supervisor', lazy_ugettext('Supervisor'), | |
|
843 | # 'admin_settings_supervisor'), | |
|
844 | ] | |
|
845 | ||
|
846 | def __init__(self): | |
|
847 | self._registered_entries = collections.OrderedDict([ | |
|
848 | (item.key, item) for item in self.__class__._base_entries | |
|
849 | ]) | |
|
850 | ||
|
851 | # Add the labs entry when it's activated. | |
|
852 | labs_active = str2bool( | |
|
853 | rhodecode.CONFIG.get('labs_settings_active', 'false')) | |
|
854 | if labs_active: | |
|
855 | self.add_entry( | |
|
856 | NavEntry('labs', lazy_ugettext('Labs'), 'admin_settings_labs')) | |
|
857 | ||
|
858 | def add_entry(self, entry): | |
|
859 | self._registered_entries[entry.key] = entry | |
|
860 | ||
|
861 | def get_navlist(self, request): | |
|
862 | navlist = [NavListEntry(i.key, i.name, i.generate_url(request)) | |
|
863 | for i in self._registered_entries.values()] | |
|
864 | return navlist | |
|
865 | ||
|
866 | navigation = NavigationRegistry() |
@@ -1,480 +1,480 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2011-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | """ |
|
22 | 22 | User Groups crud controller for pylons |
|
23 | 23 | """ |
|
24 | 24 | |
|
25 | 25 | import logging |
|
26 | 26 | import formencode |
|
27 | 27 | |
|
28 | 28 | from formencode import htmlfill |
|
29 | 29 | from pylons import request, tmpl_context as c, url, config |
|
30 | 30 | from pylons.controllers.util import redirect |
|
31 | 31 | from pylons.i18n.translation import _ |
|
32 | 32 | |
|
33 | 33 | from sqlalchemy.orm import joinedload |
|
34 | 34 | |
|
35 | 35 | from rhodecode.lib import auth |
|
36 | 36 | from rhodecode.lib import helpers as h |
|
37 | 37 | from rhodecode.lib.exceptions import UserGroupAssignedException,\ |
|
38 | 38 | RepoGroupAssignmentError |
|
39 | 39 | from rhodecode.lib.utils import jsonify, action_logger |
|
40 | 40 | from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int |
|
41 | 41 | from rhodecode.lib.auth import ( |
|
42 | 42 | LoginRequired, NotAnonymous, HasUserGroupPermissionAnyDecorator, |
|
43 | 43 | HasPermissionAnyDecorator) |
|
44 | 44 | from rhodecode.lib.base import BaseController, render |
|
45 | 45 | from rhodecode.model.permission import PermissionModel |
|
46 | 46 | from rhodecode.model.scm import UserGroupList |
|
47 | 47 | from rhodecode.model.user_group import UserGroupModel |
|
48 | 48 | from rhodecode.model.db import ( |
|
49 | 49 | User, UserGroup, UserGroupRepoToPerm, UserGroupRepoGroupToPerm) |
|
50 | 50 | from rhodecode.model.forms import ( |
|
51 | 51 | UserGroupForm, UserGroupPermsForm, UserIndividualPermissionsForm, |
|
52 | 52 | UserPermissionsForm) |
|
53 | 53 | from rhodecode.model.meta import Session |
|
54 | 54 | from rhodecode.lib.utils import action_logger |
|
55 | 55 | from rhodecode.lib.ext_json import json |
|
56 | 56 | |
|
57 | 57 | log = logging.getLogger(__name__) |
|
58 | 58 | |
|
59 | 59 | |
|
60 | 60 | class UserGroupsController(BaseController): |
|
61 | 61 | """REST Controller styled on the Atom Publishing Protocol""" |
|
62 | 62 | |
|
63 | 63 | @LoginRequired() |
|
64 | 64 | def __before__(self): |
|
65 | 65 | super(UserGroupsController, self).__before__() |
|
66 | 66 | c.available_permissions = config['available_permissions'] |
|
67 | 67 | PermissionModel().set_global_permission_choices(c, translator=_) |
|
68 | 68 | |
|
69 | 69 | def __load_data(self, user_group_id): |
|
70 | 70 | c.group_members_obj = [x.user for x in c.user_group.members] |
|
71 | 71 | c.group_members_obj.sort(key=lambda u: u.username.lower()) |
|
72 | 72 | |
|
73 | 73 | c.group_members = [(x.user_id, x.username) for x in c.group_members_obj] |
|
74 | 74 | |
|
75 | 75 | c.available_members = [(x.user_id, x.username) |
|
76 | 76 | for x in User.query().all()] |
|
77 | 77 | c.available_members.sort(key=lambda u: u[1].lower()) |
|
78 | 78 | |
|
79 | 79 | def __load_defaults(self, user_group_id): |
|
80 | 80 | """ |
|
81 | 81 | Load defaults settings for edit, and update |
|
82 | 82 | |
|
83 | 83 | :param user_group_id: |
|
84 | 84 | """ |
|
85 | 85 | user_group = UserGroup.get_or_404(user_group_id) |
|
86 | 86 | data = user_group.get_dict() |
|
87 | 87 | # fill owner |
|
88 | 88 | if user_group.user: |
|
89 | 89 | data.update({'user': user_group.user.username}) |
|
90 | 90 | else: |
|
91 | replacement_user = User.get_first_admin().username | |
|
91 | replacement_user = User.get_first_super_admin().username | |
|
92 | 92 | data.update({'user': replacement_user}) |
|
93 | 93 | return data |
|
94 | 94 | |
|
95 | 95 | def _revoke_perms_on_yourself(self, form_result): |
|
96 | 96 | _updates = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), |
|
97 | 97 | form_result['perm_updates']) |
|
98 | 98 | _additions = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), |
|
99 | 99 | form_result['perm_additions']) |
|
100 | 100 | _deletions = filter(lambda u: c.rhodecode_user.user_id == int(u[0]), |
|
101 | 101 | form_result['perm_deletions']) |
|
102 | 102 | admin_perm = 'usergroup.admin' |
|
103 | 103 | if _updates and _updates[0][1] != admin_perm or \ |
|
104 | 104 | _additions and _additions[0][1] != admin_perm or \ |
|
105 | 105 | _deletions and _deletions[0][1] != admin_perm: |
|
106 | 106 | return True |
|
107 | 107 | return False |
|
108 | 108 | |
|
109 | 109 | # permission check inside |
|
110 | 110 | @NotAnonymous() |
|
111 | 111 | def index(self): |
|
112 | 112 | """GET /users_groups: All items in the collection""" |
|
113 | 113 | # url('users_groups') |
|
114 | 114 | |
|
115 | 115 | from rhodecode.lib.utils import PartialRenderer |
|
116 | 116 | _render = PartialRenderer('data_table/_dt_elements.html') |
|
117 | 117 | |
|
118 | 118 | def user_group_name(user_group_id, user_group_name): |
|
119 | 119 | return _render("user_group_name", user_group_id, user_group_name) |
|
120 | 120 | |
|
121 | 121 | def user_group_actions(user_group_id, user_group_name): |
|
122 | 122 | return _render("user_group_actions", user_group_id, user_group_name) |
|
123 | 123 | |
|
124 | 124 | ## json generate |
|
125 | 125 | group_iter = UserGroupList(UserGroup.query().all(), |
|
126 | 126 | perm_set=['usergroup.admin']) |
|
127 | 127 | |
|
128 | 128 | user_groups_data = [] |
|
129 | 129 | for user_gr in group_iter: |
|
130 | 130 | user_groups_data.append({ |
|
131 | 131 | "group_name": user_group_name( |
|
132 | 132 | user_gr.users_group_id, h.escape(user_gr.users_group_name)), |
|
133 | 133 | "group_name_raw": user_gr.users_group_name, |
|
134 | 134 | "desc": h.escape(user_gr.user_group_description), |
|
135 | 135 | "members": len(user_gr.members), |
|
136 | 136 | "active": h.bool2icon(user_gr.users_group_active), |
|
137 | 137 | "owner": h.escape(h.link_to_user(user_gr.user.username)), |
|
138 | 138 | "action": user_group_actions( |
|
139 | 139 | user_gr.users_group_id, user_gr.users_group_name) |
|
140 | 140 | }) |
|
141 | 141 | |
|
142 | 142 | c.data = json.dumps(user_groups_data) |
|
143 | 143 | return render('admin/user_groups/user_groups.html') |
|
144 | 144 | |
|
145 | 145 | @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true') |
|
146 | 146 | @auth.CSRFRequired() |
|
147 | 147 | def create(self): |
|
148 | 148 | """POST /users_groups: Create a new item""" |
|
149 | 149 | # url('users_groups') |
|
150 | 150 | |
|
151 | 151 | users_group_form = UserGroupForm()() |
|
152 | 152 | try: |
|
153 | 153 | form_result = users_group_form.to_python(dict(request.POST)) |
|
154 | 154 | user_group = UserGroupModel().create( |
|
155 | 155 | name=form_result['users_group_name'], |
|
156 | 156 | description=form_result['user_group_description'], |
|
157 | 157 | owner=c.rhodecode_user.user_id, |
|
158 | 158 | active=form_result['users_group_active']) |
|
159 | 159 | Session().flush() |
|
160 | 160 | |
|
161 | 161 | user_group_name = form_result['users_group_name'] |
|
162 | 162 | action_logger(c.rhodecode_user, |
|
163 | 163 | 'admin_created_users_group:%s' % user_group_name, |
|
164 | 164 | None, self.ip_addr, self.sa) |
|
165 | 165 | user_group_link = h.link_to(h.escape(user_group_name), |
|
166 | 166 | url('edit_users_group', |
|
167 | 167 | user_group_id=user_group.users_group_id)) |
|
168 | 168 | h.flash(h.literal(_('Created user group %(user_group_link)s') |
|
169 | 169 | % {'user_group_link': user_group_link}), |
|
170 | 170 | category='success') |
|
171 | 171 | Session().commit() |
|
172 | 172 | except formencode.Invalid as errors: |
|
173 | 173 | return htmlfill.render( |
|
174 | 174 | render('admin/user_groups/user_group_add.html'), |
|
175 | 175 | defaults=errors.value, |
|
176 | 176 | errors=errors.error_dict or {}, |
|
177 | 177 | prefix_error=False, |
|
178 | 178 | encoding="UTF-8", |
|
179 | 179 | force_defaults=False) |
|
180 | 180 | except Exception: |
|
181 | 181 | log.exception("Exception creating user group") |
|
182 | 182 | h.flash(_('Error occurred during creation of user group %s') \ |
|
183 | 183 | % request.POST.get('users_group_name'), category='error') |
|
184 | 184 | |
|
185 | 185 | return redirect( |
|
186 | 186 | url('edit_users_group', user_group_id=user_group.users_group_id)) |
|
187 | 187 | |
|
188 | 188 | @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true') |
|
189 | 189 | def new(self): |
|
190 | 190 | """GET /user_groups/new: Form to create a new item""" |
|
191 | 191 | # url('new_users_group') |
|
192 | 192 | return render('admin/user_groups/user_group_add.html') |
|
193 | 193 | |
|
194 | 194 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
195 | 195 | @auth.CSRFRequired() |
|
196 | 196 | def update(self, user_group_id): |
|
197 | 197 | """PUT /user_groups/user_group_id: Update an existing item""" |
|
198 | 198 | # Forms posted to this method should contain a hidden field: |
|
199 | 199 | # <input type="hidden" name="_method" value="PUT" /> |
|
200 | 200 | # Or using helpers: |
|
201 | 201 | # h.form(url('users_group', user_group_id=ID), |
|
202 | 202 | # method='put') |
|
203 | 203 | # url('users_group', user_group_id=ID) |
|
204 | 204 | |
|
205 | 205 | user_group_id = safe_int(user_group_id) |
|
206 | 206 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
207 | 207 | c.active = 'settings' |
|
208 | 208 | self.__load_data(user_group_id) |
|
209 | 209 | |
|
210 | 210 | available_members = [safe_unicode(x[0]) for x in c.available_members] |
|
211 | 211 | |
|
212 |
users_group_form = UserGroupForm( |
|
|
213 |
|
|
|
214 |
|
|
|
212 | users_group_form = UserGroupForm( | |
|
213 | edit=True, old_data=c.user_group.get_dict(), | |
|
214 | available_members=available_members, allow_disabled=True)() | |
|
215 | 215 | |
|
216 | 216 | try: |
|
217 | 217 | form_result = users_group_form.to_python(request.POST) |
|
218 | 218 | UserGroupModel().update(c.user_group, form_result) |
|
219 | 219 | gr = form_result['users_group_name'] |
|
220 | 220 | action_logger(c.rhodecode_user, |
|
221 | 221 | 'admin_updated_users_group:%s' % gr, |
|
222 | 222 | None, self.ip_addr, self.sa) |
|
223 | 223 | h.flash(_('Updated user group %s') % gr, category='success') |
|
224 | 224 | Session().commit() |
|
225 | 225 | except formencode.Invalid as errors: |
|
226 | 226 | defaults = errors.value |
|
227 | 227 | e = errors.error_dict or {} |
|
228 | 228 | |
|
229 | 229 | return htmlfill.render( |
|
230 | 230 | render('admin/user_groups/user_group_edit.html'), |
|
231 | 231 | defaults=defaults, |
|
232 | 232 | errors=e, |
|
233 | 233 | prefix_error=False, |
|
234 | 234 | encoding="UTF-8", |
|
235 | 235 | force_defaults=False) |
|
236 | 236 | except Exception: |
|
237 | 237 | log.exception("Exception during update of user group") |
|
238 | 238 | h.flash(_('Error occurred during update of user group %s') |
|
239 | 239 | % request.POST.get('users_group_name'), category='error') |
|
240 | 240 | |
|
241 | 241 | return redirect(url('edit_users_group', user_group_id=user_group_id)) |
|
242 | 242 | |
|
243 | 243 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
244 | 244 | @auth.CSRFRequired() |
|
245 | 245 | def delete(self, user_group_id): |
|
246 | 246 | """DELETE /user_groups/user_group_id: Delete an existing item""" |
|
247 | 247 | # Forms posted to this method should contain a hidden field: |
|
248 | 248 | # <input type="hidden" name="_method" value="DELETE" /> |
|
249 | 249 | # Or using helpers: |
|
250 | 250 | # h.form(url('users_group', user_group_id=ID), |
|
251 | 251 | # method='delete') |
|
252 | 252 | # url('users_group', user_group_id=ID) |
|
253 | 253 | user_group_id = safe_int(user_group_id) |
|
254 | 254 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
255 | 255 | force = str2bool(request.POST.get('force')) |
|
256 | 256 | |
|
257 | 257 | try: |
|
258 | 258 | UserGroupModel().delete(c.user_group, force=force) |
|
259 | 259 | Session().commit() |
|
260 | 260 | h.flash(_('Successfully deleted user group'), category='success') |
|
261 | 261 | except UserGroupAssignedException as e: |
|
262 | 262 | h.flash(str(e), category='error') |
|
263 | 263 | except Exception: |
|
264 | 264 | log.exception("Exception during deletion of user group") |
|
265 | 265 | h.flash(_('An error occurred during deletion of user group'), |
|
266 | 266 | category='error') |
|
267 | 267 | return redirect(url('users_groups')) |
|
268 | 268 | |
|
269 | 269 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
270 | 270 | def edit(self, user_group_id): |
|
271 | 271 | """GET /user_groups/user_group_id/edit: Form to edit an existing item""" |
|
272 | 272 | # url('edit_users_group', user_group_id=ID) |
|
273 | 273 | |
|
274 | 274 | user_group_id = safe_int(user_group_id) |
|
275 | 275 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
276 | 276 | c.active = 'settings' |
|
277 | 277 | self.__load_data(user_group_id) |
|
278 | 278 | |
|
279 | 279 | defaults = self.__load_defaults(user_group_id) |
|
280 | 280 | |
|
281 | 281 | return htmlfill.render( |
|
282 | 282 | render('admin/user_groups/user_group_edit.html'), |
|
283 | 283 | defaults=defaults, |
|
284 | 284 | encoding="UTF-8", |
|
285 | 285 | force_defaults=False |
|
286 | 286 | ) |
|
287 | 287 | |
|
288 | 288 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
289 | 289 | def edit_perms(self, user_group_id): |
|
290 | 290 | user_group_id = safe_int(user_group_id) |
|
291 | 291 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
292 | 292 | c.active = 'perms' |
|
293 | 293 | |
|
294 | 294 | defaults = {} |
|
295 | 295 | # fill user group users |
|
296 | 296 | for p in c.user_group.user_user_group_to_perm: |
|
297 | 297 | defaults.update({'u_perm_%s' % p.user.user_id: |
|
298 | 298 | p.permission.permission_name}) |
|
299 | 299 | |
|
300 | 300 | for p in c.user_group.user_group_user_group_to_perm: |
|
301 | 301 | defaults.update({'g_perm_%s' % p.user_group.users_group_id: |
|
302 | 302 | p.permission.permission_name}) |
|
303 | 303 | |
|
304 | 304 | return htmlfill.render( |
|
305 | 305 | render('admin/user_groups/user_group_edit.html'), |
|
306 | 306 | defaults=defaults, |
|
307 | 307 | encoding="UTF-8", |
|
308 | 308 | force_defaults=False |
|
309 | 309 | ) |
|
310 | 310 | |
|
311 | 311 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
312 | 312 | @auth.CSRFRequired() |
|
313 | 313 | def update_perms(self, user_group_id): |
|
314 | 314 | """ |
|
315 | 315 | grant permission for given usergroup |
|
316 | 316 | |
|
317 | 317 | :param user_group_id: |
|
318 | 318 | """ |
|
319 | 319 | user_group_id = safe_int(user_group_id) |
|
320 | 320 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
321 | 321 | form = UserGroupPermsForm()().to_python(request.POST) |
|
322 | 322 | |
|
323 | 323 | if not c.rhodecode_user.is_admin: |
|
324 | 324 | if self._revoke_perms_on_yourself(form): |
|
325 | 325 | msg = _('Cannot change permission for yourself as admin') |
|
326 | 326 | h.flash(msg, category='warning') |
|
327 | 327 | return redirect(url('edit_user_group_perms', user_group_id=user_group_id)) |
|
328 | 328 | |
|
329 | 329 | try: |
|
330 | 330 | UserGroupModel().update_permissions(user_group_id, |
|
331 | 331 | form['perm_additions'], form['perm_updates'], form['perm_deletions']) |
|
332 | 332 | except RepoGroupAssignmentError: |
|
333 | 333 | h.flash(_('Target group cannot be the same'), category='error') |
|
334 | 334 | return redirect(url('edit_user_group_perms', user_group_id=user_group_id)) |
|
335 | 335 | #TODO: implement this |
|
336 | 336 | #action_logger(c.rhodecode_user, 'admin_changed_repo_permissions', |
|
337 | 337 | # repo_name, self.ip_addr, self.sa) |
|
338 | 338 | Session().commit() |
|
339 | 339 | h.flash(_('User Group permissions updated'), category='success') |
|
340 | 340 | return redirect(url('edit_user_group_perms', user_group_id=user_group_id)) |
|
341 | 341 | |
|
342 | 342 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
343 | 343 | def edit_perms_summary(self, user_group_id): |
|
344 | 344 | user_group_id = safe_int(user_group_id) |
|
345 | 345 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
346 | 346 | c.active = 'perms_summary' |
|
347 | 347 | permissions = { |
|
348 | 348 | 'repositories': {}, |
|
349 | 349 | 'repositories_groups': {}, |
|
350 | 350 | } |
|
351 | 351 | ugroup_repo_perms = UserGroupRepoToPerm.query()\ |
|
352 | 352 | .options(joinedload(UserGroupRepoToPerm.permission))\ |
|
353 | 353 | .options(joinedload(UserGroupRepoToPerm.repository))\ |
|
354 | 354 | .filter(UserGroupRepoToPerm.users_group_id == user_group_id)\ |
|
355 | 355 | .all() |
|
356 | 356 | |
|
357 | 357 | for gr in ugroup_repo_perms: |
|
358 | 358 | permissions['repositories'][gr.repository.repo_name] \ |
|
359 | 359 | = gr.permission.permission_name |
|
360 | 360 | |
|
361 | 361 | ugroup_group_perms = UserGroupRepoGroupToPerm.query()\ |
|
362 | 362 | .options(joinedload(UserGroupRepoGroupToPerm.permission))\ |
|
363 | 363 | .options(joinedload(UserGroupRepoGroupToPerm.group))\ |
|
364 | 364 | .filter(UserGroupRepoGroupToPerm.users_group_id == user_group_id)\ |
|
365 | 365 | .all() |
|
366 | 366 | |
|
367 | 367 | for gr in ugroup_group_perms: |
|
368 | 368 | permissions['repositories_groups'][gr.group.group_name] \ |
|
369 | 369 | = gr.permission.permission_name |
|
370 | 370 | c.permissions = permissions |
|
371 | 371 | return render('admin/user_groups/user_group_edit.html') |
|
372 | 372 | |
|
373 | 373 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
374 | 374 | def edit_global_perms(self, user_group_id): |
|
375 | 375 | user_group_id = safe_int(user_group_id) |
|
376 | 376 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
377 | 377 | c.active = 'global_perms' |
|
378 | 378 | |
|
379 | 379 | c.default_user = User.get_default_user() |
|
380 | 380 | defaults = c.user_group.get_dict() |
|
381 | 381 | defaults.update(c.default_user.get_default_perms(suffix='_inherited')) |
|
382 | 382 | defaults.update(c.user_group.get_default_perms()) |
|
383 | 383 | |
|
384 | 384 | return htmlfill.render( |
|
385 | 385 | render('admin/user_groups/user_group_edit.html'), |
|
386 | 386 | defaults=defaults, |
|
387 | 387 | encoding="UTF-8", |
|
388 | 388 | force_defaults=False |
|
389 | 389 | ) |
|
390 | 390 | |
|
391 | 391 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
392 | 392 | @auth.CSRFRequired() |
|
393 | 393 | def update_global_perms(self, user_group_id): |
|
394 | 394 | """PUT /users_perm/user_group_id: Update an existing item""" |
|
395 | 395 | # url('users_group_perm', user_group_id=ID, method='put') |
|
396 | 396 | user_group_id = safe_int(user_group_id) |
|
397 | 397 | user_group = UserGroup.get_or_404(user_group_id) |
|
398 | 398 | c.active = 'global_perms' |
|
399 | 399 | |
|
400 | 400 | try: |
|
401 | 401 | # first stage that verifies the checkbox |
|
402 | 402 | _form = UserIndividualPermissionsForm() |
|
403 | 403 | form_result = _form.to_python(dict(request.POST)) |
|
404 | 404 | inherit_perms = form_result['inherit_default_permissions'] |
|
405 | 405 | user_group.inherit_default_permissions = inherit_perms |
|
406 | 406 | Session().add(user_group) |
|
407 | 407 | |
|
408 | 408 | if not inherit_perms: |
|
409 | 409 | # only update the individual ones if we un check the flag |
|
410 | 410 | _form = UserPermissionsForm( |
|
411 | 411 | [x[0] for x in c.repo_create_choices], |
|
412 | 412 | [x[0] for x in c.repo_create_on_write_choices], |
|
413 | 413 | [x[0] for x in c.repo_group_create_choices], |
|
414 | 414 | [x[0] for x in c.user_group_create_choices], |
|
415 | 415 | [x[0] for x in c.fork_choices], |
|
416 | 416 | [x[0] for x in c.inherit_default_permission_choices])() |
|
417 | 417 | |
|
418 | 418 | form_result = _form.to_python(dict(request.POST)) |
|
419 | 419 | form_result.update({'perm_user_group_id': user_group.users_group_id}) |
|
420 | 420 | |
|
421 | 421 | PermissionModel().update_user_group_permissions(form_result) |
|
422 | 422 | |
|
423 | 423 | Session().commit() |
|
424 | 424 | h.flash(_('User Group global permissions updated successfully'), |
|
425 | 425 | category='success') |
|
426 | 426 | |
|
427 | 427 | except formencode.Invalid as errors: |
|
428 | 428 | defaults = errors.value |
|
429 | 429 | c.user_group = user_group |
|
430 | 430 | return htmlfill.render( |
|
431 | 431 | render('admin/user_groups/user_group_edit.html'), |
|
432 | 432 | defaults=defaults, |
|
433 | 433 | errors=errors.error_dict or {}, |
|
434 | 434 | prefix_error=False, |
|
435 | 435 | encoding="UTF-8", |
|
436 | 436 | force_defaults=False) |
|
437 | 437 | |
|
438 | 438 | except Exception: |
|
439 | 439 | log.exception("Exception during permissions saving") |
|
440 | 440 | h.flash(_('An error occurred during permissions saving'), |
|
441 | 441 | category='error') |
|
442 | 442 | |
|
443 | 443 | return redirect(url('edit_user_group_global_perms', user_group_id=user_group_id)) |
|
444 | 444 | |
|
445 | 445 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
446 | 446 | def edit_advanced(self, user_group_id): |
|
447 | 447 | user_group_id = safe_int(user_group_id) |
|
448 | 448 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
449 | 449 | c.active = 'advanced' |
|
450 | 450 | c.group_members_obj = sorted( |
|
451 | 451 | (x.user for x in c.user_group.members), |
|
452 | 452 | key=lambda u: u.username.lower()) |
|
453 | 453 | |
|
454 | 454 | c.group_to_repos = sorted( |
|
455 | 455 | (x.repository for x in c.user_group.users_group_repo_to_perm), |
|
456 | 456 | key=lambda u: u.repo_name.lower()) |
|
457 | 457 | |
|
458 | 458 | c.group_to_repo_groups = sorted( |
|
459 | 459 | (x.group for x in c.user_group.users_group_repo_group_to_perm), |
|
460 | 460 | key=lambda u: u.group_name.lower()) |
|
461 | 461 | |
|
462 | 462 | return render('admin/user_groups/user_group_edit.html') |
|
463 | 463 | |
|
464 | 464 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
465 | 465 | def edit_members(self, user_group_id): |
|
466 | 466 | user_group_id = safe_int(user_group_id) |
|
467 | 467 | c.user_group = UserGroup.get_or_404(user_group_id) |
|
468 | 468 | c.active = 'members' |
|
469 | 469 | c.group_members_obj = sorted((x.user for x in c.user_group.members), |
|
470 | 470 | key=lambda u: u.username.lower()) |
|
471 | 471 | |
|
472 | 472 | group_members = [(x.user_id, x.username) for x in c.group_members_obj] |
|
473 | 473 | |
|
474 | 474 | if request.is_xhr: |
|
475 | 475 | return jsonify(lambda *a, **k: { |
|
476 | 476 | 'members': group_members |
|
477 | 477 | }) |
|
478 | 478 | |
|
479 | 479 | c.group_members = group_members |
|
480 | 480 | return render('admin/user_groups/user_group_edit.html') |
@@ -1,717 +1,719 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | """ |
|
22 | 22 | Users crud controller for pylons |
|
23 | 23 | """ |
|
24 | 24 | |
|
25 | 25 | import logging |
|
26 | 26 | import formencode |
|
27 | 27 | |
|
28 | 28 | from formencode import htmlfill |
|
29 | 29 | from pylons import request, tmpl_context as c, url, config |
|
30 | 30 | from pylons.controllers.util import redirect |
|
31 | 31 | from pylons.i18n.translation import _ |
|
32 | 32 | |
|
33 | 33 | from rhodecode.authentication.plugins import auth_rhodecode |
|
34 | 34 | from rhodecode.lib.exceptions import ( |
|
35 | 35 | DefaultUserException, UserOwnsReposException, UserOwnsRepoGroupsException, |
|
36 | 36 | UserOwnsUserGroupsException, UserCreationError) |
|
37 | 37 | from rhodecode.lib import helpers as h |
|
38 | 38 | from rhodecode.lib import auth |
|
39 | 39 | from rhodecode.lib.auth import ( |
|
40 | 40 | LoginRequired, HasPermissionAllDecorator, AuthUser, generate_auth_token) |
|
41 | 41 | from rhodecode.lib.base import BaseController, render |
|
42 | 42 | from rhodecode.model.auth_token import AuthTokenModel |
|
43 | 43 | |
|
44 | 44 | from rhodecode.model.db import ( |
|
45 | 45 | PullRequestReviewers, User, UserEmailMap, UserIpMap, RepoGroup) |
|
46 | 46 | from rhodecode.model.forms import ( |
|
47 | 47 | UserForm, UserPermissionsForm, UserIndividualPermissionsForm) |
|
48 | 48 | from rhodecode.model.user import UserModel |
|
49 | 49 | from rhodecode.model.meta import Session |
|
50 | 50 | from rhodecode.model.permission import PermissionModel |
|
51 | 51 | from rhodecode.lib.utils import action_logger |
|
52 | 52 | from rhodecode.lib.ext_json import json |
|
53 | 53 | from rhodecode.lib.utils2 import datetime_to_time, safe_int |
|
54 | 54 | |
|
55 | 55 | log = logging.getLogger(__name__) |
|
56 | 56 | |
|
57 | 57 | |
|
58 | 58 | class UsersController(BaseController): |
|
59 | 59 | """REST Controller styled on the Atom Publishing Protocol""" |
|
60 | 60 | |
|
61 | 61 | @LoginRequired() |
|
62 | 62 | def __before__(self): |
|
63 | 63 | super(UsersController, self).__before__() |
|
64 | 64 | c.available_permissions = config['available_permissions'] |
|
65 | 65 | c.allowed_languages = [ |
|
66 | 66 | ('en', 'English (en)'), |
|
67 | 67 | ('de', 'German (de)'), |
|
68 | 68 | ('fr', 'French (fr)'), |
|
69 | 69 | ('it', 'Italian (it)'), |
|
70 | 70 | ('ja', 'Japanese (ja)'), |
|
71 | 71 | ('pl', 'Polish (pl)'), |
|
72 | 72 | ('pt', 'Portuguese (pt)'), |
|
73 | 73 | ('ru', 'Russian (ru)'), |
|
74 | 74 | ('zh', 'Chinese (zh)'), |
|
75 | 75 | ] |
|
76 | 76 | PermissionModel().set_global_permission_choices(c, translator=_) |
|
77 | 77 | |
|
78 | 78 | @HasPermissionAllDecorator('hg.admin') |
|
79 | 79 | def index(self): |
|
80 | 80 | """GET /users: All items in the collection""" |
|
81 | 81 | # url('users') |
|
82 | 82 | |
|
83 | 83 | from rhodecode.lib.utils import PartialRenderer |
|
84 | 84 | _render = PartialRenderer('data_table/_dt_elements.html') |
|
85 | 85 | |
|
86 | 86 | def grav_tmpl(user_email, size): |
|
87 | 87 | return _render("user_gravatar", user_email, size) |
|
88 | 88 | |
|
89 | 89 | def username(user_id, username): |
|
90 | 90 | return _render("user_name", user_id, username) |
|
91 | 91 | |
|
92 | 92 | def user_actions(user_id, username): |
|
93 | 93 | return _render("user_actions", user_id, username) |
|
94 | 94 | |
|
95 | 95 | # json generate |
|
96 | 96 | c.users_list = User.query()\ |
|
97 | 97 | .filter(User.username != User.DEFAULT_USER) \ |
|
98 | 98 | .all() |
|
99 | 99 | |
|
100 | 100 | users_data = [] |
|
101 | 101 | for user in c.users_list: |
|
102 | 102 | users_data.append({ |
|
103 | 103 | "gravatar": grav_tmpl(user.email, 20), |
|
104 | 104 | "username": h.link_to( |
|
105 | 105 | user.username, h.url('user_profile', username=user.username)), |
|
106 | 106 | "username_raw": user.username, |
|
107 | 107 | "email": user.email, |
|
108 | 108 | "first_name": h.escape(user.name), |
|
109 | 109 | "last_name": h.escape(user.lastname), |
|
110 | 110 | "last_login": h.format_date(user.last_login), |
|
111 | 111 | "last_login_raw": datetime_to_time(user.last_login), |
|
112 | 112 | "last_activity": h.format_date( |
|
113 | 113 | h.time_to_datetime(user.user_data.get('last_activity', 0))), |
|
114 | 114 | "last_activity_raw": user.user_data.get('last_activity', 0), |
|
115 | 115 | "active": h.bool2icon(user.active), |
|
116 | 116 | "active_raw": user.active, |
|
117 | 117 | "admin": h.bool2icon(user.admin), |
|
118 | 118 | "admin_raw": user.admin, |
|
119 | 119 | "extern_type": user.extern_type, |
|
120 | 120 | "extern_name": user.extern_name, |
|
121 | 121 | "action": user_actions(user.user_id, user.username), |
|
122 | 122 | }) |
|
123 | 123 | |
|
124 | 124 | |
|
125 | 125 | c.data = json.dumps(users_data) |
|
126 | 126 | return render('admin/users/users.html') |
|
127 | 127 | |
|
128 | 128 | @HasPermissionAllDecorator('hg.admin') |
|
129 | 129 | @auth.CSRFRequired() |
|
130 | 130 | def create(self): |
|
131 | 131 | """POST /users: Create a new item""" |
|
132 | 132 | # url('users') |
|
133 | 133 | c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.name |
|
134 | 134 | user_model = UserModel() |
|
135 | 135 | user_form = UserForm()() |
|
136 | 136 | try: |
|
137 | 137 | form_result = user_form.to_python(dict(request.POST)) |
|
138 | 138 | user = user_model.create(form_result) |
|
139 | 139 | Session().flush() |
|
140 | 140 | username = form_result['username'] |
|
141 | 141 | action_logger(c.rhodecode_user, 'admin_created_user:%s' % username, |
|
142 | 142 | None, self.ip_addr, self.sa) |
|
143 | 143 | |
|
144 | 144 | user_link = h.link_to(h.escape(username), |
|
145 | 145 | url('edit_user', |
|
146 | 146 | user_id=user.user_id)) |
|
147 | 147 | h.flash(h.literal(_('Created user %(user_link)s') |
|
148 | 148 | % {'user_link': user_link}), category='success') |
|
149 | 149 | Session().commit() |
|
150 | 150 | except formencode.Invalid as errors: |
|
151 | 151 | return htmlfill.render( |
|
152 | 152 | render('admin/users/user_add.html'), |
|
153 | 153 | defaults=errors.value, |
|
154 | 154 | errors=errors.error_dict or {}, |
|
155 | 155 | prefix_error=False, |
|
156 | 156 | encoding="UTF-8", |
|
157 | 157 | force_defaults=False) |
|
158 | 158 | except UserCreationError as e: |
|
159 | 159 | h.flash(e, 'error') |
|
160 | 160 | except Exception: |
|
161 | 161 | log.exception("Exception creation of user") |
|
162 | 162 | h.flash(_('Error occurred during creation of user %s') |
|
163 | 163 | % request.POST.get('username'), category='error') |
|
164 | 164 | return redirect(url('users')) |
|
165 | 165 | |
|
166 | 166 | @HasPermissionAllDecorator('hg.admin') |
|
167 | 167 | def new(self): |
|
168 | 168 | """GET /users/new: Form to create a new item""" |
|
169 | 169 | # url('new_user') |
|
170 | 170 | c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.name |
|
171 | 171 | return render('admin/users/user_add.html') |
|
172 | 172 | |
|
173 | 173 | @HasPermissionAllDecorator('hg.admin') |
|
174 | 174 | @auth.CSRFRequired() |
|
175 | 175 | def update(self, user_id): |
|
176 | 176 | """PUT /users/user_id: Update an existing item""" |
|
177 | 177 | # Forms posted to this method should contain a hidden field: |
|
178 | 178 | # <input type="hidden" name="_method" value="PUT" /> |
|
179 | 179 | # Or using helpers: |
|
180 | 180 | # h.form(url('update_user', user_id=ID), |
|
181 | 181 | # method='put') |
|
182 | 182 | # url('user', user_id=ID) |
|
183 | 183 | user_id = safe_int(user_id) |
|
184 | 184 | c.user = User.get_or_404(user_id) |
|
185 | 185 | c.active = 'profile' |
|
186 | 186 | c.extern_type = c.user.extern_type |
|
187 | 187 | c.extern_name = c.user.extern_name |
|
188 | 188 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) |
|
189 | 189 | available_languages = [x[0] for x in c.allowed_languages] |
|
190 | 190 | _form = UserForm(edit=True, available_languages=available_languages, |
|
191 | 191 | old_data={'user_id': user_id, |
|
192 | 192 | 'email': c.user.email})() |
|
193 | 193 | form_result = {} |
|
194 | 194 | try: |
|
195 | 195 | form_result = _form.to_python(dict(request.POST)) |
|
196 | 196 | skip_attrs = ['extern_type', 'extern_name'] |
|
197 | 197 | # TODO: plugin should define if username can be updated |
|
198 | 198 | if c.extern_type != "rhodecode": |
|
199 | 199 | # forbid updating username for external accounts |
|
200 | 200 | skip_attrs.append('username') |
|
201 | 201 | |
|
202 | 202 | UserModel().update_user(user_id, skip_attrs=skip_attrs, **form_result) |
|
203 | 203 | usr = form_result['username'] |
|
204 | 204 | action_logger(c.rhodecode_user, 'admin_updated_user:%s' % usr, |
|
205 | 205 | None, self.ip_addr, self.sa) |
|
206 | 206 | h.flash(_('User updated successfully'), category='success') |
|
207 | 207 | Session().commit() |
|
208 | 208 | except formencode.Invalid as errors: |
|
209 | 209 | defaults = errors.value |
|
210 | 210 | e = errors.error_dict or {} |
|
211 | 211 | |
|
212 | 212 | return htmlfill.render( |
|
213 | 213 | render('admin/users/user_edit.html'), |
|
214 | 214 | defaults=defaults, |
|
215 | 215 | errors=e, |
|
216 | 216 | prefix_error=False, |
|
217 | 217 | encoding="UTF-8", |
|
218 | 218 | force_defaults=False) |
|
219 | except UserCreationError as e: | |
|
220 | h.flash(e, 'error') | |
|
219 | 221 | except Exception: |
|
220 | 222 | log.exception("Exception updating user") |
|
221 | 223 | h.flash(_('Error occurred during update of user %s') |
|
222 | 224 | % form_result.get('username'), category='error') |
|
223 | 225 | return redirect(url('edit_user', user_id=user_id)) |
|
224 | 226 | |
|
225 | 227 | @HasPermissionAllDecorator('hg.admin') |
|
226 | 228 | @auth.CSRFRequired() |
|
227 | 229 | def delete(self, user_id): |
|
228 | 230 | """DELETE /users/user_id: Delete an existing item""" |
|
229 | 231 | # Forms posted to this method should contain a hidden field: |
|
230 | 232 | # <input type="hidden" name="_method" value="DELETE" /> |
|
231 | 233 | # Or using helpers: |
|
232 | 234 | # h.form(url('delete_user', user_id=ID), |
|
233 | 235 | # method='delete') |
|
234 | 236 | # url('user', user_id=ID) |
|
235 | 237 | user_id = safe_int(user_id) |
|
236 | 238 | c.user = User.get_or_404(user_id) |
|
237 | 239 | |
|
238 | 240 | _repos = c.user.repositories |
|
239 | 241 | _repo_groups = c.user.repository_groups |
|
240 | 242 | _user_groups = c.user.user_groups |
|
241 | 243 | |
|
242 | 244 | handle_repos = None |
|
243 | 245 | handle_repo_groups = None |
|
244 | 246 | handle_user_groups = None |
|
245 | 247 | # dummy call for flash of handle |
|
246 | 248 | set_handle_flash_repos = lambda: None |
|
247 | 249 | set_handle_flash_repo_groups = lambda: None |
|
248 | 250 | set_handle_flash_user_groups = lambda: None |
|
249 | 251 | |
|
250 | 252 | if _repos and request.POST.get('user_repos'): |
|
251 | 253 | do = request.POST['user_repos'] |
|
252 | 254 | if do == 'detach': |
|
253 | 255 | handle_repos = 'detach' |
|
254 | 256 | set_handle_flash_repos = lambda: h.flash( |
|
255 | 257 | _('Detached %s repositories') % len(_repos), |
|
256 | 258 | category='success') |
|
257 | 259 | elif do == 'delete': |
|
258 | 260 | handle_repos = 'delete' |
|
259 | 261 | set_handle_flash_repos = lambda: h.flash( |
|
260 | 262 | _('Deleted %s repositories') % len(_repos), |
|
261 | 263 | category='success') |
|
262 | 264 | |
|
263 | 265 | if _repo_groups and request.POST.get('user_repo_groups'): |
|
264 | 266 | do = request.POST['user_repo_groups'] |
|
265 | 267 | if do == 'detach': |
|
266 | 268 | handle_repo_groups = 'detach' |
|
267 | 269 | set_handle_flash_repo_groups = lambda: h.flash( |
|
268 | 270 | _('Detached %s repository groups') % len(_repo_groups), |
|
269 | 271 | category='success') |
|
270 | 272 | elif do == 'delete': |
|
271 | 273 | handle_repo_groups = 'delete' |
|
272 | 274 | set_handle_flash_repo_groups = lambda: h.flash( |
|
273 | 275 | _('Deleted %s repository groups') % len(_repo_groups), |
|
274 | 276 | category='success') |
|
275 | 277 | |
|
276 | 278 | if _user_groups and request.POST.get('user_user_groups'): |
|
277 | 279 | do = request.POST['user_user_groups'] |
|
278 | 280 | if do == 'detach': |
|
279 | 281 | handle_user_groups = 'detach' |
|
280 | 282 | set_handle_flash_user_groups = lambda: h.flash( |
|
281 | 283 | _('Detached %s user groups') % len(_user_groups), |
|
282 | 284 | category='success') |
|
283 | 285 | elif do == 'delete': |
|
284 | 286 | handle_user_groups = 'delete' |
|
285 | 287 | set_handle_flash_user_groups = lambda: h.flash( |
|
286 | 288 | _('Deleted %s user groups') % len(_user_groups), |
|
287 | 289 | category='success') |
|
288 | 290 | |
|
289 | 291 | try: |
|
290 | 292 | UserModel().delete(c.user, handle_repos=handle_repos, |
|
291 | 293 | handle_repo_groups=handle_repo_groups, |
|
292 | 294 | handle_user_groups=handle_user_groups) |
|
293 | 295 | Session().commit() |
|
294 | 296 | set_handle_flash_repos() |
|
295 | 297 | set_handle_flash_repo_groups() |
|
296 | 298 | set_handle_flash_user_groups() |
|
297 | 299 | h.flash(_('Successfully deleted user'), category='success') |
|
298 | 300 | except (UserOwnsReposException, UserOwnsRepoGroupsException, |
|
299 | 301 | UserOwnsUserGroupsException, DefaultUserException) as e: |
|
300 | 302 | h.flash(e, category='warning') |
|
301 | 303 | except Exception: |
|
302 | 304 | log.exception("Exception during deletion of user") |
|
303 | 305 | h.flash(_('An error occurred during deletion of user'), |
|
304 | 306 | category='error') |
|
305 | 307 | return redirect(url('users')) |
|
306 | 308 | |
|
307 | 309 | @HasPermissionAllDecorator('hg.admin') |
|
308 | 310 | @auth.CSRFRequired() |
|
309 | 311 | def reset_password(self, user_id): |
|
310 | 312 | """ |
|
311 | 313 | toggle reset password flag for this user |
|
312 | 314 | |
|
313 | 315 | :param user_id: |
|
314 | 316 | """ |
|
315 | 317 | user_id = safe_int(user_id) |
|
316 | 318 | c.user = User.get_or_404(user_id) |
|
317 | 319 | try: |
|
318 | 320 | old_value = c.user.user_data.get('force_password_change') |
|
319 | 321 | c.user.update_userdata(force_password_change=not old_value) |
|
320 | 322 | Session().commit() |
|
321 | 323 | if old_value: |
|
322 | 324 | msg = _('Force password change disabled for user') |
|
323 | 325 | else: |
|
324 | 326 | msg = _('Force password change enabled for user') |
|
325 | 327 | h.flash(msg, category='success') |
|
326 | 328 | except Exception: |
|
327 | 329 | log.exception("Exception during password reset for user") |
|
328 | 330 | h.flash(_('An error occurred during password reset for user'), |
|
329 | 331 | category='error') |
|
330 | 332 | |
|
331 | 333 | return redirect(url('edit_user_advanced', user_id=user_id)) |
|
332 | 334 | |
|
333 | 335 | @HasPermissionAllDecorator('hg.admin') |
|
334 | 336 | @auth.CSRFRequired() |
|
335 | 337 | def create_personal_repo_group(self, user_id): |
|
336 | 338 | """ |
|
337 | 339 | Create personal repository group for this user |
|
338 | 340 | |
|
339 | 341 | :param user_id: |
|
340 | 342 | """ |
|
341 | 343 | from rhodecode.model.repo_group import RepoGroupModel |
|
342 | 344 | |
|
343 | 345 | user_id = safe_int(user_id) |
|
344 | 346 | c.user = User.get_or_404(user_id) |
|
345 | 347 | |
|
346 | 348 | try: |
|
347 | 349 | desc = RepoGroupModel.PERSONAL_GROUP_DESC % { |
|
348 | 350 | 'username': c.user.username} |
|
349 | 351 | if not RepoGroup.get_by_group_name(c.user.username): |
|
350 | 352 | RepoGroupModel().create(group_name=c.user.username, |
|
351 | 353 | group_description=desc, |
|
352 | 354 | owner=c.user.username) |
|
353 | 355 | |
|
354 | 356 | msg = _('Created repository group `%s`' % (c.user.username,)) |
|
355 | 357 | h.flash(msg, category='success') |
|
356 | 358 | except Exception: |
|
357 | 359 | log.exception("Exception during repository group creation") |
|
358 | 360 | msg = _( |
|
359 | 361 | 'An error occurred during repository group creation for user') |
|
360 | 362 | h.flash(msg, category='error') |
|
361 | 363 | |
|
362 | 364 | return redirect(url('edit_user_advanced', user_id=user_id)) |
|
363 | 365 | |
|
364 | 366 | @HasPermissionAllDecorator('hg.admin') |
|
365 | 367 | def show(self, user_id): |
|
366 | 368 | """GET /users/user_id: Show a specific item""" |
|
367 | 369 | # url('user', user_id=ID) |
|
368 | 370 | User.get_or_404(-1) |
|
369 | 371 | |
|
370 | 372 | @HasPermissionAllDecorator('hg.admin') |
|
371 | 373 | def edit(self, user_id): |
|
372 | 374 | """GET /users/user_id/edit: Form to edit an existing item""" |
|
373 | 375 | # url('edit_user', user_id=ID) |
|
374 | 376 | user_id = safe_int(user_id) |
|
375 | 377 | c.user = User.get_or_404(user_id) |
|
376 | 378 | if c.user.username == User.DEFAULT_USER: |
|
377 | 379 | h.flash(_("You can't edit this user"), category='warning') |
|
378 | 380 | return redirect(url('users')) |
|
379 | 381 | |
|
380 | 382 | c.active = 'profile' |
|
381 | 383 | c.extern_type = c.user.extern_type |
|
382 | 384 | c.extern_name = c.user.extern_name |
|
383 | 385 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) |
|
384 | 386 | |
|
385 | 387 | defaults = c.user.get_dict() |
|
386 | 388 | defaults.update({'language': c.user.user_data.get('language')}) |
|
387 | 389 | return htmlfill.render( |
|
388 | 390 | render('admin/users/user_edit.html'), |
|
389 | 391 | defaults=defaults, |
|
390 | 392 | encoding="UTF-8", |
|
391 | 393 | force_defaults=False) |
|
392 | 394 | |
|
393 | 395 | @HasPermissionAllDecorator('hg.admin') |
|
394 | 396 | def edit_advanced(self, user_id): |
|
395 | 397 | user_id = safe_int(user_id) |
|
396 | 398 | user = c.user = User.get_or_404(user_id) |
|
397 | 399 | if user.username == User.DEFAULT_USER: |
|
398 | 400 | h.flash(_("You can't edit this user"), category='warning') |
|
399 | 401 | return redirect(url('users')) |
|
400 | 402 | |
|
401 | 403 | c.active = 'advanced' |
|
402 | 404 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) |
|
403 | 405 | c.personal_repo_group = RepoGroup.get_by_group_name(user.username) |
|
404 | c.first_admin = User.get_first_admin() | |
|
406 | c.first_admin = User.get_first_super_admin() | |
|
405 | 407 | defaults = user.get_dict() |
|
406 | 408 | |
|
407 | 409 | # Interim workaround if the user participated on any pull requests as a |
|
408 | 410 | # reviewer. |
|
409 | 411 | has_review = bool(PullRequestReviewers.query().filter( |
|
410 | 412 | PullRequestReviewers.user_id == user_id).first()) |
|
411 | 413 | c.can_delete_user = not has_review |
|
412 | 414 | c.can_delete_user_message = _( |
|
413 | 415 | 'The user participates as reviewer in pull requests and ' |
|
414 | 416 | 'cannot be deleted. You can set the user to ' |
|
415 | 417 | '"inactive" instead of deleting it.') if has_review else '' |
|
416 | 418 | |
|
417 | 419 | return htmlfill.render( |
|
418 | 420 | render('admin/users/user_edit.html'), |
|
419 | 421 | defaults=defaults, |
|
420 | 422 | encoding="UTF-8", |
|
421 | 423 | force_defaults=False) |
|
422 | 424 | |
|
423 | 425 | @HasPermissionAllDecorator('hg.admin') |
|
424 | 426 | def edit_auth_tokens(self, user_id): |
|
425 | 427 | user_id = safe_int(user_id) |
|
426 | 428 | c.user = User.get_or_404(user_id) |
|
427 | 429 | if c.user.username == User.DEFAULT_USER: |
|
428 | 430 | h.flash(_("You can't edit this user"), category='warning') |
|
429 | 431 | return redirect(url('users')) |
|
430 | 432 | |
|
431 | 433 | c.active = 'auth_tokens' |
|
432 | 434 | show_expired = True |
|
433 | 435 | c.lifetime_values = [ |
|
434 | 436 | (str(-1), _('forever')), |
|
435 | 437 | (str(5), _('5 minutes')), |
|
436 | 438 | (str(60), _('1 hour')), |
|
437 | 439 | (str(60 * 24), _('1 day')), |
|
438 | 440 | (str(60 * 24 * 30), _('1 month')), |
|
439 | 441 | ] |
|
440 | 442 | c.lifetime_options = [(c.lifetime_values, _("Lifetime"))] |
|
441 | 443 | c.role_values = [(x, AuthTokenModel.cls._get_role_name(x)) |
|
442 | 444 | for x in AuthTokenModel.cls.ROLES] |
|
443 | 445 | c.role_options = [(c.role_values, _("Role"))] |
|
444 | 446 | c.user_auth_tokens = AuthTokenModel().get_auth_tokens( |
|
445 | 447 | c.user.user_id, show_expired=show_expired) |
|
446 | 448 | defaults = c.user.get_dict() |
|
447 | 449 | return htmlfill.render( |
|
448 | 450 | render('admin/users/user_edit.html'), |
|
449 | 451 | defaults=defaults, |
|
450 | 452 | encoding="UTF-8", |
|
451 | 453 | force_defaults=False) |
|
452 | 454 | |
|
453 | 455 | @HasPermissionAllDecorator('hg.admin') |
|
454 | 456 | @auth.CSRFRequired() |
|
455 | 457 | def add_auth_token(self, user_id): |
|
456 | 458 | user_id = safe_int(user_id) |
|
457 | 459 | c.user = User.get_or_404(user_id) |
|
458 | 460 | if c.user.username == User.DEFAULT_USER: |
|
459 | 461 | h.flash(_("You can't edit this user"), category='warning') |
|
460 | 462 | return redirect(url('users')) |
|
461 | 463 | |
|
462 | 464 | lifetime = safe_int(request.POST.get('lifetime'), -1) |
|
463 | 465 | description = request.POST.get('description') |
|
464 | 466 | role = request.POST.get('role') |
|
465 | 467 | AuthTokenModel().create(c.user.user_id, description, lifetime, role) |
|
466 | 468 | Session().commit() |
|
467 | 469 | h.flash(_("Auth token successfully created"), category='success') |
|
468 | 470 | return redirect(url('edit_user_auth_tokens', user_id=c.user.user_id)) |
|
469 | 471 | |
|
470 | 472 | @HasPermissionAllDecorator('hg.admin') |
|
471 | 473 | @auth.CSRFRequired() |
|
472 | 474 | def delete_auth_token(self, user_id): |
|
473 | 475 | user_id = safe_int(user_id) |
|
474 | 476 | c.user = User.get_or_404(user_id) |
|
475 | 477 | if c.user.username == User.DEFAULT_USER: |
|
476 | 478 | h.flash(_("You can't edit this user"), category='warning') |
|
477 | 479 | return redirect(url('users')) |
|
478 | 480 | |
|
479 | 481 | auth_token = request.POST.get('del_auth_token') |
|
480 | 482 | if request.POST.get('del_auth_token_builtin'): |
|
481 | 483 | user = User.get(c.user.user_id) |
|
482 | 484 | if user: |
|
483 | 485 | user.api_key = generate_auth_token(user.username) |
|
484 | 486 | Session().add(user) |
|
485 | 487 | Session().commit() |
|
486 | 488 | h.flash(_("Auth token successfully reset"), category='success') |
|
487 | 489 | elif auth_token: |
|
488 | 490 | AuthTokenModel().delete(auth_token, c.user.user_id) |
|
489 | 491 | Session().commit() |
|
490 | 492 | h.flash(_("Auth token successfully deleted"), category='success') |
|
491 | 493 | |
|
492 | 494 | return redirect(url('edit_user_auth_tokens', user_id=c.user.user_id)) |
|
493 | 495 | |
|
494 | 496 | @HasPermissionAllDecorator('hg.admin') |
|
495 | 497 | def edit_global_perms(self, user_id): |
|
496 | 498 | user_id = safe_int(user_id) |
|
497 | 499 | c.user = User.get_or_404(user_id) |
|
498 | 500 | if c.user.username == User.DEFAULT_USER: |
|
499 | 501 | h.flash(_("You can't edit this user"), category='warning') |
|
500 | 502 | return redirect(url('users')) |
|
501 | 503 | |
|
502 | 504 | c.active = 'global_perms' |
|
503 | 505 | |
|
504 | 506 | c.default_user = User.get_default_user() |
|
505 | 507 | defaults = c.user.get_dict() |
|
506 | 508 | defaults.update(c.default_user.get_default_perms(suffix='_inherited')) |
|
507 | 509 | defaults.update(c.default_user.get_default_perms()) |
|
508 | 510 | defaults.update(c.user.get_default_perms()) |
|
509 | 511 | |
|
510 | 512 | return htmlfill.render( |
|
511 | 513 | render('admin/users/user_edit.html'), |
|
512 | 514 | defaults=defaults, |
|
513 | 515 | encoding="UTF-8", |
|
514 | 516 | force_defaults=False) |
|
515 | 517 | |
|
516 | 518 | @HasPermissionAllDecorator('hg.admin') |
|
517 | 519 | @auth.CSRFRequired() |
|
518 | 520 | def update_global_perms(self, user_id): |
|
519 | 521 | """PUT /users_perm/user_id: Update an existing item""" |
|
520 | 522 | # url('user_perm', user_id=ID, method='put') |
|
521 | 523 | user_id = safe_int(user_id) |
|
522 | 524 | user = User.get_or_404(user_id) |
|
523 | 525 | c.active = 'global_perms' |
|
524 | 526 | try: |
|
525 | 527 | # first stage that verifies the checkbox |
|
526 | 528 | _form = UserIndividualPermissionsForm() |
|
527 | 529 | form_result = _form.to_python(dict(request.POST)) |
|
528 | 530 | inherit_perms = form_result['inherit_default_permissions'] |
|
529 | 531 | user.inherit_default_permissions = inherit_perms |
|
530 | 532 | Session().add(user) |
|
531 | 533 | |
|
532 | 534 | if not inherit_perms: |
|
533 | 535 | # only update the individual ones if we un check the flag |
|
534 | 536 | _form = UserPermissionsForm( |
|
535 | 537 | [x[0] for x in c.repo_create_choices], |
|
536 | 538 | [x[0] for x in c.repo_create_on_write_choices], |
|
537 | 539 | [x[0] for x in c.repo_group_create_choices], |
|
538 | 540 | [x[0] for x in c.user_group_create_choices], |
|
539 | 541 | [x[0] for x in c.fork_choices], |
|
540 | 542 | [x[0] for x in c.inherit_default_permission_choices])() |
|
541 | 543 | |
|
542 | 544 | form_result = _form.to_python(dict(request.POST)) |
|
543 | 545 | form_result.update({'perm_user_id': user.user_id}) |
|
544 | 546 | |
|
545 | 547 | PermissionModel().update_user_permissions(form_result) |
|
546 | 548 | |
|
547 | 549 | Session().commit() |
|
548 | 550 | h.flash(_('User global permissions updated successfully'), |
|
549 | 551 | category='success') |
|
550 | 552 | |
|
551 | 553 | Session().commit() |
|
552 | 554 | except formencode.Invalid as errors: |
|
553 | 555 | defaults = errors.value |
|
554 | 556 | c.user = user |
|
555 | 557 | return htmlfill.render( |
|
556 | 558 | render('admin/users/user_edit.html'), |
|
557 | 559 | defaults=defaults, |
|
558 | 560 | errors=errors.error_dict or {}, |
|
559 | 561 | prefix_error=False, |
|
560 | 562 | encoding="UTF-8", |
|
561 | 563 | force_defaults=False) |
|
562 | 564 | except Exception: |
|
563 | 565 | log.exception("Exception during permissions saving") |
|
564 | 566 | h.flash(_('An error occurred during permissions saving'), |
|
565 | 567 | category='error') |
|
566 | 568 | return redirect(url('edit_user_global_perms', user_id=user_id)) |
|
567 | 569 | |
|
568 | 570 | @HasPermissionAllDecorator('hg.admin') |
|
569 | 571 | def edit_perms_summary(self, user_id): |
|
570 | 572 | user_id = safe_int(user_id) |
|
571 | 573 | c.user = User.get_or_404(user_id) |
|
572 | 574 | if c.user.username == User.DEFAULT_USER: |
|
573 | 575 | h.flash(_("You can't edit this user"), category='warning') |
|
574 | 576 | return redirect(url('users')) |
|
575 | 577 | |
|
576 | 578 | c.active = 'perms_summary' |
|
577 | 579 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) |
|
578 | 580 | |
|
579 | 581 | return render('admin/users/user_edit.html') |
|
580 | 582 | |
|
581 | 583 | @HasPermissionAllDecorator('hg.admin') |
|
582 | 584 | def edit_emails(self, user_id): |
|
583 | 585 | user_id = safe_int(user_id) |
|
584 | 586 | c.user = User.get_or_404(user_id) |
|
585 | 587 | if c.user.username == User.DEFAULT_USER: |
|
586 | 588 | h.flash(_("You can't edit this user"), category='warning') |
|
587 | 589 | return redirect(url('users')) |
|
588 | 590 | |
|
589 | 591 | c.active = 'emails' |
|
590 | 592 | c.user_email_map = UserEmailMap.query() \ |
|
591 | 593 | .filter(UserEmailMap.user == c.user).all() |
|
592 | 594 | |
|
593 | 595 | defaults = c.user.get_dict() |
|
594 | 596 | return htmlfill.render( |
|
595 | 597 | render('admin/users/user_edit.html'), |
|
596 | 598 | defaults=defaults, |
|
597 | 599 | encoding="UTF-8", |
|
598 | 600 | force_defaults=False) |
|
599 | 601 | |
|
600 | 602 | @HasPermissionAllDecorator('hg.admin') |
|
601 | 603 | @auth.CSRFRequired() |
|
602 | 604 | def add_email(self, user_id): |
|
603 | 605 | """POST /user_emails:Add an existing item""" |
|
604 | 606 | # url('user_emails', user_id=ID, method='put') |
|
605 | 607 | user_id = safe_int(user_id) |
|
606 | 608 | c.user = User.get_or_404(user_id) |
|
607 | 609 | |
|
608 | 610 | email = request.POST.get('new_email') |
|
609 | 611 | user_model = UserModel() |
|
610 | 612 | |
|
611 | 613 | try: |
|
612 | 614 | user_model.add_extra_email(user_id, email) |
|
613 | 615 | Session().commit() |
|
614 | 616 | h.flash(_("Added new email address `%s` for user account") % email, |
|
615 | 617 | category='success') |
|
616 | 618 | except formencode.Invalid as error: |
|
617 | 619 | msg = error.error_dict['email'] |
|
618 | 620 | h.flash(msg, category='error') |
|
619 | 621 | except Exception: |
|
620 | 622 | log.exception("Exception during email saving") |
|
621 | 623 | h.flash(_('An error occurred during email saving'), |
|
622 | 624 | category='error') |
|
623 | 625 | return redirect(url('edit_user_emails', user_id=user_id)) |
|
624 | 626 | |
|
625 | 627 | @HasPermissionAllDecorator('hg.admin') |
|
626 | 628 | @auth.CSRFRequired() |
|
627 | 629 | def delete_email(self, user_id): |
|
628 | 630 | """DELETE /user_emails_delete/user_id: Delete an existing item""" |
|
629 | 631 | # url('user_emails_delete', user_id=ID, method='delete') |
|
630 | 632 | user_id = safe_int(user_id) |
|
631 | 633 | c.user = User.get_or_404(user_id) |
|
632 | 634 | email_id = request.POST.get('del_email_id') |
|
633 | 635 | user_model = UserModel() |
|
634 | 636 | user_model.delete_extra_email(user_id, email_id) |
|
635 | 637 | Session().commit() |
|
636 | 638 | h.flash(_("Removed email address from user account"), category='success') |
|
637 | 639 | return redirect(url('edit_user_emails', user_id=user_id)) |
|
638 | 640 | |
|
639 | 641 | @HasPermissionAllDecorator('hg.admin') |
|
640 | 642 | def edit_ips(self, user_id): |
|
641 | 643 | user_id = safe_int(user_id) |
|
642 | 644 | c.user = User.get_or_404(user_id) |
|
643 | 645 | if c.user.username == User.DEFAULT_USER: |
|
644 | 646 | h.flash(_("You can't edit this user"), category='warning') |
|
645 | 647 | return redirect(url('users')) |
|
646 | 648 | |
|
647 | 649 | c.active = 'ips' |
|
648 | 650 | c.user_ip_map = UserIpMap.query() \ |
|
649 | 651 | .filter(UserIpMap.user == c.user).all() |
|
650 | 652 | |
|
651 | 653 | c.inherit_default_ips = c.user.inherit_default_permissions |
|
652 | 654 | c.default_user_ip_map = UserIpMap.query() \ |
|
653 | 655 | .filter(UserIpMap.user == User.get_default_user()).all() |
|
654 | 656 | |
|
655 | 657 | defaults = c.user.get_dict() |
|
656 | 658 | return htmlfill.render( |
|
657 | 659 | render('admin/users/user_edit.html'), |
|
658 | 660 | defaults=defaults, |
|
659 | 661 | encoding="UTF-8", |
|
660 | 662 | force_defaults=False) |
|
661 | 663 | |
|
662 | 664 | @HasPermissionAllDecorator('hg.admin') |
|
663 | 665 | @auth.CSRFRequired() |
|
664 | 666 | def add_ip(self, user_id): |
|
665 | 667 | """POST /user_ips:Add an existing item""" |
|
666 | 668 | # url('user_ips', user_id=ID, method='put') |
|
667 | 669 | |
|
668 | 670 | user_id = safe_int(user_id) |
|
669 | 671 | c.user = User.get_or_404(user_id) |
|
670 | 672 | user_model = UserModel() |
|
671 | 673 | try: |
|
672 | 674 | ip_list = user_model.parse_ip_range(request.POST.get('new_ip')) |
|
673 | 675 | except Exception as e: |
|
674 | 676 | ip_list = [] |
|
675 | 677 | log.exception("Exception during ip saving") |
|
676 | 678 | h.flash(_('An error occurred during ip saving:%s' % (e,)), |
|
677 | 679 | category='error') |
|
678 | 680 | |
|
679 | 681 | desc = request.POST.get('description') |
|
680 | 682 | added = [] |
|
681 | 683 | for ip in ip_list: |
|
682 | 684 | try: |
|
683 | 685 | user_model.add_extra_ip(user_id, ip, desc) |
|
684 | 686 | Session().commit() |
|
685 | 687 | added.append(ip) |
|
686 | 688 | except formencode.Invalid as error: |
|
687 | 689 | msg = error.error_dict['ip'] |
|
688 | 690 | h.flash(msg, category='error') |
|
689 | 691 | except Exception: |
|
690 | 692 | log.exception("Exception during ip saving") |
|
691 | 693 | h.flash(_('An error occurred during ip saving'), |
|
692 | 694 | category='error') |
|
693 | 695 | if added: |
|
694 | 696 | h.flash( |
|
695 | 697 | _("Added ips %s to user whitelist") % (', '.join(ip_list), ), |
|
696 | 698 | category='success') |
|
697 | 699 | if 'default_user' in request.POST: |
|
698 | 700 | return redirect(url('admin_permissions_ips')) |
|
699 | 701 | return redirect(url('edit_user_ips', user_id=user_id)) |
|
700 | 702 | |
|
701 | 703 | @HasPermissionAllDecorator('hg.admin') |
|
702 | 704 | @auth.CSRFRequired() |
|
703 | 705 | def delete_ip(self, user_id): |
|
704 | 706 | """DELETE /user_ips_delete/user_id: Delete an existing item""" |
|
705 | 707 | # url('user_ips_delete', user_id=ID, method='delete') |
|
706 | 708 | user_id = safe_int(user_id) |
|
707 | 709 | c.user = User.get_or_404(user_id) |
|
708 | 710 | |
|
709 | 711 | ip_id = request.POST.get('del_ip_id') |
|
710 | 712 | user_model = UserModel() |
|
711 | 713 | user_model.delete_extra_ip(user_id, ip_id) |
|
712 | 714 | Session().commit() |
|
713 | 715 | h.flash(_("Removed ip address from user whitelist"), category='success') |
|
714 | 716 | |
|
715 | 717 | if 'default_user' in request.POST: |
|
716 | 718 | return redirect(url('admin_permissions_ips')) |
|
717 | 719 | return redirect(url('edit_user_ips', user_id=user_id)) |
@@ -1,277 +1,289 b'' | |||
|
1 | 1 | # -*- coding: utf-8 -*- |
|
2 | 2 | |
|
3 | 3 | # Copyright (C) 2010-2016 RhodeCode GmbH |
|
4 | 4 | # |
|
5 | 5 | # This program is free software: you can redistribute it and/or modify |
|
6 | 6 | # it under the terms of the GNU Affero General Public License, version 3 |
|
7 | 7 | # (only), as published by the Free Software Foundation. |
|
8 | 8 | # |
|
9 | 9 | # This program is distributed in the hope that it will be useful, |
|
10 | 10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
11 | 11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
12 | 12 | # GNU General Public License for more details. |
|
13 | 13 | # |
|
14 | 14 | # You should have received a copy of the GNU Affero General Public License |
|
15 | 15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
16 | 16 | # |
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | """ |
|
22 | 22 | Home controller for RhodeCode Enterprise |
|
23 | 23 | """ |
|
24 | 24 | |
|
25 | 25 | import logging |
|
26 | 26 | import time |
|
27 | 27 | import re |
|
28 | 28 | |
|
29 | 29 | from pylons import tmpl_context as c, request, url, config |
|
30 | 30 | from pylons.i18n.translation import _ |
|
31 | 31 | from sqlalchemy.sql import func |
|
32 | 32 | |
|
33 | 33 | from rhodecode.lib.auth import ( |
|
34 | 34 | LoginRequired, HasPermissionAllDecorator, AuthUser, |
|
35 | 35 | HasRepoGroupPermissionAnyDecorator, XHRRequired) |
|
36 | 36 | from rhodecode.lib.base import BaseController, render |
|
37 | 37 | from rhodecode.lib.index import searcher_from_config |
|
38 | 38 | from rhodecode.lib.ext_json import json |
|
39 | 39 | from rhodecode.lib.utils import jsonify |
|
40 | from rhodecode.lib.utils2 import safe_unicode | |
|
40 | from rhodecode.lib.utils2 import safe_unicode, str2bool | |
|
41 | 41 | from rhodecode.model.db import Repository, RepoGroup |
|
42 | 42 | from rhodecode.model.repo import RepoModel |
|
43 | 43 | from rhodecode.model.repo_group import RepoGroupModel |
|
44 | 44 | from rhodecode.model.scm import RepoList, RepoGroupList |
|
45 | 45 | |
|
46 | 46 | |
|
47 | 47 | log = logging.getLogger(__name__) |
|
48 | 48 | |
|
49 | 49 | |
|
50 | 50 | class HomeController(BaseController): |
|
51 | 51 | def __before__(self): |
|
52 | 52 | super(HomeController, self).__before__() |
|
53 | 53 | |
|
54 | 54 | def ping(self): |
|
55 | 55 | """ |
|
56 | 56 | Ping, doesn't require login, good for checking out the platform |
|
57 | 57 | """ |
|
58 | 58 | instance_id = getattr(c, 'rhodecode_instanceid', '') |
|
59 | 59 | return 'pong[%s] => %s' % (instance_id, self.ip_addr,) |
|
60 | 60 | |
|
61 | 61 | @LoginRequired() |
|
62 | 62 | @HasPermissionAllDecorator('hg.admin') |
|
63 | 63 | def error_test(self): |
|
64 | 64 | """ |
|
65 | 65 | Test exception handling and emails on errors |
|
66 | 66 | """ |
|
67 | 67 | class TestException(Exception): |
|
68 | 68 | pass |
|
69 | 69 | |
|
70 | 70 | msg = ('RhodeCode Enterprise %s test exception. Generation time: %s' |
|
71 | 71 | % (c.rhodecode_name, time.time())) |
|
72 | 72 | raise TestException(msg) |
|
73 | 73 | |
|
74 | 74 | def _get_groups_and_repos(self, repo_group_id=None): |
|
75 | 75 | # repo groups groups |
|
76 | 76 | repo_group_list = RepoGroup.get_all_repo_groups(group_id=repo_group_id) |
|
77 | 77 | _perms = ['group.read', 'group.write', 'group.admin'] |
|
78 | 78 | repo_group_list_acl = RepoGroupList(repo_group_list, perm_set=_perms) |
|
79 | 79 | repo_group_data = RepoGroupModel().get_repo_groups_as_dict( |
|
80 | 80 | repo_group_list=repo_group_list_acl, admin=False) |
|
81 | 81 | |
|
82 | 82 | # repositories |
|
83 | 83 | repo_list = Repository.get_all_repos(group_id=repo_group_id) |
|
84 | 84 | _perms = ['repository.read', 'repository.write', 'repository.admin'] |
|
85 | 85 | repo_list_acl = RepoList(repo_list, perm_set=_perms) |
|
86 | 86 | repo_data = RepoModel().get_repos_as_dict( |
|
87 | 87 | repo_list=repo_list_acl, admin=False) |
|
88 | 88 | |
|
89 | 89 | return repo_data, repo_group_data |
|
90 | 90 | |
|
91 | 91 | @LoginRequired() |
|
92 | 92 | def index(self): |
|
93 | 93 | c.repo_group = None |
|
94 | 94 | |
|
95 | 95 | repo_data, repo_group_data = self._get_groups_and_repos() |
|
96 | 96 | # json used to render the grids |
|
97 | 97 | c.repos_data = json.dumps(repo_data) |
|
98 | 98 | c.repo_groups_data = json.dumps(repo_group_data) |
|
99 | 99 | |
|
100 | 100 | return render('/index.html') |
|
101 | 101 | |
|
102 | 102 | @LoginRequired() |
|
103 | 103 | @HasRepoGroupPermissionAnyDecorator('group.read', 'group.write', |
|
104 | 104 | 'group.admin') |
|
105 | 105 | def index_repo_group(self, group_name): |
|
106 | 106 | """GET /repo_group_name: Show a specific item""" |
|
107 | 107 | c.repo_group = RepoGroupModel()._get_repo_group(group_name) |
|
108 | 108 | repo_data, repo_group_data = self._get_groups_and_repos( |
|
109 | 109 | c.repo_group.group_id) |
|
110 | 110 | |
|
111 | 111 | # json used to render the grids |
|
112 | 112 | c.repos_data = json.dumps(repo_data) |
|
113 | 113 | c.repo_groups_data = json.dumps(repo_group_data) |
|
114 | 114 | |
|
115 | 115 | return render('index_repo_group.html') |
|
116 | 116 | |
|
117 | 117 | def _get_repo_list(self, name_contains=None, repo_type=None, limit=20): |
|
118 | 118 | query = Repository.query()\ |
|
119 | 119 | .order_by(func.length(Repository.repo_name))\ |
|
120 | 120 | .order_by(Repository.repo_name) |
|
121 | 121 | |
|
122 | 122 | if repo_type: |
|
123 | 123 | query = query.filter(Repository.repo_type == repo_type) |
|
124 | 124 | |
|
125 | 125 | if name_contains: |
|
126 | 126 | ilike_expression = u'%{}%'.format(safe_unicode(name_contains)) |
|
127 | 127 | query = query.filter( |
|
128 | 128 | Repository.repo_name.ilike(ilike_expression)) |
|
129 | 129 | query = query.limit(limit) |
|
130 | 130 | |
|
131 | 131 | all_repos = query.all() |
|
132 | 132 | repo_iter = self.scm_model.get_repos(all_repos) |
|
133 | 133 | return [ |
|
134 | 134 | { |
|
135 | 135 | 'id': obj['name'], |
|
136 | 136 | 'text': obj['name'], |
|
137 | 137 | 'type': 'repo', |
|
138 | 138 | 'obj': obj['dbrepo'], |
|
139 | 139 | 'url': url('summary_home', repo_name=obj['name']) |
|
140 | 140 | } |
|
141 | 141 | for obj in repo_iter] |
|
142 | 142 | |
|
143 | 143 | def _get_repo_group_list(self, name_contains=None, limit=20): |
|
144 | 144 | query = RepoGroup.query()\ |
|
145 | 145 | .order_by(func.length(RepoGroup.group_name))\ |
|
146 | 146 | .order_by(RepoGroup.group_name) |
|
147 | 147 | |
|
148 | 148 | if name_contains: |
|
149 | 149 | ilike_expression = u'%{}%'.format(safe_unicode(name_contains)) |
|
150 | 150 | query = query.filter( |
|
151 | 151 | RepoGroup.group_name.ilike(ilike_expression)) |
|
152 | 152 | query = query.limit(limit) |
|
153 | 153 | |
|
154 | 154 | all_groups = query.all() |
|
155 | 155 | repo_groups_iter = self.scm_model.get_repo_groups(all_groups) |
|
156 | 156 | return [ |
|
157 | 157 | { |
|
158 | 158 | 'id': obj.group_name, |
|
159 | 159 | 'text': obj.group_name, |
|
160 | 160 | 'type': 'group', |
|
161 | 161 | 'obj': {}, |
|
162 | 162 | 'url': url('repo_group_home', group_name=obj.group_name) |
|
163 | 163 | } |
|
164 | 164 | for obj in repo_groups_iter] |
|
165 | 165 | |
|
166 | 166 | def _get_hash_commit_list(self, hash_starts_with=None, limit=20): |
|
167 | 167 | if not hash_starts_with or len(hash_starts_with) < 3: |
|
168 | 168 | return [] |
|
169 | 169 | |
|
170 | 170 | commit_hashes = re.compile('([0-9a-f]{2,40})').findall(hash_starts_with) |
|
171 | 171 | |
|
172 | 172 | if len(commit_hashes) != 1: |
|
173 | 173 | return [] |
|
174 | 174 | |
|
175 | 175 | commit_hash_prefix = commit_hashes[0] |
|
176 | 176 | |
|
177 | 177 | auth_user = AuthUser( |
|
178 | 178 | user_id=c.rhodecode_user.user_id, ip_addr=self.ip_addr) |
|
179 | 179 | searcher = searcher_from_config(config) |
|
180 | 180 | result = searcher.search( |
|
181 | 181 | 'commit_id:%s*' % commit_hash_prefix, 'commit', auth_user) |
|
182 | 182 | |
|
183 | 183 | return [ |
|
184 | 184 | { |
|
185 | 185 | 'id': entry['commit_id'], |
|
186 | 186 | 'text': entry['commit_id'], |
|
187 | 187 | 'type': 'commit', |
|
188 | 188 | 'obj': {'repo': entry['repository']}, |
|
189 | 189 | 'url': url('changeset_home', |
|
190 | 190 | repo_name=entry['repository'], revision=entry['commit_id']) |
|
191 | 191 | } |
|
192 | 192 | for entry in result['results']] |
|
193 | 193 | |
|
194 | 194 | @LoginRequired() |
|
195 | 195 | @XHRRequired() |
|
196 | 196 | @jsonify |
|
197 | 197 | def goto_switcher_data(self): |
|
198 | 198 | query = request.GET.get('query') |
|
199 | 199 | log.debug('generating goto switcher list, query %s', query) |
|
200 | 200 | |
|
201 | 201 | res = [] |
|
202 | 202 | repo_groups = self._get_repo_group_list(query) |
|
203 | 203 | if repo_groups: |
|
204 | 204 | res.append({ |
|
205 | 205 | 'text': _('Groups'), |
|
206 | 206 | 'children': repo_groups |
|
207 | 207 | }) |
|
208 | 208 | |
|
209 | 209 | repos = self._get_repo_list(query) |
|
210 | 210 | if repos: |
|
211 | 211 | res.append({ |
|
212 | 212 | 'text': _('Repositories'), |
|
213 | 213 | 'children': repos |
|
214 | 214 | }) |
|
215 | 215 | |
|
216 | 216 | commits = self._get_hash_commit_list(query) |
|
217 | 217 | if commits: |
|
218 | 218 | unique_repos = {} |
|
219 | 219 | for commit in commits: |
|
220 | 220 | unique_repos.setdefault(commit['obj']['repo'], [] |
|
221 | 221 | ).append(commit) |
|
222 | 222 | |
|
223 | 223 | for repo in unique_repos: |
|
224 | 224 | res.append({ |
|
225 | 225 | 'text': _('Commits in %(repo)s') % {'repo': repo}, |
|
226 | 226 | 'children': unique_repos[repo] |
|
227 | 227 | }) |
|
228 | 228 | |
|
229 | 229 | data = { |
|
230 | 230 | 'more': False, |
|
231 | 231 | 'results': res |
|
232 | 232 | } |
|
233 | 233 | return data |
|
234 | 234 | |
|
235 | 235 | @LoginRequired() |
|
236 | 236 | @XHRRequired() |
|
237 | 237 | @jsonify |
|
238 | 238 | def repo_list_data(self): |
|
239 | 239 | query = request.GET.get('query') |
|
240 | 240 | repo_type = request.GET.get('repo_type') |
|
241 | 241 | log.debug('generating repo list, query:%s', query) |
|
242 | 242 | |
|
243 | 243 | res = [] |
|
244 | 244 | repos = self._get_repo_list(query, repo_type=repo_type) |
|
245 | 245 | if repos: |
|
246 | 246 | res.append({ |
|
247 | 247 | 'text': _('Repositories'), |
|
248 | 248 | 'children': repos |
|
249 | 249 | }) |
|
250 | 250 | |
|
251 | 251 | data = { |
|
252 | 252 | 'more': False, |
|
253 | 253 | 'results': res |
|
254 | 254 | } |
|
255 | 255 | return data |
|
256 | 256 | |
|
257 | 257 | @LoginRequired() |
|
258 | 258 | @XHRRequired() |
|
259 | 259 | @jsonify |
|
260 | 260 | def user_autocomplete_data(self): |
|
261 | 261 | query = request.GET.get('query') |
|
262 | active = str2bool(request.GET.get('active') or True) | |
|
262 | 263 | |
|
263 | 264 | repo_model = RepoModel() |
|
264 |
_users = repo_model.get_users( |
|
|
265 | _users = repo_model.get_users( | |
|
266 | name_contains=query, only_active=active) | |
|
265 | 267 | |
|
266 | 268 | if request.GET.get('user_groups'): |
|
267 | 269 | # extend with user groups |
|
268 |
_user_groups = repo_model.get_user_groups( |
|
|
270 | _user_groups = repo_model.get_user_groups( | |
|
271 | name_contains=query, only_active=active) | |
|
269 | 272 | _users = _users + _user_groups |
|
270 | 273 | |
|
271 | 274 | return {'suggestions': _users} |
|
272 | 275 | |
|
273 | 276 | @LoginRequired() |
|
274 | 277 | @XHRRequired() |
|
275 | 278 | @jsonify |
|
276 | 279 | def user_group_autocomplete_data(self): |
|
277 | return {'suggestions': []} | |
|
280 | query = request.GET.get('query') | |
|
281 | active = str2bool(request.GET.get('active') or True) | |
|
282 | ||
|
283 | repo_model = RepoModel() | |
|
284 | _user_groups = repo_model.get_user_groups( | |
|
285 | name_contains=query, only_active=active) | |
|
286 | _user_groups = _user_groups | |
|
287 | ||
|
288 | return {'suggestions': _user_groups} | |
|
289 |
@@ -1,31 +1,53 b'' | |||
|
1 | 1 | # Copyright (C) 2016-2016 RhodeCode GmbH |
|
2 | 2 | # |
|
3 | 3 | # This program is free software: you can redistribute it and/or modify |
|
4 | 4 | # it under the terms of the GNU Affero General Public License, version 3 |
|
5 | 5 | # (only), as published by the Free Software Foundation. |
|
6 | 6 | # |
|
7 | 7 | # This program is distributed in the hope that it will be useful, |
|
8 | 8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of |
|
9 | 9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
|
10 | 10 | # GNU General Public License for more details. |
|
11 | 11 | # |
|
12 | 12 | # You should have received a copy of the GNU Affero General Public License |
|
13 | 13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. |
|
14 | 14 | # |
|
15 | 15 | # This program is dual-licensed. If you wish to learn more about the |
|
16 | 16 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
17 | 17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
18 | 18 | |
|
19 | 19 | from zope.interface import implementer |
|
20 |
from rhodecode.interfaces import |
|
|
20 | from rhodecode.interfaces import ( | |
|
21 | IUserRegistered, IUserPreCreate, IUserPreUpdate) | |
|
21 | 22 | |
|
22 | 23 | |
|
23 | 24 | @implementer(IUserRegistered) |
|
24 | 25 | class UserRegistered(object): |
|
25 | 26 | """ |
|
26 | 27 | An instance of this class is emitted as an :term:`event` whenever a user |
|
27 | 28 | account is registered. |
|
28 | 29 | """ |
|
29 | 30 | def __init__(self, user, session): |
|
30 | 31 | self.user = user |
|
31 | 32 | self.session = session |
|
33 | ||
|
34 | ||
|
35 | @implementer(IUserPreCreate) | |
|
36 | class UserPreCreate(object): | |
|
37 | """ | |
|
38 | An instance of this class is emitted as an :term:`event` before a new user | |
|
39 | object is created. | |
|
40 | """ | |
|
41 | def __init__(self, user_data): | |
|
42 | self.user_data = user_data | |
|
43 | ||
|
44 | ||
|
45 | @implementer(IUserPreUpdate) | |
|
46 | class UserPreUpdate(object): | |
|
47 | """ | |
|
48 | An instance of this class is emitted as an :term:`event` before a user | |
|
49 | object is updated. | |
|
50 | """ | |
|
51 | def __init__(self, user, user_data): | |
|
52 | self.user = user | |
|
53 | self.user_data = user_data |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed, binary diff hidden |
|
1 | NO CONTENT: file was removed, binary diff hidden |
|
1 | NO CONTENT: file was removed, binary diff hidden |
General Comments 0
You need to be logged in to leave comments.
Login now