##// END OF EJS Templates
release: Merge default into stable for release preparation
marcink -
r802:f6088673 merge stable
parent child Browse files
Show More

The requested changes are too big and content was truncated. Show full diff

@@ -0,0 +1,15 b''
1 {
2 "name": "rhodecode-elements",
3 "description": "User interface for elements for rhodecode",
4 "main": "index.html",
5 "dependencies": {
6 "webcomponentsjs": "^0.7.22",
7 "polymer": "Polymer/polymer#^1.6.1",
8 "paper-button": "PolymerElements/paper-button#^1.0.13",
9 "paper-spinner": "PolymerElements/paper-spinner#^1.2.0",
10 "paper-tooltip": "PolymerElements/paper-tooltip#^1.1.2",
11 "paper-toast": "PolymerElements/paper-toast#^1.3.0",
12 "paper-toggle-button": "PolymerElements/paper-toggle-button#^1.2.0",
13 "iron-ajax": "PolymerElements/iron-ajax#^1.4.3"
14 }
15 }
@@ -0,0 +1,60 b''
1
2 =======================
3 Dependency management
4 =======================
5
6
7 Overview
8 ========
9
10 We use the Nix package manager to handle our dependencies. In general we use the
11 packages out of the package collection `nixpkgs`. For frequently changing
12 dependencies for Python and JavaScript we use the tools which are described in
13 this section to generate the needed Nix derivations.
14
15
16 Python dependencies
17 ===================
18
19 We use the tool `pip2nix` to generate the Nix derivations for our Python
20 dependencies.
21
22 Generating the dependencies should be done with the following command:
23
24 .. code:: shell
25
26 pip2nix generate --license
27
28
29 .. note::
30
31 License extraction support is still experimental, use the version from the
32 following pull request: https://github.com/ktosiek/pip2nix/pull/30
33
34
35
36 Node dependencies
37 =================
38
39 After adding new dependencies via ``npm install --save``, use `node2nix` to
40 update the corresponding Nix derivations:
41
42 .. code:: shell
43
44 cd pkgs
45 node2nix --input ../package.json \
46 -o node-packages.nix \
47 -e node-env.nix \
48 -c node-default.nix \
49 -d --flatten
50
51
52 Bower dependencies
53 ==================
54
55 Frontend dependencies are managed based on `bower`, with `bower2nix` a tool
56 exists which can generate the needed Nix derivations:
57
58 .. code:: shell
59
60 bower2nix bower.json pkgs/bower-packages.nix
@@ -0,0 +1,78 b''
1 |RCE| 4.4.0 |RNS|
2 -----------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2016-09-16
8
9
10 General
11 ^^^^^^^
12
13 - UI: introduced Polymer webcomponents into core application. RhodeCode will
14 be now shipped together with Polymer framework webcomponents. Most of
15 dynamic UI components that require large amounts of interaction
16 will be done now with Polymer.
17 - live-notifications: use rhodecode-toast for live notifications instead of
18 toastr jquery plugin.
19 - Svn: moved svn http support out of labs settings. It's tested and stable now.
20
21
22 New Features
23 ^^^^^^^^^^^^
24
25 - Integrations: integrations can now be configure on whole repo group to apply
26 same integrations on multiple projects/groups at once.
27 - Integrations: added scopes on integrations, scopes are: Global,
28 Repository Group (with/without children), Repositories, Root Repositories Only.
29 It will allow to configure exactly which projects use which integrations.
30 - Integrations: show branches/commits separately when posting push events
31 to hipchat/slack, fixes #4192.
32 - Pull-requests: summary page now shows update dates for pull request to
33 easier see which one were receantly updated.
34 - UI: hidden inline comments will be shown in side view when browsing the diffs
35 - Diffs: added inline comments toggle into pull requests diff view. #2884
36 - Live-chat: added summon reviewers functionality. You can now request
37 presence from online users into a chat for collaborative code-review.
38 This requires channelstream to be enabled.
39 - UX: added a static 502 page for gateway error. Once configured via
40 Nginx or Apache it will present a custom RhodeCode page while
41 backend servers are offline. Fixes #4202.
42
43
44 Security
45 ^^^^^^^^
46
47 - Passwords: forced password change will not allow users to put in the
48 old password as new one.
49
50
51 Performance
52 ^^^^^^^^^^^
53
54 - Vcs: refactor vcs-middleware to handle order of .ini file backends in
55 detection of vcs protocol. Detection ends now on first match and speeds
56 overall transaction speed.
57 - Summary: Improve the algorithm and performance of detection README files
58 inside summary page. In some cases we reduced cold-cache time from 50s to 1s.
59 - Safari: improved speed of large diffs on Safari browser.
60 - UX: remove position relative on diff td as it causes very slow
61 rendering in browsers.
62
63 Fixes
64 ^^^^^
65
66 - UX: change confirm password widget to have spacing between the fields to
67 match rest of ui, fixes: #4200.
68 - UX: show multiple tags/branches in changelog/summary instead of
69 truncating them.
70 - My-account: fix test notifications for IE10+
71 - Vcs: change way refs are retrieved for git so same name branch/tags and
72 remotes can be supported, fixes #298.
73 - Lexers: added small extensions table to extend syntax highlighting for file
74 sources. Fixes #4227.
75 - Search: fix bug where file path link was wrong when the repository name was
76 in the file path, fixes #4228
77 - Fixed INT overflow bug
78 - Events: send pushed commits always in the correct in order.
@@ -0,0 +1,186 b''
1 {
2 "dirs": {
3 "css": {
4 "src":"rhodecode/public/css",
5 "dest":"rhodecode/public/css"
6 },
7 "js": {
8 "src": "rhodecode/public/js/src",
9 "dest": "rhodecode/public/js"
10 }
11 },
12 "copy": {
13 "main": {
14 "expand": true,
15 "cwd": "bower_components",
16 "src": "webcomponentsjs/webcomponents-lite.js",
17 "dest": "<%= dirs.js.dest %>/vendors"
18 }
19 },
20 "concat": {
21 "polymercss": {
22 "src": [
23 "<%= dirs.js.src %>/components/root-styles-prefix.html",
24 "<%= dirs.css.src %>/style-polymer.css",
25 "<%= dirs.js.src %>/components/root-styles-suffix.html"
26 ],
27 "dest": "<%= dirs.js.dest %>/src/components/root-styles.gen.html",
28 "nonull": true
29 },
30 "dist": {
31 "src": [
32 "<%= dirs.js.src %>/jquery-1.11.1.min.js",
33 "<%= dirs.js.src %>/logging.js",
34 "<%= dirs.js.src %>/bootstrap.js",
35 "<%= dirs.js.src %>/mousetrap.js",
36 "<%= dirs.js.src %>/moment.js",
37 "<%= dirs.js.src %>/appenlight-client-0.4.1.min.js",
38 "<%= dirs.js.src %>/i18n_utils.js",
39 "<%= dirs.js.src %>/deform.js",
40 "<%= dirs.js.src %>/plugins/jquery.pjax.js",
41 "<%= dirs.js.src %>/plugins/jquery.dataTables.js",
42 "<%= dirs.js.src %>/plugins/flavoured_checkbox.js",
43 "<%= dirs.js.src %>/plugins/jquery.auto-grow-input.js",
44 "<%= dirs.js.src %>/plugins/jquery.autocomplete.js",
45 "<%= dirs.js.src %>/plugins/jquery.debounce.js",
46 "<%= dirs.js.src %>/plugins/jquery.mark.js",
47 "<%= dirs.js.src %>/plugins/jquery.timeago.js",
48 "<%= dirs.js.src %>/plugins/jquery.timeago-extension.js",
49 "<%= dirs.js.src %>/select2/select2.js",
50 "<%= dirs.js.src %>/codemirror/codemirror.js",
51 "<%= dirs.js.src %>/codemirror/codemirror_loadmode.js",
52 "<%= dirs.js.src %>/codemirror/codemirror_hint.js",
53 "<%= dirs.js.src %>/codemirror/codemirror_overlay.js",
54 "<%= dirs.js.src %>/codemirror/codemirror_placeholder.js",
55 "<%= dirs.js.dest %>/mode/meta.js",
56 "<%= dirs.js.dest %>/mode/meta_ext.js",
57 "<%= dirs.js.dest %>/rhodecode/i18n/select2/translations.js",
58 "<%= dirs.js.src %>/rhodecode/utils/array.js",
59 "<%= dirs.js.src %>/rhodecode/utils/string.js",
60 "<%= dirs.js.src %>/rhodecode/utils/pyroutes.js",
61 "<%= dirs.js.src %>/rhodecode/utils/ajax.js",
62 "<%= dirs.js.src %>/rhodecode/utils/autocomplete.js",
63 "<%= dirs.js.src %>/rhodecode/utils/colorgenerator.js",
64 "<%= dirs.js.src %>/rhodecode/utils/ie.js",
65 "<%= dirs.js.src %>/rhodecode/utils/os.js",
66 "<%= dirs.js.src %>/rhodecode/utils/topics.js",
67 "<%= dirs.js.src %>/rhodecode/widgets/multiselect.js",
68 "<%= dirs.js.src %>/rhodecode/init.js",
69 "<%= dirs.js.src %>/rhodecode/codemirror.js",
70 "<%= dirs.js.src %>/rhodecode/comments.js",
71 "<%= dirs.js.src %>/rhodecode/constants.js",
72 "<%= dirs.js.src %>/rhodecode/files.js",
73 "<%= dirs.js.src %>/rhodecode/followers.js",
74 "<%= dirs.js.src %>/rhodecode/menus.js",
75 "<%= dirs.js.src %>/rhodecode/notifications.js",
76 "<%= dirs.js.src %>/rhodecode/permissions.js",
77 "<%= dirs.js.src %>/rhodecode/pjax.js",
78 "<%= dirs.js.src %>/rhodecode/pullrequests.js",
79 "<%= dirs.js.src %>/rhodecode/settings.js",
80 "<%= dirs.js.src %>/rhodecode/select2_widgets.js",
81 "<%= dirs.js.src %>/rhodecode/tooltips.js",
82 "<%= dirs.js.src %>/rhodecode/users.js",
83 "<%= dirs.js.src %>/rhodecode/appenlight.js",
84 "<%= dirs.js.src %>/rhodecode.js"
85 ],
86 "dest": "<%= dirs.js.dest %>/scripts.js",
87 "nonull": true
88 }
89 },
90 "crisper": {
91 "dist": {
92 "options": {
93 "cleanup": false,
94 "onlySplit": true
95 },
96 "src": "<%= dirs.js.dest %>/rhodecode-components.html",
97 "dest": "<%= dirs.js.dest %>/rhodecode-components.js"
98 }
99 },
100 "less": {
101 "development": {
102 "options": {
103 "compress": false,
104 "yuicompress": false,
105 "optimization": 0
106 },
107 "files": {
108 "<%= dirs.css.dest %>/style.css": "<%= dirs.css.src %>/main.less",
109 "<%= dirs.css.dest %>/style-polymer.css": "<%= dirs.css.src %>/polymer.less"
110 }
111 },
112 "production": {
113 "options": {
114 "compress": true,
115 "yuicompress": true,
116 "optimization": 2
117 },
118 "files": {
119 "<%= dirs.css.dest %>/style.css": "<%= dirs.css.src %>/main.less",
120 "<%= dirs.css.dest %>/style-polymer.css": "<%= dirs.css.src %>/polymer.less"
121 }
122 },
123 "components": {
124 "files": [
125 {
126 "cwd": "<%= dirs.js.src %>/components/",
127 "dest": "<%= dirs.js.src %>/components/",
128 "src": [
129 "**/*.less"
130 ],
131 "expand": true,
132 "ext": ".css"
133 }
134 ]
135 }
136 },
137 "watch": {
138 "less": {
139 "files": [
140 "<%= dirs.css.src %>/**/*.less",
141 "<%= dirs.js.src %>/components/**/*.less"
142 ],
143 "tasks": [
144 "less:development",
145 "less:components",
146 "concat:polymercss",
147 "vulcanize"
148 ]
149 },
150 "js": {
151 "files": [
152 "!<%= dirs.js.src %>/components/root-styles.gen.html",
153 "<%= dirs.js.src %>/**/*.js",
154 "<%= dirs.js.src %>/components/**/*.html"
155 ],
156 "tasks": [
157 "less:components",
158 "concat:polymercss",
159 "vulcanize",
160 "crisper",
161 "concat:dist"
162 ]
163 }
164 },
165 "jshint": {
166 "rhodecode": {
167 "src": "<%= dirs.js.src %>/rhodecode/**/*.js",
168 "options": {
169 "jshintrc": ".jshintrc"
170 }
171 }
172 },
173 "vulcanize": {
174 "default": {
175 "options": {
176 "abspath": "",
177 "inlineScripts": true,
178 "inlineCss": true,
179 "stripComments": true
180 },
181 "files": {
182 "<%= dirs.js.dest %>/rhodecode-components.html": "<%= dirs.js.src %>/components/shared-components.html"
183 }
184 }
185 }
186 }
@@ -0,0 +1,67 b''
1 # Backported buildBowerComponents so that we can also use it with the version
2 # 16.03 which is the current stable at the time of this writing.
3 #
4 # This file can be removed once building with 16.03 is not needed anymore.
5
6 { pkgs }:
7
8 { buildInputs ? [], generated, ... } @ attrs:
9
10 let
11 bower2nix-src = pkgs.fetchzip {
12 url = "https://github.com/rvl/bower2nix/archive/v3.0.1.tar.gz";
13 sha256 = "1zbvz96k2j6g0r4lvm5cgh41a73k9dgayk7x63cmg538dzznxvyb";
14 };
15
16 bower2nix = import "${bower2nix-src}/default.nix" { inherit pkgs; };
17
18 fetchbower = import ./backport-16.03-fetchbower.nix {
19 inherit (pkgs) stdenv lib;
20 inherit bower2nix;
21 };
22
23 # Fetches the bower packages. `generated` should be the result of a
24 # `bower2nix` command.
25 bowerPackages = import generated {
26 inherit (pkgs) buildEnv;
27 inherit fetchbower;
28 };
29
30 in pkgs.stdenv.mkDerivation (
31 attrs
32 //
33 {
34 name = "bower_components-" + attrs.name;
35
36 inherit bowerPackages;
37
38 builder = builtins.toFile "builder.sh" ''
39 source $stdenv/setup
40
41 # The project's bower.json is required
42 cp $src/bower.json .
43
44 # Dereference symlinks -- bower doesn't like them
45 cp --recursive --reflink=auto \
46 --dereference --no-preserve=mode \
47 $bowerPackages bc
48
49 # Bower install in offline mode -- links together the fetched
50 # bower packages.
51 HOME=$PWD bower \
52 --config.storage.packages=bc/packages \
53 --config.storage.registry=bc/registry \
54 --offline install
55
56 # Sets up a single bower_components directory within
57 # the output derivation.
58 mkdir -p $out
59 mv bower_components $out
60 '';
61
62 buildInputs = buildInputs ++ [
63 pkgs.git
64 pkgs.nodePackages.bower
65 ];
66 }
67 )
@@ -0,0 +1,26 b''
1 { stdenv, lib, bower2nix }:
2 let
3 bowerVersion = version:
4 let
5 components = lib.splitString "#" version;
6 hash = lib.last components;
7 ver = if builtins.length components == 1 then version else hash;
8 in ver;
9
10 fetchbower = name: version: target: outputHash: stdenv.mkDerivation {
11 name = "${name}-${bowerVersion version}";
12 buildCommand = ''
13 fetch-bower --quiet --out=$PWD/out "${name}" "${target}" "${version}"
14 # In some cases, the result of fetchBower is different depending
15 # on the output directory (e.g. if the bower package contains
16 # symlinks). So use a local output directory before copying to
17 # $out.
18 cp -R out $out
19 '';
20 outputHashMode = "recursive";
21 outputHashAlgo = "sha256";
22 inherit outputHash;
23 buildInputs = [ bower2nix ];
24 };
25
26 in fetchbower
@@ -0,0 +1,31 b''
1 { fetchbower, buildEnv }:
2 buildEnv { name = "bower-env"; ignoreCollisions = true; paths = [
3 (fetchbower "webcomponentsjs" "0.7.22" "^0.7.22" "0ggh3k8ssafd056ib1m5bvzi7cpz3ry7gr5176d79na1w0c3i7dz")
4 (fetchbower "polymer" "Polymer/polymer#1.6.1" "Polymer/polymer#^1.6.1" "09mm0jgk457gvwqlc155swch7gjr6fs3g7spnvhi6vh5b6518540")
5 (fetchbower "paper-button" "PolymerElements/paper-button#1.0.13" "PolymerElements/paper-button#^1.0.13" "0i3y153nqk06pn0gk282vyybnl3g1w3w41d5i9z659cgn27g3fvm")
6 (fetchbower "paper-spinner" "PolymerElements/paper-spinner#1.2.0" "PolymerElements/paper-spinner#^1.2.0" "1av1m6y81jw3hjhz1yqy3rwcgxarjzl58ldfn4q6sn51pgzngfqb")
7 (fetchbower "paper-tooltip" "PolymerElements/paper-tooltip#1.1.2" "PolymerElements/paper-tooltip#^1.1.2" "1j64nprcyk2d2bbl3qwjyr0lbjngm4wclpyfwgai1c4y6g6bigd2")
8 (fetchbower "paper-toast" "PolymerElements/paper-toast#1.3.0" "PolymerElements/paper-toast#^1.3.0" "0x9rqxsks5455s8pk4aikpp99ijdn6kxr9gvhwh99nbcqdzcxq1m")
9 (fetchbower "paper-toggle-button" "PolymerElements/paper-toggle-button#1.2.0" "PolymerElements/paper-toggle-button#^1.2.0" "0mphcng3ngspbpg4jjn0mb91nvr4xc1phq3qswib15h6sfww1b2w")
10 (fetchbower "iron-ajax" "PolymerElements/iron-ajax#1.4.3" "PolymerElements/iron-ajax#^1.4.3" "0m3dx27arwmlcp00b7n516sc5a51f40p9vapr1nvd57l3i3z0pzm")
11 (fetchbower "iron-flex-layout" "PolymerElements/iron-flex-layout#1.3.1" "PolymerElements/iron-flex-layout#^1.0.0" "0nswv3ih3bhflgcd2wjfmddqswzgqxb2xbq65jk9w3rkj26hplbl")
12 (fetchbower "paper-behaviors" "PolymerElements/paper-behaviors#1.0.12" "PolymerElements/paper-behaviors#^1.0.0" "012bqk97awgz55cn7rm9g7cckrdhkqhls3zvp8l6nd4rdwcrdzq8")
13 (fetchbower "paper-material" "PolymerElements/paper-material#1.0.6" "PolymerElements/paper-material#^1.0.0" "0rljmknfdbm5aabvx9pk77754zckj3l127c3mvnmwkpkkr353xnh")
14 (fetchbower "paper-styles" "PolymerElements/paper-styles#1.1.4" "PolymerElements/paper-styles#^1.0.0" "0j8vg74xrcxlni8i93dsab3y80f34kk30lv4yblqpkp9c3nrilf7")
15 (fetchbower "neon-animation" "PolymerElements/neon-animation#1.2.4" "PolymerElements/neon-animation#^1.0.0" "16mz9i2n5w0k5j8d6gha23cnbdgm5syz3fawyh89gdbq97bi2q5j")
16 (fetchbower "iron-a11y-announcer" "PolymerElements/iron-a11y-announcer#1.0.5" "PolymerElements/iron-a11y-announcer#^1.0.0" "0n7c7j1pwk3835s7s2jd9125wdcsqf216yi5gj07wn5s8h8p7m9d")
17 (fetchbower "iron-overlay-behavior" "PolymerElements/iron-overlay-behavior#1.8.6" "PolymerElements/iron-overlay-behavior#^1.0.9" "14brn9gz6qqskarg3fxk91xs7vg02vgcsz9a9743kidxr0l0413m")
18 (fetchbower "iron-fit-behavior" "PolymerElements/iron-fit-behavior#1.2.5" "PolymerElements/iron-fit-behavior#^1.1.0" "1msnlh8lp1xg6v4h6dkjwj9kzac5q5q208ayla3x9hi483ki6rlf")
19 (fetchbower "iron-checked-element-behavior" "PolymerElements/iron-checked-element-behavior#1.0.5" "PolymerElements/iron-checked-element-behavior#^1.0.0" "0l0yy4ah454s8bzfv076s8by7h67zy9ni6xb932qwyhx8br6c1m7")
20 (fetchbower "promise-polyfill" "polymerlabs/promise-polyfill#1.0.1" "polymerlabs/promise-polyfill#^1.0.0" "045bj2caav3famr5hhxgs1dx7n08r4s46mlzwb313vdy17is38xb")
21 (fetchbower "iron-behaviors" "PolymerElements/iron-behaviors#1.0.17" "PolymerElements/iron-behaviors#^1.0.0" "021qvkmbk32jrrmmphpmwgby4bzi5jyf47rh1bxmq2ip07ly4bpr")
22 (fetchbower "paper-ripple" "PolymerElements/paper-ripple#1.0.8" "PolymerElements/paper-ripple#^1.0.0" "0r9sq8ik7wwrw0qb82c3rw0c030ljwd3s466c9y4qbcrsbvfjnns")
23 (fetchbower "font-roboto" "PolymerElements/font-roboto#1.0.1" "PolymerElements/font-roboto#^1.0.1" "02jz43r0wkyr3yp7rq2rc08l5cwnsgca9fr54sr4rhsnl7cjpxrj")
24 (fetchbower "iron-meta" "PolymerElements/iron-meta#1.1.2" "PolymerElements/iron-meta#^1.0.0" "1wl4dx8fnsknw9z9xi8bpc4cy9x70c11x4zxwxnj73hf3smifppl")
25 (fetchbower "iron-resizable-behavior" "PolymerElements/iron-resizable-behavior#1.0.5" "PolymerElements/iron-resizable-behavior#^1.0.0" "1fd5zmbr2hax42vmcasncvk7lzi38fmb1kyii26nn8pnnjak7zkn")
26 (fetchbower "iron-selector" "PolymerElements/iron-selector#1.5.2" "PolymerElements/iron-selector#^1.0.0" "1ajv46llqzvahm5g6g75w7nfyjcslp53ji0wm96l2k94j87spv3r")
27 (fetchbower "web-animations-js" "web-animations/web-animations-js#2.2.2" "web-animations/web-animations-js#^2.2.0" "1izfvm3l67vwys0bqbhidi9rqziw2f8wv289386sc6jsxzgkzhga")
28 (fetchbower "iron-a11y-keys-behavior" "PolymerElements/iron-a11y-keys-behavior#1.1.7" "PolymerElements/iron-a11y-keys-behavior#^1.0.0" "070z46dbbz242002gmqrgy28x0y1fcqp9hnvbi05r3zphiqfx3l7")
29 (fetchbower "iron-validatable-behavior" "PolymerElements/iron-validatable-behavior#1.1.1" "PolymerElements/iron-validatable-behavior#^1.0.0" "1yhxlvywhw2klbbgm3f3cmanxfxggagph4ii635zv0c13707wslv")
30 (fetchbower "iron-form-element-behavior" "PolymerElements/iron-form-element-behavior#1.0.6" "PolymerElements/iron-form-element-behavior#^1.0.0" "0rdhxivgkdhhz2yadgdbjfc70l555p3y83vjh8rfj5hr0asyn6q1")
31 ]; }
@@ -0,0 +1,15 b''
1 # This file has been generated by node2nix 1.0.0. Do not edit!
2
3 {pkgs ? import <nixpkgs> {
4 inherit system;
5 }, system ? builtins.currentSystem}:
6
7 let
8 nodeEnv = import ./node-env.nix {
9 inherit (pkgs) stdenv python utillinux runCommand writeTextFile nodejs;
10 };
11 in
12 import ./node-packages.nix {
13 inherit (pkgs) fetchurl fetchgit;
14 inherit nodeEnv;
15 } No newline at end of file
@@ -0,0 +1,292 b''
1 # This file originates from node2nix
2
3 {stdenv, python, nodejs, utillinux, runCommand, writeTextFile}:
4
5 let
6 # Create a tar wrapper that filters all the 'Ignoring unknown extended header keyword' noise
7 tarWrapper = runCommand "tarWrapper" {} ''
8 mkdir -p $out/bin
9
10 cat > $out/bin/tar <<EOF
11 #! ${stdenv.shell} -e
12 $(type -p tar) "\$@" --warning=no-unknown-keyword
13 EOF
14
15 chmod +x $out/bin/tar
16 '';
17
18 # Function that generates a TGZ file from a NPM project
19 buildNodeSourceDist =
20 { name, version, src, ... }:
21
22 stdenv.mkDerivation {
23 name = "node-tarball-${name}-${version}";
24 inherit src;
25 buildInputs = [ nodejs ];
26 buildPhase = ''
27 export HOME=$TMPDIR
28 tgzFile=$(npm pack)
29 '';
30 installPhase = ''
31 mkdir -p $out/tarballs
32 mv $tgzFile $out/tarballs
33 mkdir -p $out/nix-support
34 echo "file source-dist $out/tarballs/$tgzFile" >> $out/nix-support/hydra-build-products
35 '';
36 };
37
38 includeDependencies = {dependencies}:
39 stdenv.lib.optionalString (dependencies != [])
40 (stdenv.lib.concatMapStrings (dependency:
41 ''
42 # Bundle the dependencies of the package
43 mkdir -p node_modules
44 cd node_modules
45
46 # Only include dependencies if they don't exist. They may also be bundled in the package.
47 if [ ! -e "${dependency.name}" ]
48 then
49 ${composePackage dependency}
50 fi
51
52 cd ..
53 ''
54 ) dependencies);
55
56 # Recursively composes the dependencies of a package
57 composePackage = { name, packageName, src, dependencies ? [], ... }@args:
58 let
59 fixImpureDependencies = writeTextFile {
60 name = "fixDependencies.js";
61 text = ''
62 var fs = require('fs');
63 var url = require('url');
64
65 /*
66 * Replaces an impure version specification by *
67 */
68 function replaceImpureVersionSpec(versionSpec) {
69 var parsedUrl = url.parse(versionSpec);
70
71 if(versionSpec == "latest" || versionSpec == "unstable" ||
72 versionSpec.substr(0, 2) == ".." || dependency.substr(0, 2) == "./" || dependency.substr(0, 2) == "~/" || dependency.substr(0, 1) == '/')
73 return '*';
74 else if(parsedUrl.protocol == "git:" || parsedUrl.protocol == "git+ssh:" || parsedUrl.protocol == "git+http:" || parsedUrl.protocol == "git+https:" ||
75 parsedUrl.protocol == "http:" || parsedUrl.protocol == "https:")
76 return '*';
77 else
78 return versionSpec;
79 }
80
81 var packageObj = JSON.parse(fs.readFileSync('./package.json'));
82
83 /* Replace dependencies */
84 if(packageObj.dependencies !== undefined) {
85 for(var dependency in packageObj.dependencies) {
86 var versionSpec = packageObj.dependencies[dependency];
87 packageObj.dependencies[dependency] = replaceImpureVersionSpec(versionSpec);
88 }
89 }
90
91 /* Replace development dependencies */
92 if(packageObj.devDependencies !== undefined) {
93 for(var dependency in packageObj.devDependencies) {
94 var versionSpec = packageObj.devDependencies[dependency];
95 packageObj.devDependencies[dependency] = replaceImpureVersionSpec(versionSpec);
96 }
97 }
98
99 /* Replace optional dependencies */
100 if(packageObj.optionalDependencies !== undefined) {
101 for(var dependency in packageObj.optionalDependencies) {
102 var versionSpec = packageObj.optionalDependencies[dependency];
103 packageObj.optionalDependencies[dependency] = replaceImpureVersionSpec(versionSpec);
104 }
105 }
106
107 /* Write the fixed JSON file */
108 fs.writeFileSync("package.json", JSON.stringify(packageObj));
109 '';
110 };
111 in
112 ''
113 DIR=$(pwd)
114 cd $TMPDIR
115
116 unpackFile ${src}
117
118 # Make the base dir in which the target dependency resides first
119 mkdir -p "$(dirname "$DIR/${packageName}")"
120
121 if [ -f "${src}" ]
122 then
123 # Figure out what directory has been unpacked
124 packageDir=$(find . -type d -maxdepth 1 | tail -1)
125
126 # Restore write permissions to make building work
127 chmod -R u+w "$packageDir"
128
129 # Move the extracted tarball into the output folder
130 mv "$packageDir" "$DIR/${packageName}"
131 elif [ -d "${src}" ]
132 then
133 # Restore write permissions to make building work
134 chmod -R u+w $strippedName
135
136 # Move the extracted directory into the output folder
137 mv $strippedName "$DIR/${packageName}"
138 fi
139
140 # Unset the stripped name to not confuse the next unpack step
141 unset strippedName
142
143 # Some version specifiers (latest, unstable, URLs, file paths) force NPM to make remote connections or consult paths outside the Nix store.
144 # The following JavaScript replaces these by * to prevent that
145 cd "$DIR/${packageName}"
146 node ${fixImpureDependencies}
147
148 # Include the dependencies of the package
149 ${includeDependencies { inherit dependencies; }}
150 cd ..
151 ${stdenv.lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
152 '';
153
154 # Extract the Node.js source code which is used to compile packages with
155 # native bindings
156 nodeSources = runCommand "node-sources" {} ''
157 tar --no-same-owner --no-same-permissions -xf ${nodejs.src}
158 mv node-* $out
159 '';
160
161 # Builds and composes an NPM package including all its dependencies
162 buildNodePackage = { name, packageName, version, dependencies ? [], production ? true, npmFlags ? "", dontNpmInstall ? false, preRebuild ? "", ... }@args:
163
164 stdenv.lib.makeOverridable stdenv.mkDerivation (builtins.removeAttrs args [ "dependencies" ] // {
165 name = "node-${name}-${version}";
166 buildInputs = [ tarWrapper python nodejs ] ++ stdenv.lib.optional (stdenv.isLinux) utillinux ++ args.buildInputs or [];
167 dontStrip = args.dontStrip or true; # Striping may fail a build for some package deployments
168
169 inherit dontNpmInstall preRebuild;
170
171 unpackPhase = args.unpackPhase or "true";
172
173 buildPhase = args.buildPhase or "true";
174
175 compositionScript = composePackage args;
176 passAsFile = [ "compositionScript" ];
177
178 installPhase = args.installPhase or ''
179 # Create and enter a root node_modules/ folder
180 mkdir -p $out/lib/node_modules
181 cd $out/lib/node_modules
182
183 # Compose the package and all its dependencies
184 source $compositionScriptPath
185
186 # Patch the shebangs of the bundled modules to prevent them from
187 # calling executables outside the Nix store as much as possible
188 patchShebangs .
189
190 # Deploy the Node.js package by running npm install. Since the
191 # dependencies have been provided already by ourselves, it should not
192 # attempt to install them again, which is good, because we want to make
193 # it Nix's responsibility. If it needs to install any dependencies
194 # anyway (e.g. because the dependency parameters are
195 # incomplete/incorrect), it fails.
196 #
197 # The other responsibilities of NPM are kept -- version checks, build
198 # steps, postprocessing etc.
199
200 export HOME=$TMPDIR
201 cd "${packageName}"
202 runHook preRebuild
203 npm --registry http://www.example.com --nodedir=${nodeSources} ${npmFlags} ${stdenv.lib.optionalString production "--production"} rebuild
204
205 if [ "$dontNpmInstall" != "1" ]
206 then
207 npm --registry http://www.example.com --nodedir=${nodeSources} ${npmFlags} ${stdenv.lib.optionalString production "--production"} install
208 fi
209
210 # Create symlink to the deployed executable folder, if applicable
211 if [ -d "$out/lib/node_modules/.bin" ]
212 then
213 ln -s $out/lib/node_modules/.bin $out/bin
214 fi
215
216 # Create symlinks to the deployed manual page folders, if applicable
217 if [ -d "$out/lib/node_modules/${packageName}/man" ]
218 then
219 mkdir -p $out/share
220 for dir in "$out/lib/node_modules/${packageName}/man/"*
221 do
222 mkdir -p $out/share/man/$(basename "$dir")
223 for page in "$dir"/*
224 do
225 ln -s $page $out/share/man/$(basename "$dir")
226 done
227 done
228 fi
229 '';
230 });
231
232 # Builds a development shell
233 buildNodeShell = { name, packageName, version, src, dependencies ? [], production ? true, npmFlags ? "", dontNpmInstall ? false, ... }@args:
234 let
235 nodeDependencies = stdenv.mkDerivation {
236 name = "node-dependencies-${name}-${version}";
237
238 buildInputs = [ tarWrapper python nodejs ] ++ stdenv.lib.optional (stdenv.isLinux) utillinux ++ args.buildInputs or [];
239
240 includeScript = includeDependencies { inherit dependencies; };
241 passAsFile = [ "includeScript" ];
242
243 buildCommand = ''
244 mkdir -p $out/lib
245 cd $out/lib
246 source $includeScriptPath
247
248 # Create fake package.json to make the npm commands work properly
249 cat > package.json <<EOF
250 {
251 "name": "${packageName}",
252 "version": "${version}"
253 }
254 EOF
255
256 # Patch the shebangs of the bundled modules to prevent them from
257 # calling executables outside the Nix store as much as possible
258 patchShebangs .
259
260 export HOME=$TMPDIR
261 npm --registry http://www.example.com --nodedir=${nodeSources} ${npmFlags} ${stdenv.lib.optionalString production "--production"} rebuild
262
263 ${stdenv.lib.optionalString (!dontNpmInstall) ''
264 npm --registry http://www.example.com --nodedir=${nodeSources} ${npmFlags} ${stdenv.lib.optionalString production "--production"} install
265 ''}
266
267 ln -s $out/lib/node_modules/.bin $out/bin
268 '';
269 };
270 in
271 stdenv.lib.makeOverridable stdenv.mkDerivation {
272 name = "node-shell-${name}-${version}";
273
274 buildInputs = [ python nodejs ] ++ stdenv.lib.optional (stdenv.isLinux) utillinux ++ args.buildInputs or [];
275 buildCommand = ''
276 mkdir -p $out/bin
277 cat > $out/bin/shell <<EOF
278 #! ${stdenv.shell} -e
279 $shellHook
280 exec ${stdenv.shell}
281 EOF
282 chmod +x $out/bin/shell
283 '';
284
285 # Provide the dependencies in a development shell through the NODE_PATH environment variable
286 inherit nodeDependencies;
287 shellHook = stdenv.lib.optionalString (dependencies != []) ''
288 export NODE_PATH=$nodeDependencies/lib/node_modules
289 '';
290 };
291 in
292 { inherit buildNodeSourceDist buildNodePackage buildNodeShell; }
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
@@ -1,6 +1,6 b''
1 1 [bumpversion]
2 current_version = 4.3.1
2 current_version = 4.4.0
3 3 message = release: Bump version {current_version} to {new_version}
4 4
5 5 [bumpversion:file:rhodecode/VERSION]
6 6
@@ -1,59 +1,65 b''
1 1 syntax: glob
2 2 *.egg
3 3 *.egg-info
4 4 *.idea
5 5 *.orig
6 6 *.pyc
7 7 *.sqlite-journal
8 8 *.swp
9 9 *.tox
10 10 *.DS_Store*
11 rhodecode/public/js/src/components/**/*.css
11 12
12 13 syntax: regexp
13 14
14 15 #.filename
15 16 ^\.settings$
16 17 ^\.project$
17 18 ^\.pydevproject$
18 19 ^\.coverage$
19 20 ^\.cache.*$
20 21 ^\.rhodecode$
21 22
22 23 ^rcextensions
23 24 ^_dev
24 25 ^._dev
25 26 ^build/
27 ^bower_components/
26 28 ^coverage\.xml$
27 29 ^data$
28 30 ^\.eggs/
29 31 ^configs/data$
30 32 ^dev.ini$
31 33 ^acceptance_tests/dev.*\.ini$
32 34 ^dist/
33 35 ^fabfile.py
34 36 ^htmlcov
35 37 ^junit\.xml$
36 38 ^node_modules/
37 39 ^pylint.log$
38 40 ^rcextensions/
39 41 ^result$
40 42 ^rhodecode/public/css/style.css$
43 ^rhodecode/public/css/style-polymer.css$
44 ^rhodecode/public/js/rhodecode-components.html$
41 45 ^rhodecode/public/js/scripts.js$
46 ^rhodecode/public/js/src/components/root-styles.gen.html$
47 ^rhodecode/public/js/vendors/webcomponentsjs/
42 48 ^rhodecode\.db$
43 49 ^rhodecode\.log$
44 50 ^rhodecode_dev\.log$
45 51 ^test\.db$
46 52
47 53 # ac-tests
48 54 ^acceptance_tests/\.cache.*$
49 55 ^acceptance_tests/externals
50 56 ^acceptance_tests/ghostdriver.log$
51 57 ^acceptance_tests/local(_.+)?\.ini$
52 58
53 59 # docs
54 60 ^docs/_build$
55 61 ^docs/result$
56 62 ^docs-internal/_build$
57 63
58 64 # Cythonized things
59 65 ^rhodecode/.*\.(c|so)$
@@ -1,33 +1,28 b''
1 1 [DEFAULT]
2 2 done = false
3 3
4 4 [task:bump_version]
5 5 done = true
6 6
7 [task:rc_tools_pinned]
8 done = true
9
10 7 [task:fixes_on_stable]
11 done = true
12 8
13 9 [task:pip2nix_generated]
14 done = true
15 10
16 11 [task:changelog_updated]
17 done = true
18 12
19 13 [task:generate_api_docs]
20 done = true
14
15 [task:updated_translation]
21 16
22 17 [release]
23 state = prepared
24 version = 4.3.1
18 state = in_progress
19 version = 4.4.0
25 20
26 [task:updated_translation]
21 [task:rc_tools_pinned]
27 22
28 23 [task:generate_js_routes]
29 24
30 25 [task:updated_trial_license]
31 26
32 27 [task:generate_oss_licenses]
33 28
@@ -1,144 +1,15 b''
1 module.exports = function(grunt) {
2 grunt.initConfig({
3
4 dirs: {
5 css: "rhodecode/public/css",
6 js: {
7 "src": "rhodecode/public/js/src",
8 "dest": "rhodecode/public/js"
9 }
10 },
11
12 concat: {
13 dist: {
14 src: [
15 // Base libraries
16 '<%= dirs.js.src %>/jquery-1.11.1.min.js',
17 '<%= dirs.js.src %>/logging.js',
18 '<%= dirs.js.src %>/bootstrap.js',
19 '<%= dirs.js.src %>/mousetrap.js',
20 '<%= dirs.js.src %>/moment.js',
21 '<%= dirs.js.src %>/appenlight-client-0.4.1.min.js',
22 '<%= dirs.js.src %>/i18n_utils.js',
23 '<%= dirs.js.src %>/deform.js',
24
25 // Plugins
26 '<%= dirs.js.src %>/plugins/jquery.pjax.js',
27 '<%= dirs.js.src %>/plugins/jquery.dataTables.js',
28 '<%= dirs.js.src %>/plugins/flavoured_checkbox.js',
29 '<%= dirs.js.src %>/plugins/jquery.auto-grow-input.js',
30 '<%= dirs.js.src %>/plugins/jquery.autocomplete.js',
31 '<%= dirs.js.src %>/plugins/jquery.debounce.js',
32 '<%= dirs.js.src %>/plugins/jquery.mark.js',
33 '<%= dirs.js.src %>/plugins/jquery.timeago.js',
34 '<%= dirs.js.src %>/plugins/jquery.timeago-extension.js',
35 '<%= dirs.js.src %>/plugins/toastr.js',
36
37 // Select2
38 '<%= dirs.js.src %>/select2/select2.js',
39
40 // Code-mirror
41 '<%= dirs.js.src %>/codemirror/codemirror.js',
42 '<%= dirs.js.src %>/codemirror/codemirror_loadmode.js',
43 '<%= dirs.js.src %>/codemirror/codemirror_hint.js',
44 '<%= dirs.js.src %>/codemirror/codemirror_overlay.js',
45 '<%= dirs.js.src %>/codemirror/codemirror_placeholder.js',
46 // TODO: mikhail: this is an exception. Since the code mirror modes
47 // are loaded "on the fly", we need to keep them in a public folder
48 '<%= dirs.js.dest %>/mode/meta.js',
49 '<%= dirs.js.dest %>/mode/meta_ext.js',
50 '<%= dirs.js.dest %>/rhodecode/i18n/select2/translations.js',
51
52 // Rhodecode utilities
53 '<%= dirs.js.src %>/rhodecode/utils/array.js',
54 '<%= dirs.js.src %>/rhodecode/utils/string.js',
55 '<%= dirs.js.src %>/rhodecode/utils/pyroutes.js',
56 '<%= dirs.js.src %>/rhodecode/utils/ajax.js',
57 '<%= dirs.js.src %>/rhodecode/utils/autocomplete.js',
58 '<%= dirs.js.src %>/rhodecode/utils/colorgenerator.js',
59 '<%= dirs.js.src %>/rhodecode/utils/ie.js',
60 '<%= dirs.js.src %>/rhodecode/utils/os.js',
61 '<%= dirs.js.src %>/rhodecode/utils/topics.js',
62
63 // Rhodecode widgets
64 '<%= dirs.js.src %>/rhodecode/widgets/multiselect.js',
1 var gruntConfig = require('./grunt_config.json');
65 2
66 // Rhodecode components
67 '<%= dirs.js.src %>/rhodecode/init.js',
68 '<%= dirs.js.src %>/rhodecode/connection_controller.js',
69 '<%= dirs.js.src %>/rhodecode/codemirror.js',
70 '<%= dirs.js.src %>/rhodecode/comments.js',
71 '<%= dirs.js.src %>/rhodecode/constants.js',
72 '<%= dirs.js.src %>/rhodecode/files.js',
73 '<%= dirs.js.src %>/rhodecode/followers.js',
74 '<%= dirs.js.src %>/rhodecode/menus.js',
75 '<%= dirs.js.src %>/rhodecode/notifications.js',
76 '<%= dirs.js.src %>/rhodecode/permissions.js',
77 '<%= dirs.js.src %>/rhodecode/pjax.js',
78 '<%= dirs.js.src %>/rhodecode/pullrequests.js',
79 '<%= dirs.js.src %>/rhodecode/settings.js',
80 '<%= dirs.js.src %>/rhodecode/select2_widgets.js',
81 '<%= dirs.js.src %>/rhodecode/tooltips.js',
82 '<%= dirs.js.src %>/rhodecode/users.js',
83 '<%= dirs.js.src %>/rhodecode/utils/notifications.js',
84 '<%= dirs.js.src %>/rhodecode/appenlight.js',
85
86 // Rhodecode main module
87 '<%= dirs.js.src %>/rhodecode.js'
88 ],
89 dest: '<%= dirs.js.dest %>/scripts.js',
90 nonull: true
91 }
92 },
93
94 less: {
95 development: {
96 options: {
97 compress: false,
98 yuicompress: false,
99 optimization: 0
100 },
101 files: {
102 "<%= dirs.css %>/style.css": "<%= dirs.css %>/main.less"
103 }
104 },
105 production: {
106 options: {
107 compress: true,
108 yuicompress: true,
109 optimization: 2
110 },
111 files: {
112 "<%= dirs.css %>/style.css": "<%= dirs.css %>/main.less"
113 }
114 }
115 },
116
117 watch: {
118 less: {
119 files: ["<%= dirs.css %>/*.less"],
120 tasks: ["less:production"]
121 },
122 js: {
123 files: ["<%= dirs.js.src %>/**/*.js"],
124 tasks: ["concat:dist"]
125 }
126 },
127
128 jshint: {
129 rhodecode: {
130 src: '<%= dirs.js.src %>/rhodecode/**/*.js',
131 options: {
132 jshintrc: '.jshintrc'
133 }
134 }
135 }
136 });
3 module.exports = function(grunt) {
4 grunt.initConfig(gruntConfig);
137 5
138 6 grunt.loadNpmTasks('grunt-contrib-less');
139 7 grunt.loadNpmTasks('grunt-contrib-concat');
140 8 grunt.loadNpmTasks('grunt-contrib-watch');
141 9 grunt.loadNpmTasks('grunt-contrib-jshint');
10 grunt.loadNpmTasks('grunt-vulcanize');
11 grunt.loadNpmTasks('grunt-crisper');
12 grunt.loadNpmTasks('grunt-contrib-copy');
142 13
143 grunt.registerTask('default', ['less:production', 'concat:dist']);
14 grunt.registerTask('default', ['less:production', 'less:components', 'concat:polymercss', 'copy','vulcanize', 'crisper', 'concat:dist']);
144 15 };
@@ -1,52 +1,55 b''
1 1 # top level files
2 2 include test.ini
3 3 include MANIFEST.in
4 4 include README.rst
5 5 include CHANGES.rst
6 6 include LICENSE.txt
7 7
8 8 include rhodecode/VERSION
9 9
10 10 # docs
11 11 recursive-include docs *
12 12
13 13 # all config files
14 14 recursive-include configs *
15 15
16 16 # translations
17 17 recursive-include rhodecode/i18n *
18 18
19 19 # hook templates
20 20 recursive-include rhodecode/config/hook_templates *
21 21
22 22 # non-python core stuff
23 23 recursive-include rhodecode *.cfg
24 24 recursive-include rhodecode *.json
25 25 recursive-include rhodecode *.ini_tmpl
26 26 recursive-include rhodecode *.sh
27 27 recursive-include rhodecode *.mako
28 28
29 29 # 502 page
30 30 include rhodecode/public/502.html
31 31
32 # 502 page
33 include rhodecode/public/502.html
34
32 35 # images, css
33 36 include rhodecode/public/css/*.css
34 37 include rhodecode/public/images/*.*
35 38
36 39 # sound files
37 40 include rhodecode/public/sounds/*.mp3
38 41 include rhodecode/public/sounds/*.wav
39 42
40 43 # fonts
41 44 recursive-include rhodecode/public/fonts/ProximaNova *
42 45 recursive-include rhodecode/public/fonts/RCIcons *
43 46
44 47 # js
45 48 recursive-include rhodecode/public/js *
46 49
47 50 # templates
48 51 recursive-include rhodecode/templates *
49 52
50 53 # skip any tests files
51 54 recursive-exclude rhodecode/tests *
52 55
@@ -1,672 +1,672 b''
1 1
2 2
3 3 ################################################################################
4 4 ## RHODECODE ENTERPRISE CONFIGURATION ##
5 5 # The %(here)s variable will be replaced with the parent directory of this file#
6 6 ################################################################################
7 7
8 8 [DEFAULT]
9 9 debug = true
10 10
11 11 ################################################################################
12 12 ## EMAIL CONFIGURATION ##
13 13 ## Uncomment and replace with the email address which should receive ##
14 14 ## any error reports after an application crash ##
15 15 ## Additionally these settings will be used by the RhodeCode mailing system ##
16 16 ################################################################################
17 17
18 18 ## prefix all emails subjects with given prefix, helps filtering out emails
19 19 #email_prefix = [RhodeCode]
20 20
21 21 ## email FROM address all mails will be sent
22 22 #app_email_from = rhodecode-noreply@localhost
23 23
24 24 ## Uncomment and replace with the address which should receive any error report
25 25 ## note: using appenlight for error handling doesn't need this to be uncommented
26 26 #email_to = admin@localhost
27 27
28 28 ## in case of Application errors, sent an error email form
29 29 #error_email_from = rhodecode_error@localhost
30 30
31 31 ## additional error message to be send in case of server crash
32 32 #error_message =
33 33
34 34
35 35 #smtp_server = mail.server.com
36 36 #smtp_username =
37 37 #smtp_password =
38 38 #smtp_port =
39 39 #smtp_use_tls = false
40 40 #smtp_use_ssl = true
41 41 ## Specify available auth parameters here (e.g. LOGIN PLAIN CRAM-MD5, etc.)
42 42 #smtp_auth =
43 43
44 44 [server:main]
45 45 ## COMMON ##
46 46 host = 127.0.0.1
47 47 port = 5000
48 48
49 49 ##################################
50 50 ## WAITRESS WSGI SERVER ##
51 51 ## Recommended for Development ##
52 52 ##################################
53 53
54 54 use = egg:waitress#main
55 55 ## number of worker threads
56 56 threads = 5
57 57 ## MAX BODY SIZE 100GB
58 58 max_request_body_size = 107374182400
59 59 ## Use poll instead of select, fixes file descriptors limits problems.
60 60 ## May not work on old windows systems.
61 61 asyncore_use_poll = true
62 62
63 63
64 64 ##########################
65 65 ## GUNICORN WSGI SERVER ##
66 66 ##########################
67 67 ## run with gunicorn --log-config <inifile.ini> --paste <inifile.ini>
68 68
69 69 #use = egg:gunicorn#main
70 70 ## Sets the number of process workers. You must set `instance_id = *`
71 71 ## when this option is set to more than one worker, recommended
72 72 ## value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers
73 73 ## The `instance_id = *` must be set in the [app:main] section below
74 74 #workers = 2
75 75 ## number of threads for each of the worker, must be set to 1 for gevent
76 76 ## generally recommened to be at 1
77 77 #threads = 1
78 78 ## process name
79 79 #proc_name = rhodecode
80 80 ## type of worker class, one of sync, gevent
81 81 ## recommended for bigger setup is using of of other than sync one
82 82 #worker_class = sync
83 83 ## The maximum number of simultaneous clients. Valid only for Gevent
84 84 #worker_connections = 10
85 85 ## max number of requests that worker will handle before being gracefully
86 86 ## restarted, could prevent memory leaks
87 87 #max_requests = 1000
88 88 #max_requests_jitter = 30
89 89 ## amount of time a worker can spend with handling a request before it
90 90 ## gets killed and restarted. Set to 6hrs
91 91 #timeout = 21600
92 92
93 93
94 94 ## prefix middleware for RhodeCode, disables force_https flag.
95 95 ## recommended when using proxy setup.
96 96 ## allows to set RhodeCode under a prefix in server.
97 97 ## eg https://server.com/<prefix>. Enable `filter-with =` option below as well.
98 98 ## optionally set prefix like: `prefix = /<your-prefix>`
99 99 [filter:proxy-prefix]
100 100 use = egg:PasteDeploy#prefix
101 101 prefix = /
102 102
103 103 [app:main]
104 104 use = egg:rhodecode-enterprise-ce
105 105
106 106 ## enable proxy prefix middleware, defined above
107 107 #filter-with = proxy-prefix
108 108
109 109 # During development the we want to have the debug toolbar enabled
110 110 pyramid.includes =
111 111 pyramid_debugtoolbar
112 112 rhodecode.utils.debugtoolbar
113 113 rhodecode.lib.middleware.request_wrapper
114 114
115 115 pyramid.reload_templates = true
116 116
117 117 debugtoolbar.hosts = 0.0.0.0/0
118 118 debugtoolbar.exclude_prefixes =
119 119 /css
120 120 /fonts
121 121 /images
122 122 /js
123 123
124 124 ## RHODECODE PLUGINS ##
125 125 rhodecode.includes =
126 126 rhodecode.api
127 127
128 128
129 129 # api prefix url
130 130 rhodecode.api.url = /_admin/api
131 131
132 132
133 133 ## END RHODECODE PLUGINS ##
134 134
135 135 ## encryption key used to encrypt social plugin tokens,
136 136 ## remote_urls with credentials etc, if not set it defaults to
137 137 ## `beaker.session.secret`
138 138 #rhodecode.encrypted_values.secret =
139 139
140 140 ## decryption strict mode (enabled by default). It controls if decryption raises
141 141 ## `SignatureVerificationError` in case of wrong key, or damaged encryption data.
142 142 #rhodecode.encrypted_values.strict = false
143 143
144 144 ## return gzipped responses from Rhodecode (static files/application)
145 145 gzip_responses = false
146 146
147 147 ## autogenerate javascript routes file on startup
148 148 generate_js_files = false
149 149
150 150 ## Optional Languages
151 151 ## en(default), be, de, es, fr, it, ja, pl, pt, ru, zh
152 152 lang = en
153 153
154 154 ## perform a full repository scan on each server start, this should be
155 155 ## set to false after first startup, to allow faster server restarts.
156 156 startup.import_repos = false
157 157
158 158 ## Uncomment and set this path to use archive download cache.
159 159 ## Once enabled, generated archives will be cached at this location
160 160 ## and served from the cache during subsequent requests for the same archive of
161 161 ## the repository.
162 162 #archive_cache_dir = /tmp/tarballcache
163 163
164 164 ## change this to unique ID for security
165 165 app_instance_uuid = rc-production
166 166
167 167 ## cut off limit for large diffs (size in bytes)
168 168 cut_off_limit_diff = 1024000
169 169 cut_off_limit_file = 256000
170 170
171 171 ## use cache version of scm repo everywhere
172 172 vcs_full_cache = true
173 173
174 174 ## force https in RhodeCode, fixes https redirects, assumes it's always https
175 175 ## Normally this is controlled by proper http flags sent from http server
176 176 force_https = false
177 177
178 178 ## use Strict-Transport-Security headers
179 179 use_htsts = false
180 180
181 181 ## number of commits stats will parse on each iteration
182 182 commit_parse_limit = 25
183 183
184 184 ## git rev filter option, --all is the default filter, if you need to
185 185 ## hide all refs in changelog switch this to --branches --tags
186 186 git_rev_filter = --branches --tags
187 187
188 188 # Set to true if your repos are exposed using the dumb protocol
189 189 git_update_server_info = false
190 190
191 191 ## RSS/ATOM feed options
192 192 rss_cut_off_limit = 256000
193 193 rss_items_per_page = 10
194 194 rss_include_diff = false
195 195
196 196 ## gist URL alias, used to create nicer urls for gist. This should be an
197 197 ## url that does rewrites to _admin/gists/<gistid>.
198 198 ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal
199 199 ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/<gistid>
200 200 gist_alias_url =
201 201
202 202 ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be
203 203 ## used for access.
204 204 ## Adding ?auth_token = <token> to the url authenticates this request as if it
205 205 ## came from the the logged in user who own this authentication token.
206 206 ##
207 207 ## Syntax is <ControllerClass>:<function_pattern>.
208 208 ## To enable access to raw_files put `FilesController:raw`.
209 209 ## To enable access to patches add `ChangesetController:changeset_patch`.
210 210 ## The list should be "," separated and on a single line.
211 211 ##
212 212 ## Recommended controllers to enable:
213 213 # ChangesetController:changeset_patch,
214 214 # ChangesetController:changeset_raw,
215 215 # FilesController:raw,
216 216 # FilesController:archivefile,
217 217 # GistsController:*,
218 218 api_access_controllers_whitelist =
219 219
220 220 ## default encoding used to convert from and to unicode
221 221 ## can be also a comma separated list of encoding in case of mixed encodings
222 222 default_encoding = UTF-8
223 223
224 224 ## instance-id prefix
225 225 ## a prefix key for this instance used for cache invalidation when running
226 226 ## multiple instances of rhodecode, make sure it's globally unique for
227 227 ## all running rhodecode instances. Leave empty if you don't use it
228 228 instance_id =
229 229
230 230 ## Fallback authentication plugin. Set this to a plugin ID to force the usage
231 231 ## of an authentication plugin also if it is disabled by it's settings.
232 232 ## This could be useful if you are unable to log in to the system due to broken
233 233 ## authentication settings. Then you can enable e.g. the internal rhodecode auth
234 234 ## module to log in again and fix the settings.
235 235 ##
236 236 ## Available builtin plugin IDs (hash is part of the ID):
237 237 ## egg:rhodecode-enterprise-ce#rhodecode
238 238 ## egg:rhodecode-enterprise-ce#pam
239 239 ## egg:rhodecode-enterprise-ce#ldap
240 240 ## egg:rhodecode-enterprise-ce#jasig_cas
241 241 ## egg:rhodecode-enterprise-ce#headers
242 242 ## egg:rhodecode-enterprise-ce#crowd
243 243 #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode
244 244
245 245 ## alternative return HTTP header for failed authentication. Default HTTP
246 246 ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with
247 247 ## handling that causing a series of failed authentication calls.
248 248 ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code
249 249 ## This will be served instead of default 401 on bad authnetication
250 250 auth_ret_code =
251 251
252 252 ## use special detection method when serving auth_ret_code, instead of serving
253 253 ## ret_code directly, use 401 initially (Which triggers credentials prompt)
254 254 ## and then serve auth_ret_code to clients
255 255 auth_ret_code_detection = false
256 256
257 257 ## locking return code. When repository is locked return this HTTP code. 2XX
258 258 ## codes don't break the transactions while 4XX codes do
259 259 lock_ret_code = 423
260 260
261 261 ## allows to change the repository location in settings page
262 262 allow_repo_location_change = true
263 263
264 264 ## allows to setup custom hooks in settings page
265 265 allow_custom_hooks_settings = true
266 266
267 267 ## generated license token, goto license page in RhodeCode settings to obtain
268 268 ## new token
269 269 license_token =
270 270
271 271 ## supervisor connection uri, for managing supervisor and logs.
272 272 supervisor.uri =
273 273 ## supervisord group name/id we only want this RC instance to handle
274 274 supervisor.group_id = dev
275 275
276 276 ## Display extended labs settings
277 277 labs_settings_active = true
278 278
279 279 ####################################
280 280 ### CELERY CONFIG ####
281 281 ####################################
282 282 use_celery = false
283 283 broker.host = localhost
284 284 broker.vhost = rabbitmqhost
285 285 broker.port = 5672
286 286 broker.user = rabbitmq
287 287 broker.password = qweqwe
288 288
289 289 celery.imports = rhodecode.lib.celerylib.tasks
290 290
291 291 celery.result.backend = amqp
292 292 celery.result.dburi = amqp://
293 293 celery.result.serialier = json
294 294
295 295 #celery.send.task.error.emails = true
296 296 #celery.amqp.task.result.expires = 18000
297 297
298 298 celeryd.concurrency = 2
299 299 #celeryd.log.file = celeryd.log
300 300 celeryd.log.level = debug
301 301 celeryd.max.tasks.per.child = 1
302 302
303 303 ## tasks will never be sent to the queue, but executed locally instead.
304 304 celery.always.eager = false
305 305
306 306 ####################################
307 307 ### BEAKER CACHE ####
308 308 ####################################
309 309 # default cache dir for templates. Putting this into a ramdisk
310 310 ## can boost performance, eg. %(here)s/data_ramdisk
311 311 cache_dir = %(here)s/data
312 312
313 313 ## locking and default file storage for Beaker. Putting this into a ramdisk
314 314 ## can boost performance, eg. %(here)s/data_ramdisk/cache/beaker_data
315 315 beaker.cache.data_dir = %(here)s/data/cache/beaker_data
316 316 beaker.cache.lock_dir = %(here)s/data/cache/beaker_lock
317 317
318 318 beaker.cache.regions = super_short_term, short_term, long_term, sql_cache_short, auth_plugins, repo_cache_long
319 319
320 320 beaker.cache.super_short_term.type = memory
321 321 beaker.cache.super_short_term.expire = 10
322 322 beaker.cache.super_short_term.key_length = 256
323 323
324 324 beaker.cache.short_term.type = memory
325 325 beaker.cache.short_term.expire = 60
326 326 beaker.cache.short_term.key_length = 256
327 327
328 328 beaker.cache.long_term.type = memory
329 329 beaker.cache.long_term.expire = 36000
330 330 beaker.cache.long_term.key_length = 256
331 331
332 332 beaker.cache.sql_cache_short.type = memory
333 333 beaker.cache.sql_cache_short.expire = 10
334 334 beaker.cache.sql_cache_short.key_length = 256
335 335
336 336 ## default is memory cache, configure only if required
337 337 ## using multi-node or multi-worker setup
338 338 #beaker.cache.auth_plugins.type = ext:database
339 339 #beaker.cache.auth_plugins.lock_dir = %(here)s/data/cache/auth_plugin_lock
340 340 #beaker.cache.auth_plugins.url = postgresql://postgres:secret@localhost/rhodecode
341 341 #beaker.cache.auth_plugins.url = mysql://root:secret@127.0.0.1/rhodecode
342 342 #beaker.cache.auth_plugins.sa.pool_recycle = 3600
343 343 #beaker.cache.auth_plugins.sa.pool_size = 10
344 344 #beaker.cache.auth_plugins.sa.max_overflow = 0
345 345
346 346 beaker.cache.repo_cache_long.type = memorylru_base
347 347 beaker.cache.repo_cache_long.max_items = 4096
348 348 beaker.cache.repo_cache_long.expire = 2592000
349 349
350 350 ## default is memorylru_base cache, configure only if required
351 351 ## using multi-node or multi-worker setup
352 352 #beaker.cache.repo_cache_long.type = ext:memcached
353 353 #beaker.cache.repo_cache_long.url = localhost:11211
354 354 #beaker.cache.repo_cache_long.expire = 1209600
355 355 #beaker.cache.repo_cache_long.key_length = 256
356 356
357 357 ####################################
358 358 ### BEAKER SESSION ####
359 359 ####################################
360 360
361 361 ## .session.type is type of storage options for the session, current allowed
362 362 ## types are file, ext:memcached, ext:database, and memory (default).
363 363 beaker.session.type = file
364 364 beaker.session.data_dir = %(here)s/data/sessions/data
365 365
366 366 ## db based session, fast, and allows easy management over logged in users
367 367 #beaker.session.type = ext:database
368 368 #beaker.session.table_name = db_session
369 369 #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode
370 370 #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode
371 371 #beaker.session.sa.pool_recycle = 3600
372 372 #beaker.session.sa.echo = false
373 373
374 374 beaker.session.key = rhodecode
375 375 beaker.session.secret = develop-rc-uytcxaz
376 376 beaker.session.lock_dir = %(here)s/data/sessions/lock
377 377
378 378 ## Secure encrypted cookie. Requires AES and AES python libraries
379 379 ## you must disable beaker.session.secret to use this
380 380 #beaker.session.encrypt_key = <key_for_encryption>
381 381 #beaker.session.validate_key = <validation_key>
382 382
383 383 ## sets session as invalid(also logging out user) if it haven not been
384 384 ## accessed for given amount of time in seconds
385 385 beaker.session.timeout = 2592000
386 386 beaker.session.httponly = true
387 387 ## Path to use for the cookie.
388 388 #beaker.session.cookie_path = /<your-prefix>
389 389
390 390 ## uncomment for https secure cookie
391 391 beaker.session.secure = false
392 392
393 393 ## auto save the session to not to use .save()
394 394 beaker.session.auto = false
395 395
396 396 ## default cookie expiration time in seconds, set to `true` to set expire
397 397 ## at browser close
398 398 #beaker.session.cookie_expires = 3600
399 399
400 400 ###################################
401 401 ## SEARCH INDEXING CONFIGURATION ##
402 402 ###################################
403 403 ## Full text search indexer is available in rhodecode-tools under
404 404 ## `rhodecode-tools index` command
405 405
406 406 # WHOOSH Backend, doesn't require additional services to run
407 407 # it works good with few dozen repos
408 408 search.module = rhodecode.lib.index.whoosh
409 409 search.location = %(here)s/data/index
410 410
411 411 ########################################
412 412 ### CHANNELSTREAM CONFIG ####
413 413 ########################################
414 414 ## channelstream enables persistent connections and live notification
415 415 ## in the system. It's also used by the chat system
416 416
417 channelstream.enabled = true
417 channelstream.enabled = false
418 418 ## location of channelstream server on the backend
419 419 channelstream.server = 127.0.0.1:9800
420 420 ## location of the channelstream server from outside world
421 421 ## most likely this would be an http server special backend URL, that handles
422 422 ## websocket connections see nginx example for config
423 423 channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream
424 424 channelstream.secret = secret
425 425 channelstream.history.location = %(here)s/channelstream_history
426 426
427 427
428 428 ###################################
429 429 ## APPENLIGHT CONFIG ##
430 430 ###################################
431 431
432 432 ## Appenlight is tailored to work with RhodeCode, see
433 433 ## http://appenlight.com for details how to obtain an account
434 434
435 435 ## appenlight integration enabled
436 436 appenlight = false
437 437
438 438 appenlight.server_url = https://api.appenlight.com
439 439 appenlight.api_key = YOUR_API_KEY
440 440 #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5
441 441
442 442 # used for JS client
443 443 appenlight.api_public_key = YOUR_API_PUBLIC_KEY
444 444
445 445 ## TWEAK AMOUNT OF INFO SENT HERE
446 446
447 447 ## enables 404 error logging (default False)
448 448 appenlight.report_404 = false
449 449
450 450 ## time in seconds after request is considered being slow (default 1)
451 451 appenlight.slow_request_time = 1
452 452
453 453 ## record slow requests in application
454 454 ## (needs to be enabled for slow datastore recording and time tracking)
455 455 appenlight.slow_requests = true
456 456
457 457 ## enable hooking to application loggers
458 458 appenlight.logging = true
459 459
460 460 ## minimum log level for log capture
461 461 appenlight.logging.level = WARNING
462 462
463 463 ## send logs only from erroneous/slow requests
464 464 ## (saves API quota for intensive logging)
465 465 appenlight.logging_on_error = false
466 466
467 467 ## list of additonal keywords that should be grabbed from environ object
468 468 ## can be string with comma separated list of words in lowercase
469 469 ## (by default client will always send following info:
470 470 ## 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that
471 471 ## start with HTTP* this list be extended with additional keywords here
472 472 appenlight.environ_keys_whitelist =
473 473
474 474 ## list of keywords that should be blanked from request object
475 475 ## can be string with comma separated list of words in lowercase
476 476 ## (by default client will always blank keys that contain following words
477 477 ## 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf'
478 478 ## this list be extended with additional keywords set here
479 479 appenlight.request_keys_blacklist =
480 480
481 481 ## list of namespaces that should be ignores when gathering log entries
482 482 ## can be string with comma separated list of namespaces
483 483 ## (by default the client ignores own entries: appenlight_client.client)
484 484 appenlight.log_namespace_blacklist =
485 485
486 486
487 487 ################################################################################
488 488 ## WARNING: *THE LINE BELOW MUST BE UNCOMMENTED ON A PRODUCTION ENVIRONMENT* ##
489 489 ## Debug mode will enable the interactive debugging tool, allowing ANYONE to ##
490 490 ## execute malicious code after an exception is raised. ##
491 491 ################################################################################
492 492 #set debug = false
493 493
494 494
495 495 ##############
496 496 ## STYLING ##
497 497 ##############
498 498 debug_style = true
499 499
500 500 #########################################################
501 501 ### DB CONFIGS - EACH DB WILL HAVE IT'S OWN CONFIG ###
502 502 #########################################################
503 503 #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30
504 504 #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode
505 505 #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode
506 506 sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30
507 507
508 508 # see sqlalchemy docs for other advanced settings
509 509
510 510 ## print the sql statements to output
511 511 sqlalchemy.db1.echo = false
512 512 ## recycle the connections after this ammount of seconds
513 513 sqlalchemy.db1.pool_recycle = 3600
514 514 sqlalchemy.db1.convert_unicode = true
515 515
516 516 ## the number of connections to keep open inside the connection pool.
517 517 ## 0 indicates no limit
518 518 #sqlalchemy.db1.pool_size = 5
519 519
520 520 ## the number of connections to allow in connection pool "overflow", that is
521 521 ## connections that can be opened above and beyond the pool_size setting,
522 522 ## which defaults to five.
523 523 #sqlalchemy.db1.max_overflow = 10
524 524
525 525
526 526 ##################
527 527 ### VCS CONFIG ###
528 528 ##################
529 529 vcs.server.enable = true
530 530 vcs.server = localhost:9900
531 531
532 532 ## Web server connectivity protocol, responsible for web based VCS operatations
533 533 ## Available protocols are:
534 534 ## `pyro4` - using pyro4 server
535 535 ## `http` - using http-rpc backend
536 536 vcs.server.protocol = http
537 537
538 538 ## Push/Pull operations protocol, available options are:
539 539 ## `pyro4` - using pyro4 server
540 540 ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended
541 541 ## `vcsserver.scm_app` - internal app (EE only)
542 542 vcs.scm_app_implementation = rhodecode.lib.middleware.utils.scm_app_http
543 543
544 544 ## Push/Pull operations hooks protocol, available options are:
545 545 ## `pyro4` - using pyro4 server
546 546 ## `http` - using http-rpc backend
547 547 vcs.hooks.protocol = http
548 548
549 549 vcs.server.log_level = debug
550 550 ## Start VCSServer with this instance as a subprocess, usefull for development
551 551 vcs.start_server = true
552 552
553 553 ## List of enabled VCS backends, available options are:
554 554 ## `hg` - mercurial
555 555 ## `git` - git
556 556 ## `svn` - subversion
557 557 vcs.backends = hg, git, svn
558 558
559 559 vcs.connection_timeout = 3600
560 560 ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out.
561 561 ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible
562 562 #vcs.svn.compatible_version = pre-1.8-compatible
563 563
564 564
565 565 ############################################################
566 566 ### Subversion proxy support (mod_dav_svn) ###
567 567 ### Maps RhodeCode repo groups into SVN paths for Apache ###
568 568 ############################################################
569 569 ## Enable or disable the config file generation.
570 570 svn.proxy.generate_config = false
571 571 ## Generate config file with `SVNListParentPath` set to `On`.
572 572 svn.proxy.list_parent_path = true
573 573 ## Set location and file name of generated config file.
574 574 svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf
575 575 ## File system path to the directory containing the repositories served by
576 576 ## RhodeCode.
577 577 svn.proxy.parent_path_root = /path/to/repo_store
578 578 ## Used as a prefix to the <Location> block in the generated config file. In
579 579 ## most cases it should be set to `/`.
580 580 svn.proxy.location_root = /
581 581
582 582
583 583 ################################
584 584 ### LOGGING CONFIGURATION ####
585 585 ################################
586 586 [loggers]
587 587 keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates
588 588
589 589 [handlers]
590 590 keys = console, console_sql
591 591
592 592 [formatters]
593 593 keys = generic, color_formatter, color_formatter_sql
594 594
595 595 #############
596 596 ## LOGGERS ##
597 597 #############
598 598 [logger_root]
599 599 level = NOTSET
600 600 handlers = console
601 601
602 602 [logger_routes]
603 603 level = DEBUG
604 604 handlers =
605 605 qualname = routes.middleware
606 606 ## "level = DEBUG" logs the route matched and routing variables.
607 607 propagate = 1
608 608
609 609 [logger_beaker]
610 610 level = DEBUG
611 611 handlers =
612 612 qualname = beaker.container
613 613 propagate = 1
614 614
615 615 [logger_pyro4]
616 616 level = DEBUG
617 617 handlers =
618 618 qualname = Pyro4
619 619 propagate = 1
620 620
621 621 [logger_templates]
622 622 level = INFO
623 623 handlers =
624 624 qualname = pylons.templating
625 625 propagate = 1
626 626
627 627 [logger_rhodecode]
628 628 level = DEBUG
629 629 handlers =
630 630 qualname = rhodecode
631 631 propagate = 1
632 632
633 633 [logger_sqlalchemy]
634 634 level = INFO
635 635 handlers = console_sql
636 636 qualname = sqlalchemy.engine
637 637 propagate = 0
638 638
639 639 ##############
640 640 ## HANDLERS ##
641 641 ##############
642 642
643 643 [handler_console]
644 644 class = StreamHandler
645 645 args = (sys.stderr,)
646 646 level = DEBUG
647 647 formatter = color_formatter
648 648
649 649 [handler_console_sql]
650 650 class = StreamHandler
651 651 args = (sys.stderr,)
652 652 level = DEBUG
653 653 formatter = color_formatter_sql
654 654
655 655 ################
656 656 ## FORMATTERS ##
657 657 ################
658 658
659 659 [formatter_generic]
660 660 class = rhodecode.lib.logging_formatter.Pyro4AwareFormatter
661 661 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
662 662 datefmt = %Y-%m-%d %H:%M:%S
663 663
664 664 [formatter_color_formatter]
665 665 class = rhodecode.lib.logging_formatter.ColorFormatter
666 666 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
667 667 datefmt = %Y-%m-%d %H:%M:%S
668 668
669 669 [formatter_color_formatter_sql]
670 670 class = rhodecode.lib.logging_formatter.ColorFormatterSql
671 671 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
672 672 datefmt = %Y-%m-%d %H:%M:%S
@@ -1,641 +1,641 b''
1 1
2 2
3 3 ################################################################################
4 4 ## RHODECODE ENTERPRISE CONFIGURATION ##
5 5 # The %(here)s variable will be replaced with the parent directory of this file#
6 6 ################################################################################
7 7
8 8 [DEFAULT]
9 9 debug = true
10 10
11 11 ################################################################################
12 12 ## EMAIL CONFIGURATION ##
13 13 ## Uncomment and replace with the email address which should receive ##
14 14 ## any error reports after an application crash ##
15 15 ## Additionally these settings will be used by the RhodeCode mailing system ##
16 16 ################################################################################
17 17
18 18 ## prefix all emails subjects with given prefix, helps filtering out emails
19 19 #email_prefix = [RhodeCode]
20 20
21 21 ## email FROM address all mails will be sent
22 22 #app_email_from = rhodecode-noreply@localhost
23 23
24 24 ## Uncomment and replace with the address which should receive any error report
25 25 ## note: using appenlight for error handling doesn't need this to be uncommented
26 26 #email_to = admin@localhost
27 27
28 28 ## in case of Application errors, sent an error email form
29 29 #error_email_from = rhodecode_error@localhost
30 30
31 31 ## additional error message to be send in case of server crash
32 32 #error_message =
33 33
34 34
35 35 #smtp_server = mail.server.com
36 36 #smtp_username =
37 37 #smtp_password =
38 38 #smtp_port =
39 39 #smtp_use_tls = false
40 40 #smtp_use_ssl = true
41 41 ## Specify available auth parameters here (e.g. LOGIN PLAIN CRAM-MD5, etc.)
42 42 #smtp_auth =
43 43
44 44 [server:main]
45 45 ## COMMON ##
46 46 host = 127.0.0.1
47 47 port = 5000
48 48
49 49 ##################################
50 50 ## WAITRESS WSGI SERVER ##
51 51 ## Recommended for Development ##
52 52 ##################################
53 53
54 54 #use = egg:waitress#main
55 55 ## number of worker threads
56 56 #threads = 5
57 57 ## MAX BODY SIZE 100GB
58 58 #max_request_body_size = 107374182400
59 59 ## Use poll instead of select, fixes file descriptors limits problems.
60 60 ## May not work on old windows systems.
61 61 #asyncore_use_poll = true
62 62
63 63
64 64 ##########################
65 65 ## GUNICORN WSGI SERVER ##
66 66 ##########################
67 67 ## run with gunicorn --log-config <inifile.ini> --paste <inifile.ini>
68 68
69 69 use = egg:gunicorn#main
70 70 ## Sets the number of process workers. You must set `instance_id = *`
71 71 ## when this option is set to more than one worker, recommended
72 72 ## value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers
73 73 ## The `instance_id = *` must be set in the [app:main] section below
74 74 workers = 2
75 75 ## number of threads for each of the worker, must be set to 1 for gevent
76 76 ## generally recommened to be at 1
77 77 #threads = 1
78 78 ## process name
79 79 proc_name = rhodecode
80 80 ## type of worker class, one of sync, gevent
81 81 ## recommended for bigger setup is using of of other than sync one
82 82 worker_class = sync
83 83 ## The maximum number of simultaneous clients. Valid only for Gevent
84 84 #worker_connections = 10
85 85 ## max number of requests that worker will handle before being gracefully
86 86 ## restarted, could prevent memory leaks
87 87 max_requests = 1000
88 88 max_requests_jitter = 30
89 89 ## amount of time a worker can spend with handling a request before it
90 90 ## gets killed and restarted. Set to 6hrs
91 91 timeout = 21600
92 92
93 93
94 94 ## prefix middleware for RhodeCode, disables force_https flag.
95 95 ## recommended when using proxy setup.
96 96 ## allows to set RhodeCode under a prefix in server.
97 97 ## eg https://server.com/<prefix>. Enable `filter-with =` option below as well.
98 98 ## optionally set prefix like: `prefix = /<your-prefix>`
99 99 [filter:proxy-prefix]
100 100 use = egg:PasteDeploy#prefix
101 101 prefix = /
102 102
103 103 [app:main]
104 104 use = egg:rhodecode-enterprise-ce
105 105
106 106 ## enable proxy prefix middleware, defined above
107 107 #filter-with = proxy-prefix
108 108
109 109 ## encryption key used to encrypt social plugin tokens,
110 110 ## remote_urls with credentials etc, if not set it defaults to
111 111 ## `beaker.session.secret`
112 112 #rhodecode.encrypted_values.secret =
113 113
114 114 ## decryption strict mode (enabled by default). It controls if decryption raises
115 115 ## `SignatureVerificationError` in case of wrong key, or damaged encryption data.
116 116 #rhodecode.encrypted_values.strict = false
117 117
118 118 ## return gzipped responses from Rhodecode (static files/application)
119 119 gzip_responses = false
120 120
121 121 ## autogenerate javascript routes file on startup
122 122 generate_js_files = false
123 123
124 124 ## Optional Languages
125 125 ## en(default), be, de, es, fr, it, ja, pl, pt, ru, zh
126 126 lang = en
127 127
128 128 ## perform a full repository scan on each server start, this should be
129 129 ## set to false after first startup, to allow faster server restarts.
130 130 startup.import_repos = false
131 131
132 132 ## Uncomment and set this path to use archive download cache.
133 133 ## Once enabled, generated archives will be cached at this location
134 134 ## and served from the cache during subsequent requests for the same archive of
135 135 ## the repository.
136 136 #archive_cache_dir = /tmp/tarballcache
137 137
138 138 ## change this to unique ID for security
139 139 app_instance_uuid = rc-production
140 140
141 141 ## cut off limit for large diffs (size in bytes)
142 142 cut_off_limit_diff = 1024000
143 143 cut_off_limit_file = 256000
144 144
145 145 ## use cache version of scm repo everywhere
146 146 vcs_full_cache = true
147 147
148 148 ## force https in RhodeCode, fixes https redirects, assumes it's always https
149 149 ## Normally this is controlled by proper http flags sent from http server
150 150 force_https = false
151 151
152 152 ## use Strict-Transport-Security headers
153 153 use_htsts = false
154 154
155 155 ## number of commits stats will parse on each iteration
156 156 commit_parse_limit = 25
157 157
158 158 ## git rev filter option, --all is the default filter, if you need to
159 159 ## hide all refs in changelog switch this to --branches --tags
160 160 git_rev_filter = --branches --tags
161 161
162 162 # Set to true if your repos are exposed using the dumb protocol
163 163 git_update_server_info = false
164 164
165 165 ## RSS/ATOM feed options
166 166 rss_cut_off_limit = 256000
167 167 rss_items_per_page = 10
168 168 rss_include_diff = false
169 169
170 170 ## gist URL alias, used to create nicer urls for gist. This should be an
171 171 ## url that does rewrites to _admin/gists/<gistid>.
172 172 ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal
173 173 ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/<gistid>
174 174 gist_alias_url =
175 175
176 176 ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be
177 177 ## used for access.
178 178 ## Adding ?auth_token = <token> to the url authenticates this request as if it
179 179 ## came from the the logged in user who own this authentication token.
180 180 ##
181 181 ## Syntax is <ControllerClass>:<function_pattern>.
182 182 ## To enable access to raw_files put `FilesController:raw`.
183 183 ## To enable access to patches add `ChangesetController:changeset_patch`.
184 184 ## The list should be "," separated and on a single line.
185 185 ##
186 186 ## Recommended controllers to enable:
187 187 # ChangesetController:changeset_patch,
188 188 # ChangesetController:changeset_raw,
189 189 # FilesController:raw,
190 190 # FilesController:archivefile,
191 191 # GistsController:*,
192 192 api_access_controllers_whitelist =
193 193
194 194 ## default encoding used to convert from and to unicode
195 195 ## can be also a comma separated list of encoding in case of mixed encodings
196 196 default_encoding = UTF-8
197 197
198 198 ## instance-id prefix
199 199 ## a prefix key for this instance used for cache invalidation when running
200 200 ## multiple instances of rhodecode, make sure it's globally unique for
201 201 ## all running rhodecode instances. Leave empty if you don't use it
202 202 instance_id =
203 203
204 204 ## Fallback authentication plugin. Set this to a plugin ID to force the usage
205 205 ## of an authentication plugin also if it is disabled by it's settings.
206 206 ## This could be useful if you are unable to log in to the system due to broken
207 207 ## authentication settings. Then you can enable e.g. the internal rhodecode auth
208 208 ## module to log in again and fix the settings.
209 209 ##
210 210 ## Available builtin plugin IDs (hash is part of the ID):
211 211 ## egg:rhodecode-enterprise-ce#rhodecode
212 212 ## egg:rhodecode-enterprise-ce#pam
213 213 ## egg:rhodecode-enterprise-ce#ldap
214 214 ## egg:rhodecode-enterprise-ce#jasig_cas
215 215 ## egg:rhodecode-enterprise-ce#headers
216 216 ## egg:rhodecode-enterprise-ce#crowd
217 217 #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode
218 218
219 219 ## alternative return HTTP header for failed authentication. Default HTTP
220 220 ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with
221 221 ## handling that causing a series of failed authentication calls.
222 222 ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code
223 223 ## This will be served instead of default 401 on bad authnetication
224 224 auth_ret_code =
225 225
226 226 ## use special detection method when serving auth_ret_code, instead of serving
227 227 ## ret_code directly, use 401 initially (Which triggers credentials prompt)
228 228 ## and then serve auth_ret_code to clients
229 229 auth_ret_code_detection = false
230 230
231 231 ## locking return code. When repository is locked return this HTTP code. 2XX
232 232 ## codes don't break the transactions while 4XX codes do
233 233 lock_ret_code = 423
234 234
235 235 ## allows to change the repository location in settings page
236 236 allow_repo_location_change = true
237 237
238 238 ## allows to setup custom hooks in settings page
239 239 allow_custom_hooks_settings = true
240 240
241 241 ## generated license token, goto license page in RhodeCode settings to obtain
242 242 ## new token
243 243 license_token =
244 244
245 245 ## supervisor connection uri, for managing supervisor and logs.
246 246 supervisor.uri =
247 247 ## supervisord group name/id we only want this RC instance to handle
248 248 supervisor.group_id = prod
249 249
250 250 ## Display extended labs settings
251 251 labs_settings_active = true
252 252
253 253 ####################################
254 254 ### CELERY CONFIG ####
255 255 ####################################
256 256 use_celery = false
257 257 broker.host = localhost
258 258 broker.vhost = rabbitmqhost
259 259 broker.port = 5672
260 260 broker.user = rabbitmq
261 261 broker.password = qweqwe
262 262
263 263 celery.imports = rhodecode.lib.celerylib.tasks
264 264
265 265 celery.result.backend = amqp
266 266 celery.result.dburi = amqp://
267 267 celery.result.serialier = json
268 268
269 269 #celery.send.task.error.emails = true
270 270 #celery.amqp.task.result.expires = 18000
271 271
272 272 celeryd.concurrency = 2
273 273 #celeryd.log.file = celeryd.log
274 274 celeryd.log.level = debug
275 275 celeryd.max.tasks.per.child = 1
276 276
277 277 ## tasks will never be sent to the queue, but executed locally instead.
278 278 celery.always.eager = false
279 279
280 280 ####################################
281 281 ### BEAKER CACHE ####
282 282 ####################################
283 283 # default cache dir for templates. Putting this into a ramdisk
284 284 ## can boost performance, eg. %(here)s/data_ramdisk
285 285 cache_dir = %(here)s/data
286 286
287 287 ## locking and default file storage for Beaker. Putting this into a ramdisk
288 288 ## can boost performance, eg. %(here)s/data_ramdisk/cache/beaker_data
289 289 beaker.cache.data_dir = %(here)s/data/cache/beaker_data
290 290 beaker.cache.lock_dir = %(here)s/data/cache/beaker_lock
291 291
292 292 beaker.cache.regions = super_short_term, short_term, long_term, sql_cache_short, auth_plugins, repo_cache_long
293 293
294 294 beaker.cache.super_short_term.type = memory
295 295 beaker.cache.super_short_term.expire = 10
296 296 beaker.cache.super_short_term.key_length = 256
297 297
298 298 beaker.cache.short_term.type = memory
299 299 beaker.cache.short_term.expire = 60
300 300 beaker.cache.short_term.key_length = 256
301 301
302 302 beaker.cache.long_term.type = memory
303 303 beaker.cache.long_term.expire = 36000
304 304 beaker.cache.long_term.key_length = 256
305 305
306 306 beaker.cache.sql_cache_short.type = memory
307 307 beaker.cache.sql_cache_short.expire = 10
308 308 beaker.cache.sql_cache_short.key_length = 256
309 309
310 310 ## default is memory cache, configure only if required
311 311 ## using multi-node or multi-worker setup
312 312 #beaker.cache.auth_plugins.type = ext:database
313 313 #beaker.cache.auth_plugins.lock_dir = %(here)s/data/cache/auth_plugin_lock
314 314 #beaker.cache.auth_plugins.url = postgresql://postgres:secret@localhost/rhodecode
315 315 #beaker.cache.auth_plugins.url = mysql://root:secret@127.0.0.1/rhodecode
316 316 #beaker.cache.auth_plugins.sa.pool_recycle = 3600
317 317 #beaker.cache.auth_plugins.sa.pool_size = 10
318 318 #beaker.cache.auth_plugins.sa.max_overflow = 0
319 319
320 320 beaker.cache.repo_cache_long.type = memorylru_base
321 321 beaker.cache.repo_cache_long.max_items = 4096
322 322 beaker.cache.repo_cache_long.expire = 2592000
323 323
324 324 ## default is memorylru_base cache, configure only if required
325 325 ## using multi-node or multi-worker setup
326 326 #beaker.cache.repo_cache_long.type = ext:memcached
327 327 #beaker.cache.repo_cache_long.url = localhost:11211
328 328 #beaker.cache.repo_cache_long.expire = 1209600
329 329 #beaker.cache.repo_cache_long.key_length = 256
330 330
331 331 ####################################
332 332 ### BEAKER SESSION ####
333 333 ####################################
334 334
335 335 ## .session.type is type of storage options for the session, current allowed
336 336 ## types are file, ext:memcached, ext:database, and memory (default).
337 337 beaker.session.type = file
338 338 beaker.session.data_dir = %(here)s/data/sessions/data
339 339
340 340 ## db based session, fast, and allows easy management over logged in users
341 341 #beaker.session.type = ext:database
342 342 #beaker.session.table_name = db_session
343 343 #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode
344 344 #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode
345 345 #beaker.session.sa.pool_recycle = 3600
346 346 #beaker.session.sa.echo = false
347 347
348 348 beaker.session.key = rhodecode
349 349 beaker.session.secret = production-rc-uytcxaz
350 350 beaker.session.lock_dir = %(here)s/data/sessions/lock
351 351
352 352 ## Secure encrypted cookie. Requires AES and AES python libraries
353 353 ## you must disable beaker.session.secret to use this
354 354 #beaker.session.encrypt_key = <key_for_encryption>
355 355 #beaker.session.validate_key = <validation_key>
356 356
357 357 ## sets session as invalid(also logging out user) if it haven not been
358 358 ## accessed for given amount of time in seconds
359 359 beaker.session.timeout = 2592000
360 360 beaker.session.httponly = true
361 361 ## Path to use for the cookie.
362 362 #beaker.session.cookie_path = /<your-prefix>
363 363
364 364 ## uncomment for https secure cookie
365 365 beaker.session.secure = false
366 366
367 367 ## auto save the session to not to use .save()
368 368 beaker.session.auto = false
369 369
370 370 ## default cookie expiration time in seconds, set to `true` to set expire
371 371 ## at browser close
372 372 #beaker.session.cookie_expires = 3600
373 373
374 374 ###################################
375 375 ## SEARCH INDEXING CONFIGURATION ##
376 376 ###################################
377 377 ## Full text search indexer is available in rhodecode-tools under
378 378 ## `rhodecode-tools index` command
379 379
380 380 # WHOOSH Backend, doesn't require additional services to run
381 381 # it works good with few dozen repos
382 382 search.module = rhodecode.lib.index.whoosh
383 383 search.location = %(here)s/data/index
384 384
385 385 ########################################
386 386 ### CHANNELSTREAM CONFIG ####
387 387 ########################################
388 388 ## channelstream enables persistent connections and live notification
389 389 ## in the system. It's also used by the chat system
390 390
391 channelstream.enabled = true
391 channelstream.enabled = false
392 392 ## location of channelstream server on the backend
393 393 channelstream.server = 127.0.0.1:9800
394 394 ## location of the channelstream server from outside world
395 395 ## most likely this would be an http server special backend URL, that handles
396 396 ## websocket connections see nginx example for config
397 397 channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream
398 398 channelstream.secret = secret
399 399 channelstream.history.location = %(here)s/channelstream_history
400 400
401 401
402 402 ###################################
403 403 ## APPENLIGHT CONFIG ##
404 404 ###################################
405 405
406 406 ## Appenlight is tailored to work with RhodeCode, see
407 407 ## http://appenlight.com for details how to obtain an account
408 408
409 409 ## appenlight integration enabled
410 410 appenlight = false
411 411
412 412 appenlight.server_url = https://api.appenlight.com
413 413 appenlight.api_key = YOUR_API_KEY
414 414 #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5
415 415
416 416 # used for JS client
417 417 appenlight.api_public_key = YOUR_API_PUBLIC_KEY
418 418
419 419 ## TWEAK AMOUNT OF INFO SENT HERE
420 420
421 421 ## enables 404 error logging (default False)
422 422 appenlight.report_404 = false
423 423
424 424 ## time in seconds after request is considered being slow (default 1)
425 425 appenlight.slow_request_time = 1
426 426
427 427 ## record slow requests in application
428 428 ## (needs to be enabled for slow datastore recording and time tracking)
429 429 appenlight.slow_requests = true
430 430
431 431 ## enable hooking to application loggers
432 432 appenlight.logging = true
433 433
434 434 ## minimum log level for log capture
435 435 appenlight.logging.level = WARNING
436 436
437 437 ## send logs only from erroneous/slow requests
438 438 ## (saves API quota for intensive logging)
439 439 appenlight.logging_on_error = false
440 440
441 441 ## list of additonal keywords that should be grabbed from environ object
442 442 ## can be string with comma separated list of words in lowercase
443 443 ## (by default client will always send following info:
444 444 ## 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that
445 445 ## start with HTTP* this list be extended with additional keywords here
446 446 appenlight.environ_keys_whitelist =
447 447
448 448 ## list of keywords that should be blanked from request object
449 449 ## can be string with comma separated list of words in lowercase
450 450 ## (by default client will always blank keys that contain following words
451 451 ## 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf'
452 452 ## this list be extended with additional keywords set here
453 453 appenlight.request_keys_blacklist =
454 454
455 455 ## list of namespaces that should be ignores when gathering log entries
456 456 ## can be string with comma separated list of namespaces
457 457 ## (by default the client ignores own entries: appenlight_client.client)
458 458 appenlight.log_namespace_blacklist =
459 459
460 460
461 461 ################################################################################
462 462 ## WARNING: *THE LINE BELOW MUST BE UNCOMMENTED ON A PRODUCTION ENVIRONMENT* ##
463 463 ## Debug mode will enable the interactive debugging tool, allowing ANYONE to ##
464 464 ## execute malicious code after an exception is raised. ##
465 465 ################################################################################
466 466 set debug = false
467 467
468 468
469 469 #########################################################
470 470 ### DB CONFIGS - EACH DB WILL HAVE IT'S OWN CONFIG ###
471 471 #########################################################
472 472 #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30
473 473 #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode
474 474 #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode
475 475 sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode
476 476
477 477 # see sqlalchemy docs for other advanced settings
478 478
479 479 ## print the sql statements to output
480 480 sqlalchemy.db1.echo = false
481 481 ## recycle the connections after this ammount of seconds
482 482 sqlalchemy.db1.pool_recycle = 3600
483 483 sqlalchemy.db1.convert_unicode = true
484 484
485 485 ## the number of connections to keep open inside the connection pool.
486 486 ## 0 indicates no limit
487 487 #sqlalchemy.db1.pool_size = 5
488 488
489 489 ## the number of connections to allow in connection pool "overflow", that is
490 490 ## connections that can be opened above and beyond the pool_size setting,
491 491 ## which defaults to five.
492 492 #sqlalchemy.db1.max_overflow = 10
493 493
494 494
495 495 ##################
496 496 ### VCS CONFIG ###
497 497 ##################
498 498 vcs.server.enable = true
499 499 vcs.server = localhost:9900
500 500
501 501 ## Web server connectivity protocol, responsible for web based VCS operatations
502 502 ## Available protocols are:
503 503 ## `pyro4` - using pyro4 server
504 504 ## `http` - using http-rpc backend
505 505 #vcs.server.protocol = http
506 506
507 507 ## Push/Pull operations protocol, available options are:
508 508 ## `pyro4` - using pyro4 server
509 509 ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended
510 510 ## `vcsserver.scm_app` - internal app (EE only)
511 511 #vcs.scm_app_implementation = rhodecode.lib.middleware.utils.scm_app_http
512 512
513 513 ## Push/Pull operations hooks protocol, available options are:
514 514 ## `pyro4` - using pyro4 server
515 515 ## `http` - using http-rpc backend
516 516 #vcs.hooks.protocol = http
517 517
518 518 vcs.server.log_level = info
519 519 ## Start VCSServer with this instance as a subprocess, usefull for development
520 520 vcs.start_server = false
521 521
522 522 ## List of enabled VCS backends, available options are:
523 523 ## `hg` - mercurial
524 524 ## `git` - git
525 525 ## `svn` - subversion
526 526 vcs.backends = hg, git, svn
527 527
528 528 vcs.connection_timeout = 3600
529 529 ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out.
530 530 ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible
531 531 #vcs.svn.compatible_version = pre-1.8-compatible
532 532
533 533
534 534 ############################################################
535 535 ### Subversion proxy support (mod_dav_svn) ###
536 536 ### Maps RhodeCode repo groups into SVN paths for Apache ###
537 537 ############################################################
538 538 ## Enable or disable the config file generation.
539 539 svn.proxy.generate_config = false
540 540 ## Generate config file with `SVNListParentPath` set to `On`.
541 541 svn.proxy.list_parent_path = true
542 542 ## Set location and file name of generated config file.
543 543 svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf
544 544 ## File system path to the directory containing the repositories served by
545 545 ## RhodeCode.
546 546 svn.proxy.parent_path_root = /path/to/repo_store
547 547 ## Used as a prefix to the <Location> block in the generated config file. In
548 548 ## most cases it should be set to `/`.
549 549 svn.proxy.location_root = /
550 550
551 551
552 552 ################################
553 553 ### LOGGING CONFIGURATION ####
554 554 ################################
555 555 [loggers]
556 556 keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates
557 557
558 558 [handlers]
559 559 keys = console, console_sql
560 560
561 561 [formatters]
562 562 keys = generic, color_formatter, color_formatter_sql
563 563
564 564 #############
565 565 ## LOGGERS ##
566 566 #############
567 567 [logger_root]
568 568 level = NOTSET
569 569 handlers = console
570 570
571 571 [logger_routes]
572 572 level = DEBUG
573 573 handlers =
574 574 qualname = routes.middleware
575 575 ## "level = DEBUG" logs the route matched and routing variables.
576 576 propagate = 1
577 577
578 578 [logger_beaker]
579 579 level = DEBUG
580 580 handlers =
581 581 qualname = beaker.container
582 582 propagate = 1
583 583
584 584 [logger_pyro4]
585 585 level = DEBUG
586 586 handlers =
587 587 qualname = Pyro4
588 588 propagate = 1
589 589
590 590 [logger_templates]
591 591 level = INFO
592 592 handlers =
593 593 qualname = pylons.templating
594 594 propagate = 1
595 595
596 596 [logger_rhodecode]
597 597 level = DEBUG
598 598 handlers =
599 599 qualname = rhodecode
600 600 propagate = 1
601 601
602 602 [logger_sqlalchemy]
603 603 level = INFO
604 604 handlers = console_sql
605 605 qualname = sqlalchemy.engine
606 606 propagate = 0
607 607
608 608 ##############
609 609 ## HANDLERS ##
610 610 ##############
611 611
612 612 [handler_console]
613 613 class = StreamHandler
614 614 args = (sys.stderr,)
615 615 level = INFO
616 616 formatter = generic
617 617
618 618 [handler_console_sql]
619 619 class = StreamHandler
620 620 args = (sys.stderr,)
621 621 level = WARN
622 622 formatter = generic
623 623
624 624 ################
625 625 ## FORMATTERS ##
626 626 ################
627 627
628 628 [formatter_generic]
629 629 class = rhodecode.lib.logging_formatter.Pyro4AwareFormatter
630 630 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
631 631 datefmt = %Y-%m-%d %H:%M:%S
632 632
633 633 [formatter_color_formatter]
634 634 class = rhodecode.lib.logging_formatter.ColorFormatter
635 635 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
636 636 datefmt = %Y-%m-%d %H:%M:%S
637 637
638 638 [formatter_color_formatter_sql]
639 639 class = rhodecode.lib.logging_formatter.ColorFormatterSql
640 640 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
641 641 datefmt = %Y-%m-%d %H:%M:%S
@@ -1,229 +1,241 b''
1 1 # Nix environment for the community edition
2 2 #
3 3 # This shall be as lean as possible, just producing the Enterprise
4 4 # derivation. For advanced tweaks to pimp up the development environment we use
5 5 # "shell.nix" so that it does not have to clutter this file.
6 6
7 7 { pkgs ? (import <nixpkgs> {})
8 8 , pythonPackages ? "python27Packages"
9 9 , pythonExternalOverrides ? self: super: {}
10 10 , doCheck ? true
11 11 }:
12 12
13 13 let pkgs_ = pkgs; in
14 14
15 15 let
16 16 pkgs = pkgs_.overridePackages (self: super: {
17 17 # Override subversion derivation to
18 18 # - activate python bindings
19 19 # - set version to 1.8
20 20 subversion = super.subversion18.override {
21 21 httpSupport = true;
22 22 pythonBindings = true;
23 23 python = self.python27Packages.python;
24 24 };
25 25 });
26 26
27 27 inherit (pkgs.lib) fix extends;
28 28
29 29 basePythonPackages = with builtins; if isAttrs pythonPackages
30 30 then pythonPackages
31 31 else getAttr pythonPackages pkgs;
32 32
33 buildBowerComponents =
34 pkgs.buildBowerComponents or
35 (import ./pkgs/backport-16.03-build-bower-components.nix { inherit pkgs; });
36
33 37 elem = builtins.elem;
34 38 basename = path: with pkgs.lib; last (splitString "/" path);
35 39 startsWith = prefix: full: let
36 40 actualPrefix = builtins.substring 0 (builtins.stringLength prefix) full;
37 41 in actualPrefix == prefix;
38 42
39 43 src-filter = path: type: with pkgs.lib;
40 44 let
41 45 ext = last (splitString "." path);
42 46 in
43 47 !elem (basename path) [
44 ".git" ".hg" "__pycache__" ".eggs" "node_modules"
45 "build" "data" "tmp"] &&
48 ".git" ".hg" "__pycache__" ".eggs"
49 "bower_components" "node_modules"
50 "build" "data" "result" "tmp"] &&
46 51 !elem ext ["egg-info" "pyc"] &&
52 # TODO: johbo: This check is wrong, since "path" contains an absolute path,
53 # it would still be good to restore it since we want to ignore "result-*".
47 54 !startsWith "result" path;
48 55
49 56 sources = pkgs.config.rc.sources or {};
57 version = builtins.readFile ./rhodecode/VERSION;
50 58 rhodecode-enterprise-ce-src = builtins.filterSource src-filter ./.;
51 59
52 # Load the generated node packages
53 nodePackages = pkgs.callPackage "${pkgs.path}/pkgs/top-level/node-packages.nix" rec {
54 self = nodePackages;
55 generated = pkgs.callPackage ./pkgs/node-packages.nix { inherit self; };
60 nodeEnv = import ./pkgs/node-default.nix {
61 inherit pkgs;
56 62 };
63 nodeDependencies = nodeEnv.shell.nodeDependencies;
57 64
58 # TODO: Should be taken automatically out of the generates packages.
59 # apps.nix has one solution for this, although I'd prefer to have the deps
60 # from package.json mapped in here.
61 nodeDependencies = with nodePackages; [
62 grunt
63 grunt-contrib-concat
64 grunt-contrib-jshint
65 grunt-contrib-less
66 grunt-contrib-watch
67 jshint
68 ];
65 bowerComponents = buildBowerComponents {
66 name = "enterprise-ce-${version}";
67 generated = ./pkgs/bower-packages.nix;
68 src = rhodecode-enterprise-ce-src;
69 };
69 70
70 71 pythonGeneratedPackages = self: basePythonPackages.override (a: {
71 72 inherit self;
72 73 })
73 74 // (scopedImport {
74 75 self = self;
75 76 super = basePythonPackages;
76 77 inherit pkgs;
77 78 inherit (pkgs) fetchurl fetchgit;
78 79 } ./pkgs/python-packages.nix);
79 80
80 81 pythonOverrides = import ./pkgs/python-packages-overrides.nix {
81 82 inherit
82 83 basePythonPackages
83 84 pkgs;
84 85 };
85 86
86 87 pythonLocalOverrides = self: super: {
87 88 rhodecode-enterprise-ce =
88 89 let
89 version = builtins.readFile ./rhodecode/VERSION;
90 linkNodeModules = ''
90 linkNodeAndBowerPackages = ''
91 echo "Export RhodeCode CE path"
92 export RHODECODE_CE_PATH=${rhodecode-enterprise-ce-src}
91 93 echo "Link node packages"
92 # TODO: check if this adds stuff as a dependency, closure size
93 94 rm -fr node_modules
94 mkdir -p node_modules
95 ${pkgs.lib.concatMapStrings (dep: ''
96 ln -sfv ${dep}/lib/node_modules/${dep.pkgName} node_modules/
97 '') nodeDependencies}
95 mkdir node_modules
96 # johbo: Linking individual packages allows us to run "npm install"
97 # inside of a shell to try things out. Re-entering the shell will
98 # restore a clean environment.
99 ln -s ${nodeDependencies}/lib/node_modules/* node_modules/
100
98 101 echo "DONE: Link node packages"
102
103 echo "Link bower packages"
104 rm -fr bower_components
105 mkdir bower_components
106
107 ln -s ${bowerComponents}/bower_components/* bower_components/
108 echo "DONE: Link bower packages"
99 109 '';
100 110 in super.rhodecode-enterprise-ce.override (attrs: {
101 111
102 112 inherit
103 113 doCheck
104 114 version;
105 115 name = "rhodecode-enterprise-ce-${version}";
106 116 releaseName = "RhodeCodeEnterpriseCE-${version}";
107 117 src = rhodecode-enterprise-ce-src;
108 118
109 119 buildInputs =
110 120 attrs.buildInputs ++
111 121 (with self; [
122 pkgs.nodePackages.bower
112 123 pkgs.nodePackages.grunt-cli
113 124 pkgs.subversion
114 125 pytest-catchlog
115 126 rhodecode-testdata
116 127 ]);
117 128
118 129 propagatedBuildInputs = attrs.propagatedBuildInputs ++ (with self; [
119 130 rhodecode-tools
120 131 ]);
121 132
122 133 # TODO: johbo: Make a nicer way to expose the parts. Maybe
123 134 # pkgs/default.nix?
124 135 passthru = {
125 136 inherit
126 linkNodeModules
137 bowerComponents
138 linkNodeAndBowerPackages
127 139 myPythonPackagesUnfix
128 140 pythonLocalOverrides;
129 141 pythonPackages = self;
130 142 };
131 143
132 144 LC_ALL = "en_US.UTF-8";
133 145 LOCALE_ARCHIVE =
134 146 if pkgs.stdenv ? glibc
135 147 then "${pkgs.glibcLocales}/lib/locale/locale-archive"
136 148 else "";
137 149
138 150 # Somewhat snappier setup of the development environment
139 151 # TODO: move into shell.nix
140 152 # TODO: think of supporting a stable path again, so that multiple shells
141 153 # can share it.
142 154 shellHook = ''
143 155 tmp_path=$(mktemp -d)
144 156 export PATH="$tmp_path/bin:$PATH"
145 157 export PYTHONPATH="$tmp_path/${self.python.sitePackages}:$PYTHONPATH"
146 158 mkdir -p $tmp_path/${self.python.sitePackages}
147 159 python setup.py develop --prefix $tmp_path --allow-hosts ""
148 '' + linkNodeModules;
160 '' + linkNodeAndBowerPackages;
149 161
150 162 preCheck = ''
151 163 export PATH="$out/bin:$PATH"
152 164 '';
153 165
154 166 postCheck = ''
155 167 rm -rf $out/lib/${self.python.libPrefix}/site-packages/pytest_pylons
156 168 rm -rf $out/lib/${self.python.libPrefix}/site-packages/rhodecode/tests
157 169 '';
158 170
159 preBuild = linkNodeModules + ''
171 preBuild = linkNodeAndBowerPackages + ''
160 172 grunt
161 173 rm -fr node_modules
162 174 '';
163 175
164 176 postInstall = ''
165 177 # python based programs need to be wrapped
166 178 ln -s ${self.supervisor}/bin/supervisor* $out/bin/
167 179 ln -s ${self.gunicorn}/bin/gunicorn $out/bin/
168 180 ln -s ${self.PasteScript}/bin/paster $out/bin/
169 181 ln -s ${self.channelstream}/bin/channelstream $out/bin/
170 182 ln -s ${self.pyramid}/bin/* $out/bin/ #*/
171 183
172 184 # rhodecode-tools
173 185 # TODO: johbo: re-think this. Do the tools import anything from enterprise?
174 186 ln -s ${self.rhodecode-tools}/bin/rhodecode-* $out/bin/
175 187
176 188 # note that condition should be restricted when adding further tools
177 189 for file in $out/bin/*; do #*/
178 190 wrapProgram $file \
179 191 --prefix PYTHONPATH : $PYTHONPATH \
180 192 --prefix PATH : $PATH \
181 193 --set PYTHONHASHSEED random
182 194 done
183 195
184 196 mkdir $out/etc
185 197 cp configs/production.ini $out/etc
186 198
187 199 echo "Writing meta information for rccontrol to nix-support/rccontrol"
188 200 mkdir -p $out/nix-support/rccontrol
189 201 cp -v rhodecode/VERSION $out/nix-support/rccontrol/version
190 202 echo "DONE: Meta information for rccontrol written"
191 203
192 204 # TODO: johbo: Make part of ac-tests
193 205 if [ ! -f rhodecode/public/js/scripts.js ]; then
194 206 echo "Missing scripts.js"
195 207 exit 1
196 208 fi
197 209 if [ ! -f rhodecode/public/css/style.css ]; then
198 210 echo "Missing style.css"
199 211 exit 1
200 212 fi
201 213 '';
202 214
203 215 });
204 216
205 217 rhodecode-testdata = import "${rhodecode-testdata-src}/default.nix" {
206 218 inherit
207 219 doCheck
208 220 pkgs
209 221 pythonPackages;
210 222 };
211 223
212 224 };
213 225
214 226 rhodecode-testdata-src = sources.rhodecode-testdata or (
215 227 pkgs.fetchhg {
216 228 url = "https://code.rhodecode.com/upstream/rc_testdata";
217 229 rev = "v0.8.0";
218 230 sha256 = "0hy1ba134rq2f9si85yx7j4qhc9ky0hjzdk553s3q026i7km809m";
219 231 });
220 232
221 233 # Apply all overrides and fix the final package set
222 234 myPythonPackagesUnfix =
223 235 (extends pythonExternalOverrides
224 236 (extends pythonLocalOverrides
225 237 (extends pythonOverrides
226 238 pythonGeneratedPackages)));
227 239 myPythonPackages = (fix myPythonPackagesUnfix);
228 240
229 241 in myPythonPackages.rhodecode-enterprise-ce
@@ -1,41 +1,31 b''
1 1 .. _lab-settings:
2 2
3 3 Lab Settings
4 4 ============
5 5
6 6 |RCE| Lab Settings is for delivering features which may require an additional
7 7 level of support to optimize for production scenarios. To enable lab settings,
8 8 use the following instructions:
9 9
10 10 1. Open the |RCE| configuration file,
11 11 :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini`
12 12
13 13 2. Add the following configuration option in the ``[app:main]`` section.
14 14
15 15 .. code-block:: bash
16 16
17 17 [app:main]
18 18
19 19 ## Display extended labs settings
20 20 labs_settings_active = true
21 21
22 22 3. Restart your |RCE| instance
23 23
24 24 .. code-block:: bash
25 25
26 26 $ rccontrol restart enterprise-1
27 27
28 28 4. You will see the labs setting on the
29 29 :menuselection:`Admin --> Settings --> labs` page.
30 30
31 31 .. image:: ../images/lab-setting.png
32
33 Available Lab Extras
34 --------------------
35
36 Once lab settings are enabled, the following features are available.
37
38 .. toctree::
39 :maxdepth: 1
40
41 svn-http
@@ -1,301 +1,430 b''
1 1 .. _vcs-server:
2 2
3 3 VCS Server Management
4 4 ---------------------
5 5
6 6 The VCS Server handles |RCM| backend functionality. You need to configure
7 7 a VCS Server to run with a |RCM| instance. If you do not, you will be missing
8 8 the connection between |RCM| and its |repos|. This will cause error messages
9 9 on the web interface. You can run your setup in the following configurations,
10 10 currently the best performance is one VCS Server per |RCM| instance:
11 11
12 12 * One VCS Server per |RCM| instance.
13 13 * One VCS Server handling multiple instances.
14 14
15 15 .. important::
16 16
17 17 If your server locale settings are not correctly configured,
18 18 |RCE| and the VCS Server can run into issues. See this `Ask Ubuntu`_ post
19 19 which explains the problem and gives a solution.
20 20
21 21 For more information, see the following sections:
22 22
23 23 * :ref:`install-vcs`
24 24 * :ref:`config-vcs`
25 25 * :ref:`vcs-server-options`
26 26 * :ref:`vcs-server-versions`
27 27 * :ref:`vcs-server-maintain`
28 28 * :ref:`vcs-server-config-file`
29 * :ref:`svn-http`
29 30
30 31 .. _install-vcs:
31 32
32 33 VCS Server Installation
33 34 ^^^^^^^^^^^^^^^^^^^^^^^
34 35
35 36 To install a VCS Server, see
36 37 :ref:`Installing a VCS server <control:install-vcsserver>`.
37 38
38 39 .. _config-vcs:
39 40
40 41 Hooking |RCE| to its VCS Server
41 42 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
42 43
43 44 To configure a |RCE| instance to use a VCS server, see
44 45 :ref:`Configuring the VCS Server connection <control:manually-vcsserver-ini>`.
45 46
46 47 .. _vcs-server-options:
47 48
48 49 |RCE| VCS Server Options
49 50 ^^^^^^^^^^^^^^^^^^^^^^^^
50 51
51 52 The following list shows the available options on the |RCM| side of the
52 53 connection to the VCS Server. The settings are configured per
53 54 instance in the
54 55 :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini` file.
55 56
56 57 .. rst-class:: dl-horizontal
57 58
58 59 \vcs.backends <available-vcs-systems>
59 60 Set a comma-separated list of the |repo| options available from the
60 61 web interface. The default is ``hg, git, svn``,
61 62 which is all |repo| types available.
62 63
63 64 \vcs.connection_timeout <seconds>
64 65 Set the length of time in seconds that the VCS Server waits for
65 66 requests to process. After the timeout expires,
66 67 the request is closed. The default is ``3600``. Set to a higher
67 68 number if you experience network latency, or timeout issues with very
68 69 large push/pull requests.
69 70
70 71 \vcs.server.enable <boolean>
71 72 Enable or disable the VCS Server. The available options are ``true`` or
72 73 ``false``. The default is ``true``.
73 74
74 75 \vcs.server <host:port>
75 76 Set the host, either hostname or IP Address, and port of the VCS server
76 77 you wish to run with your |RCM| instance.
77 78
78 79 .. code-block:: ini
79 80
80 81 ##################
81 82 ### VCS CONFIG ###
82 83 ##################
83 84 # set this line to match your VCS Server
84 85 vcs.server = 127.0.0.1:10004
85 86 # Set to False to disable the VCS Server
86 87 vcs.server.enable = True
87 88 vcs.backends = hg, git, svn
88 89 vcs.connection_timeout = 3600
89 90
90 91
91 92 .. _vcs-server-versions:
92 93
93 94 VCS Server Versions
94 95 ^^^^^^^^^^^^^^^^^^^
95 96
96 97 An updated version of the VCS Server is released with each |RCE| version. Use
97 98 the VCS Server number that matches with the |RCE| version to pair the
98 99 appropriate ones together. For |RCE| versions pre 3.3.0,
99 100 VCS Server 1.X.Y works with |RCE| 3.X.Y, for example:
100 101
101 102 * VCS Server 1.0.0 works with |RCE| 3.0.0
102 103 * VCS Server 1.2.2 works with |RCE| 3.2.2
103 104
104 105 For |RCE| versions post 3.3.0, the VCS Server and |RCE| version numbers
105 106 match, for example:
106 107
107 108 * VCS Server |release| works with |RCE| |release|
108 109
109 110 .. _vcs-server-maintain:
110 111
111 112 VCS Server Memory Optimization
112 113 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
113 114
114 115 To configure the VCS server to manage the cache efficiently, you need to
115 116 configure the following options in the
116 117 :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.ini` file. Once
117 118 configured, restart the VCS Server.
118 119
119 120 .. rst-class:: dl-horizontal
120 121
121 122 \beaker.cache.repo_object.type = memorylru
122 123 Configures the cache to discard the least recently used items.
123 124 This setting takes the following valid options:
124 125
125 126 * ``memorylru``: The default setting, which removes the least recently
126 127 used items from the cache.
127 128 * ``memory``: Runs the VCS Server without clearing the cache.
128 129 * ``nocache``: Runs the VCS Server without a cache. This will
129 130 dramatically reduce the VCS Server performance.
130 131
131 132 \beaker.cache.repo_object.max_items = 100
132 133 Sets the maximum number of items stored in the cache, before the cache
133 134 starts to be cleared.
134 135
135 136 As a general rule of thumb, running this value at 120 resulted in a
136 137 5GB cache. Running it at 240 resulted in a 9GB cache. Your results
137 138 will differ based on usage patterns and |repo| sizes.
138 139
139 140 Tweaking this value to run at a fairly constant memory load on your
140 141 server will help performance.
141 142
142 143 To clear the cache completely, you can restart the VCS Server.
143 144
144 145 .. important::
145 146
146 147 While the VCS Server handles a restart gracefully on the web interface,
147 148 it will drop connections during push/pull requests. So it is recommended
148 149 you only perform this when there is very little traffic on the instance.
149 150
150 151 Use the following example to restart your VCS Server,
151 152 for full details see the :ref:`RhodeCode Control CLI <control:rcc-cli>`.
152 153
153 154 .. code-block:: bash
154 155
155 156 $ rccontrol status
156 157
157 158 .. code-block:: vim
158 159
159 160 - NAME: vcsserver-1
160 161 - STATUS: RUNNING
161 162 - TYPE: VCSServer
162 163 - VERSION: 1.0.0
163 164 - URL: http://127.0.0.1:10001
164 165
165 166 $ rccontrol restart vcsserver-1
166 167 Instance "vcsserver-1" successfully stopped.
167 168 Instance "vcsserver-1" successfully started.
168 169
169 170 .. _vcs-server-config-file:
170 171
171 172 VCS Server Configuration
172 173 ^^^^^^^^^^^^^^^^^^^^^^^^
173 174
174 175 You can configure settings for multiple VCS Servers on your
175 176 system using their individual configuration files. Use the following
176 177 properties inside the configuration file to set up your system. The default
177 178 location is :file:`home/{user}/.rccontrol/{vcsserver-id}/vcsserver.ini`.
178 179 For a more detailed explanation of the logger levers, see :ref:`debug-mode`.
179 180
180 181 .. rst-class:: dl-horizontal
181 182
182 183 \host <ip-address>
183 184 Set the host on which the VCS Server will run.
184 185
185 186 \port <int>
186 187 Set the port number on which the VCS Server will be available.
187 188
188 189 \locale <locale_utf>
189 190 Set the locale the VCS Server expects.
190 191
191 192 \threadpool_size <int>
192 193 Set the size of the threadpool used to communicate
193 194 with the WSGI workers. This should be at least 6 times the number of
194 195 WSGI worker processes.
195 196
196 197 \timeout <seconds>
197 198 Set the timeout for RPC communication in seconds.
198 199
199 200 .. note::
200 201
201 202 After making changes, you need to restart your VCS Server to pick them up.
202 203
203 204 .. code-block:: ini
204 205
205 206 ################################################################################
206 207 # RhodeCode VCSServer - configuration #
207 208 # #
208 209 ################################################################################
209 210
210 211 [DEFAULT]
211 212 host = 127.0.0.1
212 213 port = 9900
213 214 locale = en_US.UTF-8
214 215 # number of worker threads, this should be set based on a formula threadpool=N*6
215 216 # where N is number of RhodeCode Enterprise workers, eg. running 2 instances
216 217 # 8 gunicorn workers each would be 2 * 8 * 6 = 96, threadpool_size = 96
217 218 threadpool_size = 16
218 219 timeout = 0
219 220
220 221 # cache regions, please don't change
221 222 beaker.cache.regions = repo_object
222 223 beaker.cache.repo_object.type = memorylru
223 224 beaker.cache.repo_object.max_items = 1000
224 225
225 226 # cache auto-expires after N seconds
226 227 beaker.cache.repo_object.expire = 10
227 228 beaker.cache.repo_object.enabled = true
228 229
229 230
230 231 ################################
231 232 ### LOGGING CONFIGURATION ####
232 233 ################################
233 234 [loggers]
234 235 keys = root, vcsserver, pyro4, beaker
235 236
236 237 [handlers]
237 238 keys = console
238 239
239 240 [formatters]
240 241 keys = generic
241 242
242 243 #############
243 244 ## LOGGERS ##
244 245 #############
245 246 [logger_root]
246 247 level = NOTSET
247 248 handlers = console
248 249
249 250 [logger_vcsserver]
250 251 level = DEBUG
251 252 handlers =
252 253 qualname = vcsserver
253 254 propagate = 1
254 255
255 256 [logger_beaker]
256 257 level = DEBUG
257 258 handlers =
258 259 qualname = beaker
259 260 propagate = 1
260 261
261 262 [logger_pyro4]
262 263 level = DEBUG
263 264 handlers =
264 265 qualname = Pyro4
265 266 propagate = 1
266 267
267 268
268 269 ##############
269 270 ## HANDLERS ##
270 271 ##############
271 272
272 273 [handler_console]
273 274 class = StreamHandler
274 275 args = (sys.stderr,)
275 276 level = DEBUG
276 277 formatter = generic
277 278
278 279 [handler_file]
279 280 class = FileHandler
280 281 args = ('vcsserver.log', 'a',)
281 282 level = DEBUG
282 283 formatter = generic
283 284
284 285 [handler_file_rotating]
285 286 class = logging.handlers.TimedRotatingFileHandler
286 287 # 'D', 5 - rotate every 5days
287 288 # you can set 'h', 'midnight'
288 289 args = ('vcsserver.log', 'D', 5, 10,)
289 290 level = DEBUG
290 291 formatter = generic
291 292
292 293 ################
293 294 ## FORMATTERS ##
294 295 ################
295 296
296 297 [formatter_generic]
297 298 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
298 299 datefmt = %Y-%m-%d %H:%M:%S
299 300
301 .. _svn-http:
300 302
301 .. _Ask Ubuntu: http://askubuntu.com/questions/162391/how-do-i-fix-my-locale-issue
303 |svn| With Write Over HTTP
304 ^^^^^^^^^^^^^^^^^^^^^^^^^^
305
306 To use |svn| with read/write support over the |svn| HTTP protocol, you have to
307 configure the HTTP |svn| backend.
308
309 Prerequisites
310 =============
311
312 - Enable HTTP support inside the admin VCS settings on your |RCE| instance
313 - You need to install the following tools on the machine that is running an
314 instance of |RCE|:
315 ``Apache HTTP Server`` and
316 ``mod_dav_svn``.
317
318
319 Using Ubuntu Distribution as an example you can run:
320
321 .. code-block:: bash
322
323 $ sudo apt-get install apache2 libapache2-mod-svn
324
325 Once installed you need to enable ``dav_svn``:
326
327 .. code-block:: bash
328
329 $ sudo a2enmod dav_svn
330
331 Configuring Apache Setup
332 ========================
333
334 .. tip::
335
336 It is recommended to run Apache on a port other than 80, due to possible
337 conflicts with other HTTP servers like nginx. To do this, set the
338 ``Listen`` parameter in the ``/etc/apache2/ports.conf`` file, for example
339 ``Listen 8090``.
340
341
342 .. warning::
343
344 Make sure your Apache instance which runs the mod_dav_svn module is
345 only accessible by RhodeCode. Otherwise everyone is able to browse
346 the repositories or run subversion operations (checkout/commit/etc.).
347
348 It is also recommended to run apache as the same user as |RCE|, otherwise
349 permission issues could occur. To do this edit the ``/etc/apache2/envvars``
350
351 .. code-block:: apache
352
353 export APACHE_RUN_USER=rhodecode
354 export APACHE_RUN_GROUP=rhodecode
355
356 1. To configure Apache, create and edit a virtual hosts file, for example
357 :file:`/etc/apache2/sites-available/default.conf`. Below is an example
358 how to use one with auto-generated config ```mod_dav_svn.conf```
359 from configured |RCE| instance.
360
361 .. code-block:: apache
362
363 <VirtualHost *:8080>
364 ServerAdmin rhodecode-admin@localhost
365 DocumentRoot /var/www/html
366 ErrorLog ${'${APACHE_LOG_DIR}'}/error.log
367 CustomLog ${'${APACHE_LOG_DIR}'}/access.log combined
368 Include /home/user/.rccontrol/enterprise-1/mod_dav_svn.conf
369 </VirtualHost>
370
371
372 2. Go to the :menuselection:`Admin --> Settings --> VCS` page, and
373 enable :guilabel:`Proxy Subversion HTTP requests`, and specify the
374 :guilabel:`Subversion HTTP Server URL`.
375
376 3. Open the |RCE| configuration file,
377 :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini`
378
379 4. Add the following configuration option in the ``[app:main]``
380 section if you don't have it yet.
381
382 This enables mapping of the created |RCE| repo groups into special |svn| paths.
383 Each time a new repository group is created, the system will update
384 the template file and create new mapping. Apache web server needs to be
385 reloaded to pick up the changes on this file.
386 It's recommended to add reload into a crontab so the changes can be picked
387 automatically once someone creates a repository group inside RhodeCode.
388
389
390 .. code-block:: ini
391
392 ##############################################
393 ### Subversion proxy support (mod_dav_svn) ###
394 ##############################################
395 ## Enable or disable the config file generation.
396 svn.proxy.generate_config = true
397 ## Generate config file with `SVNListParentPath` set to `On`.
398 svn.proxy.list_parent_path = true
399 ## Set location and file name of generated config file.
400 svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf
401 ## File system path to the directory containing the repositories served by
402 ## RhodeCode.
403 svn.proxy.parent_path_root = /path/to/repo_store
404 ## Used as a prefix to the <Location> block in the generated config file. In
405 ## most cases it should be set to `/`.
406 svn.proxy.location_root = /
407
408
409 This would create a special template file called ```mod_dav_svn.conf```. We
410 used that file path in the apache config above inside the Include statement.
411
412
413 Using |svn|
414 ===========
415
416 Once |svn| has been enabled on your instance, you can use it with the
417 following examples. For more |svn| information, see the `Subversion Red Book`_
418
419 .. code-block:: bash
420
421 # To clone a repository
422 svn checkout http://my-svn-server.example.com/my-svn-repo
423
424 # svn commit
425 svn commit
426
427 .. _Subversion Red Book: http://svnbook.red-bean.com/en/1.7/svn-book.html#svn.ref.svn
428
429
430 .. _Ask Ubuntu: http://askubuntu.com/questions/162391/how-do-i-fix-my-locale-issue No newline at end of file
@@ -1,20 +1,21 b''
1 1 .. _contributing:
2 2
3 3 Contributing to RhodeCode
4 4 =========================
5 5
6 6
7 7
8 8 Welcome to the contribution guides and development docs of RhodeCode.
9 9
10 10
11 11
12 12 .. toctree::
13 13 :maxdepth: 1
14 14
15 15 overview
16 16 testing/index
17 17 dev-setup
18 18 db-schema
19 19 dev-settings
20 20 api
21 dependencies
@@ -1,157 +1,160 b''
1 1 .. _dev-setup:
2 2
3 3 ===================
4 4 Development setup
5 5 ===================
6 6
7 7
8 8 RhodeCode Enterprise runs inside a Nix managed environment. This ensures build
9 9 environment dependencies are correctly declared and installed during setup.
10 10 It also enables atomic upgrades, rollbacks, and multiple instances of RhodeCode
11 11 Enterprise running with isolation.
12 12
13 13 To set up RhodeCode Enterprise inside the Nix environment, use the following steps:
14 14
15 15
16 16
17 17 Setup Nix Package Manager
18 18 -------------------------
19 19
20 20 To install the Nix Package Manager, please run::
21 21
22 22 $ curl https://nixos.org/nix/install | sh
23 23
24 24 or go to https://nixos.org/nix/ and follow the installation instructions.
25 25 Once this is correctly set up on your system, you should be able to use the
26 26 following commands:
27 27
28 28 * `nix-env`
29 29
30 30 * `nix-shell`
31 31
32 32
33 33 .. tip::
34 34
35 35 Update your channels frequently by running ``nix-channel --upgrade``.
36 36
37 37
38 38 Switch nix to the latest STABLE channel
39 39 ---------------------------------------
40 40
41 41 run::
42 42
43 43 nix-channel --add https://nixos.org/channels/nixos-16.03 nixpkgs
44 44
45 45 Followed by::
46 46
47 47 nix-channel --update
48 48
49 49
50 50 Clone the required repositories
51 51 -------------------------------
52 52
53 53 After Nix is set up, clone the RhodeCode Enterprise Community Edition and
54 54 RhodeCode VCSServer repositories into the same directory.
55 55 To do this, use the following example::
56 56
57 57 mkdir rhodecode-develop && cd rhodecode-develop
58 58 hg clone https://code.rhodecode.com/rhodecode-enterprise-ce
59 59 hg clone https://code.rhodecode.com/rhodecode-vcsserver
60 60
61 61 .. note::
62 62
63 63 If you cannot clone the repository, please request read permissions
64 64 via support@rhodecode.com
65 65
66 66
67 67
68 68 Enter the Development Shell
69 69 ---------------------------
70 70
71 71 The final step is to start the development shell. To do this, run the
72 72 following command from inside the cloned repository::
73 73
74 74 cd ~/rhodecode-enterprise-ce
75 75 nix-shell
76 76
77 77 .. note::
78 78
79 79 On the first run, this will take a while to download and optionally compile
80 80 a few things. The following runs will be faster. The development shell works
81 81 fine on both MacOS and Linux platforms.
82 82
83 83
84 84
85 85 Creating a Development Configuration
86 86 ------------------------------------
87 87
88 88 To create a development environment for RhodeCode Enterprise,
89 89 use the following steps:
90 90
91 91 1. Create a copy of `~/rhodecode-enterprise-ce/configs/development.ini`
92 92 2. Adjust the configuration settings to your needs
93 93
94 94 .. note::
95 95
96 96 It is recommended to use the name `dev.ini`.
97 97
98 98
99 99 Setup the Development Database
100 100 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
101 101
102 102 To create a development database, use the following example. This is a one
103 103 time operation::
104 104
105 105 paster setup-rhodecode dev.ini \
106 106 --user=admin --password=secret \
107 107 --email=admin@example.com \
108 108 --repos=~/my_dev_repos
109 109
110 110
111 111 Compile CSS and JavaScript
112 112 ^^^^^^^^^^^^^^^^^^^^^^^^^^
113 113
114 To use the application's frontend, you will need to compile the CSS and
115 JavaScript with Grunt. This is easily done from within the nix-shell using the
116 following command::
114 To use the application's frontend and prepare it for production deployment,
115 you will need to compile the CSS and JavaScript with Grunt.
116 This is easily done from within the nix-shell using the following command::
117
118 grunt
117 119
118 make web-build
120 When developing new features you will need to recompile following any
121 changes made to the CSS or JavaScript files when developing the code::
119 122
120 You will need to recompile following any changes made to the CSS or JavaScript
121 files.
123 grunt watch
122 124
125 This prepares the development (with comments/whitespace) versions of files.
123 126
124 127 Start the Development Server
125 128 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
126 129
127 130 From the rhodecode-vcsserver directory, start the development server in another
128 131 nix-shell, using the following command::
129 132
130 133 pserve configs/development.ini http_port=9900
131 134
132 135 In the adjacent nix-shell which you created for your development server, you may
133 136 now start CE with the following command::
134 137
135 138
136 139 rcserver dev.ini
137 140
138 141 .. note::
139 142
140 143 To automatically refresh - and recompile the frontend assets - when changes
141 144 are made in the source code, you can use the option `--reload`.
142 145
143 146
144 147 Run the Environment Tests
145 148 ^^^^^^^^^^^^^^^^^^^^^^^^^
146 149
147 150 Please make sure that the tests are passing to verify that your environment is
148 151 set up correctly. RhodeCode uses py.test to run tests.
149 152 While your instance is running, start a new nix-shell and simply run
150 153 ``make test`` to run the basic test suite.
151 154
152 155
153 156 Need Help?
154 157 ^^^^^^^^^^
155 158
156 159 Join us on Slack via https://rhodecode.com/join or post questions in our
157 160 Community Portal at https://community.rhodecode.com
@@ -1,177 +1,182 b''
1 1
2 2 ======================
3 3 Contribution Standards
4 4 ======================
5 5
6 6 Standards help to improve the quality of our product and its development. Herein
7 7 we define our standards for processes and development to maintain consistency
8 8 and function well as a community. It is a work in progress; modifications to this
9 9 document should be discussed and agreed upon by the community.
10 10
11 11
12 12 --------------------------------------------------------------------------------
13 13
14 14 Code
15 15 ====
16 16
17 17 This provides an outline for standards we use in our codebase to keep our code
18 18 easy to read and easy to maintain. Much of our code guidelines are based on the
19 19 book `Clean Code <http://www.pearsonhighered.com/educator/product/Clean-Code-A-Handbook-of-Agile-Software-Craftsmanship/9780132350884.page>`_
20 20 by Robert Martin.
21 21
22 22 We maintain a Tech Glossary to provide consistency in terms and symbolic names
23 23 used for items and concepts within the application. This is found in the CE
24 24 project in /docs-internal/glossary.rst
25 25
26 26
27 27 Refactoring
28 28 -----------
29 29 Make it better than you found it!
30 30
31 31 Our codebase can always use improvement and often benefits from refactoring.
32 32 New code should be refactored as it is being written, and old code should be
33 33 treated with the same care as if it was new. Before doing any refactoring,
34 34 ensure that there is test coverage on the affected code; this will help
35 35 minimize issues.
36 36
37 37
38 38 Python
39 39 ------
40 40 For Python, we use `PEP8 <https://www.python.org/dev/peps/pep-0008/>`_.
41 41 We adjust lines of code to under 80 characters and use 4 spaces for indentation.
42 42
43 43
44 44 JavaScript
45 45 ----------
46 46 This currently remains undefined. Suggestions welcome!
47 47
48 However, we have decided to go forward with W3C standards and picked
49 WebComponents as the foundation of user interface. New functionality should
50 be implemented as components using the
51 `Polymer Project` <https://www.polymer-project.org>`_ library
52 and should avoid external dependencies like `jquery`.
48 53
49 54 HTML
50 55 ----
51 56 Unfortunately, we currently have no strict HTML standards, but there are a few
52 57 guidelines we do follow. Templates must work in all modern browsers. HTML should
53 58 be clean and easy to read, and additionally should be free of inline CSS or
54 59 JavaScript. It is recommended to use data attributes for JS actions where
55 60 possible in order to separate it from styling and prevent unintentional changes.
56 61
57 62
58 63 LESS/CSS
59 64 --------
60 65 We use LESS for our CSS; see :doc:`frontend` for structure and formatting
61 66 guidelines.
62 67
63 68
64 69 Linters
65 70 -------
66 71 Currently, we have a linter for pull requests which checks code against PEP8.
67 72 We intend to add more in the future as we clarify standards.
68 73
69 74
70 75 --------------------------------------------------------------------------------
71 76
72 77 Naming Conventions
73 78 ==================
74 79
75 80 These still need to be defined for naming everything from Python variables to
76 81 HTML classes to files and folders.
77 82
78 83
79 84 --------------------------------------------------------------------------------
80 85
81 86 Testing
82 87 =======
83 88
84 89 Testing is a very important aspect of our process, especially as we are our own
85 90 quality control team. While it is of course unrealistic to hit every potential
86 91 combination, our goal is to cover every line of Python code with a test.
87 92
88 93 The following is a brief introduction to our test suite. Our tests are primarily
89 94 written using `py.test <http://pytest.org/>`_
90 95
91 96
92 97 Acceptance Tests
93 98 ----------------
94 99 Also known as "ac tests", these test from the user and business perspective to
95 100 check if the requirements of a feature are met. Scenarios are created at a
96 101 feature's inception and help to define its value.
97 102
98 103 py.test is used for ac tests; they are located in a repo separate from the
99 104 other tests which follow. Each feature has a .feature file which contains a
100 105 brief description and the scenarios to be tested.
101 106
102 107
103 108 Functional Tests
104 109 ----------------
105 110 These test specific functionality in the application which checks through the
106 111 entire stack. Typically these are user actions or permissions which go through
107 112 the web browser. They are located in rhodecode/tests.
108 113
109 114
110 115 Unit Tests
111 116 ----------
112 117 These test isolated, individual modules to ensure that they function correctly.
113 118 They are located in rhodecode/tests.
114 119
115 120
116 121 Integration Tests
117 122 -----------------
118 123 These are used for testing performance of larger systems than the unit tests.
119 124 They are located in rhodecode/tests.
120 125
121 126
122 127 JavaScript Testing
123 128 ------------------
124 129 Currently, we have not defined how to test our JavaScript code.
125 130
126 131
127 132 --------------------------------------------------------------------------------
128 133
129 134 Pull Requests
130 135 =============
131 136
132 137 Pull requests should generally contain only one thing: a single feature, one bug
133 138 fix, etc.. The commit history should be concise and clean, and the pull request
134 139 should contain the ticket number (also a good idea for the commits themselves)
135 140 to provide context for the reviewer.
136 141
137 142 See also: :doc:`checklist-pull-request`
138 143
139 144
140 145 Reviewers
141 146 ---------
142 147 Each pull request must be approved by at least one member of the RhodeCode
143 148 team. Assignments may be based on expertise or familiarity with a particular
144 149 area of code, or simply availability. Switching up or adding extra community
145 150 members for different pull requests helps to share knowledge as well as provide
146 151 other perspectives.
147 152
148 153
149 154 Responsibility
150 155 --------------
151 156 The community is responsible for maintaining features and this must be taken
152 157 into consideration. External contributions must be held to the same standards
153 158 as internal contributions.
154 159
155 160
156 161 Feature Switch
157 162 --------------
158 163 Experimental and work-in-progress features can be hidden behind one of two
159 164 switches:
160 165
161 166 * A setting can be added to the Labs page in the Admin section which may allow
162 167 customers to access and toggle additional features.
163 168 * For work-in-progress or other features where customer access is not desired,
164 169 use a custom setting in the .ini file as a trigger.
165 170
166 171
167 172 --------------------------------------------------------------------------------
168 173
169 174 Tickets
170 175 =======
171 176
172 177 Redmine tickets are a crucial part of our development process. Any code added or
173 178 changed in our codebase should have a corresponding ticket to document it. With
174 179 this in mind, it is important that tickets be as clear and concise as possible,
175 180 including what the expected outcome is.
176 181
177 182 See also: :doc:`checklist-tickets`
@@ -1,61 +1,66 b''
1 1
2 2 .. _test-unit-and-functional:
3 3
4 4 ===========================
5 5 Unit and Functional Tests
6 6 ===========================
7 7
8 8
9 9
10 10 py.test based test suite
11 11 ========================
12 12
13 13
14 14 The test suite is in the folder :file:`rhodecode/tests/` and should be run with
15 15 the test runner `py.test` inside of your `nix-shell` environment::
16 16
17 # In case you need the cythonized version
18 CYTHONIZE=1 python setup.py develop --prefix=$tmp_path
19
20 17 py.test rhodecode
21 18
22 19
23 20
24 21 py.test integration
25 22 -------------------
26 23
27 24 The integration with the test runner is based on the following three parts:
28 25
29 - `pytest_pylons` is a py.test plugin which does the integration with the
30 Pylons web framework. It sets up the Pylons environment based on the given ini
31 file.
26 - :file:`rhodecode/tests/pylons_plugin.py` is a py.test plugin which does the
27 integration with the Pylons web framework. It sets up the Pylons environment
28 based on the given ini file.
32 29
33 30 Tests which depend on the Pylons environment to be set up must request the
34 31 fixture `pylonsapp`.
35 32
36 33 - :file:`rhodecode/tests/plugin.py` contains the integration of py.test with
37 RhodeCode Enterprise itself.
34 RhodeCode Enterprise itself and it takes care of setting up the needed parts
35 of the Pyramid framework.
38 36
39 37 - :file:`conftest.py` plugins are used to provide a special integration for
40 38 certain groups of tests based on the directory location.
41 39
42 40
41 .. note::
42
43 We are migrating from Pylons to its successor Pyramid. Eventually the role of
44 the file `pylons_plugin.py` will change to provide only a Pyramid
45 integration.
46
47
43 48
44 49 VCS backend selection
45 50 ---------------------
46 51
47 52 The py.test integration provides a parameter `--backends`. It will skip all
48 53 tests which are marked for other backends.
49 54
50 55 To run only Subversion tests::
51 56
52 57 py.test rhodecode --backends=svn
53 58
54 59
55 60
56 61 Frontend / Styling support
57 62 ==========================
58 63
59 64 All relevant style components have an example inside of the "Style" section
60 65 within the application. Enable the setting `debug_style` to make this section
61 66 visible in your local instance of the application.
@@ -1,109 +1,110 b''
1 1 .. _quick-start:
2 2
3 3 Quick Start Installation Guide
4 4 ==============================
5 5
6 6 .. important::
7 7
8 8 These are quick start instructions. To optimize your |RCE|,
9 9 |RCC|, and |RCT| usage, read the more detailed instructions in our guides.
10 10 For detailed installation instructions, see
11 11 :ref:`RhodeCode Control Documentation <control:rcc>`
12 12
13 13 .. tip::
14 14
15 15 If using a non-SQLite database, install and configure the database, create
16 16 a new user, and grant permissions. You will be prompted for this user's
17 17 credentials during |RCE| installation. See the relevant database
18 18 documentation for more details.
19 19
20 20 To get |RCE| up and running, run through the below steps:
21 21
22 1. Download the latest |RCC| installer from your `rhodecode.com`_ profile
23 or main page.
22 1. Download the latest |RCC| installer from `rhodecode.com/download`_.
24 23 If you don't have an account, sign up at `rhodecode.com/register`_.
25 24
26 25 2. Run the |RCC| installer and accept the End User Licence using the
27 26 following example:
28 27
29 28 .. code-block:: bash
30 29
31 30 $ chmod 755 RhodeCode-installer-linux-*
32 31 $ ./RhodeCode-installer-linux-*
33 32
34 33 3. Install a VCS Server, and configure it to start at boot.
35 34
36 35 .. code-block:: bash
37 36
38 37 $ rccontrol install VCSServer
39 38
40 39 Agree to the licence agreement? [y/N]: y
41 40 IP to start the server on [127.0.0.1]:
42 41 Port for the server to start [10005]:
43 42 Creating new instance: vcsserver-1
44 43 Installing RhodeCode VCSServer
45 44 Configuring RhodeCode VCS Server ...
46 45 Supervisord state is: RUNNING
47 46 Added process group vcsserver-1
48 47
49 48
50 49 4. Install |RCEE| or |RCCE|. If using MySQL or PostgreSQL, during
51 50 installation you'll be asked for your database credentials, so have them at hand.
52 51 Mysql or Postgres needs to be running and a new database needs to be created.
53 52 You don't need any credentials or to create a database for SQLite.
54 53
55 54 .. code-block:: bash
56 55 :emphasize-lines: 11-16
57 56
58 57 $ rccontrol install Community
59 58
60 59 or
61 60
62 61 $ rccontrol install Enterprise
63 62
64 63 Username [admin]: username
65 64 Password (min 6 chars):
66 65 Repeat for confirmation:
67 66 Email: your@mail.com
68 67 Respositories location [/home/brian/repos]:
69 68 IP to start the Enterprise server on [127.0.0.1]:
70 69 Port for the Enterprise server to use [10004]:
71 70 Database type - [s]qlite, [m]ysql, [p]ostresql:
72 71 PostgreSQL selected
73 72 Database host [127.0.0.1]:
74 73 Database port [5432]:
75 74 Database username: db-user-name
76 75 Database password: somepassword
77 76 Database name: example-db-name
78 77
79 78 5. Check the status of your installation. You |RCEE|/|RCCE| instance runs
80 79 on the URL displayed in the status message.
81 80
82 81 .. code-block:: bash
83 82
84 83 $ rccontrol status
85 84
86 85 - NAME: enterprise-1
87 86 - STATUS: RUNNING
88 87 - TYPE: Enterprise
89 88 - VERSION: 4.1.0
90 89 - URL: http://127.0.0.1:10003
91 90
92 91 - NAME: vcsserver-1
93 92 - STATUS: RUNNING
94 93 - TYPE: VCSServer
95 94 - VERSION: 4.1.0
96 95 - URL: http://127.0.0.1:10001
97 96
98 97 .. note::
99 98
100 99 Recommended post quick start install instructions:
101 100
102 101 * Read the documentation
103 102 * Carry out the :ref:`rhodecode-post-instal-ref`
104 103 * Set up :ref:`indexing-ref`
105 104 * Familiarise yourself with the :ref:`rhodecode-admin-ref` section.
106 105
107 106 .. _rhodecode.com/download/: https://rhodecode.com/download/
108 107 .. _rhodecode.com: https://rhodecode.com/
109 108 .. _rhodecode.com/register: https://rhodecode.com/register/
109 .. _rhodecode.com/download: https://rhodecode.com/download/
110
@@ -1,24 +1,24 b''
1 1 .. _install-postgresql-database:
2 2
3 3 PostgreSQL
4 4 ----------
5 5
6 To use a PostgreSQL database you should install and configurevthe database
7 before installing |RCV|. This is becausevduring |RCV| installation you will
8 setup a connection to your PostgreSQL database. To work with PostgreSQL,
6 To use a PostgreSQL database, you should install and configure the database
7 before installing |RCV|. This is because during |RCV| installation you will
8 setup the connection to your PostgreSQL database. To work with PostgreSQL,
9 9 use the following steps:
10 10
11 1. Depending on your |os|, install avPostgreSQL database following the
11 1. Depending on your |os|, install a PostgreSQL database following the
12 12 appropriate instructions from the `PostgreSQL website`_.
13 2. Configure the database with a username and password which you will use
13 2. Configure the database with a username and password, which you will use
14 14 with |RCV|.
15 15 3. Install |RCV|, and during installation select PostgreSQL as your database.
16 4. Enter the following information to during the database setup:
16 4. Enter the following information during the database setup:
17 17
18 18 * Your network IP Address
19 * The port number for MySQL access. The default MySQL port is ``5434``
19 * The port number for PostgreSQL access; the default port is ``5434``
20 20 * Your database username
21 21 * Your database password
22 22 * A new database name
23 23
24 24 .. _PostgreSQL website: http://www.postgresql.org/
@@ -1,86 +1,87 b''
1 1 .. _rhodecode-release-notes-ref:
2 2
3 3 Release Notes
4 4 =============
5 5
6 6 |RCE| 4.x Versions
7 7 ------------------
8 8
9 9 .. toctree::
10 10 :maxdepth: 1
11 11
12 release-notes-4.4.0.rst
12 13 release-notes-4.3.1.rst
13 14 release-notes-4.3.0.rst
14 15 release-notes-4.2.1.rst
15 16 release-notes-4.2.0.rst
16 17 release-notes-4.1.2.rst
17 18 release-notes-4.1.1.rst
18 19 release-notes-4.1.0.rst
19 20 release-notes-4.0.1.rst
20 21 release-notes-4.0.0.rst
21 22
22 23 |RCE| 3.x Versions
23 24 ------------------
24 25
25 26 .. toctree::
26 27 :maxdepth: 1
27 28
28 29 release-notes-3.8.4.rst
29 30 release-notes-3.8.3.rst
30 31 release-notes-3.8.2.rst
31 32 release-notes-3.8.1.rst
32 33 release-notes-3.8.0.rst
33 34 release-notes-3.7.1.rst
34 35 release-notes-3.7.0.rst
35 36 release-notes-3.6.1.rst
36 37 release-notes-3.6.0.rst
37 38 release-notes-3.5.2.rst
38 39 release-notes-3.5.1.rst
39 40 release-notes-3.5.0.rst
40 41 release-notes-3.4.1.rst
41 42 release-notes-3.4.0.rst
42 43 release-notes-3.3.4.rst
43 44 release-notes-3.3.3.rst
44 45 release-notes-3.3.2.rst
45 46 release-notes-3.3.1.rst
46 47 release-notes-3.3.0.rst
47 48 release-notes-3.2.3.rst
48 49 release-notes-3.2.2.rst
49 50 release-notes-3.2.1.rst
50 51 release-notes-3.2.0.rst
51 52 release-notes-3.1.1.rst
52 53 release-notes-3.1.0.rst
53 54 release-notes-3.0.2.rst
54 55 release-notes-3.0.1.rst
55 56 release-notes-3.0.0.rst
56 57
57 58 |RCE| 2.x Versions
58 59 ------------------
59 60
60 61 .. toctree::
61 62 :maxdepth: 1
62 63
63 64 release-notes-2.2.8.rst
64 65 release-notes-2.2.7.rst
65 66 release-notes-2.2.6.rst
66 67 release-notes-2.2.5.rst
67 68 release-notes-2.2.4.rst
68 69 release-notes-2.2.3.rst
69 70 release-notes-2.2.2.rst
70 71 release-notes-2.2.1.rst
71 72 release-notes-2.2.0.rst
72 73 release-notes-2.1.0.rst
73 74 release-notes-2.0.2.rst
74 75 release-notes-2.0.1.rst
75 76 release-notes-2.0.0.rst
76 77
77 78 |RCE| 1.x Versions
78 79 ------------------
79 80
80 81 .. toctree::
81 82 :maxdepth: 1
82 83
83 84 release-notes-1.7.2.rst
84 85 release-notes-1.7.1.rst
85 86 release-notes-1.7.0.rst
86 87 release-notes-1.6.0.rst
@@ -1,12 +1,18 b''
1 1 {
2 2 "name": "rhodecode-enterprise",
3 3 "version": "0.0.1",
4 4 "devDependencies": {
5 5 "grunt": "^0.4.5",
6 "grunt-contrib-copy": "^1.0.0",
6 7 "grunt-contrib-concat": "^0.5.1",
7 8 "grunt-contrib-jshint": "^0.12.0",
8 9 "grunt-contrib-less": "^1.1.0",
9 10 "grunt-contrib-watch": "^0.6.1",
10 "jshint": "^2.9.1-rc3"
11 "crisper": "^2.0.2",
12 "vulcanize": "^1.14.8",
13 "grunt-crisper": "^1.0.1",
14 "grunt-vulcanize": "^1.0.0",
15 "jshint": "^2.9.1-rc3",
16 "bower": "^1.7.9"
11 17 }
12 18 }
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
@@ -1,281 +1,295 b''
1 1 # Overrides for the generated python-packages.nix
2 2 #
3 3 # This function is intended to be used as an extension to the generated file
4 4 # python-packages.nix. The main objective is to add needed dependencies of C
5 5 # libraries and tweak the build instructions where needed.
6 6
7 7 { pkgs, basePythonPackages }:
8 8
9 9 let
10 10 sed = "sed -i";
11 11 localLicenses = {
12 12 repoze = {
13 13 fullName = "Repoze License";
14 14 url = http://www.repoze.org/LICENSE.txt;
15 15 };
16 16 };
17
18 # johbo: Interim bridge which allows us to build with the upcoming
19 # nixos.16.09 branch (unstable at the moment of writing this note) and the
20 # current stable nixos-16.03.
21 backwardsCompatibleFetchgit = { ... }@args:
22 let
23 origSources = pkgs.fetchgit args;
24 in
25 pkgs.lib.overrideDerivation origSources (oldAttrs: {
26 NIX_PREFETCH_GIT_CHECKOUT_HOOK = ''
27 find $out -name '.git*' -print0 | xargs -0 rm -rf
28 '';
29 });
30
17 31 in
18 32
19 33 self: super: {
20 34
21 35 appenlight-client = super.appenlight-client.override (attrs: {
22 36 meta = {
23 37 license = [ pkgs.lib.licenses.bsdOriginal ];
24 38 };
25 39 });
26 40
27 41 future = super.future.override (attrs: {
28 42 meta = {
29 43 license = [ pkgs.lib.licenses.mit ];
30 44 };
31 45 });
32 46
33 47 gnureadline = super.gnureadline.override (attrs: {
34 48 buildInputs = attrs.buildInputs ++ [
35 49 pkgs.ncurses
36 50 ];
37 51 patchPhase = ''
38 52 substituteInPlace setup.py --replace "/bin/bash" "${pkgs.bash}/bin/bash"
39 53 '';
40 54 });
41 55
42 56 gunicorn = super.gunicorn.override (attrs: {
43 57 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
44 58 # johbo: futures is needed as long as we are on Python 2, otherwise
45 59 # gunicorn explodes if used with multiple threads per worker.
46 60 self.futures
47 61 ];
48 62 });
49 63
50 64 ipython = super.ipython.override (attrs: {
51 65 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
52 66 self.gnureadline
53 67 ];
54 68 });
55 69
56 70 kombu = super.kombu.override (attrs: {
57 71 # The current version of kombu needs some patching to work with the
58 72 # other libs. Should be removed once we update celery and kombu.
59 73 patches = [
60 74 ./patch-kombu-py-2-7-11.diff
61 75 ./patch-kombu-msgpack.diff
62 76 ];
63 77 });
64 78
65 79 lxml = super.lxml.override (attrs: {
66 80 buildInputs = with self; [
67 81 pkgs.libxml2
68 82 pkgs.libxslt
69 83 ];
70 84 });
71 85
72 86 MySQL-python = super.MySQL-python.override (attrs: {
73 87 buildInputs = attrs.buildInputs ++ [
74 88 pkgs.openssl
75 89 ];
76 90 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
77 91 pkgs.mysql.lib
78 92 pkgs.zlib
79 93 ];
80 94 });
81 95
82 96 psutil = super.psutil.override (attrs: {
83 97 buildInputs = attrs.buildInputs ++
84 98 pkgs.lib.optional pkgs.stdenv.isDarwin pkgs.darwin.IOKit;
85 99 });
86 100
87 101 psycopg2 = super.psycopg2.override (attrs: {
88 102 buildInputs = attrs.buildInputs ++
89 103 pkgs.lib.optional pkgs.stdenv.isDarwin pkgs.openssl;
90 104 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
91 105 pkgs.postgresql
92 106 ];
93 107 meta = {
94 108 license = pkgs.lib.licenses.lgpl3Plus;
95 109 };
96 110 });
97 111
98 112 py-gfm = super.py-gfm.override {
99 src = pkgs.fetchgit {
113 src = backwardsCompatibleFetchgit {
100 114 url = "https://code.rhodecode.com/upstream/py-gfm";
101 115 rev = "0d66a19bc16e3d49de273c0f797d4e4781e8c0f2";
102 116 sha256 = "0ryp74jyihd3ckszq31bml5jr3bciimhfp7va7kw6ld92930ksv3";
103 117 };
104 118 };
105 119
106 120 pycurl = super.pycurl.override (attrs: {
107 121 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
108 122 pkgs.curl
109 123 pkgs.openssl
110 124 ];
111 125 preConfigure = ''
112 126 substituteInPlace setup.py --replace '--static-libs' '--libs'
113 127 export PYCURL_SSL_LIBRARY=openssl
114 128 '';
115 129 meta = {
116 130 # TODO: It is LGPL and MIT
117 131 license = pkgs.lib.licenses.mit;
118 132 };
119 133 });
120 134
121 135 Pylons = super.Pylons.override (attrs: {
122 136 name = "Pylons-1.0.1-patch1";
123 src = pkgs.fetchgit {
137 src = backwardsCompatibleFetchgit {
124 138 url = "https://code.rhodecode.com/upstream/pylons";
125 139 rev = "707354ee4261b9c10450404fc9852ccea4fd667d";
126 140 sha256 = "b2763274c2780523a335f83a1df65be22ebe4ff413a7bc9e9288d23c1f62032e";
127 141 };
128 142 });
129 143
130 144 pyramid = super.pyramid.override (attrs: {
131 145 postFixup = ''
132 146 wrapPythonPrograms
133 147 # TODO: johbo: "wrapPython" adds this magic line which
134 148 # confuses pserve.
135 149 ${sed} '/import sys; sys.argv/d' $out/bin/.pserve-wrapped
136 150 '';
137 151 meta = {
138 152 license = localLicenses.repoze;
139 153 };
140 154 });
141 155
142 156 pyramid-debugtoolbar = super.pyramid-debugtoolbar.override (attrs: {
143 157 meta = {
144 158 license = [ pkgs.lib.licenses.bsdOriginal localLicenses.repoze ];
145 159 };
146 160 });
147 161
148 162 Pyro4 = super.Pyro4.override (attrs: {
149 163 # TODO: Was not able to generate this version, needs further
150 164 # investigation.
151 165 name = "Pyro4-4.35";
152 166 src = pkgs.fetchurl {
153 167 url = "https://pypi.python.org/packages/source/P/Pyro4/Pyro4-4.35.src.tar.gz";
154 168 md5 = "cbe6cb855f086a0f092ca075005855f3";
155 169 };
156 170 });
157 171
158 172 pysqlite = super.pysqlite.override (attrs: {
159 173 propagatedBuildInputs = [
160 174 pkgs.sqlite
161 175 ];
162 176 meta = {
163 177 license = [ pkgs.lib.licenses.zlib pkgs.lib.licenses.libpng ];
164 178 };
165 179 });
166 180
167 181 pytest-runner = super.pytest-runner.override (attrs: {
168 182 propagatedBuildInputs = [
169 183 self.setuptools-scm
170 184 ];
171 185 });
172 186
173 187 python-ldap = super.python-ldap.override (attrs: {
174 188 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
175 189 pkgs.cyrus_sasl
176 190 pkgs.openldap
177 191 pkgs.openssl
178 192 ];
179 193 NIX_CFLAGS_COMPILE = "-I${pkgs.cyrus_sasl}/include/sasl";
180 194 });
181 195
182 196 python-pam = super.python-pam.override (attrs:
183 197 let
184 198 includeLibPam = pkgs.stdenv.isLinux;
185 199 in {
186 200 # TODO: johbo: Move the option up into the default.nix, we should
187 201 # include python-pam only on supported platforms.
188 202 propagatedBuildInputs = attrs.propagatedBuildInputs ++
189 203 pkgs.lib.optional includeLibPam [
190 204 pkgs.pam
191 205 ];
192 206 # TODO: johbo: Check if this can be avoided, or transform into
193 207 # a real patch
194 208 patchPhase = pkgs.lib.optionals includeLibPam ''
195 209 substituteInPlace pam.py \
196 210 --replace 'find_library("pam")' '"${pkgs.pam}/lib/libpam.so.0"'
197 211 '';
198 212 });
199 213
200 214 rhodecode-tools = super.rhodecode-tools.override (attrs: {
201 215 patches = [
202 216 ./patch-rhodecode-tools-setup.diff
203 217 ];
204 218 });
205 219
206 220 URLObject = super.URLObject.override (attrs: {
207 221 meta = {
208 222 license = {
209 223 spdxId = "Unlicense";
210 224 fullName = "The Unlicense";
211 225 url = http://unlicense.org/;
212 226 };
213 227 };
214 228 });
215 229
216 230 amqplib = super.amqplib.override (attrs: {
217 231 meta = {
218 232 license = pkgs.lib.licenses.lgpl3;
219 233 };
220 234 });
221 235
222 236 docutils = super.docutils.override (attrs: {
223 237 meta = {
224 238 license = pkgs.lib.licenses.bsd2;
225 239 };
226 240 });
227 241
228 242 colander = super.colander.override (attrs: {
229 243 meta = {
230 244 license = localLicenses.repoze;
231 245 };
232 246 });
233 247
234 248 pyramid-beaker = super.pyramid-beaker.override (attrs: {
235 249 meta = {
236 250 license = localLicenses.repoze;
237 251 };
238 252 });
239 253
240 254 pyramid-mako = super.pyramid-mako.override (attrs: {
241 255 meta = {
242 256 license = localLicenses.repoze;
243 257 };
244 258 });
245 259
246 260 repoze.lru = super.repoze.lru.override (attrs: {
247 261 meta = {
248 262 license = localLicenses.repoze;
249 263 };
250 264 });
251 265
252 266 recaptcha-client = super.recaptcha-client.override (attrs: {
253 267 meta = {
254 268 # TODO: It is MIT/X11
255 269 license = pkgs.lib.licenses.mit;
256 270 };
257 271 });
258 272
259 273 python-editor = super.python-editor.override (attrs: {
260 274 meta = {
261 275 license = pkgs.lib.licenses.asl20;
262 276 };
263 277 });
264 278
265 279 translationstring = super.translationstring.override (attrs: {
266 280 meta = {
267 281 license = localLicenses.repoze;
268 282 };
269 283 });
270 284
271 285 venusian = super.venusian.override (attrs: {
272 286 meta = {
273 287 license = localLicenses.repoze;
274 288 };
275 289 });
276 290
277 291 # Avoid that setuptools is replaced, this leads to trouble
278 292 # with buildPythonPackage.
279 293 setuptools = basePythonPackages.setuptools;
280 294
281 295 }
@@ -1,1732 +1,1719 b''
1 1 {
2 2 Babel = super.buildPythonPackage {
3 3 name = "Babel-1.3";
4 4 buildInputs = with self; [];
5 5 doCheck = false;
6 6 propagatedBuildInputs = with self; [pytz];
7 7 src = fetchurl {
8 8 url = "https://pypi.python.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz";
9 9 md5 = "5264ceb02717843cbc9ffce8e6e06bdb";
10 10 };
11 11 meta = {
12 12 license = [ pkgs.lib.licenses.bsdOriginal ];
13 13 };
14 14 };
15 15 Beaker = super.buildPythonPackage {
16 16 name = "Beaker-1.7.0";
17 17 buildInputs = with self; [];
18 18 doCheck = false;
19 19 propagatedBuildInputs = with self; [];
20 20 src = fetchurl {
21 21 url = "https://pypi.python.org/packages/97/8e/409d2e7c009b8aa803dc9e6f239f1db7c3cdf578249087a404e7c27a505d/Beaker-1.7.0.tar.gz";
22 22 md5 = "386be3f7fe427358881eee4622b428b3";
23 23 };
24 24 meta = {
25 25 license = [ pkgs.lib.licenses.bsdOriginal ];
26 26 };
27 27 };
28 28 CProfileV = super.buildPythonPackage {
29 29 name = "CProfileV-1.0.6";
30 30 buildInputs = with self; [];
31 31 doCheck = false;
32 32 propagatedBuildInputs = with self; [bottle];
33 33 src = fetchurl {
34 34 url = "https://pypi.python.org/packages/eb/df/983a0b6cfd3ac94abf023f5011cb04f33613ace196e33f53c86cf91850d5/CProfileV-1.0.6.tar.gz";
35 35 md5 = "08c7c242b6e64237bc53c5d13537e03d";
36 36 };
37 37 meta = {
38 38 license = [ pkgs.lib.licenses.mit ];
39 39 };
40 40 };
41 41 Chameleon = super.buildPythonPackage {
42 42 name = "Chameleon-2.24";
43 43 buildInputs = with self; [];
44 44 doCheck = false;
45 45 propagatedBuildInputs = with self; [];
46 46 src = fetchurl {
47 47 url = "https://pypi.python.org/packages/5a/9e/637379ffa13c5172b5c0e704833ffea6bf51cec7567f93fd6e903d53ed74/Chameleon-2.24.tar.gz";
48 48 md5 = "1b01f1f6533a8a11d0d2f2366dec5342";
49 49 };
50 50 meta = {
51 51 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
52 52 };
53 53 };
54 Fabric = super.buildPythonPackage {
55 name = "Fabric-1.10.0";
56 buildInputs = with self; [];
57 doCheck = false;
58 propagatedBuildInputs = with self; [paramiko];
59 src = fetchurl {
60 url = "https://pypi.python.org/packages/e3/5f/b6ebdb5241d5ec9eab582a5c8a01255c1107da396f849e538801d2fe64a5/Fabric-1.10.0.tar.gz";
61 md5 = "2cb96473387f0e7aa035210892352f4a";
62 };
63 meta = {
64 license = [ pkgs.lib.licenses.bsdOriginal ];
65 };
66 };
67 54 FormEncode = super.buildPythonPackage {
68 55 name = "FormEncode-1.2.4";
69 56 buildInputs = with self; [];
70 57 doCheck = false;
71 58 propagatedBuildInputs = with self; [];
72 59 src = fetchurl {
73 60 url = "https://pypi.python.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz";
74 61 md5 = "6bc17fb9aed8aea198975e888e2077f4";
75 62 };
76 63 meta = {
77 64 license = [ pkgs.lib.licenses.psfl ];
78 65 };
79 66 };
80 67 Jinja2 = super.buildPythonPackage {
81 68 name = "Jinja2-2.7.3";
82 69 buildInputs = with self; [];
83 70 doCheck = false;
84 71 propagatedBuildInputs = with self; [MarkupSafe];
85 72 src = fetchurl {
86 73 url = "https://pypi.python.org/packages/b0/73/eab0bca302d6d6a0b5c402f47ad1760dc9cb2dd14bbc1873ad48db258e4d/Jinja2-2.7.3.tar.gz";
87 74 md5 = "b9dffd2f3b43d673802fe857c8445b1a";
88 75 };
89 76 meta = {
90 77 license = [ pkgs.lib.licenses.bsdOriginal ];
91 78 };
92 79 };
93 80 Mako = super.buildPythonPackage {
94 81 name = "Mako-1.0.1";
95 82 buildInputs = with self; [];
96 83 doCheck = false;
97 84 propagatedBuildInputs = with self; [MarkupSafe];
98 85 src = fetchurl {
99 86 url = "https://pypi.python.org/packages/8e/a4/aa56533ecaa5f22ca92428f74e074d0c9337282933c722391902c8f9e0f8/Mako-1.0.1.tar.gz";
100 87 md5 = "9f0aafd177b039ef67b90ea350497a54";
101 88 };
102 89 meta = {
103 90 license = [ pkgs.lib.licenses.mit ];
104 91 };
105 92 };
106 93 Markdown = super.buildPythonPackage {
107 94 name = "Markdown-2.6.2";
108 95 buildInputs = with self; [];
109 96 doCheck = false;
110 97 propagatedBuildInputs = with self; [];
111 98 src = fetchurl {
112 99 url = "https://pypi.python.org/packages/62/8b/83658b5f6c220d5fcde9f9852d46ea54765d734cfbc5a9f4c05bfc36db4d/Markdown-2.6.2.tar.gz";
113 100 md5 = "256d19afcc564dc4ce4c229bb762f7ae";
114 101 };
115 102 meta = {
116 103 license = [ pkgs.lib.licenses.bsdOriginal ];
117 104 };
118 105 };
119 106 MarkupSafe = super.buildPythonPackage {
120 107 name = "MarkupSafe-0.23";
121 108 buildInputs = with self; [];
122 109 doCheck = false;
123 110 propagatedBuildInputs = with self; [];
124 111 src = fetchurl {
125 112 url = "https://pypi.python.org/packages/c0/41/bae1254e0396c0cc8cf1751cb7d9afc90a602353695af5952530482c963f/MarkupSafe-0.23.tar.gz";
126 113 md5 = "f5ab3deee4c37cd6a922fb81e730da6e";
127 114 };
128 115 meta = {
129 116 license = [ pkgs.lib.licenses.bsdOriginal ];
130 117 };
131 118 };
132 119 MySQL-python = super.buildPythonPackage {
133 120 name = "MySQL-python-1.2.5";
134 121 buildInputs = with self; [];
135 122 doCheck = false;
136 123 propagatedBuildInputs = with self; [];
137 124 src = fetchurl {
138 125 url = "https://pypi.python.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip";
139 126 md5 = "654f75b302db6ed8dc5a898c625e030c";
140 127 };
141 128 meta = {
142 129 license = [ pkgs.lib.licenses.gpl1 ];
143 130 };
144 131 };
145 132 Paste = super.buildPythonPackage {
146 133 name = "Paste-2.0.2";
147 134 buildInputs = with self; [];
148 135 doCheck = false;
149 136 propagatedBuildInputs = with self; [six];
150 137 src = fetchurl {
151 138 url = "https://pypi.python.org/packages/d5/8d/0f8ac40687b97ff3e07ebd1369be20bdb3f93864d2dc3c2ff542edb4ce50/Paste-2.0.2.tar.gz";
152 139 md5 = "4bfc8a7eaf858f6309d2ac0f40fc951c";
153 140 };
154 141 meta = {
155 142 license = [ pkgs.lib.licenses.mit ];
156 143 };
157 144 };
158 145 PasteDeploy = super.buildPythonPackage {
159 146 name = "PasteDeploy-1.5.2";
160 147 buildInputs = with self; [];
161 148 doCheck = false;
162 149 propagatedBuildInputs = with self; [];
163 150 src = fetchurl {
164 151 url = "https://pypi.python.org/packages/0f/90/8e20cdae206c543ea10793cbf4136eb9a8b3f417e04e40a29d72d9922cbd/PasteDeploy-1.5.2.tar.gz";
165 152 md5 = "352b7205c78c8de4987578d19431af3b";
166 153 };
167 154 meta = {
168 155 license = [ pkgs.lib.licenses.mit ];
169 156 };
170 157 };
171 158 PasteScript = super.buildPythonPackage {
172 159 name = "PasteScript-1.7.5";
173 160 buildInputs = with self; [];
174 161 doCheck = false;
175 162 propagatedBuildInputs = with self; [Paste PasteDeploy];
176 163 src = fetchurl {
177 164 url = "https://pypi.python.org/packages/a5/05/fc60efa7c2f17a1dbaeccb2a903a1e90902d92b9d00eebabe3095829d806/PasteScript-1.7.5.tar.gz";
178 165 md5 = "4c72d78dcb6bb993f30536842c16af4d";
179 166 };
180 167 meta = {
181 168 license = [ pkgs.lib.licenses.mit ];
182 169 };
183 170 };
184 171 Pygments = super.buildPythonPackage {
185 172 name = "Pygments-2.1.3";
186 173 buildInputs = with self; [];
187 174 doCheck = false;
188 175 propagatedBuildInputs = with self; [];
189 176 src = fetchurl {
190 177 url = "https://pypi.python.org/packages/b8/67/ab177979be1c81bc99c8d0592ef22d547e70bb4c6815c383286ed5dec504/Pygments-2.1.3.tar.gz";
191 178 md5 = "ed3fba2467c8afcda4d317e4ef2c6150";
192 179 };
193 180 meta = {
194 181 license = [ pkgs.lib.licenses.bsdOriginal ];
195 182 };
196 183 };
197 184 Pylons = super.buildPythonPackage {
198 185 name = "Pylons-1.0.1";
199 186 buildInputs = with self; [];
200 187 doCheck = false;
201 188 propagatedBuildInputs = with self; [Routes WebHelpers Beaker Paste PasteDeploy PasteScript FormEncode simplejson decorator nose Mako WebError WebTest Tempita MarkupSafe WebOb];
202 189 src = fetchurl {
203 190 url = "https://pypi.python.org/packages/a2/69/b835a6bad00acbfeed3f33c6e44fa3f936efc998c795bfb15c61a79ecf62/Pylons-1.0.1.tar.gz";
204 191 md5 = "6cb880d75fa81213192142b07a6e4915";
205 192 };
206 193 meta = {
207 194 license = [ pkgs.lib.licenses.bsdOriginal ];
208 195 };
209 196 };
210 197 Pyro4 = super.buildPythonPackage {
211 198 name = "Pyro4-4.41";
212 199 buildInputs = with self; [];
213 200 doCheck = false;
214 201 propagatedBuildInputs = with self; [serpent];
215 202 src = fetchurl {
216 203 url = "https://pypi.python.org/packages/56/2b/89b566b4bf3e7f8ba790db2d1223852f8cb454c52cab7693dd41f608ca2a/Pyro4-4.41.tar.gz";
217 204 md5 = "ed69e9bfafa9c06c049a87cb0c4c2b6c";
218 205 };
219 206 meta = {
220 207 license = [ pkgs.lib.licenses.mit ];
221 208 };
222 209 };
223 210 Routes = super.buildPythonPackage {
224 211 name = "Routes-1.13";
225 212 buildInputs = with self; [];
226 213 doCheck = false;
227 214 propagatedBuildInputs = with self; [repoze.lru];
228 215 src = fetchurl {
229 216 url = "https://pypi.python.org/packages/88/d3/259c3b3cde8837eb9441ab5f574a660e8a4acea8f54a078441d4d2acac1c/Routes-1.13.tar.gz";
230 217 md5 = "d527b0ab7dd9172b1275a41f97448783";
231 218 };
232 219 meta = {
233 220 license = [ pkgs.lib.licenses.bsdOriginal ];
234 221 };
235 222 };
236 223 SQLAlchemy = super.buildPythonPackage {
237 224 name = "SQLAlchemy-0.9.9";
238 225 buildInputs = with self; [];
239 226 doCheck = false;
240 227 propagatedBuildInputs = with self; [];
241 228 src = fetchurl {
242 229 url = "https://pypi.python.org/packages/28/f7/1bbfd0d8597e8c358d5e15a166a486ad82fc5579b4e67b6ef7c05b1d182b/SQLAlchemy-0.9.9.tar.gz";
243 230 md5 = "8a10a9bd13ed3336ef7333ac2cc679ff";
244 231 };
245 232 meta = {
246 233 license = [ pkgs.lib.licenses.mit ];
247 234 };
248 235 };
249 236 Sphinx = super.buildPythonPackage {
250 237 name = "Sphinx-1.2.2";
251 238 buildInputs = with self; [];
252 239 doCheck = false;
253 240 propagatedBuildInputs = with self; [Pygments docutils Jinja2];
254 241 src = fetchurl {
255 242 url = "https://pypi.python.org/packages/0a/50/34017e6efcd372893a416aba14b84a1a149fc7074537b0e9cb6ca7b7abe9/Sphinx-1.2.2.tar.gz";
256 243 md5 = "3dc73ccaa8d0bfb2d62fb671b1f7e8a4";
257 244 };
258 245 meta = {
259 246 license = [ pkgs.lib.licenses.bsdOriginal ];
260 247 };
261 248 };
262 249 Tempita = super.buildPythonPackage {
263 250 name = "Tempita-0.5.2";
264 251 buildInputs = with self; [];
265 252 doCheck = false;
266 253 propagatedBuildInputs = with self; [];
267 254 src = fetchurl {
268 255 url = "https://pypi.python.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz";
269 256 md5 = "4c2f17bb9d481821c41b6fbee904cea1";
270 257 };
271 258 meta = {
272 259 license = [ pkgs.lib.licenses.mit ];
273 260 };
274 261 };
275 262 URLObject = super.buildPythonPackage {
276 263 name = "URLObject-2.4.0";
277 264 buildInputs = with self; [];
278 265 doCheck = false;
279 266 propagatedBuildInputs = with self; [];
280 267 src = fetchurl {
281 268 url = "https://pypi.python.org/packages/cb/b6/e25e58500f9caef85d664bec71ec67c116897bfebf8622c32cb75d1ca199/URLObject-2.4.0.tar.gz";
282 269 md5 = "2ed819738a9f0a3051f31dc9924e3065";
283 270 };
284 271 meta = {
285 272 license = [ ];
286 273 };
287 274 };
288 275 WebError = super.buildPythonPackage {
289 276 name = "WebError-0.10.3";
290 277 buildInputs = with self; [];
291 278 doCheck = false;
292 279 propagatedBuildInputs = with self; [WebOb Tempita Pygments Paste];
293 280 src = fetchurl {
294 281 url = "https://pypi.python.org/packages/35/76/e7e5c2ce7e9c7f31b54c1ff295a495886d1279a002557d74dd8957346a79/WebError-0.10.3.tar.gz";
295 282 md5 = "84b9990b0baae6fd440b1e60cdd06f9a";
296 283 };
297 284 meta = {
298 285 license = [ pkgs.lib.licenses.mit ];
299 286 };
300 287 };
301 288 WebHelpers = super.buildPythonPackage {
302 289 name = "WebHelpers-1.3";
303 290 buildInputs = with self; [];
304 291 doCheck = false;
305 292 propagatedBuildInputs = with self; [MarkupSafe];
306 293 src = fetchurl {
307 294 url = "https://pypi.python.org/packages/ee/68/4d07672821d514184357f1552f2dad923324f597e722de3b016ca4f7844f/WebHelpers-1.3.tar.gz";
308 295 md5 = "32749ffadfc40fea51075a7def32588b";
309 296 };
310 297 meta = {
311 298 license = [ pkgs.lib.licenses.bsdOriginal ];
312 299 };
313 300 };
314 301 WebHelpers2 = super.buildPythonPackage {
315 302 name = "WebHelpers2-2.0";
316 303 buildInputs = with self; [];
317 304 doCheck = false;
318 305 propagatedBuildInputs = with self; [MarkupSafe six];
319 306 src = fetchurl {
320 307 url = "https://pypi.python.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz";
321 308 md5 = "0f6b68d70c12ee0aed48c00b24da13d3";
322 309 };
323 310 meta = {
324 311 license = [ pkgs.lib.licenses.mit ];
325 312 };
326 313 };
327 314 WebOb = super.buildPythonPackage {
328 315 name = "WebOb-1.3.1";
329 316 buildInputs = with self; [];
330 317 doCheck = false;
331 318 propagatedBuildInputs = with self; [];
332 319 src = fetchurl {
333 320 url = "https://pypi.python.org/packages/16/78/adfc0380b8a0d75b2d543fa7085ba98a573b1ae486d9def88d172b81b9fa/WebOb-1.3.1.tar.gz";
334 321 md5 = "20918251c5726956ba8fef22d1556177";
335 322 };
336 323 meta = {
337 324 license = [ pkgs.lib.licenses.mit ];
338 325 };
339 326 };
340 327 WebTest = super.buildPythonPackage {
341 328 name = "WebTest-1.4.3";
342 329 buildInputs = with self; [];
343 330 doCheck = false;
344 331 propagatedBuildInputs = with self; [WebOb];
345 332 src = fetchurl {
346 333 url = "https://pypi.python.org/packages/51/3d/84fd0f628df10b30c7db87895f56d0158e5411206b721ca903cb51bfd948/WebTest-1.4.3.zip";
347 334 md5 = "631ce728bed92c681a4020a36adbc353";
348 335 };
349 336 meta = {
350 337 license = [ pkgs.lib.licenses.mit ];
351 338 };
352 339 };
353 340 Whoosh = super.buildPythonPackage {
354 341 name = "Whoosh-2.7.0";
355 342 buildInputs = with self; [];
356 343 doCheck = false;
357 344 propagatedBuildInputs = with self; [];
358 345 src = fetchurl {
359 346 url = "https://pypi.python.org/packages/1c/dc/2f0231ff3875ded36df8c1ab851451e51a237dc0e5a86d3d96036158da94/Whoosh-2.7.0.zip";
360 347 md5 = "7abfd970f16fadc7311960f3fa0bc7a9";
361 348 };
362 349 meta = {
363 350 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
364 351 };
365 352 };
366 353 alembic = super.buildPythonPackage {
367 354 name = "alembic-0.8.4";
368 355 buildInputs = with self; [];
369 356 doCheck = false;
370 357 propagatedBuildInputs = with self; [SQLAlchemy Mako python-editor];
371 358 src = fetchurl {
372 359 url = "https://pypi.python.org/packages/ca/7e/299b4499b5c75e5a38c5845145ad24755bebfb8eec07a2e1c366b7181eeb/alembic-0.8.4.tar.gz";
373 360 md5 = "5f95d8ee62b443f9b37eb5bee76c582d";
374 361 };
375 362 meta = {
376 363 license = [ pkgs.lib.licenses.mit ];
377 364 };
378 365 };
379 366 amqplib = super.buildPythonPackage {
380 367 name = "amqplib-1.0.2";
381 368 buildInputs = with self; [];
382 369 doCheck = false;
383 370 propagatedBuildInputs = with self; [];
384 371 src = fetchurl {
385 372 url = "https://pypi.python.org/packages/75/b7/8c2429bf8d92354a0118614f9a4d15e53bc69ebedce534284111de5a0102/amqplib-1.0.2.tgz";
386 373 md5 = "5c92f17fbedd99b2b4a836d4352d1e2f";
387 374 };
388 375 meta = {
389 376 license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
390 377 };
391 378 };
392 379 anyjson = super.buildPythonPackage {
393 380 name = "anyjson-0.3.3";
394 381 buildInputs = with self; [];
395 382 doCheck = false;
396 383 propagatedBuildInputs = with self; [];
397 384 src = fetchurl {
398 385 url = "https://pypi.python.org/packages/c3/4d/d4089e1a3dd25b46bebdb55a992b0797cff657b4477bc32ce28038fdecbc/anyjson-0.3.3.tar.gz";
399 386 md5 = "2ea28d6ec311aeeebaf993cb3008b27c";
400 387 };
401 388 meta = {
402 389 license = [ pkgs.lib.licenses.bsdOriginal ];
403 390 };
404 391 };
405 392 appenlight-client = super.buildPythonPackage {
406 393 name = "appenlight-client-0.6.14";
407 394 buildInputs = with self; [];
408 395 doCheck = false;
409 396 propagatedBuildInputs = with self; [WebOb requests];
410 397 src = fetchurl {
411 398 url = "https://pypi.python.org/packages/4d/e0/23fee3ebada8143f707e65c06bcb82992040ee64ea8355e044ed55ebf0c1/appenlight_client-0.6.14.tar.gz";
412 399 md5 = "578c69b09f4356d898fff1199b98a95c";
413 400 };
414 401 meta = {
415 402 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "DFSG approved"; } ];
416 403 };
417 404 };
418 405 authomatic = super.buildPythonPackage {
419 406 name = "authomatic-0.1.0.post1";
420 407 buildInputs = with self; [];
421 408 doCheck = false;
422 409 propagatedBuildInputs = with self; [];
423 410 src = fetchurl {
424 411 url = "https://pypi.python.org/packages/08/1a/8a930461e604c2d5a7a871e1ac59fa82ccf994c32e807230c8d2fb07815a/Authomatic-0.1.0.post1.tar.gz";
425 412 md5 = "be3f3ce08747d776aae6d6cc8dcb49a9";
426 413 };
427 414 meta = {
428 415 license = [ pkgs.lib.licenses.mit ];
429 416 };
430 417 };
431 418 backport-ipaddress = super.buildPythonPackage {
432 419 name = "backport-ipaddress-0.1";
433 420 buildInputs = with self; [];
434 421 doCheck = false;
435 422 propagatedBuildInputs = with self; [];
436 423 src = fetchurl {
437 424 url = "https://pypi.python.org/packages/d3/30/54c6dab05a4dec44db25ff309f1fbb6b7a8bde3f2bade38bb9da67bbab8f/backport_ipaddress-0.1.tar.gz";
438 425 md5 = "9c1f45f4361f71b124d7293a60006c05";
439 426 };
440 427 meta = {
441 428 license = [ pkgs.lib.licenses.psfl ];
442 429 };
443 430 };
444 431 bottle = super.buildPythonPackage {
445 432 name = "bottle-0.12.8";
446 433 buildInputs = with self; [];
447 434 doCheck = false;
448 435 propagatedBuildInputs = with self; [];
449 436 src = fetchurl {
450 437 url = "https://pypi.python.org/packages/52/df/e4a408f3a7af396d186d4ecd3b389dd764f0f943b4fa8d257bfe7b49d343/bottle-0.12.8.tar.gz";
451 438 md5 = "13132c0a8f607bf860810a6ee9064c5b";
452 439 };
453 440 meta = {
454 441 license = [ pkgs.lib.licenses.mit ];
455 442 };
456 443 };
457 444 bumpversion = super.buildPythonPackage {
458 445 name = "bumpversion-0.5.3";
459 446 buildInputs = with self; [];
460 447 doCheck = false;
461 448 propagatedBuildInputs = with self; [];
462 449 src = fetchurl {
463 450 url = "https://pypi.python.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz";
464 451 md5 = "c66a3492eafcf5ad4b024be9fca29820";
465 452 };
466 453 meta = {
467 454 license = [ pkgs.lib.licenses.mit ];
468 455 };
469 456 };
470 457 celery = super.buildPythonPackage {
471 458 name = "celery-2.2.10";
472 459 buildInputs = with self; [];
473 460 doCheck = false;
474 461 propagatedBuildInputs = with self; [python-dateutil anyjson kombu pyparsing];
475 462 src = fetchurl {
476 463 url = "https://pypi.python.org/packages/b1/64/860fd50e45844c83442e7953effcddeff66b2851d90b2d784f7201c111b8/celery-2.2.10.tar.gz";
477 464 md5 = "898bc87e54f278055b561316ba73e222";
478 465 };
479 466 meta = {
480 467 license = [ pkgs.lib.licenses.bsdOriginal ];
481 468 };
482 469 };
483 470 channelstream = super.buildPythonPackage {
484 471 name = "channelstream-0.5.2";
485 472 buildInputs = with self; [];
486 473 doCheck = false;
487 474 propagatedBuildInputs = with self; [gevent ws4py pyramid pyramid-jinja2 itsdangerous requests six];
488 475 src = fetchurl {
489 476 url = "https://pypi.python.org/packages/2b/31/29a8e085cf5bf97fa88e7b947adabfc581a18a3463adf77fb6dada34a65f/channelstream-0.5.2.tar.gz";
490 477 md5 = "1c5eb2a8a405be6f1073da94da6d81d3";
491 478 };
492 479 meta = {
493 480 license = [ pkgs.lib.licenses.bsdOriginal ];
494 481 };
495 482 };
496 483 click = super.buildPythonPackage {
497 484 name = "click-5.1";
498 485 buildInputs = with self; [];
499 486 doCheck = false;
500 487 propagatedBuildInputs = with self; [];
501 488 src = fetchurl {
502 489 url = "https://pypi.python.org/packages/b7/34/a496632c4fb6c1ee76efedf77bb8d28b29363d839953d95095b12defe791/click-5.1.tar.gz";
503 490 md5 = "9c5323008cccfe232a8b161fc8196d41";
504 491 };
505 492 meta = {
506 493 license = [ pkgs.lib.licenses.bsdOriginal ];
507 494 };
508 495 };
509 496 colander = super.buildPythonPackage {
510 497 name = "colander-1.2";
511 498 buildInputs = with self; [];
512 499 doCheck = false;
513 500 propagatedBuildInputs = with self; [translationstring iso8601];
514 501 src = fetchurl {
515 502 url = "https://pypi.python.org/packages/14/23/c9ceba07a6a1dc0eefbb215fc0dc64aabc2b22ee756bc0f0c13278fa0887/colander-1.2.tar.gz";
516 503 md5 = "83db21b07936a0726e588dae1914b9ed";
517 504 };
518 505 meta = {
519 506 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
520 507 };
521 508 };
522 509 configobj = super.buildPythonPackage {
523 510 name = "configobj-5.0.6";
524 511 buildInputs = with self; [];
525 512 doCheck = false;
526 513 propagatedBuildInputs = with self; [six];
527 514 src = fetchurl {
528 515 url = "https://pypi.python.org/packages/64/61/079eb60459c44929e684fa7d9e2fdca403f67d64dd9dbac27296be2e0fab/configobj-5.0.6.tar.gz";
529 516 md5 = "e472a3a1c2a67bb0ec9b5d54c13a47d6";
530 517 };
531 518 meta = {
532 519 license = [ pkgs.lib.licenses.bsdOriginal ];
533 520 };
534 521 };
535 522 cov-core = super.buildPythonPackage {
536 523 name = "cov-core-1.15.0";
537 524 buildInputs = with self; [];
538 525 doCheck = false;
539 526 propagatedBuildInputs = with self; [coverage];
540 527 src = fetchurl {
541 528 url = "https://pypi.python.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz";
542 529 md5 = "f519d4cb4c4e52856afb14af52919fe6";
543 530 };
544 531 meta = {
545 532 license = [ pkgs.lib.licenses.mit ];
546 533 };
547 534 };
548 535 coverage = super.buildPythonPackage {
549 536 name = "coverage-3.7.1";
550 537 buildInputs = with self; [];
551 538 doCheck = false;
552 539 propagatedBuildInputs = with self; [];
553 540 src = fetchurl {
554 541 url = "https://pypi.python.org/packages/09/4f/89b06c7fdc09687bca507dc411c342556ef9c5a3b26756137a4878ff19bf/coverage-3.7.1.tar.gz";
555 542 md5 = "c47b36ceb17eaff3ecfab3bcd347d0df";
556 543 };
557 544 meta = {
558 545 license = [ pkgs.lib.licenses.bsdOriginal ];
559 546 };
560 547 };
561 548 cssselect = super.buildPythonPackage {
562 549 name = "cssselect-0.9.1";
563 550 buildInputs = with self; [];
564 551 doCheck = false;
565 552 propagatedBuildInputs = with self; [];
566 553 src = fetchurl {
567 554 url = "https://pypi.python.org/packages/aa/e5/9ee1460d485b94a6d55732eb7ad5b6c084caf73dd6f9cb0bb7d2a78fafe8/cssselect-0.9.1.tar.gz";
568 555 md5 = "c74f45966277dc7a0f768b9b0f3522ac";
569 556 };
570 557 meta = {
571 558 license = [ pkgs.lib.licenses.bsdOriginal ];
572 559 };
573 560 };
574 561 decorator = super.buildPythonPackage {
575 562 name = "decorator-3.4.2";
576 563 buildInputs = with self; [];
577 564 doCheck = false;
578 565 propagatedBuildInputs = with self; [];
579 566 src = fetchurl {
580 567 url = "https://pypi.python.org/packages/35/3a/42566eb7a2cbac774399871af04e11d7ae3fc2579e7dae85213b8d1d1c57/decorator-3.4.2.tar.gz";
581 568 md5 = "9e0536870d2b83ae27d58dbf22582f4d";
582 569 };
583 570 meta = {
584 571 license = [ pkgs.lib.licenses.bsdOriginal ];
585 572 };
586 573 };
587 574 deform = super.buildPythonPackage {
588 575 name = "deform-2.0a2";
589 576 buildInputs = with self; [];
590 577 doCheck = false;
591 578 propagatedBuildInputs = with self; [Chameleon colander peppercorn translationstring zope.deprecation];
592 579 src = fetchurl {
593 580 url = "https://pypi.python.org/packages/8d/b3/aab57e81da974a806dc9c5fa024a6404720f890a6dcf2e80885e3cb4609a/deform-2.0a2.tar.gz";
594 581 md5 = "7a90d41f7fbc18002ce74f39bd90a5e4";
595 582 };
596 583 meta = {
597 584 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
598 585 };
599 586 };
600 587 docutils = super.buildPythonPackage {
601 588 name = "docutils-0.12";
602 589 buildInputs = with self; [];
603 590 doCheck = false;
604 591 propagatedBuildInputs = with self; [];
605 592 src = fetchurl {
606 593 url = "https://pypi.python.org/packages/37/38/ceda70135b9144d84884ae2fc5886c6baac4edea39550f28bcd144c1234d/docutils-0.12.tar.gz";
607 594 md5 = "4622263b62c5c771c03502afa3157768";
608 595 };
609 596 meta = {
610 597 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.publicDomain pkgs.lib.licenses.gpl1 { fullName = "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"; } pkgs.lib.licenses.psfl ];
611 598 };
612 599 };
613 600 dogpile.cache = super.buildPythonPackage {
614 601 name = "dogpile.cache-0.6.1";
615 602 buildInputs = with self; [];
616 603 doCheck = false;
617 604 propagatedBuildInputs = with self; [];
618 605 src = fetchurl {
619 606 url = "https://pypi.python.org/packages/f6/a0/6f2142c58c6588d17c734265b103ae1cd0741e1681dd9483a63f22033375/dogpile.cache-0.6.1.tar.gz";
620 607 md5 = "35d7fb30f22bbd0685763d894dd079a9";
621 608 };
622 609 meta = {
623 610 license = [ pkgs.lib.licenses.bsdOriginal ];
624 611 };
625 612 };
626 613 dogpile.core = super.buildPythonPackage {
627 614 name = "dogpile.core-0.4.1";
628 615 buildInputs = with self; [];
629 616 doCheck = false;
630 617 propagatedBuildInputs = with self; [];
631 618 src = fetchurl {
632 619 url = "https://pypi.python.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz";
633 620 md5 = "01cb19f52bba3e95c9b560f39341f045";
634 621 };
635 622 meta = {
636 623 license = [ pkgs.lib.licenses.bsdOriginal ];
637 624 };
638 625 };
639 626 dulwich = super.buildPythonPackage {
640 627 name = "dulwich-0.12.0";
641 628 buildInputs = with self; [];
642 629 doCheck = false;
643 630 propagatedBuildInputs = with self; [];
644 631 src = fetchurl {
645 632 url = "https://pypi.python.org/packages/6f/04/fbe561b6d45c0ec758330d5b7f5ba4b6cb4f1ca1ab49859d2fc16320da75/dulwich-0.12.0.tar.gz";
646 633 md5 = "f3a8a12bd9f9dd8c233e18f3d49436fa";
647 634 };
648 635 meta = {
649 636 license = [ pkgs.lib.licenses.gpl2Plus ];
650 637 };
651 638 };
652 639 ecdsa = super.buildPythonPackage {
653 640 name = "ecdsa-0.11";
654 641 buildInputs = with self; [];
655 642 doCheck = false;
656 643 propagatedBuildInputs = with self; [];
657 644 src = fetchurl {
658 645 url = "https://pypi.python.org/packages/6c/3f/92fe5dcdcaa7bd117be21e5520c9a54375112b66ec000d209e9e9519fad1/ecdsa-0.11.tar.gz";
659 646 md5 = "8ef586fe4dbb156697d756900cb41d7c";
660 647 };
661 648 meta = {
662 649 license = [ pkgs.lib.licenses.mit ];
663 650 };
664 651 };
665 652 elasticsearch = super.buildPythonPackage {
666 653 name = "elasticsearch-2.3.0";
667 654 buildInputs = with self; [];
668 655 doCheck = false;
669 656 propagatedBuildInputs = with self; [urllib3];
670 657 src = fetchurl {
671 658 url = "https://pypi.python.org/packages/10/35/5fd52c5f0b0ee405ed4b5195e8bce44c5e041787680dc7b94b8071cac600/elasticsearch-2.3.0.tar.gz";
672 659 md5 = "2550f3b51629cf1ef9636608af92c340";
673 660 };
674 661 meta = {
675 662 license = [ pkgs.lib.licenses.asl20 ];
676 663 };
677 664 };
678 665 elasticsearch-dsl = super.buildPythonPackage {
679 666 name = "elasticsearch-dsl-2.0.0";
680 667 buildInputs = with self; [];
681 668 doCheck = false;
682 669 propagatedBuildInputs = with self; [six python-dateutil elasticsearch];
683 670 src = fetchurl {
684 671 url = "https://pypi.python.org/packages/4e/5d/e788ae8dbe2ff4d13426db0a027533386a5c276c77a2654dc0e2007ce04a/elasticsearch-dsl-2.0.0.tar.gz";
685 672 md5 = "4cdfec81bb35383dd3b7d02d7dc5ee68";
686 673 };
687 674 meta = {
688 675 license = [ pkgs.lib.licenses.asl20 ];
689 676 };
690 677 };
691 678 flake8 = super.buildPythonPackage {
692 679 name = "flake8-2.4.1";
693 680 buildInputs = with self; [];
694 681 doCheck = false;
695 682 propagatedBuildInputs = with self; [pyflakes pep8 mccabe];
696 683 src = fetchurl {
697 684 url = "https://pypi.python.org/packages/8f/b5/9a73c66c7dba273bac8758398f060c008a25f3e84531063b42503b5d0a95/flake8-2.4.1.tar.gz";
698 685 md5 = "ed45d3db81a3b7c88bd63c6e37ca1d65";
699 686 };
700 687 meta = {
701 688 license = [ pkgs.lib.licenses.mit ];
702 689 };
703 690 };
704 691 future = super.buildPythonPackage {
705 692 name = "future-0.14.3";
706 693 buildInputs = with self; [];
707 694 doCheck = false;
708 695 propagatedBuildInputs = with self; [];
709 696 src = fetchurl {
710 697 url = "https://pypi.python.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz";
711 698 md5 = "e94079b0bd1fc054929e8769fc0f6083";
712 699 };
713 700 meta = {
714 701 license = [ { fullName = "OSI Approved"; } pkgs.lib.licenses.mit ];
715 702 };
716 703 };
717 704 futures = super.buildPythonPackage {
718 705 name = "futures-3.0.2";
719 706 buildInputs = with self; [];
720 707 doCheck = false;
721 708 propagatedBuildInputs = with self; [];
722 709 src = fetchurl {
723 710 url = "https://pypi.python.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz";
724 711 md5 = "42aaf1e4de48d6e871d77dc1f9d96d5a";
725 712 };
726 713 meta = {
727 714 license = [ pkgs.lib.licenses.bsdOriginal ];
728 715 };
729 716 };
730 717 gevent = super.buildPythonPackage {
731 718 name = "gevent-1.1.1";
732 719 buildInputs = with self; [];
733 720 doCheck = false;
734 721 propagatedBuildInputs = with self; [greenlet];
735 722 src = fetchurl {
736 723 url = "https://pypi.python.org/packages/12/dc/0b2e57823225de86f6e111a65d212c9e3b64847dddaa19691a6cb94b0b2e/gevent-1.1.1.tar.gz";
737 724 md5 = "1532f5396ab4d07a231f1935483be7c3";
738 725 };
739 726 meta = {
740 727 license = [ pkgs.lib.licenses.mit ];
741 728 };
742 729 };
743 730 gnureadline = super.buildPythonPackage {
744 731 name = "gnureadline-6.3.3";
745 732 buildInputs = with self; [];
746 733 doCheck = false;
747 734 propagatedBuildInputs = with self; [];
748 735 src = fetchurl {
749 736 url = "https://pypi.python.org/packages/3a/ee/2c3f568b0a74974791ac590ec742ef6133e2fbd287a074ba72a53fa5e97c/gnureadline-6.3.3.tar.gz";
750 737 md5 = "c4af83c9a3fbeac8f2da9b5a7c60e51c";
751 738 };
752 739 meta = {
753 740 license = [ pkgs.lib.licenses.gpl1 ];
754 741 };
755 742 };
756 743 gprof2dot = super.buildPythonPackage {
757 744 name = "gprof2dot-2015.12.1";
758 745 buildInputs = with self; [];
759 746 doCheck = false;
760 747 propagatedBuildInputs = with self; [];
761 748 src = fetchurl {
762 749 url = "https://pypi.python.org/packages/b9/34/7bf93c1952d40fa5c95ad963f4d8344b61ef58558632402eca18e6c14127/gprof2dot-2015.12.1.tar.gz";
763 750 md5 = "e23bf4e2f94db032750c193384b4165b";
764 751 };
765 752 meta = {
766 753 license = [ { fullName = "LGPL"; } ];
767 754 };
768 755 };
769 756 greenlet = super.buildPythonPackage {
770 757 name = "greenlet-0.4.9";
771 758 buildInputs = with self; [];
772 759 doCheck = false;
773 760 propagatedBuildInputs = with self; [];
774 761 src = fetchurl {
775 762 url = "https://pypi.python.org/packages/4e/3d/9d421539b74e33608b245092870156b2e171fb49f2b51390aa4641eecb4a/greenlet-0.4.9.zip";
776 763 md5 = "c6659cdb2a5e591723e629d2eef22e82";
777 764 };
778 765 meta = {
779 766 license = [ pkgs.lib.licenses.mit ];
780 767 };
781 768 };
782 769 gunicorn = super.buildPythonPackage {
783 770 name = "gunicorn-19.6.0";
784 771 buildInputs = with self; [];
785 772 doCheck = false;
786 773 propagatedBuildInputs = with self; [];
787 774 src = fetchurl {
788 775 url = "https://pypi.python.org/packages/84/ce/7ea5396efad1cef682bbc4068e72a0276341d9d9d0f501da609fab9fcb80/gunicorn-19.6.0.tar.gz";
789 776 md5 = "338e5e8a83ea0f0625f768dba4597530";
790 777 };
791 778 meta = {
792 779 license = [ pkgs.lib.licenses.mit ];
793 780 };
794 781 };
795 782 infrae.cache = super.buildPythonPackage {
796 783 name = "infrae.cache-1.0.1";
797 784 buildInputs = with self; [];
798 785 doCheck = false;
799 786 propagatedBuildInputs = with self; [Beaker repoze.lru];
800 787 src = fetchurl {
801 788 url = "https://pypi.python.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz";
802 789 md5 = "b09076a766747e6ed2a755cc62088e32";
803 790 };
804 791 meta = {
805 792 license = [ pkgs.lib.licenses.zpt21 ];
806 793 };
807 794 };
808 795 invoke = super.buildPythonPackage {
809 796 name = "invoke-0.13.0";
810 797 buildInputs = with self; [];
811 798 doCheck = false;
812 799 propagatedBuildInputs = with self; [];
813 800 src = fetchurl {
814 801 url = "https://pypi.python.org/packages/47/bf/d07ef52fa1ac645468858bbac7cb95b246a972a045e821493d17d89c81be/invoke-0.13.0.tar.gz";
815 802 md5 = "c0d1ed4bfb34eaab551662d8cfee6540";
816 803 };
817 804 meta = {
818 805 license = [ pkgs.lib.licenses.bsdOriginal ];
819 806 };
820 807 };
821 808 ipdb = super.buildPythonPackage {
822 809 name = "ipdb-0.8";
823 810 buildInputs = with self; [];
824 811 doCheck = false;
825 812 propagatedBuildInputs = with self; [ipython];
826 813 src = fetchurl {
827 814 url = "https://pypi.python.org/packages/f0/25/d7dd430ced6cd8dc242a933c8682b5dbf32eb4011d82f87e34209e5ec845/ipdb-0.8.zip";
828 815 md5 = "96dca0712efa01aa5eaf6b22071dd3ed";
829 816 };
830 817 meta = {
831 818 license = [ pkgs.lib.licenses.gpl1 ];
832 819 };
833 820 };
834 821 ipython = super.buildPythonPackage {
835 822 name = "ipython-3.1.0";
836 823 buildInputs = with self; [];
837 824 doCheck = false;
838 825 propagatedBuildInputs = with self; [];
839 826 src = fetchurl {
840 827 url = "https://pypi.python.org/packages/06/91/120c0835254c120af89f066afaabf81289bc2726c1fc3ca0555df6882f58/ipython-3.1.0.tar.gz";
841 828 md5 = "a749d90c16068687b0ec45a27e72ef8f";
842 829 };
843 830 meta = {
844 831 license = [ pkgs.lib.licenses.bsdOriginal ];
845 832 };
846 833 };
847 834 iso8601 = super.buildPythonPackage {
848 835 name = "iso8601-0.1.11";
849 836 buildInputs = with self; [];
850 837 doCheck = false;
851 838 propagatedBuildInputs = with self; [];
852 839 src = fetchurl {
853 840 url = "https://pypi.python.org/packages/c0/75/c9209ee4d1b5975eb8c2cba4428bde6b61bd55664a98290dd015cdb18e98/iso8601-0.1.11.tar.gz";
854 841 md5 = "b06d11cd14a64096f907086044f0fe38";
855 842 };
856 843 meta = {
857 844 license = [ pkgs.lib.licenses.mit ];
858 845 };
859 846 };
860 847 itsdangerous = super.buildPythonPackage {
861 848 name = "itsdangerous-0.24";
862 849 buildInputs = with self; [];
863 850 doCheck = false;
864 851 propagatedBuildInputs = with self; [];
865 852 src = fetchurl {
866 853 url = "https://pypi.python.org/packages/dc/b4/a60bcdba945c00f6d608d8975131ab3f25b22f2bcfe1dab221165194b2d4/itsdangerous-0.24.tar.gz";
867 854 md5 = "a3d55aa79369aef5345c036a8a26307f";
868 855 };
869 856 meta = {
870 857 license = [ pkgs.lib.licenses.bsdOriginal ];
871 858 };
872 859 };
873 860 kombu = super.buildPythonPackage {
874 861 name = "kombu-1.5.1";
875 862 buildInputs = with self; [];
876 863 doCheck = false;
877 864 propagatedBuildInputs = with self; [anyjson amqplib];
878 865 src = fetchurl {
879 866 url = "https://pypi.python.org/packages/19/53/74bf2a624644b45f0850a638752514fc10a8e1cbd738f10804951a6df3f5/kombu-1.5.1.tar.gz";
880 867 md5 = "50662f3c7e9395b3d0721fb75d100b63";
881 868 };
882 869 meta = {
883 870 license = [ pkgs.lib.licenses.bsdOriginal ];
884 871 };
885 872 };
886 873 lxml = super.buildPythonPackage {
887 874 name = "lxml-3.4.4";
888 875 buildInputs = with self; [];
889 876 doCheck = false;
890 877 propagatedBuildInputs = with self; [];
891 878 src = fetchurl {
892 879 url = "https://pypi.python.org/packages/63/c7/4f2a2a4ad6c6fa99b14be6b3c1cece9142e2d915aa7c43c908677afc8fa4/lxml-3.4.4.tar.gz";
893 880 md5 = "a9a65972afc173ec7a39c585f4eea69c";
894 881 };
895 882 meta = {
896 883 license = [ pkgs.lib.licenses.bsdOriginal ];
897 884 };
898 885 };
899 886 mccabe = super.buildPythonPackage {
900 887 name = "mccabe-0.3";
901 888 buildInputs = with self; [];
902 889 doCheck = false;
903 890 propagatedBuildInputs = with self; [];
904 891 src = fetchurl {
905 892 url = "https://pypi.python.org/packages/c9/2e/75231479e11a906b64ac43bad9d0bb534d00080b18bdca8db9da46e1faf7/mccabe-0.3.tar.gz";
906 893 md5 = "81640948ff226f8c12b3277059489157";
907 894 };
908 895 meta = {
909 896 license = [ { fullName = "Expat license"; } pkgs.lib.licenses.mit ];
910 897 };
911 898 };
912 899 meld3 = super.buildPythonPackage {
913 900 name = "meld3-1.0.2";
914 901 buildInputs = with self; [];
915 902 doCheck = false;
916 903 propagatedBuildInputs = with self; [];
917 904 src = fetchurl {
918 905 url = "https://pypi.python.org/packages/45/a0/317c6422b26c12fe0161e936fc35f36552069ba8e6f7ecbd99bbffe32a5f/meld3-1.0.2.tar.gz";
919 906 md5 = "3ccc78cd79cffd63a751ad7684c02c91";
920 907 };
921 908 meta = {
922 909 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
923 910 };
924 911 };
925 912 mock = super.buildPythonPackage {
926 913 name = "mock-1.0.1";
927 914 buildInputs = with self; [];
928 915 doCheck = false;
929 916 propagatedBuildInputs = with self; [];
930 917 src = fetchurl {
931 918 url = "https://pypi.python.org/packages/15/45/30273ee91feb60dabb8fbb2da7868520525f02cf910279b3047182feed80/mock-1.0.1.zip";
932 919 md5 = "869f08d003c289a97c1a6610faf5e913";
933 920 };
934 921 meta = {
935 922 license = [ pkgs.lib.licenses.bsdOriginal ];
936 923 };
937 924 };
938 925 msgpack-python = super.buildPythonPackage {
939 926 name = "msgpack-python-0.4.6";
940 927 buildInputs = with self; [];
941 928 doCheck = false;
942 929 propagatedBuildInputs = with self; [];
943 930 src = fetchurl {
944 931 url = "https://pypi.python.org/packages/15/ce/ff2840885789ef8035f66cd506ea05bdb228340307d5e71a7b1e3f82224c/msgpack-python-0.4.6.tar.gz";
945 932 md5 = "8b317669314cf1bc881716cccdaccb30";
946 933 };
947 934 meta = {
948 935 license = [ pkgs.lib.licenses.asl20 ];
949 936 };
950 937 };
951 938 nose = super.buildPythonPackage {
952 939 name = "nose-1.3.6";
953 940 buildInputs = with self; [];
954 941 doCheck = false;
955 942 propagatedBuildInputs = with self; [];
956 943 src = fetchurl {
957 944 url = "https://pypi.python.org/packages/70/c7/469e68148d17a0d3db5ed49150242fd70a74a8147b8f3f8b87776e028d99/nose-1.3.6.tar.gz";
958 945 md5 = "0ca546d81ca8309080fc80cb389e7a16";
959 946 };
960 947 meta = {
961 948 license = [ { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "GNU LGPL"; } ];
962 949 };
963 950 };
964 951 objgraph = super.buildPythonPackage {
965 952 name = "objgraph-2.0.0";
966 953 buildInputs = with self; [];
967 954 doCheck = false;
968 955 propagatedBuildInputs = with self; [];
969 956 src = fetchurl {
970 957 url = "https://pypi.python.org/packages/d7/33/ace750b59247496ed769b170586c5def7202683f3d98e737b75b767ff29e/objgraph-2.0.0.tar.gz";
971 958 md5 = "25b0d5e5adc74aa63ead15699614159c";
972 959 };
973 960 meta = {
974 961 license = [ pkgs.lib.licenses.mit ];
975 962 };
976 963 };
977 964 packaging = super.buildPythonPackage {
978 965 name = "packaging-15.2";
979 966 buildInputs = with self; [];
980 967 doCheck = false;
981 968 propagatedBuildInputs = with self; [];
982 969 src = fetchurl {
983 970 url = "https://pypi.python.org/packages/24/c4/185da1304f07047dc9e0c46c31db75c0351bd73458ac3efad7da3dbcfbe1/packaging-15.2.tar.gz";
984 971 md5 = "c16093476f6ced42128bf610e5db3784";
985 972 };
986 973 meta = {
987 974 license = [ pkgs.lib.licenses.asl20 ];
988 975 };
989 976 };
990 977 paramiko = super.buildPythonPackage {
991 978 name = "paramiko-1.15.1";
992 979 buildInputs = with self; [];
993 980 doCheck = false;
994 981 propagatedBuildInputs = with self; [pycrypto ecdsa];
995 982 src = fetchurl {
996 983 url = "https://pypi.python.org/packages/04/2b/a22d2a560c1951abbbf95a0628e245945565f70dc082d9e784666887222c/paramiko-1.15.1.tar.gz";
997 984 md5 = "48c274c3f9b1282932567b21f6acf3b5";
998 985 };
999 986 meta = {
1000 987 license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1001 988 };
1002 989 };
1003 990 pep8 = super.buildPythonPackage {
1004 991 name = "pep8-1.5.7";
1005 992 buildInputs = with self; [];
1006 993 doCheck = false;
1007 994 propagatedBuildInputs = with self; [];
1008 995 src = fetchurl {
1009 996 url = "https://pypi.python.org/packages/8b/de/259f5e735897ada1683489dd514b2a1c91aaa74e5e6b68f80acf128a6368/pep8-1.5.7.tar.gz";
1010 997 md5 = "f6adbdd69365ecca20513c709f9b7c93";
1011 998 };
1012 999 meta = {
1013 1000 license = [ { fullName = "Expat license"; } pkgs.lib.licenses.mit ];
1014 1001 };
1015 1002 };
1016 1003 peppercorn = super.buildPythonPackage {
1017 1004 name = "peppercorn-0.5";
1018 1005 buildInputs = with self; [];
1019 1006 doCheck = false;
1020 1007 propagatedBuildInputs = with self; [];
1021 1008 src = fetchurl {
1022 1009 url = "https://pypi.python.org/packages/45/ec/a62ec317d1324a01567c5221b420742f094f05ee48097e5157d32be3755c/peppercorn-0.5.tar.gz";
1023 1010 md5 = "f08efbca5790019ab45d76b7244abd40";
1024 1011 };
1025 1012 meta = {
1026 1013 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1027 1014 };
1028 1015 };
1029 1016 psutil = super.buildPythonPackage {
1030 1017 name = "psutil-2.2.1";
1031 1018 buildInputs = with self; [];
1032 1019 doCheck = false;
1033 1020 propagatedBuildInputs = with self; [];
1034 1021 src = fetchurl {
1035 1022 url = "https://pypi.python.org/packages/df/47/ee54ef14dd40f8ce831a7581001a5096494dc99fe71586260ca6b531fe86/psutil-2.2.1.tar.gz";
1036 1023 md5 = "1a2b58cd9e3a53528bb6148f0c4d5244";
1037 1024 };
1038 1025 meta = {
1039 1026 license = [ pkgs.lib.licenses.bsdOriginal ];
1040 1027 };
1041 1028 };
1042 1029 psycopg2 = super.buildPythonPackage {
1043 1030 name = "psycopg2-2.6.1";
1044 1031 buildInputs = with self; [];
1045 1032 doCheck = false;
1046 1033 propagatedBuildInputs = with self; [];
1047 1034 src = fetchurl {
1048 1035 url = "https://pypi.python.org/packages/86/fd/cc8315be63a41fe000cce20482a917e874cdc1151e62cb0141f5e55f711e/psycopg2-2.6.1.tar.gz";
1049 1036 md5 = "842b44f8c95517ed5b792081a2370da1";
1050 1037 };
1051 1038 meta = {
1052 1039 license = [ pkgs.lib.licenses.zpt21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ];
1053 1040 };
1054 1041 };
1055 1042 py = super.buildPythonPackage {
1056 1043 name = "py-1.4.29";
1057 1044 buildInputs = with self; [];
1058 1045 doCheck = false;
1059 1046 propagatedBuildInputs = with self; [];
1060 1047 src = fetchurl {
1061 1048 url = "https://pypi.python.org/packages/2a/bc/a1a4a332ac10069b8e5e25136a35e08a03f01fd6ab03d819889d79a1fd65/py-1.4.29.tar.gz";
1062 1049 md5 = "c28e0accba523a29b35a48bb703fb96c";
1063 1050 };
1064 1051 meta = {
1065 1052 license = [ pkgs.lib.licenses.mit ];
1066 1053 };
1067 1054 };
1068 1055 py-bcrypt = super.buildPythonPackage {
1069 1056 name = "py-bcrypt-0.4";
1070 1057 buildInputs = with self; [];
1071 1058 doCheck = false;
1072 1059 propagatedBuildInputs = with self; [];
1073 1060 src = fetchurl {
1074 1061 url = "https://pypi.python.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz";
1075 1062 md5 = "dd8b367d6b716a2ea2e72392525f4e36";
1076 1063 };
1077 1064 meta = {
1078 1065 license = [ pkgs.lib.licenses.bsdOriginal ];
1079 1066 };
1080 1067 };
1081 1068 py-gfm = super.buildPythonPackage {
1082 1069 name = "py-gfm-0.1.3";
1083 1070 buildInputs = with self; [];
1084 1071 doCheck = false;
1085 1072 propagatedBuildInputs = with self; [setuptools Markdown];
1086 1073 src = fetchurl {
1087 1074 url = "https://pypi.python.org/packages/12/e4/6b3d8678da04f97d7490d8264d8de51c2dc9fb91209ccee9c515c95e14c5/py-gfm-0.1.3.tar.gz";
1088 1075 md5 = "e588d9e69640a241b97e2c59c22527a6";
1089 1076 };
1090 1077 meta = {
1091 1078 license = [ pkgs.lib.licenses.bsdOriginal ];
1092 1079 };
1093 1080 };
1094 1081 pycrypto = super.buildPythonPackage {
1095 1082 name = "pycrypto-2.6.1";
1096 1083 buildInputs = with self; [];
1097 1084 doCheck = false;
1098 1085 propagatedBuildInputs = with self; [];
1099 1086 src = fetchurl {
1100 1087 url = "https://pypi.python.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz";
1101 1088 md5 = "55a61a054aa66812daf5161a0d5d7eda";
1102 1089 };
1103 1090 meta = {
1104 1091 license = [ pkgs.lib.licenses.publicDomain ];
1105 1092 };
1106 1093 };
1107 1094 pycurl = super.buildPythonPackage {
1108 1095 name = "pycurl-7.19.5";
1109 1096 buildInputs = with self; [];
1110 1097 doCheck = false;
1111 1098 propagatedBuildInputs = with self; [];
1112 1099 src = fetchurl {
1113 1100 url = "https://pypi.python.org/packages/6c/48/13bad289ef6f4869b1d8fc11ae54de8cfb3cc4a2eb9f7419c506f763be46/pycurl-7.19.5.tar.gz";
1114 1101 md5 = "47b4eac84118e2606658122104e62072";
1115 1102 };
1116 1103 meta = {
1117 1104 license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1118 1105 };
1119 1106 };
1120 1107 pyflakes = super.buildPythonPackage {
1121 1108 name = "pyflakes-0.8.1";
1122 1109 buildInputs = with self; [];
1123 1110 doCheck = false;
1124 1111 propagatedBuildInputs = with self; [];
1125 1112 src = fetchurl {
1126 1113 url = "https://pypi.python.org/packages/75/22/a90ec0252f4f87f3ffb6336504de71fe16a49d69c4538dae2f12b9360a38/pyflakes-0.8.1.tar.gz";
1127 1114 md5 = "905fe91ad14b912807e8fdc2ac2e2c23";
1128 1115 };
1129 1116 meta = {
1130 1117 license = [ pkgs.lib.licenses.mit ];
1131 1118 };
1132 1119 };
1133 1120 pyparsing = super.buildPythonPackage {
1134 1121 name = "pyparsing-1.5.7";
1135 1122 buildInputs = with self; [];
1136 1123 doCheck = false;
1137 1124 propagatedBuildInputs = with self; [];
1138 1125 src = fetchurl {
1139 1126 url = "https://pypi.python.org/packages/2e/26/e8fb5b4256a5f5036be7ce115ef8db8d06bc537becfbdc46c6af008314ee/pyparsing-1.5.7.zip";
1140 1127 md5 = "b86854857a368d6ccb4d5b6e76d0637f";
1141 1128 };
1142 1129 meta = {
1143 1130 license = [ pkgs.lib.licenses.mit ];
1144 1131 };
1145 1132 };
1146 1133 pyramid = super.buildPythonPackage {
1147 1134 name = "pyramid-1.6.1";
1148 1135 buildInputs = with self; [];
1149 1136 doCheck = false;
1150 1137 propagatedBuildInputs = with self; [setuptools WebOb repoze.lru zope.interface zope.deprecation venusian translationstring PasteDeploy];
1151 1138 src = fetchurl {
1152 1139 url = "https://pypi.python.org/packages/30/b3/fcc4a2a4800cbf21989e00454b5828cf1f7fe35c63e0810b350e56d4c475/pyramid-1.6.1.tar.gz";
1153 1140 md5 = "b18688ff3cc33efdbb098a35b45dd122";
1154 1141 };
1155 1142 meta = {
1156 1143 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1157 1144 };
1158 1145 };
1159 1146 pyramid-beaker = super.buildPythonPackage {
1160 1147 name = "pyramid-beaker-0.8";
1161 1148 buildInputs = with self; [];
1162 1149 doCheck = false;
1163 1150 propagatedBuildInputs = with self; [pyramid Beaker];
1164 1151 src = fetchurl {
1165 1152 url = "https://pypi.python.org/packages/d9/6e/b85426e00fd3d57f4545f74e1c3828552d8700f13ededeef9233f7bca8be/pyramid_beaker-0.8.tar.gz";
1166 1153 md5 = "22f14be31b06549f80890e2c63a93834";
1167 1154 };
1168 1155 meta = {
1169 1156 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1170 1157 };
1171 1158 };
1172 1159 pyramid-debugtoolbar = super.buildPythonPackage {
1173 1160 name = "pyramid-debugtoolbar-2.4.2";
1174 1161 buildInputs = with self; [];
1175 1162 doCheck = false;
1176 1163 propagatedBuildInputs = with self; [pyramid pyramid-mako repoze.lru Pygments];
1177 1164 src = fetchurl {
1178 1165 url = "https://pypi.python.org/packages/89/00/ed5426ee41ed747ba3ffd30e8230841a6878286ea67d480b1444d24f06a2/pyramid_debugtoolbar-2.4.2.tar.gz";
1179 1166 md5 = "073ea67086cc4bd5decc3a000853642d";
1180 1167 };
1181 1168 meta = {
1182 1169 license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ];
1183 1170 };
1184 1171 };
1185 1172 pyramid-jinja2 = super.buildPythonPackage {
1186 1173 name = "pyramid-jinja2-2.5";
1187 1174 buildInputs = with self; [];
1188 1175 doCheck = false;
1189 1176 propagatedBuildInputs = with self; [pyramid zope.deprecation Jinja2 MarkupSafe];
1190 1177 src = fetchurl {
1191 1178 url = "https://pypi.python.org/packages/a1/80/595e26ffab7deba7208676b6936b7e5a721875710f982e59899013cae1ed/pyramid_jinja2-2.5.tar.gz";
1192 1179 md5 = "07cb6547204ac5e6f0b22a954ccee928";
1193 1180 };
1194 1181 meta = {
1195 1182 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1196 1183 };
1197 1184 };
1198 1185 pyramid-mako = super.buildPythonPackage {
1199 1186 name = "pyramid-mako-1.0.2";
1200 1187 buildInputs = with self; [];
1201 1188 doCheck = false;
1202 1189 propagatedBuildInputs = with self; [pyramid Mako];
1203 1190 src = fetchurl {
1204 1191 url = "https://pypi.python.org/packages/f1/92/7e69bcf09676d286a71cb3bbb887b16595b96f9ba7adbdc239ffdd4b1eb9/pyramid_mako-1.0.2.tar.gz";
1205 1192 md5 = "ee25343a97eb76bd90abdc2a774eb48a";
1206 1193 };
1207 1194 meta = {
1208 1195 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1209 1196 };
1210 1197 };
1211 1198 pysqlite = super.buildPythonPackage {
1212 1199 name = "pysqlite-2.6.3";
1213 1200 buildInputs = with self; [];
1214 1201 doCheck = false;
1215 1202 propagatedBuildInputs = with self; [];
1216 1203 src = fetchurl {
1217 1204 url = "https://pypi.python.org/packages/5c/a6/1c429cd4c8069cf4bfbd0eb4d592b3f4042155a8202df83d7e9b93aa3dc2/pysqlite-2.6.3.tar.gz";
1218 1205 md5 = "7ff1cedee74646b50117acff87aa1cfa";
1219 1206 };
1220 1207 meta = {
1221 1208 license = [ { fullName = "zlib/libpng License"; } { fullName = "zlib/libpng license"; } ];
1222 1209 };
1223 1210 };
1224 1211 pytest = super.buildPythonPackage {
1225 1212 name = "pytest-2.8.5";
1226 1213 buildInputs = with self; [];
1227 1214 doCheck = false;
1228 1215 propagatedBuildInputs = with self; [py];
1229 1216 src = fetchurl {
1230 1217 url = "https://pypi.python.org/packages/b1/3d/d7ea9b0c51e0cacded856e49859f0a13452747491e842c236bbab3714afe/pytest-2.8.5.zip";
1231 1218 md5 = "8493b06f700862f1294298d6c1b715a9";
1232 1219 };
1233 1220 meta = {
1234 1221 license = [ pkgs.lib.licenses.mit ];
1235 1222 };
1236 1223 };
1237 1224 pytest-catchlog = super.buildPythonPackage {
1238 1225 name = "pytest-catchlog-1.2.2";
1239 1226 buildInputs = with self; [];
1240 1227 doCheck = false;
1241 1228 propagatedBuildInputs = with self; [py pytest];
1242 1229 src = fetchurl {
1243 1230 url = "https://pypi.python.org/packages/f2/2b/2faccdb1a978fab9dd0bf31cca9f6847fbe9184a0bdcc3011ac41dd44191/pytest-catchlog-1.2.2.zip";
1244 1231 md5 = "09d890c54c7456c818102b7ff8c182c8";
1245 1232 };
1246 1233 meta = {
1247 1234 license = [ pkgs.lib.licenses.mit ];
1248 1235 };
1249 1236 };
1250 1237 pytest-cov = super.buildPythonPackage {
1251 1238 name = "pytest-cov-1.8.1";
1252 1239 buildInputs = with self; [];
1253 1240 doCheck = false;
1254 1241 propagatedBuildInputs = with self; [py pytest coverage cov-core];
1255 1242 src = fetchurl {
1256 1243 url = "https://pypi.python.org/packages/11/4b/b04646e97f1721878eb21e9f779102d84dd044d324382263b1770a3e4838/pytest-cov-1.8.1.tar.gz";
1257 1244 md5 = "76c778afa2494088270348be42d759fc";
1258 1245 };
1259 1246 meta = {
1260 1247 license = [ pkgs.lib.licenses.mit ];
1261 1248 };
1262 1249 };
1263 1250 pytest-profiling = super.buildPythonPackage {
1264 1251 name = "pytest-profiling-1.0.1";
1265 1252 buildInputs = with self; [];
1266 1253 doCheck = false;
1267 1254 propagatedBuildInputs = with self; [six pytest gprof2dot];
1268 1255 src = fetchurl {
1269 1256 url = "https://pypi.python.org/packages/d8/67/8ffab73406e22870e07fa4dc8dce1d7689b26dba8efd00161c9b6fc01ec0/pytest-profiling-1.0.1.tar.gz";
1270 1257 md5 = "354404eb5b3fd4dc5eb7fffbb3d9b68b";
1271 1258 };
1272 1259 meta = {
1273 1260 license = [ pkgs.lib.licenses.mit ];
1274 1261 };
1275 1262 };
1276 1263 pytest-runner = super.buildPythonPackage {
1277 1264 name = "pytest-runner-2.7.1";
1278 1265 buildInputs = with self; [];
1279 1266 doCheck = false;
1280 1267 propagatedBuildInputs = with self; [];
1281 1268 src = fetchurl {
1282 1269 url = "https://pypi.python.org/packages/99/6b/c4ff4418d3424d4475b7af60724fd4a5cdd91ed8e489dc9443281f0052bc/pytest-runner-2.7.1.tar.gz";
1283 1270 md5 = "e56f0bc8d79a6bd91772b44ef4215c7e";
1284 1271 };
1285 1272 meta = {
1286 1273 license = [ pkgs.lib.licenses.mit ];
1287 1274 };
1288 1275 };
1289 1276 pytest-timeout = super.buildPythonPackage {
1290 1277 name = "pytest-timeout-0.4";
1291 1278 buildInputs = with self; [];
1292 1279 doCheck = false;
1293 1280 propagatedBuildInputs = with self; [pytest];
1294 1281 src = fetchurl {
1295 1282 url = "https://pypi.python.org/packages/24/48/5f6bd4b8026a26e1dd427243d560a29a0f1b24a5c7cffca4bf049a7bb65b/pytest-timeout-0.4.tar.gz";
1296 1283 md5 = "03b28aff69cbbfb959ed35ade5fde262";
1297 1284 };
1298 1285 meta = {
1299 1286 license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ];
1300 1287 };
1301 1288 };
1302 1289 python-dateutil = super.buildPythonPackage {
1303 1290 name = "python-dateutil-1.5";
1304 1291 buildInputs = with self; [];
1305 1292 doCheck = false;
1306 1293 propagatedBuildInputs = with self; [];
1307 1294 src = fetchurl {
1308 1295 url = "https://pypi.python.org/packages/b4/7c/df59c89a753eb33c7c44e1dd42de0e9bc2ccdd5a4d576e0bfad97cc280cb/python-dateutil-1.5.tar.gz";
1309 1296 md5 = "0dcb1de5e5cad69490a3b6ab63f0cfa5";
1310 1297 };
1311 1298 meta = {
1312 1299 license = [ pkgs.lib.licenses.psfl ];
1313 1300 };
1314 1301 };
1315 1302 python-editor = super.buildPythonPackage {
1316 1303 name = "python-editor-1.0.1";
1317 1304 buildInputs = with self; [];
1318 1305 doCheck = false;
1319 1306 propagatedBuildInputs = with self; [];
1320 1307 src = fetchurl {
1321 1308 url = "https://pypi.python.org/packages/2b/c0/df7b87d5cf016f82eab3b05cd35f53287c1178ad8c42bfb6fa61b89b22f6/python-editor-1.0.1.tar.gz";
1322 1309 md5 = "e1fa63535b40e022fa4fd646fd8b511a";
1323 1310 };
1324 1311 meta = {
1325 1312 license = [ pkgs.lib.licenses.asl20 ];
1326 1313 };
1327 1314 };
1328 1315 python-ldap = super.buildPythonPackage {
1329 1316 name = "python-ldap-2.4.19";
1330 1317 buildInputs = with self; [];
1331 1318 doCheck = false;
1332 1319 propagatedBuildInputs = with self; [setuptools];
1333 1320 src = fetchurl {
1334 1321 url = "https://pypi.python.org/packages/42/81/1b64838c82e64f14d4e246ff00b52e650a35c012551b891ada2b85d40737/python-ldap-2.4.19.tar.gz";
1335 1322 md5 = "b941bf31d09739492aa19ef679e94ae3";
1336 1323 };
1337 1324 meta = {
1338 1325 license = [ pkgs.lib.licenses.psfl ];
1339 1326 };
1340 1327 };
1341 1328 python-memcached = super.buildPythonPackage {
1342 1329 name = "python-memcached-1.57";
1343 1330 buildInputs = with self; [];
1344 1331 doCheck = false;
1345 1332 propagatedBuildInputs = with self; [six];
1346 1333 src = fetchurl {
1347 1334 url = "https://pypi.python.org/packages/52/9d/eebc0dcbc5c7c66840ad207dfc1baa376dadb74912484bff73819cce01e6/python-memcached-1.57.tar.gz";
1348 1335 md5 = "de21f64b42b2d961f3d4ad7beb5468a1";
1349 1336 };
1350 1337 meta = {
1351 1338 license = [ pkgs.lib.licenses.psfl ];
1352 1339 };
1353 1340 };
1354 1341 python-pam = super.buildPythonPackage {
1355 1342 name = "python-pam-1.8.2";
1356 1343 buildInputs = with self; [];
1357 1344 doCheck = false;
1358 1345 propagatedBuildInputs = with self; [];
1359 1346 src = fetchurl {
1360 1347 url = "https://pypi.python.org/packages/de/8c/f8f5d38b4f26893af267ea0b39023d4951705ab0413a39e0cf7cf4900505/python-pam-1.8.2.tar.gz";
1361 1348 md5 = "db71b6b999246fb05d78ecfbe166629d";
1362 1349 };
1363 1350 meta = {
1364 1351 license = [ { fullName = "License :: OSI Approved :: MIT License"; } pkgs.lib.licenses.mit ];
1365 1352 };
1366 1353 };
1367 1354 pytz = super.buildPythonPackage {
1368 1355 name = "pytz-2015.4";
1369 1356 buildInputs = with self; [];
1370 1357 doCheck = false;
1371 1358 propagatedBuildInputs = with self; [];
1372 1359 src = fetchurl {
1373 1360 url = "https://pypi.python.org/packages/7e/1a/f43b5c92df7b156822030fed151327ea096bcf417e45acc23bd1df43472f/pytz-2015.4.zip";
1374 1361 md5 = "233f2a2b370d03f9b5911700cc9ebf3c";
1375 1362 };
1376 1363 meta = {
1377 1364 license = [ pkgs.lib.licenses.mit ];
1378 1365 };
1379 1366 };
1380 1367 pyzmq = super.buildPythonPackage {
1381 1368 name = "pyzmq-14.6.0";
1382 1369 buildInputs = with self; [];
1383 1370 doCheck = false;
1384 1371 propagatedBuildInputs = with self; [];
1385 1372 src = fetchurl {
1386 1373 url = "https://pypi.python.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz";
1387 1374 md5 = "395b5de95a931afa5b14c9349a5b8024";
1388 1375 };
1389 1376 meta = {
1390 1377 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "LGPL+BSD"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1391 1378 };
1392 1379 };
1393 1380 recaptcha-client = super.buildPythonPackage {
1394 1381 name = "recaptcha-client-1.0.6";
1395 1382 buildInputs = with self; [];
1396 1383 doCheck = false;
1397 1384 propagatedBuildInputs = with self; [];
1398 1385 src = fetchurl {
1399 1386 url = "https://pypi.python.org/packages/0a/ea/5f2fbbfd894bdac1c68ef8d92019066cfcf9fbff5fe3d728d2b5c25c8db4/recaptcha-client-1.0.6.tar.gz";
1400 1387 md5 = "74228180f7e1fb76c4d7089160b0d919";
1401 1388 };
1402 1389 meta = {
1403 1390 license = [ { fullName = "MIT/X11"; } ];
1404 1391 };
1405 1392 };
1406 1393 repoze.lru = super.buildPythonPackage {
1407 1394 name = "repoze.lru-0.6";
1408 1395 buildInputs = with self; [];
1409 1396 doCheck = false;
1410 1397 propagatedBuildInputs = with self; [];
1411 1398 src = fetchurl {
1412 1399 url = "https://pypi.python.org/packages/6e/1e/aa15cc90217e086dc8769872c8778b409812ff036bf021b15795638939e4/repoze.lru-0.6.tar.gz";
1413 1400 md5 = "2c3b64b17a8e18b405f55d46173e14dd";
1414 1401 };
1415 1402 meta = {
1416 1403 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1417 1404 };
1418 1405 };
1419 1406 requests = super.buildPythonPackage {
1420 1407 name = "requests-2.9.1";
1421 1408 buildInputs = with self; [];
1422 1409 doCheck = false;
1423 1410 propagatedBuildInputs = with self; [];
1424 1411 src = fetchurl {
1425 1412 url = "https://pypi.python.org/packages/f9/6d/07c44fb1ebe04d069459a189e7dab9e4abfe9432adcd4477367c25332748/requests-2.9.1.tar.gz";
1426 1413 md5 = "0b7f480d19012ec52bab78292efd976d";
1427 1414 };
1428 1415 meta = {
1429 1416 license = [ pkgs.lib.licenses.asl20 ];
1430 1417 };
1431 1418 };
1432 1419 rhodecode-enterprise-ce = super.buildPythonPackage {
1433 name = "rhodecode-enterprise-ce-4.3.1";
1420 name = "rhodecode-enterprise-ce-4.4.0";
1434 1421 buildInputs = with self; [WebTest configobj cssselect flake8 lxml mock pytest pytest-cov pytest-runner];
1435 1422 doCheck = true;
1436 1423 propagatedBuildInputs = with self; [Babel Beaker FormEncode Mako Markdown MarkupSafe MySQL-python Paste PasteDeploy PasteScript Pygments Pylons Pyro4 Routes SQLAlchemy Tempita URLObject WebError WebHelpers WebHelpers2 WebOb WebTest Whoosh alembic amqplib anyjson appenlight-client authomatic backport-ipaddress celery channelstream colander decorator deform docutils gevent gunicorn infrae.cache ipython iso8601 kombu msgpack-python packaging psycopg2 py-gfm pycrypto pycurl pyparsing pyramid pyramid-debugtoolbar pyramid-mako pyramid-beaker pysqlite python-dateutil python-ldap python-memcached python-pam recaptcha-client repoze.lru requests simplejson waitress zope.cachedescriptors dogpile.cache dogpile.core psutil py-bcrypt];
1437 1424 src = ./.;
1438 1425 meta = {
1439 1426 license = [ { fullName = "AGPLv3, and Commercial License"; } ];
1440 1427 };
1441 1428 };
1442 1429 rhodecode-tools = super.buildPythonPackage {
1443 1430 name = "rhodecode-tools-0.10.0";
1444 1431 buildInputs = with self; [];
1445 1432 doCheck = false;
1446 1433 propagatedBuildInputs = with self; [click future six Mako MarkupSafe requests Whoosh elasticsearch elasticsearch-dsl];
1447 1434 src = fetchurl {
1448 1435 url = "https://code.rhodecode.com/rhodecode-tools-ce/archive/v0.10.0.zip";
1449 1436 md5 = "4762391473ded761bead3aa58c748044";
1450 1437 };
1451 1438 meta = {
1452 1439 license = [ { fullName = "AGPLv3 and Proprietary"; } ];
1453 1440 };
1454 1441 };
1455 1442 serpent = super.buildPythonPackage {
1456 1443 name = "serpent-1.12";
1457 1444 buildInputs = with self; [];
1458 1445 doCheck = false;
1459 1446 propagatedBuildInputs = with self; [];
1460 1447 src = fetchurl {
1461 1448 url = "https://pypi.python.org/packages/3b/19/1e0e83b47c09edaef8398655088036e7e67386b5c48770218ebb339fbbd5/serpent-1.12.tar.gz";
1462 1449 md5 = "05869ac7b062828b34f8f927f0457b65";
1463 1450 };
1464 1451 meta = {
1465 1452 license = [ pkgs.lib.licenses.mit ];
1466 1453 };
1467 1454 };
1468 1455 setproctitle = super.buildPythonPackage {
1469 1456 name = "setproctitle-1.1.8";
1470 1457 buildInputs = with self; [];
1471 1458 doCheck = false;
1472 1459 propagatedBuildInputs = with self; [];
1473 1460 src = fetchurl {
1474 1461 url = "https://pypi.python.org/packages/33/c3/ad367a4f4f1ca90468863ae727ac62f6edb558fc09a003d344a02cfc6ea6/setproctitle-1.1.8.tar.gz";
1475 1462 md5 = "728f4c8c6031bbe56083a48594027edd";
1476 1463 };
1477 1464 meta = {
1478 1465 license = [ pkgs.lib.licenses.bsdOriginal ];
1479 1466 };
1480 1467 };
1481 1468 setuptools = super.buildPythonPackage {
1482 1469 name = "setuptools-20.8.1";
1483 1470 buildInputs = with self; [];
1484 1471 doCheck = false;
1485 1472 propagatedBuildInputs = with self; [];
1486 1473 src = fetchurl {
1487 1474 url = "https://pypi.python.org/packages/c4/19/c1bdc88b53da654df43770f941079dbab4e4788c2dcb5658fb86259894c7/setuptools-20.8.1.zip";
1488 1475 md5 = "fe58a5cac0df20bb83942b252a4b0543";
1489 1476 };
1490 1477 meta = {
1491 1478 license = [ pkgs.lib.licenses.mit ];
1492 1479 };
1493 1480 };
1494 1481 setuptools-scm = super.buildPythonPackage {
1495 1482 name = "setuptools-scm-1.11.0";
1496 1483 buildInputs = with self; [];
1497 1484 doCheck = false;
1498 1485 propagatedBuildInputs = with self; [];
1499 1486 src = fetchurl {
1500 1487 url = "https://pypi.python.org/packages/cd/5f/e3a038292358058d83d764a47d09114aa5a8003ed4529518f9e580f1a94f/setuptools_scm-1.11.0.tar.gz";
1501 1488 md5 = "4c5c896ba52e134bbc3507bac6400087";
1502 1489 };
1503 1490 meta = {
1504 1491 license = [ pkgs.lib.licenses.mit ];
1505 1492 };
1506 1493 };
1507 1494 simplejson = super.buildPythonPackage {
1508 1495 name = "simplejson-3.7.2";
1509 1496 buildInputs = with self; [];
1510 1497 doCheck = false;
1511 1498 propagatedBuildInputs = with self; [];
1512 1499 src = fetchurl {
1513 1500 url = "https://pypi.python.org/packages/6d/89/7f13f099344eea9d6722779a1f165087cb559598107844b1ac5dbd831fb1/simplejson-3.7.2.tar.gz";
1514 1501 md5 = "a5fc7d05d4cb38492285553def5d4b46";
1515 1502 };
1516 1503 meta = {
1517 1504 license = [ pkgs.lib.licenses.mit pkgs.lib.licenses.afl21 ];
1518 1505 };
1519 1506 };
1520 1507 six = super.buildPythonPackage {
1521 1508 name = "six-1.9.0";
1522 1509 buildInputs = with self; [];
1523 1510 doCheck = false;
1524 1511 propagatedBuildInputs = with self; [];
1525 1512 src = fetchurl {
1526 1513 url = "https://pypi.python.org/packages/16/64/1dc5e5976b17466fd7d712e59cbe9fb1e18bec153109e5ba3ed6c9102f1a/six-1.9.0.tar.gz";
1527 1514 md5 = "476881ef4012262dfc8adc645ee786c4";
1528 1515 };
1529 1516 meta = {
1530 1517 license = [ pkgs.lib.licenses.mit ];
1531 1518 };
1532 1519 };
1533 1520 subprocess32 = super.buildPythonPackage {
1534 1521 name = "subprocess32-3.2.6";
1535 1522 buildInputs = with self; [];
1536 1523 doCheck = false;
1537 1524 propagatedBuildInputs = with self; [];
1538 1525 src = fetchurl {
1539 1526 url = "https://pypi.python.org/packages/28/8d/33ccbff51053f59ae6c357310cac0e79246bbed1d345ecc6188b176d72c3/subprocess32-3.2.6.tar.gz";
1540 1527 md5 = "754c5ab9f533e764f931136974b618f1";
1541 1528 };
1542 1529 meta = {
1543 1530 license = [ pkgs.lib.licenses.psfl ];
1544 1531 };
1545 1532 };
1546 1533 supervisor = super.buildPythonPackage {
1547 1534 name = "supervisor-3.3.0";
1548 1535 buildInputs = with self; [];
1549 1536 doCheck = false;
1550 1537 propagatedBuildInputs = with self; [meld3];
1551 1538 src = fetchurl {
1552 1539 url = "https://pypi.python.org/packages/44/80/d28047d120bfcc8158b4e41127706731ee6a3419c661e0a858fb0e7c4b2d/supervisor-3.3.0.tar.gz";
1553 1540 md5 = "46bac00378d1eddb616752b990c67416";
1554 1541 };
1555 1542 meta = {
1556 1543 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1557 1544 };
1558 1545 };
1559 1546 transifex-client = super.buildPythonPackage {
1560 1547 name = "transifex-client-0.10";
1561 1548 buildInputs = with self; [];
1562 1549 doCheck = false;
1563 1550 propagatedBuildInputs = with self; [];
1564 1551 src = fetchurl {
1565 1552 url = "https://pypi.python.org/packages/f3/4e/7b925192aee656fb3e04fa6381c8b3dc40198047c3b4a356f6cfd642c809/transifex-client-0.10.tar.gz";
1566 1553 md5 = "5549538d84b8eede6b254cd81ae024fa";
1567 1554 };
1568 1555 meta = {
1569 1556 license = [ pkgs.lib.licenses.gpl2 ];
1570 1557 };
1571 1558 };
1572 1559 translationstring = super.buildPythonPackage {
1573 1560 name = "translationstring-1.3";
1574 1561 buildInputs = with self; [];
1575 1562 doCheck = false;
1576 1563 propagatedBuildInputs = with self; [];
1577 1564 src = fetchurl {
1578 1565 url = "https://pypi.python.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz";
1579 1566 md5 = "a4b62e0f3c189c783a1685b3027f7c90";
1580 1567 };
1581 1568 meta = {
1582 1569 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
1583 1570 };
1584 1571 };
1585 1572 trollius = super.buildPythonPackage {
1586 1573 name = "trollius-1.0.4";
1587 1574 buildInputs = with self; [];
1588 1575 doCheck = false;
1589 1576 propagatedBuildInputs = with self; [futures];
1590 1577 src = fetchurl {
1591 1578 url = "https://pypi.python.org/packages/aa/e6/4141db437f55e6ee7a3fb69663239e3fde7841a811b4bef293145ad6c836/trollius-1.0.4.tar.gz";
1592 1579 md5 = "3631a464d49d0cbfd30ab2918ef2b783";
1593 1580 };
1594 1581 meta = {
1595 1582 license = [ pkgs.lib.licenses.asl20 ];
1596 1583 };
1597 1584 };
1598 1585 uWSGI = super.buildPythonPackage {
1599 1586 name = "uWSGI-2.0.11.2";
1600 1587 buildInputs = with self; [];
1601 1588 doCheck = false;
1602 1589 propagatedBuildInputs = with self; [];
1603 1590 src = fetchurl {
1604 1591 url = "https://pypi.python.org/packages/9b/78/918db0cfab0546afa580c1e565209c49aaf1476bbfe491314eadbe47c556/uwsgi-2.0.11.2.tar.gz";
1605 1592 md5 = "1f02dcbee7f6f61de4b1fd68350cf16f";
1606 1593 };
1607 1594 meta = {
1608 1595 license = [ pkgs.lib.licenses.gpl2 ];
1609 1596 };
1610 1597 };
1611 1598 urllib3 = super.buildPythonPackage {
1612 1599 name = "urllib3-1.16";
1613 1600 buildInputs = with self; [];
1614 1601 doCheck = false;
1615 1602 propagatedBuildInputs = with self; [];
1616 1603 src = fetchurl {
1617 1604 url = "https://pypi.python.org/packages/3b/f0/e763169124e3f5db0926bc3dbfcd580a105f9ca44cf5d8e6c7a803c9f6b5/urllib3-1.16.tar.gz";
1618 1605 md5 = "fcaab1c5385c57deeb7053d3d7d81d59";
1619 1606 };
1620 1607 meta = {
1621 1608 license = [ pkgs.lib.licenses.mit ];
1622 1609 };
1623 1610 };
1624 1611 venusian = super.buildPythonPackage {
1625 1612 name = "venusian-1.0";
1626 1613 buildInputs = with self; [];
1627 1614 doCheck = false;
1628 1615 propagatedBuildInputs = with self; [];
1629 1616 src = fetchurl {
1630 1617 url = "https://pypi.python.org/packages/86/20/1948e0dfc4930ddde3da8c33612f6a5717c0b4bc28f591a5c5cf014dd390/venusian-1.0.tar.gz";
1631 1618 md5 = "dccf2eafb7113759d60c86faf5538756";
1632 1619 };
1633 1620 meta = {
1634 1621 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1635 1622 };
1636 1623 };
1637 1624 waitress = super.buildPythonPackage {
1638 1625 name = "waitress-0.8.9";
1639 1626 buildInputs = with self; [];
1640 1627 doCheck = false;
1641 1628 propagatedBuildInputs = with self; [setuptools];
1642 1629 src = fetchurl {
1643 1630 url = "https://pypi.python.org/packages/ee/65/fc9dee74a909a1187ca51e4f15ad9c4d35476e4ab5813f73421505c48053/waitress-0.8.9.tar.gz";
1644 1631 md5 = "da3f2e62b3676be5dd630703a68e2a04";
1645 1632 };
1646 1633 meta = {
1647 1634 license = [ pkgs.lib.licenses.zpt21 ];
1648 1635 };
1649 1636 };
1650 1637 ws4py = super.buildPythonPackage {
1651 1638 name = "ws4py-0.3.5";
1652 1639 buildInputs = with self; [];
1653 1640 doCheck = false;
1654 1641 propagatedBuildInputs = with self; [];
1655 1642 src = fetchurl {
1656 1643 url = "https://pypi.python.org/packages/b6/4f/34af703be86939629479e74d6e650e39f3bd73b3b09212c34e5125764cbc/ws4py-0.3.5.zip";
1657 1644 md5 = "a261b75c20b980e55ce7451a3576a867";
1658 1645 };
1659 1646 meta = {
1660 1647 license = [ pkgs.lib.licenses.bsdOriginal ];
1661 1648 };
1662 1649 };
1663 1650 wsgiref = super.buildPythonPackage {
1664 1651 name = "wsgiref-0.1.2";
1665 1652 buildInputs = with self; [];
1666 1653 doCheck = false;
1667 1654 propagatedBuildInputs = with self; [];
1668 1655 src = fetchurl {
1669 1656 url = "https://pypi.python.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip";
1670 1657 md5 = "29b146e6ebd0f9fb119fe321f7bcf6cb";
1671 1658 };
1672 1659 meta = {
1673 1660 license = [ { fullName = "PSF or ZPL"; } ];
1674 1661 };
1675 1662 };
1676 1663 zope.cachedescriptors = super.buildPythonPackage {
1677 1664 name = "zope.cachedescriptors-4.0.0";
1678 1665 buildInputs = with self; [];
1679 1666 doCheck = false;
1680 1667 propagatedBuildInputs = with self; [setuptools];
1681 1668 src = fetchurl {
1682 1669 url = "https://pypi.python.org/packages/40/33/694b6644c37f28553f4b9f20b3c3a20fb709a22574dff20b5bdffb09ecd5/zope.cachedescriptors-4.0.0.tar.gz";
1683 1670 md5 = "8d308de8c936792c8e758058fcb7d0f0";
1684 1671 };
1685 1672 meta = {
1686 1673 license = [ pkgs.lib.licenses.zpt21 ];
1687 1674 };
1688 1675 };
1689 1676 zope.deprecation = super.buildPythonPackage {
1690 1677 name = "zope.deprecation-4.1.2";
1691 1678 buildInputs = with self; [];
1692 1679 doCheck = false;
1693 1680 propagatedBuildInputs = with self; [setuptools];
1694 1681 src = fetchurl {
1695 1682 url = "https://pypi.python.org/packages/c1/d3/3919492d5e57d8dd01b36f30b34fc8404a30577392b1eb817c303499ad20/zope.deprecation-4.1.2.tar.gz";
1696 1683 md5 = "e9a663ded58f4f9f7881beb56cae2782";
1697 1684 };
1698 1685 meta = {
1699 1686 license = [ pkgs.lib.licenses.zpt21 ];
1700 1687 };
1701 1688 };
1702 1689 zope.event = super.buildPythonPackage {
1703 1690 name = "zope.event-4.0.3";
1704 1691 buildInputs = with self; [];
1705 1692 doCheck = false;
1706 1693 propagatedBuildInputs = with self; [setuptools];
1707 1694 src = fetchurl {
1708 1695 url = "https://pypi.python.org/packages/c1/29/91ba884d7d6d96691df592e9e9c2bfa57a47040ec1ff47eff18c85137152/zope.event-4.0.3.tar.gz";
1709 1696 md5 = "9a3780916332b18b8b85f522bcc3e249";
1710 1697 };
1711 1698 meta = {
1712 1699 license = [ pkgs.lib.licenses.zpt21 ];
1713 1700 };
1714 1701 };
1715 1702 zope.interface = super.buildPythonPackage {
1716 1703 name = "zope.interface-4.1.3";
1717 1704 buildInputs = with self; [];
1718 1705 doCheck = false;
1719 1706 propagatedBuildInputs = with self; [setuptools];
1720 1707 src = fetchurl {
1721 1708 url = "https://pypi.python.org/packages/9d/81/2509ca3c6f59080123c1a8a97125eb48414022618cec0e64eb1313727bfe/zope.interface-4.1.3.tar.gz";
1722 1709 md5 = "9ae3d24c0c7415deb249dd1a132f0f79";
1723 1710 };
1724 1711 meta = {
1725 1712 license = [ pkgs.lib.licenses.zpt21 ];
1726 1713 };
1727 1714 };
1728 1715
1729 1716 ### Test requirements
1730 1717
1731 1718
1732 1719 }
@@ -1,151 +1,150 b''
1 1 Babel==1.3
2 2 Beaker==1.7.0
3 3 CProfileV==1.0.6
4 Fabric==1.10.0
5 4 FormEncode==1.2.4
6 5 Jinja2==2.7.3
7 6 Mako==1.0.1
8 7 Markdown==2.6.2
9 8 MarkupSafe==0.23
10 9 MySQL-python==1.2.5
11 10 Paste==2.0.2
12 11 PasteDeploy==1.5.2
13 12 PasteScript==1.7.5
14 13 Pygments==2.1.3
15 14
16 15 # TODO: This version is not available on PyPI
17 16 # Pylons==1.0.2.dev20160108
18 17 Pylons==1.0.1
19 18
20 19 # TODO: This version is not available, but newer ones are
21 20 # Pyro4==4.35
22 21 Pyro4==4.41
23 22
24 23 # TODO: This should probably not be in here
25 24 # -e hg+https://johbo@code.rhodecode.com/johbo/rhodecode-fork@3a454bd1f17c0b2b2a951cf2b111e0320d7942a9#egg=RhodeCodeEnterprise-dev
26 25
27 26 Routes==1.13
28 27 SQLAlchemy==0.9.9
29 28 Sphinx==1.2.2
30 29 Tempita==0.5.2
31 30 URLObject==2.4.0
32 31 WebError==0.10.3
33 32
34 33 # TODO: This is modified by us, needs a better integration. For now
35 34 # using the latest version before.
36 35 # WebHelpers==1.3.dev20150807
37 36 WebHelpers==1.3
38 37
39 38 WebHelpers2==2.0
40 39 WebOb==1.3.1
41 40 WebTest==1.4.3
42 41 Whoosh==2.7.0
43 42 alembic==0.8.4
44 43 amqplib==1.0.2
45 44 anyjson==0.3.3
46 45 appenlight-client==0.6.14
47 46 authomatic==0.1.0.post1;
48 47 backport-ipaddress==0.1
49 48 bottle==0.12.8
50 49 bumpversion==0.5.3
51 50 celery==2.2.10
52 51 channelstream==0.5.2
53 52 click==5.1
54 53 colander==1.2
55 54 configobj==5.0.6
56 55 cov-core==1.15.0
57 56 coverage==3.7.1
58 57 cssselect==0.9.1
59 58 decorator==3.4.2
60 59 deform==2.0a2
61 60 docutils==0.12
62 61 dogpile.cache==0.6.1
63 62 dogpile.core==0.4.1
64 63 dulwich==0.12.0
65 64 ecdsa==0.11
66 65 flake8==2.4.1
67 66 future==0.14.3
68 67 futures==3.0.2
69 68 gevent==1.1.1
70 69 gprof2dot==2015.12.1
71 70 greenlet==0.4.9
72 71 gunicorn==19.6.0
73 72
74 73 # TODO: Needs subvertpy and blows up without Subversion headers,
75 74 # actually we should not need this for Enterprise at all.
76 75 # hgsubversion==1.8.2
77 76
78 77 gnureadline==6.3.3
79 78 infrae.cache==1.0.1
80 79 invoke==0.13.0
81 80 ipdb==0.8
82 81 ipython==3.1.0
83 82 iso8601==0.1.11
84 83 itsdangerous==0.24
85 84 kombu==1.5.1
86 85 lxml==3.4.4
87 86 mccabe==0.3
88 87 meld3==1.0.2
89 88 mock==1.0.1
90 89 msgpack-python==0.4.6
91 90 nose==1.3.6
92 91 objgraph==2.0.0
93 92 packaging==15.2
94 93 paramiko==1.15.1
95 94 pep8==1.5.7
96 95 psutil==2.2.1
97 96 psycopg2==2.6.1
98 97 py==1.4.29
99 98 py-bcrypt==0.4
100 99 py-gfm==0.1.3
101 100 pycrypto==2.6.1
102 101 pycurl==7.19.5
103 102 pyflakes==0.8.1
104 103 pyparsing==1.5.7
105 104 pyramid==1.6.1
106 105 pyramid-beaker==0.8
107 106 pyramid-debugtoolbar==2.4.2
108 107 pyramid-jinja2==2.5
109 108 pyramid-mako==1.0.2
110 109 pysqlite==2.6.3
111 110 pytest==2.8.5
112 111 pytest-runner==2.7.1
113 112 pytest-catchlog==1.2.2
114 113 pytest-cov==1.8.1
115 114 pytest-profiling==1.0.1
116 115 pytest-timeout==0.4
117 116 python-dateutil==1.5
118 117 python-ldap==2.4.19
119 118 python-memcached==1.57
120 119 python-pam==1.8.2
121 120 pytz==2015.4
122 121 pyzmq==14.6.0
123 122
124 123 # TODO: This is not available in public
125 124 # rc-testdata==0.2.0
126 125
127 126 https://code.rhodecode.com/rhodecode-tools-ce/archive/v0.10.0.zip#md5=4762391473ded761bead3aa58c748044
128 127
129 128
130 129 recaptcha-client==1.0.6
131 130 repoze.lru==0.6
132 131 requests==2.9.1
133 132 serpent==1.12
134 133 setproctitle==1.1.8
135 134 setuptools==20.8.1
136 135 setuptools-scm==1.11.0
137 136 simplejson==3.7.2
138 137 six==1.9.0
139 138 subprocess32==3.2.6
140 139 supervisor==3.3.0
141 140 transifex-client==0.10
142 141 translationstring==1.3
143 142 trollius==1.0.4
144 143 uWSGI==2.0.11.2
145 144 venusian==1.0
146 145 waitress==0.8.9
147 146 wsgiref==0.1.2
148 147 zope.cachedescriptors==4.0.0
149 148 zope.deprecation==4.1.2
150 149 zope.event==4.0.3
151 150 zope.interface==4.1.3
@@ -1,1 +1,1 b''
1 4.3.1 No newline at end of file
1 4.4.0 No newline at end of file
@@ -1,62 +1,63 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2016 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 """
22 22
23 23 RhodeCode, a web based repository management software
24 24 versioning implementation: http://www.python.org/dev/peps/pep-0386/
25 25 """
26 26
27 27 import os
28 28 import sys
29 29 import platform
30 30
31 31 VERSION = tuple(open(os.path.join(
32 32 os.path.dirname(__file__), 'VERSION')).read().split('.'))
33 33
34 34 BACKENDS = {
35 35 'hg': 'Mercurial repository',
36 36 'git': 'Git repository',
37 37 'svn': 'Subversion repository',
38 38 }
39 39
40 40 CELERY_ENABLED = False
41 41 CELERY_EAGER = False
42 42
43 43 # link to config for pylons
44 44 CONFIG = {}
45 45
46 46 # Populated with the settings dictionary from application init in
47 47 # rhodecode.conf.environment.load_pyramid_environment
48 48 PYRAMID_SETTINGS = {}
49 49
50 50 # Linked module for extensions
51 51 EXTENSIONS = {}
52 52
53 53 __version__ = ('.'.join((str(each) for each in VERSION[:3])))
54 __dbversion__ = 55 # defines current db version for migrations
54 __dbversion__ = 58 # defines current db version for migrations
55 55 __platform__ = platform.system()
56 56 __license__ = 'AGPLv3, and Commercial License'
57 57 __author__ = 'RhodeCode GmbH'
58 58 __url__ = 'http://rhodecode.com'
59 59
60 60 is_windows = __platform__ in ['Windows']
61 61 is_unix = not is_windows
62 62 is_test = False
63 disable_error_handler = False
@@ -1,79 +1,81 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2016 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import os
22 22
23 23 from pyramid.settings import asbool
24 24
25 25 from rhodecode.config.routing import ADMIN_PREFIX
26 26 from rhodecode.lib.ext_json import json
27 27
28 28
29 29 def url_gen(request):
30 30 urls = {
31 31 'connect': request.route_url('channelstream_connect'),
32 'subscribe': request.route_url('channelstream_subscribe')
32 'subscribe': request.route_url('channelstream_subscribe'),
33 'longpoll': request.registry.settings.get('channelstream.longpoll_url', ''),
34 'ws': request.registry.settings.get('channelstream.ws_url', '')
33 35 }
34 36 return json.dumps(urls)
35 37
36 38
37 39 PLUGIN_DEFINITION = {
38 40 'name': 'channelstream',
39 41 'config': {
40 42 'javascript': [],
41 43 'css': [],
42 44 'template_hooks': {
43 45 'plugin_init_template': 'rhodecode:templates/channelstream/plugin_init.html'
44 46 },
45 47 'url_gen': url_gen,
46 48 'static': None,
47 49 'enabled': False,
48 50 'server': '',
49 51 'secret': ''
50 52 }
51 53 }
52 54
53 55
54 56 def includeme(config):
55 57 settings = config.registry.settings
56 58 PLUGIN_DEFINITION['config']['enabled'] = asbool(
57 59 settings.get('channelstream.enabled'))
58 60 PLUGIN_DEFINITION['config']['server'] = settings.get(
59 61 'channelstream.server', '')
60 62 PLUGIN_DEFINITION['config']['secret'] = settings.get(
61 63 'channelstream.secret', '')
62 64 PLUGIN_DEFINITION['config']['history.location'] = settings.get(
63 65 'channelstream.history.location', '')
64 66 config.register_rhodecode_plugin(
65 67 PLUGIN_DEFINITION['name'],
66 68 PLUGIN_DEFINITION['config']
67 69 )
68 70 # create plugin history location
69 71 history_dir = PLUGIN_DEFINITION['config']['history.location']
70 72 if history_dir and not os.path.exists(history_dir):
71 73 os.makedirs(history_dir, 0750)
72 74
73 75 config.add_route(
74 76 name='channelstream_connect',
75 77 pattern=ADMIN_PREFIX + '/channelstream/connect')
76 78 config.add_route(
77 79 name='channelstream_subscribe',
78 80 pattern=ADMIN_PREFIX + '/channelstream/subscribe')
79 81 config.scan('rhodecode.channelstream')
@@ -1,177 +1,178 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2016 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 """
22 22 Channel Stream controller for rhodecode
23 23
24 24 :created_on: Oct 10, 2015
25 25 :author: marcinl
26 26 :copyright: (c) 2013-2015 RhodeCode GmbH.
27 27 :license: Commercial License, see LICENSE for more details.
28 28 """
29 29
30 30 import logging
31 31 import uuid
32 32
33 33 from pylons import tmpl_context as c
34 34 from pyramid.settings import asbool
35 35 from pyramid.view import view_config
36 36 from webob.exc import HTTPBadRequest, HTTPForbidden, HTTPBadGateway
37 37
38 38 from rhodecode.lib.channelstream import (
39 39 channelstream_request,
40 40 ChannelstreamConnectionException,
41 41 ChannelstreamPermissionException,
42 42 check_channel_permissions,
43 43 get_connection_validators,
44 44 get_user_data,
45 45 parse_channels_info,
46 46 update_history_from_logs,
47 47 STATE_PUBLIC_KEYS)
48 48 from rhodecode.lib.auth import NotAnonymous
49 49 from rhodecode.lib.utils2 import str2bool
50 50
51 51 log = logging.getLogger(__name__)
52 52
53 53
54 54 class ChannelstreamView(object):
55 55 def __init__(self, context, request):
56 56 self.context = context
57 57 self.request = request
58 58
59 59 # Some of the decorators rely on this attribute to be present
60 60 # on the class of the decorated method.
61 61 self._rhodecode_user = request.user
62 62 registry = request.registry
63 63 self.channelstream_config = registry.rhodecode_plugins['channelstream']
64 64 if not self.channelstream_config.get('enabled'):
65 65 log.exception('Channelstream plugin is disabled')
66 66 raise HTTPBadRequest()
67 67
68 68 @NotAnonymous()
69 69 @view_config(route_name='channelstream_connect', renderer='json')
70 70 def connect(self):
71 71 """ handle authorization of users trying to connect """
72 72 try:
73 73 json_body = self.request.json_body
74 74 except Exception:
75 75 log.exception('Failed to decode json from request')
76 76 raise HTTPBadRequest()
77 77 try:
78 78 channels = check_channel_permissions(
79 79 json_body.get('channels'),
80 80 get_connection_validators(self.request.registry))
81 81 except ChannelstreamPermissionException:
82 82 log.error('Incorrect permissions for requested channels')
83 83 raise HTTPForbidden()
84 84
85 85 user = c.rhodecode_user
86 86 if user.user_id:
87 87 user_data = get_user_data(user.user_id)
88 88 else:
89 89 user_data = {
90 90 'id': None,
91 91 'username': None,
92 92 'first_name': None,
93 93 'last_name': None,
94 94 'icon_link': None,
95 95 'display_name': None,
96 96 'display_link': None,
97 97 }
98 user_data['permissions'] = c.rhodecode_user.permissions
98 99 payload = {
99 100 'username': user.username,
100 101 'user_state': user_data,
101 102 'conn_id': str(uuid.uuid4()),
102 103 'channels': channels,
103 104 'channel_configs': {},
104 105 'state_public_keys': STATE_PUBLIC_KEYS,
105 106 'info': {
106 107 'exclude_channels': ['broadcast']
107 108 }
108 109 }
109 110 filtered_channels = [channel for channel in channels
110 111 if channel != 'broadcast']
111 112 for channel in filtered_channels:
112 113 payload['channel_configs'][channel] = {
113 114 'notify_presence': True,
114 115 'history_size': 100,
115 116 'store_history': True,
116 117 'broadcast_presence_with_user_lists': True
117 118 }
118 119 # connect user to server
119 120 try:
120 121 connect_result = channelstream_request(self.channelstream_config,
121 122 payload, '/connect')
122 123 except ChannelstreamConnectionException:
123 124 log.exception('Channelstream service is down')
124 125 return HTTPBadGateway()
125 126
126 127 connect_result['channels'] = channels
127 128 connect_result['channels_info'] = parse_channels_info(
128 129 connect_result['channels_info'],
129 130 include_channel_info=filtered_channels)
130 131 update_history_from_logs(self.channelstream_config,
131 132 filtered_channels, connect_result)
132 133 return connect_result
133 134
134 135 @NotAnonymous()
135 136 @view_config(route_name='channelstream_subscribe', renderer='json')
136 137 def subscribe(self):
137 138 """ can be used to subscribe specific connection to other channels """
138 139 try:
139 140 json_body = self.request.json_body
140 141 except Exception:
141 142 log.exception('Failed to decode json from request')
142 143 raise HTTPBadRequest()
143 144 try:
144 145 channels = check_channel_permissions(
145 146 json_body.get('channels'),
146 147 get_connection_validators(self.request.registry))
147 148 except ChannelstreamPermissionException:
148 149 log.error('Incorrect permissions for requested channels')
149 150 raise HTTPForbidden()
150 151 payload = {'conn_id': json_body.get('conn_id', ''),
151 152 'channels': channels,
152 153 'channel_configs': {},
153 154 'info': {
154 155 'exclude_channels': ['broadcast']}
155 156 }
156 157 filtered_channels = [chan for chan in channels if chan != 'broadcast']
157 158 for channel in filtered_channels:
158 159 payload['channel_configs'][channel] = {
159 160 'notify_presence': True,
160 161 'history_size': 100,
161 162 'store_history': True,
162 163 'broadcast_presence_with_user_lists': True
163 164 }
164 165 try:
165 166 connect_result = channelstream_request(
166 167 self.channelstream_config, payload, '/subscribe')
167 168 except ChannelstreamConnectionException:
168 169 log.exception('Channelstream service is down')
169 170 return HTTPBadGateway()
170 171 # include_channel_info will limit history only to new channel
171 172 # to not overwrite histories on other channels in client
172 173 connect_result['channels_info'] = parse_channels_info(
173 174 connect_result['channels_info'],
174 175 include_channel_info=filtered_channels)
175 176 update_history_from_logs(self.channelstream_config,
176 177 filtered_channels, connect_result)
177 178 return connect_result
@@ -1,35 +1,39 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2013-2016 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 """
22 22 Various config settings for RhodeCode
23 23 """
24 24 from rhodecode import EXTENSIONS
25 25
26 26 from rhodecode.lib.utils2 import __get_lem
27 27
28 28
29 29 # language map is also used by whoosh indexer, which for those specified
30 30 # extensions will index it's content
31 LANGUAGES_EXTENSIONS_MAP = __get_lem()
31 # custom extensions to lexers, format is 'ext': 'LexerClass'
32 extra = {
33 'vbs': 'VbNet'
34 }
35 LANGUAGES_EXTENSIONS_MAP = __get_lem(extra)
32 36
33 37 DATETIME_FORMAT = "%Y-%m-%d %H:%M:%S"
34 38
35 39 DATE_FORMAT = "%Y-%m-%d"
@@ -1,188 +1,190 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2016 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 """
22 22 Pylons environment configuration
23 23 """
24 24
25 25 import os
26 26 import logging
27 27 import rhodecode
28 28 import platform
29 29 import re
30 30 import io
31 31
32 32 from mako.lookup import TemplateLookup
33 33 from pylons.configuration import PylonsConfig
34 34 from pylons.error import handle_mako_error
35 35 from pyramid.settings import asbool
36 36
37 37 # ------------------------------------------------------------------------------
38 38 # CELERY magic until refactor - issue #4163 - import order matters here:
39 39 from rhodecode.lib import celerypylons # this must be first, celerypylons
40 40 # sets config settings upon import
41 41
42 42 import rhodecode.integrations # any modules using celery task
43 43 # decorators should be added afterwards:
44 44 # ------------------------------------------------------------------------------
45 45
46 46 from rhodecode.lib import app_globals
47 47 from rhodecode.config import utils
48 48 from rhodecode.config.routing import make_map
49 49 from rhodecode.config.jsroutes import generate_jsroutes_content
50 50
51 51 from rhodecode.lib import helpers
52 52 from rhodecode.lib.auth import set_available_permissions
53 53 from rhodecode.lib.utils import (
54 54 repo2db_mapper, make_db_config, set_rhodecode_config,
55 55 load_rcextensions)
56 56 from rhodecode.lib.utils2 import str2bool, aslist
57 57 from rhodecode.lib.vcs import connect_vcs, start_vcs_server
58 58 from rhodecode.model.scm import ScmModel
59 59
60 60 log = logging.getLogger(__name__)
61 61
62 62 def load_environment(global_conf, app_conf, initial=False,
63 63 test_env=None, test_index=None):
64 64 """
65 65 Configure the Pylons environment via the ``pylons.config``
66 66 object
67 67 """
68 68 config = PylonsConfig()
69 69
70 70
71 71 # Pylons paths
72 72 root = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
73 73 paths = {
74 74 'root': root,
75 75 'controllers': os.path.join(root, 'controllers'),
76 76 'static_files': os.path.join(root, 'public'),
77 77 'templates': [os.path.join(root, 'templates')],
78 78 }
79 79
80 80 # Initialize config with the basic options
81 81 config.init_app(global_conf, app_conf, package='rhodecode', paths=paths)
82 82
83 83 # store some globals into rhodecode
84 84 rhodecode.CELERY_ENABLED = str2bool(config['app_conf'].get('use_celery'))
85 85 rhodecode.CELERY_EAGER = str2bool(
86 86 config['app_conf'].get('celery.always.eager'))
87 87
88 88 config['routes.map'] = make_map(config)
89 89
90 90 if asbool(config.get('generate_js_files', 'false')):
91 91 jsroutes = config['routes.map'].jsroutes()
92 92 jsroutes_file_content = generate_jsroutes_content(jsroutes)
93 93 jsroutes_file_path = os.path.join(
94 94 paths['static_files'], 'js', 'rhodecode', 'routes.js')
95 95
96 96 with io.open(jsroutes_file_path, 'w', encoding='utf-8') as f:
97 97 f.write(jsroutes_file_content)
98 98
99 99 config['pylons.app_globals'] = app_globals.Globals(config)
100 100 config['pylons.h'] = helpers
101 101 rhodecode.CONFIG = config
102 102
103 103 load_rcextensions(root_path=config['here'])
104 104
105 105 # Setup cache object as early as possible
106 106 import pylons
107 107 pylons.cache._push_object(config['pylons.app_globals'].cache)
108 108
109 109 # Create the Mako TemplateLookup, with the default auto-escaping
110 110 config['pylons.app_globals'].mako_lookup = TemplateLookup(
111 111 directories=paths['templates'],
112 112 error_handler=handle_mako_error,
113 113 module_directory=os.path.join(app_conf['cache_dir'], 'templates'),
114 114 input_encoding='utf-8', default_filters=['escape'],
115 115 imports=['from webhelpers.html import escape'])
116 116
117 117 # sets the c attribute access when don't existing attribute are accessed
118 118 config['pylons.strict_tmpl_context'] = True
119 119
120 120 # configure channelstream
121 121 config['channelstream_config'] = {
122 122 'enabled': asbool(config.get('channelstream.enabled', False)),
123 123 'server': config.get('channelstream.server'),
124 124 'secret': config.get('channelstream.secret')
125 125 }
126 126
127 127 set_available_permissions(config)
128 128 db_cfg = make_db_config(clear_session=True)
129 129
130 130 repos_path = list(db_cfg.items('paths'))[0][1]
131 131 config['base_path'] = repos_path
132 132
133 133 # store db config also in main global CONFIG
134 134 set_rhodecode_config(config)
135 135
136 136 # configure instance id
137 137 utils.set_instance_id(config)
138 138
139 139 # CONFIGURATION OPTIONS HERE (note: all config options will override
140 140 # any Pylons config options)
141 141
142 142 # store config reference into our module to skip import magic of pylons
143 143 rhodecode.CONFIG.update(config)
144 144
145 145 return config
146 146
147 147
148 148 def load_pyramid_environment(global_config, settings):
149 149 # Some parts of the code expect a merge of global and app settings.
150 150 settings_merged = global_config.copy()
151 151 settings_merged.update(settings)
152 152
153 153 # Store the settings to make them available to other modules.
154 154 rhodecode.PYRAMID_SETTINGS = settings_merged
155 155
156 156 # If this is a test run we prepare the test environment like
157 157 # creating a test database, test search index and test repositories.
158 158 # This has to be done before the database connection is initialized.
159 159 if settings['is_test']:
160 160 rhodecode.is_test = True
161 rhodecode.disable_error_handler = True
162
161 163 utils.initialize_test_environment(settings_merged)
162 164
163 165 # Initialize the database connection.
164 166 utils.initialize_database(settings_merged)
165 167
166 168 # Limit backends to `vcs.backends` from configuration
167 169 for alias in rhodecode.BACKENDS.keys():
168 170 if alias not in settings['vcs.backends']:
169 171 del rhodecode.BACKENDS[alias]
170 172 log.info('Enabled VCS backends: %s', rhodecode.BACKENDS.keys())
171 173
172 174 # initialize vcs client and optionally run the server if enabled
173 175 vcs_server_uri = settings['vcs.server']
174 176 vcs_server_enabled = settings['vcs.server.enable']
175 177 start_server = (
176 178 settings['vcs.start_server'] and
177 179 not int(os.environ.get('RC_VCSSERVER_TEST_DISABLE', '0')))
178 180
179 181 if vcs_server_enabled and start_server:
180 182 log.info("Starting vcsserver")
181 183 start_vcs_server(server_and_port=vcs_server_uri,
182 184 protocol=utils.get_vcs_server_protocol(settings),
183 185 log_level=settings['vcs.server.log_level'])
184 186
185 187 utils.configure_pyro4(settings)
186 188 utils.configure_vcs(settings)
187 189 if vcs_server_enabled:
188 190 connect_vcs(vcs_server_uri, utils.get_vcs_server_protocol(settings))
@@ -1,497 +1,505 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2016 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 """
22 22 Pylons middleware initialization
23 23 """
24 24 import logging
25 25 from collections import OrderedDict
26 26
27 27 from paste.registry import RegistryManager
28 28 from paste.gzipper import make_gzip_middleware
29 29 from pylons.wsgiapp import PylonsApp
30 30 from pyramid.authorization import ACLAuthorizationPolicy
31 31 from pyramid.config import Configurator
32 32 from pyramid.settings import asbool, aslist
33 33 from pyramid.wsgi import wsgiapp
34 34 from pyramid.httpexceptions import HTTPError, HTTPInternalServerError, HTTPFound
35 35 from pyramid.events import ApplicationCreated
36 36 import pyramid.httpexceptions as httpexceptions
37 37 from pyramid.renderers import render_to_response
38 38 from routes.middleware import RoutesMiddleware
39 39 import routes.util
40 40
41 41 import rhodecode
42 42 from rhodecode.model import meta
43 43 from rhodecode.config import patches
44 44 from rhodecode.config.routing import STATIC_FILE_PREFIX
45 45 from rhodecode.config.environment import (
46 46 load_environment, load_pyramid_environment)
47 from rhodecode.lib.exceptions import VCSServerUnavailable
48 from rhodecode.lib.vcs.exceptions import VCSCommunicationError
47 49 from rhodecode.lib.middleware import csrf
48 50 from rhodecode.lib.middleware.appenlight import wrap_in_appenlight_if_enabled
49 from rhodecode.lib.middleware.disable_vcs import DisableVCSPagesWrapper
50 51 from rhodecode.lib.middleware.https_fixup import HttpsFixup
51 52 from rhodecode.lib.middleware.vcs import VCSMiddleware
52 53 from rhodecode.lib.plugins.utils import register_rhodecode_plugin
53 54 from rhodecode.lib.utils2 import aslist as rhodecode_aslist
54 55 from rhodecode.subscribers import scan_repositories_if_enabled
55 56
56 57
57 58 log = logging.getLogger(__name__)
58 59
59 60
60 61 # this is used to avoid avoid the route lookup overhead in routesmiddleware
61 62 # for certain routes which won't go to pylons to - eg. static files, debugger
62 63 # it is only needed for the pylons migration and can be removed once complete
63 64 class SkippableRoutesMiddleware(RoutesMiddleware):
64 65 """ Routes middleware that allows you to skip prefixes """
65 66
66 67 def __init__(self, *args, **kw):
67 68 self.skip_prefixes = kw.pop('skip_prefixes', [])
68 69 super(SkippableRoutesMiddleware, self).__init__(*args, **kw)
69 70
70 71 def __call__(self, environ, start_response):
71 72 for prefix in self.skip_prefixes:
72 73 if environ['PATH_INFO'].startswith(prefix):
73 74 # added to avoid the case when a missing /_static route falls
74 75 # through to pylons and causes an exception as pylons is
75 76 # expecting wsgiorg.routingargs to be set in the environ
76 77 # by RoutesMiddleware.
77 78 if 'wsgiorg.routing_args' not in environ:
78 79 environ['wsgiorg.routing_args'] = (None, {})
79 80 return self.app(environ, start_response)
80 81
81 82 return super(SkippableRoutesMiddleware, self).__call__(
82 83 environ, start_response)
83 84
84 85
85 86 def make_app(global_conf, static_files=True, **app_conf):
86 87 """Create a Pylons WSGI application and return it
87 88
88 89 ``global_conf``
89 90 The inherited configuration for this application. Normally from
90 91 the [DEFAULT] section of the Paste ini file.
91 92
92 93 ``app_conf``
93 94 The application's local configuration. Normally specified in
94 95 the [app:<name>] section of the Paste ini file (where <name>
95 96 defaults to main).
96 97
97 98 """
98 99 # Apply compatibility patches
99 100 patches.kombu_1_5_1_python_2_7_11()
100 101 patches.inspect_getargspec()
101 102
102 103 # Configure the Pylons environment
103 104 config = load_environment(global_conf, app_conf)
104 105
105 106 # The Pylons WSGI app
106 107 app = PylonsApp(config=config)
107 108 if rhodecode.is_test:
108 109 app = csrf.CSRFDetector(app)
109 110
110 111 expected_origin = config.get('expected_origin')
111 112 if expected_origin:
112 113 # The API can be accessed from other Origins.
113 114 app = csrf.OriginChecker(app, expected_origin,
114 115 skip_urls=[routes.util.url_for('api')])
115 116
116 117 # Establish the Registry for this application
117 118 app = RegistryManager(app)
118 119
119 120 app.config = config
120 121
121 122 return app
122 123
123 124
124 125 def make_pyramid_app(global_config, **settings):
125 126 """
126 127 Constructs the WSGI application based on Pyramid and wraps the Pylons based
127 128 application.
128 129
129 130 Specials:
130 131
131 132 * We migrate from Pylons to Pyramid. While doing this, we keep both
132 133 frameworks functional. This involves moving some WSGI middlewares around
133 134 and providing access to some data internals, so that the old code is
134 135 still functional.
135 136
136 137 * The application can also be integrated like a plugin via the call to
137 138 `includeme`. This is accompanied with the other utility functions which
138 139 are called. Changing this should be done with great care to not break
139 140 cases when these fragments are assembled from another place.
140 141
141 142 """
142 143 # The edition string should be available in pylons too, so we add it here
143 144 # before copying the settings.
144 145 settings.setdefault('rhodecode.edition', 'Community Edition')
145 146
146 147 # As long as our Pylons application does expect "unprepared" settings, make
147 148 # sure that we keep an unmodified copy. This avoids unintentional change of
148 149 # behavior in the old application.
149 150 settings_pylons = settings.copy()
150 151
151 152 sanitize_settings_and_apply_defaults(settings)
152 153 config = Configurator(settings=settings)
153 154 add_pylons_compat_data(config.registry, global_config, settings_pylons)
154 155
155 156 load_pyramid_environment(global_config, settings)
156 157
157 158 includeme_first(config)
158 159 includeme(config)
159 160 pyramid_app = config.make_wsgi_app()
160 161 pyramid_app = wrap_app_in_wsgi_middlewares(pyramid_app, config)
161 162 pyramid_app.config = config
162 163
163 164 # creating the app uses a connection - return it after we are done
164 165 meta.Session.remove()
165 166
166 167 return pyramid_app
167 168
168 169
169 170 def make_not_found_view(config):
170 171 """
171 172 This creates the view which should be registered as not-found-view to
172 173 pyramid. Basically it contains of the old pylons app, converted to a view.
173 174 Additionally it is wrapped by some other middlewares.
174 175 """
175 176 settings = config.registry.settings
176 177 vcs_server_enabled = settings['vcs.server.enable']
177 178
178 179 # Make pylons app from unprepared settings.
179 180 pylons_app = make_app(
180 181 config.registry._pylons_compat_global_config,
181 182 **config.registry._pylons_compat_settings)
182 183 config.registry._pylons_compat_config = pylons_app.config
183 184
184 185 # Appenlight monitoring.
185 186 pylons_app, appenlight_client = wrap_in_appenlight_if_enabled(
186 187 pylons_app, settings)
187 188
188 189 # The VCSMiddleware shall operate like a fallback if pyramid doesn't find
189 190 # a view to handle the request. Therefore we wrap it around the pylons app.
190 191 if vcs_server_enabled:
191 192 pylons_app = VCSMiddleware(
192 193 pylons_app, settings, appenlight_client, registry=config.registry)
193 194
194 195 pylons_app_as_view = wsgiapp(pylons_app)
195 196
196 # Protect from VCS Server error related pages when server is not available
197 if not vcs_server_enabled:
198 pylons_app_as_view = DisableVCSPagesWrapper(pylons_app_as_view)
199
200 197 def pylons_app_with_error_handler(context, request):
201 198 """
202 199 Handle exceptions from rc pylons app:
203 200
204 201 - old webob type exceptions get converted to pyramid exceptions
205 202 - pyramid exceptions are passed to the error handler view
206 203 """
207 204 def is_vcs_response(response):
208 205 return 'X-RhodeCode-Backend' in response.headers
209 206
210 207 def is_http_error(response):
211 208 # webob type error responses
212 209 return (400 <= response.status_int <= 599)
213 210
214 211 def is_error_handling_needed(response):
215 212 return is_http_error(response) and not is_vcs_response(response)
216 213
217 214 try:
218 215 response = pylons_app_as_view(context, request)
219 216 if is_error_handling_needed(response):
220 217 response = webob_to_pyramid_http_response(response)
221 218 return error_handler(response, request)
222 219 except HTTPError as e: # pyramid type exceptions
223 220 return error_handler(e, request)
224 except Exception:
225 if settings.get('debugtoolbar.enabled', False):
221 except Exception as e:
222 log.exception(e)
223
224 if (settings.get('debugtoolbar.enabled', False) or
225 rhodecode.disable_error_handler):
226 226 raise
227
228 if isinstance(e, VCSCommunicationError):
229 return error_handler(VCSServerUnavailable(), request)
230
227 231 return error_handler(HTTPInternalServerError(), request)
232
228 233 return response
229 234
230 235 return pylons_app_with_error_handler
231 236
232 237
233 238 def add_pylons_compat_data(registry, global_config, settings):
234 239 """
235 240 Attach data to the registry to support the Pylons integration.
236 241 """
237 242 registry._pylons_compat_global_config = global_config
238 243 registry._pylons_compat_settings = settings
239 244
240 245
241 246 def webob_to_pyramid_http_response(webob_response):
242 247 ResponseClass = httpexceptions.status_map[webob_response.status_int]
243 248 pyramid_response = ResponseClass(webob_response.status)
244 249 pyramid_response.status = webob_response.status
245 250 pyramid_response.headers.update(webob_response.headers)
246 251 if pyramid_response.headers['content-type'] == 'text/html':
247 252 pyramid_response.headers['content-type'] = 'text/html; charset=UTF-8'
248 253 return pyramid_response
249 254
250 255
251 256 def error_handler(exception, request):
252 # TODO: dan: replace the old pylons error controller with this
253 257 from rhodecode.model.settings import SettingsModel
254 258 from rhodecode.lib.utils2 import AttributeDict
255 259
256 260 try:
257 261 rc_config = SettingsModel().get_all_settings()
258 262 except Exception:
259 263 log.exception('failed to fetch settings')
260 264 rc_config = {}
261 265
262 266 base_response = HTTPInternalServerError()
263 267 # prefer original exception for the response since it may have headers set
264 268 if isinstance(exception, HTTPError):
265 269 base_response = exception
266 270
267 271 c = AttributeDict()
268 272 c.error_message = base_response.status
269 273 c.error_explanation = base_response.explanation or str(base_response)
270 274 c.visual = AttributeDict()
271 275
272 276 c.visual.rhodecode_support_url = (
273 277 request.registry.settings.get('rhodecode_support_url') or
274 278 request.route_url('rhodecode_support')
275 279 )
276 280 c.redirect_time = 0
277 281 c.rhodecode_name = rc_config.get('rhodecode_title', '')
278 282 if not c.rhodecode_name:
279 283 c.rhodecode_name = 'Rhodecode'
280 284
285 c.causes = []
286 if hasattr(base_response, 'causes'):
287 c.causes = base_response.causes
288
281 289 response = render_to_response(
282 290 '/errors/error_document.html', {'c': c}, request=request,
283 291 response=base_response)
284 292
285 293 return response
286 294
287 295
288 296 def includeme(config):
289 297 settings = config.registry.settings
290 298
291 299 # plugin information
292 300 config.registry.rhodecode_plugins = OrderedDict()
293 301
294 302 config.add_directive(
295 303 'register_rhodecode_plugin', register_rhodecode_plugin)
296 304
297 305 if asbool(settings.get('appenlight', 'false')):
298 306 config.include('appenlight_client.ext.pyramid_tween')
299 307
300 308 # Includes which are required. The application would fail without them.
301 309 config.include('pyramid_mako')
302 310 config.include('pyramid_beaker')
303 311 config.include('rhodecode.channelstream')
304 312 config.include('rhodecode.admin')
305 313 config.include('rhodecode.authentication')
306 314 config.include('rhodecode.integrations')
307 315 config.include('rhodecode.login')
308 316 config.include('rhodecode.tweens')
309 317 config.include('rhodecode.api')
310 318 config.include('rhodecode.svn_support')
311 319 config.add_route(
312 320 'rhodecode_support', 'https://rhodecode.com/help/', static=True)
313 321
314 322 # Add subscribers.
315 323 config.add_subscriber(scan_repositories_if_enabled, ApplicationCreated)
316 324
317 325 # Set the authorization policy.
318 326 authz_policy = ACLAuthorizationPolicy()
319 327 config.set_authorization_policy(authz_policy)
320 328
321 329 # Set the default renderer for HTML templates to mako.
322 330 config.add_mako_renderer('.html')
323 331
324 332 # include RhodeCode plugins
325 333 includes = aslist(settings.get('rhodecode.includes', []))
326 334 for inc in includes:
327 335 config.include(inc)
328 336
329 337 # This is the glue which allows us to migrate in chunks. By registering the
330 338 # pylons based application as the "Not Found" view in Pyramid, we will
331 339 # fallback to the old application each time the new one does not yet know
332 340 # how to handle a request.
333 341 config.add_notfound_view(make_not_found_view(config))
334 342
335 343 if not settings.get('debugtoolbar.enabled', False):
336 344 # if no toolbar, then any exception gets caught and rendered
337 345 config.add_view(error_handler, context=Exception)
338 346
339 347 config.add_view(error_handler, context=HTTPError)
340 348
341 349
342 350 def includeme_first(config):
343 351 # redirect automatic browser favicon.ico requests to correct place
344 352 def favicon_redirect(context, request):
345 353 return HTTPFound(
346 354 request.static_path('rhodecode:public/images/favicon.ico'))
347 355
348 356 config.add_view(favicon_redirect, route_name='favicon')
349 357 config.add_route('favicon', '/favicon.ico')
350 358
351 359 config.add_static_view(
352 360 '_static/deform', 'deform:static')
353 361 config.add_static_view(
354 362 '_static/rhodecode', path='rhodecode:public', cache_max_age=3600 * 24)
355 363
356 364
357 365 def wrap_app_in_wsgi_middlewares(pyramid_app, config):
358 366 """
359 367 Apply outer WSGI middlewares around the application.
360 368
361 369 Part of this has been moved up from the Pylons layer, so that the
362 370 data is also available if old Pylons code is hit through an already ported
363 371 view.
364 372 """
365 373 settings = config.registry.settings
366 374
367 375 # enable https redirects based on HTTP_X_URL_SCHEME set by proxy
368 376 pyramid_app = HttpsFixup(pyramid_app, settings)
369 377
370 378 # Add RoutesMiddleware to support the pylons compatibility tween during
371 379 # migration to pyramid.
372 380 pyramid_app = SkippableRoutesMiddleware(
373 381 pyramid_app, config.registry._pylons_compat_config['routes.map'],
374 382 skip_prefixes=(STATIC_FILE_PREFIX, '/_debug_toolbar'))
375 383
376 384 pyramid_app, _ = wrap_in_appenlight_if_enabled(pyramid_app, settings)
377 385
378 386 if settings['gzip_responses']:
379 387 pyramid_app = make_gzip_middleware(
380 388 pyramid_app, settings, compress_level=1)
381 389
382 390
383 391 # this should be the outer most middleware in the wsgi stack since
384 392 # middleware like Routes make database calls
385 393 def pyramid_app_with_cleanup(environ, start_response):
386 394 try:
387 395 return pyramid_app(environ, start_response)
388 396 finally:
389 397 # Dispose current database session and rollback uncommitted
390 398 # transactions.
391 399 meta.Session.remove()
392 400
393 401 # In a single threaded mode server, on non sqlite db we should have
394 402 # '0 Current Checked out connections' at the end of a request,
395 403 # if not, then something, somewhere is leaving a connection open
396 404 pool = meta.Base.metadata.bind.engine.pool
397 405 log.debug('sa pool status: %s', pool.status())
398 406
399 407
400 408 return pyramid_app_with_cleanup
401 409
402 410
403 411 def sanitize_settings_and_apply_defaults(settings):
404 412 """
405 413 Applies settings defaults and does all type conversion.
406 414
407 415 We would move all settings parsing and preparation into this place, so that
408 416 we have only one place left which deals with this part. The remaining parts
409 417 of the application would start to rely fully on well prepared settings.
410 418
411 419 This piece would later be split up per topic to avoid a big fat monster
412 420 function.
413 421 """
414 422
415 423 # Pyramid's mako renderer has to search in the templates folder so that the
416 424 # old templates still work. Ported and new templates are expected to use
417 425 # real asset specifications for the includes.
418 426 mako_directories = settings.setdefault('mako.directories', [
419 427 # Base templates of the original Pylons application
420 428 'rhodecode:templates',
421 429 ])
422 430 log.debug(
423 431 "Using the following Mako template directories: %s",
424 432 mako_directories)
425 433
426 434 # Default includes, possible to change as a user
427 435 pyramid_includes = settings.setdefault('pyramid.includes', [
428 436 'rhodecode.lib.middleware.request_wrapper',
429 437 ])
430 438 log.debug(
431 439 "Using the following pyramid.includes: %s",
432 440 pyramid_includes)
433 441
434 442 # TODO: johbo: Re-think this, usually the call to config.include
435 443 # should allow to pass in a prefix.
436 444 settings.setdefault('rhodecode.api.url', '/_admin/api')
437 445
438 446 # Sanitize generic settings.
439 447 _list_setting(settings, 'default_encoding', 'UTF-8')
440 448 _bool_setting(settings, 'is_test', 'false')
441 449 _bool_setting(settings, 'gzip_responses', 'false')
442 450
443 451 # Call split out functions that sanitize settings for each topic.
444 452 _sanitize_appenlight_settings(settings)
445 453 _sanitize_vcs_settings(settings)
446 454
447 455 return settings
448 456
449 457
450 458 def _sanitize_appenlight_settings(settings):
451 459 _bool_setting(settings, 'appenlight', 'false')
452 460
453 461
454 462 def _sanitize_vcs_settings(settings):
455 463 """
456 464 Applies settings defaults and does type conversion for all VCS related
457 465 settings.
458 466 """
459 467 _string_setting(settings, 'vcs.svn.compatible_version', '')
460 468 _string_setting(settings, 'git_rev_filter', '--all')
461 469 _string_setting(settings, 'vcs.hooks.protocol', 'pyro4')
462 470 _string_setting(settings, 'vcs.server', '')
463 471 _string_setting(settings, 'vcs.server.log_level', 'debug')
464 472 _string_setting(settings, 'vcs.server.protocol', 'pyro4')
465 473 _bool_setting(settings, 'startup.import_repos', 'false')
466 474 _bool_setting(settings, 'vcs.hooks.direct_calls', 'false')
467 475 _bool_setting(settings, 'vcs.server.enable', 'true')
468 476 _bool_setting(settings, 'vcs.start_server', 'false')
469 477 _list_setting(settings, 'vcs.backends', 'hg, git, svn')
470 478 _int_setting(settings, 'vcs.connection_timeout', 3600)
471 479
472 480
473 481 def _int_setting(settings, name, default):
474 482 settings[name] = int(settings.get(name, default))
475 483
476 484
477 485 def _bool_setting(settings, name, default):
478 486 input = settings.get(name, default)
479 487 if isinstance(input, unicode):
480 488 input = input.encode('utf8')
481 489 settings[name] = asbool(input)
482 490
483 491
484 492 def _list_setting(settings, name, default):
485 493 raw_value = settings.get(name, default)
486 494
487 495 old_separator = ','
488 496 if old_separator in raw_value:
489 497 # If we get a comma separated list, pass it to our own function.
490 498 settings[name] = rhodecode_aslist(raw_value, sep=old_separator)
491 499 else:
492 500 # Otherwise we assume it uses pyramids space/newline separation.
493 501 settings[name] = aslist(raw_value)
494 502
495 503
496 504 def _string_setting(settings, name, default):
497 505 settings[name] = settings.get(name, default).lower()
@@ -1,1161 +1,1160 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2016 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 """
22 22 Routes configuration
23 23
24 24 The more specific and detailed routes should be defined first so they
25 25 may take precedent over the more generic routes. For more information
26 26 refer to the routes manual at http://routes.groovie.org/docs/
27 27
28 28 IMPORTANT: if you change any routing here, make sure to take a look at lib/base.py
29 29 and _route_name variable which uses some of stored naming here to do redirects.
30 30 """
31 31 import os
32 32 import re
33 33 from routes import Mapper
34 34
35 35 from rhodecode.config import routing_links
36 36
37 37 # prefix for non repository related links needs to be prefixed with `/`
38 38 ADMIN_PREFIX = '/_admin'
39 39 STATIC_FILE_PREFIX = '/_static'
40 40
41 41 # Default requirements for URL parts
42 42 URL_NAME_REQUIREMENTS = {
43 43 # group name can have a slash in them, but they must not end with a slash
44 44 'group_name': r'.*?[^/]',
45 'repo_group_name': r'.*?[^/]',
45 46 # repo names can have a slash in them, but they must not end with a slash
46 47 'repo_name': r'.*?[^/]',
47 48 # file path eats up everything at the end
48 49 'f_path': r'.*',
49 50 # reference types
50 51 'source_ref_type': '(branch|book|tag|rev|\%\(source_ref_type\)s)',
51 52 'target_ref_type': '(branch|book|tag|rev|\%\(target_ref_type\)s)',
52 53 }
53 54
54 55
55 56 def add_route_requirements(route_path, requirements):
56 57 """
57 58 Adds regex requirements to pyramid routes using a mapping dict
58 59
59 60 >>> add_route_requirements('/{action}/{id}', {'id': r'\d+'})
60 61 '/{action}/{id:\d+}'
61 62
62 63 """
63 64 for key, regex in requirements.items():
64 65 route_path = route_path.replace('{%s}' % key, '{%s:%s}' % (key, regex))
65 66 return route_path
66 67
67 68
68 69 class JSRoutesMapper(Mapper):
69 70 """
70 71 Wrapper for routes.Mapper to make pyroutes compatible url definitions
71 72 """
72 73 _named_route_regex = re.compile(r'^[a-z-_0-9A-Z]+$')
73 74 _argument_prog = re.compile('\{(.*?)\}|:\((.*)\)')
74 75 def __init__(self, *args, **kw):
75 76 super(JSRoutesMapper, self).__init__(*args, **kw)
76 77 self._jsroutes = []
77 78
78 79 def connect(self, *args, **kw):
79 80 """
80 81 Wrapper for connect to take an extra argument jsroute=True
81 82
82 83 :param jsroute: boolean, if True will add the route to the pyroutes list
83 84 """
84 85 if kw.pop('jsroute', False):
85 86 if not self._named_route_regex.match(args[0]):
86 87 raise Exception('only named routes can be added to pyroutes')
87 88 self._jsroutes.append(args[0])
88 89
89 90 super(JSRoutesMapper, self).connect(*args, **kw)
90 91
91 92 def _extract_route_information(self, route):
92 93 """
93 94 Convert a route into tuple(name, path, args), eg:
94 95 ('user_profile', '/profile/%(username)s', ['username'])
95 96 """
96 97 routepath = route.routepath
97 98 def replace(matchobj):
98 99 if matchobj.group(1):
99 100 return "%%(%s)s" % matchobj.group(1).split(':')[0]
100 101 else:
101 102 return "%%(%s)s" % matchobj.group(2)
102 103
103 104 routepath = self._argument_prog.sub(replace, routepath)
104 105 return (
105 106 route.name,
106 107 routepath,
107 108 [(arg[0].split(':')[0] if arg[0] != '' else arg[1])
108 109 for arg in self._argument_prog.findall(route.routepath)]
109 110 )
110 111
111 112 def jsroutes(self):
112 113 """
113 114 Return a list of pyroutes.js compatible routes
114 115 """
115 116 for route_name in self._jsroutes:
116 117 yield self._extract_route_information(self._routenames[route_name])
117 118
118 119
119 120 def make_map(config):
120 121 """Create, configure and return the routes Mapper"""
121 122 rmap = JSRoutesMapper(directory=config['pylons.paths']['controllers'],
122 123 always_scan=config['debug'])
123 124 rmap.minimization = False
124 125 rmap.explicit = False
125 126
126 127 from rhodecode.lib.utils2 import str2bool
127 128 from rhodecode.model import repo, repo_group
128 129
129 130 def check_repo(environ, match_dict):
130 131 """
131 132 check for valid repository for proper 404 handling
132 133
133 134 :param environ:
134 135 :param match_dict:
135 136 """
136 137 repo_name = match_dict.get('repo_name')
137 138
138 139 if match_dict.get('f_path'):
139 140 # fix for multiple initial slashes that causes errors
140 141 match_dict['f_path'] = match_dict['f_path'].lstrip('/')
141 142 repo_model = repo.RepoModel()
142 143 by_name_match = repo_model.get_by_repo_name(repo_name)
143 144 # if we match quickly from database, short circuit the operation,
144 145 # and validate repo based on the type.
145 146 if by_name_match:
146 147 return True
147 148
148 149 by_id_match = repo_model.get_repo_by_id(repo_name)
149 150 if by_id_match:
150 151 repo_name = by_id_match.repo_name
151 152 match_dict['repo_name'] = repo_name
152 153 return True
153 154
154 155 return False
155 156
156 157 def check_group(environ, match_dict):
157 158 """
158 159 check for valid repository group path for proper 404 handling
159 160
160 161 :param environ:
161 162 :param match_dict:
162 163 """
163 164 repo_group_name = match_dict.get('group_name')
164 165 repo_group_model = repo_group.RepoGroupModel()
165 166 by_name_match = repo_group_model.get_by_group_name(repo_group_name)
166 167 if by_name_match:
167 168 return True
168 169
169 170 return False
170 171
171 172 def check_user_group(environ, match_dict):
172 173 """
173 174 check for valid user group for proper 404 handling
174 175
175 176 :param environ:
176 177 :param match_dict:
177 178 """
178 179 return True
179 180
180 181 def check_int(environ, match_dict):
181 182 return match_dict.get('id').isdigit()
182 183
183 184
184 185 #==========================================================================
185 186 # CUSTOM ROUTES HERE
186 187 #==========================================================================
187 188
188 189 # MAIN PAGE
189 190 rmap.connect('home', '/', controller='home', action='index', jsroute=True)
190 191 rmap.connect('goto_switcher_data', '/_goto_data', controller='home',
191 192 action='goto_switcher_data')
192 193 rmap.connect('repo_list_data', '/_repos', controller='home',
193 194 action='repo_list_data')
194 195
195 196 rmap.connect('user_autocomplete_data', '/_users', controller='home',
196 197 action='user_autocomplete_data', jsroute=True)
197 198 rmap.connect('user_group_autocomplete_data', '/_user_groups', controller='home',
198 199 action='user_group_autocomplete_data')
199 200
200 201 rmap.connect(
201 202 'user_profile', '/_profiles/{username}', controller='users',
202 203 action='user_profile')
203 204
204 205 # TODO: johbo: Static links, to be replaced by our redirection mechanism
205 206 rmap.connect('rst_help',
206 207 'http://docutils.sourceforge.net/docs/user/rst/quickref.html',
207 208 _static=True)
208 209 rmap.connect('markdown_help',
209 210 'http://daringfireball.net/projects/markdown/syntax',
210 211 _static=True)
211 212 rmap.connect('rhodecode_official', 'https://rhodecode.com', _static=True)
212 213 rmap.connect('rhodecode_support', 'https://rhodecode.com/help/', _static=True)
213 214 rmap.connect('rhodecode_translations', 'https://rhodecode.com/translate/enterprise', _static=True)
214 215 # TODO: anderson - making this a static link since redirect won't play
215 216 # nice with POST requests
216 217 rmap.connect('enterprise_license_convert_from_old',
217 218 'https://rhodecode.com/u/license-upgrade',
218 219 _static=True)
219 220
220 221 routing_links.connect_redirection_links(rmap)
221 222
222 223 rmap.connect('ping', '%s/ping' % (ADMIN_PREFIX,), controller='home', action='ping')
223 224 rmap.connect('error_test', '%s/error_test' % (ADMIN_PREFIX,), controller='home', action='error_test')
224 225
225 226 # ADMIN REPOSITORY ROUTES
226 227 with rmap.submapper(path_prefix=ADMIN_PREFIX,
227 228 controller='admin/repos') as m:
228 229 m.connect('repos', '/repos',
229 230 action='create', conditions={'method': ['POST']})
230 231 m.connect('repos', '/repos',
231 232 action='index', conditions={'method': ['GET']})
232 233 m.connect('new_repo', '/create_repository', jsroute=True,
233 234 action='create_repository', conditions={'method': ['GET']})
234 235 m.connect('/repos/{repo_name}',
235 236 action='update', conditions={'method': ['PUT'],
236 237 'function': check_repo},
237 238 requirements=URL_NAME_REQUIREMENTS)
238 239 m.connect('delete_repo', '/repos/{repo_name}',
239 240 action='delete', conditions={'method': ['DELETE']},
240 241 requirements=URL_NAME_REQUIREMENTS)
241 242 m.connect('repo', '/repos/{repo_name}',
242 243 action='show', conditions={'method': ['GET'],
243 244 'function': check_repo},
244 245 requirements=URL_NAME_REQUIREMENTS)
245 246
246 247 # ADMIN REPOSITORY GROUPS ROUTES
247 248 with rmap.submapper(path_prefix=ADMIN_PREFIX,
248 249 controller='admin/repo_groups') as m:
249 250 m.connect('repo_groups', '/repo_groups',
250 251 action='create', conditions={'method': ['POST']})
251 252 m.connect('repo_groups', '/repo_groups',
252 253 action='index', conditions={'method': ['GET']})
253 254 m.connect('new_repo_group', '/repo_groups/new',
254 255 action='new', conditions={'method': ['GET']})
255 256 m.connect('update_repo_group', '/repo_groups/{group_name}',
256 257 action='update', conditions={'method': ['PUT'],
257 258 'function': check_group},
258 259 requirements=URL_NAME_REQUIREMENTS)
259 260
260 261 # EXTRAS REPO GROUP ROUTES
261 262 m.connect('edit_repo_group', '/repo_groups/{group_name}/edit',
262 263 action='edit',
263 264 conditions={'method': ['GET'], 'function': check_group},
264 265 requirements=URL_NAME_REQUIREMENTS)
265 266 m.connect('edit_repo_group', '/repo_groups/{group_name}/edit',
266 267 action='edit',
267 268 conditions={'method': ['PUT'], 'function': check_group},
268 269 requirements=URL_NAME_REQUIREMENTS)
269 270
270 271 m.connect('edit_repo_group_advanced', '/repo_groups/{group_name}/edit/advanced',
271 272 action='edit_repo_group_advanced',
272 273 conditions={'method': ['GET'], 'function': check_group},
273 274 requirements=URL_NAME_REQUIREMENTS)
274 275 m.connect('edit_repo_group_advanced', '/repo_groups/{group_name}/edit/advanced',
275 276 action='edit_repo_group_advanced',
276 277 conditions={'method': ['PUT'], 'function': check_group},
277 278 requirements=URL_NAME_REQUIREMENTS)
278 279
279 280 m.connect('edit_repo_group_perms', '/repo_groups/{group_name}/edit/permissions',
280 281 action='edit_repo_group_perms',
281 282 conditions={'method': ['GET'], 'function': check_group},
282 283 requirements=URL_NAME_REQUIREMENTS)
283 284 m.connect('edit_repo_group_perms', '/repo_groups/{group_name}/edit/permissions',
284 285 action='update_perms',
285 286 conditions={'method': ['PUT'], 'function': check_group},
286 287 requirements=URL_NAME_REQUIREMENTS)
287 288
288 289 m.connect('delete_repo_group', '/repo_groups/{group_name}',
289 290 action='delete', conditions={'method': ['DELETE'],
290 291 'function': check_group},
291 292 requirements=URL_NAME_REQUIREMENTS)
292 293
293 294 # ADMIN USER ROUTES
294 295 with rmap.submapper(path_prefix=ADMIN_PREFIX,
295 296 controller='admin/users') as m:
296 297 m.connect('users', '/users',
297 298 action='create', conditions={'method': ['POST']})
298 299 m.connect('users', '/users',
299 300 action='index', conditions={'method': ['GET']})
300 301 m.connect('new_user', '/users/new',
301 302 action='new', conditions={'method': ['GET']})
302 303 m.connect('update_user', '/users/{user_id}',
303 304 action='update', conditions={'method': ['PUT']})
304 305 m.connect('delete_user', '/users/{user_id}',
305 306 action='delete', conditions={'method': ['DELETE']})
306 307 m.connect('edit_user', '/users/{user_id}/edit',
307 308 action='edit', conditions={'method': ['GET']})
308 309 m.connect('user', '/users/{user_id}',
309 310 action='show', conditions={'method': ['GET']})
310 311 m.connect('force_password_reset_user', '/users/{user_id}/password_reset',
311 312 action='reset_password', conditions={'method': ['POST']})
312 313 m.connect('create_personal_repo_group', '/users/{user_id}/create_repo_group',
313 314 action='create_personal_repo_group', conditions={'method': ['POST']})
314 315
315 316 # EXTRAS USER ROUTES
316 317 m.connect('edit_user_advanced', '/users/{user_id}/edit/advanced',
317 318 action='edit_advanced', conditions={'method': ['GET']})
318 319 m.connect('edit_user_advanced', '/users/{user_id}/edit/advanced',
319 320 action='update_advanced', conditions={'method': ['PUT']})
320 321
321 322 m.connect('edit_user_auth_tokens', '/users/{user_id}/edit/auth_tokens',
322 323 action='edit_auth_tokens', conditions={'method': ['GET']})
323 324 m.connect('edit_user_auth_tokens', '/users/{user_id}/edit/auth_tokens',
324 325 action='add_auth_token', conditions={'method': ['PUT']})
325 326 m.connect('edit_user_auth_tokens', '/users/{user_id}/edit/auth_tokens',
326 327 action='delete_auth_token', conditions={'method': ['DELETE']})
327 328
328 329 m.connect('edit_user_global_perms', '/users/{user_id}/edit/global_permissions',
329 330 action='edit_global_perms', conditions={'method': ['GET']})
330 331 m.connect('edit_user_global_perms', '/users/{user_id}/edit/global_permissions',
331 332 action='update_global_perms', conditions={'method': ['PUT']})
332 333
333 334 m.connect('edit_user_perms_summary', '/users/{user_id}/edit/permissions_summary',
334 335 action='edit_perms_summary', conditions={'method': ['GET']})
335 336
336 337 m.connect('edit_user_emails', '/users/{user_id}/edit/emails',
337 338 action='edit_emails', conditions={'method': ['GET']})
338 339 m.connect('edit_user_emails', '/users/{user_id}/edit/emails',
339 340 action='add_email', conditions={'method': ['PUT']})
340 341 m.connect('edit_user_emails', '/users/{user_id}/edit/emails',
341 342 action='delete_email', conditions={'method': ['DELETE']})
342 343
343 344 m.connect('edit_user_ips', '/users/{user_id}/edit/ips',
344 345 action='edit_ips', conditions={'method': ['GET']})
345 346 m.connect('edit_user_ips', '/users/{user_id}/edit/ips',
346 347 action='add_ip', conditions={'method': ['PUT']})
347 348 m.connect('edit_user_ips', '/users/{user_id}/edit/ips',
348 349 action='delete_ip', conditions={'method': ['DELETE']})
349 350
350 351 # ADMIN USER GROUPS REST ROUTES
351 352 with rmap.submapper(path_prefix=ADMIN_PREFIX,
352 353 controller='admin/user_groups') as m:
353 354 m.connect('users_groups', '/user_groups',
354 355 action='create', conditions={'method': ['POST']})
355 356 m.connect('users_groups', '/user_groups',
356 357 action='index', conditions={'method': ['GET']})
357 358 m.connect('new_users_group', '/user_groups/new',
358 359 action='new', conditions={'method': ['GET']})
359 360 m.connect('update_users_group', '/user_groups/{user_group_id}',
360 361 action='update', conditions={'method': ['PUT']})
361 362 m.connect('delete_users_group', '/user_groups/{user_group_id}',
362 363 action='delete', conditions={'method': ['DELETE']})
363 364 m.connect('edit_users_group', '/user_groups/{user_group_id}/edit',
364 365 action='edit', conditions={'method': ['GET']},
365 366 function=check_user_group)
366 367
367 368 # EXTRAS USER GROUP ROUTES
368 369 m.connect('edit_user_group_global_perms',
369 370 '/user_groups/{user_group_id}/edit/global_permissions',
370 371 action='edit_global_perms', conditions={'method': ['GET']})
371 372 m.connect('edit_user_group_global_perms',
372 373 '/user_groups/{user_group_id}/edit/global_permissions',
373 374 action='update_global_perms', conditions={'method': ['PUT']})
374 375 m.connect('edit_user_group_perms_summary',
375 376 '/user_groups/{user_group_id}/edit/permissions_summary',
376 377 action='edit_perms_summary', conditions={'method': ['GET']})
377 378
378 379 m.connect('edit_user_group_perms',
379 380 '/user_groups/{user_group_id}/edit/permissions',
380 381 action='edit_perms', conditions={'method': ['GET']})
381 382 m.connect('edit_user_group_perms',
382 383 '/user_groups/{user_group_id}/edit/permissions',
383 384 action='update_perms', conditions={'method': ['PUT']})
384 385
385 386 m.connect('edit_user_group_advanced',
386 387 '/user_groups/{user_group_id}/edit/advanced',
387 388 action='edit_advanced', conditions={'method': ['GET']})
388 389
389 390 m.connect('edit_user_group_members',
390 391 '/user_groups/{user_group_id}/edit/members', jsroute=True,
391 392 action='edit_members', conditions={'method': ['GET']})
392 393
393 394 # ADMIN PERMISSIONS ROUTES
394 395 with rmap.submapper(path_prefix=ADMIN_PREFIX,
395 396 controller='admin/permissions') as m:
396 397 m.connect('admin_permissions_application', '/permissions/application',
397 398 action='permission_application_update', conditions={'method': ['POST']})
398 399 m.connect('admin_permissions_application', '/permissions/application',
399 400 action='permission_application', conditions={'method': ['GET']})
400 401
401 402 m.connect('admin_permissions_global', '/permissions/global',
402 403 action='permission_global_update', conditions={'method': ['POST']})
403 404 m.connect('admin_permissions_global', '/permissions/global',
404 405 action='permission_global', conditions={'method': ['GET']})
405 406
406 407 m.connect('admin_permissions_object', '/permissions/object',
407 408 action='permission_objects_update', conditions={'method': ['POST']})
408 409 m.connect('admin_permissions_object', '/permissions/object',
409 410 action='permission_objects', conditions={'method': ['GET']})
410 411
411 412 m.connect('admin_permissions_ips', '/permissions/ips',
412 413 action='permission_ips', conditions={'method': ['POST']})
413 414 m.connect('admin_permissions_ips', '/permissions/ips',
414 415 action='permission_ips', conditions={'method': ['GET']})
415 416
416 417 m.connect('admin_permissions_overview', '/permissions/overview',
417 418 action='permission_perms', conditions={'method': ['GET']})
418 419
419 420 # ADMIN DEFAULTS REST ROUTES
420 421 with rmap.submapper(path_prefix=ADMIN_PREFIX,
421 422 controller='admin/defaults') as m:
422 423 m.connect('admin_defaults_repositories', '/defaults/repositories',
423 424 action='update_repository_defaults', conditions={'method': ['POST']})
424 425 m.connect('admin_defaults_repositories', '/defaults/repositories',
425 426 action='index', conditions={'method': ['GET']})
426 427
427 428 # ADMIN DEBUG STYLE ROUTES
428 429 if str2bool(config.get('debug_style')):
429 430 with rmap.submapper(path_prefix=ADMIN_PREFIX + '/debug_style',
430 431 controller='debug_style') as m:
431 432 m.connect('debug_style_home', '',
432 433 action='index', conditions={'method': ['GET']})
433 434 m.connect('debug_style_template', '/t/{t_path}',
434 435 action='template', conditions={'method': ['GET']})
435 436
436 437 # ADMIN SETTINGS ROUTES
437 438 with rmap.submapper(path_prefix=ADMIN_PREFIX,
438 439 controller='admin/settings') as m:
439 440
440 441 # default
441 442 m.connect('admin_settings', '/settings',
442 443 action='settings_global_update',
443 444 conditions={'method': ['POST']})
444 445 m.connect('admin_settings', '/settings',
445 446 action='settings_global', conditions={'method': ['GET']})
446 447
447 448 m.connect('admin_settings_vcs', '/settings/vcs',
448 449 action='settings_vcs_update',
449 450 conditions={'method': ['POST']})
450 451 m.connect('admin_settings_vcs', '/settings/vcs',
451 452 action='settings_vcs',
452 453 conditions={'method': ['GET']})
453 454 m.connect('admin_settings_vcs', '/settings/vcs',
454 455 action='delete_svn_pattern',
455 456 conditions={'method': ['DELETE']})
456 457
457 458 m.connect('admin_settings_mapping', '/settings/mapping',
458 459 action='settings_mapping_update',
459 460 conditions={'method': ['POST']})
460 461 m.connect('admin_settings_mapping', '/settings/mapping',
461 462 action='settings_mapping', conditions={'method': ['GET']})
462 463
463 464 m.connect('admin_settings_global', '/settings/global',
464 465 action='settings_global_update',
465 466 conditions={'method': ['POST']})
466 467 m.connect('admin_settings_global', '/settings/global',
467 468 action='settings_global', conditions={'method': ['GET']})
468 469
469 470 m.connect('admin_settings_visual', '/settings/visual',
470 471 action='settings_visual_update',
471 472 conditions={'method': ['POST']})
472 473 m.connect('admin_settings_visual', '/settings/visual',
473 474 action='settings_visual', conditions={'method': ['GET']})
474 475
475 476 m.connect('admin_settings_issuetracker',
476 477 '/settings/issue-tracker', action='settings_issuetracker',
477 478 conditions={'method': ['GET']})
478 479 m.connect('admin_settings_issuetracker_save',
479 480 '/settings/issue-tracker/save',
480 481 action='settings_issuetracker_save',
481 482 conditions={'method': ['POST']})
482 483 m.connect('admin_issuetracker_test', '/settings/issue-tracker/test',
483 484 action='settings_issuetracker_test',
484 485 conditions={'method': ['POST']})
485 486 m.connect('admin_issuetracker_delete',
486 487 '/settings/issue-tracker/delete',
487 488 action='settings_issuetracker_delete',
488 489 conditions={'method': ['DELETE']})
489 490
490 491 m.connect('admin_settings_email', '/settings/email',
491 492 action='settings_email_update',
492 493 conditions={'method': ['POST']})
493 494 m.connect('admin_settings_email', '/settings/email',
494 495 action='settings_email', conditions={'method': ['GET']})
495 496
496 497 m.connect('admin_settings_hooks', '/settings/hooks',
497 498 action='settings_hooks_update',
498 499 conditions={'method': ['POST', 'DELETE']})
499 500 m.connect('admin_settings_hooks', '/settings/hooks',
500 501 action='settings_hooks', conditions={'method': ['GET']})
501 502
502 503 m.connect('admin_settings_search', '/settings/search',
503 504 action='settings_search', conditions={'method': ['GET']})
504 505
505 506 m.connect('admin_settings_system', '/settings/system',
506 507 action='settings_system', conditions={'method': ['GET']})
507 508
508 509 m.connect('admin_settings_system_update', '/settings/system/updates',
509 510 action='settings_system_update', conditions={'method': ['GET']})
510 511
511 512 m.connect('admin_settings_supervisor', '/settings/supervisor',
512 513 action='settings_supervisor', conditions={'method': ['GET']})
513 514 m.connect('admin_settings_supervisor_log', '/settings/supervisor/{procid}/log',
514 515 action='settings_supervisor_log', conditions={'method': ['GET']})
515 516
516 517 m.connect('admin_settings_labs', '/settings/labs',
517 518 action='settings_labs_update',
518 519 conditions={'method': ['POST']})
519 520 m.connect('admin_settings_labs', '/settings/labs',
520 521 action='settings_labs', conditions={'method': ['GET']})
521 522
522 523 # ADMIN MY ACCOUNT
523 524 with rmap.submapper(path_prefix=ADMIN_PREFIX,
524 525 controller='admin/my_account') as m:
525 526
526 527 m.connect('my_account', '/my_account',
527 528 action='my_account', conditions={'method': ['GET']})
528 529 m.connect('my_account_edit', '/my_account/edit',
529 530 action='my_account_edit', conditions={'method': ['GET']})
530 531 m.connect('my_account', '/my_account',
531 532 action='my_account_update', conditions={'method': ['POST']})
532 533
533 534 m.connect('my_account_password', '/my_account/password',
534 action='my_account_password', conditions={'method': ['GET']})
535 m.connect('my_account_password', '/my_account/password',
536 action='my_account_password_update', conditions={'method': ['POST']})
535 action='my_account_password', conditions={'method': ['GET', 'POST']})
537 536
538 537 m.connect('my_account_repos', '/my_account/repos',
539 538 action='my_account_repos', conditions={'method': ['GET']})
540 539
541 540 m.connect('my_account_watched', '/my_account/watched',
542 541 action='my_account_watched', conditions={'method': ['GET']})
543 542
544 543 m.connect('my_account_pullrequests', '/my_account/pull_requests',
545 544 action='my_account_pullrequests', conditions={'method': ['GET']})
546 545
547 546 m.connect('my_account_perms', '/my_account/perms',
548 547 action='my_account_perms', conditions={'method': ['GET']})
549 548
550 549 m.connect('my_account_emails', '/my_account/emails',
551 550 action='my_account_emails', conditions={'method': ['GET']})
552 551 m.connect('my_account_emails', '/my_account/emails',
553 552 action='my_account_emails_add', conditions={'method': ['POST']})
554 553 m.connect('my_account_emails', '/my_account/emails',
555 554 action='my_account_emails_delete', conditions={'method': ['DELETE']})
556 555
557 556 m.connect('my_account_auth_tokens', '/my_account/auth_tokens',
558 557 action='my_account_auth_tokens', conditions={'method': ['GET']})
559 558 m.connect('my_account_auth_tokens', '/my_account/auth_tokens',
560 559 action='my_account_auth_tokens_add', conditions={'method': ['POST']})
561 560 m.connect('my_account_auth_tokens', '/my_account/auth_tokens',
562 561 action='my_account_auth_tokens_delete', conditions={'method': ['DELETE']})
563 562 m.connect('my_account_notifications', '/my_account/notifications',
564 563 action='my_notifications',
565 564 conditions={'method': ['GET']})
566 565 m.connect('my_account_notifications_toggle_visibility',
567 566 '/my_account/toggle_visibility',
568 567 action='my_notifications_toggle_visibility',
569 568 conditions={'method': ['POST']})
570 569
571 570 # NOTIFICATION REST ROUTES
572 571 with rmap.submapper(path_prefix=ADMIN_PREFIX,
573 572 controller='admin/notifications') as m:
574 573 m.connect('notifications', '/notifications',
575 574 action='index', conditions={'method': ['GET']})
576 575 m.connect('notifications_mark_all_read', '/notifications/mark_all_read',
577 576 action='mark_all_read', conditions={'method': ['POST']})
578 577 m.connect('/notifications/{notification_id}',
579 578 action='update', conditions={'method': ['PUT']})
580 579 m.connect('/notifications/{notification_id}',
581 580 action='delete', conditions={'method': ['DELETE']})
582 581 m.connect('notification', '/notifications/{notification_id}',
583 582 action='show', conditions={'method': ['GET']})
584 583
585 584 # ADMIN GIST
586 585 with rmap.submapper(path_prefix=ADMIN_PREFIX,
587 586 controller='admin/gists') as m:
588 587 m.connect('gists', '/gists',
589 588 action='create', conditions={'method': ['POST']})
590 589 m.connect('gists', '/gists', jsroute=True,
591 590 action='index', conditions={'method': ['GET']})
592 591 m.connect('new_gist', '/gists/new', jsroute=True,
593 592 action='new', conditions={'method': ['GET']})
594 593
595 594 m.connect('/gists/{gist_id}',
596 595 action='delete', conditions={'method': ['DELETE']})
597 596 m.connect('edit_gist', '/gists/{gist_id}/edit',
598 597 action='edit_form', conditions={'method': ['GET']})
599 598 m.connect('edit_gist', '/gists/{gist_id}/edit',
600 599 action='edit', conditions={'method': ['POST']})
601 600 m.connect(
602 601 'edit_gist_check_revision', '/gists/{gist_id}/edit/check_revision',
603 602 action='check_revision', conditions={'method': ['GET']})
604 603
605 604 m.connect('gist', '/gists/{gist_id}',
606 605 action='show', conditions={'method': ['GET']})
607 606 m.connect('gist_rev', '/gists/{gist_id}/{revision}',
608 607 revision='tip',
609 608 action='show', conditions={'method': ['GET']})
610 609 m.connect('formatted_gist', '/gists/{gist_id}/{revision}/{format}',
611 610 revision='tip',
612 611 action='show', conditions={'method': ['GET']})
613 612 m.connect('formatted_gist_file', '/gists/{gist_id}/{revision}/{format}/{f_path}',
614 613 revision='tip',
615 614 action='show', conditions={'method': ['GET']},
616 615 requirements=URL_NAME_REQUIREMENTS)
617 616
618 617 # ADMIN MAIN PAGES
619 618 with rmap.submapper(path_prefix=ADMIN_PREFIX,
620 619 controller='admin/admin') as m:
621 620 m.connect('admin_home', '', action='index')
622 621 m.connect('admin_add_repo', '/add_repo/{new_repo:[a-z0-9\. _-]*}',
623 622 action='add_repo')
624 623 m.connect(
625 624 'pull_requests_global_0', '/pull_requests/{pull_request_id:[0-9]+}',
626 625 action='pull_requests')
627 626 m.connect(
628 627 'pull_requests_global', '/pull-requests/{pull_request_id:[0-9]+}',
629 628 action='pull_requests')
630 629
631 630
632 631 # USER JOURNAL
633 632 rmap.connect('journal', '%s/journal' % (ADMIN_PREFIX,),
634 633 controller='journal', action='index')
635 634 rmap.connect('journal_rss', '%s/journal/rss' % (ADMIN_PREFIX,),
636 635 controller='journal', action='journal_rss')
637 636 rmap.connect('journal_atom', '%s/journal/atom' % (ADMIN_PREFIX,),
638 637 controller='journal', action='journal_atom')
639 638
640 639 rmap.connect('public_journal', '%s/public_journal' % (ADMIN_PREFIX,),
641 640 controller='journal', action='public_journal')
642 641
643 642 rmap.connect('public_journal_rss', '%s/public_journal/rss' % (ADMIN_PREFIX,),
644 643 controller='journal', action='public_journal_rss')
645 644
646 645 rmap.connect('public_journal_rss_old', '%s/public_journal_rss' % (ADMIN_PREFIX,),
647 646 controller='journal', action='public_journal_rss')
648 647
649 648 rmap.connect('public_journal_atom',
650 649 '%s/public_journal/atom' % (ADMIN_PREFIX,), controller='journal',
651 650 action='public_journal_atom')
652 651
653 652 rmap.connect('public_journal_atom_old',
654 653 '%s/public_journal_atom' % (ADMIN_PREFIX,), controller='journal',
655 654 action='public_journal_atom')
656 655
657 656 rmap.connect('toggle_following', '%s/toggle_following' % (ADMIN_PREFIX,),
658 657 controller='journal', action='toggle_following', jsroute=True,
659 658 conditions={'method': ['POST']})
660 659
661 660 # FULL TEXT SEARCH
662 661 rmap.connect('search', '%s/search' % (ADMIN_PREFIX,),
663 662 controller='search')
664 663 rmap.connect('search_repo_home', '/{repo_name}/search',
665 664 controller='search',
666 665 action='index',
667 666 conditions={'function': check_repo},
668 667 requirements=URL_NAME_REQUIREMENTS)
669 668
670 669 # FEEDS
671 670 rmap.connect('rss_feed_home', '/{repo_name}/feed/rss',
672 671 controller='feed', action='rss',
673 672 conditions={'function': check_repo},
674 673 requirements=URL_NAME_REQUIREMENTS)
675 674
676 675 rmap.connect('atom_feed_home', '/{repo_name}/feed/atom',
677 676 controller='feed', action='atom',
678 677 conditions={'function': check_repo},
679 678 requirements=URL_NAME_REQUIREMENTS)
680 679
681 680 #==========================================================================
682 681 # REPOSITORY ROUTES
683 682 #==========================================================================
684 683
685 684 rmap.connect('repo_creating_home', '/{repo_name}/repo_creating',
686 685 controller='admin/repos', action='repo_creating',
687 686 requirements=URL_NAME_REQUIREMENTS)
688 687 rmap.connect('repo_check_home', '/{repo_name}/crepo_check',
689 688 controller='admin/repos', action='repo_check',
690 689 requirements=URL_NAME_REQUIREMENTS)
691 690
692 691 rmap.connect('repo_stats', '/{repo_name}/repo_stats/{commit_id}',
693 692 controller='summary', action='repo_stats',
694 693 conditions={'function': check_repo},
695 694 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
696 695
697 696 rmap.connect('repo_refs_data', '/{repo_name}/refs-data',
698 697 controller='summary', action='repo_refs_data', jsroute=True,
699 698 requirements=URL_NAME_REQUIREMENTS)
700 699 rmap.connect('repo_refs_changelog_data', '/{repo_name}/refs-data-changelog',
701 700 controller='summary', action='repo_refs_changelog_data',
702 701 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
703 702
704 703 rmap.connect('changeset_home', '/{repo_name}/changeset/{revision}',
705 704 controller='changeset', revision='tip', jsroute=True,
706 705 conditions={'function': check_repo},
707 706 requirements=URL_NAME_REQUIREMENTS)
708 707 rmap.connect('changeset_children', '/{repo_name}/changeset_children/{revision}',
709 708 controller='changeset', revision='tip', action='changeset_children',
710 709 conditions={'function': check_repo},
711 710 requirements=URL_NAME_REQUIREMENTS)
712 711 rmap.connect('changeset_parents', '/{repo_name}/changeset_parents/{revision}',
713 712 controller='changeset', revision='tip', action='changeset_parents',
714 713 conditions={'function': check_repo},
715 714 requirements=URL_NAME_REQUIREMENTS)
716 715
717 716 # repo edit options
718 717 rmap.connect('edit_repo', '/{repo_name}/settings', jsroute=True,
719 718 controller='admin/repos', action='edit',
720 719 conditions={'method': ['GET'], 'function': check_repo},
721 720 requirements=URL_NAME_REQUIREMENTS)
722 721
723 722 rmap.connect('edit_repo_perms', '/{repo_name}/settings/permissions',
724 723 jsroute=True,
725 724 controller='admin/repos', action='edit_permissions',
726 725 conditions={'method': ['GET'], 'function': check_repo},
727 726 requirements=URL_NAME_REQUIREMENTS)
728 727 rmap.connect('edit_repo_perms_update', '/{repo_name}/settings/permissions',
729 728 controller='admin/repos', action='edit_permissions_update',
730 729 conditions={'method': ['PUT'], 'function': check_repo},
731 730 requirements=URL_NAME_REQUIREMENTS)
732 731
733 732 rmap.connect('edit_repo_fields', '/{repo_name}/settings/fields',
734 733 controller='admin/repos', action='edit_fields',
735 734 conditions={'method': ['GET'], 'function': check_repo},
736 735 requirements=URL_NAME_REQUIREMENTS)
737 736 rmap.connect('create_repo_fields', '/{repo_name}/settings/fields/new',
738 737 controller='admin/repos', action='create_repo_field',
739 738 conditions={'method': ['PUT'], 'function': check_repo},
740 739 requirements=URL_NAME_REQUIREMENTS)
741 740 rmap.connect('delete_repo_fields', '/{repo_name}/settings/fields/{field_id}',
742 741 controller='admin/repos', action='delete_repo_field',
743 742 conditions={'method': ['DELETE'], 'function': check_repo},
744 743 requirements=URL_NAME_REQUIREMENTS)
745 744
746 745 rmap.connect('edit_repo_advanced', '/{repo_name}/settings/advanced',
747 746 controller='admin/repos', action='edit_advanced',
748 747 conditions={'method': ['GET'], 'function': check_repo},
749 748 requirements=URL_NAME_REQUIREMENTS)
750 749
751 750 rmap.connect('edit_repo_advanced_locking', '/{repo_name}/settings/advanced/locking',
752 751 controller='admin/repos', action='edit_advanced_locking',
753 752 conditions={'method': ['PUT'], 'function': check_repo},
754 753 requirements=URL_NAME_REQUIREMENTS)
755 754 rmap.connect('toggle_locking', '/{repo_name}/settings/advanced/locking_toggle',
756 755 controller='admin/repos', action='toggle_locking',
757 756 conditions={'method': ['GET'], 'function': check_repo},
758 757 requirements=URL_NAME_REQUIREMENTS)
759 758
760 759 rmap.connect('edit_repo_advanced_journal', '/{repo_name}/settings/advanced/journal',
761 760 controller='admin/repos', action='edit_advanced_journal',
762 761 conditions={'method': ['PUT'], 'function': check_repo},
763 762 requirements=URL_NAME_REQUIREMENTS)
764 763
765 764 rmap.connect('edit_repo_advanced_fork', '/{repo_name}/settings/advanced/fork',
766 765 controller='admin/repos', action='edit_advanced_fork',
767 766 conditions={'method': ['PUT'], 'function': check_repo},
768 767 requirements=URL_NAME_REQUIREMENTS)
769 768
770 769 rmap.connect('edit_repo_caches', '/{repo_name}/settings/caches',
771 770 controller='admin/repos', action='edit_caches_form',
772 771 conditions={'method': ['GET'], 'function': check_repo},
773 772 requirements=URL_NAME_REQUIREMENTS)
774 773 rmap.connect('edit_repo_caches', '/{repo_name}/settings/caches',
775 774 controller='admin/repos', action='edit_caches',
776 775 conditions={'method': ['PUT'], 'function': check_repo},
777 776 requirements=URL_NAME_REQUIREMENTS)
778 777
779 778 rmap.connect('edit_repo_remote', '/{repo_name}/settings/remote',
780 779 controller='admin/repos', action='edit_remote_form',
781 780 conditions={'method': ['GET'], 'function': check_repo},
782 781 requirements=URL_NAME_REQUIREMENTS)
783 782 rmap.connect('edit_repo_remote', '/{repo_name}/settings/remote',
784 783 controller='admin/repos', action='edit_remote',
785 784 conditions={'method': ['PUT'], 'function': check_repo},
786 785 requirements=URL_NAME_REQUIREMENTS)
787 786
788 787 rmap.connect('edit_repo_statistics', '/{repo_name}/settings/statistics',
789 788 controller='admin/repos', action='edit_statistics_form',
790 789 conditions={'method': ['GET'], 'function': check_repo},
791 790 requirements=URL_NAME_REQUIREMENTS)
792 791 rmap.connect('edit_repo_statistics', '/{repo_name}/settings/statistics',
793 792 controller='admin/repos', action='edit_statistics',
794 793 conditions={'method': ['PUT'], 'function': check_repo},
795 794 requirements=URL_NAME_REQUIREMENTS)
796 795 rmap.connect('repo_settings_issuetracker',
797 796 '/{repo_name}/settings/issue-tracker',
798 797 controller='admin/repos', action='repo_issuetracker',
799 798 conditions={'method': ['GET'], 'function': check_repo},
800 799 requirements=URL_NAME_REQUIREMENTS)
801 800 rmap.connect('repo_issuetracker_test',
802 801 '/{repo_name}/settings/issue-tracker/test',
803 802 controller='admin/repos', action='repo_issuetracker_test',
804 803 conditions={'method': ['POST'], 'function': check_repo},
805 804 requirements=URL_NAME_REQUIREMENTS)
806 805 rmap.connect('repo_issuetracker_delete',
807 806 '/{repo_name}/settings/issue-tracker/delete',
808 807 controller='admin/repos', action='repo_issuetracker_delete',
809 808 conditions={'method': ['DELETE'], 'function': check_repo},
810 809 requirements=URL_NAME_REQUIREMENTS)
811 810 rmap.connect('repo_issuetracker_save',
812 811 '/{repo_name}/settings/issue-tracker/save',
813 812 controller='admin/repos', action='repo_issuetracker_save',
814 813 conditions={'method': ['POST'], 'function': check_repo},
815 814 requirements=URL_NAME_REQUIREMENTS)
816 815 rmap.connect('repo_vcs_settings', '/{repo_name}/settings/vcs',
817 816 controller='admin/repos', action='repo_settings_vcs_update',
818 817 conditions={'method': ['POST'], 'function': check_repo},
819 818 requirements=URL_NAME_REQUIREMENTS)
820 819 rmap.connect('repo_vcs_settings', '/{repo_name}/settings/vcs',
821 820 controller='admin/repos', action='repo_settings_vcs',
822 821 conditions={'method': ['GET'], 'function': check_repo},
823 822 requirements=URL_NAME_REQUIREMENTS)
824 823 rmap.connect('repo_vcs_settings', '/{repo_name}/settings/vcs',
825 824 controller='admin/repos', action='repo_delete_svn_pattern',
826 825 conditions={'method': ['DELETE'], 'function': check_repo},
827 826 requirements=URL_NAME_REQUIREMENTS)
828 827
829 828 # still working url for backward compat.
830 829 rmap.connect('raw_changeset_home_depraced',
831 830 '/{repo_name}/raw-changeset/{revision}',
832 831 controller='changeset', action='changeset_raw',
833 832 revision='tip', conditions={'function': check_repo},
834 833 requirements=URL_NAME_REQUIREMENTS)
835 834
836 835 # new URLs
837 836 rmap.connect('changeset_raw_home',
838 837 '/{repo_name}/changeset-diff/{revision}',
839 838 controller='changeset', action='changeset_raw',
840 839 revision='tip', conditions={'function': check_repo},
841 840 requirements=URL_NAME_REQUIREMENTS)
842 841
843 842 rmap.connect('changeset_patch_home',
844 843 '/{repo_name}/changeset-patch/{revision}',
845 844 controller='changeset', action='changeset_patch',
846 845 revision='tip', conditions={'function': check_repo},
847 846 requirements=URL_NAME_REQUIREMENTS)
848 847
849 848 rmap.connect('changeset_download_home',
850 849 '/{repo_name}/changeset-download/{revision}',
851 850 controller='changeset', action='changeset_download',
852 851 revision='tip', conditions={'function': check_repo},
853 852 requirements=URL_NAME_REQUIREMENTS)
854 853
855 854 rmap.connect('changeset_comment',
856 855 '/{repo_name}/changeset/{revision}/comment', jsroute=True,
857 856 controller='changeset', revision='tip', action='comment',
858 857 conditions={'function': check_repo},
859 858 requirements=URL_NAME_REQUIREMENTS)
860 859
861 860 rmap.connect('changeset_comment_preview',
862 861 '/{repo_name}/changeset/comment/preview', jsroute=True,
863 862 controller='changeset', action='preview_comment',
864 863 conditions={'function': check_repo, 'method': ['POST']},
865 864 requirements=URL_NAME_REQUIREMENTS)
866 865
867 866 rmap.connect('changeset_comment_delete',
868 867 '/{repo_name}/changeset/comment/{comment_id}/delete',
869 868 controller='changeset', action='delete_comment',
870 869 conditions={'function': check_repo, 'method': ['DELETE']},
871 870 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
872 871
873 872 rmap.connect('changeset_info', '/{repo_name}/changeset_info/{revision}',
874 873 controller='changeset', action='changeset_info',
875 874 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
876 875
877 876 rmap.connect('compare_home',
878 877 '/{repo_name}/compare',
879 878 controller='compare', action='index',
880 879 conditions={'function': check_repo},
881 880 requirements=URL_NAME_REQUIREMENTS)
882 881
883 882 rmap.connect('compare_url',
884 883 '/{repo_name}/compare/{source_ref_type}@{source_ref:.*?}...{target_ref_type}@{target_ref:.*?}',
885 884 controller='compare', action='compare',
886 885 conditions={'function': check_repo},
887 886 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
888 887
889 888 rmap.connect('pullrequest_home',
890 889 '/{repo_name}/pull-request/new', controller='pullrequests',
891 890 action='index', conditions={'function': check_repo,
892 891 'method': ['GET']},
893 892 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
894 893
895 894 rmap.connect('pullrequest',
896 895 '/{repo_name}/pull-request/new', controller='pullrequests',
897 896 action='create', conditions={'function': check_repo,
898 897 'method': ['POST']},
899 898 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
900 899
901 900 rmap.connect('pullrequest_repo_refs',
902 901 '/{repo_name}/pull-request/refs/{target_repo_name:.*?[^/]}',
903 902 controller='pullrequests',
904 903 action='get_repo_refs',
905 904 conditions={'function': check_repo, 'method': ['GET']},
906 905 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
907 906
908 907 rmap.connect('pullrequest_repo_destinations',
909 908 '/{repo_name}/pull-request/repo-destinations',
910 909 controller='pullrequests',
911 910 action='get_repo_destinations',
912 911 conditions={'function': check_repo, 'method': ['GET']},
913 912 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
914 913
915 914 rmap.connect('pullrequest_show',
916 915 '/{repo_name}/pull-request/{pull_request_id}',
917 916 controller='pullrequests',
918 917 action='show', conditions={'function': check_repo,
919 918 'method': ['GET']},
920 919 requirements=URL_NAME_REQUIREMENTS)
921 920
922 921 rmap.connect('pullrequest_update',
923 922 '/{repo_name}/pull-request/{pull_request_id}',
924 923 controller='pullrequests',
925 924 action='update', conditions={'function': check_repo,
926 925 'method': ['PUT']},
927 926 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
928 927
929 928 rmap.connect('pullrequest_merge',
930 929 '/{repo_name}/pull-request/{pull_request_id}',
931 930 controller='pullrequests',
932 931 action='merge', conditions={'function': check_repo,
933 932 'method': ['POST']},
934 933 requirements=URL_NAME_REQUIREMENTS)
935 934
936 935 rmap.connect('pullrequest_delete',
937 936 '/{repo_name}/pull-request/{pull_request_id}',
938 937 controller='pullrequests',
939 938 action='delete', conditions={'function': check_repo,
940 939 'method': ['DELETE']},
941 940 requirements=URL_NAME_REQUIREMENTS)
942 941
943 942 rmap.connect('pullrequest_show_all',
944 943 '/{repo_name}/pull-request',
945 944 controller='pullrequests',
946 945 action='show_all', conditions={'function': check_repo,
947 946 'method': ['GET']},
948 947 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
949 948
950 949 rmap.connect('pullrequest_comment',
951 950 '/{repo_name}/pull-request-comment/{pull_request_id}',
952 951 controller='pullrequests',
953 952 action='comment', conditions={'function': check_repo,
954 953 'method': ['POST']},
955 954 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
956 955
957 956 rmap.connect('pullrequest_comment_delete',
958 957 '/{repo_name}/pull-request-comment/{comment_id}/delete',
959 958 controller='pullrequests', action='delete_comment',
960 959 conditions={'function': check_repo, 'method': ['DELETE']},
961 960 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
962 961
963 962 rmap.connect('summary_home_explicit', '/{repo_name}/summary',
964 963 controller='summary', conditions={'function': check_repo},
965 964 requirements=URL_NAME_REQUIREMENTS)
966 965
967 966 rmap.connect('branches_home', '/{repo_name}/branches',
968 967 controller='branches', conditions={'function': check_repo},
969 968 requirements=URL_NAME_REQUIREMENTS)
970 969
971 970 rmap.connect('tags_home', '/{repo_name}/tags',
972 971 controller='tags', conditions={'function': check_repo},
973 972 requirements=URL_NAME_REQUIREMENTS)
974 973
975 974 rmap.connect('bookmarks_home', '/{repo_name}/bookmarks',
976 975 controller='bookmarks', conditions={'function': check_repo},
977 976 requirements=URL_NAME_REQUIREMENTS)
978 977
979 978 rmap.connect('changelog_home', '/{repo_name}/changelog', jsroute=True,
980 979 controller='changelog', conditions={'function': check_repo},
981 980 requirements=URL_NAME_REQUIREMENTS)
982 981
983 982 rmap.connect('changelog_summary_home', '/{repo_name}/changelog_summary',
984 983 controller='changelog', action='changelog_summary',
985 984 conditions={'function': check_repo},
986 985 requirements=URL_NAME_REQUIREMENTS)
987 986
988 987 rmap.connect('changelog_file_home',
989 988 '/{repo_name}/changelog/{revision}/{f_path}',
990 989 controller='changelog', f_path=None,
991 990 conditions={'function': check_repo},
992 991 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
993 992
994 993 rmap.connect('changelog_details', '/{repo_name}/changelog_details/{cs}',
995 994 controller='changelog', action='changelog_details',
996 995 conditions={'function': check_repo},
997 996 requirements=URL_NAME_REQUIREMENTS)
998 997
999 998 rmap.connect('files_home', '/{repo_name}/files/{revision}/{f_path}',
1000 999 controller='files', revision='tip', f_path='',
1001 1000 conditions={'function': check_repo},
1002 1001 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
1003 1002
1004 1003 rmap.connect('files_home_simple_catchrev',
1005 1004 '/{repo_name}/files/{revision}',
1006 1005 controller='files', revision='tip', f_path='',
1007 1006 conditions={'function': check_repo},
1008 1007 requirements=URL_NAME_REQUIREMENTS)
1009 1008
1010 1009 rmap.connect('files_home_simple_catchall',
1011 1010 '/{repo_name}/files',
1012 1011 controller='files', revision='tip', f_path='',
1013 1012 conditions={'function': check_repo},
1014 1013 requirements=URL_NAME_REQUIREMENTS)
1015 1014
1016 1015 rmap.connect('files_history_home',
1017 1016 '/{repo_name}/history/{revision}/{f_path}',
1018 1017 controller='files', action='history', revision='tip', f_path='',
1019 1018 conditions={'function': check_repo},
1020 1019 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
1021 1020
1022 1021 rmap.connect('files_authors_home',
1023 1022 '/{repo_name}/authors/{revision}/{f_path}',
1024 1023 controller='files', action='authors', revision='tip', f_path='',
1025 1024 conditions={'function': check_repo},
1026 1025 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
1027 1026
1028 1027 rmap.connect('files_diff_home', '/{repo_name}/diff/{f_path}',
1029 1028 controller='files', action='diff', f_path='',
1030 1029 conditions={'function': check_repo},
1031 1030 requirements=URL_NAME_REQUIREMENTS)
1032 1031
1033 1032 rmap.connect('files_diff_2way_home',
1034 1033 '/{repo_name}/diff-2way/{f_path}',
1035 1034 controller='files', action='diff_2way', f_path='',
1036 1035 conditions={'function': check_repo},
1037 1036 requirements=URL_NAME_REQUIREMENTS)
1038 1037
1039 1038 rmap.connect('files_rawfile_home',
1040 1039 '/{repo_name}/rawfile/{revision}/{f_path}',
1041 1040 controller='files', action='rawfile', revision='tip',
1042 1041 f_path='', conditions={'function': check_repo},
1043 1042 requirements=URL_NAME_REQUIREMENTS)
1044 1043
1045 1044 rmap.connect('files_raw_home',
1046 1045 '/{repo_name}/raw/{revision}/{f_path}',
1047 1046 controller='files', action='raw', revision='tip', f_path='',
1048 1047 conditions={'function': check_repo},
1049 1048 requirements=URL_NAME_REQUIREMENTS)
1050 1049
1051 1050 rmap.connect('files_render_home',
1052 1051 '/{repo_name}/render/{revision}/{f_path}',
1053 1052 controller='files', action='index', revision='tip', f_path='',
1054 1053 rendered=True, conditions={'function': check_repo},
1055 1054 requirements=URL_NAME_REQUIREMENTS)
1056 1055
1057 1056 rmap.connect('files_annotate_home',
1058 1057 '/{repo_name}/annotate/{revision}/{f_path}',
1059 1058 controller='files', action='index', revision='tip',
1060 1059 f_path='', annotate=True, conditions={'function': check_repo},
1061 1060 requirements=URL_NAME_REQUIREMENTS)
1062 1061
1063 1062 rmap.connect('files_edit',
1064 1063 '/{repo_name}/edit/{revision}/{f_path}',
1065 1064 controller='files', action='edit', revision='tip',
1066 1065 f_path='',
1067 1066 conditions={'function': check_repo, 'method': ['POST']},
1068 1067 requirements=URL_NAME_REQUIREMENTS)
1069 1068
1070 1069 rmap.connect('files_edit_home',
1071 1070 '/{repo_name}/edit/{revision}/{f_path}',
1072 1071 controller='files', action='edit_home', revision='tip',
1073 1072 f_path='', conditions={'function': check_repo},
1074 1073 requirements=URL_NAME_REQUIREMENTS)
1075 1074
1076 1075 rmap.connect('files_add',
1077 1076 '/{repo_name}/add/{revision}/{f_path}',
1078 1077 controller='files', action='add', revision='tip',
1079 1078 f_path='',
1080 1079 conditions={'function': check_repo, 'method': ['POST']},
1081 1080 requirements=URL_NAME_REQUIREMENTS)
1082 1081
1083 1082 rmap.connect('files_add_home',
1084 1083 '/{repo_name}/add/{revision}/{f_path}',
1085 1084 controller='files', action='add_home', revision='tip',
1086 1085 f_path='', conditions={'function': check_repo},
1087 1086 requirements=URL_NAME_REQUIREMENTS)
1088 1087
1089 1088 rmap.connect('files_delete',
1090 1089 '/{repo_name}/delete/{revision}/{f_path}',
1091 1090 controller='files', action='delete', revision='tip',
1092 1091 f_path='',
1093 1092 conditions={'function': check_repo, 'method': ['POST']},
1094 1093 requirements=URL_NAME_REQUIREMENTS)
1095 1094
1096 1095 rmap.connect('files_delete_home',
1097 1096 '/{repo_name}/delete/{revision}/{f_path}',
1098 1097 controller='files', action='delete_home', revision='tip',
1099 1098 f_path='', conditions={'function': check_repo},
1100 1099 requirements=URL_NAME_REQUIREMENTS)
1101 1100
1102 1101 rmap.connect('files_archive_home', '/{repo_name}/archive/{fname}',
1103 1102 controller='files', action='archivefile',
1104 1103 conditions={'function': check_repo},
1105 1104 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
1106 1105
1107 1106 rmap.connect('files_nodelist_home',
1108 1107 '/{repo_name}/nodelist/{revision}/{f_path}',
1109 1108 controller='files', action='nodelist',
1110 1109 conditions={'function': check_repo},
1111 1110 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
1112 1111
1113 1112 rmap.connect('files_nodetree_full',
1114 1113 '/{repo_name}/nodetree_full/{commit_id}/{f_path}',
1115 1114 controller='files', action='nodetree_full',
1116 1115 conditions={'function': check_repo},
1117 1116 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
1118 1117
1119 1118 rmap.connect('repo_fork_create_home', '/{repo_name}/fork',
1120 1119 controller='forks', action='fork_create',
1121 1120 conditions={'function': check_repo, 'method': ['POST']},
1122 1121 requirements=URL_NAME_REQUIREMENTS)
1123 1122
1124 1123 rmap.connect('repo_fork_home', '/{repo_name}/fork',
1125 1124 controller='forks', action='fork',
1126 1125 conditions={'function': check_repo},
1127 1126 requirements=URL_NAME_REQUIREMENTS)
1128 1127
1129 1128 rmap.connect('repo_forks_home', '/{repo_name}/forks',
1130 1129 controller='forks', action='forks',
1131 1130 conditions={'function': check_repo},
1132 1131 requirements=URL_NAME_REQUIREMENTS)
1133 1132
1134 1133 rmap.connect('repo_followers_home', '/{repo_name}/followers',
1135 1134 controller='followers', action='followers',
1136 1135 conditions={'function': check_repo},
1137 1136 requirements=URL_NAME_REQUIREMENTS)
1138 1137
1139 1138 # must be here for proper group/repo catching pattern
1140 1139 _connect_with_slash(
1141 1140 rmap, 'repo_group_home', '/{group_name}',
1142 1141 controller='home', action='index_repo_group',
1143 1142 conditions={'function': check_group},
1144 1143 requirements=URL_NAME_REQUIREMENTS)
1145 1144
1146 1145 # catch all, at the end
1147 1146 _connect_with_slash(
1148 1147 rmap, 'summary_home', '/{repo_name}', jsroute=True,
1149 1148 controller='summary', action='index',
1150 1149 conditions={'function': check_repo},
1151 1150 requirements=URL_NAME_REQUIREMENTS)
1152 1151
1153 1152 return rmap
1154 1153
1155 1154
1156 1155 def _connect_with_slash(mapper, name, path, *args, **kwargs):
1157 1156 """
1158 1157 Connect a route with an optional trailing slash in `path`.
1159 1158 """
1160 1159 mapper.connect(name + '_slash', path + '/', *args, **kwargs)
1161 1160 mapper.connect(name, path, *args, **kwargs)
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: file renamed from rhodecode/templates/admin/integrations/edit.html to rhodecode/templates/admin/integrations/form.html
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: file was removed
1 NO CONTENT: file was removed
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: file was removed
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: file was removed
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: file was removed
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: file was removed
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: file was removed
The requested commit or file is too big and content was truncated. Show full diff
General Comments 0
You need to be logged in to leave comments. Login now