Technical Debt and it Types Datasets
Collection
24 items
•
Updated
•
1
Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
17,991 | 24,010,646,572 | IssuesEvent | 2022-09-14 18:29:17 | googleapis/repo-automation-bots | https://api.github.com/repos/googleapis/repo-automation-bots | closed | migrate policy bot to yargs | type: process priority: p2 | It becomes difficult to catch up with meow updates.
Related:
https://github.com/googleapis/repo-automation-bots/pull/4372#issuecomment-1244453649 | 1.0 | migrate policy bot to yargs - It becomes difficult to catch up with meow updates.
Related:
https://github.com/googleapis/repo-automation-bots/pull/4372#issuecomment-1244453649 | process | migrate policy bot to yargs it becomes difficult to catch up with meow updates related | 1 |
9,624 | 12,562,174,286 | IssuesEvent | 2020-06-08 03:24:37 | pingcap/tidb | https://api.github.com/repos/pingcap/tidb | opened | Add unit test for Corp Cache in TiDB side | component/coprocessor type/enhancement | ## Development Task
Currently, Corp Cache has no related unit test in TiDB side.
We may need to implement Corp Cache protocol in mocktikv or unistore then write the unit test for it. | 1.0 | Add unit test for Corp Cache in TiDB side - ## Development Task
Currently, Corp Cache has no related unit test in TiDB side.
We may need to implement Corp Cache protocol in mocktikv or unistore then write the unit test for it. | process | add unit test for corp cache in tidb side development task currently corp cache has no related unit test in tidb side we may need to implement corp cache protocol in mocktikv or unistore then write the unit test for it | 1 |
1,093 | 3,561,210,242 | IssuesEvent | 2016-01-23 17:01:29 | csscomb/csscomb.js | https://api.github.com/repos/csscomb/csscomb.js | closed | Sass: sorting bug with mixins with content | bug gonzales preprocessors | csscomb 3.0.0-5, note how include breakpoint was moved and content left in original place:
Input file
```
html, body {
height: 100%;
}
body {
@include font-size(13);
}
.center-box {
margin-left: auto;
margin-right: auto;
max-width: $page-width;
.lt-ie9 & {
width: $page-width;
}
@include breakpoint(max-width ($page-width - 1)) {
margin-left: 15px;
margin-right: 15px;
}
}
#main-box {
min-height: 100%;
}
/* Header
------------------------------------ */
#header {
}
/* Content
------------------------------------ */
#content-box {
}
/* Footer
------------------------------------ */
$footerHeight: 100px;
#padder {
padding: 0 0 $footerHeight;
height: 0;
overflow: hidden;
}
#footer {
margin-top: -$footerHeight;
height: $footerHeight;
}
```
Output
```
html, body {
height: 100%;
}
body {
@include font-size(13);
}
.center-box {
@include breakpoint;
margin-right: auto;
margin-left: auto;
max-width: $page-width;
.lt-ie9 & {
width: $page-width;
}(max-width ($page-width - 1)) {
margin-right: 15px;
margin-left: 15px;
}
}
#main-box {
min-height: 100%;
}
/* Header
------------------------------------ */
#header {
}
/* Content
------------------------------------ */
#content-box {
}
/* Footer
------------------------------------ */
$footerHeight: 100px;
#padder {
padding: 0 0 $footerHeight;
height: 0;
overflow: hidden;
}
#footer {
margin-top: -$footerHeight;
height: $footerHeight;
}
```
Config:
```
{
"exclude": [
".git/**",
".hg/**",
"node_modules/**"
],
"always-semicolon": true,
"color-case": "lower",
"block-indent": "\t",
"color-shorthand": true,
"element-case": "lower",
"leading-zero": true,
"quotes": "single",
"space-before-colon": "",
"space-after-colon": " ",
"space-before-combinator": " ",
"space-after-combinator": " ",
"space-between-declarations": "\n",
"space-before-opening-brace": " ",
"space-after-opening-brace": "\n",
"space-before-selector-delimiter": "",
"space-before-closing-brace": "\n",
"strip-spaces": true,
"unitless-zero": true,
"sort-order-fallback": "abc",
"sort-order": [
[
"$variables",
"$include"
],
[
"content",
"position",
"z-index",
"top",
"right",
"bottom",
"left",
"margin",
"margin-top",
"margin-right",
"margin-bottom",
"margin-left",
"border",
"border-collapse",
"border-width",
"border-style",
"border-color",
"border-top",
"border-top-width",
"border-top-style",
"border-top-color",
"border-right",
"border-right-width",
"border-right-style",
"border-right-color",
"border-bottom",
"border-bottom-width",
"border-bottom-style",
"border-bottom-color",
"border-left",
"border-left-width",
"border-left-style",
"border-left-color",
"padding",
"padding-top",
"padding-right",
"padding-bottom",
"padding-left",
"-webkit-box-sizing",
"-moz-box-sizing",
"box-sizing",
"width",
"min-width",
"max-width",
"height",
"min-height",
"max-height",
"display",
"visibility",
"float",
"clear",
"overflow",
"overflow-x",
"overflow-y",
"-ms-overflow-x",
"-ms-overflow-y",
"-webkit-overflow-scrolling",
"clip",
"zoom",
"flex-direction",
"flex-order",
"flex-pack",
"flex-align",
"table-layout",
"empty-cells",
"caption-side",
"border-spacing",
"border-collapse",
"list-style",
"list-style-position",
"list-style-type",
"list-style-image",
"background",
"filter:progid:DXImageTransform.Microsoft.AlphaImageLoader",
"background-color",
"background-image",
"background-repeat",
"background-attachment",
"background-position",
"background-position-x",
"-ms-background-position-x",
"background-position-y",
"-ms-background-position-y",
"-webkit-background-clip",
"-moz-background-clip",
"background-clip",
"background-origin",
"-webkit-background-size",
"-moz-background-size",
"-o-background-size",
"background-size",
"color",
"font",
"font-family",
"font-size",
"font-weight",
"font-style",
"font-variant",
"font-size-adjust",
"font-stretch",
"font-effect",
"font-emphasize",
"font-emphasize-position",
"font-emphasize-style",
"font-smooth",
"line-height",
"text-align",
"-webkit-text-align-last",
"-moz-text-align-last",
"-ms-text-align-last",
"text-align-last",
"vertical-align",
"white-space",
"text-decoration",
"text-emphasis",
"text-emphasis-color",
"text-emphasis-style",
"text-emphasis-position",
"text-indent",
"-ms-text-justify",
"text-justify",
"text-transform",
"letter-spacing",
"word-spacing",
"-ms-writing-mode",
"text-outline",
"text-transform",
"text-wrap",
"text-overflow",
"-ms-text-overflow",
"text-overflow-ellipsis",
"text-overflow-mode",
"-ms-word-wrap",
"word-wrap",
"word-break",
"-ms-word-break",
"-moz-tab-size",
"-o-tab-size",
"tab-size",
"-webkit-hyphens",
"-moz-hyphens",
"hyphens",
"quotes",
"counter-reset",
"counter-increment",
"resize",
"cursor",
"pointer-events",
"-webkit-user-select",
"-moz-user-select",
"-ms-user-select",
"user-select",
"nav-index",
"nav-up",
"nav-right",
"nav-down",
"nav-left",
"opacity",
"filter:progid:DXImageTransform.Microsoft.Alpha(Opacity",
"-ms-filter:\\'progid:DXImageTransform.Microsoft.Alpha",
"-ms-interpolation-mode",
"-webkit-border-radius",
"-moz-border-radius",
"border-radius",
"-webkit-border-top-left-radius",
"-moz-border-radius-topleft",
"border-top-left-radius",
"-webkit-border-top-right-radius",
"-moz-border-radius-topright",
"border-top-right-radius",
"-webkit-border-bottom-right-radius",
"-moz-border-radius-bottomright",
"border-bottom-right-radius",
"-webkit-border-bottom-left-radius",
"-moz-border-radius-bottomleft",
"border-bottom-left-radius",
"-webkit-border-image",
"-moz-border-image",
"-o-border-image",
"border-image",
"-webkit-border-image-source",
"-moz-border-image-source",
"-o-border-image-source",
"border-image-source",
"-webkit-border-image-slice",
"-moz-border-image-slice",
"-o-border-image-slice",
"border-image-slice",
"-webkit-border-image-width",
"-moz-border-image-width",
"-o-border-image-width",
"border-image-width",
"-webkit-border-image-outset",
"-moz-border-image-outset",
"-o-border-image-outset",
"border-image-outset",
"-webkit-border-image-repeat",
"-moz-border-image-repeat",
"-o-border-image-repeat",
"border-image-repeat",
"outline",
"outline-width",
"outline-style",
"outline-color",
"outline-offset",
"box-decoration-break",
"-webkit-box-shadow",
"-moz-box-shadow",
"box-shadow",
"-webkit-box-shadow",
"-moz-box-shadow",
"box-shadow",
"-webkit-box-shadow",
"-moz-box-shadow",
"box-shadow",
"-webkit-box-shadow",
"-moz-box-shadow",
"box-shadow",
"filter:progid:DXImageTransform.Microsoft.gradient",
"-ms-filter:\\'progid:DXImageTransform.Microsoft.gradient",
"text-shadow",
"-webkit-transition",
"-moz-transition",
"-ms-transition",
"-o-transition",
"transition",
"-webkit-transition-delay",
"-moz-transition-delay",
"-ms-transition-delay",
"-o-transition-delay",
"transition-delay",
"-webkit-transition-timing-function",
"-moz-transition-timing-function",
"-ms-transition-timing-function",
"-o-transition-timing-function",
"transition-timing-function",
"-webkit-transition-duration",
"-moz-transition-duration",
"-ms-transition-duration",
"-o-transition-duration",
"transition-duration",
"-webkit-transition-property",
"-moz-transition-property",
"-ms-transition-property",
"-o-transition-property",
"transition-property",
"-webkit-transform",
"-moz-transform",
"-ms-transform",
"-o-transform",
"transform",
"-webkit-transform-origin",
"-moz-transform-origin",
"-ms-transform-origin",
"-o-transform-origin",
"transform-origin",
"-webkit-animation",
"-moz-animation",
"-ms-animation",
"-o-animation",
"animation",
"-webkit-animation-name",
"-moz-animation-name",
"-ms-animation-name",
"-o-animation-name",
"animation-name",
"-webkit-animation-duration",
"-moz-animation-duration",
"-ms-animation-duration",
"-o-animation-duration",
"animation-duration",
"-webkit-animation-play-state",
"-moz-animation-play-state",
"-ms-animation-play-state",
"-o-animation-play-state",
"animation-play-state",
"-webkit-animation-timing-function",
"-moz-animation-timing-function",
"-ms-animation-timing-function",
"-o-animation-timing-function",
"animation-timing-function",
"-webkit-animation-delay",
"-moz-animation-delay",
"-ms-animation-delay",
"-o-animation-delay",
"animation-delay",
"-webkit-animation-iteration-count",
"-moz-animation-iteration-count",
"-ms-animation-iteration-count",
"-o-animation-iteration-count",
"animation-iteration-count",
"-webkit-animation-iteration-count",
"-moz-animation-iteration-count",
"-ms-animation-iteration-count",
"-o-animation-iteration-count",
"animation-iteration-count",
"-webkit-animation-direction",
"-moz-animation-direction",
"-ms-animation-direction",
"-o-animation-direction",
"animation-direction"
]
]
}
```
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/3321043-sass-sorting-bug-with-mixins-with-content?utm_campaign=plugin&utm_content=tracker%2F214563&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F214563&utm_medium=issues&utm_source=github).
</bountysource-plugin> | 1.0 | Sass: sorting bug with mixins with content - csscomb 3.0.0-5, note how include breakpoint was moved and content left in original place:
Input file
```
html, body {
height: 100%;
}
body {
@include font-size(13);
}
.center-box {
margin-left: auto;
margin-right: auto;
max-width: $page-width;
.lt-ie9 & {
width: $page-width;
}
@include breakpoint(max-width ($page-width - 1)) {
margin-left: 15px;
margin-right: 15px;
}
}
#main-box {
min-height: 100%;
}
/* Header
------------------------------------ */
#header {
}
/* Content
------------------------------------ */
#content-box {
}
/* Footer
------------------------------------ */
$footerHeight: 100px;
#padder {
padding: 0 0 $footerHeight;
height: 0;
overflow: hidden;
}
#footer {
margin-top: -$footerHeight;
height: $footerHeight;
}
```
Output
```
html, body {
height: 100%;
}
body {
@include font-size(13);
}
.center-box {
@include breakpoint;
margin-right: auto;
margin-left: auto;
max-width: $page-width;
.lt-ie9 & {
width: $page-width;
}(max-width ($page-width - 1)) {
margin-right: 15px;
margin-left: 15px;
}
}
#main-box {
min-height: 100%;
}
/* Header
------------------------------------ */
#header {
}
/* Content
------------------------------------ */
#content-box {
}
/* Footer
------------------------------------ */
$footerHeight: 100px;
#padder {
padding: 0 0 $footerHeight;
height: 0;
overflow: hidden;
}
#footer {
margin-top: -$footerHeight;
height: $footerHeight;
}
```
Config:
```
{
"exclude": [
".git/**",
".hg/**",
"node_modules/**"
],
"always-semicolon": true,
"color-case": "lower",
"block-indent": "\t",
"color-shorthand": true,
"element-case": "lower",
"leading-zero": true,
"quotes": "single",
"space-before-colon": "",
"space-after-colon": " ",
"space-before-combinator": " ",
"space-after-combinator": " ",
"space-between-declarations": "\n",
"space-before-opening-brace": " ",
"space-after-opening-brace": "\n",
"space-before-selector-delimiter": "",
"space-before-closing-brace": "\n",
"strip-spaces": true,
"unitless-zero": true,
"sort-order-fallback": "abc",
"sort-order": [
[
"$variables",
"$include"
],
[
"content",
"position",
"z-index",
"top",
"right",
"bottom",
"left",
"margin",
"margin-top",
"margin-right",
"margin-bottom",
"margin-left",
"border",
"border-collapse",
"border-width",
"border-style",
"border-color",
"border-top",
"border-top-width",
"border-top-style",
"border-top-color",
"border-right",
"border-right-width",
"border-right-style",
"border-right-color",
"border-bottom",
"border-bottom-width",
"border-bottom-style",
"border-bottom-color",
"border-left",
"border-left-width",
"border-left-style",
"border-left-color",
"padding",
"padding-top",
"padding-right",
"padding-bottom",
"padding-left",
"-webkit-box-sizing",
"-moz-box-sizing",
"box-sizing",
"width",
"min-width",
"max-width",
"height",
"min-height",
"max-height",
"display",
"visibility",
"float",
"clear",
"overflow",
"overflow-x",
"overflow-y",
"-ms-overflow-x",
"-ms-overflow-y",
"-webkit-overflow-scrolling",
"clip",
"zoom",
"flex-direction",
"flex-order",
"flex-pack",
"flex-align",
"table-layout",
"empty-cells",
"caption-side",
"border-spacing",
"border-collapse",
"list-style",
"list-style-position",
"list-style-type",
"list-style-image",
"background",
"filter:progid:DXImageTransform.Microsoft.AlphaImageLoader",
"background-color",
"background-image",
"background-repeat",
"background-attachment",
"background-position",
"background-position-x",
"-ms-background-position-x",
"background-position-y",
"-ms-background-position-y",
"-webkit-background-clip",
"-moz-background-clip",
"background-clip",
"background-origin",
"-webkit-background-size",
"-moz-background-size",
"-o-background-size",
"background-size",
"color",
"font",
"font-family",
"font-size",
"font-weight",
"font-style",
"font-variant",
"font-size-adjust",
"font-stretch",
"font-effect",
"font-emphasize",
"font-emphasize-position",
"font-emphasize-style",
"font-smooth",
"line-height",
"text-align",
"-webkit-text-align-last",
"-moz-text-align-last",
"-ms-text-align-last",
"text-align-last",
"vertical-align",
"white-space",
"text-decoration",
"text-emphasis",
"text-emphasis-color",
"text-emphasis-style",
"text-emphasis-position",
"text-indent",
"-ms-text-justify",
"text-justify",
"text-transform",
"letter-spacing",
"word-spacing",
"-ms-writing-mode",
"text-outline",
"text-transform",
"text-wrap",
"text-overflow",
"-ms-text-overflow",
"text-overflow-ellipsis",
"text-overflow-mode",
"-ms-word-wrap",
"word-wrap",
"word-break",
"-ms-word-break",
"-moz-tab-size",
"-o-tab-size",
"tab-size",
"-webkit-hyphens",
"-moz-hyphens",
"hyphens",
"quotes",
"counter-reset",
"counter-increment",
"resize",
"cursor",
"pointer-events",
"-webkit-user-select",
"-moz-user-select",
"-ms-user-select",
"user-select",
"nav-index",
"nav-up",
"nav-right",
"nav-down",
"nav-left",
"opacity",
"filter:progid:DXImageTransform.Microsoft.Alpha(Opacity",
"-ms-filter:\\'progid:DXImageTransform.Microsoft.Alpha",
"-ms-interpolation-mode",
"-webkit-border-radius",
"-moz-border-radius",
"border-radius",
"-webkit-border-top-left-radius",
"-moz-border-radius-topleft",
"border-top-left-radius",
"-webkit-border-top-right-radius",
"-moz-border-radius-topright",
"border-top-right-radius",
"-webkit-border-bottom-right-radius",
"-moz-border-radius-bottomright",
"border-bottom-right-radius",
"-webkit-border-bottom-left-radius",
"-moz-border-radius-bottomleft",
"border-bottom-left-radius",
"-webkit-border-image",
"-moz-border-image",
"-o-border-image",
"border-image",
"-webkit-border-image-source",
"-moz-border-image-source",
"-o-border-image-source",
"border-image-source",
"-webkit-border-image-slice",
"-moz-border-image-slice",
"-o-border-image-slice",
"border-image-slice",
"-webkit-border-image-width",
"-moz-border-image-width",
"-o-border-image-width",
"border-image-width",
"-webkit-border-image-outset",
"-moz-border-image-outset",
"-o-border-image-outset",
"border-image-outset",
"-webkit-border-image-repeat",
"-moz-border-image-repeat",
"-o-border-image-repeat",
"border-image-repeat",
"outline",
"outline-width",
"outline-style",
"outline-color",
"outline-offset",
"box-decoration-break",
"-webkit-box-shadow",
"-moz-box-shadow",
"box-shadow",
"-webkit-box-shadow",
"-moz-box-shadow",
"box-shadow",
"-webkit-box-shadow",
"-moz-box-shadow",
"box-shadow",
"-webkit-box-shadow",
"-moz-box-shadow",
"box-shadow",
"filter:progid:DXImageTransform.Microsoft.gradient",
"-ms-filter:\\'progid:DXImageTransform.Microsoft.gradient",
"text-shadow",
"-webkit-transition",
"-moz-transition",
"-ms-transition",
"-o-transition",
"transition",
"-webkit-transition-delay",
"-moz-transition-delay",
"-ms-transition-delay",
"-o-transition-delay",
"transition-delay",
"-webkit-transition-timing-function",
"-moz-transition-timing-function",
"-ms-transition-timing-function",
"-o-transition-timing-function",
"transition-timing-function",
"-webkit-transition-duration",
"-moz-transition-duration",
"-ms-transition-duration",
"-o-transition-duration",
"transition-duration",
"-webkit-transition-property",
"-moz-transition-property",
"-ms-transition-property",
"-o-transition-property",
"transition-property",
"-webkit-transform",
"-moz-transform",
"-ms-transform",
"-o-transform",
"transform",
"-webkit-transform-origin",
"-moz-transform-origin",
"-ms-transform-origin",
"-o-transform-origin",
"transform-origin",
"-webkit-animation",
"-moz-animation",
"-ms-animation",
"-o-animation",
"animation",
"-webkit-animation-name",
"-moz-animation-name",
"-ms-animation-name",
"-o-animation-name",
"animation-name",
"-webkit-animation-duration",
"-moz-animation-duration",
"-ms-animation-duration",
"-o-animation-duration",
"animation-duration",
"-webkit-animation-play-state",
"-moz-animation-play-state",
"-ms-animation-play-state",
"-o-animation-play-state",
"animation-play-state",
"-webkit-animation-timing-function",
"-moz-animation-timing-function",
"-ms-animation-timing-function",
"-o-animation-timing-function",
"animation-timing-function",
"-webkit-animation-delay",
"-moz-animation-delay",
"-ms-animation-delay",
"-o-animation-delay",
"animation-delay",
"-webkit-animation-iteration-count",
"-moz-animation-iteration-count",
"-ms-animation-iteration-count",
"-o-animation-iteration-count",
"animation-iteration-count",
"-webkit-animation-iteration-count",
"-moz-animation-iteration-count",
"-ms-animation-iteration-count",
"-o-animation-iteration-count",
"animation-iteration-count",
"-webkit-animation-direction",
"-moz-animation-direction",
"-ms-animation-direction",
"-o-animation-direction",
"animation-direction"
]
]
}
```
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/3321043-sass-sorting-bug-with-mixins-with-content?utm_campaign=plugin&utm_content=tracker%2F214563&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F214563&utm_medium=issues&utm_source=github).
</bountysource-plugin> | process | sass sorting bug with mixins with content csscomb note how include breakpoint was moved and content left in original place input file html body height body include font size center box margin left auto margin right auto max width page width lt width page width include breakpoint max width page width margin left margin right main box min height header header content content box footer footerheight padder padding footerheight height overflow hidden footer margin top footerheight height footerheight output html body height body include font size center box include breakpoint margin right auto margin left auto max width page width lt width page width max width page width margin right margin left main box min height header header content content box footer footerheight padder padding footerheight height overflow hidden footer margin top footerheight height footerheight config exclude git hg node modules always semicolon true color case lower block indent t color shorthand true element case lower leading zero true quotes single space before colon space after colon space before combinator space after combinator space between declarations n space before opening brace space after opening brace n space before selector delimiter space before closing brace n strip spaces true unitless zero true sort order fallback abc sort order variables include content position z index top right bottom left margin margin top margin right margin bottom margin left border border collapse border width border style border color border top border top width border top style border top color border right border right width border right style border right color border bottom border bottom width border bottom style border bottom color border left border left width border left style border left color padding padding top padding right padding bottom padding left webkit box sizing moz box sizing box sizing width min width max width height min height max height display visibility float clear overflow overflow x overflow y ms overflow x ms overflow y webkit overflow scrolling clip zoom flex direction flex order flex pack flex align table layout empty cells caption side border spacing border collapse list style list style position list style type list style image background filter progid dximagetransform microsoft alphaimageloader background color background image background repeat background attachment background position background position x ms background position x background position y ms background position y webkit background clip moz background clip background clip background origin webkit background size moz background size o background size background size color font font family font size font weight font style font variant font size adjust font stretch font effect font emphasize font emphasize position font emphasize style font smooth line height text align webkit text align last moz text align last ms text align last text align last vertical align white space text decoration text emphasis text emphasis color text emphasis style text emphasis position text indent ms text justify text justify text transform letter spacing word spacing ms writing mode text outline text transform text wrap text overflow ms text overflow text overflow ellipsis text overflow mode ms word wrap word wrap word break ms word break moz tab size o tab size tab size webkit hyphens moz hyphens hyphens quotes counter reset counter increment resize cursor pointer events webkit user select moz user select ms user select user select nav index nav up nav right nav down nav left opacity filter progid dximagetransform microsoft alpha opacity ms filter progid dximagetransform microsoft alpha ms interpolation mode webkit border radius moz border radius border radius webkit border top left radius moz border radius topleft border top left radius webkit border top right radius moz border radius topright border top right radius webkit border bottom right radius moz border radius bottomright border bottom right radius webkit border bottom left radius moz border radius bottomleft border bottom left radius webkit border image moz border image o border image border image webkit border image source moz border image source o border image source border image source webkit border image slice moz border image slice o border image slice border image slice webkit border image width moz border image width o border image width border image width webkit border image outset moz border image outset o border image outset border image outset webkit border image repeat moz border image repeat o border image repeat border image repeat outline outline width outline style outline color outline offset box decoration break webkit box shadow moz box shadow box shadow webkit box shadow moz box shadow box shadow webkit box shadow moz box shadow box shadow webkit box shadow moz box shadow box shadow filter progid dximagetransform microsoft gradient ms filter progid dximagetransform microsoft gradient text shadow webkit transition moz transition ms transition o transition transition webkit transition delay moz transition delay ms transition delay o transition delay transition delay webkit transition timing function moz transition timing function ms transition timing function o transition timing function transition timing function webkit transition duration moz transition duration ms transition duration o transition duration transition duration webkit transition property moz transition property ms transition property o transition property transition property webkit transform moz transform ms transform o transform transform webkit transform origin moz transform origin ms transform origin o transform origin transform origin webkit animation moz animation ms animation o animation animation webkit animation name moz animation name ms animation name o animation name animation name webkit animation duration moz animation duration ms animation duration o animation duration animation duration webkit animation play state moz animation play state ms animation play state o animation play state animation play state webkit animation timing function moz animation timing function ms animation timing function o animation timing function animation timing function webkit animation delay moz animation delay ms animation delay o animation delay animation delay webkit animation iteration count moz animation iteration count ms animation iteration count o animation iteration count animation iteration count webkit animation iteration count moz animation iteration count ms animation iteration count o animation iteration count animation iteration count webkit animation direction moz animation direction ms animation direction o animation direction animation direction want to back this issue we accept bounties via | 1 |
18,686 | 24,594,945,554 | IssuesEvent | 2022-10-14 07:29:17 | GoogleCloudPlatform/fda-mystudies | https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies | closed | [DID] The below entered name is not getting de-identified | Bug P1 Response datastore Process: Fixed Process: Tested dev | 1. The below entered name is not getting de-identified
'My name is Naga'
2. Locations not getting redacted

| 2.0 | [DID] The below entered name is not getting de-identified - 1. The below entered name is not getting de-identified
'My name is Naga'
2. Locations not getting redacted

| process | the below entered name is not getting de identified the below entered name is not getting de identified my name is naga locations not getting redacted | 1 |
14,589 | 17,703,525,472 | IssuesEvent | 2021-08-25 03:12:34 | tdwg/dwc | https://api.github.com/repos/tdwg/dwc | closed | Change term: footprintSRS | Term - change Class - Location normative Process - complete | ## Change term
* Term identifier (URL of the term to change): http://rs.tdwg.org/dwc/terms/#footprintSRS
* Justification (why is this change necessary?): currently footprintSRS must be provided using the Well-Known Text (WKT) representation of the Spatial Reference System (SRS) for the footprintWKT. This representation is very long (e.g. ESRI WKT for EPSG:28992), thus making it more prone to being written with error compared to writing the shorter EPSG code. For example, publishers might try to provide the Human-Readable OGC WKT using newline characters which can break tabular data. Furthermore, confusion can arise when deciding to use the OGC WKT or ESRI WKT representation. For all these reasons, allowing the SRS for the footprintWKT to also be provided using the EPSG code (assuming it is present in http://epsg.io) will make it easier for publishers to fill in footprintSRS, and for users of the data to understand it.
* Submitter: Kyle Braak
I suggest the following changes (leave blank whatever would not change):
* Term name (in lowerCamelCase): footprintSRS
* Class (e.g. Location, Taxon): Location
* Definition of the term: **The ellipsoid, geodetic datum, or spatial reference system (SRS) upon which the geometry given in footprintWKT is based.**
* Usage comments (recommendations regarding content, etc.): **Recommended best practice is to use the EPSG code of the SRS, if known. Otherwise use a controlled vocabulary for the name or code of the geodetic datum, if known. Otherwise use a controlled vocabulary for the name or code of the ellipsoid, if known. If none of these is known, use the value `unknown`. It is also permitted to provide the SRS in Well-Known-Text, especially if no EPSG code provides the necessary values for the attributes of the SRS. Do not use this term to describe the SRS of the decimalLatitude and decimalLongitude, nor of any verbatim coordinates - use the geodeticDatum and verbatimSRS instead.**
* Examples: **`epsg:4326`**, `GEOGCS["GCS_WGS_1984", DATUM["D_WGS_1984", SPHEROID["WGS_1984",6378137,298.257223563]], PRIMEM["Greenwich",0], UNIT["Degree",0.0174532925199433]]` (WKT for the standard WGS84 Spatial Reference System EPSG:4326)
* Refines (identifier of the broader term this term refines, if applicable):
* Replaces (identifier of the existing term that would be deprecated and replaced by this term, if applicable): http://rs.tdwg.org/dwc/terms/version/footprintSRS-2018-09-06
* ABCD 2.06 (XPATH of the equivalent term in ABCD, if applicable): not in ABCD
Original comment:
**Term name**: [footprintSRS](http://rs.tdwg.org/dwc/terms/#footprintSRS)
**Term change recommendation**: to allow footprintSRS _to also be_ provided using the EPSG code as a controlled vocabulary (e.g. "EPSG:4326").
**Term change justification**: currently footprintSRS must be provided using the Well-Known Text (WKT) representation of the Spatial Reference System (SRS) for the footprintWKT. This representation is very long (e.g. [ESRI WKT for EPSG:28992](http://spatialreference.org/ref/epsg/amersfoort-rd-new/esriwkt/)), thus making it more prone to being written with error compared to writing the shorter EPSG code. For example, publishers might try to provide the [Human-Readable OGC WKT](http://spatialreference.org/ref/epsg/wgs-84/prettywkt/) using newline characters which can break tabular data. Furthermore, confusion can arise when deciding to use the [OGC WKT](http://spatialreference.org/ref/epsg/wgs-84/ogcwkt/) or [ESRI WKT](http://spatialreference.org/ref/epsg/wgs-84/esriwkt/) representation. For all these reasons, allowing the SRS for the footprintWKT to also be provided using the EPSG code (assuming it is present in http://epsg.io) will make it easier for publishers to fill in footprintSRS, and for users of the data to understand it.
| 1.0 | Change term: footprintSRS - ## Change term
* Term identifier (URL of the term to change): http://rs.tdwg.org/dwc/terms/#footprintSRS
* Justification (why is this change necessary?): currently footprintSRS must be provided using the Well-Known Text (WKT) representation of the Spatial Reference System (SRS) for the footprintWKT. This representation is very long (e.g. ESRI WKT for EPSG:28992), thus making it more prone to being written with error compared to writing the shorter EPSG code. For example, publishers might try to provide the Human-Readable OGC WKT using newline characters which can break tabular data. Furthermore, confusion can arise when deciding to use the OGC WKT or ESRI WKT representation. For all these reasons, allowing the SRS for the footprintWKT to also be provided using the EPSG code (assuming it is present in http://epsg.io) will make it easier for publishers to fill in footprintSRS, and for users of the data to understand it.
* Submitter: Kyle Braak
I suggest the following changes (leave blank whatever would not change):
* Term name (in lowerCamelCase): footprintSRS
* Class (e.g. Location, Taxon): Location
* Definition of the term: **The ellipsoid, geodetic datum, or spatial reference system (SRS) upon which the geometry given in footprintWKT is based.**
* Usage comments (recommendations regarding content, etc.): **Recommended best practice is to use the EPSG code of the SRS, if known. Otherwise use a controlled vocabulary for the name or code of the geodetic datum, if known. Otherwise use a controlled vocabulary for the name or code of the ellipsoid, if known. If none of these is known, use the value `unknown`. It is also permitted to provide the SRS in Well-Known-Text, especially if no EPSG code provides the necessary values for the attributes of the SRS. Do not use this term to describe the SRS of the decimalLatitude and decimalLongitude, nor of any verbatim coordinates - use the geodeticDatum and verbatimSRS instead.**
* Examples: **`epsg:4326`**, `GEOGCS["GCS_WGS_1984", DATUM["D_WGS_1984", SPHEROID["WGS_1984",6378137,298.257223563]], PRIMEM["Greenwich",0], UNIT["Degree",0.0174532925199433]]` (WKT for the standard WGS84 Spatial Reference System EPSG:4326)
* Refines (identifier of the broader term this term refines, if applicable):
* Replaces (identifier of the existing term that would be deprecated and replaced by this term, if applicable): http://rs.tdwg.org/dwc/terms/version/footprintSRS-2018-09-06
* ABCD 2.06 (XPATH of the equivalent term in ABCD, if applicable): not in ABCD
Original comment:
**Term name**: [footprintSRS](http://rs.tdwg.org/dwc/terms/#footprintSRS)
**Term change recommendation**: to allow footprintSRS _to also be_ provided using the EPSG code as a controlled vocabulary (e.g. "EPSG:4326").
**Term change justification**: currently footprintSRS must be provided using the Well-Known Text (WKT) representation of the Spatial Reference System (SRS) for the footprintWKT. This representation is very long (e.g. [ESRI WKT for EPSG:28992](http://spatialreference.org/ref/epsg/amersfoort-rd-new/esriwkt/)), thus making it more prone to being written with error compared to writing the shorter EPSG code. For example, publishers might try to provide the [Human-Readable OGC WKT](http://spatialreference.org/ref/epsg/wgs-84/prettywkt/) using newline characters which can break tabular data. Furthermore, confusion can arise when deciding to use the [OGC WKT](http://spatialreference.org/ref/epsg/wgs-84/ogcwkt/) or [ESRI WKT](http://spatialreference.org/ref/epsg/wgs-84/esriwkt/) representation. For all these reasons, allowing the SRS for the footprintWKT to also be provided using the EPSG code (assuming it is present in http://epsg.io) will make it easier for publishers to fill in footprintSRS, and for users of the data to understand it.
| process | change term footprintsrs change term term identifier url of the term to change justification why is this change necessary currently footprintsrs must be provided using the well known text wkt representation of the spatial reference system srs for the footprintwkt this representation is very long e g esri wkt for epsg thus making it more prone to being written with error compared to writing the shorter epsg code for example publishers might try to provide the human readable ogc wkt using newline characters which can break tabular data furthermore confusion can arise when deciding to use the ogc wkt or esri wkt representation for all these reasons allowing the srs for the footprintwkt to also be provided using the epsg code assuming it is present in will make it easier for publishers to fill in footprintsrs and for users of the data to understand it submitter kyle braak i suggest the following changes leave blank whatever would not change term name in lowercamelcase footprintsrs class e g location taxon location definition of the term the ellipsoid geodetic datum or spatial reference system srs upon which the geometry given in footprintwkt is based usage comments recommendations regarding content etc recommended best practice is to use the epsg code of the srs if known otherwise use a controlled vocabulary for the name or code of the geodetic datum if known otherwise use a controlled vocabulary for the name or code of the ellipsoid if known if none of these is known use the value unknown it is also permitted to provide the srs in well known text especially if no epsg code provides the necessary values for the attributes of the srs do not use this term to describe the srs of the decimallatitude and decimallongitude nor of any verbatim coordinates use the geodeticdatum and verbatimsrs instead examples epsg geogcs primem unit wkt for the standard spatial reference system epsg refines identifier of the broader term this term refines if applicable replaces identifier of the existing term that would be deprecated and replaced by this term if applicable abcd xpath of the equivalent term in abcd if applicable not in abcd original comment term name term change recommendation to allow footprintsrs to also be provided using the epsg code as a controlled vocabulary e g epsg term change justification currently footprintsrs must be provided using the well known text wkt representation of the spatial reference system srs for the footprintwkt this representation is very long e g thus making it more prone to being written with error compared to writing the shorter epsg code for example publishers might try to provide the using newline characters which can break tabular data furthermore confusion can arise when deciding to use the or representation for all these reasons allowing the srs for the footprintwkt to also be provided using the epsg code assuming it is present in will make it easier for publishers to fill in footprintsrs and for users of the data to understand it | 1 |
1,802 | 3,123,069,268 | IssuesEvent | 2015-09-07 02:32:04 | orientechnologies/orientdb | https://api.github.com/repos/orientechnologies/orientdb | closed | OrientDB takes 10x more time to run on graph mode | in progress performance waiting reply | I am trying to run our load tests on a 2.1.1 cluster of 3 nodes. The operations / group of operations that usually takes 1.5 sec in standalone mode, is now taking 15sec + in clustered mode.
Is this normal? | True | OrientDB takes 10x more time to run on graph mode - I am trying to run our load tests on a 2.1.1 cluster of 3 nodes. The operations / group of operations that usually takes 1.5 sec in standalone mode, is now taking 15sec + in clustered mode.
Is this normal? | non_process | orientdb takes more time to run on graph mode i am trying to run our load tests on a cluster of nodes the operations group of operations that usually takes sec in standalone mode is now taking in clustered mode is this normal | 0 |
791,017 | 27,846,794,151 | IssuesEvent | 2023-03-20 15:58:28 | bounswe/bounswe2023group1 | https://api.github.com/repos/bounswe/bounswe2023group1 | closed | Standardization of Meeting Notes 6 | Priority/Low Type/Wiki Effort/Low State/Assigned | The action items table of meeting notes 6 is not in the standard form. | 1.0 | Standardization of Meeting Notes 6 - The action items table of meeting notes 6 is not in the standard form. | non_process | standardization of meeting notes the action items table of meeting notes is not in the standard form | 0 |
2,944 | 5,923,237,816 | IssuesEvent | 2017-05-23 07:21:39 | orbardugo/Hahot-Hameshulash | https://api.github.com/repos/orbardugo/Hahot-Hameshulash | closed | Create ZFR Wiki page | in process | ## Checklist:
- [x] Version Control
- [x] Create prototype #6
- [x] [User Manual](https://github.com/orbardugo/Hahot-Hameshulash/wiki/user-manual)
- [x] [Readme - for new development](https://github.com/orbardugo/Hahot-Hameshulash/blob/master/README.md#development)
- [x] [Create new labels](https://github.com/orbardugo/Hahot-Hameshulash/labels)
- [x] Create schedule tasks - new project
- [x] Next Iteration's planning
| 1.0 | Create ZFR Wiki page - ## Checklist:
- [x] Version Control
- [x] Create prototype #6
- [x] [User Manual](https://github.com/orbardugo/Hahot-Hameshulash/wiki/user-manual)
- [x] [Readme - for new development](https://github.com/orbardugo/Hahot-Hameshulash/blob/master/README.md#development)
- [x] [Create new labels](https://github.com/orbardugo/Hahot-Hameshulash/labels)
- [x] Create schedule tasks - new project
- [x] Next Iteration's planning
| process | create zfr wiki page checklist version control create prototype create schedule tasks new project next iteration s planning | 1 |
18,231 | 24,297,915,658 | IssuesEvent | 2022-09-29 11:40:19 | saibrotech/mentoria | https://api.github.com/repos/saibrotech/mentoria | closed | Fazer processo seletivo Santander Code 2022 | processo seletivo | https://letscode.com.br/processos-seletivos/santander-coders
Etapas
- [x] Realizar inscrição para "Web Full Stack"
- [x] Fazer curso online - 10/09 à 18/09
- [ ] Fazer teste de Lógica - 21/09
- [ ] Participar de Dinâmica com especialistas - 26/09 à 30/09
- [ ] Realizar coding Tank - 11/10 à 21/10
- [ ] Verificar resultado - 26/10 | 1.0 | Fazer processo seletivo Santander Code 2022 - https://letscode.com.br/processos-seletivos/santander-coders
Etapas
- [x] Realizar inscrição para "Web Full Stack"
- [x] Fazer curso online - 10/09 à 18/09
- [ ] Fazer teste de Lógica - 21/09
- [ ] Participar de Dinâmica com especialistas - 26/09 à 30/09
- [ ] Realizar coding Tank - 11/10 à 21/10
- [ ] Verificar resultado - 26/10 | process | fazer processo seletivo santander code etapas realizar inscrição para web full stack fazer curso online à fazer teste de lógica participar de dinâmica com especialistas à realizar coding tank à verificar resultado | 1 |
8,012 | 4,128,258,870 | IssuesEvent | 2016-06-10 04:52:35 | openshift/origin | https://api.github.com/repos/openshift/origin | closed | New build strategy to trigger external CI (Jenkins) | component/build kind/enhancement priority/P2 | In order to support things like the fabric8 CD pipelines, we'd like to add a new build strategy that triggers external CI via a webhook. This build strategy would be configured just with a URL & optionally a secret to use to trigger the build in an external CI server.
This would also involve adding the creation of a Jenkins job via `oc new-app` if the source repository contains the standard `Jenkinsfile` used for pipeline configuration.
/cc @jstrachan @rawlingsj | 1.0 | New build strategy to trigger external CI (Jenkins) - In order to support things like the fabric8 CD pipelines, we'd like to add a new build strategy that triggers external CI via a webhook. This build strategy would be configured just with a URL & optionally a secret to use to trigger the build in an external CI server.
This would also involve adding the creation of a Jenkins job via `oc new-app` if the source repository contains the standard `Jenkinsfile` used for pipeline configuration.
/cc @jstrachan @rawlingsj | non_process | new build strategy to trigger external ci jenkins in order to support things like the cd pipelines we d like to add a new build strategy that triggers external ci via a webhook this build strategy would be configured just with a url optionally a secret to use to trigger the build in an external ci server this would also involve adding the creation of a jenkins job via oc new app if the source repository contains the standard jenkinsfile used for pipeline configuration cc jstrachan rawlingsj | 0 |
18,353 | 24,480,289,074 | IssuesEvent | 2022-10-08 18:39:40 | RobertCraigie/prisma-client-py | https://api.github.com/repos/RobertCraigie/prisma-client-py | closed | Schema copying error when generating to the root directory | bug/2-confirmed kind/bug process/candidate priority/high level/unknown topic: generation | <!--
Thanks for helping us improve Prisma Client Python! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by enabling additional logging output.
See https://prisma-client-py.readthedocs.io/en/stable/reference/logging/ for how to enable additional logging output.
-->
## Bug description
<!-- A clear and concise description of what the bug is. -->
https://discord.com/channels/933860922039099444/933860923117043718/1027579126703468596
## How to reproduce
<!--
Steps to reproduce the behavior:
1. Go to '...'
2. Change '....'
3. Run '....'
4. See error
-->
TODO
## Expected behavior
<!-- A clear and concise description of what you expected to happen. -->
This should not crash.
## Environment & setup
<!-- In which environment does the problem occur -->
- OS: <!--[e.g. Mac OS, Windows, Debian, CentOS, ...]--> MacOS
- Database: <!--[PostgreSQL, MySQL, MariaDB or SQLite]--> SQLite
- Python version: <!--[Run `python -V` to see your Python version]--> 3.9
| 1.0 | Schema copying error when generating to the root directory - <!--
Thanks for helping us improve Prisma Client Python! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by enabling additional logging output.
See https://prisma-client-py.readthedocs.io/en/stable/reference/logging/ for how to enable additional logging output.
-->
## Bug description
<!-- A clear and concise description of what the bug is. -->
https://discord.com/channels/933860922039099444/933860923117043718/1027579126703468596
## How to reproduce
<!--
Steps to reproduce the behavior:
1. Go to '...'
2. Change '....'
3. Run '....'
4. See error
-->
TODO
## Expected behavior
<!-- A clear and concise description of what you expected to happen. -->
This should not crash.
## Environment & setup
<!-- In which environment does the problem occur -->
- OS: <!--[e.g. Mac OS, Windows, Debian, CentOS, ...]--> MacOS
- Database: <!--[PostgreSQL, MySQL, MariaDB or SQLite]--> SQLite
- Python version: <!--[Run `python -V` to see your Python version]--> 3.9
| process | schema copying error when generating to the root directory thanks for helping us improve prisma client python 🙏 please follow the sections in the template and provide as much information as possible about your problem e g by enabling additional logging output see for how to enable additional logging output bug description how to reproduce steps to reproduce the behavior go to change run see error todo expected behavior this should not crash environment setup os macos database sqlite python version | 1 |
12,788 | 15,167,585,579 | IssuesEvent | 2021-02-12 18:01:16 | wordpress-mobile/gutenberg-mobile | https://api.github.com/repos/wordpress-mobile/gutenberg-mobile | closed | Test plan for 16.7 Beta version of WordPress Android app | release-process | Due to the recent issue regarding the merged [PR](https://github.com/wordpress-mobile/WordPress-Android/pull/13699) into `WordPress-Android` repository, we have to execute a specific test plan to ensure that it didn't break anything.
**This plan will be executed only on Android.**
- [x] [Writing flow tests](https://github.com/wordpress-mobile/test-cases/tree/trunk/test-cases/gutenberg/writing-flow) - @fluiddot
- [x] Random tests from the [sanity check test suites](https://github.com/wordpress-mobile/test-cases/blob/trunk/test-suites/gutenberg/sanity-tests.md) (preferably one test from each section to cover a wide range) @ceyhun
- [x] Smoke test the editor flow in general by doing: @fluiddot
- Entering exiting the editor
- Rotate the device in various cases
- Open and dismiss the bottom sheet
- Background the app and foreground it again | 1.0 | Test plan for 16.7 Beta version of WordPress Android app - Due to the recent issue regarding the merged [PR](https://github.com/wordpress-mobile/WordPress-Android/pull/13699) into `WordPress-Android` repository, we have to execute a specific test plan to ensure that it didn't break anything.
**This plan will be executed only on Android.**
- [x] [Writing flow tests](https://github.com/wordpress-mobile/test-cases/tree/trunk/test-cases/gutenberg/writing-flow) - @fluiddot
- [x] Random tests from the [sanity check test suites](https://github.com/wordpress-mobile/test-cases/blob/trunk/test-suites/gutenberg/sanity-tests.md) (preferably one test from each section to cover a wide range) @ceyhun
- [x] Smoke test the editor flow in general by doing: @fluiddot
- Entering exiting the editor
- Rotate the device in various cases
- Open and dismiss the bottom sheet
- Background the app and foreground it again | process | test plan for beta version of wordpress android app due to the recent issue regarding the merged into wordpress android repository we have to execute a specific test plan to ensure that it didn t break anything this plan will be executed only on android fluiddot random tests from the preferably one test from each section to cover a wide range ceyhun smoke test the editor flow in general by doing fluiddot entering exiting the editor rotate the device in various cases open and dismiss the bottom sheet background the app and foreground it again | 1 |
64,699 | 6,917,612,577 | IssuesEvent | 2017-11-29 09:13:08 | eclipse/californium | https://api.github.com/repos/eclipse/californium | closed | Can I run "cf-secure" on Android? | bug retest - validate PR | I want to run "cf-secure example" by running Android local server. Is it possible to change "JKS" to "BKS"? | 1.0 | Can I run "cf-secure" on Android? - I want to run "cf-secure example" by running Android local server. Is it possible to change "JKS" to "BKS"? | non_process | can i run cf secure on android i want to run cf secure example by running android local server is it possible to change jks to bks | 0 |
3,327 | 6,445,428,179 | IssuesEvent | 2017-08-13 04:38:21 | nodejs/node | https://api.github.com/repos/nodejs/node | closed | invalid floating point uid or gid for spawn/execSync causes uv to assert and abort node | child_process confirmed-bug v4.x v6.x v7.x | * **Version**: 0.12 to v8.0.0-pre
```
> child_process.spawnSync("cat", {uid: 3.5})
node: ../deps/uv/src/unix/core.c:166: uv_close: Assertion `0' failed.
zsh: abort (core dumped) ./node
```
Also
```
> child_process.execSync("date", {uid: 3.5})
node: ../deps/uv/src/unix/core.c:161: uv_close: Assertion `0' failed.
zsh: abort (core dumped)
% ./node --version
v8.0.0-pre
```
EDIT: git aborts, too | 1.0 | invalid floating point uid or gid for spawn/execSync causes uv to assert and abort node - * **Version**: 0.12 to v8.0.0-pre
```
> child_process.spawnSync("cat", {uid: 3.5})
node: ../deps/uv/src/unix/core.c:166: uv_close: Assertion `0' failed.
zsh: abort (core dumped) ./node
```
Also
```
> child_process.execSync("date", {uid: 3.5})
node: ../deps/uv/src/unix/core.c:161: uv_close: Assertion `0' failed.
zsh: abort (core dumped)
% ./node --version
v8.0.0-pre
```
EDIT: git aborts, too | process | invalid floating point uid or gid for spawn execsync causes uv to assert and abort node version to pre child process spawnsync cat uid node deps uv src unix core c uv close assertion failed zsh abort core dumped node also child process execsync date uid node deps uv src unix core c uv close assertion failed zsh abort core dumped node version pre edit git aborts too | 1 |
696,126 | 23,885,740,719 | IssuesEvent | 2022-09-08 07:31:10 | kubernetes/website | https://api.github.com/repos/kubernetes/website | closed | Grammar problem with Kubernetes Components concept | priority/awaiting-more-evidence lifecycle/stale language/en needs-triage | The subtopic for [Addons](https://kubernetes.io/docs/concepts/overview/components/#addons) has a possible grammatical error. The second sentence starts with 'Because'.
A likely fix would be to replace 'Because' with 'As'.
| 1.0 | Grammar problem with Kubernetes Components concept - The subtopic for [Addons](https://kubernetes.io/docs/concepts/overview/components/#addons) has a possible grammatical error. The second sentence starts with 'Because'.
A likely fix would be to replace 'Because' with 'As'.
| non_process | grammar problem with kubernetes components concept the subtopic for has a possible grammatical error the second sentence starts with because a likely fix would be to replace because with as | 0 |
4,617 | 7,461,450,392 | IssuesEvent | 2018-03-31 03:04:24 | dotnet/corefx | https://api.github.com/repos/dotnet/corefx | closed | NETFX x86 Release Build not running a set of tests | area-System.Diagnostics.Process test bug | Can't workout which ones, but open any recent PR and in the logs will be
```
xUnit.net Console Runner (64-bit Desktop .NET 4.0.30319.42000)
Copyright (C) .NET Foundation.
usage: xunit.console <assemblyFile> [configFile] [assemblyFile [configFile]...] [options] [reporter] [resultFormat filename [...]]
Note: Configuration files must end in .json (for JSON) or .config (for XML)
Valid options:
-nologo : do not show the copyright message
-nocolor : do not output results with colors
-noappdomain : do not use app domains to run test code
-failskips : convert skipped tests into failures
-parallel option : set parallelization based on option
: none - turn off all parallelization
: collections - only parallelize collections
: assemblies - only parallelize assemblies
: all - parallelize assemblies & collections
-maxthreads count : maximum thread count for collection parallelization
: default - run with default (1 thread per CPU thread)
: unlimited - run with unbounded thread count
: (number) - limit task thread pool size to 'count'
-noshadow : do not shadow copy assemblies
-wait : wait for input after completion
-diagnostics : enable diagnostics messages for all test assemblies
-internaldiagnostics : enable internal diagnostics messages for all test assemblies
-debug : launch the debugger to debug the tests
-serialize : serialize all test cases (for diagnostic purposes only)
-trait "name=value" : only run tests with matching name/value traits
: if specified more than once, acts as an OR operation
-notrait "name=value" : do not run tests with matching name/value traits
: if specified more than once, acts as an AND operation
-method "name" : run a given test method (should be fully specified;
: i.e., 'MyNamespace.MyClass.MyTestMethod')
: if specified more than once, acts as an OR operation
xUnit.net Console Runner (64-bit Desktop .NET 4.0.30319.42000)
-class "name" : run all methods in a given test class (should be fully
: specified; i.e., 'MyNamespace.MyClass')
: if specified more than once, acts as an OR operation
-namespace "name" : run all methods in a given namespace (i.e.,
: 'MyNamespace.MySubNamespace')
: if specified more than once, acts as an OR operation
-noautoreporters : do not allow reporters to be auto-enabled by environment
: (for example, auto-detecting TeamCity or AppVeyor)
Result formats: (optional, choose one or more)
-xml <filename> : output results to xUnit.net v2 XML file
-xmlv1 <filename> : output results to xUnit.net v1 XML file
-html <filename> : output results to HTML file
-nunit <filename> : output results to NUnit v2.5 XML file
```
So its outputting the usage rather than running | 1.0 | NETFX x86 Release Build not running a set of tests - Can't workout which ones, but open any recent PR and in the logs will be
```
xUnit.net Console Runner (64-bit Desktop .NET 4.0.30319.42000)
Copyright (C) .NET Foundation.
usage: xunit.console <assemblyFile> [configFile] [assemblyFile [configFile]...] [options] [reporter] [resultFormat filename [...]]
Note: Configuration files must end in .json (for JSON) or .config (for XML)
Valid options:
-nologo : do not show the copyright message
-nocolor : do not output results with colors
-noappdomain : do not use app domains to run test code
-failskips : convert skipped tests into failures
-parallel option : set parallelization based on option
: none - turn off all parallelization
: collections - only parallelize collections
: assemblies - only parallelize assemblies
: all - parallelize assemblies & collections
-maxthreads count : maximum thread count for collection parallelization
: default - run with default (1 thread per CPU thread)
: unlimited - run with unbounded thread count
: (number) - limit task thread pool size to 'count'
-noshadow : do not shadow copy assemblies
-wait : wait for input after completion
-diagnostics : enable diagnostics messages for all test assemblies
-internaldiagnostics : enable internal diagnostics messages for all test assemblies
-debug : launch the debugger to debug the tests
-serialize : serialize all test cases (for diagnostic purposes only)
-trait "name=value" : only run tests with matching name/value traits
: if specified more than once, acts as an OR operation
-notrait "name=value" : do not run tests with matching name/value traits
: if specified more than once, acts as an AND operation
-method "name" : run a given test method (should be fully specified;
: i.e., 'MyNamespace.MyClass.MyTestMethod')
: if specified more than once, acts as an OR operation
xUnit.net Console Runner (64-bit Desktop .NET 4.0.30319.42000)
-class "name" : run all methods in a given test class (should be fully
: specified; i.e., 'MyNamespace.MyClass')
: if specified more than once, acts as an OR operation
-namespace "name" : run all methods in a given namespace (i.e.,
: 'MyNamespace.MySubNamespace')
: if specified more than once, acts as an OR operation
-noautoreporters : do not allow reporters to be auto-enabled by environment
: (for example, auto-detecting TeamCity or AppVeyor)
Result formats: (optional, choose one or more)
-xml <filename> : output results to xUnit.net v2 XML file
-xmlv1 <filename> : output results to xUnit.net v1 XML file
-html <filename> : output results to HTML file
-nunit <filename> : output results to NUnit v2.5 XML file
```
So its outputting the usage rather than running | process | netfx release build not running a set of tests can t workout which ones but open any recent pr and in the logs will be xunit net console runner bit desktop net copyright c net foundation usage xunit console note configuration files must end in json for json or config for xml valid options nologo do not show the copyright message nocolor do not output results with colors noappdomain do not use app domains to run test code failskips convert skipped tests into failures parallel option set parallelization based on option none turn off all parallelization collections only parallelize collections assemblies only parallelize assemblies all parallelize assemblies collections maxthreads count maximum thread count for collection parallelization default run with default thread per cpu thread unlimited run with unbounded thread count number limit task thread pool size to count noshadow do not shadow copy assemblies wait wait for input after completion diagnostics enable diagnostics messages for all test assemblies internaldiagnostics enable internal diagnostics messages for all test assemblies debug launch the debugger to debug the tests serialize serialize all test cases for diagnostic purposes only trait name value only run tests with matching name value traits if specified more than once acts as an or operation notrait name value do not run tests with matching name value traits if specified more than once acts as an and operation method name run a given test method should be fully specified i e mynamespace myclass mytestmethod if specified more than once acts as an or operation xunit net console runner bit desktop net class name run all methods in a given test class should be fully specified i e mynamespace myclass if specified more than once acts as an or operation namespace name run all methods in a given namespace i e mynamespace mysubnamespace if specified more than once acts as an or operation noautoreporters do not allow reporters to be auto enabled by environment for example auto detecting teamcity or appveyor result formats optional choose one or more xml output results to xunit net xml file output results to xunit net xml file html output results to html file nunit output results to nunit xml file so its outputting the usage rather than running | 1 |
7,538 | 10,617,478,765 | IssuesEvent | 2019-10-12 19:23:12 | cetic/tsorage | https://api.github.com/repos/cetic/tsorage | closed | Add user and/or token as observation tags | enhancement processing | In a typical use case, a token is used in order to authenticate a submitted message containing new observations.
In addition to check whether the message should be accepted or refused,
~~1. The token could be added as one of the dynamic tag associated with the observations belonging to the message.~~
2. The user associated with the token could be added as one of the dynamic tag associated with the observations belonging to the message.
For instance, if a message is submitted with the tagset {"status": "ok"}, then, the tagset should be altered in order to become {"status": "ok", "user_id": "mgoeminne"}
If such tag names are already mentioned in the original tagset, their values are replaced in order to reflect the actual token / user ids.
The append / update of the token / user ids should be determined by the configuration file of the interface system. | 1.0 | Add user and/or token as observation tags - In a typical use case, a token is used in order to authenticate a submitted message containing new observations.
In addition to check whether the message should be accepted or refused,
~~1. The token could be added as one of the dynamic tag associated with the observations belonging to the message.~~
2. The user associated with the token could be added as one of the dynamic tag associated with the observations belonging to the message.
For instance, if a message is submitted with the tagset {"status": "ok"}, then, the tagset should be altered in order to become {"status": "ok", "user_id": "mgoeminne"}
If such tag names are already mentioned in the original tagset, their values are replaced in order to reflect the actual token / user ids.
The append / update of the token / user ids should be determined by the configuration file of the interface system. | process | add user and or token as observation tags in a typical use case a token is used in order to authenticate a submitted message containing new observations in addition to check whether the message should be accepted or refused the token could be added as one of the dynamic tag associated with the observations belonging to the message the user associated with the token could be added as one of the dynamic tag associated with the observations belonging to the message for instance if a message is submitted with the tagset status ok then the tagset should be altered in order to become status ok user id mgoeminne if such tag names are already mentioned in the original tagset their values are replaced in order to reflect the actual token user ids the append update of the token user ids should be determined by the configuration file of the interface system | 1 |
4,966 | 7,806,219,383 | IssuesEvent | 2018-06-11 13:29:10 | cptechinc/soft-dpluso | https://api.github.com/repos/cptechinc/soft-dpluso | closed | Customer Template | PHP Processwire | Break Customer Template and Break it into 3
1. Customer
2. Cust Index
3. Customer Add | 1.0 | Customer Template - Break Customer Template and Break it into 3
1. Customer
2. Cust Index
3. Customer Add | process | customer template break customer template and break it into customer cust index customer add | 1 |
14,009 | 16,814,685,329 | IssuesEvent | 2021-06-17 05:32:01 | e4exp/paper_manager_abstract | https://api.github.com/repos/e4exp/paper_manager_abstract | opened | GroupBERT: Enhanced Transformer Architecture with Efficient Grouped Structures | 2021 BERT Efficient Natural Language Processing | - https://arxiv.org/abs/2106.05822
- 2021
注意に基づく言語モデルは、最先端の自然言語処理システムにおいて重要な要素となっている。
しかし、これらのモデルは、長い学習時間、高密度の演算、膨大なパラメータ数のために、大きな計算量を必要とします。
本研究では、Transformer層の構造にいくつかの変更を加え、より効率的なアーキテクチャを実現しました。
まず、自己注意モジュールを補完するために、畳み込みモジュールを追加し、局所的な相互作用と大域的な相互作用の学習を分離します。
次に、グループ化された変換により、モデルの表現力を維持しつつ、密なフィードフォワード層と畳み込みの計算コストを削減します。
結果として得られたアーキテクチャを言語表現学習に適用し、様々な規模のBERTモデルと比較して、その優れた性能を実証しました。
さらに、浮動小数点演算(FLOPs)と学習時間の両方の観点から、効率性が向上していることを明らかにした。
| 1.0 | GroupBERT: Enhanced Transformer Architecture with Efficient Grouped Structures - - https://arxiv.org/abs/2106.05822
- 2021
注意に基づく言語モデルは、最先端の自然言語処理システムにおいて重要な要素となっている。
しかし、これらのモデルは、長い学習時間、高密度の演算、膨大なパラメータ数のために、大きな計算量を必要とします。
本研究では、Transformer層の構造にいくつかの変更を加え、より効率的なアーキテクチャを実現しました。
まず、自己注意モジュールを補完するために、畳み込みモジュールを追加し、局所的な相互作用と大域的な相互作用の学習を分離します。
次に、グループ化された変換により、モデルの表現力を維持しつつ、密なフィードフォワード層と畳み込みの計算コストを削減します。
結果として得られたアーキテクチャを言語表現学習に適用し、様々な規模のBERTモデルと比較して、その優れた性能を実証しました。
さらに、浮動小数点演算(FLOPs)と学習時間の両方の観点から、効率性が向上していることを明らかにした。
| process | groupbert enhanced transformer architecture with efficient grouped structures 注意に基づく言語モデルは、最先端の自然言語処理システムにおいて重要な要素となっている。 しかし、これらのモデルは、長い学習時間、高密度の演算、膨大なパラメータ数のために、大きな計算量を必要とします。 本研究では、transformer層の構造にいくつかの変更を加え、より効率的なアーキテクチャを実現しました。 まず、自己注意モジュールを補完するために、畳み込みモジュールを追加し、局所的な相互作用と大域的な相互作用の学習を分離します。 次に、グループ化された変換により、モデルの表現力を維持しつつ、密なフィードフォワード層と畳み込みの計算コストを削減します。 結果として得られたアーキテクチャを言語表現学習に適用し、様々な規模のbertモデルと比較して、その優れた性能を実証しました。 さらに、浮動小数点演算(flops)と学習時間の両方の観点から、効率性が向上していることを明らかにした。 | 1 |
70,706 | 15,099,069,782 | IssuesEvent | 2021-02-08 01:20:09 | TechnoConserve/personal_website | https://api.github.com/repos/TechnoConserve/personal_website | closed | CVE-2019-8331 (Medium) detected in bootstrap-3.3.7.min.js | security vulnerability | ## CVE-2019-8331 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.7.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js</a></p>
<p>Path to dependency file: personal_website/photo_blog/templates/base.html</p>
<p>Path to vulnerable library: personal_website/photo_blog/templates/base.html</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.7.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/TechnoConserve/personal_website/commit/ae1b99a0f747fe02c68548331e8e39042b28bc81">ae1b99a0f747fe02c68548331e8e39042b28bc81</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 3.4.1 and 4.3.x before 4.3.1, XSS is possible in the tooltip or popover data-template attribute.
<p>Publish Date: 2019-02-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331>CVE-2019-8331</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/twbs/bootstrap/pull/28236">https://github.com/twbs/bootstrap/pull/28236</a></p>
<p>Release Date: 2019-02-20</p>
<p>Fix Resolution: bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-8331 (Medium) detected in bootstrap-3.3.7.min.js - ## CVE-2019-8331 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.7.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js</a></p>
<p>Path to dependency file: personal_website/photo_blog/templates/base.html</p>
<p>Path to vulnerable library: personal_website/photo_blog/templates/base.html</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.7.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/TechnoConserve/personal_website/commit/ae1b99a0f747fe02c68548331e8e39042b28bc81">ae1b99a0f747fe02c68548331e8e39042b28bc81</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 3.4.1 and 4.3.x before 4.3.1, XSS is possible in the tooltip or popover data-template attribute.
<p>Publish Date: 2019-02-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331>CVE-2019-8331</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/twbs/bootstrap/pull/28236">https://github.com/twbs/bootstrap/pull/28236</a></p>
<p>Release Date: 2019-02-20</p>
<p>Fix Resolution: bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_process | cve medium detected in bootstrap min js cve medium severity vulnerability vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to dependency file personal website photo blog templates base html path to vulnerable library personal website photo blog templates base html dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch main vulnerability details in bootstrap before and x before xss is possible in the tooltip or popover data template attribute publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap bootstrap sass step up your open source security game with whitesource | 0 |
42,772 | 5,474,827,211 | IssuesEvent | 2017-03-11 03:52:20 | rust-lang/rust | https://api.github.com/repos/rust-lang/rust | closed | LLVM Assertion: Both operands to ICmp instruction are not of the same type! | E-needstest I-ICE | Unfortunately, I don't know what in hyper is causing this error.
```
rustc: /home/rustbuild/src/rust-buildbot/slave/nightly-dist-rustc-linux/build/src/llvm/include/llvm/IR/Instructions.h:997: void llvm::ICmpInst::AssertOK(): Assertion `getOperand(0)->getType() == getOperand(1)->getType() && "Both operands to ICmp instruction are not of the same type!"' failed.
```
| 1.0 | LLVM Assertion: Both operands to ICmp instruction are not of the same type! - Unfortunately, I don't know what in hyper is causing this error.
```
rustc: /home/rustbuild/src/rust-buildbot/slave/nightly-dist-rustc-linux/build/src/llvm/include/llvm/IR/Instructions.h:997: void llvm::ICmpInst::AssertOK(): Assertion `getOperand(0)->getType() == getOperand(1)->getType() && "Both operands to ICmp instruction are not of the same type!"' failed.
```
| non_process | llvm assertion both operands to icmp instruction are not of the same type unfortunately i don t know what in hyper is causing this error rustc home rustbuild src rust buildbot slave nightly dist rustc linux build src llvm include llvm ir instructions h void llvm icmpinst assertok assertion getoperand gettype getoperand gettype both operands to icmp instruction are not of the same type failed | 0 |
2,904 | 5,889,444,304 | IssuesEvent | 2017-05-17 12:55:35 | LOVDnl/LOVD3 | https://api.github.com/repos/LOVDnl/LOVD3 | opened | Reform the submission process | cat: submission process feature request | The current submission process was designed as a "wizard" type of process where the user was taken through the process step by step, which was hopefully simpler than to have all the submission information together on one screen. However, several submitters have now indicated they would like to see all data together on one screen, and furthermore it will probably cause less errors with submission because the user can move around more freely around the submission process.
The submission process is to be redesigned as follows:
- The basis is one page where all data is displayed, similar to the individual's VE. However, places where there is no data are to be filled with clear links to adding this data.
- The order in which data is added is still similar; individual data is needed before any other data, then screenings or phenotypes (the latter only if a disease has been linked), then finally variants can only be added once screenings have been defined, at least one variant should be added to be able to submit the actual submission for curation.
- After each data entry form, you are always returned to this overview page. | 1.0 | Reform the submission process - The current submission process was designed as a "wizard" type of process where the user was taken through the process step by step, which was hopefully simpler than to have all the submission information together on one screen. However, several submitters have now indicated they would like to see all data together on one screen, and furthermore it will probably cause less errors with submission because the user can move around more freely around the submission process.
The submission process is to be redesigned as follows:
- The basis is one page where all data is displayed, similar to the individual's VE. However, places where there is no data are to be filled with clear links to adding this data.
- The order in which data is added is still similar; individual data is needed before any other data, then screenings or phenotypes (the latter only if a disease has been linked), then finally variants can only be added once screenings have been defined, at least one variant should be added to be able to submit the actual submission for curation.
- After each data entry form, you are always returned to this overview page. | process | reform the submission process the current submission process was designed as a wizard type of process where the user was taken through the process step by step which was hopefully simpler than to have all the submission information together on one screen however several submitters have now indicated they would like to see all data together on one screen and furthermore it will probably cause less errors with submission because the user can move around more freely around the submission process the submission process is to be redesigned as follows the basis is one page where all data is displayed similar to the individual s ve however places where there is no data are to be filled with clear links to adding this data the order in which data is added is still similar individual data is needed before any other data then screenings or phenotypes the latter only if a disease has been linked then finally variants can only be added once screenings have been defined at least one variant should be added to be able to submit the actual submission for curation after each data entry form you are always returned to this overview page | 1 |
85,272 | 10,434,791,401 | IssuesEvent | 2019-09-17 15:53:26 | open-contracting/ocdskit | https://api.github.com/repos/open-contracting/ocdskit | closed | Add heredocs for undocumented library methods | documentation | * [x] mapping_sheet
* [x] get_schema_fields
* [x] Add first parameter to documented parameters in combine.py | 1.0 | Add heredocs for undocumented library methods - * [x] mapping_sheet
* [x] get_schema_fields
* [x] Add first parameter to documented parameters in combine.py | non_process | add heredocs for undocumented library methods mapping sheet get schema fields add first parameter to documented parameters in combine py | 0 |
11,484 | 14,355,631,644 | IssuesEvent | 2020-11-30 10:22:25 | DevExpress/testcafe-hammerhead | https://api.github.com/repos/DevExpress/testcafe-hammerhead | closed | Constructor Worker requires 'new' in firefox | AREA: client BROWSER: Firefox FREQUENCY: level 1 SYSTEM: client side processing TYPE: bug | ### What is your Test Scenario?
The main use of my app is to view different types of documents such as pdf, office files, images etc. It uses Web Worker to run some scripts that help processing the document so that it can be rendered/viewed.
### What is the Current behavior?
When the app is running in the localhost, the Web Worker can be instantiated correctly so that the document can be processed and rendered. However if I'm using testcafe to test the app, the console will error `Constructor Worker requires 'new'`. After tracking the error down a bit, I think it is coming from `window.ts`:
```
if (constructorIsCalledWithoutNewKeyword(this, window.Worker))
nativeMethods.Worker.apply(this, arguments);
```
### What is the Expected behavior?
The Web Worker can be instantiated without any error.
### What is your web application and your TestCafe test code?
Please clone the repo from: https://github.com/ZhijieZhang/testcafe_sample, run npm install and then npm start.
You will see the web application opened in a new tab.
You can check the TestCafe test code in `test.js` and start the test by `npm run test`.
### Your Environment details:
testcafe version: 1.1.0
node.js version: 9.9.0
command-line arguments: npm run test
browser name and version: Firefox 66.0
platform and version: macOs 10.13.6 | 1.0 | Constructor Worker requires 'new' in firefox - ### What is your Test Scenario?
The main use of my app is to view different types of documents such as pdf, office files, images etc. It uses Web Worker to run some scripts that help processing the document so that it can be rendered/viewed.
### What is the Current behavior?
When the app is running in the localhost, the Web Worker can be instantiated correctly so that the document can be processed and rendered. However if I'm using testcafe to test the app, the console will error `Constructor Worker requires 'new'`. After tracking the error down a bit, I think it is coming from `window.ts`:
```
if (constructorIsCalledWithoutNewKeyword(this, window.Worker))
nativeMethods.Worker.apply(this, arguments);
```
### What is the Expected behavior?
The Web Worker can be instantiated without any error.
### What is your web application and your TestCafe test code?
Please clone the repo from: https://github.com/ZhijieZhang/testcafe_sample, run npm install and then npm start.
You will see the web application opened in a new tab.
You can check the TestCafe test code in `test.js` and start the test by `npm run test`.
### Your Environment details:
testcafe version: 1.1.0
node.js version: 9.9.0
command-line arguments: npm run test
browser name and version: Firefox 66.0
platform and version: macOs 10.13.6 | process | constructor worker requires new in firefox what is your test scenario the main use of my app is to view different types of documents such as pdf office files images etc it uses web worker to run some scripts that help processing the document so that it can be rendered viewed what is the current behavior when the app is running in the localhost the web worker can be instantiated correctly so that the document can be processed and rendered however if i m using testcafe to test the app the console will error constructor worker requires new after tracking the error down a bit i think it is coming from window ts if constructoriscalledwithoutnewkeyword this window worker nativemethods worker apply this arguments what is the expected behavior the web worker can be instantiated without any error what is your web application and your testcafe test code please clone the repo from run npm install and then npm start you will see the web application opened in a new tab you can check the testcafe test code in test js and start the test by npm run test your environment details testcafe version node js version command line arguments npm run test browser name and version firefox platform and version macos | 1 |
13,293 | 15,768,035,826 | IssuesEvent | 2021-03-31 16:45:55 | googleapis/google-cloud-go | https://api.github.com/repos/googleapis/google-cloud-go | closed | all: audit startup time | help wanted type: process | Figure out a way to audit startup time (primarily time spent in init). Maybe using the profiler or statements added before/after init funcs.
This is more of a nice-to-have, since I haven't heard anyone mention this (other than in passing from the core Go team).
Cold starts are increasingly important for serverless platforms, which scale up on demand. | 1.0 | all: audit startup time - Figure out a way to audit startup time (primarily time spent in init). Maybe using the profiler or statements added before/after init funcs.
This is more of a nice-to-have, since I haven't heard anyone mention this (other than in passing from the core Go team).
Cold starts are increasingly important for serverless platforms, which scale up on demand. | process | all audit startup time figure out a way to audit startup time primarily time spent in init maybe using the profiler or statements added before after init funcs this is more of a nice to have since i haven t heard anyone mention this other than in passing from the core go team cold starts are increasingly important for serverless platforms which scale up on demand | 1 |
23,768 | 2,663,198,330 | IssuesEvent | 2015-03-20 02:07:54 | certtools/intelmq | https://api.github.com/repos/certtools/intelmq | closed | After reboot /var/run/intelmq | bug high priority | After reboot, if the manager throught intelmqctl try to execute any command, it will not work because /var/run/intelmq doest not exist and www-data doesnt have perms to create it. | 1.0 | After reboot /var/run/intelmq - After reboot, if the manager throught intelmqctl try to execute any command, it will not work because /var/run/intelmq doest not exist and www-data doesnt have perms to create it. | non_process | after reboot var run intelmq after reboot if the manager throught intelmqctl try to execute any command it will not work because var run intelmq doest not exist and www data doesnt have perms to create it | 0 |
22,397 | 31,142,288,680 | IssuesEvent | 2023-08-16 01:44:28 | cypress-io/cypress | https://api.github.com/repos/cypress-io/cypress | closed | Flaky test: Timed out retrying after 4000ms: expected undefined to be an object | OS: linux process: flaky test topic: flake ❄️ stage: flake stale | ### Link to dashboard or CircleCI failure
https://dashboard.cypress.io/projects/ypt4pf/runs/37673/test-results/be1a324d-2fba-4156-86ca-87b6577de629
### Link to failing test in GitHub
https://github.com/cypress-io/cypress/blob/develop/packages/driver/cypress/e2e/commands/xhr.cy.js#L2399
### Analysis
<img width="429" alt="Screen Shot 2022-08-10 at 9 28 40 AM" src="https://user-images.githubusercontent.com/26726429/183963541-f8efa18a-1643-487e-9b57-183f1b5b524b.png">
### Cypress Version
10.4.0
### Other
Search for this issue number in the codebase to find the test(s) skipped until this issue is fixed | 1.0 | Flaky test: Timed out retrying after 4000ms: expected undefined to be an object - ### Link to dashboard or CircleCI failure
https://dashboard.cypress.io/projects/ypt4pf/runs/37673/test-results/be1a324d-2fba-4156-86ca-87b6577de629
### Link to failing test in GitHub
https://github.com/cypress-io/cypress/blob/develop/packages/driver/cypress/e2e/commands/xhr.cy.js#L2399
### Analysis
<img width="429" alt="Screen Shot 2022-08-10 at 9 28 40 AM" src="https://user-images.githubusercontent.com/26726429/183963541-f8efa18a-1643-487e-9b57-183f1b5b524b.png">
### Cypress Version
10.4.0
### Other
Search for this issue number in the codebase to find the test(s) skipped until this issue is fixed | process | flaky test timed out retrying after expected undefined to be an object link to dashboard or circleci failure link to failing test in github analysis img width alt screen shot at am src cypress version other search for this issue number in the codebase to find the test s skipped until this issue is fixed | 1 |
105,558 | 16,652,828,922 | IssuesEvent | 2021-06-05 01:31:44 | cfscode/react-photoswipe | https://api.github.com/repos/cfscode/react-photoswipe | opened | CVE-2016-10540 (High) detected in minimatch-0.2.14.tgz, minimatch-2.0.10.tgz | security vulnerability | ## CVE-2016-10540 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimatch-0.2.14.tgz</b>, <b>minimatch-2.0.10.tgz</b></p></summary>
<p>
<details><summary><b>minimatch-0.2.14.tgz</b></p></summary>
<p>a glob matcher in javascript</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimatch/-/minimatch-0.2.14.tgz">https://registry.npmjs.org/minimatch/-/minimatch-0.2.14.tgz</a></p>
<p>Path to dependency file: react-photoswipe/package.json</p>
<p>Path to vulnerable library: react-photoswipe/node_modules/globule/node_modules/minimatch/package.json</p>
<p>
Dependency Hierarchy:
- gulp-3.9.1.tgz (Root Library)
- vinyl-fs-0.3.14.tgz
- glob-watcher-0.0.6.tgz
- gaze-0.5.2.tgz
- globule-0.1.0.tgz
- :x: **minimatch-0.2.14.tgz** (Vulnerable Library)
</details>
<details><summary><b>minimatch-2.0.10.tgz</b></p></summary>
<p>a glob matcher in javascript</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimatch/-/minimatch-2.0.10.tgz">https://registry.npmjs.org/minimatch/-/minimatch-2.0.10.tgz</a></p>
<p>Path to dependency file: react-photoswipe/package.json</p>
<p>Path to vulnerable library: react-photoswipe/node_modules/minimatch/package.json</p>
<p>
Dependency Hierarchy:
- babel-core-5.8.38.tgz (Root Library)
- :x: **minimatch-2.0.10.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Minimatch is a minimal matching utility that works by converting glob expressions into JavaScript `RegExp` objects. The primary function, `minimatch(path, pattern)` in Minimatch 3.0.1 and earlier is vulnerable to ReDoS in the `pattern` parameter.
<p>Publish Date: 2018-05-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10540>CVE-2016-10540</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nodesecurity.io/advisories/118">https://nodesecurity.io/advisories/118</a></p>
<p>Release Date: 2016-06-20</p>
<p>Fix Resolution: Update to version 3.0.2 or later.</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2016-10540 (High) detected in minimatch-0.2.14.tgz, minimatch-2.0.10.tgz - ## CVE-2016-10540 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimatch-0.2.14.tgz</b>, <b>minimatch-2.0.10.tgz</b></p></summary>
<p>
<details><summary><b>minimatch-0.2.14.tgz</b></p></summary>
<p>a glob matcher in javascript</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimatch/-/minimatch-0.2.14.tgz">https://registry.npmjs.org/minimatch/-/minimatch-0.2.14.tgz</a></p>
<p>Path to dependency file: react-photoswipe/package.json</p>
<p>Path to vulnerable library: react-photoswipe/node_modules/globule/node_modules/minimatch/package.json</p>
<p>
Dependency Hierarchy:
- gulp-3.9.1.tgz (Root Library)
- vinyl-fs-0.3.14.tgz
- glob-watcher-0.0.6.tgz
- gaze-0.5.2.tgz
- globule-0.1.0.tgz
- :x: **minimatch-0.2.14.tgz** (Vulnerable Library)
</details>
<details><summary><b>minimatch-2.0.10.tgz</b></p></summary>
<p>a glob matcher in javascript</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimatch/-/minimatch-2.0.10.tgz">https://registry.npmjs.org/minimatch/-/minimatch-2.0.10.tgz</a></p>
<p>Path to dependency file: react-photoswipe/package.json</p>
<p>Path to vulnerable library: react-photoswipe/node_modules/minimatch/package.json</p>
<p>
Dependency Hierarchy:
- babel-core-5.8.38.tgz (Root Library)
- :x: **minimatch-2.0.10.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Minimatch is a minimal matching utility that works by converting glob expressions into JavaScript `RegExp` objects. The primary function, `minimatch(path, pattern)` in Minimatch 3.0.1 and earlier is vulnerable to ReDoS in the `pattern` parameter.
<p>Publish Date: 2018-05-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10540>CVE-2016-10540</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nodesecurity.io/advisories/118">https://nodesecurity.io/advisories/118</a></p>
<p>Release Date: 2016-06-20</p>
<p>Fix Resolution: Update to version 3.0.2 or later.</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_process | cve high detected in minimatch tgz minimatch tgz cve high severity vulnerability vulnerable libraries minimatch tgz minimatch tgz minimatch tgz a glob matcher in javascript library home page a href path to dependency file react photoswipe package json path to vulnerable library react photoswipe node modules globule node modules minimatch package json dependency hierarchy gulp tgz root library vinyl fs tgz glob watcher tgz gaze tgz globule tgz x minimatch tgz vulnerable library minimatch tgz a glob matcher in javascript library home page a href path to dependency file react photoswipe package json path to vulnerable library react photoswipe node modules minimatch package json dependency hierarchy babel core tgz root library x minimatch tgz vulnerable library found in base branch master vulnerability details minimatch is a minimal matching utility that works by converting glob expressions into javascript regexp objects the primary function minimatch path pattern in minimatch and earlier is vulnerable to redos in the pattern parameter publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution update to version or later step up your open source security game with whitesource | 0 |
256,361 | 19,409,669,263 | IssuesEvent | 2021-12-20 08:05:41 | fremtind/jokul | https://api.github.com/repos/fremtind/jokul | closed | Komponentdok for Accordion | 📚 documentation | **Må ha**
- [x] Eksempler på riktig og feil bruk
- [x] Live kodeeksempel
**Vil ha**
- [ ] Eksempler på bruk i teamene
- [ ] Kontrollspørsmål for bruk
- [x] Lenker til relevante komponenter | 1.0 | Komponentdok for Accordion - **Må ha**
- [x] Eksempler på riktig og feil bruk
- [x] Live kodeeksempel
**Vil ha**
- [ ] Eksempler på bruk i teamene
- [ ] Kontrollspørsmål for bruk
- [x] Lenker til relevante komponenter | non_process | komponentdok for accordion må ha eksempler på riktig og feil bruk live kodeeksempel vil ha eksempler på bruk i teamene kontrollspørsmål for bruk lenker til relevante komponenter | 0 |
29,027 | 13,037,850,230 | IssuesEvent | 2020-07-28 14:24:06 | cityofaustin/atd-data-tech | https://api.github.com/repos/cityofaustin/atd-data-tech | closed | Document requirements for Project Tracking in AMD's Data Tracker | Product: AMD Data Tracker Product: Mobility Project Database Service: Apps Service: Product Workgroup: AMD | I will follow DTS' process for tracking requirements (i.e. user stories) for Knack apps, otherwise, I'm happy to get building in a non-production copy of Data Tracker. Will plan to deliver the requirements in the upcoming sprint; ideally, we can do it all (requirements + development) in the same sprint.
See #3042 for initial research/discussion, including proposed spreadsheet to track projects in Data Tracker. | 2.0 | Document requirements for Project Tracking in AMD's Data Tracker - I will follow DTS' process for tracking requirements (i.e. user stories) for Knack apps, otherwise, I'm happy to get building in a non-production copy of Data Tracker. Will plan to deliver the requirements in the upcoming sprint; ideally, we can do it all (requirements + development) in the same sprint.
See #3042 for initial research/discussion, including proposed spreadsheet to track projects in Data Tracker. | non_process | document requirements for project tracking in amd s data tracker i will follow dts process for tracking requirements i e user stories for knack apps otherwise i m happy to get building in a non production copy of data tracker will plan to deliver the requirements in the upcoming sprint ideally we can do it all requirements development in the same sprint see for initial research discussion including proposed spreadsheet to track projects in data tracker | 0 |
56,990 | 6,535,917,841 | IssuesEvent | 2017-08-31 16:07:42 | learn-co-curriculum/js-object-oriented-constructor-functions-readme | https://api.github.com/repos/learn-co-curriculum/js-object-oriented-constructor-functions-readme | closed | "we can create as many Puppies as we want" and then code example only shows one puppy being made | Test | And of course we can create as many objects we want with our constructor function.
```
function Puppy(name, age, color, size) {
this.name = name
this.age = age
this.color = color
this.size = size
}
let snoopy = new Puppy('snoopy', 3, 'white', 'medium')
// {name: 'snoopy', age: 3, color: 'white', size: 'medium'}
```
Maybe create two or three puppies in this code snippet | 1.0 | "we can create as many Puppies as we want" and then code example only shows one puppy being made - And of course we can create as many objects we want with our constructor function.
```
function Puppy(name, age, color, size) {
this.name = name
this.age = age
this.color = color
this.size = size
}
let snoopy = new Puppy('snoopy', 3, 'white', 'medium')
// {name: 'snoopy', age: 3, color: 'white', size: 'medium'}
```
Maybe create two or three puppies in this code snippet | non_process | we can create as many puppies as we want and then code example only shows one puppy being made and of course we can create as many objects we want with our constructor function function puppy name age color size this name name this age age this color color this size size let snoopy new puppy snoopy white medium name snoopy age color white size medium maybe create two or three puppies in this code snippet | 0 |
9,848 | 12,838,132,603 | IssuesEvent | 2020-07-07 16:54:59 | pystatgen/sgkit | https://api.github.com/repos/pystatgen/sgkit | closed | Tools for enforcing coding standards | process + tools | Which tools should we use for enforcing coding standards?
Here is a table summarising what (related projects) [Zarr](https://zarr.readthedocs.io/en/stable/contributing.html#code-standards), [Dask](https://docs.dask.org/en/latest/develop.html#code-formatting), and [Xarray](https://xarray.pydata.org/en/latest/contributing.html#contributing-to-the-code-base) use.
| |Zarr|Dask|Xarray|
|-|----|----|------|
|Formatting|-|Black|Black|
|Linting|Flake8|Flake8|Flake8|
|Imports|-|-|isort|
|Type checking|-|-|Mypy|
|Code coverage|Coveralls|Coveralls|Codecov|
|CI|Travis|Travis|Azure Pipelines|
I would like to propose we use:
* [Black](https://black.readthedocs.io/en/stable/)
* [Flake8](https://flake8.pycqa.org/en/latest/)
* [isort](https://timothycrosley.github.io/isort/)
* [Mypy](https://mypy.readthedocs.io/en/stable/)
* [Coveralls](https://coveralls.io/)
* [GitHub Actions](https://help.github.com/en/actions)
GitHub Actions is the only one that isn't used by any of the three other related projects, but it seems to be a popular choice for new projects due to its close integration with GitHub.
| 1.0 | Tools for enforcing coding standards - Which tools should we use for enforcing coding standards?
Here is a table summarising what (related projects) [Zarr](https://zarr.readthedocs.io/en/stable/contributing.html#code-standards), [Dask](https://docs.dask.org/en/latest/develop.html#code-formatting), and [Xarray](https://xarray.pydata.org/en/latest/contributing.html#contributing-to-the-code-base) use.
| |Zarr|Dask|Xarray|
|-|----|----|------|
|Formatting|-|Black|Black|
|Linting|Flake8|Flake8|Flake8|
|Imports|-|-|isort|
|Type checking|-|-|Mypy|
|Code coverage|Coveralls|Coveralls|Codecov|
|CI|Travis|Travis|Azure Pipelines|
I would like to propose we use:
* [Black](https://black.readthedocs.io/en/stable/)
* [Flake8](https://flake8.pycqa.org/en/latest/)
* [isort](https://timothycrosley.github.io/isort/)
* [Mypy](https://mypy.readthedocs.io/en/stable/)
* [Coveralls](https://coveralls.io/)
* [GitHub Actions](https://help.github.com/en/actions)
GitHub Actions is the only one that isn't used by any of the three other related projects, but it seems to be a popular choice for new projects due to its close integration with GitHub.
| process | tools for enforcing coding standards which tools should we use for enforcing coding standards here is a table summarising what related projects and use zarr dask xarray formatting black black linting imports isort type checking mypy code coverage coveralls coveralls codecov ci travis travis azure pipelines i would like to propose we use github actions is the only one that isn t used by any of the three other related projects but it seems to be a popular choice for new projects due to its close integration with github | 1 |
98,413 | 11,082,665,964 | IssuesEvent | 2019-12-13 12:42:51 | phpDocumentor/phpDocumentor | https://api.github.com/repos/phpDocumentor/phpDocumentor | closed | phpdoc v3 config regressions | documentation | This issue is to collect v3 config regressions.
Related to #2056 I discovered that phpdoc v3 config format might have an issue with the definition of visibility. Either we need to check how we can keep the current defined format and document the new format or we should fallback to the visibility format of phpdoc v2. | 1.0 | phpdoc v3 config regressions - This issue is to collect v3 config regressions.
Related to #2056 I discovered that phpdoc v3 config format might have an issue with the definition of visibility. Either we need to check how we can keep the current defined format and document the new format or we should fallback to the visibility format of phpdoc v2. | non_process | phpdoc config regressions this issue is to collect config regressions related to i discovered that phpdoc config format might have an issue with the definition of visibility either we need to check how we can keep the current defined format and document the new format or we should fallback to the visibility format of phpdoc | 0 |
14,651 | 17,776,547,949 | IssuesEvent | 2021-08-30 20:02:01 | GoogleCloudPlatform/professional-services-data-validator | https://api.github.com/repos/GoogleCloudPlatform/professional-services-data-validator | opened | PoC Random Row validation | type: process priority: p0 Release | Proof of concept random row validation.
This includes random sampling a source table (with a TABLESAMPLE or an ORDER BY/RAND) and retrieving a specific number of rows with LIMIT. Then, using the primary keys retrieved in the source query, build a query on the target DB with a WHERE/IN statement to sample the same rows as the source.
| 1.0 | PoC Random Row validation - Proof of concept random row validation.
This includes random sampling a source table (with a TABLESAMPLE or an ORDER BY/RAND) and retrieving a specific number of rows with LIMIT. Then, using the primary keys retrieved in the source query, build a query on the target DB with a WHERE/IN statement to sample the same rows as the source.
| process | poc random row validation proof of concept random row validation this includes random sampling a source table with a tablesample or an order by rand and retrieving a specific number of rows with limit then using the primary keys retrieved in the source query build a query on the target db with a where in statement to sample the same rows as the source | 1 |
39,371 | 12,663,417,192 | IssuesEvent | 2020-06-18 01:17:23 | TIBCOSoftware/ASAssets_Utilities | https://api.github.com/repos/TIBCOSoftware/ASAssets_Utilities | opened | CVE-2020-2934 (Medium) detected in mysql-connector-java-5.1.14.jar | security vulnerability | ## CVE-2020-2934 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mysql-connector-java-5.1.14.jar</b></p></summary>
<p>MySQL JDBC Type 4 driver</p>
<p>Library home page: <a href="http://dev.mysql.com/doc/connector-j/en/">http://dev.mysql.com/doc/connector-j/en/</a></p>
<p>Path to vulnerable library: _depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q303/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q3/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q301/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2015Q3/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2017Q2/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q2/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q4/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2015Q4/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2016Q1/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q302/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2015Q2/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q305/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2015Q1/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q304/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2015Q401/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_1/ASAssets_Utilities/Release/archive/Utilities_2017Q4/Utilities_2017Q4/Utilities_2017Q4/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q307/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q306/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar</p>
<p>
Dependency Hierarchy:
- :x: **mysql-connector-java-5.1.14.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Vulnerability in the MySQL Connectors product of Oracle MySQL (component: Connector/J). Supported versions that are affected are 8.0.19 and prior and 5.1.48 and prior. Difficult to exploit vulnerability allows unauthenticated attacker with network access via multiple protocols to compromise MySQL Connectors. Successful attacks require human interaction from a person other than the attacker. Successful attacks of this vulnerability can result in unauthorized update, insert or delete access to some of MySQL Connectors accessible data as well as unauthorized read access to a subset of MySQL Connectors accessible data and unauthorized ability to cause a partial denial of service (partial DOS) of MySQL Connectors. CVSS 3.0 Base Score 5.0 (Confidentiality, Integrity and Availability impacts). CVSS Vector: (CVSS:3.0/AV:N/AC:H/PR:N/UI:R/S:U/C:L/I:L/A:L).
<p>Publish Date: 2020-04-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-2934>CVE-2020-2934</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.oracle.com/security-alerts/cpuapr2020.html">https://www.oracle.com/security-alerts/cpuapr2020.html</a></p>
<p>Release Date: 2020-04-15</p>
<p>Fix Resolution: mysql:mysql-connector-java:5.1.49,8.0.20</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"mysql","packageName":"mysql-connector-java","packageVersion":"5.1.14","isTransitiveDependency":false,"dependencyTree":"mysql:mysql-connector-java:5.1.14","isMinimumFixVersionAvailable":true,"minimumFixVersion":"mysql:mysql-connector-java:5.1.49,8.0.20"}],"vulnerabilityIdentifier":"CVE-2020-2934","vulnerabilityDetails":"Vulnerability in the MySQL Connectors product of Oracle MySQL (component: Connector/J). Supported versions that are affected are 8.0.19 and prior and 5.1.48 and prior. Difficult to exploit vulnerability allows unauthenticated attacker with network access via multiple protocols to compromise MySQL Connectors. Successful attacks require human interaction from a person other than the attacker. Successful attacks of this vulnerability can result in unauthorized update, insert or delete access to some of MySQL Connectors accessible data as well as unauthorized read access to a subset of MySQL Connectors accessible data and unauthorized ability to cause a partial denial of service (partial DOS) of MySQL Connectors. CVSS 3.0 Base Score 5.0 (Confidentiality, Integrity and Availability impacts). CVSS Vector: (CVSS:3.0/AV:N/AC:H/PR:N/UI:R/S:U/C:L/I:L/A:L).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-2934","cvss3Severity":"medium","cvss3Score":"5.0","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | True | CVE-2020-2934 (Medium) detected in mysql-connector-java-5.1.14.jar - ## CVE-2020-2934 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mysql-connector-java-5.1.14.jar</b></p></summary>
<p>MySQL JDBC Type 4 driver</p>
<p>Library home page: <a href="http://dev.mysql.com/doc/connector-j/en/">http://dev.mysql.com/doc/connector-j/en/</a></p>
<p>Path to vulnerable library: _depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q303/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q3/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q301/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2015Q3/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2017Q2/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q2/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q4/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2015Q4/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2016Q1/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q302/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2015Q2/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q305/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2015Q1/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q304/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2015Q401/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_1/ASAssets_Utilities/Release/archive/Utilities_2017Q4/Utilities_2017Q4/Utilities_2017Q4/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q307/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar,_depth_0/ASAssets_Utilities/Release/archive/Utilities_2014Q306/files/conf/adapters/system/mysql_5_0/mysql-connector-java-5.1.14-bin.jar</p>
<p>
Dependency Hierarchy:
- :x: **mysql-connector-java-5.1.14.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Vulnerability in the MySQL Connectors product of Oracle MySQL (component: Connector/J). Supported versions that are affected are 8.0.19 and prior and 5.1.48 and prior. Difficult to exploit vulnerability allows unauthenticated attacker with network access via multiple protocols to compromise MySQL Connectors. Successful attacks require human interaction from a person other than the attacker. Successful attacks of this vulnerability can result in unauthorized update, insert or delete access to some of MySQL Connectors accessible data as well as unauthorized read access to a subset of MySQL Connectors accessible data and unauthorized ability to cause a partial denial of service (partial DOS) of MySQL Connectors. CVSS 3.0 Base Score 5.0 (Confidentiality, Integrity and Availability impacts). CVSS Vector: (CVSS:3.0/AV:N/AC:H/PR:N/UI:R/S:U/C:L/I:L/A:L).
<p>Publish Date: 2020-04-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-2934>CVE-2020-2934</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.oracle.com/security-alerts/cpuapr2020.html">https://www.oracle.com/security-alerts/cpuapr2020.html</a></p>
<p>Release Date: 2020-04-15</p>
<p>Fix Resolution: mysql:mysql-connector-java:5.1.49,8.0.20</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"mysql","packageName":"mysql-connector-java","packageVersion":"5.1.14","isTransitiveDependency":false,"dependencyTree":"mysql:mysql-connector-java:5.1.14","isMinimumFixVersionAvailable":true,"minimumFixVersion":"mysql:mysql-connector-java:5.1.49,8.0.20"}],"vulnerabilityIdentifier":"CVE-2020-2934","vulnerabilityDetails":"Vulnerability in the MySQL Connectors product of Oracle MySQL (component: Connector/J). Supported versions that are affected are 8.0.19 and prior and 5.1.48 and prior. Difficult to exploit vulnerability allows unauthenticated attacker with network access via multiple protocols to compromise MySQL Connectors. Successful attacks require human interaction from a person other than the attacker. Successful attacks of this vulnerability can result in unauthorized update, insert or delete access to some of MySQL Connectors accessible data as well as unauthorized read access to a subset of MySQL Connectors accessible data and unauthorized ability to cause a partial denial of service (partial DOS) of MySQL Connectors. CVSS 3.0 Base Score 5.0 (Confidentiality, Integrity and Availability impacts). CVSS Vector: (CVSS:3.0/AV:N/AC:H/PR:N/UI:R/S:U/C:L/I:L/A:L).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-2934","cvss3Severity":"medium","cvss3Score":"5.0","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | non_process | cve medium detected in mysql connector java jar cve medium severity vulnerability vulnerable library mysql connector java jar mysql jdbc type driver library home page a href path to vulnerable library depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities utilities utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar depth asassets utilities release archive utilities files conf adapters system mysql mysql connector java bin jar dependency hierarchy x mysql connector java jar vulnerable library vulnerability details vulnerability in the mysql connectors product of oracle mysql component connector j supported versions that are affected are and prior and and prior difficult to exploit vulnerability allows unauthenticated attacker with network access via multiple protocols to compromise mysql connectors successful attacks require human interaction from a person other than the attacker successful attacks of this vulnerability can result in unauthorized update insert or delete access to some of mysql connectors accessible data as well as unauthorized read access to a subset of mysql connectors accessible data and unauthorized ability to cause a partial denial of service partial dos of mysql connectors cvss base score confidentiality integrity and availability impacts cvss vector cvss av n ac h pr n ui r s u c l i l a l publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction required scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution mysql mysql connector java check this box to open an automated fix pr isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails vulnerability in the mysql connectors product of oracle mysql component connector j supported versions that are affected are and prior and and prior difficult to exploit vulnerability allows unauthenticated attacker with network access via multiple protocols to compromise mysql connectors successful attacks require human interaction from a person other than the attacker successful attacks of this vulnerability can result in unauthorized update insert or delete access to some of mysql connectors accessible data as well as unauthorized read access to a subset of mysql connectors accessible data and unauthorized ability to cause a partial denial of service partial dos of mysql connectors cvss base score confidentiality integrity and availability impacts cvss vector cvss av n ac h pr n ui r s u c l i l a l vulnerabilityurl | 0 |
21,052 | 27,997,634,677 | IssuesEvent | 2023-03-27 09:32:07 | nodejs/node | https://api.github.com/repos/nodejs/node | closed | node {16,17,18} doesn't respect spawn timeout and hangs | child_process | ### Version
16.*
### Platform
linux
`Linux f092196b2008 5.10.104-linuxkit #1 SMP Thu Mar 17 17:08:06 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux`
### What steps will reproduce the bug?
I've created a demo using docker so you can easily swap node versions and see the bug.
That being said, I'm sharing the tarball as bas64 because I wasn't sure if you'd like to download tarballs...that being said you may also not want to decode b64 🤷♂️
download here if you'd like: [poc<via wetransfer>](https://wetransfer.com/downloads/4395c16ad0eaf6ff6492d67b56060af420220706154704/4edc045984510764c4a4f8980d82ffe720220706154716/0d769f?utm_campaign=WT_email_tracking&utm_content=general&utm_medium=download_button&utm_source=notify_recipient_email)
or run the following:
```sh
base64 -d <<< H4sIACurxWIAA+1YbWvbMBDOZ/+KqzfqZCSyZcsxBLoLjpJp+fRKchuVA7HcQLfB912y9ZxSdkuAJi4Xey5PvE9cDAmJGiAX/3UGo2ZKMJcTiW8YGH8gJ90G40e6F+sY9W+ECB7OGNJ3Ek7+SxFYlJFjHX5x06wyr/nuzL/BHvdBjhVTOY+Xnn+32zZQ5baw1BMjCIc75gpjynudiZhOqa5acQ8Oqc56EMCnQLeSidAS7M8M9piPPcyavwlkP1Bp3LEElpVDMmHLiEP8B/7S/4T38eS/x7xSM3/TeDToP8NFOd7kvQiYVPD+N4ffP2wNwCbZ4Vh7PYPjgFJyhsf948Gxwf9vf0jOAGtE2YbTJbG9Bf6IUw4rVXg5QHZKpFnUx7PEiqqKQYfUf8RQhx9/+O6/tsIkL0kcHUx1um/G9zKv9dV+o+xV+v/JhDxVBRwLbLwMj28SqM57EBOf85YTptWNJFV31mW84gKYbUMQ3nzhKKEj5vWofoNS8cIIdm3GqFppdnUap9YTI4cJonVtpCdnY9l2+lc0HzIBbVO2zJmETPek24TmrPCakPBppTPih6WG/jOm7fuhtMBVCl6RWW45963/wU6NRX/B/AY/cdBIPnvup5b6/8mUOY/C6PzcEzlLcDTp4+xTv89n9zov0PU/e/X+r8ZXBsga/lwSs0emPIomG1lkDItGE+VDSMHOaU1piLKWVYsekrjNGT62+odoK2lo5AdKoA0ZDld3AfKmUYTDtYXmiQcRjmfgkgozaS437htWbC9rV8mgGzdrQaXg811gHN6dcnzWEU4OdWWcFZMeH4zsYRFNBV6XXuHu6Yxr++MP6Dk/3J/q4mx9v3vBff03wscXPN/E7hbYqljICs6SbtLVdLR4qgsyZrN1s77O66fOY9hynNVAFqtti7ZuurTAtuGEBJeyLqwpty/jnvvfykGTx/j8e9/3/fk/e8iVM2EbuOV879GjRqvF78BY21wXQAgAAA= > out.tar.gz
```
then `tar -zxf out.tar.gz` and you'll see the following files:
```sh
Dockerfile build-n-run.sh index.js node_modules pkg
```
you can run the `build-n-run.sh` that I've created (which is a silly 2liner), or build and run it yourself
### Steps to Reproduce
1. Run something like:
```js
const {spawnSync} = require('child_process')
console.log('Spawning...')
spawnSync('npm',['install','./pkg','--verbose'], {stdio:'inherit', timeout:1000*3})
console.log('spawner bye')
```
and make the package preinstall script for `pkg` to hang (via long `setTimeout` that will make it busy)
3. Expect `ETIMEOUT`, but get nothing
| 1.0 | node {16,17,18} doesn't respect spawn timeout and hangs - ### Version
16.*
### Platform
linux
`Linux f092196b2008 5.10.104-linuxkit #1 SMP Thu Mar 17 17:08:06 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux`
### What steps will reproduce the bug?
I've created a demo using docker so you can easily swap node versions and see the bug.
That being said, I'm sharing the tarball as bas64 because I wasn't sure if you'd like to download tarballs...that being said you may also not want to decode b64 🤷♂️
download here if you'd like: [poc<via wetransfer>](https://wetransfer.com/downloads/4395c16ad0eaf6ff6492d67b56060af420220706154704/4edc045984510764c4a4f8980d82ffe720220706154716/0d769f?utm_campaign=WT_email_tracking&utm_content=general&utm_medium=download_button&utm_source=notify_recipient_email)
or run the following:
```sh
base64 -d <<< H4sIACurxWIAA+1YbWvbMBDOZ/+KqzfqZCSyZcsxBLoLjpJp+fRKchuVA7HcQLfB912y9ZxSdkuAJi4Xey5PvE9cDAmJGiAX/3UGo2ZKMJcTiW8YGH8gJ90G40e6F+sY9W+ECB7OGNJ3Ek7+SxFYlJFjHX5x06wyr/nuzL/BHvdBjhVTOY+Xnn+32zZQ5baw1BMjCIc75gpjynudiZhOqa5acQ8Oqc56EMCnQLeSidAS7M8M9piPPcyavwlkP1Bp3LEElpVDMmHLiEP8B/7S/4T38eS/x7xSM3/TeDToP8NFOd7kvQiYVPD+N4ffP2wNwCbZ4Vh7PYPjgFJyhsf948Gxwf9vf0jOAGtE2YbTJbG9Bf6IUw4rVXg5QHZKpFnUx7PEiqqKQYfUf8RQhx9/+O6/tsIkL0kcHUx1um/G9zKv9dV+o+xV+v/JhDxVBRwLbLwMj28SqM57EBOf85YTptWNJFV31mW84gKYbUMQ3nzhKKEj5vWofoNS8cIIdm3GqFppdnUap9YTI4cJonVtpCdnY9l2+lc0HzIBbVO2zJmETPek24TmrPCakPBppTPih6WG/jOm7fuhtMBVCl6RWW45963/wU6NRX/B/AY/cdBIPnvup5b6/8mUOY/C6PzcEzlLcDTp4+xTv89n9zov0PU/e/X+r8ZXBsga/lwSs0emPIomG1lkDItGE+VDSMHOaU1piLKWVYsekrjNGT62+odoK2lo5AdKoA0ZDld3AfKmUYTDtYXmiQcRjmfgkgozaS437htWbC9rV8mgGzdrQaXg811gHN6dcnzWEU4OdWWcFZMeH4zsYRFNBV6XXuHu6Yxr++MP6Dk/3J/q4mx9v3vBff03wscXPN/E7hbYqljICs6SbtLVdLR4qgsyZrN1s77O66fOY9hynNVAFqtti7ZuurTAtuGEBJeyLqwpty/jnvvfykGTx/j8e9/3/fk/e8iVM2EbuOV879GjRqvF78BY21wXQAgAAA= > out.tar.gz
```
then `tar -zxf out.tar.gz` and you'll see the following files:
```sh
Dockerfile build-n-run.sh index.js node_modules pkg
```
you can run the `build-n-run.sh` that I've created (which is a silly 2liner), or build and run it yourself
### Steps to Reproduce
1. Run something like:
```js
const {spawnSync} = require('child_process')
console.log('Spawning...')
spawnSync('npm',['install','./pkg','--verbose'], {stdio:'inherit', timeout:1000*3})
console.log('spawner bye')
```
and make the package preinstall script for `pkg` to hang (via long `setTimeout` that will make it busy)
3. Expect `ETIMEOUT`, but get nothing
| process | node doesn t respect spawn timeout and hangs version platform linux linux linuxkit smp thu mar utc gnu linux what steps will reproduce the bug i ve created a demo using docker so you can easily swap node versions and see the bug that being said i m sharing the tarball as because i wasn t sure if you d like to download tarballs that being said you may also not want to decode 🤷♂️ download here if you d like or run the following sh d out tar gz then tar zxf out tar gz and you ll see the following files sh dockerfile build n run sh index js node modules pkg you can run the build n run sh that i ve created which is a silly or build and run it yourself steps to reproduce run something like js const spawnsync require child process console log spawning spawnsync npm stdio inherit timeout console log spawner bye and make the package preinstall script for pkg to hang via long settimeout that will make it busy expect etimeout but get nothing | 1 |
8,260 | 11,425,620,278 | IssuesEvent | 2020-02-03 20:12:20 | NationalSecurityAgency/ghidra | https://api.github.com/repos/NationalSecurityAgency/ghidra | closed | dsPIC30F program space strings | Feature: Processor/other | I compiled a simple dsPIC30F binary using the following code:
```
#include <stdio.h>
__prog__ const char __attribute__((space(prog))) str[] = "Hello!";
int main() {
printf("%s\n", str);
}
```
Because I am forcing the string into program space, it ends up looking like this in Ghidra:

Note the extra 0x00's in between some of the string's characters. This is due to the weird size and wordsize of the PIC program (ROM) space. Is there any way for Ghidra's string datatype to be made aware of these various features of the address space the string is contained in? Similar question for the string search feature. | 1.0 | dsPIC30F program space strings - I compiled a simple dsPIC30F binary using the following code:
```
#include <stdio.h>
__prog__ const char __attribute__((space(prog))) str[] = "Hello!";
int main() {
printf("%s\n", str);
}
```
Because I am forcing the string into program space, it ends up looking like this in Ghidra:

Note the extra 0x00's in between some of the string's characters. This is due to the weird size and wordsize of the PIC program (ROM) space. Is there any way for Ghidra's string datatype to be made aware of these various features of the address space the string is contained in? Similar question for the string search feature. | process | program space strings i compiled a simple binary using the following code include prog const char attribute space prog str hello int main printf s n str because i am forcing the string into program space it ends up looking like this in ghidra note the extra s in between some of the string s characters this is due to the weird size and wordsize of the pic program rom space is there any way for ghidra s string datatype to be made aware of these various features of the address space the string is contained in similar question for the string search feature | 1 |
2,092 | 4,928,880,201 | IssuesEvent | 2016-11-27 15:13:48 | brucemiller/LaTeXML | https://api.github.com/repos/brucemiller/LaTeXML | closed | Cannot use non-html5 equations | bug postprocessing | First off, excellent work with this package. I am able to convert a rather complex document to html5 with lots of equations and images. Ultimately my target is to get back to a Word document from a latex document. I think I am not far off.
That said, latexml always borks when I try to tell it to use svg or png equations (with ``--mathsvg`` or ``--mathimages`` respectively). The HTML5 equations will obviously be no good in Word.
I always get this error:
```
Fatal:perl:die Perl died
Postprocessing LaTeXML::Post::MathML::Presentation Paper.html
Wide character in subroutine entry at /usr/share/perl5/LaTeXML/Post.pm line 1188.
```
It would be nicer if the error would not kill LateXML, rather it would just skip that equation. | 1.0 | Cannot use non-html5 equations - First off, excellent work with this package. I am able to convert a rather complex document to html5 with lots of equations and images. Ultimately my target is to get back to a Word document from a latex document. I think I am not far off.
That said, latexml always borks when I try to tell it to use svg or png equations (with ``--mathsvg`` or ``--mathimages`` respectively). The HTML5 equations will obviously be no good in Word.
I always get this error:
```
Fatal:perl:die Perl died
Postprocessing LaTeXML::Post::MathML::Presentation Paper.html
Wide character in subroutine entry at /usr/share/perl5/LaTeXML/Post.pm line 1188.
```
It would be nicer if the error would not kill LateXML, rather it would just skip that equation. | process | cannot use non equations first off excellent work with this package i am able to convert a rather complex document to with lots of equations and images ultimately my target is to get back to a word document from a latex document i think i am not far off that said latexml always borks when i try to tell it to use svg or png equations with mathsvg or mathimages respectively the equations will obviously be no good in word i always get this error fatal perl die perl died postprocessing latexml post mathml presentation paper html wide character in subroutine entry at usr share latexml post pm line it would be nicer if the error would not kill latexml rather it would just skip that equation | 1 |
12,468 | 7,885,614,472 | IssuesEvent | 2018-06-27 13:01:04 | angular/angular | https://api.github.com/repos/angular/angular | closed | Allow prod mode to be enabled via build-time environment variables + support dead-code elimination | comp: core & compiler comp: packaging comp: performance fixed by Ivy freq3: high severity3: broken type: feature | other frameworks support DCE, dead-code elimination, with environment variables for switching between `"development"`, `"production"`, and `"test"` modes

can we do the same for `enableProdMode`
```
if (process.env.NODE_ENV !== 'production') {
console.log("I'm in development");
enableDevMode();
}
```
webpack(w/ DefinePlugin) and browserify(w/ envify) will replace `process.env.NODE_ENV` with `"production"`
```
if ("production" !== 'production') {
console.log("I'm in development");
enableDevMode();
}
```
which would then be replaced with
```
if (false) {
console.log("I'm in development");
enableDevMode();
}
```
and if they had UglifyJS then the statement would be removed. It would be great to have a list of standard environment variables that all libraries could use for universal modules for example.
- ENV: `'development' || 'production' || 'test'`
- PLATFORM: `'node' || 'worker' || 'browser' || 'browser-ui' || 'atom' || 'recreate-user-agent-standard'`
related: https://github.com/mishoo/UglifyJS2#conditional-compilation
[](https://tipe.io?ref=github-comment) | True | Allow prod mode to be enabled via build-time environment variables + support dead-code elimination - other frameworks support DCE, dead-code elimination, with environment variables for switching between `"development"`, `"production"`, and `"test"` modes

can we do the same for `enableProdMode`
```
if (process.env.NODE_ENV !== 'production') {
console.log("I'm in development");
enableDevMode();
}
```
webpack(w/ DefinePlugin) and browserify(w/ envify) will replace `process.env.NODE_ENV` with `"production"`
```
if ("production" !== 'production') {
console.log("I'm in development");
enableDevMode();
}
```
which would then be replaced with
```
if (false) {
console.log("I'm in development");
enableDevMode();
}
```
and if they had UglifyJS then the statement would be removed. It would be great to have a list of standard environment variables that all libraries could use for universal modules for example.
- ENV: `'development' || 'production' || 'test'`
- PLATFORM: `'node' || 'worker' || 'browser' || 'browser-ui' || 'atom' || 'recreate-user-agent-standard'`
related: https://github.com/mishoo/UglifyJS2#conditional-compilation
[](https://tipe.io?ref=github-comment) | non_process | allow prod mode to be enabled via build time environment variables support dead code elimination other frameworks support dce dead code elimination with environment variables for switching between development production and test modes can we do the same for enableprodmode if process env node env production console log i m in development enabledevmode webpack w defineplugin and browserify w envify will replace process env node env with production if production production console log i m in development enabledevmode which would then be replaced with if false console log i m in development enabledevmode and if they had uglifyjs then the statement would be removed it would be great to have a list of standard environment variables that all libraries could use for universal modules for example env development production test platform node worker browser browser ui atom recreate user agent standard related | 0 |
310,851 | 9,524,849,368 | IssuesEvent | 2019-04-28 07:26:22 | projectacrn/acrn-hypervisor | https://api.github.com/repos/projectacrn/acrn-hypervisor | closed | "ST_PERF_DS3_Resume_to_IVE_Android_OS_UI" test result value is low. | priority: P3-Medium status: Assigned type: bug | Kernel Version/Android Version | 4.19.19-quilt-2e5dc0ac-00135-gdfb7dfd764a1 (190208T031538Z) / 9
AOSP Version | PPR1.181005.003
ABL Version | 1906_GP20
IFWI Version | 3.1.55.2278a
IOC Version | 4.0.14
ACRN-Hypervisor | 2019w05.3.150000p_156
SOS version: 27700
SOS Kernel Version | 4.19.19-8-.iot-lts2018-sos #1 SMP PREEMPT Mon Feb 11
cAVS FW Version | 9.22.1.3472
Test environment settings
Connect 1 HDMI(on HDMI port2) and 1 eDP panel
Execution steps
Freshly flash the USER image on DUT (Device Under Test) and perform System Setup & Configuration section's steps.
Boot the device to home screen and wait 5 minutes.
Press 'ignition' button let the DUT go to S3 mode and wait about 1 minutes.
open mobile video
Start measurement by pressing the 'ignition' button.
Until the screen lights up and stop video
copy video from mobile to host
execute "python ffmpeg.py -i *.MOV"
calculating time
Expected result
less than 2 seconds
Actual result
more than 2seconds | 1.0 | "ST_PERF_DS3_Resume_to_IVE_Android_OS_UI" test result value is low. - Kernel Version/Android Version | 4.19.19-quilt-2e5dc0ac-00135-gdfb7dfd764a1 (190208T031538Z) / 9
AOSP Version | PPR1.181005.003
ABL Version | 1906_GP20
IFWI Version | 3.1.55.2278a
IOC Version | 4.0.14
ACRN-Hypervisor | 2019w05.3.150000p_156
SOS version: 27700
SOS Kernel Version | 4.19.19-8-.iot-lts2018-sos #1 SMP PREEMPT Mon Feb 11
cAVS FW Version | 9.22.1.3472
Test environment settings
Connect 1 HDMI(on HDMI port2) and 1 eDP panel
Execution steps
Freshly flash the USER image on DUT (Device Under Test) and perform System Setup & Configuration section's steps.
Boot the device to home screen and wait 5 minutes.
Press 'ignition' button let the DUT go to S3 mode and wait about 1 minutes.
open mobile video
Start measurement by pressing the 'ignition' button.
Until the screen lights up and stop video
copy video from mobile to host
execute "python ffmpeg.py -i *.MOV"
calculating time
Expected result
less than 2 seconds
Actual result
more than 2seconds | non_process | st perf resume to ive android os ui test result value is low kernel version android version quilt aosp version abl version ifwi version ioc version acrn hypervisor sos version sos kernel version iot sos smp preempt mon feb cavs fw version test environment settings connect hdmi on hdmi and edp panel execution steps freshly flash the user image on dut device under test and perform system setup configuration section s steps boot the device to home screen and wait minutes press ignition button let the dut go to mode and wait about minutes open mobile video start measurement by pressing the ignition button until the screen lights up and stop video copy video from mobile to host execute python ffmpeg py i mov calculating time expected result less than seconds actual result more than | 0 |
52,196 | 3,022,220,698 | IssuesEvent | 2015-07-31 19:01:06 | Microsoft/TypeScript | https://api.github.com/repos/Microsoft/TypeScript | closed | tsc no longer reports a non-zero error code when reporting errors | Bug High Priority | Try installing the nightly with `npm install -g typescript@next` and compile the following file:
```TypeScript
var asdf = 123
asdf = '123'
```
If you try to get the error code from 20150730 (or really any nightly that's available right now), it'll be `0`. If you switch back to 1.5 with `npm install -g typescript@latest`, you'll get an error code of `2`. | 1.0 | tsc no longer reports a non-zero error code when reporting errors - Try installing the nightly with `npm install -g typescript@next` and compile the following file:
```TypeScript
var asdf = 123
asdf = '123'
```
If you try to get the error code from 20150730 (or really any nightly that's available right now), it'll be `0`. If you switch back to 1.5 with `npm install -g typescript@latest`, you'll get an error code of `2`. | non_process | tsc no longer reports a non zero error code when reporting errors try installing the nightly with npm install g typescript next and compile the following file typescript var asdf asdf if you try to get the error code from or really any nightly that s available right now it ll be if you switch back to with npm install g typescript latest you ll get an error code of | 0 |
181,889 | 14,891,924,642 | IssuesEvent | 2021-01-21 01:38:10 | executablebooks/jupyter-book | https://api.github.com/repos/executablebooks/jupyter-book | opened | Document how to disable the download button | documentation | A few folks have asked whether they can *disable* the page download button. This is possible in the theme, as documented here:
https://sphinx-book-theme.readthedocs.io/en/latest/configure.html?highlight=use_download_button#download-page-button
but, it's not documented in jupyter book. We should either document it or add some kind of configuration for it! | 1.0 | Document how to disable the download button - A few folks have asked whether they can *disable* the page download button. This is possible in the theme, as documented here:
https://sphinx-book-theme.readthedocs.io/en/latest/configure.html?highlight=use_download_button#download-page-button
but, it's not documented in jupyter book. We should either document it or add some kind of configuration for it! | non_process | document how to disable the download button a few folks have asked whether they can disable the page download button this is possible in the theme as documented here but it s not documented in jupyter book we should either document it or add some kind of configuration for it | 0 |
133,460 | 12,541,958,373 | IssuesEvent | 2020-06-05 13:16:53 | OpenMined/SwiftSyft | https://api.github.com/repos/OpenMined/SwiftSyft | closed | Create a better Readme | Priority: 2 - High :cold_sweat: Severity: 4 - Low :sunglasses: Status: In Progress :star2: Type: Documentation :books: | [Based on this readme template](https://github.com/OpenMined/.github/blob/master/README-TEMPLATE.md), we should improve our readmes across all OpenMined projects.
More specifically, you should fill out the template **at the minimum**.
- [ ] Don't worry about the logo, I'll get this to you.
- [ ] Change all badges to reflect your repo, include other badges as desired, but use those at the minimum. You can generate more here: https://shields.io/
- [ ] Change the title
- [ ] Write a detailed description of what your library intends to accomplish. I would also advise that you provide links to the following papers: https://ai.googleblog.com/2017/04/federated-learning-collaborative.html, https://arxiv.org/pdf/1902.01046.pdf, https://research.google/pubs/pub47246/. I would also explain that the system is driven by developing a model in PySyft, hosting it in PyGrid, and then downloading it using a worker library. Be sure to also link to the other worker libraries that aren't yours, so we can cross-promote our work!
- [ ] Fill out a list of features that your library supports. A suggested list includes: PySyft plan execution, optional third-party JWT authentication, wifi detection, charge detection, sleep/wake detection, protocols for secure aggregation (put mark this as "in progress"), and a list of environments this library is expected to work in. That's a short list, you can add or remove what you please.
- [ ] Installation section should be updated and specify the appropriate package manager (with a link to our deployment page on that package manager)
- [ ] Usage section should be comprised of the implementation code for the MNIST example. Make sure to clean these up first. I've created an issue for this elsewhere - do that one first.
- [ ] Fill out some basic contributing information to tell people how to run your library locally, what the local development instructions are, etc.
- [ ] Fill out the list of contributors from All Contributors. Build this into your workflow and expect to use their Github issue commands in the future to make adding people to the readme easier. | 1.0 | Create a better Readme - [Based on this readme template](https://github.com/OpenMined/.github/blob/master/README-TEMPLATE.md), we should improve our readmes across all OpenMined projects.
More specifically, you should fill out the template **at the minimum**.
- [ ] Don't worry about the logo, I'll get this to you.
- [ ] Change all badges to reflect your repo, include other badges as desired, but use those at the minimum. You can generate more here: https://shields.io/
- [ ] Change the title
- [ ] Write a detailed description of what your library intends to accomplish. I would also advise that you provide links to the following papers: https://ai.googleblog.com/2017/04/federated-learning-collaborative.html, https://arxiv.org/pdf/1902.01046.pdf, https://research.google/pubs/pub47246/. I would also explain that the system is driven by developing a model in PySyft, hosting it in PyGrid, and then downloading it using a worker library. Be sure to also link to the other worker libraries that aren't yours, so we can cross-promote our work!
- [ ] Fill out a list of features that your library supports. A suggested list includes: PySyft plan execution, optional third-party JWT authentication, wifi detection, charge detection, sleep/wake detection, protocols for secure aggregation (put mark this as "in progress"), and a list of environments this library is expected to work in. That's a short list, you can add or remove what you please.
- [ ] Installation section should be updated and specify the appropriate package manager (with a link to our deployment page on that package manager)
- [ ] Usage section should be comprised of the implementation code for the MNIST example. Make sure to clean these up first. I've created an issue for this elsewhere - do that one first.
- [ ] Fill out some basic contributing information to tell people how to run your library locally, what the local development instructions are, etc.
- [ ] Fill out the list of contributors from All Contributors. Build this into your workflow and expect to use their Github issue commands in the future to make adding people to the readme easier. | non_process | create a better readme we should improve our readmes across all openmined projects more specifically you should fill out the template at the minimum don t worry about the logo i ll get this to you change all badges to reflect your repo include other badges as desired but use those at the minimum you can generate more here change the title write a detailed description of what your library intends to accomplish i would also advise that you provide links to the following papers i would also explain that the system is driven by developing a model in pysyft hosting it in pygrid and then downloading it using a worker library be sure to also link to the other worker libraries that aren t yours so we can cross promote our work fill out a list of features that your library supports a suggested list includes pysyft plan execution optional third party jwt authentication wifi detection charge detection sleep wake detection protocols for secure aggregation put mark this as in progress and a list of environments this library is expected to work in that s a short list you can add or remove what you please installation section should be updated and specify the appropriate package manager with a link to our deployment page on that package manager usage section should be comprised of the implementation code for the mnist example make sure to clean these up first i ve created an issue for this elsewhere do that one first fill out some basic contributing information to tell people how to run your library locally what the local development instructions are etc fill out the list of contributors from all contributors build this into your workflow and expect to use their github issue commands in the future to make adding people to the readme easier | 0 |