Dataset Viewer
Auto-converted to Parquet
Unnamed: 0
int64
3
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
2
742
labels
stringlengths
4
431
body
stringlengths
5
239k
index
stringclasses
10 values
text_combine
stringlengths
96
240k
label
stringclasses
2 values
text
stringlengths
96
200k
binary_label
int64
0
1
20,901
16,187,864,456
IssuesEvent
2021-05-04 01:26:38
phseiff/github-flavored-markdown-to-html
https://api.github.com/repos/phseiff/github-flavored-markdown-to-html
opened
Try to remove the warning that occurs when g`gh-md-to-html` is run from the command line.
enhancement usability
See https://github.com/phseiff/github-flavored-markdown-to-html/issues/39#issuecomment-830831571 and (from the same issue) my explanation https://github.com/phseiff/github-flavored-markdown-to-html/issues/39#issuecomment-830831954.
True
Try to remove the warning that occurs when g`gh-md-to-html` is run from the command line. - See https://github.com/phseiff/github-flavored-markdown-to-html/issues/39#issuecomment-830831571 and (from the same issue) my explanation https://github.com/phseiff/github-flavored-markdown-to-html/issues/39#issuecomment-830831954.
usab
try to remove the warning that occurs when g gh md to html is run from the command line see and from the same issue my explanation
1
73,340
15,253,644,706
IssuesEvent
2021-02-20 08:38:14
gsylvie/madness
https://api.github.com/repos/gsylvie/madness
closed
CVE-2017-3523 (High) detected in mysql-connector-java-5.1.6.jar - autoclosed
security vulnerability
## CVE-2017-3523 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mysql-connector-java-5.1.6.jar</b></p></summary> <p>MySQL JDBC Type 4 driver</p> <p>Library home page: <a href="http://dev.mysql.com/doc/connector-j/en/">http://dev.mysql.com/doc/connector-j/en/</a></p> <p>Path to vulnerable library: madness/sub1/target/madness-sub1-2019.02.01/WEB-INF/lib/mysql-connector-java-5.1.6.jar,canner/.m2/repository/mysql/mysql-connector-java/5.1.6/mysql-connector-java-5.1.6.jar</p> <p> Dependency Hierarchy: - :x: **mysql-connector-java-5.1.6.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/gsylvie/madness/commit/032e0bc50a6a45a60e9aed1a5aae9530ad02548a">032e0bc50a6a45a60e9aed1a5aae9530ad02548a</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Vulnerability in the MySQL Connectors component of Oracle MySQL (subcomponent: Connector/J). Supported versions that are affected are 5.1.40 and earlier. Difficult to exploit vulnerability allows low privileged attacker with network access via multiple protocols to compromise MySQL Connectors. While the vulnerability is in MySQL Connectors, attacks may significantly impact additional products. Successful attacks of this vulnerability can result in takeover of MySQL Connectors. CVSS 3.0 Base Score 8.5 (Confidentiality, Integrity and Availability impacts). CVSS Vector: (CVSS:3.0/AV:N/AC:H/PR:L/UI:N/S:C/C:H/I:H/A:H). <p>Publish Date: 2017-04-24 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-3523>CVE-2017-3523</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.oracle.com/technetwork/security-advisory/cpuapr2017-3236618.html">https://www.oracle.com/technetwork/security-advisory/cpuapr2017-3236618.html</a></p> <p>Release Date: 2017-04-24</p> <p>Fix Resolution: 5.1.41</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2017-3523 (High) detected in mysql-connector-java-5.1.6.jar - autoclosed - ## CVE-2017-3523 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mysql-connector-java-5.1.6.jar</b></p></summary> <p>MySQL JDBC Type 4 driver</p> <p>Library home page: <a href="http://dev.mysql.com/doc/connector-j/en/">http://dev.mysql.com/doc/connector-j/en/</a></p> <p>Path to vulnerable library: madness/sub1/target/madness-sub1-2019.02.01/WEB-INF/lib/mysql-connector-java-5.1.6.jar,canner/.m2/repository/mysql/mysql-connector-java/5.1.6/mysql-connector-java-5.1.6.jar</p> <p> Dependency Hierarchy: - :x: **mysql-connector-java-5.1.6.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/gsylvie/madness/commit/032e0bc50a6a45a60e9aed1a5aae9530ad02548a">032e0bc50a6a45a60e9aed1a5aae9530ad02548a</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Vulnerability in the MySQL Connectors component of Oracle MySQL (subcomponent: Connector/J). Supported versions that are affected are 5.1.40 and earlier. Difficult to exploit vulnerability allows low privileged attacker with network access via multiple protocols to compromise MySQL Connectors. While the vulnerability is in MySQL Connectors, attacks may significantly impact additional products. Successful attacks of this vulnerability can result in takeover of MySQL Connectors. CVSS 3.0 Base Score 8.5 (Confidentiality, Integrity and Availability impacts). CVSS Vector: (CVSS:3.0/AV:N/AC:H/PR:L/UI:N/S:C/C:H/I:H/A:H). <p>Publish Date: 2017-04-24 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-3523>CVE-2017-3523</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.oracle.com/technetwork/security-advisory/cpuapr2017-3236618.html">https://www.oracle.com/technetwork/security-advisory/cpuapr2017-3236618.html</a></p> <p>Release Date: 2017-04-24</p> <p>Fix Resolution: 5.1.41</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_usab
cve high detected in mysql connector java jar autoclosed cve high severity vulnerability vulnerable library mysql connector java jar mysql jdbc type driver library home page a href path to vulnerable library madness target madness web inf lib mysql connector java jar canner repository mysql mysql connector java mysql connector java jar dependency hierarchy x mysql connector java jar vulnerable library found in head commit a href vulnerability details vulnerability in the mysql connectors component of oracle mysql subcomponent connector j supported versions that are affected are and earlier difficult to exploit vulnerability allows low privileged attacker with network access via multiple protocols to compromise mysql connectors while the vulnerability is in mysql connectors attacks may significantly impact additional products successful attacks of this vulnerability can result in takeover of mysql connectors cvss base score confidentiality integrity and availability impacts cvss vector cvss av n ac h pr l ui n s c c h i h a h publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required low user interaction none scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
13,180
8,343,120,821
IssuesEvent
2018-09-30 00:09:03
ekaterinailin/AltaiPony
https://api.github.com/repos/ekaterinailin/AltaiPony
opened
Rewrite the structure description in a short and accessible way for new users.
usability
<!-- Fill in the information below before opening an issue. --> #### What needs to be created or improved? <!-- Provide a clear and concise description of the issue. --> #### Can you provide an example? <!-- Provide a link or minimal code snippet that demonstrates the issue. --> - top level modules - FlareLightCurve, _wrapper_ places - core functions - flare finding, injection/recovery, analysis - helper modules - IO, MAST caller #### What is the goal / expected behaviour? <!-- Describe the behavior you expected and how it differs from the behavior observed in the example. --> Regardless of verb-y and descriptive module names the code structure can remain opaque to users or my future self. To prevent this, note your current idea that the code's structure represents.
True
Rewrite the structure description in a short and accessible way for new users. - <!-- Fill in the information below before opening an issue. --> #### What needs to be created or improved? <!-- Provide a clear and concise description of the issue. --> #### Can you provide an example? <!-- Provide a link or minimal code snippet that demonstrates the issue. --> - top level modules - FlareLightCurve, _wrapper_ places - core functions - flare finding, injection/recovery, analysis - helper modules - IO, MAST caller #### What is the goal / expected behaviour? <!-- Describe the behavior you expected and how it differs from the behavior observed in the example. --> Regardless of verb-y and descriptive module names the code structure can remain opaque to users or my future self. To prevent this, note your current idea that the code's structure represents.
usab
rewrite the structure description in a short and accessible way for new users what needs to be created or improved can you provide an example top level modules flarelightcurve wrapper places core functions flare finding injection recovery analysis helper modules io mast caller what is the goal expected behaviour regardless of verb y and descriptive module names the code structure can remain opaque to users or my future self to prevent this note your current idea that the code s structure represents
1
8,373
5,636,992,025
IssuesEvent
2017-04-06 07:55:13
zaproxy/zaproxy
https://api.github.com/repos/zaproxy/zaproxy
closed
Enhancement: Additional Global Exclude default patterns
enhancement Usability
Food for thought ... re: <sub>https://github.com/zaproxy/zaproxy/blob/efce236ddf6f89516de6cd0592e22372149e0e05/src/org/zaproxy/zap/extension/globalexcludeurl/GlobalExcludeURLParam.java#L63</sub> - [x] `^https?://detectportal\.firefox\.com.*$` Firefox mechanism for detecting captive portals. - [x] `^https?://www\.google-analytics\.com.*$` Google analytics....nuff said. - [x] `^https?://ciscobinary\.openh264\.org.*$` Firefox codec download (https://support.mozilla.org/t5/Firefox/Where-is-a-check-that-http-ciscobinary-openh264-org-openh264-is/m-p/1316497#M1005892) - [x] `^https?://fonts.*$` This might be too generic, I built it broadly after seeing fonts.googleapis.com and fonts.gstatic.com used...
True
Enhancement: Additional Global Exclude default patterns - Food for thought ... re: <sub>https://github.com/zaproxy/zaproxy/blob/efce236ddf6f89516de6cd0592e22372149e0e05/src/org/zaproxy/zap/extension/globalexcludeurl/GlobalExcludeURLParam.java#L63</sub> - [x] `^https?://detectportal\.firefox\.com.*$` Firefox mechanism for detecting captive portals. - [x] `^https?://www\.google-analytics\.com.*$` Google analytics....nuff said. - [x] `^https?://ciscobinary\.openh264\.org.*$` Firefox codec download (https://support.mozilla.org/t5/Firefox/Where-is-a-check-that-http-ciscobinary-openh264-org-openh264-is/m-p/1316497#M1005892) - [x] `^https?://fonts.*$` This might be too generic, I built it broadly after seeing fonts.googleapis.com and fonts.gstatic.com used...
usab
enhancement additional global exclude default patterns food for thought re https detectportal firefox com firefox mechanism for detecting captive portals https www google analytics com google analytics nuff said https ciscobinary org firefox codec download https fonts this might be too generic i built it broadly after seeing fonts googleapis com and fonts gstatic com used
1
209,536
7,176,995,979
IssuesEvent
2018-01-31 12:02:39
fusetools/fuselibs-public
https://api.github.com/repos/fusetools/fuselibs-public
closed
Attempted to detach child that is not attached to us
Priority: High Severity: Bug
In my [Dice app](https://github.com/mortoray/dice) I can predictably get the below error with these steps: 1. Add New Set Twice (button at bottom) 2. Delete Second set (swipe left, press Dleete) 3. Delete another set ``` Error: Fuse.Scripting.ScriptException: Uncaught Error: Internal error: Attempted to detach child that is not attached to us Name: Error: Internal error: Attempted to detach child that is not attached to us File name: FuseJS/Internal/zone.js Line number: 196 Script stack trace: Error: Internal error: Attempted to detach child that is not attached to us at removeAsParentFrom (FuseJS/Internal/Model.js:320:12) at removeRange (FuseJS/Internal/Model.js:441:5) at Object.meta.diff (FuseJS/Internal/Model.js:246:7) at update (FuseJS/Internal/Model.js:392:15) at Object.meta.diff (FuseJS/Internal/Model.js:266:6) at FuseJS/Internal/Model.js:212:11 at ZoneDelegate.invokeTask (FuseJS/Internal/zone.js:425:31) at Zone.runTask (FuseJS/Internal/zone.js:192:47) at ZoneTask.invokeTask (FuseJS/Internal/zone.js:499:34) at ZoneTask.invoke (FuseJS/Internal/zone.js:488:48) ``` I'll see if I can find a smaller reproduction. High priority because it's a common use-case involving only basic behavior.
1.0
Attempted to detach child that is not attached to us - In my [Dice app](https://github.com/mortoray/dice) I can predictably get the below error with these steps: 1. Add New Set Twice (button at bottom) 2. Delete Second set (swipe left, press Dleete) 3. Delete another set ``` Error: Fuse.Scripting.ScriptException: Uncaught Error: Internal error: Attempted to detach child that is not attached to us Name: Error: Internal error: Attempted to detach child that is not attached to us File name: FuseJS/Internal/zone.js Line number: 196 Script stack trace: Error: Internal error: Attempted to detach child that is not attached to us at removeAsParentFrom (FuseJS/Internal/Model.js:320:12) at removeRange (FuseJS/Internal/Model.js:441:5) at Object.meta.diff (FuseJS/Internal/Model.js:246:7) at update (FuseJS/Internal/Model.js:392:15) at Object.meta.diff (FuseJS/Internal/Model.js:266:6) at FuseJS/Internal/Model.js:212:11 at ZoneDelegate.invokeTask (FuseJS/Internal/zone.js:425:31) at Zone.runTask (FuseJS/Internal/zone.js:192:47) at ZoneTask.invokeTask (FuseJS/Internal/zone.js:499:34) at ZoneTask.invoke (FuseJS/Internal/zone.js:488:48) ``` I'll see if I can find a smaller reproduction. High priority because it's a common use-case involving only basic behavior.
non_usab
attempted to detach child that is not attached to us in my i can predictably get the below error with these steps add new set twice button at bottom delete second set swipe left press dleete delete another set error fuse scripting scriptexception uncaught error internal error attempted to detach child that is not attached to us name error internal error attempted to detach child that is not attached to us file name fusejs internal zone js line number script stack trace error internal error attempted to detach child that is not attached to us at removeasparentfrom fusejs internal model js at removerange fusejs internal model js at object meta diff fusejs internal model js at update fusejs internal model js at object meta diff fusejs internal model js at fusejs internal model js at zonedelegate invoketask fusejs internal zone js at zone runtask fusejs internal zone js at zonetask invoketask fusejs internal zone js at zonetask invoke fusejs internal zone js i ll see if i can find a smaller reproduction high priority because it s a common use case involving only basic behavior
0
401,142
11,786,268,547
IssuesEvent
2020-03-17 11:56:36
vector-im/riot-web
https://api.github.com/repos/vector-im/riot-web
closed
Hide video and phone call buttons in composer when config option set
release-blocker type:voip 🔔 Priority 🔔
No changes by default. We would support a new config.json option that would hide the video and phone call buttons for situations where it is more confusing than helpful.
1.0
Hide video and phone call buttons in composer when config option set - No changes by default. We would support a new config.json option that would hide the video and phone call buttons for situations where it is more confusing than helpful.
non_usab
hide video and phone call buttons in composer when config option set no changes by default we would support a new config json option that would hide the video and phone call buttons for situations where it is more confusing than helpful
0
234,257
25,810,786,865
IssuesEvent
2022-12-11 20:48:11
argoproj/argo-cd
https://api.github.com/repos/argoproj/argo-cd
closed
"Deny sources" for Projects
enhancement security
# Summary Now that we have "deny destinations" on Projects (#9652), it would be natural for this to be extended to application sources as well. # Motivation We have a use case where we have a quite restricted `AppProject`, which only has a restricted number of repos and destinations which it is allowed to use. The dual of this is that we have a more liberal `AppProject` where we now use deny destinations, but it'd also be nice to add restrictions on which repos are _not_ allowed to be used. # Proposal In order to be consistent with the deny destinations feature, we'd also allow sources to be prefixed with `!`, which would negate the source value if it has been matched. The implementation would pretty much look like this: ```go func (proj AppProject) IsSourcePermitted(src ApplicationSource) bool { srcNormalized := git.NormalizeGitURL(src.RepoURL) var normalized string anySourceMatched := false noDenySourcesMatched := true for _, repoURL := range proj.Spec.SourceRepos { if isDenyPattern(repoURL) { normalized = "!" + git.NormalizeGitURL(strings.TrimPrefix("!", repoURL)) } else { normalized = git.NormalizeGitURL(repoURL) } matched := globMatch(normalized, srcNormalized, true, '/') if matched { anySourceMatched = true } else if !matched && isDenyPattern(normalized) { noDenySourcesMatched = false } } return anySourceMatched && noDenySourcesMatched } ``` As with deny destinations we would check that if there are any sources that are matched _and_ check that there are no deny sources matched.
True
"Deny sources" for Projects - # Summary Now that we have "deny destinations" on Projects (#9652), it would be natural for this to be extended to application sources as well. # Motivation We have a use case where we have a quite restricted `AppProject`, which only has a restricted number of repos and destinations which it is allowed to use. The dual of this is that we have a more liberal `AppProject` where we now use deny destinations, but it'd also be nice to add restrictions on which repos are _not_ allowed to be used. # Proposal In order to be consistent with the deny destinations feature, we'd also allow sources to be prefixed with `!`, which would negate the source value if it has been matched. The implementation would pretty much look like this: ```go func (proj AppProject) IsSourcePermitted(src ApplicationSource) bool { srcNormalized := git.NormalizeGitURL(src.RepoURL) var normalized string anySourceMatched := false noDenySourcesMatched := true for _, repoURL := range proj.Spec.SourceRepos { if isDenyPattern(repoURL) { normalized = "!" + git.NormalizeGitURL(strings.TrimPrefix("!", repoURL)) } else { normalized = git.NormalizeGitURL(repoURL) } matched := globMatch(normalized, srcNormalized, true, '/') if matched { anySourceMatched = true } else if !matched && isDenyPattern(normalized) { noDenySourcesMatched = false } } return anySourceMatched && noDenySourcesMatched } ``` As with deny destinations we would check that if there are any sources that are matched _and_ check that there are no deny sources matched.
non_usab
deny sources for projects summary now that we have deny destinations on projects it would be natural for this to be extended to application sources as well motivation we have a use case where we have a quite restricted appproject which only has a restricted number of repos and destinations which it is allowed to use the dual of this is that we have a more liberal appproject where we now use deny destinations but it d also be nice to add restrictions on which repos are not allowed to be used proposal in order to be consistent with the deny destinations feature we d also allow sources to be prefixed with which would negate the source value if it has been matched the implementation would pretty much look like this go func proj appproject issourcepermitted src applicationsource bool srcnormalized git normalizegiturl src repourl var normalized string anysourcematched false nodenysourcesmatched true for repourl range proj spec sourcerepos if isdenypattern repourl normalized git normalizegiturl strings trimprefix repourl else normalized git normalizegiturl repourl matched globmatch normalized srcnormalized true if matched anysourcematched true else if matched isdenypattern normalized nodenysourcesmatched false return anysourcematched nodenysourcesmatched as with deny destinations we would check that if there are any sources that are matched and check that there are no deny sources matched
0
10,682
6,875,259,750
IssuesEvent
2017-11-19 11:49:53
webanno/webanno
https://api.github.com/repos/webanno/webanno
closed
Bulk import leads to bulk messages
enhancement usability
Bulk import leads to bulk messages. these do not go away when changing tab and hide everything after. ![image](https://user-images.githubusercontent.com/1056051/29684495-770b43c4-8912-11e7-93a6-bdf1428db760.png) Browser: Chrome Versions: 3.3.x and SNAPSHOT Tested by: @ChrisBieTUDA
True
Bulk import leads to bulk messages - Bulk import leads to bulk messages. these do not go away when changing tab and hide everything after. ![image](https://user-images.githubusercontent.com/1056051/29684495-770b43c4-8912-11e7-93a6-bdf1428db760.png) Browser: Chrome Versions: 3.3.x and SNAPSHOT Tested by: @ChrisBieTUDA
usab
bulk import leads to bulk messages bulk import leads to bulk messages these do not go away when changing tab and hide everything after browser chrome versions x and snapshot tested by chrisbietuda
1
673,939
23,034,492,892
IssuesEvent
2022-07-22 17:06:30
apexcharts/apexcharts.js
https://api.github.com/repos/apexcharts/apexcharts.js
closed
column width bug when colors are functions
bug high-priority
# Bug report ## Codepen https://codepen.io/RomRider/pen/NWbNbdO ## Explanation - What is the behavior you expect? When using color functions to determine the color of a column vs using a plain color string, only the color should change. In the codepen: --> top chart: using color functions, width is weird --> bottom chart: using plain color, width is normal ![image](https://user-images.githubusercontent.com/21064206/107122013-b91c7d80-6895-11eb-8cb5-997dc2b9c665.png) - What is happening instead? When using plain color, the columns widths are fine, when using color functions, the width of each bar gets reduced - What error message are you getting? None
1.0
column width bug when colors are functions - # Bug report ## Codepen https://codepen.io/RomRider/pen/NWbNbdO ## Explanation - What is the behavior you expect? When using color functions to determine the color of a column vs using a plain color string, only the color should change. In the codepen: --> top chart: using color functions, width is weird --> bottom chart: using plain color, width is normal ![image](https://user-images.githubusercontent.com/21064206/107122013-b91c7d80-6895-11eb-8cb5-997dc2b9c665.png) - What is happening instead? When using plain color, the columns widths are fine, when using color functions, the width of each bar gets reduced - What error message are you getting? None
non_usab
column width bug when colors are functions bug report codepen explanation what is the behavior you expect when using color functions to determine the color of a column vs using a plain color string only the color should change in the codepen top chart using color functions width is weird bottom chart using plain color width is normal what is happening instead when using plain color the columns widths are fine when using color functions the width of each bar gets reduced what error message are you getting none
0
143,112
19,142,917,626
IssuesEvent
2021-12-02 02:20:00
arohablue/skill-india-backend
https://api.github.com/repos/arohablue/skill-india-backend
opened
CVE-2020-10969 (High) detected in jackson-databind-2.9.8.jar
security vulnerability
## CVE-2020-10969 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /skill-india-backend/pom.xml</p> <p>Path to vulnerable library: /root/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.8/jackson-databind-2.9.8.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-web-2.1.2.RELEASE.jar (Root Library) - spring-boot-starter-json-2.1.2.RELEASE.jar - :x: **jackson-databind-2.9.8.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to javax.swing.JEditorPane. <p>Publish Date: 2020-03-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10969>CVE-2020-10969</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10969">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10969</a></p> <p>Release Date: 2020-03-26</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.8.11.6;com.fasterxml.jackson.core:jackson-databind:2.7.9.7</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-10969 (High) detected in jackson-databind-2.9.8.jar - ## CVE-2020-10969 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /skill-india-backend/pom.xml</p> <p>Path to vulnerable library: /root/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.8/jackson-databind-2.9.8.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-web-2.1.2.RELEASE.jar (Root Library) - spring-boot-starter-json-2.1.2.RELEASE.jar - :x: **jackson-databind-2.9.8.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to javax.swing.JEditorPane. <p>Publish Date: 2020-03-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10969>CVE-2020-10969</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10969">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10969</a></p> <p>Release Date: 2020-03-26</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.8.11.6;com.fasterxml.jackson.core:jackson-databind:2.7.9.7</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_usab
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file skill india backend pom xml path to vulnerable library root repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring boot starter web release jar root library spring boot starter json release jar x jackson databind jar vulnerable library vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to javax swing jeditorpane publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind com fasterxml jackson core jackson databind step up your open source security game with whitesource
0
12,058
7,661,700,051
IssuesEvent
2018-05-11 15:01:31
rhoneyager/libicedb
https://api.github.com/repos/rhoneyager/libicedb
opened
Negative dipole coordinates
Usability
At the moment the code only accepts positive integers as dipole coordinates (uint64_t). When parsing shapefiles if a negative coordinate -x is encountered this is automatically transformed to x I have a lot of shapefiles centered around the origin which then includes negative coordinates. Are there good reasons to not assume signed integers as dipole coordinates?
True
Negative dipole coordinates - At the moment the code only accepts positive integers as dipole coordinates (uint64_t). When parsing shapefiles if a negative coordinate -x is encountered this is automatically transformed to x I have a lot of shapefiles centered around the origin which then includes negative coordinates. Are there good reasons to not assume signed integers as dipole coordinates?
usab
negative dipole coordinates at the moment the code only accepts positive integers as dipole coordinates t when parsing shapefiles if a negative coordinate x is encountered this is automatically transformed to x i have a lot of shapefiles centered around the origin which then includes negative coordinates are there good reasons to not assume signed integers as dipole coordinates
1
25,138
24,782,501,835
IssuesEvent
2022-10-24 06:58:26
eclipse/dirigible
https://api.github.com/repos/eclipse/dirigible
opened
[Templates] Integrate Email Templates
enhancement API usability component-template edm component-resources
Integrate enhanced email templates like Sendwithus in Dirigible: - https://github.com/sendwithus/templates Applicable for use cases of BPMN, EDM, API level or in samples: - https://www.dirigible.io/api/documents/pdf ```javascript const pdfDocuments = require("documents/v4/pdf"); let data = { title: "Lorem Ipsum", description: "Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus lacinia fermentum magna, sit amet accumsan felis auctor ac.", columns: [{ name: "Id", key: "id" }, { name: "First Name", key: "firstName", }, { name: "Last Name", key: "lastName" }, { name: "Age", key: "age" }], rows: [{ id: 1001, firstName: "John", lastName: "Doe", age: 29 }, { id: 1002, firstName: "Jane", lastName: "Doe", age: 26 }] }; let pdf = pdfDocuments.generateTable(data); ```
True
[Templates] Integrate Email Templates - Integrate enhanced email templates like Sendwithus in Dirigible: - https://github.com/sendwithus/templates Applicable for use cases of BPMN, EDM, API level or in samples: - https://www.dirigible.io/api/documents/pdf ```javascript const pdfDocuments = require("documents/v4/pdf"); let data = { title: "Lorem Ipsum", description: "Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus lacinia fermentum magna, sit amet accumsan felis auctor ac.", columns: [{ name: "Id", key: "id" }, { name: "First Name", key: "firstName", }, { name: "Last Name", key: "lastName" }, { name: "Age", key: "age" }], rows: [{ id: 1001, firstName: "John", lastName: "Doe", age: 29 }, { id: 1002, firstName: "Jane", lastName: "Doe", age: 26 }] }; let pdf = pdfDocuments.generateTable(data); ```
usab
integrate email templates integrate enhanced email templates like sendwithus in dirigible applicable for use cases of bpmn edm api level or in samples javascript const pdfdocuments require documents pdf let data title lorem ipsum description lorem ipsum dolor sit amet consectetur adipiscing elit vivamus lacinia fermentum magna sit amet accumsan felis auctor ac columns name id key id name first name key firstname name last name key lastname name age key age rows id firstname john lastname doe age id firstname jane lastname doe age let pdf pdfdocuments generatetable data
1
20,240
15,172,241,381
IssuesEvent
2021-02-13 08:00:27
microsoft/win32metadata
https://api.github.com/repos/microsoft/win32metadata
closed
Typedefs appear to be lost in the translation
usability
Hey all, I was looking at the Win32 metadata in ILSpy, and the first thing that jumped at me is that many typedefs are missing from the new representation. To give just one example among many others: `MsiSetPropertyW`' is defined in the headers as: ```c UINT WINAPI MsiSetPropertyW(MSIHANDLE hInstall, LPCWSTR szName, // property identifier, case-sensitive LPCWSTR szValue); // property value, null to undefine property ``` But in the WinMD file, it is defined as: ```cs [DllImport("msi", ExactSpelling = true)] public unsafe static extern uint MsiSetPropertyW(uint hInstall, [Const][NativeTypeInfo(UnmanagedType.LPWStr, IsNullTerminated = true)] ushort* szName, [Const][NativeTypeInfo(UnmanagedType.LPWStr, IsNullTerminated = true)] ushort* szValue); ``` Notice that hInstall is defined as simply being an `uint`. The typedefs here gives some nice information about the provenance of the handle, and could in theory be used to generate wrappers that wrap the uint into a struct to provide RAII semantics for instance.
True
Typedefs appear to be lost in the translation - Hey all, I was looking at the Win32 metadata in ILSpy, and the first thing that jumped at me is that many typedefs are missing from the new representation. To give just one example among many others: `MsiSetPropertyW`' is defined in the headers as: ```c UINT WINAPI MsiSetPropertyW(MSIHANDLE hInstall, LPCWSTR szName, // property identifier, case-sensitive LPCWSTR szValue); // property value, null to undefine property ``` But in the WinMD file, it is defined as: ```cs [DllImport("msi", ExactSpelling = true)] public unsafe static extern uint MsiSetPropertyW(uint hInstall, [Const][NativeTypeInfo(UnmanagedType.LPWStr, IsNullTerminated = true)] ushort* szName, [Const][NativeTypeInfo(UnmanagedType.LPWStr, IsNullTerminated = true)] ushort* szValue); ``` Notice that hInstall is defined as simply being an `uint`. The typedefs here gives some nice information about the provenance of the handle, and could in theory be used to generate wrappers that wrap the uint into a struct to provide RAII semantics for instance.
usab
typedefs appear to be lost in the translation hey all i was looking at the metadata in ilspy and the first thing that jumped at me is that many typedefs are missing from the new representation to give just one example among many others msisetpropertyw is defined in the headers as c uint winapi msisetpropertyw msihandle hinstall lpcwstr szname property identifier case sensitive lpcwstr szvalue property value null to undefine property but in the winmd file it is defined as cs public unsafe static extern uint msisetpropertyw uint hinstall ushort szname ushort szvalue notice that hinstall is defined as simply being an uint the typedefs here gives some nice information about the provenance of the handle and could in theory be used to generate wrappers that wrap the uint into a struct to provide raii semantics for instance
1
3,465
3,462,118,217
IssuesEvent
2015-12-20 17:23:56
tgstation/-tg-station
https://api.github.com/repos/tgstation/-tg-station
closed
Ventcrawling through pumps makes it impossible to see the atmos pipes you can move through
Bug Usability
Repro: - Play as drone (or any other ventcrawling creature - Head for atmos and crawl through any of the pumps - Be unable to see where you can go, unless you enter a vent again
True
Ventcrawling through pumps makes it impossible to see the atmos pipes you can move through - Repro: - Play as drone (or any other ventcrawling creature - Head for atmos and crawl through any of the pumps - Be unable to see where you can go, unless you enter a vent again
usab
ventcrawling through pumps makes it impossible to see the atmos pipes you can move through repro play as drone or any other ventcrawling creature head for atmos and crawl through any of the pumps be unable to see where you can go unless you enter a vent again
1
2,483
3,079,347,700
IssuesEvent
2015-08-21 15:46:01
piwik/piwik
https://api.github.com/repos/piwik/piwik
closed
core:archive output is sometimes too verbose eg. "0 visits in last last260 weeks,"
c: Usability
Our goal is to make Piwik the easiest possible to use and manage. This includes our console commands which should output messages that make sense to everyone. Recently we've made several improvements to the `core:archive` command messages in #7723 #7536 #8214 and in this issue we'll make the output a little better. Currently it looks like this: ``` INFO CoreConsole[2014-12-15 23:18:29] [9369d] Archived website id = 14, period = week, 0 visits in last last260 weeks, 0 visits this week, Time elapsed: 15.551s INFO CoreConsole[2014-12-15 23:18:35] [9369d] Archived website id = 14, period = month, 0 visits in last last52 months, ``` I suggest we simply remove the part ` visits in last last260 weeks, ` / `0 visits in last last52 months,` as it is not useful, and is confusing to some users. This small change was requested by Piwik PRO team.
True
core:archive output is sometimes too verbose eg. "0 visits in last last260 weeks," - Our goal is to make Piwik the easiest possible to use and manage. This includes our console commands which should output messages that make sense to everyone. Recently we've made several improvements to the `core:archive` command messages in #7723 #7536 #8214 and in this issue we'll make the output a little better. Currently it looks like this: ``` INFO CoreConsole[2014-12-15 23:18:29] [9369d] Archived website id = 14, period = week, 0 visits in last last260 weeks, 0 visits this week, Time elapsed: 15.551s INFO CoreConsole[2014-12-15 23:18:35] [9369d] Archived website id = 14, period = month, 0 visits in last last52 months, ``` I suggest we simply remove the part ` visits in last last260 weeks, ` / `0 visits in last last52 months,` as it is not useful, and is confusing to some users. This small change was requested by Piwik PRO team.
usab
core archive output is sometimes too verbose eg visits in last weeks our goal is to make piwik the easiest possible to use and manage this includes our console commands which should output messages that make sense to everyone recently we ve made several improvements to the core archive command messages in and in this issue we ll make the output a little better currently it looks like this info coreconsole archived website id period week visits in last weeks visits this week time elapsed info coreconsole archived website id period month visits in last months i suggest we simply remove the part visits in last weeks visits in last months as it is not useful and is confusing to some users this small change was requested by piwik pro team
1
1,495
2,862,962,623
IssuesEvent
2015-06-04 09:08:52
MISP/MISP
https://api.github.com/repos/MISP/MISP
closed
Freetext import tool - deduplicate parsed IOCs
enhancement usability
When importing a long list of IOCs, or copy/pasting text from a report, sometimes the same IOCs is presented at different places. It would be really nice if the freetext import tool could deduplicate the list of parsed imported results before presenting it to the user for editing before creating attributes in the event.
True
Freetext import tool - deduplicate parsed IOCs - When importing a long list of IOCs, or copy/pasting text from a report, sometimes the same IOCs is presented at different places. It would be really nice if the freetext import tool could deduplicate the list of parsed imported results before presenting it to the user for editing before creating attributes in the event.
usab
freetext import tool deduplicate parsed iocs when importing a long list of iocs or copy pasting text from a report sometimes the same iocs is presented at different places it would be really nice if the freetext import tool could deduplicate the list of parsed imported results before presenting it to the user for editing before creating attributes in the event
1
24,187
23,457,365,801
IssuesEvent
2022-08-16 10:02:34
WordPress/gutenberg
https://api.github.com/repos/WordPress/gutenberg
opened
Synced blocks: Define them as such in the Inspector
[Feature] Reusable Blocks [Block] Template Part
Related to https://github.com/WordPress/gutenberg/issues/42482 and https://github.com/WordPress/gutenberg/issues/32163. When you select a template part or reusable block, it might be nice to clarify their global / synced nature – and perhaps outline usage – in the Inspector.
True
Synced blocks: Define them as such in the Inspector - Related to https://github.com/WordPress/gutenberg/issues/42482 and https://github.com/WordPress/gutenberg/issues/32163. When you select a template part or reusable block, it might be nice to clarify their global / synced nature – and perhaps outline usage – in the Inspector.
usab
synced blocks define them as such in the inspector related to and when you select a template part or reusable block it might be nice to clarify their global synced nature – and perhaps outline usage – in the inspector
1
21,468
17,112,367,925
IssuesEvent
2021-07-10 15:43:07
lutzhamel/asteroid
https://api.github.com/repos/lutzhamel/asteroid
opened
File I/O
enhancement usability
Currently, Asteroid lacks any file input/output functionality. Asteroids base IO module, io.ast, only provides methods for writing to stdout, reading from stdin, and reading/writing from the console.
True
File I/O - Currently, Asteroid lacks any file input/output functionality. Asteroids base IO module, io.ast, only provides methods for writing to stdout, reading from stdin, and reading/writing from the console.
usab
file i o currently asteroid lacks any file input output functionality asteroids base io module io ast only provides methods for writing to stdout reading from stdin and reading writing from the console
1
27,553
29,510,549,690
IssuesEvent
2023-06-03 21:53:05
tailscale/tailscale
https://api.github.com/repos/tailscale/tailscale
closed
Tailscale hangs indefinitely on misbehaving DNS
L1 Very few P2 Aggravating T5 Usability bug
### What is the issue? Sometimes when I try to go log into tailscale, I get this weird behaviour where running `sudo tailscale up` freezes. Seems like the login flow goes ok until `control: creating new noise client` comes up and then fails with `Received error: register request: Post "https://controlplane.tailscale.com/machine/register": connection attempts aborted by context: context deadline exceeded` See logs below ```Oct 28 13:51:27 carbon tailscaled[7554]: ipnserver: conn2: connection from userid 0; root has access Oct 28 13:51:27 carbon tailscaled[7554]: Start Oct 28 13:51:27 carbon tailscaled[7554]: control: client.Shutdown() Oct 28 13:51:27 carbon tailscaled[7554]: control: client.Shutdown: inSendStatus=0 Oct 28 13:51:27 carbon tailscaled[7554]: control: mapRoutine: quit Oct 28 13:51:27 carbon tailscaled[7554]: control: Client.Shutdown done. Oct 28 13:51:27 carbon tailscaled[7554]: using backend prefs for "_daemon": Prefs{ra=false dns=true want=true routes=[] nf=on Persist=nil} Oct 28 13:51:27 carbon tailscaled[7554]: Backend: logs: be:495a217121ba33adf5c90d17c7138fb21263e9d8d9900b113a00357307310882 fe: Oct 28 13:51:27 carbon tailscaled[7554]: Switching ipn state NoState -> NeedsLogin (WantRunning=true, nm=false) Oct 28 13:51:27 carbon tailscaled[7554]: blockEngineUpdates(true) Oct 28 13:51:27 carbon tailscaled[7554]: Reconfig(down): no changes made to Engine config Oct 28 13:51:27 carbon tailscaled[7554]: StartLoginInteractive: url=false Oct 28 13:51:27 carbon tailscaled[7554]: control: client.Login(false, 2) Oct 28 13:51:27 carbon tailscaled[7554]: control: LoginInteractive -> regen=true Oct 28 13:51:27 carbon tailscaled[7554]: control: doLogin(regen=true, hasUrl=false) Oct 28 13:51:37 carbon tailscaled[7554]: trying bootstrapDNS("derp12c.tailscale.com", "149.28.119.105") for "controlplane.tailscale.com" ... Oct 28 13:51:37 carbon tailscaled[7554]: bootstrapDNS("derp12c.tailscale.com", "149.28.119.105") for "controlplane.tailscale.com" = [2a05:d014:386:202:24dd:46aa:f98e:9997 2a05:d014:386:203:b7ba:b7d4:9703:6f8a 2a05:d014:386:202:a3e1:5be6:cbb1:1ac9 2a05:d014:386:201:7200:6340:b22f:7df6 2a05:d014:386:203:4535:c15c:9ab:8258 2a05:d014:386:202:6dd4:fc68:35ae:2b80 18.193.143.177 18.156.90.224 18.193.255.254 3.121.104.141 52.29.8.43 18.157.173.201] Oct 28 13:51:37 carbon tailscaled[7554]: control: control server key from https://controlplane.tailscale.com: ts2021=[fSeS+], legacy=[nlFWp] Oct 28 13:51:37 carbon tailscaled[7554]: control: Generating a new nodekey. Oct 28 13:51:37 carbon tailscaled[7554]: control: RegisterReq: onode= node=[at8y+] fup=false Oct 28 13:51:37 carbon tailscaled[7554]: control: creating new noise client Oct 28 13:51:47 carbon tailscaled[7554]: Received error: register request: Post "https://controlplane.tailscale.com/machine/register": connection attempts aborted by context: context deadline exceeded Oct 28 13:51:47 carbon tailscaled[7554]: trying bootstrapDNS("derp1c.tailscale.com", "104.248.8.210") for "controlplane.tailscale.com" ... Oct 28 13:51:47 carbon tailscaled[7554]: trying bootstrapDNS("derp1c.tailscale.com", "104.248.8.210") for "controlplane.tailscale.com" ... Oct 28 13:51:47 carbon tailscaled[7554]: control: LoginInteractive -> regen=true Oct 28 13:51:47 carbon tailscaled[7554]: control: doLogin(regen=true, hasUrl=false) Oct 28 13:51:47 carbon tailscaled[7554]: control: Generating a new nodekey. Oct 28 13:51:47 carbon tailscaled[7554]: control: RegisterReq: onode= node=[v3v4A] fup=false Oct 28 13:51:48 carbon tailscaled[7554]: bootstrapDNS("derp1c.tailscale.com", "104.248.8.210") for "controlplane.tailscale.com" = [2a05:d014:386:202:6dd4:fc68:35ae:2b80 2a05:d014:386:203:b7ba:b7d4:9703:6f8a 2a05:d014:386:201:7200:6340:b22f:7df6 2a05:d014:386:202:24dd:46aa:f98e:9997 2a05:d014:386:202:a3e1:5be6:cbb1:1ac9 2a05:d014:386:203:4535:c15c:9ab:8258 18.156.90.224 18.193.255.254 52.29.8.43 3.121.104.141 18.157.173.201 18.193.143.177] Oct 28 13:51:48 carbon tailscaled[7554]: bootstrapDNS("derp1c.tailscale.com", "104.248.8.210") for "controlplane.tailscale.com" = [2a05:d014:386:202:6dd4:fc68:35ae:2b80 2a05:d014:386:203:b7ba:b7d4:9703:6f8a 2a05:d014:386:201:7200:6340:b22f:7df6 2a05:d014:386:202:24dd:46aa:f98e:9997 2a05:d014:386:202:a3e1:5be6:cbb1:1ac9 2a05:d014:386:203:4535:c15c:9ab:8258 18.156.90.224 18.193.255.254 52.29.8.43 3.121.104.141 18.157.173.201 18.193.143.177] Oct 28 13:51:57 carbon tailscaled[7554]: Received error: register request: Post "https://controlplane.tailscale.com/machine/register": connection attempts aborted by context: context deadline exceeded Oct 28 13:51:57 carbon tailscaled[7554]: trying bootstrapDNS("derp11.tailscale.com", "18.230.97.74") for "controlplane.tailscale.com" ... Oct 28 13:51:57 carbon tailscaled[7554]: trying bootstrapDNS("derp4e.tailscale.com", "134.122.74.153") for "controlplane.tailscale.com" ... Oct 28 13:51:57 carbon tailscaled[7554]: control: LoginInteractive -> regen=true Oct 28 13:51:57 carbon tailscaled[7554]: control: doLogin(regen=true, hasUrl=false) Oct 28 13:51:57 carbon tailscaled[7554]: control: Generating a new nodekey. Oct 28 13:51:57 carbon tailscaled[7554]: control: RegisterReq: onode= node=[p2yph] fup=false Oct 28 13:51:57 carbon tailscaled[7554]: bootstrapDNS("derp4e.tailscale.com", "134.122.74.153") for "controlplane.tailscale.com" = [2a05:d014:386:202:6dd4:fc68:35ae:2b80 2a05:d014:386:202:a3e1:5be6:cbb1:1ac9 2a05:d014:386:202:24dd:46aa:f98e:9997 2a05:d014:386:203:4535:c15c:9ab:8258 2a05:d014:386:201:7200:6340:b22f:7df6 2a05:d014:386:203:b7ba:b7d4:9703:6f8a 18.193.255.254 18.157.173.201 52.29.8.43 3.121.104.141 18.156.90.224 18.193.143.177] Oct 28 13:51:58 carbon tailscaled[7554]: bootstrapDNS("derp11.tailscale.com", "18.230.97.74") for "controlplane.tailscale.com" = [2a05:d014:386:202:24dd:46aa:f98e:9997 2a05:d014:386:202:6dd4:fc68:35ae:2b80 2a05:d014:386:203:4535:c15c:9ab:8258 2a05:d014:386:201:7200:6340:b22f:7df6 2a05:d014:386:203:b7ba:b7d4:9703:6f8a 2a05:d014:386:202:a3e1:5be6:cbb1:1ac9 18.193.255.254 18.156.90.224 3.121.104.141 18.193.143.177 18.157.173.201 52.29.8.43] Oct 28 13:52:07 carbon tailscaled[7554]: Received error: register request: Post "https://controlplane.tailscale.com/machine/register": connection attempts aborted by context: context deadline exceeded ``` ### Steps to reproduce Run `tailscale down` or `tailscale logout` and try to login again I have tried resetting the tailscale client and reinstalling the package. I have also cleared `/var/lib/tailscale/` manually to get rid of old state ### Are there any recent changes that introduced the issue? No, this behaviour seems completely random ### OS Linux ### OS version Arch Linux kernel 6.0.2-arch ### Tailscale version 1.32.1 ### Bug report BUG-495a217121ba33adf5c90d17c7138fb21263e9d8d9900b113a00357307310882-20221028105730Z-fb4af2b507a9d037
True
Tailscale hangs indefinitely on misbehaving DNS - ### What is the issue? Sometimes when I try to go log into tailscale, I get this weird behaviour where running `sudo tailscale up` freezes. Seems like the login flow goes ok until `control: creating new noise client` comes up and then fails with `Received error: register request: Post "https://controlplane.tailscale.com/machine/register": connection attempts aborted by context: context deadline exceeded` See logs below ```Oct 28 13:51:27 carbon tailscaled[7554]: ipnserver: conn2: connection from userid 0; root has access Oct 28 13:51:27 carbon tailscaled[7554]: Start Oct 28 13:51:27 carbon tailscaled[7554]: control: client.Shutdown() Oct 28 13:51:27 carbon tailscaled[7554]: control: client.Shutdown: inSendStatus=0 Oct 28 13:51:27 carbon tailscaled[7554]: control: mapRoutine: quit Oct 28 13:51:27 carbon tailscaled[7554]: control: Client.Shutdown done. Oct 28 13:51:27 carbon tailscaled[7554]: using backend prefs for "_daemon": Prefs{ra=false dns=true want=true routes=[] nf=on Persist=nil} Oct 28 13:51:27 carbon tailscaled[7554]: Backend: logs: be:495a217121ba33adf5c90d17c7138fb21263e9d8d9900b113a00357307310882 fe: Oct 28 13:51:27 carbon tailscaled[7554]: Switching ipn state NoState -> NeedsLogin (WantRunning=true, nm=false) Oct 28 13:51:27 carbon tailscaled[7554]: blockEngineUpdates(true) Oct 28 13:51:27 carbon tailscaled[7554]: Reconfig(down): no changes made to Engine config Oct 28 13:51:27 carbon tailscaled[7554]: StartLoginInteractive: url=false Oct 28 13:51:27 carbon tailscaled[7554]: control: client.Login(false, 2) Oct 28 13:51:27 carbon tailscaled[7554]: control: LoginInteractive -> regen=true Oct 28 13:51:27 carbon tailscaled[7554]: control: doLogin(regen=true, hasUrl=false) Oct 28 13:51:37 carbon tailscaled[7554]: trying bootstrapDNS("derp12c.tailscale.com", "149.28.119.105") for "controlplane.tailscale.com" ... Oct 28 13:51:37 carbon tailscaled[7554]: bootstrapDNS("derp12c.tailscale.com", "149.28.119.105") for "controlplane.tailscale.com" = [2a05:d014:386:202:24dd:46aa:f98e:9997 2a05:d014:386:203:b7ba:b7d4:9703:6f8a 2a05:d014:386:202:a3e1:5be6:cbb1:1ac9 2a05:d014:386:201:7200:6340:b22f:7df6 2a05:d014:386:203:4535:c15c:9ab:8258 2a05:d014:386:202:6dd4:fc68:35ae:2b80 18.193.143.177 18.156.90.224 18.193.255.254 3.121.104.141 52.29.8.43 18.157.173.201] Oct 28 13:51:37 carbon tailscaled[7554]: control: control server key from https://controlplane.tailscale.com: ts2021=[fSeS+], legacy=[nlFWp] Oct 28 13:51:37 carbon tailscaled[7554]: control: Generating a new nodekey. Oct 28 13:51:37 carbon tailscaled[7554]: control: RegisterReq: onode= node=[at8y+] fup=false Oct 28 13:51:37 carbon tailscaled[7554]: control: creating new noise client Oct 28 13:51:47 carbon tailscaled[7554]: Received error: register request: Post "https://controlplane.tailscale.com/machine/register": connection attempts aborted by context: context deadline exceeded Oct 28 13:51:47 carbon tailscaled[7554]: trying bootstrapDNS("derp1c.tailscale.com", "104.248.8.210") for "controlplane.tailscale.com" ... Oct 28 13:51:47 carbon tailscaled[7554]: trying bootstrapDNS("derp1c.tailscale.com", "104.248.8.210") for "controlplane.tailscale.com" ... Oct 28 13:51:47 carbon tailscaled[7554]: control: LoginInteractive -> regen=true Oct 28 13:51:47 carbon tailscaled[7554]: control: doLogin(regen=true, hasUrl=false) Oct 28 13:51:47 carbon tailscaled[7554]: control: Generating a new nodekey. Oct 28 13:51:47 carbon tailscaled[7554]: control: RegisterReq: onode= node=[v3v4A] fup=false Oct 28 13:51:48 carbon tailscaled[7554]: bootstrapDNS("derp1c.tailscale.com", "104.248.8.210") for "controlplane.tailscale.com" = [2a05:d014:386:202:6dd4:fc68:35ae:2b80 2a05:d014:386:203:b7ba:b7d4:9703:6f8a 2a05:d014:386:201:7200:6340:b22f:7df6 2a05:d014:386:202:24dd:46aa:f98e:9997 2a05:d014:386:202:a3e1:5be6:cbb1:1ac9 2a05:d014:386:203:4535:c15c:9ab:8258 18.156.90.224 18.193.255.254 52.29.8.43 3.121.104.141 18.157.173.201 18.193.143.177] Oct 28 13:51:48 carbon tailscaled[7554]: bootstrapDNS("derp1c.tailscale.com", "104.248.8.210") for "controlplane.tailscale.com" = [2a05:d014:386:202:6dd4:fc68:35ae:2b80 2a05:d014:386:203:b7ba:b7d4:9703:6f8a 2a05:d014:386:201:7200:6340:b22f:7df6 2a05:d014:386:202:24dd:46aa:f98e:9997 2a05:d014:386:202:a3e1:5be6:cbb1:1ac9 2a05:d014:386:203:4535:c15c:9ab:8258 18.156.90.224 18.193.255.254 52.29.8.43 3.121.104.141 18.157.173.201 18.193.143.177] Oct 28 13:51:57 carbon tailscaled[7554]: Received error: register request: Post "https://controlplane.tailscale.com/machine/register": connection attempts aborted by context: context deadline exceeded Oct 28 13:51:57 carbon tailscaled[7554]: trying bootstrapDNS("derp11.tailscale.com", "18.230.97.74") for "controlplane.tailscale.com" ... Oct 28 13:51:57 carbon tailscaled[7554]: trying bootstrapDNS("derp4e.tailscale.com", "134.122.74.153") for "controlplane.tailscale.com" ... Oct 28 13:51:57 carbon tailscaled[7554]: control: LoginInteractive -> regen=true Oct 28 13:51:57 carbon tailscaled[7554]: control: doLogin(regen=true, hasUrl=false) Oct 28 13:51:57 carbon tailscaled[7554]: control: Generating a new nodekey. Oct 28 13:51:57 carbon tailscaled[7554]: control: RegisterReq: onode= node=[p2yph] fup=false Oct 28 13:51:57 carbon tailscaled[7554]: bootstrapDNS("derp4e.tailscale.com", "134.122.74.153") for "controlplane.tailscale.com" = [2a05:d014:386:202:6dd4:fc68:35ae:2b80 2a05:d014:386:202:a3e1:5be6:cbb1:1ac9 2a05:d014:386:202:24dd:46aa:f98e:9997 2a05:d014:386:203:4535:c15c:9ab:8258 2a05:d014:386:201:7200:6340:b22f:7df6 2a05:d014:386:203:b7ba:b7d4:9703:6f8a 18.193.255.254 18.157.173.201 52.29.8.43 3.121.104.141 18.156.90.224 18.193.143.177] Oct 28 13:51:58 carbon tailscaled[7554]: bootstrapDNS("derp11.tailscale.com", "18.230.97.74") for "controlplane.tailscale.com" = [2a05:d014:386:202:24dd:46aa:f98e:9997 2a05:d014:386:202:6dd4:fc68:35ae:2b80 2a05:d014:386:203:4535:c15c:9ab:8258 2a05:d014:386:201:7200:6340:b22f:7df6 2a05:d014:386:203:b7ba:b7d4:9703:6f8a 2a05:d014:386:202:a3e1:5be6:cbb1:1ac9 18.193.255.254 18.156.90.224 3.121.104.141 18.193.143.177 18.157.173.201 52.29.8.43] Oct 28 13:52:07 carbon tailscaled[7554]: Received error: register request: Post "https://controlplane.tailscale.com/machine/register": connection attempts aborted by context: context deadline exceeded ``` ### Steps to reproduce Run `tailscale down` or `tailscale logout` and try to login again I have tried resetting the tailscale client and reinstalling the package. I have also cleared `/var/lib/tailscale/` manually to get rid of old state ### Are there any recent changes that introduced the issue? No, this behaviour seems completely random ### OS Linux ### OS version Arch Linux kernel 6.0.2-arch ### Tailscale version 1.32.1 ### Bug report BUG-495a217121ba33adf5c90d17c7138fb21263e9d8d9900b113a00357307310882-20221028105730Z-fb4af2b507a9d037
usab
tailscale hangs indefinitely on misbehaving dns what is the issue sometimes when i try to go log into tailscale i get this weird behaviour where running sudo tailscale up freezes seems like the login flow goes ok until control creating new noise client comes up and then fails with received error register request post connection attempts aborted by context context deadline exceeded see logs below oct carbon tailscaled ipnserver connection from userid root has access oct carbon tailscaled start oct carbon tailscaled control client shutdown oct carbon tailscaled control client shutdown insendstatus oct carbon tailscaled control maproutine quit oct carbon tailscaled control client shutdown done oct carbon tailscaled using backend prefs for daemon prefs ra false dns true want true routes nf on persist nil oct carbon tailscaled backend logs be fe oct carbon tailscaled switching ipn state nostate needslogin wantrunning true nm false oct carbon tailscaled blockengineupdates true oct carbon tailscaled reconfig down no changes made to engine config oct carbon tailscaled startlogininteractive url false oct carbon tailscaled control client login false oct carbon tailscaled control logininteractive regen true oct carbon tailscaled control dologin regen true hasurl false oct carbon tailscaled trying bootstrapdns tailscale com for controlplane tailscale com oct carbon tailscaled bootstrapdns tailscale com for controlplane tailscale com oct carbon tailscaled control control server key from legacy oct carbon tailscaled control generating a new nodekey oct carbon tailscaled control registerreq onode node fup false oct carbon tailscaled control creating new noise client oct carbon tailscaled received error register request post connection attempts aborted by context context deadline exceeded oct carbon tailscaled trying bootstrapdns tailscale com for controlplane tailscale com oct carbon tailscaled trying bootstrapdns tailscale com for controlplane tailscale com oct carbon tailscaled control logininteractive regen true oct carbon tailscaled control dologin regen true hasurl false oct carbon tailscaled control generating a new nodekey oct carbon tailscaled control registerreq onode node fup false oct carbon tailscaled bootstrapdns tailscale com for controlplane tailscale com oct carbon tailscaled bootstrapdns tailscale com for controlplane tailscale com oct carbon tailscaled received error register request post connection attempts aborted by context context deadline exceeded oct carbon tailscaled trying bootstrapdns tailscale com for controlplane tailscale com oct carbon tailscaled trying bootstrapdns tailscale com for controlplane tailscale com oct carbon tailscaled control logininteractive regen true oct carbon tailscaled control dologin regen true hasurl false oct carbon tailscaled control generating a new nodekey oct carbon tailscaled control registerreq onode node fup false oct carbon tailscaled bootstrapdns tailscale com for controlplane tailscale com oct carbon tailscaled bootstrapdns tailscale com for controlplane tailscale com oct carbon tailscaled received error register request post connection attempts aborted by context context deadline exceeded steps to reproduce run tailscale down or tailscale logout and try to login again i have tried resetting the tailscale client and reinstalling the package i have also cleared var lib tailscale manually to get rid of old state are there any recent changes that introduced the issue no this behaviour seems completely random os linux os version arch linux kernel arch tailscale version bug report bug
1
16,312
10,760,213,740
IssuesEvent
2019-10-31 18:07:27
OctopusDeploy/Issues
https://api.github.com/repos/OctopusDeploy/Issues
opened
Edit Environment screen acts as though there are changes if Jira Environment Type is not set
area/usability kind/bug
# Prerequisites - [x] I have verified the problem exists in the latest version - [x] I have searched [open](https://github.com/OctopusDeploy/Issues/issues) and [closed](https://github.com/OctopusDeploy/Issues/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aclosed) issues to make sure it isn't already reported - [x] I have written a descriptive issue title - [ ] I have linked the original source of this report - [x] I have tagged the issue appropriately (area/*, kind/bug, tag/regression?) # The bug If the Jira Environment Type field is available but empty on the Edit Environment screen, the page will ask you to save or discard your changes even if you haven't made any. ## What I expected to happen I don't expect the Unsaved Changes dialog if I haven't made any changes. ## Steps to reproduce 1. Enable Jira Issue Tracker in Configuration > Settings 2. Edit any environment that doesn't have the Jira Environment Type set 3. Try to leave the page 4. See Unsaved Changes dialog ### Screen capture ![Jira Environment Type bug](https://user-images.githubusercontent.com/1571799/67973696-fac47000-fbde-11e9-80f5-0a96ac602576.gif) ## Affected versions **Octopus Server:** 2019.10.1 ## Workarounds Set the Jira Environment Type to unmapped and save the environment.
True
Edit Environment screen acts as though there are changes if Jira Environment Type is not set - # Prerequisites - [x] I have verified the problem exists in the latest version - [x] I have searched [open](https://github.com/OctopusDeploy/Issues/issues) and [closed](https://github.com/OctopusDeploy/Issues/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aclosed) issues to make sure it isn't already reported - [x] I have written a descriptive issue title - [ ] I have linked the original source of this report - [x] I have tagged the issue appropriately (area/*, kind/bug, tag/regression?) # The bug If the Jira Environment Type field is available but empty on the Edit Environment screen, the page will ask you to save or discard your changes even if you haven't made any. ## What I expected to happen I don't expect the Unsaved Changes dialog if I haven't made any changes. ## Steps to reproduce 1. Enable Jira Issue Tracker in Configuration > Settings 2. Edit any environment that doesn't have the Jira Environment Type set 3. Try to leave the page 4. See Unsaved Changes dialog ### Screen capture ![Jira Environment Type bug](https://user-images.githubusercontent.com/1571799/67973696-fac47000-fbde-11e9-80f5-0a96ac602576.gif) ## Affected versions **Octopus Server:** 2019.10.1 ## Workarounds Set the Jira Environment Type to unmapped and save the environment.
usab
edit environment screen acts as though there are changes if jira environment type is not set prerequisites i have verified the problem exists in the latest version i have searched and issues to make sure it isn t already reported i have written a descriptive issue title i have linked the original source of this report i have tagged the issue appropriately area kind bug tag regression the bug if the jira environment type field is available but empty on the edit environment screen the page will ask you to save or discard your changes even if you haven t made any what i expected to happen i don t expect the unsaved changes dialog if i haven t made any changes steps to reproduce enable jira issue tracker in configuration settings edit any environment that doesn t have the jira environment type set try to leave the page see unsaved changes dialog screen capture affected versions octopus server workarounds set the jira environment type to unmapped and save the environment
1
160,461
12,512,549,157
IssuesEvent
2020-06-02 23:07:44
mapbox/mapbox-navigation-android
https://api.github.com/repos/mapbox/mapbox-navigation-android
closed
SoundButton shows wrong color in CustomUIComponentStyleActivity
bug test app
<!-- Hello and thanks for contributing! To help us diagnose your problem quickly, please: - Include a minimal demonstration of the bug, including code, logs, and screenshots. - Ensure you can reproduce the bug using the latest release. - Only post to report a bug or request a feature; direct all other questions to: https://stackoverflow.com/questions/tagged/mapbox --> **Android API:** Any **Mapbox Navigation SDK version:** UI SDK 1.0 ### Steps to trigger behavior 1. Launch `example` app 2. Launch `CustomUIComponentStyleActivity` example in Navigation UI SDK 3. Start a navigation 4. Click the `SoundButton` ### Expected behavior The `SoundButton` should show whatever it's before step 4 ### Actual behavior After step 4, the `SoundButton` stays black and doesn't change to the correct color even click it again.
1.0
SoundButton shows wrong color in CustomUIComponentStyleActivity - <!-- Hello and thanks for contributing! To help us diagnose your problem quickly, please: - Include a minimal demonstration of the bug, including code, logs, and screenshots. - Ensure you can reproduce the bug using the latest release. - Only post to report a bug or request a feature; direct all other questions to: https://stackoverflow.com/questions/tagged/mapbox --> **Android API:** Any **Mapbox Navigation SDK version:** UI SDK 1.0 ### Steps to trigger behavior 1. Launch `example` app 2. Launch `CustomUIComponentStyleActivity` example in Navigation UI SDK 3. Start a navigation 4. Click the `SoundButton` ### Expected behavior The `SoundButton` should show whatever it's before step 4 ### Actual behavior After step 4, the `SoundButton` stays black and doesn't change to the correct color even click it again.
non_usab
soundbutton shows wrong color in customuicomponentstyleactivity hello and thanks for contributing to help us diagnose your problem quickly please include a minimal demonstration of the bug including code logs and screenshots ensure you can reproduce the bug using the latest release only post to report a bug or request a feature direct all other questions to android api any mapbox navigation sdk version ui sdk steps to trigger behavior launch example app launch customuicomponentstyleactivity example in navigation ui sdk start a navigation click the soundbutton expected behavior the soundbutton should show whatever it s before step actual behavior after step the soundbutton stays black and doesn t change to the correct color even click it again
0
57,496
14,136,424,879
IssuesEvent
2020-11-10 04:15:54
streamnative/pulsar
https://api.github.com/repos/streamnative/pulsar
closed
ISSUE-7312: pulsar-2.6.0 cpp client build error
component/build triage/week-26 type/bug workflow::todo
Original Issue: apache/pulsar#7312 --- **Describe the bug** build error **To Reproduce** Steps to reproduce the behavior: [root@VM_72_166_tlinux /data/commlibsrc/pulsar-2.6.0/pulsar-client-cpp/build]# cmake .. -- ARCHITECTURE: x86_64 -- BUILD_TESTS: ON -- BUILD_PYTHON_WRAPPER: ON -- LINK_STATIC: OFF -- USE_LOG4CXX: OFF -- CMAKE_BUILD_TYPE: RelWithDebInfo -- Found Boost: /usr/local/include (found version "1.54.0") found components: program_options regex system -- PYTHON: 2.7.10 -- Found Boost: /usr/local/include (found version "1.54.0") found components: python -- HAS_ZSTD: 0 -- HAS_SNAPPY: 1 -- Using Boost Python libs: /usr/local/lib/libboost_python.so clang-tidy not found clang-format not found -- Configuring done -- Generating done -- Build files have been written to: /data/commlibsrc/pulsar-2.6.0/pulsar-client-cpp/build **Expected behavior** A clear and concise description of what you expected to happen. **Screenshots** If applicable, add screenshots to help explain your problem. **Desktop (please complete the following information):** - linux 3.10.106 **Additional context** /data/commlibsrc/pulsar-2.6.0/pulsar-client-cpp/lib/CompressionCodecSnappy.cc: In member function 'virtual bool pulsar::CompressionCodecSnappy::decode(const pulsar::SharedBuffer&, uint32_t, pulsar::SharedBuffer&)': /data/commlibsrc/pulsar-2.6.0/pulsar-client-cpp/lib/CompressionCodecSnappy.cc:43:42: error: cannot convert 'snappy::ByteArraySource*' to 'const char*' for argument '1' to 'bool snappy::Uncompress(const char*, size_t, snappy::string*)' if (snappy::Uncompress(&source, &sink)) { ^ make[2]: *** [lib/CMakeFiles/pulsarShared.dir/CompressionCodecSnappy.cc.o] Error 1 make[1]: *** [lib/CMakeFiles/pulsarShared.dir/all] Error 2
1.0
ISSUE-7312: pulsar-2.6.0 cpp client build error - Original Issue: apache/pulsar#7312 --- **Describe the bug** build error **To Reproduce** Steps to reproduce the behavior: [root@VM_72_166_tlinux /data/commlibsrc/pulsar-2.6.0/pulsar-client-cpp/build]# cmake .. -- ARCHITECTURE: x86_64 -- BUILD_TESTS: ON -- BUILD_PYTHON_WRAPPER: ON -- LINK_STATIC: OFF -- USE_LOG4CXX: OFF -- CMAKE_BUILD_TYPE: RelWithDebInfo -- Found Boost: /usr/local/include (found version "1.54.0") found components: program_options regex system -- PYTHON: 2.7.10 -- Found Boost: /usr/local/include (found version "1.54.0") found components: python -- HAS_ZSTD: 0 -- HAS_SNAPPY: 1 -- Using Boost Python libs: /usr/local/lib/libboost_python.so clang-tidy not found clang-format not found -- Configuring done -- Generating done -- Build files have been written to: /data/commlibsrc/pulsar-2.6.0/pulsar-client-cpp/build **Expected behavior** A clear and concise description of what you expected to happen. **Screenshots** If applicable, add screenshots to help explain your problem. **Desktop (please complete the following information):** - linux 3.10.106 **Additional context** /data/commlibsrc/pulsar-2.6.0/pulsar-client-cpp/lib/CompressionCodecSnappy.cc: In member function 'virtual bool pulsar::CompressionCodecSnappy::decode(const pulsar::SharedBuffer&, uint32_t, pulsar::SharedBuffer&)': /data/commlibsrc/pulsar-2.6.0/pulsar-client-cpp/lib/CompressionCodecSnappy.cc:43:42: error: cannot convert 'snappy::ByteArraySource*' to 'const char*' for argument '1' to 'bool snappy::Uncompress(const char*, size_t, snappy::string*)' if (snappy::Uncompress(&source, &sink)) { ^ make[2]: *** [lib/CMakeFiles/pulsarShared.dir/CompressionCodecSnappy.cc.o] Error 1 make[1]: *** [lib/CMakeFiles/pulsarShared.dir/all] Error 2
non_usab
issue pulsar cpp client build error original issue apache pulsar describe the bug build error to reproduce steps to reproduce the behavior cmake architecture build tests on build python wrapper on link static off use off cmake build type relwithdebinfo found boost usr local include found version found components program options regex system python found boost usr local include found version found components python has zstd has snappy using boost python libs usr local lib libboost python so clang tidy not found clang format not found configuring done generating done build files have been written to data commlibsrc pulsar pulsar client cpp build expected behavior a clear and concise description of what you expected to happen screenshots if applicable add screenshots to help explain your problem desktop please complete the following information linux additional context data commlibsrc pulsar pulsar client cpp lib compressioncodecsnappy cc in member function virtual bool pulsar compressioncodecsnappy decode const pulsar sharedbuffer t pulsar sharedbuffer data commlibsrc pulsar pulsar client cpp lib compressioncodecsnappy cc error cannot convert snappy bytearraysource to const char for argument to bool snappy uncompress const char size t snappy string if snappy uncompress source sink make error make error
0
21,826
17,837,042,086
IssuesEvent
2021-09-03 03:40:28
simonw/datasette-app
https://api.github.com/repos/simonw/datasette-app
closed
Show error messages when open CSV/database fails
electron-wrapper usability
Currently it fails silently and logs to the console.
True
Show error messages when open CSV/database fails - Currently it fails silently and logs to the console.
usab
show error messages when open csv database fails currently it fails silently and logs to the console
1
376,294
26,193,872,194
IssuesEvent
2023-01-03 11:37:50
arturo-lang/arturo
https://api.github.com/repos/arturo-lang/arturo
closed
[Sockets/unplug] add documentation example
documentation library todo easy
[Sockets/unplug] add documentation example https://github.com/arturo-lang/arturo/blob/a7209b1e5afa19379d5ef718f8384ce18a8b3159/src/library/Sockets.nim#L233 ```text ## The main Sockets module ## (part of the standard library) #======================================= # Pragmas #======================================= {.used.} #======================================= # Libraries #======================================= when not defined(WEB): import std/net as netsock except Socket import nativesockets import vm/values/custom/[vsocket] import vm/lib when not defined(WEB): import vm/errors #======================================= # Methods #======================================= # TODO(Sockets) Verify the whole module & check for missing functionality # obviously this cannot be done with unit-tests as easily as with other modules, but # we'd still have to verify it works as expected and track down possibly-missing # features # labels: open discussion proc defineSymbols*() = when not defined(WEB): builtin "accept", alias = unaliased, rule = PrefixPrecedence, description = "accept incoming connection and return corresponding socket", args = { "server" : {Socket} }, attrs = NoAttrs, returns = {Socket}, # TODO(Sockets/accept) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("accept") var client: netsock.Socket x.sock.socket.accept(client) let (address,port) = getPeerAddr(client) let socket = initSocket(client, proto=x.sock.protocol, address=address, port=port) push newSocket(socket) builtin "connect", alias = unaliased, rule = PrefixPrecedence, description = "create new socket connection to given server port", args = { "port" : {Integer} }, attrs = { "to" : ({String},"set socket address"), "udp" : ({Logical},"use UDP instead of TCP") }, returns = {Socket}, # TODO(Sockets/connect) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("connect") let isUDP = hadAttr("udp") let protocol = if isUDP: IPPROTO_UDP else: IPPROTO_TCP var toAddress: string if checkAttr("to"): toAddress = aTo.s else: toAddress = "0.0.0.0" var port = Port(x.i) var sock: netsock.Socket = netsock.newSocket(protocol=protocol) if not isUDP: sock.connect(toAddress, port) let socket = initSocket(sock, proto=protocol, address=toAddress, port=port) push newSocket(socket) builtin "listen", alias = unaliased, rule = PrefixPrecedence, description = "start listening on given port and return new socket", args = { "port" : {Integer} }, attrs = { "blocking" : ({String},"set blocking mode (default: false)"), "udp" : ({Logical},"use UDP instead of TCP") }, returns = {Socket}, # TODO(Sockets/listen) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("listen") let blocking = hadAttr("blocking") let protocol = if hadAttr("udp"): IPPROTO_UDP else: IPPROTO_TCP var sock: netsock.Socket = netsock.newSocket(protocol=protocol) sock.setSockOpt(OptReuseAddr, true) sock.getFd().setBlocking(blocking) sock.bindAddr(Port(x.i)) sock.listen() let (address,port) = getLocalAddr(sock) let socket = initSocket(sock, proto=protocol, address=address, port=port) push newSocket(socket) builtin "receive", alias = unaliased, rule = PrefixPrecedence, description = "receive line of data from selected socket", args = { "origin" : {Socket} }, attrs = { "size" : ({Integer},"set maximum size of received data"), "timeout" : ({Integer},"set timeout (in milliseconds)") }, returns = {String}, # TODO(Sockets/receive) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("receive") var size = MaxLineLength if checkAttr("size"): size = aSize.i var timeout = -1 if checkAttr("timeout"): timeout = aTimeout.i push newString(x.sock.socket.recvLine(timeout=timeout, maxLength=size)) builtin "send", alias = unaliased, rule = PrefixPrecedence, description = "send given message to selected socket", args = { "destination" : {Socket}, "message" : {String} }, attrs = { "chunk" : ({Logical},"don't send data as a line of data") }, returns = {Nothing}, # TODO(Sockets/send) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("send") let asChunk = hadAttr("chunk") let message = if asChunk: y.s else: y.s & "\r\L" x.sock.socket.send(message) builtin "send?", alias = unaliased, rule = PrefixPrecedence, description = "send given message to selected socket and return true if successful", args = { "destination" : {Socket}, "message" : {String} }, attrs = NoAttrs, returns = {Logical}, # TODO(Sockets/send?) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("send?") push newLogical(x.sock.socket.trySend(y.s)) builtin "unplug", alias = unaliased, rule = PrefixPrecedence, description = "close given socket", args = { "socket" : {Socket} }, attrs = NoAttrs, returns = {Nothing}, # TODO(Sockets/unplug) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("unplug") x.sock.socket.close() else: discard #======================================= # Add Library #======================================= Libraries.add(defineSymbols) ndex fd9afdfc2..7555eeebc 100644 ++ b/src/library/System.nim ``` e787a4c0139eeb43ec42731ba666ce131e656c62
1.0
[Sockets/unplug] add documentation example - [Sockets/unplug] add documentation example https://github.com/arturo-lang/arturo/blob/a7209b1e5afa19379d5ef718f8384ce18a8b3159/src/library/Sockets.nim#L233 ```text ## The main Sockets module ## (part of the standard library) #======================================= # Pragmas #======================================= {.used.} #======================================= # Libraries #======================================= when not defined(WEB): import std/net as netsock except Socket import nativesockets import vm/values/custom/[vsocket] import vm/lib when not defined(WEB): import vm/errors #======================================= # Methods #======================================= # TODO(Sockets) Verify the whole module & check for missing functionality # obviously this cannot be done with unit-tests as easily as with other modules, but # we'd still have to verify it works as expected and track down possibly-missing # features # labels: open discussion proc defineSymbols*() = when not defined(WEB): builtin "accept", alias = unaliased, rule = PrefixPrecedence, description = "accept incoming connection and return corresponding socket", args = { "server" : {Socket} }, attrs = NoAttrs, returns = {Socket}, # TODO(Sockets/accept) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("accept") var client: netsock.Socket x.sock.socket.accept(client) let (address,port) = getPeerAddr(client) let socket = initSocket(client, proto=x.sock.protocol, address=address, port=port) push newSocket(socket) builtin "connect", alias = unaliased, rule = PrefixPrecedence, description = "create new socket connection to given server port", args = { "port" : {Integer} }, attrs = { "to" : ({String},"set socket address"), "udp" : ({Logical},"use UDP instead of TCP") }, returns = {Socket}, # TODO(Sockets/connect) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("connect") let isUDP = hadAttr("udp") let protocol = if isUDP: IPPROTO_UDP else: IPPROTO_TCP var toAddress: string if checkAttr("to"): toAddress = aTo.s else: toAddress = "0.0.0.0" var port = Port(x.i) var sock: netsock.Socket = netsock.newSocket(protocol=protocol) if not isUDP: sock.connect(toAddress, port) let socket = initSocket(sock, proto=protocol, address=toAddress, port=port) push newSocket(socket) builtin "listen", alias = unaliased, rule = PrefixPrecedence, description = "start listening on given port and return new socket", args = { "port" : {Integer} }, attrs = { "blocking" : ({String},"set blocking mode (default: false)"), "udp" : ({Logical},"use UDP instead of TCP") }, returns = {Socket}, # TODO(Sockets/listen) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("listen") let blocking = hadAttr("blocking") let protocol = if hadAttr("udp"): IPPROTO_UDP else: IPPROTO_TCP var sock: netsock.Socket = netsock.newSocket(protocol=protocol) sock.setSockOpt(OptReuseAddr, true) sock.getFd().setBlocking(blocking) sock.bindAddr(Port(x.i)) sock.listen() let (address,port) = getLocalAddr(sock) let socket = initSocket(sock, proto=protocol, address=address, port=port) push newSocket(socket) builtin "receive", alias = unaliased, rule = PrefixPrecedence, description = "receive line of data from selected socket", args = { "origin" : {Socket} }, attrs = { "size" : ({Integer},"set maximum size of received data"), "timeout" : ({Integer},"set timeout (in milliseconds)") }, returns = {String}, # TODO(Sockets/receive) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("receive") var size = MaxLineLength if checkAttr("size"): size = aSize.i var timeout = -1 if checkAttr("timeout"): timeout = aTimeout.i push newString(x.sock.socket.recvLine(timeout=timeout, maxLength=size)) builtin "send", alias = unaliased, rule = PrefixPrecedence, description = "send given message to selected socket", args = { "destination" : {Socket}, "message" : {String} }, attrs = { "chunk" : ({Logical},"don't send data as a line of data") }, returns = {Nothing}, # TODO(Sockets/send) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("send") let asChunk = hadAttr("chunk") let message = if asChunk: y.s else: y.s & "\r\L" x.sock.socket.send(message) builtin "send?", alias = unaliased, rule = PrefixPrecedence, description = "send given message to selected socket and return true if successful", args = { "destination" : {Socket}, "message" : {String} }, attrs = NoAttrs, returns = {Logical}, # TODO(Sockets/send?) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("send?") push newLogical(x.sock.socket.trySend(y.s)) builtin "unplug", alias = unaliased, rule = PrefixPrecedence, description = "close given socket", args = { "socket" : {Socket} }, attrs = NoAttrs, returns = {Nothing}, # TODO(Sockets/unplug) add documentation example # labels: library, documentation, easy example = """ """: #======================================================= when defined(SAFE): RuntimeError_OperationNotPermitted("unplug") x.sock.socket.close() else: discard #======================================= # Add Library #======================================= Libraries.add(defineSymbols) ndex fd9afdfc2..7555eeebc 100644 ++ b/src/library/System.nim ``` e787a4c0139eeb43ec42731ba666ce131e656c62
non_usab
add documentation example add documentation example text the main sockets module part of the standard library pragmas used libraries when not defined web import std net as netsock except socket import nativesockets import vm values custom import vm lib when not defined web import vm errors methods todo sockets verify the whole module check for missing functionality obviously this cannot be done with unit tests as easily as with other modules but we d still have to verify it works as expected and track down possibly missing features labels open discussion proc definesymbols when not defined web builtin accept alias unaliased rule prefixprecedence description accept incoming connection and return corresponding socket args server socket attrs noattrs returns socket todo sockets accept add documentation example labels library documentation easy example when defined safe runtimeerror operationnotpermitted accept var client netsock socket x sock socket accept client let address port getpeeraddr client let socket initsocket client proto x sock protocol address address port port push newsocket socket builtin connect alias unaliased rule prefixprecedence description create new socket connection to given server port args port integer attrs to string set socket address udp logical use udp instead of tcp returns socket todo sockets connect add documentation example labels library documentation easy example when defined safe runtimeerror operationnotpermitted connect let isudp hadattr udp let protocol if isudp ipproto udp else ipproto tcp var toaddress string if checkattr to toaddress ato s else toaddress var port port x i var sock netsock socket netsock newsocket protocol protocol if not isudp sock connect toaddress port let socket initsocket sock proto protocol address toaddress port port push newsocket socket builtin listen alias unaliased rule prefixprecedence description start listening on given port and return new socket args port integer attrs blocking string set blocking mode default false udp logical use udp instead of tcp returns socket todo sockets listen add documentation example labels library documentation easy example when defined safe runtimeerror operationnotpermitted listen let blocking hadattr blocking let protocol if hadattr udp ipproto udp else ipproto tcp var sock netsock socket netsock newsocket protocol protocol sock setsockopt optreuseaddr true sock getfd setblocking blocking sock bindaddr port x i sock listen let address port getlocaladdr sock let socket initsocket sock proto protocol address address port port push newsocket socket builtin receive alias unaliased rule prefixprecedence description receive line of data from selected socket args origin socket attrs size integer set maximum size of received data timeout integer set timeout in milliseconds returns string todo sockets receive add documentation example labels library documentation easy example when defined safe runtimeerror operationnotpermitted receive var size maxlinelength if checkattr size size asize i var timeout if checkattr timeout timeout atimeout i push newstring x sock socket recvline timeout timeout maxlength size builtin send alias unaliased rule prefixprecedence description send given message to selected socket args destination socket message string attrs chunk logical don t send data as a line of data returns nothing todo sockets send add documentation example labels library documentation easy example when defined safe runtimeerror operationnotpermitted send let aschunk hadattr chunk let message if aschunk y s else y s r l x sock socket send message builtin send alias unaliased rule prefixprecedence description send given message to selected socket and return true if successful args destination socket message string attrs noattrs returns logical todo sockets send add documentation example labels library documentation easy example when defined safe runtimeerror operationnotpermitted send push newlogical x sock socket trysend y s builtin unplug alias unaliased rule prefixprecedence description close given socket args socket socket attrs noattrs returns nothing todo sockets unplug add documentation example labels library documentation easy example when defined safe runtimeerror operationnotpermitted unplug x sock socket close else discard add library libraries add definesymbols ndex b src library system nim
0
6,458
4,295,646,892
IssuesEvent
2016-07-19 08:04:26
Tiendil/the-tale
https://api.github.com/repos/Tiendil/the-tale
opened
Увеличить максимум энергии для подписчиков
comp_game_logic cont_usability est_simple type_improvement
Чтобы не требовалось заходить несколько раз в день для траты её по рассписанию. Лучше иметь запас на 1.5-2 дня
True
Увеличить максимум энергии для подписчиков - Чтобы не требовалось заходить несколько раз в день для траты её по рассписанию. Лучше иметь запас на 1.5-2 дня
usab
увеличить максимум энергии для подписчиков чтобы не требовалось заходить несколько раз в день для траты её по рассписанию лучше иметь запас на дня
1
16,537
11,035,856,243
IssuesEvent
2019-12-07 16:36:23
godotengine/godot
https://api.github.com/repos/godotengine/godot
closed
parameter Hints in autocompletion preview - to show expected type of variable and other info
enhancement topic:editor topic:gdscript usability
1. The editor and gdscript are not consistent with the value type they take for rotations A hint about set_rot() type of expected variable would have helped me avoid a bit of confusion here: https://github.com/godotengine/godot/issues/4516#issuecomment-217423197 This is an example of how in some cases your code might parse successfully, but not do what you expect. 2. In order to know the type of variable that is expected in a function or a method, you need to search the class api documentation explicitly. In game maker, the auto - completion gives hints like this: ![autocomplete_image](https://cloud.githubusercontent.com/assets/6495061/15072365/1fb1f19e-138a-11e6-8419-b19096c5c054.png) So no need to search for this, the actual autocompletion is showing you immediately what type of value or syntax it expects! Godot has no hints whatsoever. To make this even more awesome than godot, gamemaker users can write hints for their own functions parameters. Defining a hint is as simple as: ![intelligent-code-completion](https://cloud.githubusercontent.com/assets/6495061/15072382/48fa91e6-138a-11e6-8814-754a1220420d.png) http://gamemakerblog.com/2014/01/02/how-to-add-intelligent-code-completion-to-scripts/ This is a common feature of many other code editors. Look at how nice it is in superpowers game engine: ![kii8xad](https://cloud.githubusercontent.com/assets/6495061/15072444/afaa9328-138a-11e6-9c00-c3a073976dec.png) This makes them easier to learn or transition to!
True
parameter Hints in autocompletion preview - to show expected type of variable and other info - 1. The editor and gdscript are not consistent with the value type they take for rotations A hint about set_rot() type of expected variable would have helped me avoid a bit of confusion here: https://github.com/godotengine/godot/issues/4516#issuecomment-217423197 This is an example of how in some cases your code might parse successfully, but not do what you expect. 2. In order to know the type of variable that is expected in a function or a method, you need to search the class api documentation explicitly. In game maker, the auto - completion gives hints like this: ![autocomplete_image](https://cloud.githubusercontent.com/assets/6495061/15072365/1fb1f19e-138a-11e6-8419-b19096c5c054.png) So no need to search for this, the actual autocompletion is showing you immediately what type of value or syntax it expects! Godot has no hints whatsoever. To make this even more awesome than godot, gamemaker users can write hints for their own functions parameters. Defining a hint is as simple as: ![intelligent-code-completion](https://cloud.githubusercontent.com/assets/6495061/15072382/48fa91e6-138a-11e6-8814-754a1220420d.png) http://gamemakerblog.com/2014/01/02/how-to-add-intelligent-code-completion-to-scripts/ This is a common feature of many other code editors. Look at how nice it is in superpowers game engine: ![kii8xad](https://cloud.githubusercontent.com/assets/6495061/15072444/afaa9328-138a-11e6-9c00-c3a073976dec.png) This makes them easier to learn or transition to!
usab
parameter hints in autocompletion preview to show expected type of variable and other info the editor and gdscript are not consistent with the value type they take for rotations a hint about set rot type of expected variable would have helped me avoid a bit of confusion here this is an example of how in some cases your code might parse successfully but not do what you expect in order to know the type of variable that is expected in a function or a method you need to search the class api documentation explicitly in game maker the auto completion gives hints like this so no need to search for this the actual autocompletion is showing you immediately what type of value or syntax it expects godot has no hints whatsoever to make this even more awesome than godot gamemaker users can write hints for their own functions parameters defining a hint is as simple as this is a common feature of many other code editors look at how nice it is in superpowers game engine this makes them easier to learn or transition to
1
338,872
24,602,500,544
IssuesEvent
2022-10-14 13:37:39
brndnmtthws/conky
https://api.github.com/repos/brndnmtthws/conky
closed
Please remove pandoc
documentation
### What happened? I'm running Slackware 15.0 and when I tried to compile 1.14.0 and I get this message at the term; CMake Error at cmake/ConkyPlatformChecks.cmake:485 (message): Unable to find program 'pandoc' Call Stack (most recent call first): CMakeLists.txt:35 (include) I see in the doc README.md, pandoc listed. So I'm not looking Slackware biased here, look at Arch Linux for pandoc; (Dependencies (74)) https://archlinux.org/packages/community/x86_64/pandoc/ 74 dependencies needed for pandoc. I guess for now I'm going to have to use -DBUILD_DOCS=OFF if I want to use this version, but I would like to have docs installed. Can we please get this changed, since pandoc has an extreme amount of Haskell dependencies. I don't understand why from version 1.13.1 to 1.14.0 this had to change and now include this extreme amount of Haskell dependency just for docs. THANKS ### Version 1.14.0 ### Which OS/distro are you seeing the problem on? Slackware 15.0 ```P.S. Didn't mean to post this as an actual bug report, sorry.```
1.0
Please remove pandoc - ### What happened? I'm running Slackware 15.0 and when I tried to compile 1.14.0 and I get this message at the term; CMake Error at cmake/ConkyPlatformChecks.cmake:485 (message): Unable to find program 'pandoc' Call Stack (most recent call first): CMakeLists.txt:35 (include) I see in the doc README.md, pandoc listed. So I'm not looking Slackware biased here, look at Arch Linux for pandoc; (Dependencies (74)) https://archlinux.org/packages/community/x86_64/pandoc/ 74 dependencies needed for pandoc. I guess for now I'm going to have to use -DBUILD_DOCS=OFF if I want to use this version, but I would like to have docs installed. Can we please get this changed, since pandoc has an extreme amount of Haskell dependencies. I don't understand why from version 1.13.1 to 1.14.0 this had to change and now include this extreme amount of Haskell dependency just for docs. THANKS ### Version 1.14.0 ### Which OS/distro are you seeing the problem on? Slackware 15.0 ```P.S. Didn't mean to post this as an actual bug report, sorry.```
non_usab
please remove pandoc what happened i m running slackware and when i tried to compile and i get this message at the term cmake error at cmake conkyplatformchecks cmake message unable to find program pandoc call stack most recent call first cmakelists txt include i see in the doc readme md pandoc listed so i m not looking slackware biased here look at arch linux for pandoc dependencies dependencies needed for pandoc i guess for now i m going to have to use dbuild docs off if i want to use this version but i would like to have docs installed can we please get this changed since pandoc has an extreme amount of haskell dependencies i don t understand why from version to this had to change and now include this extreme amount of haskell dependency just for docs thanks version which os distro are you seeing the problem on slackware p s didn t mean to post this as an actual bug report sorry
0
44,730
18,169,784,608
IssuesEvent
2021-09-27 18:32:02
hashicorp/terraform-provider-aws
https://api.github.com/repos/hashicorp/terraform-provider-aws
closed
Data Source for aws_kinesis_firehose_delivery_stream
new-data-source service/firehose
It would be helpful to have a way to lookup an `aws_kinesis_firehose_delivery_stream` by name to get the ARN and other attributes. Usually Data Sources are used for this, but it seems to have been left out of the original implementation. Side note: I'd argue that most new resources should probably have corresponding data sources made at time they are added to this provider. Just saying...
1.0
Data Source for aws_kinesis_firehose_delivery_stream - It would be helpful to have a way to lookup an `aws_kinesis_firehose_delivery_stream` by name to get the ARN and other attributes. Usually Data Sources are used for this, but it seems to have been left out of the original implementation. Side note: I'd argue that most new resources should probably have corresponding data sources made at time they are added to this provider. Just saying...
non_usab
data source for aws kinesis firehose delivery stream it would be helpful to have a way to lookup an aws kinesis firehose delivery stream by name to get the arn and other attributes usually data sources are used for this but it seems to have been left out of the original implementation side note i d argue that most new resources should probably have corresponding data sources made at time they are added to this provider just saying
0
75,874
9,335,148,456
IssuesEvent
2019-03-28 17:52:13
GSA/pra.gov
https://api.github.com/repos/GSA/pra.gov
opened
Refer to OIRA/OMB consistently throughout the site
content design
Denote that OIRA and OMB are the same organization or use one or the other consistently throughout the site. Going back and forth between the two was confusing to users during testing.
1.0
Refer to OIRA/OMB consistently throughout the site - Denote that OIRA and OMB are the same organization or use one or the other consistently throughout the site. Going back and forth between the two was confusing to users during testing.
non_usab
refer to oira omb consistently throughout the site denote that oira and omb are the same organization or use one or the other consistently throughout the site going back and forth between the two was confusing to users during testing
0
22,322
30,884,968,413
IssuesEvent
2023-08-03 20:52:15
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
opened
Release 6.3.2 - Aug 2023
P1 type: process release team-OSS
# Status of Bazel 6.3.2 - Expected first release candidate date: 2023-08-03 - Expected release date: 2023-08-07 - [List of release blockers](https://github.com/bazelbuild/bazel/milestone/59) To report a release-blocking bug, please add a comment with the text `@bazel-io flag` to the issue. A release manager will triage it and add it to the milestone. To cherry-pick a mainline commit into 6.3.2, simply send a PR against the `release-6.3.2` branch. **Task list:** <!-- The first item is only needed for major releases (X.0.0) --> - [ ] Create release candidate: 6.3.2 - [ ] Check downstream projects - [ ] Create [draft release announcement](https://docs.google.com/document/d/1pu2ARPweOCTxPsRR8snoDtkC9R51XWRyBXeiC6Ql5so/edit) <!-- Note that there should be a new Bazel Release Announcement document for every major release. For minor and patch releases, use the latest open doc. --> - [ ] Push the blog post: [link to blog post] <!-- Only for major releases. --> - [ ] Update the [release page](https://github.com/bazelbuild/bazel/releases/)
1.0
Release 6.3.2 - Aug 2023 - # Status of Bazel 6.3.2 - Expected first release candidate date: 2023-08-03 - Expected release date: 2023-08-07 - [List of release blockers](https://github.com/bazelbuild/bazel/milestone/59) To report a release-blocking bug, please add a comment with the text `@bazel-io flag` to the issue. A release manager will triage it and add it to the milestone. To cherry-pick a mainline commit into 6.3.2, simply send a PR against the `release-6.3.2` branch. **Task list:** <!-- The first item is only needed for major releases (X.0.0) --> - [ ] Create release candidate: 6.3.2 - [ ] Check downstream projects - [ ] Create [draft release announcement](https://docs.google.com/document/d/1pu2ARPweOCTxPsRR8snoDtkC9R51XWRyBXeiC6Ql5so/edit) <!-- Note that there should be a new Bazel Release Announcement document for every major release. For minor and patch releases, use the latest open doc. --> - [ ] Push the blog post: [link to blog post] <!-- Only for major releases. --> - [ ] Update the [release page](https://github.com/bazelbuild/bazel/releases/)
non_usab
release aug status of bazel expected first release candidate date expected release date to report a release blocking bug please add a comment with the text bazel io flag to the issue a release manager will triage it and add it to the milestone to cherry pick a mainline commit into simply send a pr against the release branch task list create release candidate check downstream projects create push the blog post update the
0
16,584
11,095,274,224
IssuesEvent
2019-12-16 08:44:13
gama-platform/gama
https://api.github.com/repos/gama-platform/gama
closed
The interactive console return buggy value
Concerns Simulations OS All Version Git 🖥 Display All 😅* Workaround 😱 > Bug 🙅🏻‍♂️ Affects Usability
**Describe the bug** For somehow reasons, the interactive console (in the simulation perspective) returns some buggy results (results from previous command, nonsense results, etc). The bug seems to be created when the console is looking across some large lists of agents or when copy-pasting text on it. However, the bug does not reproduce every time. **To Reproduce** Totally random 🙊🙈🤷 **Expected behavior** Returning agents / properties requested every time **Screenshots** ![Screenshot from 2019-12-12 10-53-19](https://user-images.githubusercontent.com/16764085/70681513-35d1bd00-1cce-11ea-866c-817119135b22.png) _The command in the black box is a copy/paste which bugged, the intended text is the command just below_ ![image](https://user-images.githubusercontent.com/16764085/70681993-f99f5c00-1ccf-11ea-9ec2-538823004188.png) _I've restart the simulation after the screenshot above, wrote the first 3 lines by hand and just pressed the ⬆ arrow 1 time and execute the code displayed (which is the real code executed, but not the code displayed at the execution)_ **Desktop (please complete the following information):** - OS: Ubuntu 18.04.3 🐧 / macOS Mojave 🍎 - PC Model: Asus Republic of Gamers / MacBook Pro - GAMA version: git (probably all but not tested) - Java version: OpenJDK 1.8 / Oracle JDK 1.8 - Graphics cards / Display system: NVIDIA GeForce GTX 1050 Ti Mobile **Workaround** The code executed is the previous line of code executed like below : ``` [1] gaml> corridor[2192].shape nil [2] gaml> corridor[2711].agents_on polyline ([{748.278338593198,207.89073161548004,0.0},{724.1182492005173,273.08670051489025,0.0}]) ^--- Result from the previous command (code 1) [3] gaml> length(corridor[2711].agents_on) [people(89)] ^--- Result from the previous command (code 2) [4] gaml> fieohezhiofezhi 1 ^--- Result from the previous command (code 3) [5] gaml> corridor[2711].agents_on > Error: Block definition does not begin or end correctly ^--- Error from the previous command (code 4) ``` So, it's the most annoying workaround ever, but it's possible to get some results that way **Additional context** The problem is not a memory bug I'm starting GAMA with 4Go of RAM (to 10Go max) and the software never fill that init amount. Also I'm using the ESCAPE Project when I'm talking about "large lists of agents" _EDIT_ : Add screenshots & workaround
True
The interactive console return buggy value - **Describe the bug** For somehow reasons, the interactive console (in the simulation perspective) returns some buggy results (results from previous command, nonsense results, etc). The bug seems to be created when the console is looking across some large lists of agents or when copy-pasting text on it. However, the bug does not reproduce every time. **To Reproduce** Totally random 🙊🙈🤷 **Expected behavior** Returning agents / properties requested every time **Screenshots** ![Screenshot from 2019-12-12 10-53-19](https://user-images.githubusercontent.com/16764085/70681513-35d1bd00-1cce-11ea-866c-817119135b22.png) _The command in the black box is a copy/paste which bugged, the intended text is the command just below_ ![image](https://user-images.githubusercontent.com/16764085/70681993-f99f5c00-1ccf-11ea-9ec2-538823004188.png) _I've restart the simulation after the screenshot above, wrote the first 3 lines by hand and just pressed the ⬆ arrow 1 time and execute the code displayed (which is the real code executed, but not the code displayed at the execution)_ **Desktop (please complete the following information):** - OS: Ubuntu 18.04.3 🐧 / macOS Mojave 🍎 - PC Model: Asus Republic of Gamers / MacBook Pro - GAMA version: git (probably all but not tested) - Java version: OpenJDK 1.8 / Oracle JDK 1.8 - Graphics cards / Display system: NVIDIA GeForce GTX 1050 Ti Mobile **Workaround** The code executed is the previous line of code executed like below : ``` [1] gaml> corridor[2192].shape nil [2] gaml> corridor[2711].agents_on polyline ([{748.278338593198,207.89073161548004,0.0},{724.1182492005173,273.08670051489025,0.0}]) ^--- Result from the previous command (code 1) [3] gaml> length(corridor[2711].agents_on) [people(89)] ^--- Result from the previous command (code 2) [4] gaml> fieohezhiofezhi 1 ^--- Result from the previous command (code 3) [5] gaml> corridor[2711].agents_on > Error: Block definition does not begin or end correctly ^--- Error from the previous command (code 4) ``` So, it's the most annoying workaround ever, but it's possible to get some results that way **Additional context** The problem is not a memory bug I'm starting GAMA with 4Go of RAM (to 10Go max) and the software never fill that init amount. Also I'm using the ESCAPE Project when I'm talking about "large lists of agents" _EDIT_ : Add screenshots & workaround
usab
the interactive console return buggy value describe the bug for somehow reasons the interactive console in the simulation perspective returns some buggy results results from previous command nonsense results etc the bug seems to be created when the console is looking across some large lists of agents or when copy pasting text on it however the bug does not reproduce every time to reproduce totally random 🙊🙈🤷 expected behavior returning agents properties requested every time screenshots the command in the black box is a copy paste which bugged the intended text is the command just below i ve restart the simulation after the screenshot above wrote the first lines by hand and just pressed the ⬆ arrow time and execute the code displayed which is the real code executed but not the code displayed at the execution desktop please complete the following information os ubuntu 🐧 macos mojave 🍎 pc model asus republic of gamers macbook pro gama version git probably all but not tested java version openjdk oracle jdk graphics cards display system nvidia geforce gtx ti mobile workaround the code executed is the previous line of code executed like below gaml corridor shape nil gaml corridor agents on polyline result from the previous command code gaml length corridor agents on result from the previous command code gaml fieohezhiofezhi result from the previous command code gaml corridor agents on error block definition does not begin or end correctly error from the previous command code so it s the most annoying workaround ever but it s possible to get some results that way additional context the problem is not a memory bug i m starting gama with of ram to max and the software never fill that init amount also i m using the escape project when i m talking about large lists of agents edit add screenshots workaround
1
117,190
17,439,346,374
IssuesEvent
2021-08-05 01:05:29
rsoreq/lodash
https://api.github.com/repos/rsoreq/lodash
opened
CVE-2021-32804 (High) detected in tar-4.4.8.tgz
security vulnerability
## CVE-2021-32804 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-4.4.8.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.8.tgz">https://registry.npmjs.org/tar/-/tar-4.4.8.tgz</a></p> <p> Dependency Hierarchy: - qunitjs-2.4.1.tgz (Root Library) - chokidar-1.6.1.tgz - fsevents-1.2.9.tgz - node-pre-gyp-0.12.0.tgz - :x: **tar-4.4.8.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 6.1.1, 5.0.6, 4.4.14, and 3.3.2 has a arbitrary File Creation/Overwrite vulnerability due to insufficient absolute path sanitization. node-tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the `preservePaths` flag is not set to `true`. This is achieved by stripping the absolute path root from any absolute file paths contained in a tar file. For example `/home/user/.bashrc` would turn into `home/user/.bashrc`. This logic was insufficient when file paths contained repeated path roots such as `////home/user/.bashrc`. `node-tar` would only strip a single path root from such paths. When given an absolute file path with repeating path roots, the resulting path (e.g. `///home/user/.bashrc`) would still resolve to an absolute path, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.2, 4.4.14, 5.0.6 and 6.1.1. Users may work around this vulnerability without upgrading by creating a custom `onentry` method which sanitizes the `entry.path` or a `filter` method which removes entries with absolute paths. See referenced GitHub Advisory for details. Be aware of CVE-2021-32803 which fixes a similar bug in later versions of tar. <p>Publish Date: 2021-08-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32804>CVE-2021-32804</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.2</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: N/A - Attack Complexity: N/A - Privileges Required: N/A - User Interaction: N/A - Scope: N/A - Impact Metrics: - Confidentiality Impact: N/A - Integrity Impact: N/A - Availability Impact: N/A </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-3jfq-g458-7qm9">https://github.com/npm/node-tar/security/advisories/GHSA-3jfq-g458-7qm9</a></p> <p>Release Date: 2021-08-03</p> <p>Fix Resolution: tar - 3.2.2, 4.4.14, 5.0.6, 6.1.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"tar","packageVersion":"4.4.8","packageFilePaths":[],"isTransitiveDependency":true,"dependencyTree":"qunitjs:2.4.1;chokidar:1.6.1;fsevents:1.2.9;node-pre-gyp:0.12.0;tar:4.4.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"tar - 3.2.2, 4.4.14, 5.0.6, 6.1.1"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-32804","vulnerabilityDetails":"The npm package \"tar\" (aka node-tar) before versions 6.1.1, 5.0.6, 4.4.14, and 3.3.2 has a arbitrary File Creation/Overwrite vulnerability due to insufficient absolute path sanitization. node-tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the `preservePaths` flag is not set to `true`. This is achieved by stripping the absolute path root from any absolute file paths contained in a tar file. For example `/home/user/.bashrc` would turn into `home/user/.bashrc`. This logic was insufficient when file paths contained repeated path roots such as `////home/user/.bashrc`. `node-tar` would only strip a single path root from such paths. When given an absolute file path with repeating path roots, the resulting path (e.g. `///home/user/.bashrc`) would still resolve to an absolute path, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.2, 4.4.14, 5.0.6 and 6.1.1. Users may work around this vulnerability without upgrading by creating a custom `onentry` method which sanitizes the `entry.path` or a `filter` method which removes entries with absolute paths. See referenced GitHub Advisory for details. Be aware of CVE-2021-32803 which fixes a similar bug in later versions of tar.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32804","cvss3Severity":"high","cvss3Score":"8.2","cvss3Metrics":{"A":"N/A","AC":"N/A","PR":"N/A","S":"N/A","C":"N/A","UI":"N/A","AV":"N/A","I":"N/A"},"extraData":{}}</REMEDIATE> -->
True
CVE-2021-32804 (High) detected in tar-4.4.8.tgz - ## CVE-2021-32804 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-4.4.8.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.8.tgz">https://registry.npmjs.org/tar/-/tar-4.4.8.tgz</a></p> <p> Dependency Hierarchy: - qunitjs-2.4.1.tgz (Root Library) - chokidar-1.6.1.tgz - fsevents-1.2.9.tgz - node-pre-gyp-0.12.0.tgz - :x: **tar-4.4.8.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 6.1.1, 5.0.6, 4.4.14, and 3.3.2 has a arbitrary File Creation/Overwrite vulnerability due to insufficient absolute path sanitization. node-tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the `preservePaths` flag is not set to `true`. This is achieved by stripping the absolute path root from any absolute file paths contained in a tar file. For example `/home/user/.bashrc` would turn into `home/user/.bashrc`. This logic was insufficient when file paths contained repeated path roots such as `////home/user/.bashrc`. `node-tar` would only strip a single path root from such paths. When given an absolute file path with repeating path roots, the resulting path (e.g. `///home/user/.bashrc`) would still resolve to an absolute path, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.2, 4.4.14, 5.0.6 and 6.1.1. Users may work around this vulnerability without upgrading by creating a custom `onentry` method which sanitizes the `entry.path` or a `filter` method which removes entries with absolute paths. See referenced GitHub Advisory for details. Be aware of CVE-2021-32803 which fixes a similar bug in later versions of tar. <p>Publish Date: 2021-08-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32804>CVE-2021-32804</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.2</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: N/A - Attack Complexity: N/A - Privileges Required: N/A - User Interaction: N/A - Scope: N/A - Impact Metrics: - Confidentiality Impact: N/A - Integrity Impact: N/A - Availability Impact: N/A </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-3jfq-g458-7qm9">https://github.com/npm/node-tar/security/advisories/GHSA-3jfq-g458-7qm9</a></p> <p>Release Date: 2021-08-03</p> <p>Fix Resolution: tar - 3.2.2, 4.4.14, 5.0.6, 6.1.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"tar","packageVersion":"4.4.8","packageFilePaths":[],"isTransitiveDependency":true,"dependencyTree":"qunitjs:2.4.1;chokidar:1.6.1;fsevents:1.2.9;node-pre-gyp:0.12.0;tar:4.4.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"tar - 3.2.2, 4.4.14, 5.0.6, 6.1.1"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-32804","vulnerabilityDetails":"The npm package \"tar\" (aka node-tar) before versions 6.1.1, 5.0.6, 4.4.14, and 3.3.2 has a arbitrary File Creation/Overwrite vulnerability due to insufficient absolute path sanitization. node-tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the `preservePaths` flag is not set to `true`. This is achieved by stripping the absolute path root from any absolute file paths contained in a tar file. For example `/home/user/.bashrc` would turn into `home/user/.bashrc`. This logic was insufficient when file paths contained repeated path roots such as `////home/user/.bashrc`. `node-tar` would only strip a single path root from such paths. When given an absolute file path with repeating path roots, the resulting path (e.g. `///home/user/.bashrc`) would still resolve to an absolute path, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.2, 4.4.14, 5.0.6 and 6.1.1. Users may work around this vulnerability without upgrading by creating a custom `onentry` method which sanitizes the `entry.path` or a `filter` method which removes entries with absolute paths. See referenced GitHub Advisory for details. Be aware of CVE-2021-32803 which fixes a similar bug in later versions of tar.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32804","cvss3Severity":"high","cvss3Score":"8.2","cvss3Metrics":{"A":"N/A","AC":"N/A","PR":"N/A","S":"N/A","C":"N/A","UI":"N/A","AV":"N/A","I":"N/A"},"extraData":{}}</REMEDIATE> -->
non_usab
cve high detected in tar tgz cve high severity vulnerability vulnerable library tar tgz tar for node library home page a href dependency hierarchy qunitjs tgz root library chokidar tgz fsevents tgz node pre gyp tgz x tar tgz vulnerable library found in base branch master vulnerability details the npm package tar aka node tar before versions and has a arbitrary file creation overwrite vulnerability due to insufficient absolute path sanitization node tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the preservepaths flag is not set to true this is achieved by stripping the absolute path root from any absolute file paths contained in a tar file for example home user bashrc would turn into home user bashrc this logic was insufficient when file paths contained repeated path roots such as home user bashrc node tar would only strip a single path root from such paths when given an absolute file path with repeating path roots the resulting path e g home user bashrc would still resolve to an absolute path thus allowing arbitrary file creation and overwrite this issue was addressed in releases and users may work around this vulnerability without upgrading by creating a custom onentry method which sanitizes the entry path or a filter method which removes entries with absolute paths see referenced github advisory for details be aware of cve which fixes a similar bug in later versions of tar publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree qunitjs chokidar fsevents node pre gyp tar isminimumfixversionavailable true minimumfixversion tar basebranches vulnerabilityidentifier cve vulnerabilitydetails the npm package tar aka node tar before versions and has a arbitrary file creation overwrite vulnerability due to insufficient absolute path sanitization node tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the preservepaths flag is not set to true this is achieved by stripping the absolute path root from any absolute file paths contained in a tar file for example home user bashrc would turn into home user bashrc this logic was insufficient when file paths contained repeated path roots such as home user bashrc node tar would only strip a single path root from such paths when given an absolute file path with repeating path roots the resulting path e g home user bashrc would still resolve to an absolute path thus allowing arbitrary file creation and overwrite this issue was addressed in releases and users may work around this vulnerability without upgrading by creating a custom onentry method which sanitizes the entry path or a filter method which removes entries with absolute paths see referenced github advisory for details be aware of cve which fixes a similar bug in later versions of tar vulnerabilityurl
0
264,503
23,121,208,219
IssuesEvent
2022-07-27 21:44:29
nrwl/nx
https://api.github.com/repos/nrwl/nx
closed
TypeError: Cannot read properties of null (reading 'constructor')
type: bug blocked: retry with latest stale scope: react-native
## Current Behavior <img width="1103" alt="Screen Shot 2022-06-21 at 08 39 47" src="https://user-images.githubusercontent.com/27006656/174704841-3fdc7418-543a-430c-8ab6-3986edc10537.png"> 50/50 in ci/cd pipeline (gitlab) build with @nrwl/expo break but nx think build is succesfull ## Expected Behavior 1. Successful build 2. If can't build throw error ## Steps to Reproduce @nrwl/expo build-web
1.0
TypeError: Cannot read properties of null (reading 'constructor') - ## Current Behavior <img width="1103" alt="Screen Shot 2022-06-21 at 08 39 47" src="https://user-images.githubusercontent.com/27006656/174704841-3fdc7418-543a-430c-8ab6-3986edc10537.png"> 50/50 in ci/cd pipeline (gitlab) build with @nrwl/expo break but nx think build is succesfull ## Expected Behavior 1. Successful build 2. If can't build throw error ## Steps to Reproduce @nrwl/expo build-web
non_usab
typeerror cannot read properties of null reading constructor current behavior img width alt screen shot at src in ci cd pipeline gitlab build with nrwl expo break but nx think build is succesfull expected behavior successful build if can t build throw error steps to reproduce nrwl expo build web
0
194,539
14,681,130,624
IssuesEvent
2020-12-31 12:17:02
promitor/charts
https://api.github.com/repos/promitor/charts
opened
Push CI image to CI Helm registry for Scraper agent
agents:scraper enhancement testing
Push CI image to CI Helm registry for Scraper agent once deployment succeeds.
1.0
Push CI image to CI Helm registry for Scraper agent - Push CI image to CI Helm registry for Scraper agent once deployment succeeds.
non_usab
push ci image to ci helm registry for scraper agent push ci image to ci helm registry for scraper agent once deployment succeeds
0
16,283
10,722,130,217
IssuesEvent
2019-10-27 09:33:46
brycx/orion
https://api.github.com/repos/brycx/orion
closed
Making the public API more familiar
Breaking change question usability
The public API for streaming structs and newtypes could be made more familiar to users that come from other Rust crypto libraries. In continuation of trying to make the streaming API more consistent (https://github.com/brycx/orion/pull/87) the general approach seems to be that all one-shot functions are also part of the struct (an example of this is the RustCrypto `Digest` trait). Maybe the current should be changed to: - `module::one_shot_function()` -> `module::Ctx::one_shot_function()` - `module::verify()` -> `module::Ctx::verify()` For the newtypes API, to be more consitent with types throughout Rust, we chould change: - `get_length` -> `len` Changing the newtype API seems worthwhile, but it's debatable whether changing the straming struct API will make a noticable difference in usability.
True
Making the public API more familiar - The public API for streaming structs and newtypes could be made more familiar to users that come from other Rust crypto libraries. In continuation of trying to make the streaming API more consistent (https://github.com/brycx/orion/pull/87) the general approach seems to be that all one-shot functions are also part of the struct (an example of this is the RustCrypto `Digest` trait). Maybe the current should be changed to: - `module::one_shot_function()` -> `module::Ctx::one_shot_function()` - `module::verify()` -> `module::Ctx::verify()` For the newtypes API, to be more consitent with types throughout Rust, we chould change: - `get_length` -> `len` Changing the newtype API seems worthwhile, but it's debatable whether changing the straming struct API will make a noticable difference in usability.
usab
making the public api more familiar the public api for streaming structs and newtypes could be made more familiar to users that come from other rust crypto libraries in continuation of trying to make the streaming api more consistent the general approach seems to be that all one shot functions are also part of the struct an example of this is the rustcrypto digest trait maybe the current should be changed to module one shot function module ctx one shot function module verify module ctx verify for the newtypes api to be more consitent with types throughout rust we chould change get length len changing the newtype api seems worthwhile but it s debatable whether changing the straming struct api will make a noticable difference in usability
1
308,306
23,242,963,864
IssuesEvent
2022-08-03 17:14:38
BCDevOps/developer-experience
https://api.github.com/repos/BCDevOps/developer-experience
opened
Notes for RHEL image build changes
automation documentation enhancement team/DXC ops and shared services
**Describe the issue** Create a brief notes for **Build with RHEL Base Images**. - RedHat's build entitlement subscription method has been changed to **SCA (Simple Content Access)**, which requires developers to use RH build entitlements as a volume and allows them to remove the build entitlement file copy command from their Dockerfile. **Additional context** - [#2915](https://app.zenhub.com/workspaces/platform-experience-5bb7c5ab4b5806bc2beb9d15/issues/bcdevops/developer-experience/2915) - [#2642](https://app.zenhub.com/workspaces/platform-experience-5bb7c5ab4b5806bc2beb9d15/issues/bcdevops/developer-experience/2642) **How does this benefit the users of our platform?** - Simplified they way of using [Red Hat Enterprise Linux (RHEL) based images](https://catalog.redhat.com/software/containers/search?p=1&architecture=amd64&vendor_name=Red%20Hat%7CRed%20Hat%2C%20Inc.) **Definition of done** - Create a note and email to Olena
1.0
Notes for RHEL image build changes - **Describe the issue** Create a brief notes for **Build with RHEL Base Images**. - RedHat's build entitlement subscription method has been changed to **SCA (Simple Content Access)**, which requires developers to use RH build entitlements as a volume and allows them to remove the build entitlement file copy command from their Dockerfile. **Additional context** - [#2915](https://app.zenhub.com/workspaces/platform-experience-5bb7c5ab4b5806bc2beb9d15/issues/bcdevops/developer-experience/2915) - [#2642](https://app.zenhub.com/workspaces/platform-experience-5bb7c5ab4b5806bc2beb9d15/issues/bcdevops/developer-experience/2642) **How does this benefit the users of our platform?** - Simplified they way of using [Red Hat Enterprise Linux (RHEL) based images](https://catalog.redhat.com/software/containers/search?p=1&architecture=amd64&vendor_name=Red%20Hat%7CRed%20Hat%2C%20Inc.) **Definition of done** - Create a note and email to Olena
non_usab
notes for rhel image build changes describe the issue create a brief notes for build with rhel base images redhat s build entitlement subscription method has been changed to sca simple content access which requires developers to use rh build entitlements as a volume and allows them to remove the build entitlement file copy command from their dockerfile additional context how does this benefit the users of our platform simplified they way of using definition of done create a note and email to olena
0
360
2,582,904,710
IssuesEvent
2015-02-15 19:42:45
cambridge-alpha-team/unvisual-frontend
https://api.github.com/repos/cambridge-alpha-team/unvisual-frontend
opened
Announce range of choices
usability
Theo suggested announcing the range of choices. For example, expanding "play 60" would indicate the minimum and maximum values allowed. Relevant to #10
True
Announce range of choices - Theo suggested announcing the range of choices. For example, expanding "play 60" would indicate the minimum and maximum values allowed. Relevant to #10
usab
announce range of choices theo suggested announcing the range of choices for example expanding play would indicate the minimum and maximum values allowed relevant to
1
End of preview. Expand in Data Studio

Dataset Card for "binary-10IQR-usab"

More Information needed

Downloads last month
19

Collection including karths/binary-10IQR-usab