1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 |
CVE: CVE-2011-0779 CWE: - 789 - 20 bugs: - 62791 repo: https://chromium.googlesource.com/chromium/src/ vccs: - note: | The entirety of the sandboxed_extention_unpacker.cc and .h files were written and developed for this very large code change. It seems like the first step of a large change to refactor the way chrome handled unpacking extensions within the ExtensionService. This commit essentially introduced the bug because this file did not exist previously. Logic was introduced to check for other conditions surrounding the sizing of the key (based on the key size), but no logic was introduced to make sure the key wasn't zero. It was likely just an oversight. commit: af1277b79f621b307fbcff76768cd7a225841e9e fixes: - note: | The fix for this was really small and very easy to understand. The author of the fix simply added a check to the code to make sure that the header signature size was not zero. If the signature size was zero, the code would return false and report a failure rather than letting it continue. commit: ffeada1f2de5281d59ea48c94c4001a568092cd3 bounty: date: amount: references: [] lessons: yagni: note: applies: false question: | Are there any common lessons we have learned from class that apply to this vulnerability? In other words, could this vulnerability serve as an example of one of those lessons? Leave "applies" blank or put false if you did not see that lesson (you do not need to put a reason). Put "true" if you feel the lesson applies and put a quick explanation of how it applies. Don't feel the need to claim that ALL of these apply, but it's pretty likely that one or two of them apply. If you think of another lesson we covered in class that applies here, feel free to give it a small name and add one in the same format as these. serial_killer: note: applies: false complex_inputs: note: | The input to the key was trusted to have a non zero size but the input itself was complex. There were many other checks being performed regarding the size of the header (if it was too big) but because this was an edge case it seems all involved forgot to check that it could be zero. This likely occurred because there were many cases to test for because of a complex input. applies: true distrust_input: note: | The vulnerability arose because the input (a user developed provided signed extension with a header) was trusted to have a non-zero header size. This lead to a situation where a malicious attacker could create an extension with a zero size header and crash Chrome, or just lead to a misuse case where an unknowing or inexperienced developer could create an extension with a zero size header and cause the system to crash. applies: true least_privilege: note: applies: false native_wrappers: note: applies: false defense_in_depth: note: applies: false secure_by_default: note: applies: false environment_variables: note: applies: false security_by_obscurity: note: applies: false frameworks_are_optional: note: applies: false reviews: - 4723007 upvotes: 5 mistakes: answer: | There were many other tests being performed in the code around the header size, a test to check if the header was size zero was never included however. As far as I could tell the author of the code was refactoring a large portion of the system. What they could have done better was to not assume user provided content was not any range of unpredictable values (like 0) (the header is attached to an extension which can be developed by users). However, because other checks were in place in the code to check the header size, I think that this edge case should have been considered by the team too. Especially when initializing a variables size, some checks should always be if the size is based on some kind of user driven input. If the size is driven on user based input, could the user find a way to provide a huge value (overflowing a buffer), or could the user provide a negative value, or even a zero value. The team should have checked all three use cases rather than just the max size case. question: | In your opinion, after all of this research, what mistakes were made that led to this vulnerability? Coding mistakes? Design mistakes? Maintainability? Requirements? Miscommunications? Look at the CWE entry for this vulnerability and examine the mitigations they have written there. Are they doing those? Does the fix look proper? Use those questions to inspire your answer. Don't feel obligated to answer every one. Write a thoughtful entry here that those ing the software engineering industry would find interesting. announced: '2011-02-04' subsystem: name: extensions answer: | This was found in the chrome extensions directory. I assume that the extensions directory pertains to all code that deals with all extension related activity (loading, unloading, rendering, verifying). Full url: https://chromium.googlesource.com/chromium/src/chrome/browser/extensions/ question: | What subsystems was the mistake in? Look at the path of the source code files code that were fixed to get directory names. Look at comments in the code. Look at the bug reports how the bug report was tagged. Examples: "clipboard", "gpu", "ssl", "speech", "renderer" discovered: date: '2010-11-11' answer: | As mentioned in the unit test block above, it is unclear how this was found. In the code review for the fix the author of the code review mentions this: Message: Simple fix for a report we got this morning. From the word 'report' I am lightly concluding that this was automatically found by some tool but the tool is unnamed in the review and in any of the commit messages. No automated tests could be referenced by email signatures either. So I am not sure how this was found. google: true contest: question: | How was this vulnerability discovered? Go to the bug report and read the conversation to find out how this was originally found. Answer in longform below in "answer", fill in the date in YYYY-MM-DD, and then determine if the vulnerability was found by a Google employee (you can tell from their email address). If it's clear that the vulenrability was discovered by a contest, fill in the name there. The "automated" flag can be true, false, or nil. The "google" flag can be true, false, or nil. If there is no evidence as to how this vulnerability was found, then you may leave the entries blank except for "answer". Write down where you looked in "answer". automated: false description: | Google Chrome does not properly handle a missing key in an extension, which allows remote attackers to cause a denial of service (application crash) via a crafted extension. When an extension is packaged, the extension is assigned a unique key pair. The extension's ID is based on a hash of the public key. The private key is used to sign each version of the extension and must be secured from public access. A variable that gets initialized in the extension unpacking service is sized based off of the extention's header signature size, so if a hacker removed the signature (by providing an empty key), it would get initialized to zero and cause a crash of the system since it was unexpected, leading to a denial of service. Basically, the bug was introduced in the process of refactoring the code to pull the logic for unpacking extensions out of a service that was too complex, and the writer of the new file to handle the new logic forgot to check to make sure that a variable being initialized would not be initialized to 0 bytes. unit_tested: fix: false code: true answer: | I did not see any unit testing services related to finding this bug besides files named test_* in the same directory as the code. These look like unit tests. I can only assume no new tests were introduced since the code review only points to the fix for the code not any unit test. As far as automated testing is concerned, in the code review for the fix the author of the code review mentions that the fix was a simple fix for a report. From the word 'report' (which the author of the message used in the code review), I am lightly concluding that this was automatically found by some tool but the tool is unnamed in the review and in any of the commit messages. No automated tests could be referenced by email signatures either. question: | Were automated unit tests involved in this vulnerability? Was the original code unit tested, or not unit tested? Did the fix involve improving the automated tests? For the "code" answer below, look not only at the fix but the surrounding code near the fix and determine if and was there were unit tests involved for this module. For the "fix" answer below, check if the fix for the vulnerability involves adding or improving an automated test to ensure this doesn't happen again. major_events: answer: | It looks like in the space of the original bug being introduced to when the fix was submitted (2009-07-30 to 2010-11-11)) there was a major push to implement a new interface for Chrome Extensions. Everyone looks to be very jovial (lots of funny messages in the commits, but also lots of bugs because of a new interface). The bug this CVE is based on just was one of the ones that didn't get caught for a little over a year. The team looks like it was always the same 5-6 people over the course of the year. All had Google or chromium emails. Lots of reverts and reapplications of commits. events: - date: '2010-03-18' name: aa@chromium.org forces an app's origin to be the origin of the URL the crx (package or extension) is from. - date: '2010-03-19' name: erikkay@chromium.org loosens an apps's origin constraints. - date: '2010-08-31' name: Add signing and verification to ownership API. - date: '2010-09-15' name: Add check when unpacking extensions for a null key. question: | Please record any major events you found in the history of this vulnerability. Was the code rewritten at some point? Was a nearby subsystem changed? Did the team change? The event doesn't need to be directly related to this vulnerability, rather, we want to capture what the development team was dealing with at the time. curation_level: 0 CWE_instructions: | Please go to cwe.mitre.org and find the most specific, appropriate CWE entry that describes your vulnerability. (Tip: this may not be a good one to start with - spend time understanding this vulnerability before making your choice!) bounty_instructions: | If you came across any indications that a bounty was paid out for this vulnerability, fill it out here. Or correct it if the information already here was wrong. Otherwise, leave it blank. interesting_commits: answer: commits: - note: | The author of the original code that introduced the bug keeps refactoring some code here, having to do with pulling out Crx related items from the ExtensionSerivice. It looks like the author is implementing a new UI based system piece by piece. commit: fb3ef9384cc76c4237f98e5aa38d2689cc7b60cd - note: | The author rolls back the previous commit (the one mentioned above). It seems like since the area the author is working on is so large, that it is hard to find all of the issues before pushing a fix. The author later goes on to implement it correctly (Windows UI first, Linux UI is implemented later). commit: 25e02aca12eabfdcd8ba0506ce242cf91ef54150 question: | Are there any interesting commits between your VCC(s) and fix(es)? Write a brief (under 100 words) description of why you think this commit was interesting in light of the lessons learned from this vulnerability. Any emerging themes? If there are no interesting commits, demonstrate that you completed this section by explaining what happened between the VCCs and the fix. curated_instructions: | If you are manually editing this file, then you are "curating" it. Set the entry below to "true" as soon as you start. This will enable additional integrity checks on this file to make sure you fill everything out properly. If you are a student, we cannot accept your work as finished unless curated is set to true. upvotes_instructions: | For the first round, ignore this upvotes number. For the second round of reviewing, you will be giving a certain amount of upvotes to each vulnerability you see. Your peers will tell you how interesting they think this vulnerability is, and you'll add that to the upvotes score on your branch. announced_instructions: | Was there a date that this vulnerability was announced to the world? You can find this in changelogs, blogs, bug reports, or perhaps the CVE date. A good source for this is Chrome's Stable Release Channel (https://chromereleases.googleblog.com/). Please enter your date in YYYY-MM-DD format. fixes_vcc_instructions: | Please put the commit hash in "commit" below (see my example in CVE-2011-3092.yml). Fixes and VCCs follow the same format. description_instructions: | You can get an initial description from the CVE entry on cve.mitre.org. These descriptions are a fine start, but they can be kind of jargony. Rewrite this description in your own words. Make it interesting and easy to read to anyone with some programming experience. We can always pull up the NVD description later to get more technical. Try to still be specific in your description, but remove Chromium-specific stuff. Remove references to versions, specific filenames, and other jargon that outsiders to Chromium would not understand. Technology like "regular expressions" is fine, and security phrases like "invalid write" are fine to keep too. |
See a mistake? Is something missing from our story? We welcome contributions! All of our work is open-source and version-controlled on GitHub. You can curate using our Curation Wizard.
Hover over an event to see its title.
Click on the event to learn more.
Filter by event type with the buttons below.
